OBIEE: Schedule Segments with Saved Result Sets?

Hi!
I'd like to know how it is possible to solve this problem:
Imagine that we have segments that were created with Segment Designer and configured with Saved Result Sets (SRS). Some of them are quicker to execute at night, so the goal is to schedule at night the execution of those sements and the respective saving of result sets. The next morning, the marketeer using Siebel Marketing, when designing and building the campaign, would only have to select the segment and then choose to use the contacts of the SRS of that segment, instead of running on-the-fly (online) the segment...
Was may explanation clear? How is this possible?
Thank you.
Regards,
Filipe Ganhão

Update: looks like the wizard is really overwhelmed with big cubes (1.5k accounts, 20 dims @ approx 10 generations each) and just breaks.
I haven't figured out the exact limit, but I'm able to migrate pack of about 8 dims at a time as long as I leave the accounts dim out and do that on itself.
Cheers,
C.

Similar Messages

  • Problem in creating Saved Result Set (SRS) in OBIEE 10.1.3.4

    Hi,
    We have migrated Siebel Analytincs 7.8.5 to OBIEE 10.1.3.4, and we are now unable to create any SRS from OBIEE though we can create Segment and marketing cache for the segment.
    We did the following steps -
    1. Unisntall Siebel Analytincs 7.8.5
    2. Install OBIEE 10.1.3.4
    3. Use MIGRATE tool (sawmigrate) to migrate RPD & WEBCAT
    4. We have ALTERed the SRS tables - M_SR_HEADER, M_SR_ACCOUNT (as in OBIEE version there are many new columns have been added)
    5. We passed GLOBAL CONSISTENCY in the RPD
    6. We followed the steps in the document *"Oracle®Marketing Segmentation Guide Version 10.1.3.4 July 2008"*
    7. We created a Saved Result Set Format as instructed in the document - here we are very confused to select the list of columns - we don't know what should be the excat source / format
    8. Then we click the SRS create button
    9. We got the below error -
    Error Codes: QS2QOLYY:GDL67CY9:IHVF6OM7:OPR4ONWY:U9IM8TAC
    Error in getting cursor for WorkNode (Id:0)
    Authentication Failure.
    Odbc driver returned an error (SQLDriverConnectW).
    *State: 08004. Code: 10018. [NQODBC] [SQL_STATE: 08004] [nQSError: 10018] Access for the requested connection is refused. [nQSError: 43001] Authentication failed for Administrator in repository Star: invalid user/password. (08004)*
    Can anyone help us to resolve the issue ?
    A quick response is much much appreciated.
    Many Thanks,
    Prasanta

    Hi,
    It seems like you didnt setup the Administrator user for Saved Result Sets as it mentioned in the Marketing Segmentation Guide.
    Here is an extract from the guide:
    Setting Up the Web Administrator for Managing Cache and Saved Result Sets
    Some queries issued by the segmentation engine require the use of the Execute Physical stored
    procedure. These queries include delete statements on the cache, delete statements on the saved
    result sets, and insert statements for the cache and saved result set. The Execute Physical stored
    procedure must be run by a user with administrator privileges. The administrator user is set up in
    the instanceconfig.xml file.
    NOTE: The BI Administrator password and login parameters are case sensitive.
    To set up the administrative user in the instanceconfig.xml file
    1 Open a command shell and navigate to the <OracleBI>/web/bin, where <OracleBI> represents
    the root directory of the installation.
    2 Execute the following command:
    cryptotools credstore -add -infile <OracleBIData>/web/config/credentialstore.xml
    3 When prompted, enter the following values:
    Credential Alias: admin
    Username: Administrator
    Password: <enter Admin password here>
    Do you want to encrypt the password? y
    Passphrase for encryption: <password >
    Do you want to write the passphrase to the xml? n
    File "<OracleBIData>/web/config/credentialstore.xml" exists. Do you want to overwrite it? y
    4 Open the credentialstore.xml file and verify that the following section has been created:
    <sawcs:credential type="usernamePassword" alias=“admin">
    <sawcs:username> Administrator </sawcs:username>
    <sawcs:password>
    <xenc:EncryptedData>

  • Error when creating a saved result set

    Hi all,
    I want to create a saved result set with the marketing tool in OBIEE. I have made a saved result list format. But when I want to create the saved result set, I get this error:
    Error in getting cursor for WorkNode (Id:0)
    Authentication Failure.
    Odbc driver returned an error (SQLDriverConnectW).
    State: 08004. Code: 10018. [NQODBC] [SQL_STATE: 08004] [nQSError: 10018] Access for the requested connection is refused. [nQSError: 43001] Authentication failed for Administrator in repository Star: invalid user/password. (08004)
    Does anyone know how to solve this?

    Hi,
    You would have to create Credential store values for the Administrator User.
    (Refere to the Chapter 5 – “Oracle BI Presentation Services Credential Store” of the OBIEE Deployment Guide). You would have to include the Credential store info and additional Administrator user tag to overcome this issue.
    Hope this helps!
    Thanks
    Yuvaraj Narayanan

  • "Saved Result Sets" message when using the raplace wizard in the Admin Tool

    Hi,
    I'm just trying to replace an existing Essbase source with a new one in my Business Model using the replace wizard. )It's a cube with > 1.5k account members so I'm really not looking forward to doing that manually).
    So far all migrations have worked fine, but somehow this time after specifying which tables to use for the replace, instead of showing me a list of columns which will be changed, the wizard throws a "Saved Result Sets" message and just hangs. Every time I hit "Finish" the message appears again.
    Here's a screenshot .
    Anyone ever came across this?
    PS: what's up with the links these days? Jive acting up again?
    Edited by: Christian Berg on Feb 10, 2010 2:57 PM

    Update: looks like the wizard is really overwhelmed with big cubes (1.5k accounts, 20 dims @ approx 10 generations each) and just breaks.
    I haven't figured out the exact limit, but I'm able to migrate pack of about 8 dims at a time as long as I leave the accounts dim out and do that on itself.
    Cheers,
    C.

  • Oracle 8.1.6 Thin Driver with Multiple Result Sets

    We're using Oracle 8.1.6 on NT using the latest driver release.
    Java 1.2.2
    We're experiencing problems with resultSet.next when we have multiple result sets open. What appears to be happening when you've read the last result set entry do a .next() call which should result in a false value we actually get java.sql.SQLException: ORA-01002: fetch out of sequence.
    This seems to us that the driver is trying to go beyond the end of the result set.
    We've checked JDBC standards (and examples on this site) and the code we've got is compliant. We've also found that the code produces the correct results under Oracle 7.3.4 and 8.0.4.
    I can also say that there is no other activity on the db, so there are no issues such as roll back segments coming into play.
    Any solutions, help, advice etc would be gratefully appreciated!
    null

    Phil,
    By "multiple result sets open", do you mean you are using REF Cursors, or do you have multiple statements opened, each with its own ResultSet? If you could post an example showing what the problem is, that would be very helpful.
    You don't happen to have 'for update' clause in your SQL statement, do you?
    Thanks

  • Calling a Stored Procedure with a result set and returned parms

    I am calling a stored procedure from a Brio report. The stored procedure returns a result set, but also passes back some parameters. I am getting a message that states that the DBMS cannot find the SP with the same footprint. Besides the result set, the SP returns 3 out parameters: (Integer, char(60), char(40)). Is there something special I need to do in Brio to format the out parameters to be passed back from the SP.
    Thanks,
    Roland

    Did you try just declaring the vars?
    untested
    declare
      myCur SYS_REFCURSOR;
      myRaw RAW(4);
      BEGIN
        test (0, 0, myRaw, sysdate, myCur);
      END;

  • Working With A Result Set

    Hi Everybody,
    I'm working on a application with will retrive data from a database for the past seven days, and then find the average for each of the days, the database is basically (id,date,hour,houraverage) - I can connect to the databse okay, but the problem i have is working with result set:
    ResultSet rs = d.PerformQuery("0");
    while (rs.next())
    if (firsttime)
    recorddate = rs.getString(2);
    firsttime = false;
    System.out.println("First Time");
    numberofdays = 1;
    if (!firsttime)
    if (recorddate == rs.getString(2))
    System.out.println(" Not First Time");
    thisdaystotal = thisdaystotal + Float.parseFloat(rs.getString(4));
    numberofrecords = numberofrecords + 1;
    System.out.println(thisdaysaverage);
    System.out.println(rs.getString(2));
    if (recorddate != rs.getString(2))
    thisdaystotal = thisdaystotal + Float.parseFloat(rs.getString(4));
    numberofdays = numberofdays + 1;
    thisdaysaverage = thisdaysaverage / numberofrecords;
    recorddate = rs.getString(2);
    for (int dayssofar=1;dayssofar<=numberofdays;dayssofar++)
    System.out.println("Daily Average: " + dailyaverages[dayssofar]);
    Basiclly the flaw in the program is that it can't tell if a record has the same date as the records before it - It can tell if they are different but not same, can anybody see any flaws in the logic?
    Andrew

    Why don't you try to use SQL to get what you need. For example, you can use:
    SELECT DATE, AVG(HOURAVERAGE)
    FROM TableName
    WHERE DATE > 'DATE1'
    GROUP BY DATE
    (replace 'DATE1' with your specific date. If you are using Oracle replace it with: trunc(sysdate) -7 to get most recent 7 days).
    Then you will get your averge values. There is no need to average them in your Java program.

  • Web Services with Large Result Sets

    Hi,
    We have an application where in a call to a web service could potentially yield a large result set. For the sake of argument, lets say that we cannot limit the result set size, i.e., by criteria narrowing or some other means.
    Have any of you handled paging when using Web Services? If so can you please share your experiences considering Web Services are stateless? Any patterns that have worked? I am aware of the Value List pattern but am looking for previous experiences here.
    Thanks

    Joseph Weinstein wrote:
    Aswin Dinakar wrote:
    I ran the test again and I removed the ResultSet.Fetch_Forward and it
    still gave me the same error OutOfMemory.
    The problem to me is similar to what Slava has described. I am parsing
    the result set in memory storing the results in a hash map and then
    emptying the post processed results into a table.
    The hash map turns out to be very big and jvm throws a OutOfMemory
    Exception.
    I am not sure how I can turn this around -
    I can partition my query so that it returns smaller chunks or "blocks"
    of data each time(say a page of data or two pages of data). Then I can
    store a page of data in the table. The problem with this approach is
    that it is not exactly transactional. Recovery would be very difficult
    in this approach.
    I could do this in a try catch block page by page and then the catch
    could go ahead and delete the rows that got committed. The question then
    becomes what if that transaction fails ?It sounds like you're committing the 'cardinal performance sin of DBMS processing',
    of shovelling lots of raw data out of the DBMS, processing it in some small way,
    and sending it (or some of it) back. You should instead do this processing in
    a stored procedure or procedures, so the data is manipulated where it is. The
    DBMS was written from the ground up to be a fast efficient set-based processor.
    Using clever SQL will pay off greatly. Build your saw-mills where the trees are.
    JoeYes we did think of stored procedures. Like I mentioned yesterday some of the post
    processing depends on unicode and specific character sets. Java seemed ideally suited
    to this since it handles these unicode characters very well and has all these libraries
    we can use. Moving this to DBMS would mean we would make that proprietary (not that we
    wont do it if it became absolutely essential) but its one of the reasons why the post
    processing happens in java. Now that you mention it stored procedures seem the best
    option.

  • Help with streaming result sets and prepared statements

    hi all
    I create a callable statement that is capable of streaming.
    statement = myConn2.prepareCall("{call graphProc(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}",java.sql.ResultSet.TYPE_FORWARD_ONLY,
    java.sql.ResultSet.CONCUR_READ_ONLY);
    statementOne.setFetchSize(Integer.MIN_VALUE);
    the class that contains the query is instantiated 6 times the first class streams the results beautifully and then when the second
    rs = DatabaseConnect.statementOne.executeQuery();
    is executed I get the following error
    java.sql.SQLException: Can not use streaming results with multiple result statements
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:910)
    at com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1370)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1688)
    at com.mysql.jdbc.Connection.execSQL(Connection.java:3031)
    at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:943)
    at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1049)
    at com.mysql.jdbc.CallableStatement.executeQuery(CallableStatement.java:589)
    the 6 instances are not threaded and the result set is closed before the next query executes is there a solution to this problem it would be greatly appreciated
    thanks a lot
    Brian

    Database resources should have the narrowed scope
    possible. I don't think it's a good idea to use a
    ResultSet in a UI to generate a graph. Load the data
    into an object or data structure inside the method
    that's doing the query and close the ResultSet in a
    finally block. Use the data structure to generate
    the graph.
    It's an example of MVC and layering.
    Ok that is my bad for not elaborating on the finer points sorry, the results are not directly streamed into the graphs from the result set. and are processed in another object and then plotted from there.
    with regards to your statement in the beginning I would like to ask if you think it at least a viable option to create six connections. with that said would you be able to give estimated users using the six connections under full usage.
    just a few thoughts that I want to
    bounce off you if you don't mind. Closing the
    statement would defeat the object of of having a
    callable statement How so? I don't agree with that.
    %again I apologise I assumed that since callable statements inherit from prepared statements that they would have the pre compiled sql statement functionality of prepared statements,well If you consider in the example I'm about to give maybe you will see my point at least with regards to this.
    The statement that I create uses a connection and is created statically at the start of the program, every time I make a call the same statement and thus connection is used, creating a new connection each time takes up time and resources. and as you know every second counts
    thanks for your thoughts
    Brian.

  • Connection with updatable result set

    I am using java and mysql.
    I need to get updatable result set.
    But when I check MetaData of connection I get following: "Updatable result sets are not supported"
    and get read only result set.
    Can You help me to get updatable result set?
    Thank You in advance.
    database properties
    db.driver=org.gjt.mm.mysql.Driver
    db.connectionString=jdbc\:mysql\://host/base
    See code:
    try
    conn = getConnection();
    DatabaseMetaData meta = conn.getMetaData();
    if (meta.supportsResultSetConcurrency(
    ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_UPDATABLE))
    // Updatable result sets are supported
    setTitle( "Updatable result sets are supported" );
    else
    // Updatable result sets are not supported
    setTitle( "Updatable result sets are not supported" );
    catch(SQLException e)
    e.printStackTrace();

    Well, if the JDBC driver you are using does not support updateable result sets, then you should find a different JDBC driver or a different solution to your problem. Most likely you can solve your problem with another technique than using an updateable result set.

  • Stored Procedure With Multiple Result Sets As Report Source : Crosspost

    Hello Everyone,
    I have an issue where i have created a stored procedure that returns multiple result sets
    /* Input param = @SalesOrderID */
    SELECT * FROM Orders TB1
      INNER JOIN OrderDetails TB2 ON  TB1.ID = TB2.ID
    WHERE TB1.OrderID = @SalesOrderID
    SELECT * FROM Addresses
      WHERE Addresses.OrderID = @SalesOrderID AND Addresses.AddressType = 'Shipping'
    SELECT * FROM Addresses
      WHERE Addresses.OrderID = @SalesOrderID AND Addresses.AddressType = 'Billing'
    This is just a quick sample, the actual procedure is a lot more complex but this illustrates the theory.
    When I set the report source in Crystal X to the stored procedure it is only allowing me to add rows from the first result set.
    Is there any way to get around this issue?
    The reason that I would prefer to use a stored procedure to get all the data is simply performance. Without using one big stored procedure I would have to run at least 6 sub reports which is not acceptable because the number of sub reports could grow exponentially depending on the number of items for a particular sales order.
    Any ideas or input would be greatly appreciated.
    TIA
        - Adam
    P.S
    Sorry for the cross post, I originally posted this question [here|/community [original link is broken];
    but was informed that it might be the wrong forum
    Edited by: Adam Harris on Jul 30, 2008 9:44 PM

    Adam, apologies for the redirect, but it is better to have .NET posts in one place. That way anyone can search the forum for answers. (and I do not have the rights to move posts).
    Anyhow, as long as the report is created, you should be able to pass the datasets as:
    crReportDocument.Database.Tables(0).SetDataSource(dataSet.Tables("NAME_OF_TABLE"))
    Of course alternatively, (not sure if this is possible in your environment) you could create a multi-table ADO .NET dataset and pass that to the report.
    Ludek

  • Problem with scrollable result set

    hi
    can u please tell me how i can get no of rows from a result set..
    i am using thin drivers and connecting to remote database,every thing is working fine if i use fwd only type ,but if u use sensitive type it is giving error
    code:
    stmt=con.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_READ_ONLY);
    i am geting error at this line
    error :
    java.lang.AbstractMethodError:oracle/jdbc/driver/oracleconnection.createStatement
    please help me

    does your jdk include jdbc 2.0? Is your driver updated to support jdbc 2.0 features? have you recently updated either(and forgot to change the classpath)?
    I am running jdk 1.3 and oracle thin(classes12.zip) have not had problems creating/using scrollable resultsets??
    Jamie

  • Problem with Call_Form to Return to the Previous Screen with Saved Results

    Hello All,
    I have two forms (e.g. Form A & Form B) both forms have options to call one another with a Push Button included in the form. Our users want to be able to return to Form B from Form A with a previous search result and parameters saved in Form B. I was able to use the following code to make this happen:
    1. In Form B, I added the following code in the When-Button-Pressed Trigger for the 'PB_Return_2_Form_A' item:
    CALL_FORM('Form_A', HIDE,NO_REPLACE, NO_QUERY_ONLY);
    2. In Form A, I used the following code in the When-Button-Pressed Trigger for the 'PB_Return_2_Form_B' item:
    EXIT_FORM;
    The above code works beautifully when I enter form B first and search the database with all the required parameters entered, the parameters and search results were displayed upon return from Form A. But the problem is when I enter Form A first, the form exits abruptly. I tried to use pre-condition testing in Form A such as the following code:
    IF :GLOBAL.DWR01_First_Call = 'Y' THEN
    NEW_FORM('DWR_SEL', NO_ROLLBACK);
    :GLOBAL.DWR01_First_Call := 'N';
    ELSE
    EXIT_FORM;
    END IF;
    (BTW, I initialize the :GLOBAL.DWR01_First_Call variable to 'Y' in the 'When-New-Form-Instance' Triggers in the form level.) But the above code does not work. For some reason this :GLOBAL.DWR01_First_Call = 'Y' condition always gets executed such that the working part of code is by-passed.
    If any of you can help me on resolving my problem asap, I would really appreciate it.
    Thanks to you all in advance,
    Jinlan
    --

    Thanks all for the information and help. I have read through the information provided in the links above, and modified the global variable accordingly in both the calling and called forms, but all tries failed. From my failed attempts, I realize that my current design might not work because the value of GLOBAL.DWR01_First_Call seems to be 'Y' each time when Form A is called as such the 'EXIT_FORM' statement in the ELSE condition will never be executed. This intrigues me with the following questions:
    I. How do I use the debug in Forms 6i to step thru the values of a variable at a given execution state? (Although, I am pretty certain that the :GLOBAL.DWR01_First_Call value is always going to be a 'Y' each time when Form A is called, it would be nice if I could see it myself and not just guessing. :) )
    II. By now, I am pretty sure my current design may not work... How do I make my code to work for the situation I described above? For your convenience, I restate my situation again below and hopefully it will clarify my needs:
    Both Form A and Form B have options to call one another with a Push Button included in each form. Our users want to be able to return to Form B from Form A with a previous search result and parameters saved in Form B, and the following code works:
    1. In Form B, I added the following code in the When-Button-Pressed Trigger for the 'PB_Return_2_Form_A' item:
    CALL_FORM('Form_A', HIDE,NO_REPLACE, NO_QUERY_ONLY);
    2. In Form A, I used the following code in the When-Button-Pressed Trigger for the 'PB_Return_2_Form_B' item:
    EXIT_FORM;
    But, the above code kicks the user out of the application completely if the user enter Form A first and then select Form B by pressing the 'PB_Return_2_Form_B' push button(BTW, this access mechanism is most commonly used by our users). So, how do I allow users to enter Form A first and allow them to call Form B and proceed with entering parameters and search query, without exiting the form.
    Any and all the help you can provide on this will be greatly appreciated.
    Thanks millions in advance,
    Jinlan Tomasic
    [email protected]
    1(907)458-6899

  • EXECUTE IMEDIATE with a result set based DDL command?

    How can I get the exec immed to work in this manner?
    execute immediate 'select ' DROP INDEX '' || INDEX_NAME || '' from dba_indexes where index_name = ''%STRING%'';
    I want to retrieve the list of INDEXES from the dba_indexes where the name is like a certain STRING and drop them.
    Is there a better way to do this all in a PL/SQL session/command?
    Thanks for any help I can get,
    Miller

    I resolved issue using a spool file and executing that like such:
    /* This will remove all objects created by the migration objects */
    set heading off
    set feedback off
    spool c:\temp\cleanup.txt
    select 'DROP INDEX ' || INDEX_NAME || ' ; ' from dba_indexes where index_name like '%99%' and index_type = 'NORMAL' and index_name not like '%SYS%';
    select 'DROP TABLE ' || TABLE_NAME || ' ;'  from dba_tables where table_name like 'DYNEGY_%' and table_name <> 'DYNEGY_SCRIPT_STATISTICS';
    spool off
    @c:\temp\cleanup.txt
    /

  • Error executing a query with large result set

    Dear all,
    after executing a query which uses key figures with exception aggregation the BIA-server log (TrexIndexServerAlert_....trc) displays the following messages:
    2009-09-29 10:59:00.433 e QMediator    QueryMediator.cpp(00324) : 6952; Error executing physical plan: AttributeEngine: not enough memory.;in executor::Executor in cube: bwp_v_tl_c02
    2009-09-29 10:59:00.434 e SERVER_TRACE TRexApiSearch.cpp(05162) : IndexID: bwp_v_tl_c02, QueryMediator failed executing query, reason: Error executing physical plan: AttributeEngine: not enough memory.;in executor::Executor in cube: bwp_v_tl_c02
    --> Does anyone know what this message exactly means? - I fear that the BIA-Installation is running out of physical memory, but I appreciate any other opinion.
    - Package Wise Read (SAP Note 1157582) does not solve the problem as the error message is not: "AggregateCalculator returning out of memory with hash table size..."
    - To get an impression of the data amount I had a look at table RSDDSTAT_OLAP of a query with less amount of data:
       Selected rows      : 50.000.000 (Event 9011)
       Transferred rows :   4.800.000 (Event 9010)
    It is possible to calculate the number of cells retreived from one index by multiplying the number of records from table RSDDSTAT_OLAP by the number of key figures from the query. In my example it is only one key figure so 4.800.000 are passed to the analytical engine.
    --> Is there a possibility to find this figure in some kind of statistic table? This would be helpful for complex queries.
    I am looking forward to your replies,
    Best regards
    Bjoern

    Hi Björn,
    I recommend you to upgrade to rev 52 or 53. Rev. 49 is really stable but there are some bugs in it and if you use BW SP >= 17 you can use some features.
    Please refer to this thread , you shouldn't´t use more than 50% (the other 50% are for reporting) of your memory, therefor I have stolen this quote from Vitaliy:
    The idea is that data all together for all of your BIA Indexes does not consume more then 50% of total RAM of active blades (page memory cannot be counted!).
    The simpliest test is during a period when no one is using the accelerator, remove all indexes (including temporary) from main memory of the BWA, and then load all indexes for all InfoCubes into ain memory. Then check your RAM utilization.
    Regards,
    -Vitaliy
    Regards,
    Jens

Maybe you are looking for