Incremental Load - is it possible ?

Hi,I am new to Essbase and trying to understand if it is possible to incrementally load data from an OLTP system. My requirement is this:I have a Relational OLTP database where Service Orders are processed.I'd like to extract data from the OLTP database into Essbase. Because of large volumes of data in the OLTP database, I want to load data that has changed or is new on a regular basis, and update the Essbase Cube inrementally.My question is:How can I do this ?Where can I find documentation related to incremental loads ?Thanks and RegardsDibyendu

Hi Dibyendu,It is possible to use MaxL, Esscmd, or the Essbase API to automate dataloads. We have been using Esscmd to load data from a MSSQL/Domino Web application for years, and it works well. The web app periodically performs a 'transfer' which entails writing a data load file, an organization dim build file, a scenario dimension build file, two calc scripts (clear and calc), an Esscmd script, and a dos batch file to the Essbase server disk, then executes the batch file. The Esscmd script has error handling, and returns an error condition if an error occured.Jeff McAhrenDallas, Texas

Similar Messages

  • Is it possible incremental load using sqlloader

    Hi
    I am working on datawarehousing projects and every day loading 3 lakhs record to target server using sqlloader.Is it possible incremental loading using sqlloader
    Example First day loaded 3 lakhs record into target using sqlloader ,when next day need to load another 2 lakhs records .Using sqlloader how to do the incremental load?
    Thanks inadvance
    Mohan

    Hi
    The sql loader has three options
    Append
    Replace
    Truncate
    The first option will help you and append the data and will not reject the duplicate . FOr rejecting duplicate record rejection make sure table has constraints.
    Prashant_Arvind

  • Incremental load into the Dimension table

    Hi,
    I have the problem in doing the incremental load of the dimension table.Before loading into the dimension table,i would like to check the data in the dimnesion table.
    In my dimension table i have one not null surrogate key and the other null dimension tables.The not null surrogate key, i am populating with the Sequence Generator.
    To do the incremental load i have done the following.
    I made lookup into the dimension table and looked for a key.The key from the lookup table i have passed to the expression operator.In the expression operator i have created one field and hard coded one flag based on the key from the lookup table.I passed this flag to the filter operator and rest of the fields from the source.
    By doing this i am not able to pass the new records to the dimension table.
    Can you please help me.
    I have another question also.
    How do i update one not null key in the fact table.
    Thanks
    Vinay

    Hi Mark,
    Thanks for your help to solve my problem.I thought i share more information by giving the sql.
    I am giving below the 2 sqls, i would like to achieve through OWB.
    Both the following tasks need to be accomplished after loading the fact table.
    task1:
    UPDATE fact_table c
    SET c.dimension_table_key =
    (SELECT nvl(dimension_table.dimension_table_key,0)
    FROM src_dimension_table t,
    dimension_table dimension_table
    WHERE c.ssn = t.ssn(+)
    AND c.date_src_key = to_number(t.date_src(+), '99999999')
    AND c.time_src_key = to_number(substr(t.time_src(+), 1, 4), '99999999')
    AND c.wk_src = to_number(concat(t.wk_src_year(+), concat(t.wk_src_month(+), t.wk_src_day(+))), '99999999')
    AND nvl(t.field1, 'Y') = nvl(dimension_table.field1, 'Y')
    AND nvl(t.field2, 'Y') = nvl(dimension_table.field2, 'Y')
    AND nvl(t.field3, 'Y') = nvl(dimension_table.field3, 'Y')
    AND nvl(t.field4, 'Y') = nvl(dimension_table.field4, 'Y')
    AND nvl(t.field5, 'Y') = nvl(dimension_table.field5, 'Y')
    AND nvl(t.field6, 'Y') = nvl(dimension_table.field6, 'Y')
    AND nvl(t.field7, 'Y') = nvl(dimension_table.field7, 'Y')
    AND nvl(t.field8, 'Y') = nvl(dimension_table.field8, 'Y')
    AND nvl(t.field9, 'Y') = nvl(dimension_table.field9, 'Y')
    WHERE c.dimension_table_key = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    source table for the dimension table is src_dimension_table
    dimension_table_key is a not null key in the fact table
    task2:
    update fact_table cf
    set cf.key_1 =
    (select nvl(max(p.key_1),0) from dimension_table p
         where p.field1 = cf.field1
    and p.source='YY')
    where cf.key_1 = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    key_1 is a not null key in the fact table
    Is it possible to achieve the above tasks through Oracle Warehouse builder(OWB).I created the mappings for loading the dimension table and fact table and they are working fine.But the above two queries i am not able to achieve through OWB.I would be thankful if you can help me out.
    Thanks
    Vinay

  • Incremental Loads and Refresh Date

    Hi all,
    Thank you for taking the time to review this post.
    Environment
    Oracle BI Applications 7.9.6 (Financial & Project Analytics)
    Oracle E-Business Suite 11.5.10
    Question
    I have a Test BI Apps 7.9.6 in a Test environment that is connected to a static EBS 11.5.10 data source. As part of my testing phase I'd like to do multiple Incremental Loads to get an accurate performance impact and timing study for the final pre-approval before migrating to Production. I can get a refresh of EBS which has a week's worth of transactions after my Initial Full Load. What I'd like to do is change Refresh Dates to "trick" the Incremental Load into only loading one days worth of data at a time, rather than the full week's worth of data in the Incremental load. Is this possible, and if so, how?
    Example timeline:
    Today - Initial Full load using Test EBS as of today
    1 week later - Refresh static Test EBS from Production with a week of transactions
    Post Refresh - Run daily Incremental jobs using static Test EBS
    First Incremental Load - Today's position + 1 day,
    Second " " - Today's position + 2 days,
    Third " " - Today's position + 3 days, etc
    As always all comments and solutions greatly appreciated.
    Kind Regards,
    Gary.

    Say on the 01st of the month, you did a Load.
    Then on the 08th of the month, the source EBS system was itself refreshed.
    What you want to do is to run a single day refresh on the 08th for all data from the 01st to the 02nd of the month), and then another single day referesh -- whether on the 08th or on the 09th , you don't care -- for all data from the 03rd to the 04th.
    Unfortunately, the refresh is from last refresh date to current date. You can't define "refresh upto date". Therefore, your first 'incremental' refresh on the 08th would refresh all data from the 02nd to the 08th in one shot. What you could try to do is
    a. After the first load on the 01st, shutdown the BI DWH.
    b. When the EBS test source is refresh on the 08th, reset your SYSTEM CLOCK ON THE BI DWH DATABASE SERVER to the 2nd (or 3rd) of the month.
    c. Now, when you run a refresh, BI will extract all data from the 01st to the 02nd or 03rd (even though EBS is as of the 08th).
    d. Once this is done, shutdown BI DWH.
    e. Reet the SYSTEM CLOCK ON THE BI DWH DATABASE SERVER to the 3rd or 4th of the month.
    f. Run another Incremental Refresh.
    ... and so on ...
    Hemant K Chitale
    http://hemantoracledba.blogspot.com

  • Incremental Load with ODI

    Hi All,
    I have a few questions related to ODI.
    1. ODI can be used to migrate data from DB2/4000 to Oracle 10g on AIX. This is possible right?
    2. My DB2 is a core system. Can ODI handle incremental loads to the target source? For example, if a particular tables sees updates every 10 minutes, will ODI be able to handle the changes on this table and reflect the same on the target database?
    Any suggestions would be appreciated.
    Thanks

    Yes , you can by using trigger but the trade off is the source table must be installed a trigger , and several temp tables are created in source database .
    CDC by using native journal , we are on the way of testing it .
    Edited by: oracletruedb on Oct 7, 2008 2:33 AM

  • DAC Incremental load with new instance

    we have our daily incremental load running from an instance, but now we would like to run load from another instance( not full load) how can this be acheived

    One possible way to do this is to create in awm a cube script with a Load Command with a where clause filter that would filter what records were loaded into the cube.  This cube script could then be run to load only partial data from the instance.

  • Incremental loading in owb

    hi all,
    IT IS URGENT
    i came to know that we can use filter on date field.but i cannt uderstand which condition to be placed in filter because source table is having fields like
    prod_id,prod_desc,created_date,etl_process_date.
    target table is having attributes like prod_id,prod_desc,created_date,modified_date.And also i have to perform TYPE2 mapping for any modification on prod_desc field.
    SO my requirement is type2 with incremental loading in OWB
    pls let me know incremental loading part,i.e. how could i develop using filter condition

    You ca do it by various methods:
    1.Load date time store the latest load date time in a table and while loading the target table make sure that data date is after the load date time. (joiner can be used to specify condition).
    2. Create an mlog and mview to pick incremental data from source table and use this set as source of your map.
    3.If possible add a column which indicates the status of processed records (in source table).Say for example add column named processed records and set it status as YES in the pre map process only those records with status as YES and at the end of the map set this status as DONE.Then again in the new cycle set the records with null in processed records column as YES and repeat the fore mentioned process.

  • Duplicate rows in Hierarchy Table created when running incremental load

    I copied an out of the box dimension and hierarchy mapping to my custom folders (task hierarchy) this should create the same wids from dimension to hierarchy table and on full load does this using the sequence generator. The problem I am getting is whenever I run a incremental load instead of updating, a new record is created. What would be the best place to start looking at this and testing. A full load runs with no issues. I have also checked the DAC and run the SDE trunc always and SIL trunc for full load only.
    Help appreciated

    Provide the query used for populating the child records. Issue might be due to caching.
    Thanks
    Shree

  • OBIA Financial Analytics - ETL Incremental Load issue

    Hi guys
    I have an issue while doing ETL Incremental load in DEV. Source, Target ORACLE.
    issue with these two tasks: SDE_ORA_GL_JOURNALS and SDE_ORA_ImportReferenceExtract
    incremental load is holding at SDE_ORA_GL_JOURNALS.....on Database sessions the query is completed and also the session is done but no update in the session log for that session in informatica. It says hust ' SQL Query issued to database' . no progress from there. and the task in both informatica session monitor and DAC says running and keeps running for ever. No errors seen in any of the log files.
    Any idea on whats happening? I checked session logs, DAC servr logs, Database alert logs, Exception logs on source and found nothing.
    I tried to run these Informatica generated queries in SQL developer and they ran well and I did see the results. More over, weird thing is from past three days and about 10 runs....the statis tics are
    both thses tasks run in parallel and most of the times Import references extract is completed first and then GL_Journals is running for ever
    In one run, GL_Journals is done but then Import reference extract is running for ever. I dont exactly understand whats happening. I see both the queries running in parallel on source database.
    Please give me some idea on this. And this same stuff is running good on QA. I dont know how this is working.....any idea on this is appreciated. Thank you. let me know of ay questions. Thank you in advance.

    Please refer this:
    http://gerardnico.com/wiki/obia/installation_7961

  • Incremental Loading of a Tree Component

    I'm working on an explorer-type interface in Flex 2 for
    browsing a remote file repository. It's a standard split-pane
    affair with a directory tree on the left and a listing on the
    right.
    Because the entire directory tree can be rather large, I need
    to load it incrementally as the user expands nodes rather than all
    at once. I failed to find a relevant example so I wrote my own. It
    works, but I'm new to Flex and am not sure if there's an easier way
    or if there are any pitfalls to the way I did it.
    I posted my code here:
    http://xocoatl.blogspot.com/2007/01/incremental-loading-of-tree-in-flex-2.html
    Any comments here or on the blog are appreciated; I'm
    guessing that having a good example of the "right" way to do this
    will be useful to many others.
    Thanks.

    i am also using another workaround using CSS trick(inlineStyle). if #{node.children} is null, i am placing an empty 10x10 white png image file over expand icon on tree node.
    following code is nodeStamp of tree component.
    <f:facet name="nodeStamp">
      <af:group id="g1">
        <af:image source="/images/spacer.png" id="i1" inlineStyle="border: 2px solid white; position: absolute; margin-left:-14px;" rendered="#{node.children == null}"/>
        <af:commandLink text="#{node.name}" id="cl1" partialSubmit="true"/>
      </af:group>
    </f:facet>

  • 11g (11.2.0.1) - dimension operator very slow on incremental load

    Dimension operator very slow in porcessing incremental loads on 11.2.0.1 Have applied cumulative patch - still same issue.
    Statistics also gathered.
    Initial load into empty dimension performs fine (thousands of records in < 1 min) - incremental load been running over 10 mins and still not loaded 165 records from staging table.
    Any ideas?
    Seen in 10.2.0.4 and applied patch which cured this issue.

    Hi,
    Thanks for the excellent suggestion.
    Have run other mapings which maintain SCD type 2 using dimesnion operator behave similary to 10g. - have raised issue with this particular mapping with Oracle - awaiting response.
    One question - when look at the mappings which maintain SCD Type 2s looks to join on dimension key and the surrogate ids.
    What is best practice regarding indexing of such a dimension, e.g is it recommended to index dimension key and the surrogate ids along with the natural/nbusiness keys?
    Thanks

  • Incremental load not capturing data in SSIS 2012

    Hi ,
    Iam having an issue with Oracle CDC for SSIS which is new in 2012, Developed SSIS packages with Full load and Incremental load logic to load data into ODS - STAGE - DWH. Here problem is when ever i do a full load following with an incremental load , incremental
    load is not capturing updated data , if i do second incremental load then it captures data.
    Is there any solution for this to get data in first incremental load.

    Are you sure it picks up LSN correctly? I doubt its CDC service not picking the correct LSN value which it uses to identify the changes.
    It should be in cdc.lsn_time_mapping table I guess
    http://msdn.microsoft.com/en-IN/library/bb510494.aspx
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Business Objects Data Services Incremental Load Help

    Hi this my first time creating a incremental load for a batch job. My batch job consists of a try - initialization script - data flow - catch. When I validate my initialization script I am getting an error could you review and identify the error with the script. My data flow consists of the data store table I imported with a query then table comparison then key generation then the the table I am updating.
    # Set Todays Date
    $SYSDATE = cast ( sysdate (), 'date' );
    print ('Today\' date:' || cast($SYSDATE, 'varchar(10)'));
    # SET CDC DATE
    $CDC_DATE = nvl (cast(sql('Target', 'SELECT MAX(BATCH_END_DATE) FROM BATCH_CONTROL WHERE BATCH_NAME = {$BATCH_NAME}
    AND BATCH_STATUS = \'SUCESS\' '), 'date'), cast(to_date('1900-01-01', 'YYYY-MM-DD'), 'date'));
    #Mark an entry in Batch_Control
    # Batch_Name    BATCH_STATUS   BATCH_START_DATE   BATCH_END_DATE Load_DATE
    sql('Target', 'INSERT INTO BATCH_CONTROL VALUES ( {BATCH_NAME}, \'STARTED', {to_char ($CDC_DATE, \'YYYY-MM-DD\')}, NULL, {to_char ($SYSDATE, \'YYYY-MM-DD\')};

    So I resolved the first error now I am receiving this long error any ideas?
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    |Session Table_Incramental_Load
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Process dump is written to <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Call stack:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D305BA, TrCallStatement::process_dbdiff_xform_new()+6666 byte(s), x:\src\parser\process_predef_xform.cpp, line 7281
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D3128E, TrCallStatement::process_diff_xform()+1422 byte(s), x:\src\parser\process_predef_xform.cpp, line 0432
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D356EE, TrCallStatement::process_predef_xform_options()+0286 byte(s), x:\src\parser\process_predef_xform.cpp, line
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0067+0017 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C313A5, TrCallStatement::processStatement()+0789 byte(s), x:\src\parser\dataflowstm.cpp, line 3307
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C310FC, TrCallStatement::processStatement()+0108 byte(s), x:\src\parser\dataflowstm.cpp, line 3201+0012 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C0FB55, DataFlowDef::processStatements()+0101 byte(s), x:\src\parser\dataflow.cpp, line 2331+0014 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C110D5, DataFlowDef::buildGraph()+1621 byte(s), x:\src\parser\dataflow.cpp, line 1723
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C12D99, DataFlowDef::processObjectDef()+2793 byte(s), x:\src\parser\dataflow.cpp, line 1290
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB9DC5, CallStep::processStep()+2037 byte(s), x:\src\parser\planstep.cpp, line 1050
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:FFFFFFFF, NsiAllocateAndGetPersistentDataWithMaskTable()+-1997676757 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB406F, TryStep::processStep()+0335 byte(s), x:\src\parser\planstep.cpp, line 3634
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB33A6, Step::processStepBlock()+0134 byte(s), x:\src\parser\planstep.cpp, line 0377+0018 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C8A78E, PlanDef::processObjectDef()+2718 byte(s), x:\src\parser\plandef.cpp, line 0689
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABB806, AE_Main_Process_Options()+32534 byte(s), x:\src\xterniface\actamainexp.cpp, line 3622
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABFAB1, AE_Main()+1505 byte(s), x:\src\xterniface\actamainexp.cpp, line 0830+0030 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00402AE9
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Registers:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EAX=056E85F0  EBX=00000000  ECX=00000010  EDX=02250048  ESI=056E85F0
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EDI=056E85A8  EBP=04A7C590  ESP=002700F0  EIP=00D305BA  FLG=00010206
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    CS=0023   DS=002B  SS=002B  ES=002B   FS=0053  GS=002B
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Exception code: C0000005 ACCESS_VIOLATION
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Fault address:  00D305BA 01:0029A5BA C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\bin\acta.dll

  • Time based incremental load AIS

    Dear all,
    Can someone explain me whats going wrong when i get the error @@@@@Thread_Source 30 being exited with code -1 or @@@@@Thread_Source 20 being exited with code -1 when loading data via time based incremental load in Analytic Intergration Services.
    The field i´m useing is a datefield. In the log the SQL statements can run in a client environment without any errors.
    Thanks for your reation.
    Regards,
    Johan Donkers
    Edited by: user635008 on 30-mrt-2009 0:41

    Dear all,
    Can someone explain me whats going wrong when i get the error @@@@@Thread_Source 30 being exited with code -1 or @@@@@Thread_Source 20 being exited with code -1 when loading data via time based incremental load in Analytic Intergration Services.
    The field i´m useing is a datefield. In the log the SQL statements can run in a client environment without any errors.
    Thanks for your reation.
    Regards,
    Johan Donkers
    Edited by: user635008 on 30-mrt-2009 0:41

  • Pre-requiste for Full incremental Load

    Hi Friends,
    I have installed and set up BI apps environment with OBIEE, BI Apps, DAC , Informatica. Now what are the immediate steps to follow in order to do full incremental load for EBS 12R for Financial and SCM.
    SO PLEASE GUIDE ME AS IT IS CRITICAL FOR ME TO ACCOMPLISH FULL LOAD PROCESS.
    Thanks
    Cooper

    You can do that by changing the Incremtal workflows/sessions to include something like update_date < $$TO_DATE and specify that as a DAC parameter. You willl have to do this manually. Unfortunately there is no built in "upper limit" date. There is a snapshot date that can extend to a future date but not for the regular fact tables.
    However, this is not a good test of the incremental changes. Just because you manually limit what you extract does not mean you have thoroughly unit tested your system for incremental changes. My advise is to have a source system business user enter the changes. Also..they need to run any "batch processes" on the source system that can make incremental changes. You cannot count the approach you outlined a a proper unit test for incremental.
    Is there any reason why you cannot have a business user enter transactions in a DEV source system environment and then run the full and incremental loads against that system? I dont mean a new refresh..i mean a manual entry to your DEV source system?

Maybe you are looking for