Incremental load into target

Newby question:)
Source_table
Name
City
Insert_date
Target_table
Name
City
Insert_date
What I want to do is read Target_table to get the max(insert_date) and pass that max(insert_date) into my filter to select from source table/
How can I do that?

The function will be executed only once ...the sql behind the function will be like :
select * from source_table where date >
select max(date) from source_table
so it all depends on how your indexes are created /partitions etc.
Regards
P.S. Also you can check the below link and see if anything similar can help.
http://www.geocities.com/tsibulnyak/analytfun.jpeg
Message was edited by: rado
user647181

Similar Messages

  • Incremental load into the Dimension table

    Hi,
    I have the problem in doing the incremental load of the dimension table.Before loading into the dimension table,i would like to check the data in the dimnesion table.
    In my dimension table i have one not null surrogate key and the other null dimension tables.The not null surrogate key, i am populating with the Sequence Generator.
    To do the incremental load i have done the following.
    I made lookup into the dimension table and looked for a key.The key from the lookup table i have passed to the expression operator.In the expression operator i have created one field and hard coded one flag based on the key from the lookup table.I passed this flag to the filter operator and rest of the fields from the source.
    By doing this i am not able to pass the new records to the dimension table.
    Can you please help me.
    I have another question also.
    How do i update one not null key in the fact table.
    Thanks
    Vinay

    Hi Mark,
    Thanks for your help to solve my problem.I thought i share more information by giving the sql.
    I am giving below the 2 sqls, i would like to achieve through OWB.
    Both the following tasks need to be accomplished after loading the fact table.
    task1:
    UPDATE fact_table c
    SET c.dimension_table_key =
    (SELECT nvl(dimension_table.dimension_table_key,0)
    FROM src_dimension_table t,
    dimension_table dimension_table
    WHERE c.ssn = t.ssn(+)
    AND c.date_src_key = to_number(t.date_src(+), '99999999')
    AND c.time_src_key = to_number(substr(t.time_src(+), 1, 4), '99999999')
    AND c.wk_src = to_number(concat(t.wk_src_year(+), concat(t.wk_src_month(+), t.wk_src_day(+))), '99999999')
    AND nvl(t.field1, 'Y') = nvl(dimension_table.field1, 'Y')
    AND nvl(t.field2, 'Y') = nvl(dimension_table.field2, 'Y')
    AND nvl(t.field3, 'Y') = nvl(dimension_table.field3, 'Y')
    AND nvl(t.field4, 'Y') = nvl(dimension_table.field4, 'Y')
    AND nvl(t.field5, 'Y') = nvl(dimension_table.field5, 'Y')
    AND nvl(t.field6, 'Y') = nvl(dimension_table.field6, 'Y')
    AND nvl(t.field7, 'Y') = nvl(dimension_table.field7, 'Y')
    AND nvl(t.field8, 'Y') = nvl(dimension_table.field8, 'Y')
    AND nvl(t.field9, 'Y') = nvl(dimension_table.field9, 'Y')
    WHERE c.dimension_table_key = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    source table for the dimension table is src_dimension_table
    dimension_table_key is a not null key in the fact table
    task2:
    update fact_table cf
    set cf.key_1 =
    (select nvl(max(p.key_1),0) from dimension_table p
         where p.field1 = cf.field1
    and p.source='YY')
    where cf.key_1 = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    key_1 is a not null key in the fact table
    Is it possible to achieve the above tasks through Oracle Warehouse builder(OWB).I created the mappings for loading the dimension table and fact table and they are working fine.But the above two queries i am not able to achieve through OWB.I would be thankful if you can help me out.
    Thanks
    Vinay

  • HELP! Macbook won't load into Target Disk Mode

    basically long story short i had about 100 mb of memory left on my 200gb macbook pro. i was trying to save a word document when everything froze, i manually restarted the computer and thats when all **** broke loose
    heres what happened
    -weird sounds coming from computer
    -the computer takes about 5-10 min to get to the login prompt
    -takes another 5-10 min after login to load desktop wallpaper
    -takes about 40 min for it to load anything on the dock and then nothing will load after that none of my desktop icons u cant access any programs..the computer will continue to make weird noises during this time.
    heres what ive done so far.
    -reset pram (didnt work)
    -harddrive test (Did both standard and extended ...test results came back fine nothing seemed to be wrong)
    -i tried to use disk utility to repair disk and it said:
    +"checking extents overflow file+
    +checking catalog file+
    +invalid node structure+
    +rebuilding catalog b-tree+
    +the volume macintosh HD could not be repaired+
    +error: the underlying task reported failure on exit+
    +1 hfs volum checked+
    +-1 volume could not be repaired because of an error+
    +repair attempted on 1 volume+
    +-1 volume could not be repaired"+
    -tried the fsck option and it told me basically the same thing
    -tried booting in safe mode ...computer wont boot into safe mode it will just shut itself off
    so my last option is trying to get info off of my computer using target disk mode...i have a spare macbook that im using as a host for the macbook pro...
    i bought a firewire cable today from the apple store
    i followed the directions step by step on the apple website
    -i connected the firewire cable, both computers were off.
    -i turned the host computer on (target computer remianed off)
    - held the T button right as the computer started
    - then i got a grey screen with a big firewire logo floating around from left to right
    [img]http://www.didntyouhear.com/wp-content/uploads/2007/04/firewire_logo.jpg[/img]
    what am i doing wrong!? ??
    someone PLEASE HELP!

    ok i figured it out however...
    everytime i try to click on my harddrive or folders in the harddrive the host computer freezes as does the target computer...
    im using a macbook as the host computer to my macbook pro target...
    after performing target disk mode on the target computer and connecting the two computers via firewire...
    i can see my target drive on my host desktop....
    omg and when i clicc on my targe drive i see all my old folders!!! yay!! that means my old files arent lost forever!
    the only problem is when i try to access the folders i get the colored pinwheel
    and its very slow..... nothin will load and then my host computer freezes and i have to forcequit...i didnt even get to begin transferring the files before the host computer would freeze....as this would happen the fans on the target computer would be whirring up and the floating firewire logo screensaver on the target computer would also stop floating from left to right..
    what gives?

  • Load into PSA, then into target - in process chains

    Hi
    I have a question regarding first loading into PSA, then into target - using process chains
    It can be done by executing a package only loading into PSA. Then using the process type "Read PSA and Update Data Target".
    My problem is, that I want to split this into two seperate chains. I.e. loading all data into PSAs before loading into targets. But when I do these two tasks in seperate chains I get a syntax error shown below. Can it really be true, that you are not able to do this in seperate chains???
    Thanks
    Karsten
    Process LOADING variant ZPAK_3WI2VMPZM3FE8Y1ELC0TKQHW7 is referenced but is not in the chain
    Message no. RSPC060
    Diagnosis
    Process LOADING, variant ZPAK_3WI2VMPZM3FE8Y1ELC0TKQHW7 must precede the current process. This process is not however available in the chain.
    System response
    The system is unable to activate the chain.
    Procedure
    Change the current process so that it no longer references LOADING ZPAK_3WI2VMPZM3FE8Y1ELC0TKQHW7, or schedule process LOADING ZPAK_3WI2VMPZM3FE8Y1ELC0TKQHW7 in the chain so that it precedes the current process.

    Hello Karsten
    I've discussed this with SAP and even if the response was not clear, it doesn't seem to be possible. We also wanted to do this because we have source system working 24h/24 and 7/7. So as master data and tx data can be created at any time, it was the only way to make sure all master data were available for the load of tx data (loading tx data in PSA, then loading master data, then PSA to target). Is it for the same reason that you want to dissociate the loads ?
    What I've done eventually is load in an ODS not BEX relevant (so SID not required), no master check in IP, so the ODS is like the PSA. Then load master data, activate. Then load from this ODS to target. It's working fine. I do not reconsidere changing this.
    Good luck
    Philippe

  • CDC Error (JOURNALIZED DATA is not loading into tartet database)

    HI,
    I have enabled source database for CDC and got green mark on source database model tables.
    while inserting data into source, J$ tables updating JRN_SUBSCRIBER and other filed.
    when I run the package/ interface JOURNALIZED DATA is not loading into target database.
    i have implemented cdc for 7 source table.
    and using JKM MSSQL Simple
    and enable JOURNALIZED DATA in interface level.
    and
    source database is : MSSQL Server
    Target Database : Oralce 11g.
    please advice me.
    thanks in advance.
    Zakeer Hussain

    Zakeer look into this link , -http://odiexperts.com/?p=1096 . Hope this helps.
    also before running Can you right click on the Source Datastore and click on Journal Data and can you see the data ? and if still the data is not passing through ,in that case make temporary objects t- yes in LKM , IKM and debug and see at which step data is not flowing through and look if there is any filter or condition which is stopping it .
    Still not able to figure out please tell us which step the data is not flowing through we will try to guide you.

  • Incremental Loads and Refresh Date

    Hi all,
    Thank you for taking the time to review this post.
    Environment
    Oracle BI Applications 7.9.6 (Financial & Project Analytics)
    Oracle E-Business Suite 11.5.10
    Question
    I have a Test BI Apps 7.9.6 in a Test environment that is connected to a static EBS 11.5.10 data source. As part of my testing phase I'd like to do multiple Incremental Loads to get an accurate performance impact and timing study for the final pre-approval before migrating to Production. I can get a refresh of EBS which has a week's worth of transactions after my Initial Full Load. What I'd like to do is change Refresh Dates to "trick" the Incremental Load into only loading one days worth of data at a time, rather than the full week's worth of data in the Incremental load. Is this possible, and if so, how?
    Example timeline:
    Today - Initial Full load using Test EBS as of today
    1 week later - Refresh static Test EBS from Production with a week of transactions
    Post Refresh - Run daily Incremental jobs using static Test EBS
    First Incremental Load - Today's position + 1 day,
    Second " " - Today's position + 2 days,
    Third " " - Today's position + 3 days, etc
    As always all comments and solutions greatly appreciated.
    Kind Regards,
    Gary.

    Say on the 01st of the month, you did a Load.
    Then on the 08th of the month, the source EBS system was itself refreshed.
    What you want to do is to run a single day refresh on the 08th for all data from the 01st to the 02nd of the month), and then another single day referesh -- whether on the 08th or on the 09th , you don't care -- for all data from the 03rd to the 04th.
    Unfortunately, the refresh is from last refresh date to current date. You can't define "refresh upto date". Therefore, your first 'incremental' refresh on the 08th would refresh all data from the 02nd to the 08th in one shot. What you could try to do is
    a. After the first load on the 01st, shutdown the BI DWH.
    b. When the EBS test source is refresh on the 08th, reset your SYSTEM CLOCK ON THE BI DWH DATABASE SERVER to the 2nd (or 3rd) of the month.
    c. Now, when you run a refresh, BI will extract all data from the 01st to the 02nd or 03rd (even though EBS is as of the 08th).
    d. Once this is done, shutdown BI DWH.
    e. Reet the SYSTEM CLOCK ON THE BI DWH DATABASE SERVER to the 3rd or 4th of the month.
    f. Run another Incremental Refresh.
    ... and so on ...
    Hemant K Chitale
    http://hemantoracledba.blogspot.com

  • Update the source record while loading data into target

    Hello Friends,
    I am loading data from "staging info object"  into "target info object".
    I have the validation code in 'start routine' in the transformations.
    During validations, if the conditions are met, I would like to load the record into target and at the same time, update one specific field of this record ( e.g. say - LoadFlag ) in the source infoobject.
    I am successfully able to load the records in the target info objects.
    I would like to know, how to update the record in the source-infoobject.
    Can anyone please let me know the ABAP syntax for this one.?
    I promise to award points.
    Thank you for your time.
    Pramod.

    Instead of ABAP you can:
    a) connect the update rule to the target also to the source object as input. In this updaterule you can simple set the load indicator with a constant.
    b) use an APD (TR RSANWB).

  • How to delete the data loaded into MySQL target table using Scripts

    Hi Experts
    I created a Job with a validation transformation. If the Validation was failed the data passed the validation will be loaded into Pass table and the data failed will be loaded into failed table.
    My requirement was if the data was loaded into Failed database table then i have to delete the data loaded into the Passed table using Script.
    But in the script i have written the code as
    sql('database','delete from <tablename>');
    but as it is an SQL Query execution it is rising exception for the query.
    How can i delete the data loaded into MySQL Target table using scripts.
    Please guide me for this error
    Thanks in Advance
    PrasannaKumar

    Hi Dirk Venken
    I got the Solution, the mistake i did was the query is not correct regarding MySQL.
    sql('MySQL', 'truncate world.customer_salesfact_details')
    error query
    sql('MySQL', 'delete table world.customer_salesfact_details')
    Thanks for your concern
    PrasannaKumar

  • Need suggestions regarding a system design for incremental load

    Hi,
    Our client has  a set of SQL Server tables which are being fully refreshed daily using views from DB2 source tables. We need to design an approach to load them incrementally.
    We have a third party application 'XXX' which will provide us the changed records in the underlying DB2 tables into SQL Server tables daily. Let us call them CDC_<tbl>. This table will have the same schema of source and have a flag to indicate whether
    it is an insert, update or delete record. From these CDC tables, we have to do the required transformations and do the insert/update/delete from the target table accordingly.
    This approach would work easily for cases where there is only one source table. But when we have multiple tables joined together to load the target table, we are unable to design an approach. If on a particular day, an insert record comes in only one of
    the source CDC tables, we will not be getting that row in the target since the other CDC table doesn't have a record for that particular key. We cannot join the SQL Server CDC table with source DB2 table since that will cause performance issues.
    Please share your thoughts on how we can design an approach which will work in cases of join, union, group by etc. between source tables. We are open to suggestions on changes in CDC tables also since the third party tool is to be configured as per our design
    needs.
    Thanks in advance,
    KP

    If on a particular day, an insert record comes in only one of the source CDC tables, we will not be getting that row in the target since the other CDC table doesn't have a record for that particular key.
    If I understand correctly, you extract data using DB2 views (some with joins) and then use third party CDC software to capture all changes made to the underlying DB2 tables.  These changes are then applied to a transformed version of data extracted
    from the views.  If my understanding is correct, why use DB2 views at all?  It seems to me the transformation process must have intimate knowledge of the underlying DB2 tables anyway in order to apply the CDC data properly.
    Are you saying you are not getting the other CDC table row at all, or is it just that the CDC tables are not transactionally consistent at the time the changes are applied to the target?  I think transactional inconsistencies should be addressed by
    the third party CDC application. The other alternative as I see it is to query DB2 for a each CDC key to make sure you have the latest data during the transformation process.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • ODI : how to raise cross reference error before loading into Essbase?

    Hi John .. if you read my post, I want to say that you impress me! really, thank for your blog.
    Today, my problem is :
    - I received a bad quality data file from ERP extract
    - I have cross reference table (Source ==> Target)
    - >> How to raise the error before loading into Essbase !
    My Idea is the following, (first of all, I'm not sure if it is a good one, and also I meet issue to do it in ODI !)
    - Step 1 : make JOIN between data.txt and cross-reference Table ==> Create a table DATA_STEP1 in the ODISTAGING schema (the columns of DATA_STEP1 are the addition of columns of data.txt those of cross-references Tables (... there is more than 20 columns in my case)
    - Step 2 : Control if there is no NULL value in the Target Column (NULL means that the data.txt file contains value that are not defined in my cross reference Table) by using Filter ( Filter = Target_Account IS NULL or Target_Entity IS NULL or ...)
    The result of this interface is send to reject.txt file - if reject.txt file is not empty then a mail is sent to the administrator
    - Step 3 : make the opposite : Filter NOT (Target_Account IS NULL or Target_Entity IS NULL ... ) ==> the result is sent in DATA_STEP3 Table
    - Step 4 : run properly the mapping : source : DATA_STEP3 (the clean and verified data !) with cross reference Tables and send data into Essbase - NORMALY, there is not rejected record !
    My main problem is : what is the right IKM to send data into the DATA_STEP1, or DATA_STEP3 Table, which are Oracle Table in my ODISTAGING Schema ! I thy with IKM Oracle Incremental Update but I get error, and actually I don't need an update (which is time consumming), I just need an INSERT !
    I'm just lookiing for an 'IKM SQL to Oracle" ....
    regards
    xavier

    Thanks john : very speed !
    I understood better now which IKM is useful.
    I found other information about the error followup with ODI : http://blogs.oracle.com/dataintegration/2009/10/did_you_know_that_odi_generate.html
    and I decided to activate Integrity Constorl in ODI :
    I load :
    - data.txt in ODITEMP.T_DATA
    - transco_account.csv in ODITEMP.T_TRANSCO_ACCOUNT
    - transco_entity.csv in ODITEMP.T_TRANSCO_ENTITY
    - and so on ...
    - Moreover I create integrity constraints between T_DATA and T_TRANSCO_ACCOUNT and T_TRANSCO_ENTITY ... so I expected that ODI will raise for me in E$_DATA (the error table) the bad records !
    However I have one issue when loading data.txt into T_DATA because I have no ID or Primary Key ... I read in a training book that I could use a SEQUENCE ... I try but unsuccessful ... :-(
    Is there another simple way to create a Primary Key automaticaly (T_DATA is in an oracle Schema of course) ?thanks in advance

  • Business Objects Data Services Incremental Load Help

    Hi this my first time creating a incremental load for a batch job. My batch job consists of a try - initialization script - data flow - catch. When I validate my initialization script I am getting an error could you review and identify the error with the script. My data flow consists of the data store table I imported with a query then table comparison then key generation then the the table I am updating.
    # Set Todays Date
    $SYSDATE = cast ( sysdate (), 'date' );
    print ('Today\' date:' || cast($SYSDATE, 'varchar(10)'));
    # SET CDC DATE
    $CDC_DATE = nvl (cast(sql('Target', 'SELECT MAX(BATCH_END_DATE) FROM BATCH_CONTROL WHERE BATCH_NAME = {$BATCH_NAME}
    AND BATCH_STATUS = \'SUCESS\' '), 'date'), cast(to_date('1900-01-01', 'YYYY-MM-DD'), 'date'));
    #Mark an entry in Batch_Control
    # Batch_Name    BATCH_STATUS   BATCH_START_DATE   BATCH_END_DATE Load_DATE
    sql('Target', 'INSERT INTO BATCH_CONTROL VALUES ( {BATCH_NAME}, \'STARTED', {to_char ($CDC_DATE, \'YYYY-MM-DD\')}, NULL, {to_char ($SYSDATE, \'YYYY-MM-DD\')};

    So I resolved the first error now I am receiving this long error any ideas?
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    |Session Table_Incramental_Load
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Process dump is written to <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Call stack:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D305BA, TrCallStatement::process_dbdiff_xform_new()+6666 byte(s), x:\src\parser\process_predef_xform.cpp, line 7281
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D3128E, TrCallStatement::process_diff_xform()+1422 byte(s), x:\src\parser\process_predef_xform.cpp, line 0432
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D356EE, TrCallStatement::process_predef_xform_options()+0286 byte(s), x:\src\parser\process_predef_xform.cpp, line
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0067+0017 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C313A5, TrCallStatement::processStatement()+0789 byte(s), x:\src\parser\dataflowstm.cpp, line 3307
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C310FC, TrCallStatement::processStatement()+0108 byte(s), x:\src\parser\dataflowstm.cpp, line 3201+0012 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C0FB55, DataFlowDef::processStatements()+0101 byte(s), x:\src\parser\dataflow.cpp, line 2331+0014 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C110D5, DataFlowDef::buildGraph()+1621 byte(s), x:\src\parser\dataflow.cpp, line 1723
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C12D99, DataFlowDef::processObjectDef()+2793 byte(s), x:\src\parser\dataflow.cpp, line 1290
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB9DC5, CallStep::processStep()+2037 byte(s), x:\src\parser\planstep.cpp, line 1050
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:FFFFFFFF, NsiAllocateAndGetPersistentDataWithMaskTable()+-1997676757 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB406F, TryStep::processStep()+0335 byte(s), x:\src\parser\planstep.cpp, line 3634
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB33A6, Step::processStepBlock()+0134 byte(s), x:\src\parser\planstep.cpp, line 0377+0018 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C8A78E, PlanDef::processObjectDef()+2718 byte(s), x:\src\parser\plandef.cpp, line 0689
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABB806, AE_Main_Process_Options()+32534 byte(s), x:\src\xterniface\actamainexp.cpp, line 3622
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABFAB1, AE_Main()+1505 byte(s), x:\src\xterniface\actamainexp.cpp, line 0830+0030 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00402AE9
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Registers:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EAX=056E85F0  EBX=00000000  ECX=00000010  EDX=02250048  ESI=056E85F0
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EDI=056E85A8  EBP=04A7C590  ESP=002700F0  EIP=00D305BA  FLG=00010206
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    CS=0023   DS=002B  SS=002B  ES=002B   FS=0053  GS=002B
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Exception code: C0000005 ACCESS_VIOLATION
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Fault address:  00D305BA 01:0029A5BA C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\bin\acta.dll

  • Is it possible incremental load using sqlloader

    Hi
    I am working on datawarehousing projects and every day loading 3 lakhs record to target server using sqlloader.Is it possible incremental loading using sqlloader
    Example First day loaded 3 lakhs record into target using sqlloader ,when next day need to load another 2 lakhs records .Using sqlloader how to do the incremental load?
    Thanks inadvance
    Mohan

    Hi
    The sql loader has three options
    Append
    Replace
    Truncate
    The first option will help you and append the data and will not reject the duplicate . FOr rejecting duplicate record rejection make sure table has constraints.
    Prashant_Arvind

  • Extracting explanations from planning app and loading into Oracle table

    Hi All,
    I had a requirement where I had to extract data from a planning application through ODI 11g and load it into Oracle RDBMS.
    I used essbase as my source in technology (since planning data is stored on essbase side) and oracle as my target.
    Now the data is getting extracted from essbase side and is getting loaded into Oracle table through ODI.
    Now the client requires that they want to extract the explanations or text values also from planning application and load them into Oracle table.
    How this can be achieved?Is there a table on the sql side(since sql database is being used at back end for planning app) which stores the explanations,if yes please let me know which table it is.
    Kindly help me with this requirement.

    Hi,
    IKM SQL Control Append is perfect if you don't need incremental updates. If you need it, go for IKM Oracle Incremental Update (MERGE) or something like that.
    Regards,
    JeromeFr

  • ETLs processes - Full vs Incremental loads

    Hi,
    I am working with a customer who already have implemented Financials Anlysis in the past, but now the requirement is to add Procurement and Supply Chain Analysis. Could anybody tell me how to the extraction of these new subject areas? could I create separate Executions Plans in DAC for each subject area or I need to create one ETL which contains the 3 areas?
    Please help me! I also need to understand which is the difference between full load and incremental load, how I configure the DAC to execute either full or incremental extraction?
    Hope anybody can help me,
    Thanks!

    In regards to your "multiple execution plan" question: I usually just combine all subject areas into a single execution plan. Especially considering the impact Financial Analytics has on Procurement and Supply Chain subject areas.
    The difference between full-load and incremental-load execution plans exists mostly in the source qualifiers date-constraints. Incrmenetal execution plans will have a $$LAST_EXTRACT_DATE comparison against the source system. Full-load execution plans will utilize $$INITIAL_EXTRACT_DATE in the SQL.
    A task is executed with a "FULL" load command when the last_refresh_date for that tasks target tables is NULL.
    Sorry this post is a little chaotic.
    - Austin
    Edited by: Austin W on Jan 27, 2010 9:14 AM

  • Error while importing CIM file into target SLD

    Hi
       We are in the process of moving our XI objects to QA.
    As a first step , we are preparing the QA SLD ( we have a separate SLD instance for XI DEV and QA environments ).
    We have our standard objects loaded into the QA SLD. Now for the custom objects :
    1.I export the product and the corresponding SWCV from the source SLD .
    2. I import the zip file into the target SLD in that order. While importing the product into the target, I get a warning -
    <b>The target namespace for the special import already contains data for one or more export lines. Continuing this import may corrupt the state of your data.</b>
    I am not sure what this means - should I go ahead and import my custom product and its SWCV ( the next step ) despite this warning ? Or am I missing any step ? I checked the weblog on the SLD preparation and did not see any additional steps other than export/import from the source/target SLDs
    Thank you for your time in advance.

    Hi Karthik,
    Check out this link
    ==>http://help.sap.com/saphelp_nw04s/helpdata/en/b2/0aae42e5adcd6ae10000000a155106/frameset.htm
    Hope this will explain you all the steps in importing...:)
    Regards,
    Sundararamaprasad.

Maybe you are looking for

  • How to achieve shadow in this guys?

    hi...any advice in how to built the shadow for this guys??..thanks..i cant figure it out how...

  • Exporting rtf Files - Lines are incorrect

    I am building a report that requires that I export various Crystal Reports (Version X1 Release 2) and then I will need to insert these files at various points within a Word 2007 document. The problem I'm having is finding an export format that will o

  • ShutdownOutput() is not supported in SSLSocket

    Greetings, I'm currently working on a project where I'm connecting to a webpage that is protected using SSL. For the sake of testing, I'm having Java accept whatever certificates there are. Now, when I run my program, it prints out what's on my page,

  • Designer

    I downloaded Oracle Designer 6i by the web site, I installed it on my computer and I configured the client side but I cannot configure the server side. I don't undestand how I can use the designer and what I must configure for use Client and server i

  • BorderLayout clobbers child layouts.

    Smells like a bug to me. If you place child panels with, say GridBagLayouts inside a panel which has a BorderLayout, the parental BorderLayout clobbers the child layouts. So if you have a panel in the CENTER and you want a button anchored to the uppe