Load data with Delta

Hello Expert,
I loaded data from SAP R/3 into PSA using Delta mode and found that a new record which was  just created was not loaded into SAP BI. What could be possible?
Step of loading
1. Initial without data into PSA
2. Full load with criteria Fiscal Period (e.g. 001.2009 - 016.2009) into PSA
3. Load data from PSA into DSO with update mode = Delta
4. Create a new transaction from SAP R/3
5. Load data from SAP R/3 into SAP BI-PSA with update mode = Delta
Expected Result: A new record should be loaded into PSA.
Actual Result: There was no record loaded.
After initial loading without data, is it necessary to full load with all fiscal period. And then load with Delta mode?
I have never seen this kind of problem before.
Is anyone familiar with this kind of problem? How did you resolve it?
Any suggestion would be appreciated.
Thank you very much
W.J.

Hi,
Is your Datasource is Logistics? if so how the job has been scheduled in LO cockpit (hourly / daily)
Did you check the record in Delta Queue(RSA7) ?? 
After initial loading without data, is it necessary to full load with all fiscal period. And then load with Delta mode?
I have never seen this kind of problem before.
If you have specific selections you have to do it by repair full load .. and it doesn't impact delta loads

Similar Messages

  • 3FI_SL_C1_TT - Procedure of extracting data with delta

    Dear experts,
    I try to load FI-SL data with delta.
    We use extractor 3FI_SL_C1_TT and have implemented the start routine from note 577644 to create 0balance within the transformation.
    Delta is only available for actuals so we select data only for valuetype 10 as recommended.
    I read that I have to do a full update of the closed periods.
    To load the carryforward from 2014 I have to select 000.2015 and load it with a full load.
    Till now every thing is clear.
    But what are the next steps?
    What do I have to select within the infopackage for the init load of the open periods (current year)?
    After executing the init load: how do I have to define the delta load?
    Thanks for your help.
    Best regards
    Mirjam

    Hi Leszek,
    we use cube for each year.
    So when the periods of the previous year (2014) is not closed, too, I have to work with them
    the same as with cube for current year and do at first an init for the whole year and after that a Delta for whole year.
    For current year cube I try do init/Delta with 001.2015 - 016.9999. But what do I do with the carryforward -
    I can´t do only once a full, case with every new data record for 2014 it will Change.
    What is the best practise with carryforward in this case?
    Regards,
    Mirjam

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • SQL * Loader : Load data with format MM/DD/YYYY HH:MI:SS PM

    Please advice how to load data with format MM/DD/YYYY HH:MI:SS PM into an Oracle Table using SQL * Loader.
    - What format should I give in the control file?
    - What would be the column type to create the table to load data.
    Sample data below;
    MM/DD/YYYY HH:MI:SS PM
    12/9/2012 2:40:20 PM
    11/29/2011 11:23:12 AM
    Thanks in advance
    Avinash

    Hello Srini,
    I had tried with the creation date as DATE datatype but i had got an error as
    ORA-01830: date format picture ends before converting entire input stringI am running the SQL*LOADER from Oracle R12 EBS front-end.
    the contents of my control file is
    LOAD DATA
    INFILE "$_FileName"
    REPLACE
    INTO TABLE po_recp_int_lines_stg
    WHEN (01) = 'L'
    FIELDS TERMINATED BY "|"
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    INDICATOR                POSITION(1) CHAR,
    TRANSACTION_MODE          "TRIM(:TRANSACTION_MODE)",
    RECEIPT_NUMBER               "TRIM(:RECEIPT_NUMBER)",
    INTERFACE_SOURCE          "TRIM(:INTERFACE_SOURCE)",
    RECEIPT_DATE               "TO_CHAR(TO_DATE(:RECEIPT_DATE,'MM/DD/YYYY'),'DD-MON-YYYY')",
    QUANTITY               "TRIM(:QUANTITY)",
    PO_NUMBER               "TRIM(:PO_NUMBER)",
    PO_LINE_NUMBER               "TRIM(:PO_LINE_NUMBER)",
    CREATION_DATE               "TO_CHAR(TO_DATE(:CREATION_DATE,'MM/DD/YYYY HH:MI:SS AM'),'DD-MON-YYYY HH:MI:SS AM')",
    ERROR_MESSAGE                   "TRIM(:ERROR_MESSAGE)",
    PROCESS_FLAG                    CONSTANT 'N',
    CREATED_BY                      "fnd_global.user_id",
    LAST_UPDATE_DATE                SYSDATE,
    LAST_UPDATED_BY                 "fnd_global.user_id"
    {code}
    My data file goes like
    {code}
    H|CREATE|123|ABC|12/10/2012||||
    L|CREATE|123|ABC|12/10/2012|100|PO12345|1|12/9/2012  2:40:20 PM
    L|CORRECT|123|ABC|12/10/2012|150|PO12346|2|11/29/2011 11:23:12 AM{code}
    Below is the desc of the table
    {code}
    INDICATOR             VARCHAR2 (1 Byte)                         
    TRANSACTION_MODE        VARCHAR2 (10 Byte)                         
    RECEIPT_NUMBER             NUMBER                         
    INTERFACE_SOURCE        VARCHAR2 (20 Byte)                         
    RECEIPT_DATE             DATE                    
    QUANTITY             NUMBER                    
    PO_NUMBER             VARCHAR2 (15 Byte)                         
    PO_LINE_NUMBER             NUMBER                         
    CREATION_DATE             TIMESTAMP(0)                         
    ERROR_MESSAGE             VARCHAR2 (4000 Byte)                         
    PROCESS_FLAG             VARCHAR2 (5 Byte)                         
    CREATED_BY             NUMBER               
    LAST_UPDATE_DATE        DATE               
    LAST_UPDATED_BY             NUMBER     {code}
    Thanks,
    Avinash                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Data with delta extraction.

    Hi Gurus
    I want a small clarification regarding data with delta extraction.
    If today the data is extracted from r/3 system with delta to ODS object and the data is activated and datamarted to another ODS and Cube.
    Now if tomorrow the same records had changes how will the old data will be changed i.e in cube if we compress the old data what is the scnerio.
    Can anyone explain in detail starting from Delta queue to change taking place at cube.
    Regards
    Raju

    Raju,
       Don't worry about compression of Cube... Even if you do delta after compression, nothing to worry about Data. Next time you compress it, it will Make 1 record.
    Assume...
    Initially... in ODS Document 45000001 Item 0001 qty 10 Amount 20
    in Cube Document 45000001 Item 0001 qty 10 Amount 20 (After compression)
    Qty changed to 5...
    In ODS Document 45000001 Item 0001 qty 5 Amount 10. This record will create 2 delta records to Cube.
    that is looks like this...
    Document 45000001 Item 0001 qty -10 Amount -20 Record mode ' X' (Before Image)
    in ODS Document 45000001 Item 0001 qty 5 Amount 10 Recordmode ' " (After Image)
    Finally cube contains...
    Document 45000001 Item 0001 qty 10 Amount 20
    Document 45000001 Item 0001 qty 10 Amount -20
    ODS Document 45000001 Item 0001 qty 5 Amount 10
    If you compress cube... you will endup with last record.
    all the best.
    Nagesh Ganisetti.

  • Want to load data with selection criteria

    Hi Everyone,
    I want to load data from X ODS to Y ODS , X ods is a datasource to Y ods.
    On Y ods i don't have any data loaded.
    But on X ods i am having 10 requests with 200000 records.
    In X ods i am having a request with 2 records.
    I want to load the 2 records request to Y ODS.to check data on Y ods
    Can anyone help me in solving these, b'cos i am new to BW, It's urgent Please.
    Can you tell me step by  step navigation.

    Hi,
    Just select Full upload; it will bring the InfoPackage and then in the Selection tab give the range value. If this is required only one time then this method is fine or full load is fine, otherwise you will have to write a code to pick the records. If you frequently want to load data from one ODS to another ODS then better go for init and then from next time onwards do the delta load. If you don't want to provide selection in the InfoPackage then the other way is to load all the data from X to Y and do selective deletion on Y ODS.
    Hope this helps.
    PB

  • Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

    Hi all,
    we have a problem loading data from flat file using the 7.0 datasource.
    We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
    When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
    Has anyone any tips to help me?
    Thank you for help.
    Regards
    Emiliano

    Hi,
    Iam facing the similar problem.
    Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
    When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
    But, its picking up the data from 3 other reqests and doubling the records...
    Can you please help me, how did you managed to get out of that isue?
    Cheers,
    Nisha

  • How can I load data with Scripts on *FDM* to a HFM target System????

    Hi all!
    I need help because I can´t find a good guide about scripting on FDM. The problem I have is the next one.
    I have on mind to load my data with data load file in FDM to a HFM target system, but I would like to load an additional data using an event script, ie after validate. I would need any way to access to HFM system though FDM Scripts, is it possible??
    If so, It would be wonderful to get a data from HFM with any Point of View, reachable from FDM Scripts in order to load or getting any data.
    I´ve looking for a good guide about scripting in FDM but I couldn´t find any information about accessing data on HFM target system, does it really exist?
    Thanks for help

    Hi,
    Take a look at the LOAD Action scripts of your adapter. This might give you an idea.
    Theoretically it should be possible to load data in an additional load, but you need to be very careful. You don't want to corrupt any of the log and status information that is being stored during the load process. The audit trail is an important feature in many implementations. In this context it might not be a good idea to improve automation and risk compliance of your system.
    Regards,
    Matt

  • How to load data with carriage return through DRM action script ?

    Hello,
    We are using DRM to manage Essbase metadata. These metadata contain a field for member formula.
    Currently it's a string data type property in DRM so we can't use carriage return and our formula are really hard to read.
    But DRM support other data type property : memo or formatted memo where we can use carriage return.
    Then in the export file, we have change the record delimiter to an other character than CRLF
    Our issue : we are regularly using action script to load new metadata => How to load data properties with carriage return using action script ? There is no option to change the record delimiter.
    Thanks!

    Hello Sandeep
    here what I want to do through action script : loading a formula that use more on than one line:
    Here, I write my formula using 4 lines but action script cannot load since one line = 1 record.
    ChangeProp|Version_name|Hier_name|Node_name|Formula|@round(
    qty*price
    *m05*fy13

  • Load data with SQL Loader link field between CSV file and Control File

    Hi all,
    in a SQL Loader control file, how do you specify link with field in CSV file and Control file?
    E.g. if I wat to import the record in table TEST (col1, col2, col3) with data in csv file BUT in different position. How to do this?
    FILE CSV (with variable position):
    test1;prova;pippo;Ferrari;
    xx;yy;hello;by;
    In the table TEST i want that col1 = 'prova' (xx),
    col2 = 'Ferrari' (yy)
    col3 = default N
    the others data in CSV file are ignored.
    so:
    load data
    infile 'TEST.CSV'
    into table TEST
    fields terminated by ';'
    col1 ?????,
    col2 ?????,
    col3 CONSTANT "N"
    Thanks,
    Attilio

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • Loading data with data workshop

    Hello,
    We load lots of .csv files into apex using the Dataworkshop tool in Apex. Most .csv files are 60-75K rows of data. It used to take just minutes and was quick and easy way to load data. However, recently it has started taking much much longer (as in several hours vs. 5 or 6 minutes normally) to load data this way into production.
    Any ideas of why all of a sudden it takes much longer to load data this way? Is there anything that can be tuned?
    We are running Application Express 4.1.1.00.23 on RH 5 linux with 11.2.0.2 database.
    Thank you for any help you can provide.
    Thank you,
    Mark

    Mehabub,
    Thank you for responding. In a sense that is kind of what we are doing. We truncate the table and upload our .csv to this table and then process into our other tables.
    The table does not have a particularly large number of rows (usually 60,000 - 70,000 rows), but does have 112 columns.
    It all seems to work fine in our test environment, but there are not the number of users there as in production. We have been looking at the SGA parameters and seem to make headway, but then the customer tries to upload and it slows way way down in production.
    Our customers really like this functionality we have given them with apex, but I am running out of ideas of what to look at.
    Thank you,
    Mark

  • Sql loader  Need to load data with "," only in one filed

    Hi,
    I need to load data my in one column my data is in CSV format like this
    Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan
    i want to upload this data on my table which contain only one column which is name ?
    What will be the query for upload data through sql loader
    Thanks
    Shahzaib ismail
    Oracle database Express Edition Developer 6I

    Since you mention you're using database version XE, I'll assume you're database version is at least 10.2
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Productand so you have the power of:
    - external tables
    http://www.oracle-base.com/articles/9i/ExternalTables9i.php
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6611962171229
    - regular expressions
    http://nuijten.blogspot.com/2009/07/splitting-comma-delimited-string-regexp.html
    and you don't want to be using SQL*Loader anymore, never ever.
    I simply put your string 'Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan' in a file called test.csv and told Oracle that file is in my Oracle directory DATA_DIR (that actually points to: c:\data on my 'filesystem' ) and then:
    SQL> create table t(name varchar2(155));
    Table created.
    SQL> -- instead of SQL*Loader use an External Table:
    SQL> create table ext_t
      2    ( textstring varchar2(4000)
      3    )
      4    organization external ( type oracle_loader
      5                            default directory DATA_DIR
      6                            access parameters (fields terminated by '' )
      7                            location ('test.csv')
      8                          );
    Table created.
    SQL> -- Now you can query your file as if it were a table!                       
    SQL> select * from ext_t;    
    TEXTSTRING
    Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan
    1 row selected.
    SQL> -- and use the powers of SQL to do whatever you want (instead of cludging with those dreaded ctl files):
    SQL> select regexp_substr (textstring, '[^,]+', 1, rownum) names
      2  from   ext_t
      3  connect by level <= length(regexp_replace(textstring, '[^,]+'))+1;
    NAMES
    Shahzaib ismail
    Imran aziz
    Shahmir mehmood
    Shahzad khan
    4 rows selected.
    SQL> -- Voilà, the data is loaded into the table in one single SQL statement:
    SQL> insert into t
      2  select trim(names)
      3  from ( select regexp_substr (textstring, '[^,]+', 1, rownum) names
      4         from   ext_t
      5         connect by level <= length(regexp_replace(textstring, '[^,]+'))+1
      6       );
    4 rows created.
    SQL> --
    SQL> select * from t;
    NAME
    Shahzaib ismail
    Imran aziz
    Shahmir mehmood
    Shahzad khan
    4 rows selected.Don't use SQL*Loader, use an External Table.

  • Problem loading data with DTP

    Hi everyone,
    I trying to load data from a DSO to a INFOCUBE, the problem is as follow. Normally in a DTP the system process packages in the request. In this case the DTP don't proccess any package but finish with status green.
    When I manage the target, I see that the request finish in red.
    The symptom is that the DTP don't process any request. I don´t  know why.
    Status: The DSO have active data, and the infocube is empty.
    I don't know what happen, but I've not carry the data from the DSO to the target.
    I hope that you help me,
    Regards.
    Jose

    Answering your questions:
    Are you seeing the request red in Manage and Green in Monitor Details ?
    Yes in manage I see the requests in red, but in the monitor details all is in green, although there are not requests processed in the monitor details..
    Are the requests green and active in the source DSO ?
    Ok, this DSO is direct input, an it's filled by a APD process. So the DSO don't have active request. This DSO Just have the data in table of active data. I think that it doesn't matter because I'm trying to do the same thing between two infocubes and doesn't work too.
    I'm looking that the problem y general for all warehouse.
    Additional, this DTP works correct the last week.
    This week begins to fail.
    Could be a System problem.
    What can I check??
    Also check if this is an auth issue - SU53 ..
    I check SU53 but all it's ok.
    In Monitor Header - Selections - You should see the requests which have been loaded from source.
    I don't see anything in this field.
    Thanks
    Jose

  • Unable to load data with impdp

    Hi friends,
    I've encountered with follwing errors while loading data through impdp utility.
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "TRACE.SYS_IMPORT_FULL_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-00955: insufficient privileges
    I think problem is with last line ORA-00955: insufficient privileges what are you opinion kindly tell me what necessary priviliges user should have to import/export dump file.
    Looking for you Help and suggestion
    Regards,
    Abbasi

    Is this dumpfile consists of onlyTRACE schema objects or other schema objects?
    no need to grant dba priviliges to trace, you can import using sys/system user.
    impdp system/****@TNRDB directory=tnr_dump_dir dumpfile-tnrsms.dmp logfile=loading.logThanks

  • Requirement to load data with extension as .inv.dhl of source to target as .csv

    Hi Rajendra, Though the file extension says it is of type .inv.dhl, the actual contents of the file is of type XML and it's valid too as said by AnjiReddy. I would suggest import/create your source definition from the source file by using 'Source > Import from XML Definition' menu option and use xml parser tx and go ahead with your rest of business logic within the mapping and connect to the CSV file target. Please try and let us know if you need help at anypoint or run into issues,Rajani

    Hi Friends, I have got one new requirement to load data from .inv.dhl file to target as .csv i got a file from our customer with extension of .inv.dhl in single file want to load the data available in that file to target as .csv with all information   Below is the input file and desired output files. Kindly help me in this implementation of this file.  Regards,Rajendra

Maybe you are looking for

  • Can I use a for loop to add anonymous ActionListener objects?

    I have a setListener() method that has the following inside: for(int k = 0; k < buttons.length; k++)     buttons[k].addActionListener(new ActionListener()         public void actionPerformed(ActionEvent e)             g2.setColor(colors[k]); }I have

  • How to send Java Mail with EXCEL attachment?

    I tried the following: ByteArrayDataSource bs = new ByteArrayDataSource (is, "application/excel"); as my datasource (all other apis are fine) But when I send this message with this datasource as attachment,I get a ParseException with the VM complaini

  • Purch Info Record Update - Difference of Ctyp PBXX PB00

    Hi Sap Gurus! Would like to seek your help! When I am using Ctype PBXX in Purchase Order, Purchasing Info record Price is not updating but if Ctype PB00 is used Purchasing Info Record Price is updated after creation of Purchase Order. What is the dif

  • Mac OS X Leopard iSync problem

    I use iSync I rely upon iSync Now on Mac OS X 5 (Leopard) my USB-connected Nokia phone (6136) is not recognised by iSync because of some incompatibility between the software on the phone and on the Mac. System Profiler can see phone when connected in

  • IPod updater-good or bad?

    Hello I have read that some people, when they downloaded the updater, gave their ipod trouble....did it happen oanyone? and second: when you download it, you put in my documents, but then how does it get in your ipod? Chris