Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

Hi all,
we have a problem loading data from flat file using the 7.0 datasource.
We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
Has anyone any tips to help me?
Thank you for help.
Regards
Emiliano

Hi,
Iam facing the similar problem.
Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
But, its picking up the data from 3 other reqests and doubling the records...
Can you please help me, how did you managed to get out of that isue?
Cheers,
Nisha

Similar Messages

  • Disk throughput drops when inserting data packages in write-optimized DSO

    Hi all,
    we are currently testing our new freshly installed SAN.
    To see the performance gain in BI, I'm currently doing some test loads.
    And during the monitoring of those loads, I noticed something I'd like someone to explain :-):
    I execute a DTP from PSA to a write-optimized DSO.
    The n° of parallel processes = 9
    Update method = serial extraction, immediate parallel processing
    N° of records transferred: +23.000.000
    Ok, in the first phase (read the PSA) only one process is used (serial extraction).  When I look in OS07, I notice we have very good throughput: +66.000 TransfKB/s. Very nice!
    But as soon as BI starts inserting the data packages, and parallel processing kicks in, the throughput drops to 4K or something, and sometimes we get 20K at max.  That's not too good.
    We have a massive SAN , but the BI system does not seem to use it?
    I was wondering why this is the case.  I already toyed around with the package size, but it's always the same.
    Also I noticed that the allocated processes don't seem to be active.  I allocated 9 BTC processes to this load.
    They are all used, but we only see 3 inserts at the same time, max.  Also in the DTP-monitor, only 3 packages are processed at the same time.  As it's a write-optimized DSO, RSODSO_SETTINGS does not apply I presume.
    Any ideas?
    tnx!

    Hi,
    can you pls try to give some filetr in DTP and try to pull the data.
    I am not sure why first data package is taking long time and otehr data package is taking less time..
    Do you have any start routine..If datapak = 1.. the do this logic..
    Pls check..
    regards
    Gopal

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • SQL * Loader : Load data with format MM/DD/YYYY HH:MI:SS PM

    Please advice how to load data with format MM/DD/YYYY HH:MI:SS PM into an Oracle Table using SQL * Loader.
    - What format should I give in the control file?
    - What would be the column type to create the table to load data.
    Sample data below;
    MM/DD/YYYY HH:MI:SS PM
    12/9/2012 2:40:20 PM
    11/29/2011 11:23:12 AM
    Thanks in advance
    Avinash

    Hello Srini,
    I had tried with the creation date as DATE datatype but i had got an error as
    ORA-01830: date format picture ends before converting entire input stringI am running the SQL*LOADER from Oracle R12 EBS front-end.
    the contents of my control file is
    LOAD DATA
    INFILE "$_FileName"
    REPLACE
    INTO TABLE po_recp_int_lines_stg
    WHEN (01) = 'L'
    FIELDS TERMINATED BY "|"
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    INDICATOR                POSITION(1) CHAR,
    TRANSACTION_MODE          "TRIM(:TRANSACTION_MODE)",
    RECEIPT_NUMBER               "TRIM(:RECEIPT_NUMBER)",
    INTERFACE_SOURCE          "TRIM(:INTERFACE_SOURCE)",
    RECEIPT_DATE               "TO_CHAR(TO_DATE(:RECEIPT_DATE,'MM/DD/YYYY'),'DD-MON-YYYY')",
    QUANTITY               "TRIM(:QUANTITY)",
    PO_NUMBER               "TRIM(:PO_NUMBER)",
    PO_LINE_NUMBER               "TRIM(:PO_LINE_NUMBER)",
    CREATION_DATE               "TO_CHAR(TO_DATE(:CREATION_DATE,'MM/DD/YYYY HH:MI:SS AM'),'DD-MON-YYYY HH:MI:SS AM')",
    ERROR_MESSAGE                   "TRIM(:ERROR_MESSAGE)",
    PROCESS_FLAG                    CONSTANT 'N',
    CREATED_BY                      "fnd_global.user_id",
    LAST_UPDATE_DATE                SYSDATE,
    LAST_UPDATED_BY                 "fnd_global.user_id"
    {code}
    My data file goes like
    {code}
    H|CREATE|123|ABC|12/10/2012||||
    L|CREATE|123|ABC|12/10/2012|100|PO12345|1|12/9/2012  2:40:20 PM
    L|CORRECT|123|ABC|12/10/2012|150|PO12346|2|11/29/2011 11:23:12 AM{code}
    Below is the desc of the table
    {code}
    INDICATOR             VARCHAR2 (1 Byte)                         
    TRANSACTION_MODE        VARCHAR2 (10 Byte)                         
    RECEIPT_NUMBER             NUMBER                         
    INTERFACE_SOURCE        VARCHAR2 (20 Byte)                         
    RECEIPT_DATE             DATE                    
    QUANTITY             NUMBER                    
    PO_NUMBER             VARCHAR2 (15 Byte)                         
    PO_LINE_NUMBER             NUMBER                         
    CREATION_DATE             TIMESTAMP(0)                         
    ERROR_MESSAGE             VARCHAR2 (4000 Byte)                         
    PROCESS_FLAG             VARCHAR2 (5 Byte)                         
    CREATED_BY             NUMBER               
    LAST_UPDATE_DATE        DATE               
    LAST_UPDATED_BY             NUMBER     {code}
    Thanks,
    Avinash                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Error encountered while loading data into an Essbase cube from oracle DB

    Hi All,
    I am trying to load data into an Essbase cube from Oracle db. Essbase is installed in an 64 bit HP-UX machine and oracle database on HP -UX 11.23 B server. While performing the above I am getting the below error messages:
    K/INFO - 1021013 - ODBC Layer Error: [08001] ==> [[DataDirect][ODBC Oracle Wire Protocol driver][Oracle]ORA-01034: ORACLE not available
    ORA-27101: shared memory realm does not exist
    HP-UX Error: 2: No such file or directory].
    OK/INFO - 1021014 - ODBC Layer Error: Native Error code [1034] .
    ERROR - 1021001 - Failed to Establish Connection With SQL Database Server. See log for more information.
    Can you please help me out in identifying the issue, as why it could have occured , because of network problem or is there anything related to databse?
    Regards,
    Priya

    The error is related to Oracle, I would check that it is all up and running
    ORA-01034: ORACLE not available
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to get  data with the raw pattern from resultset ?

    would you tell me how to get data with the raw pattern from resultset ?
    thank you in advance!
    longgger2000

    I tried getBytes() and getObject()
    , but I can not get the right result , for example the
    data in oracle database is 01000000DFFF, when In used
    the method of getBytes() and getObject(), I get the
    result of [B@1c2e8a4, very different , please tell me
    why !
    thank you
    longgger2000
    [B is byte arrayseem that it return an bytes array for you.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Error while loading data on to the cube : Incompatible rule file.

    Hi,
    I am trying to load data on to essbase cube from a data file. I have a rule file on the cube already, and I am getting the following error while loading the data. Is there any problem with the rules file?
    SEVERE: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
    com.essbase.api.base.EssException: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
         at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect.essMainBeginDataload(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin.essMainBeginDataload(Unknown Source)
         at com.essbase.api.datasource.EssCube.beginDataload(Unknown Source)
         at grid.BudgetDataLoad.main(BudgetDataLoad.java:85)
    Error: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
    Feb 7, 2012 3:13:37 PM com.hyperion.dsf.server.framework.BaseLogger writeException
    SEVERE: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
    com.essbase.api.base.EssException: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
         at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect.essMainLoadBufferTerm(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin.essMainLoadBufferTerm(Unknown Source)
         at com.essbase.api.datasource.EssCube.loadBufferTerm(Unknown Source)
         at grid.BudgetDataLoad.main(BudgetDataLoad.java:114)
    Error: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
    Thanks,
    Santhosh

    " Incompatible rule file. Duplicate member name rule file is used against unique name database."
    I am just guessing here as I have never used the duplicate name functionality in Essbase, nor do I remember which versions it was in. However with that said I think your answer is in your error message.
    Just guessing again.... It appears that your rule file is set to allow duplicate member names while your database is not. With that information in hand (given to you in the error message) I would start to explore that.

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • How to fetch the Date column(or Month column) from the file name from the specified path in ODI 11g

    Hi ALL,
    Can any one help us regarding How to fecth the Date column(or month column) from the file name specified in the path in a generalized way .
    For example :
    file name is :subscribers (Cost) Sep13.csv is specified in the below path
      E:\Accounting\documents\subscribers (Cost) Sep13.csv
    here I need to fetch the "Sep13" as a Date column in the ODI 11g in the generalized way.
    Can any one help us in this case as early as possible.

    I would suggest using a piece of Jython code for this.  Something like this...
    import os
    import os.path
    filelist  = os.listdir(E:\Accounting\documents\)
    for file in filelist:
    datestr = file[19:-4]
    You'd need to work out what to do with datestr next...  perhaps write it to a table or update an ODI variable with it.
    Hope this is of some help.

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • SQL Server 2012 Developer Edition will not install. Setup files don't even get copied completely. Win 8.1. ACT instance is loaded & can't be deleted. From log file: Error: Action "PreMsiTimingConfigAction" failed during execution.

    SQL Server 2012 Developer Edition will not install.  Setup files don't even get copied completely.  Win 8.1.  ACT instance is loaded & can't be deleted. From log file: Error: Action "PreMsiTimingConfigAction" failed during execution.

    Hello,
    I am glad it worked.
    Thank you for visiting MSDN forums!
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • I have file with 500 pages created from AutoCad file. In all pages, I have different document numbers.The file is not editable in pdf. How to change all the document numbers using "Comment" feature? Any alternate method?  alternate method? I have Adobe Ac

    I have pdf file with 500 pages created from AutoCad file. In all pages, I have different document numbers.The file is not editable in pdf. How to change all the document numbers using "Comment" feature? Any alternate method?  alternate method? I have Adobe Acrobat X Pro and Windows -7 platform.

    Yes, I just want to cover up all the pages for those particular area of document numbers.
    Nothing sensitive about it. I just want to show the correct document numbers on all pages in print out.
    So, I wanted to cover up by comments, but commenting on each page will be difficult. So, I wanted to comment the same on all pages.

Maybe you are looking for

  • Mail a PDF file as an attachment from my documents.

    I want to attach a PDF file to an email: I just finished composing my email and want to attach a PDF file I go to my documents select the the PDF file and click choose, the file is transferred to my letter alright but it has opened up and takes up th

  • Anyone with iPhone4, iOs5 and no battery problems?

    hello. i usually don't have any problems with updating my iphone4, it's pristine-no email, no navigation etc...just games and movies so the tons of issues i'm seeing on this forum aren't concerning me too much, but the battery drainage problem is. ca

  • XXL exporting to excel

    Hello ALl I've recently installed 7.10 and using xxl functions but it is not exporting to excel properly. I'm selecting the field that I want to export then select copy report to XXL, then it is not exported to excel, but tranferred to SAP inbox, whi

  • Exporting Long Projects to AAC Files with Chapters

    Hey Is it possible to export an AAC file with chapters at around two hours in length? I get the "maximum file size of 2gb exceeded" if I try from Garage band (even though the wav the project is made from is made from is only 1.2Gb and the estimated A

  • Why does the pop up blocker bar not always give you the option to view a blocked pop up?

    Usually when Firefox notifies you that a pop up has been blocked it gives you the option to view the blocked pop up. However, sometimes that option is not there. Typically this isn't a problem at all but once in a while there will be a pop up that I