Donu00B4t load duplicates File to a cube through DTP

Hi gurus... i want to know if is possible to avoid load the same data from files to a cube.. doing this trough a DTP.
I think to do this with a step in a process chain that eliminate dupplicates entry in the cube, but the problem is that i haven´t any selection in DTP... this is a full load.
Regards..

Neo,
do you want to avoid loading duplicate files to a cube using  DTP or avoid loading duplicates from the file to the cube ?
if it is avoid loading duplicate records from file to cube - then ideally you can use a DSo in between...

Similar Messages

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Loading XML File in Oracle Tables through Concurrent Request

    I am posting a working program which reads an XML File and loads in Oracle database table through Concurrent Request. Input parameter for this program is file name. I have added directory name ASPEN_DIR as /interface/inbound in ALL_DIRECTORIES table.
    /* This is a sample program reading an input xml file and loads data in Oracle Database table
    This program is executed through concurrent request and it has an input file name
    it also creates a log for reading and inserting records from file and into a table
    CREATE OR REPLACE PACKAGE BODY CBAP_ACCRUENT_XML_PKG AS
    PROCEDURE read_emp_xml_file (errbuf out varchar2,
    retcode out number,
    in_filename in varchar2)
    is
    my_dir varchar2(10) := 'ASPEN_DIR';
    l_bfile BFILE;
    l_clob CLOB;
    l_parser dbms_xmlparser.Parser;
    l_doc dbms_xmldom.DOMDocument;
    l_nl dbms_xmldom.DOMNodeList;
    l_n dbms_xmldom.DOMNode;
    l_temp VARCHAR2(1000);
    v_empno number(10);
    v_ename varchar2(50);
    v_job varchar2(30);
    v_mgr number(10);
    v_hiredate date;
    v_sal number(10);
    v_comm number(10);
    src_csid NUMBER := NLS_CHARSET_ID('UTF8');
    v_read NUMBER(5);
    v_insert NUMBER(5);
    dest_offset INTEGER := 1;
    src_offset INTEGER := 1;
    lang_context INTEGER := dbms_lob.default_lang_ctx;
    warning INTEGER;
    BEGIN
    v_read := 0;
    v_insert := 0;
    l_bfile := BFileName(my_dir, in_filename);
    dbms_lob.createtemporary(l_clob, cache=>FALSE);
    dbms_lob.open(l_bfile, dbms_lob.lob_readonly);
    dbms_lob.loadclobfromfile(l_clob, l_bfile, dbms_lob.getlength(l_bfile), dest_offset,src_offset, src_csid, lang_context, warning);
    dbms_lob.close(l_bfile);
    -- make sure implicit date conversions are performed correctly
    dbms_session.set_nls('NLS_DATE_FORMAT','''DD/MM/RR HH24:MI:SS''');
    -- Create a parser.
    l_parser := dbms_xmlparser.newParser;
    -- Parse the document and create a new DOM document.
    dbms_xmlparser.parseClob(l_parser, l_clob);
    l_doc := dbms_xmlparser.getDocument(l_parser);
    -- Free resources associated with the CLOB and Parser now they are no longer needed.
    dbms_lob.freetemporary(l_clob);
    dbms_xmlparser.freeParser(l_parser);
    -- Get a list of all the nodes in the document using the XPATH syntax.
    l_nl := dbms_xslprocessor.selectNodes(dbms_xmldom.makeNode(l_doc),'/EMPLOYEES/EMP');
    -- Loop through the list and create a new record in a tble collection
    -- for each record.
    FOR cur_emp IN 0 .. dbms_xmldom.getLength(l_nl) - 1 LOOP
    l_n := dbms_xmldom.item(l_nl, cur_emp);
    v_read := v_read + 1;
    -- Use XPATH syntax to assign values to he elements of the collection.
    dbms_xslprocessor.valueOf(l_n,'EMPNO/text()',v_empno);
    dbms_xslprocessor.valueOf(l_n,'ENAME/text()',v_ename);
    dbms_xslprocessor.valueOf(l_n,'JOB/text()',v_job);
    dbms_xslprocessor.valueOf(l_n,'MGR/text()',v_mgr);
    dbms_xslprocessor.valueOf(l_n,'HIREDATE/text()',v_hiredate);
    dbms_xslprocessor.valueOf(l_n,'SAL/text()',v_sal);
    dbms_xslprocessor.valueOf(l_n,'COMM/text()',v_comm);
    insert into emp(empno,ename,job,mgr,hiredate,sal,comm)
    values(v_empno,v_ename,v_job,v_mgr,v_hiredate,v_sal,v_comm);
    v_insert := v_insert + 1;
    END LOOP;
    -- Free any resources associated with the document now it
    -- is no longer needed.
    dbms_xmldom.freeDocument(l_doc);
    --remove file to another directory
    commit;
    fnd_file.put_line(fnd_file.LOG,'Number of Records Read : '||v_read);
    fnd_file.put_line(fnd_file.LOG,'Number of Records Insert : '||v_insert);
    EXCEPTION
    WHEN OTHERS THEN
    dbms_lob.freetemporary(l_clob);
    dbms_xmlparser.freeParser(l_parser);
    dbms_xmldom.freeDocument(l_doc);
    retcode := sqlcode;
    ERRBUF := sqlerrm;
    ROLLBACK;
    END read_emp_xml_file;
    END;
    <?xml version="1.0" ?>
    - <EMPLOYEES>
    - <EMP>
    <EMPNO>7369</EMPNO>
    <ENAME>SMITH</ENAME>
    <JOB>CLERK</JOB>
    <MGR>7902</MGR>
    <HIREDATE>17-DEC-80</HIREDATE>
    <SAL>800</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7499</EMPNO>
    <ENAME>ALLEN</ENAME>
    <JOB>SALESMAN</JOB>
    <MGR>7698</MGR>
    <HIREDATE>20-FEB-81</HIREDATE>
    <SAL>1600</SAL>
    <COMM>300</COMM>
    </EMP>
    - <EMP>
    <EMPNO>7521</EMPNO>
    <ENAME>WARD</ENAME>
    <JOB>SALESMAN</JOB>
    <MGR>7698</MGR>
    <HIREDATE>22-FEB-81</HIREDATE>
    <SAL>1250</SAL>
    <COMM>500</COMM>
    </EMP>
    - <EMP>
    <EMPNO>7566</EMPNO>
    <ENAME>JONES</ENAME>
    <JOB>MANAGER</JOB>
    <MGR>7839</MGR>
    <HIREDATE>02-APR-81</HIREDATE>
    <SAL>2975</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7654</EMPNO>
    <ENAME>MARTIN</ENAME>
    <JOB>SALESMAN</JOB>
    <MGR>7698</MGR>
    <HIREDATE>28-SEP-81</HIREDATE>
    <SAL>1250</SAL>
    <COMM>1400</COMM>
    </EMP>
    - <EMP>
    <EMPNO>7698</EMPNO>
    <ENAME>BLAKE</ENAME>
    <JOB>MANAGER</JOB>
    <MGR>7839</MGR>
    <HIREDATE>01-MAY-81</HIREDATE>
    <SAL>2850</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7782</EMPNO>
    <ENAME>CLARK</ENAME>
    <JOB>MANAGER</JOB>
    <MGR>7839</MGR>
    <HIREDATE>09-JUN-81</HIREDATE>
    <SAL>2450</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7788</EMPNO>
    <ENAME>SCOTT</ENAME>
    <JOB>ANALYST</JOB>
    <MGR>7566</MGR>
    <HIREDATE>19-APR-87</HIREDATE>
    <SAL>3000</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7839</EMPNO>
    <ENAME>KING</ENAME>
    <JOB>PRESIDENT</JOB>
    <HIREDATE>17-NOV-81</HIREDATE>
    <SAL>5000</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7844</EMPNO>
    <ENAME>TURNER</ENAME>
    <JOB>SALESMAN</JOB>
    <MGR>7698</MGR>
    <HIREDATE>08-SEP-81</HIREDATE>
    <SAL>1500</SAL>
    <COMM>0</COMM>
    </EMP>
    - <EMP>
    <EMPNO>7876</EMPNO>
    <ENAME>ADAMS</ENAME>
    <JOB>CLERK</JOB>
    <MGR>7788</MGR>
    <HIREDATE>23-MAY-87</HIREDATE>
    <SAL>1100</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7900</EMPNO>
    <ENAME>JAMES</ENAME>
    <JOB>CLERK</JOB>
    <MGR>7698</MGR>
    <HIREDATE>03-DEC-81</HIREDATE>
    <SAL>950</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7902</EMPNO>
    <ENAME>FORD</ENAME>
    <JOB>ANALYST</JOB>
    <MGR>7566</MGR>
    <HIREDATE>03-DEC-81</HIREDATE>
    <SAL>3000</SAL>
    </EMP>
    - <EMP>
    <EMPNO>7934</EMPNO>
    <ENAME>MILLER</ENAME>
    <JOB>CLERK</JOB>
    <MGR>7782</MGR>
    <HIREDATE>23-JAN-82</HIREDATE>
    <SAL>1300</SAL>
    </EMP>
    </EMPLOYEES>

    http://download-west.oracle.com/docs/cd/B13789_01/appdev.101/b10790/toc.htm
    Take a look at Examples 4-8 and 4-9. Even thought this is 10g doc code should work on 9.2.4 or later

  • Error 4 when loading flat file for Demo cube

    Hi gurus
    I've exported the csv file for 0D_SD_C03 and store it on my local disk.
    when i try to preview the file using the file that i stored in my local drive , the execution close w/o any error message but during loading it shows no data found. ( i have tried to open the file before execute the loading and the system prompt with an error message saying the file  is open) .
    When i tried to store the file on the application server and execute the preview it says that error 4 occurred.
    Can anyone suggest what happen and how to rectify this
    Thank you

    Hi,
    The reasons may be the file is open, the format/flatfile structure is not correct, the mapping/transfer structure may not be correct, presence of invalid characters/data inconsistency in the file, etc.
    Check if the flatfile in .CSV format.
    Thanks,
    JituK

  • Error while loading data from PSA to Infoobject through DTP(URGENT)

    Hi-
    I am running into an issue while loading the data from PSA to Infoobject.
    It says
    Data Package 1 : error during processing.
    Updating Attributes for Infoobject
    Error in substep
    Process Terminated
    Please let me know ur solution as this is very urgent....

    Data Package 1 : error during processing.
    This is the error when your flat file is opened at the time of scheduling. Pls close your flat file, will upload correctly.
    Let me know if you have any further query.

  • Error while loading the data into the cube using DTP.

    No SID found for the value'UL' of characterstics 0BASE_UOM.
    Please give me the idea ,to solve the error.
    Thanks&Regards
    Syam Prasad Dasari

    https://forums.sdn.sap.com/click.jspa?searchID=23985990&messageID=6733764
    https://forums.sdn.sap.com/click.jspa?searchID=23985990&messageID=5062570

  • Unable to load prn file in LedgerLink

    I am working with HE641 and having to use LedgerLink to load files since there is some account translation. For some reason I am unable to load the file. This is through Citrix. During the load process, the files appears to load because the status bar appears to increment upward but when I look at Data Entry in the app, I can not see any results. Cany anyone help me with this?

    I had to get security to the translation file through Citrix.

  • How to load flat file in existing Architecture.

    Hi everyone!
    There is an existing architecture that I have to upload data into.  There is a cube on top, followed below by an ODS and its update rules to the cube, followed below by an Infosource and its corresponding update rule to the ODS above it, followed below by a datasource (it says PC FILES(PC_FILE)) and its accompanying transfer rules to the infosource....
    There is nothing in the ODS...  No data.  The Datasource says emmulated 3.x Datasource...  (What does this mean?)
    What I want to do is upload data via a flat file...  Good BI Masters, how do I do this?  How do I create a "dummy file" for upload?  What are the specific sequence of steps that I need to do to load this flat file data?
    Thank thank you so much!
    Philips

    emulated DS:
    http://help.sap.com/saphelp_nw04s/helpdata/en/8c/131e3b9f10b904e10000000a114084/frameset.htm
    I'm not sure what you are asking - is it how to load a file all the way to cube in 7.0?
    From the architecture desc you have given, you havent migrated ur dataflow to new 7.0 dataflow, as you still have the update rules and transfer rules. So though ur system is 7.0, ur backend is still on 3.x, so the way you load a file is the same when your system was 3.x(before upgrade)
    Or - are you asking the steps to load the file to a cube - no matter which environment you are in  (3.x or 7.0)? if so, let us know .. and do a search on the forum for steps to load flatfile

  • Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ

    Hi All
    We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
    In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
    We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
    We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
    Thanks in advance.
    Kind Regards
    Swapnil

    Hello Swapnil.
    Please check note 979660 which mentions this issue ?
    Thanks,
    Walter Oliveira.

  • Not fetching the records through DTP

    Hi Gurus,
    I am facing a problem while loading the data in to infocube through DTP.
    I have successfully loaded the data till PSA but not able to load the records into infocube.
    The request was successfully with status green but able to see only 0 records loaded.
    later one of my friend executed the DTP successfully with all the records loaded.
    can you please tell me why it is not working with my userid.
    I have found the following difference in the monitor.
    I am not able to see any selections for my request and ale to see REQUID = 871063 in the selection of the request started by my friend.
    can any one tell me why that REQUID = 871063 is not displaying automatically when I have started the schedule.

    Hi,
    I guess the DTP update process is DELTA UPDATE mode. Because you and your friend/colleague have executed SAME DTP object with a small time gap and during the same period no new TRANSATIONS HAVE POSTED IN SOURCE.
    -Try to execute after couple hours....
    Regards

  • Error while loading data on to the cube : Incompatible rule file.

    Hi,
    I am trying to load data on to essbase cube from a data file. I have a rule file on the cube already, and I am getting the following error while loading the data. Is there any problem with the rules file?
    SEVERE: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
    com.essbase.api.base.EssException: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
         at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect.essMainBeginDataload(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin.essMainBeginDataload(Unknown Source)
         at com.essbase.api.datasource.EssCube.beginDataload(Unknown Source)
         at grid.BudgetDataLoad.main(BudgetDataLoad.java:85)
    Error: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
    Feb 7, 2012 3:13:37 PM com.hyperion.dsf.server.framework.BaseLogger writeException
    SEVERE: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
    com.essbase.api.base.EssException: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
         at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect.essMainLoadBufferTerm(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
         at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
         at com.essbase.api.session.EssOrbPlugin.essMainLoadBufferTerm(Unknown Source)
         at com.essbase.api.datasource.EssCube.loadBufferTerm(Unknown Source)
         at grid.BudgetDataLoad.main(BudgetDataLoad.java:114)
    Error: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
    Thanks,
    Santhosh

    " Incompatible rule file. Duplicate member name rule file is used against unique name database."
    I am just guessing here as I have never used the duplicate name functionality in Essbase, nor do I remember which versions it was in. However with that said I think your answer is in your error message.
    Just guessing again.... It appears that your rule file is set to allow duplicate member names while your database is not. With that information in hand (given to you in the error message) I would start to explore that.

  • Can i load the cube through an customised abap program??

    Hi all,
    I have loaded the ODS using the customised abap pgrm. is it possible to load the CUBES through the abap program.
    If not what is the reasons for not loading it?
    Thanks
    Pooja

    Hi Pooja,
    For me ..
    I'm afraid to upload directly by program to info cube tables .. Because, i don't know what tables it will be taken into account for uploading ..
    But ..
    If the requirement is to upload it by ABAP Program ..
    I make like this ..
    1. I create the ABAP Program to create .csv files. So all uploaded data will be written into .csv files.
    2. I create info-package, that upload data from file. And i define the path to refer to that corresponding .csv files.
    3. Beside ABAP Program is used to write .csv files, i also use it to trigger the info-package.
    So by that techniques, i'm able to upload data by ABAP Program.
    Hopefully it can help you a lot.
    Regards,
    Niel.

  • How to load date and time from text file to oracle table through sqlloader

    hi friends
    i need you to show me what i miss to load date and time from text file to oracle table through sqlloader
    this is my data in this path (c:\external\my_data.txt)
    7369,SMITH,17-NOV-81,09:14:04,CLERK,20
    7499,ALLEN,01-MAY-81,17:06:08,SALESMAN,30
    7521,WARD,09-JUN-81,17:06:30,SALESMAN,30
    7566,JONES,02-APR-81,09:24:10,MANAGER,20
    7654,MARTIN,28-SEP-81,17:24:10,SALESMAN,30my table in database emp2
    create table emp2 (empno number,
                      ename varchar2(20),
                      hiredate date,
                      etime date,
                      ejob varchar2(20),
                      deptno number);the control file code in this path (c:\external\ctrl.ctl)
    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime, ejob, deptno)this is the error :
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 5
    C:\>any help i greatly appreciated
    thanks
    Edited by: user10947262 on May 31, 2010 9:47 AM

    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime, ejob, deptno)Try
    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime "to_date(:etime,'hh24:mi:ss')", ejob, deptno)
    this is the error :
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 5
    C:\>
    That's not an error, you can see errors within log and bad files.

  • How to load flat file in BPS through Web-debugging

    Hi,
    We are working on flat file upload in BPS thru guide 'How to load flat file in BPS through Web'. Can some one guide on how to debug the function modules used while uploading the data.
    We have set up the break point in the function modules and also in the BSP page. But when trying to debug while uploading, it is not going to the break point which was set.
    Could you assist in setting up the break point.
    Regards,
    Sreenath
    Message was edited by:
            sreenath reddy

    Hi,
    I  have put an external break-point.
    I have hard coded it in the coding of the function module itself.
    The code is perfectly working. But, when I want to check for the values of the variables in the F.M during runtime, the break-point is not triggering. Any ideas??
    Regards,

  • Can I load several files in parallel through one file infopackage?

    Can I load several files in parallel through file datasource?

    Hi Alex
    I am sorry but am not able to understand your requirements completely.
    If you are using BSP then how come infopackage comes in picture?
    As through a BSP application you can load the data with the help of fundtion modules. In your case first load the data from file to internal table through function GUI_UPLOAD and then from internal table to ODS through function module RSDRI_ODSO_MODIFY.
    Do not forget to assign points if this helps
    Regards,
    Monika Birdi

Maybe you are looking for