Import Transaction Data - Duplicate records

Hi,
I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
Regards
Sue

Hi,
Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
Thanks,
Sreeni

Similar Messages

  • Duplicate Records, Import transaction Data

    Hi Everybody
    I'm using BPC 7.5 NW and I gett\ a warning that says that there are duplicate records when I run the package "Load Transaction Data". The txt file that I'm using does not have duplicate records. I have the following data in my flat file:
    ACCOUNT           INTCO               AMOUNT
    61012                   I_65                       10
    61012                   I_66                       12
    61012                   I_67                       13
    I'm using a conversion file for INTCO as:
    EXTERNAL               INTERNAL
    I_65                              I_99
    I_66                              I_99
    I_67                              I_99
    When I ran the package, it says that there are duplicate records, the records are:
    ACCOUNT           INTCO               AMOUNT
    61012                   I_99                       10
    61012                   I_99                       12
    My cuestion is, It is not posible to use this package when I use conversion files? If I use the APPEND package, it works fine, but why it dosnt work whit the Import Transaction Data?
    since I remember in MS version is posible to do that.
    Thanks in advenced.
    Regards

    Hi,
    Originally, you had the following records:
    ACCOUNT INTCO AMOUNT
    61012 I_65 10
    61012 I_66 12
    61012 I_67 13
    However, after the conversion file, the records are become:
    ACCOUNT INTCO AMOUNT
    61012 I_99 10
    61012 I_99 12
    61012 I_99 13
    So, there are 3 records which are duplicate.
    The import package will not accept the 2nd and the 3rd record. Because these are duplicate records fro the 1st record. However, the append package will append the 2nd and the 3rd records to the 1st one.
    Hope you got the idea.

  • Import Transaction Data from BW Cube to BPC Cube

    Hi,
    Is there any document that explain how i can import/copy transactional data from bw cube to bpc cube like this below pic. http://img841.imageshack.us/img841/6998/prof.jpg
    Thanks in advance.

    Hi Again,
    With these documents I can import transactional data with a single keyfigure but if there are more than one keyfigure, importing and assigning keyfigures to dimensions might get confusing.
    For example i have two keyfigures "zbgstsad" and "zbgstsf" in bw cube and I have two account member in "S_ACCT": "S_F", "S_A". I did a conversion file with "zbgstsad(ext)->s_a(int)" and "zbgstsf(ext)->s_f(int)" rows. But i am not sure what i have to put instead of "?".
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    CONVERTAMOUNTWDIM=S_ACCT
    *MAPPING
    RPTCURRENCY=0CURRENCY
    S_ENTITY=ZBGENTITY
    S_ACCT= ?
    S_TIME=0CALMONTH
    S_CATEGORY=*NEWCOL(ACTUAL)
    S_OB=0UNIT
    S_URUNLER=ZBGURNLR
    AMOUNT= ?
    *CONVERSION
    S_ACCT=[COMPANY]B_ACCT.XLS!CONVERSION
    TIME=[COMPANY]B_TIME.XLS!CONVERSION
    What do you recommend?
    Thanks in advance.
    Burak
    Edited by: boguner on Aug 9, 2010 2:34 PM
    Edited by: boguner on Aug 9, 2010 2:40 PM

  • BPC 7.5: Import Transactional Data Error

    When trying to run the Import Transactional Data package, the load will complete successfully (and all the data gets loaded), but when I go to view the log file I see the following error.
    Task name LOAD:
    BAdi not implemented for appset <APPSET>, dimension TIME, rule 00000001
    We recently transported this from our DEV system to our TEST system. There are no errors showing in the DEV system and everything transported correctly. Any ideas on why I may be seeing this error?

    Hi Howell,
    Any clue on how this error was resolved?
    We are facing this problem now in 7.5 SP13, in exactly the same scenario after transport.
    Appreciate if you could let us know.
    Thanks.
    Best Regards,
    Karthik AJ

  • How to exclude the selction in BW When importing Transaction data from Info provider

    Hi ,
      I  am trying to Import data from BW infoprovider to BPC cube through Data manager . It works ok if i select Year 2010-2014,value type.
    But i have G/L account i want to import from 100000 to 100129 exclude 100130 again include 100131 to 999999 .How do i achive this .
    I selected the accounts  100000 to 100130 in one line and 100131 to 999999 in different line it did not work .
    when i select this it simply errors out with 0 records selected .
    Best regards,
    Sreedhar .

    Thank you that is the issue . I used the same formula for INT_ORDER i am getting
    EXTERNAL INTERNAL FORMULA
    00000?????? ??????
    000001003951 is invalid or is a calculated member
    if i use
    EXTERNAL INTERNAL FORMULA
    * js:parseInt(%external%)
    I am getting
    1 ,NO_COST_CENTER,DS_CPX_ACTUAL,NO_EX_DEPT,NO_LOCATION,NO_PLAN_ID,NO_REQUEST_ID,USD,ACTUAL_TEST,ACT,160020,NaN,2012.DEC,1.00
    Line1 :Dimension: INT_ORDER member: NaN is invalid or is a calculated member

  • Transport Transaction Data Tool in BCS

    Hi All,
    I understand that in ECCS, there's functionality where we can transport the transaction data which is stored in ECMCA as well as ECMCT via t-code CX0TA (export and import transaction data), for example from development system to production system.
    In ECCS, I used this function since all testing data in development system are all real data. So in order to avoid developing the other tool to transport these data, we use this CX0TA function.
    Now, when I use BCS, I only see the transport tool for customising and master data. No transport tool for transaction data.
    My question is, is it true that the BCS 4.0 do not include this function anymore? Or do I miss some activation so I can not see the functions in this version?
    Any advise will be highly appreciated.
    regards,
    Halim

    Hi Halim,
    AFAIK, there is no possibility to transport transaction data in 4.0:
    http://help.sap.com/saphelp_sem40bw/helpdata/en/bf/df183d30805c59e10000000a114084/frameset.htm
    And I think there is a reason in it. As you know, transaction data coming to BCS should comply with the format and consistency the system expects.
    Transport in BCS is tricky. And there is no guarantee that everything was transported correctly. Hence, the possibility of transaction data load failure is big.
    Best regards,
    Eugene

  • How do you import Comment data

    Hi folks
    Is there a way to import bulk amount of comment data to it's corresponding table (tech name : /1CPMB/* )
    If so, could you please provide me any kind of technical tips or related document.
    Thanks in advance.
    -Tae

    No, I wasn't talking about copying comments from one version to another within application.
    I have mentioned importing comment data to it's corresponding text table starting with a tech. name /1CPMB/...
    Just like importing transaction data with using 'import transaction data' package which uploads the data from the
    CSV file.
    We have a situation where we need to import actual transaction data and comments at the same time
    in order to let our end users to explore the trans. and comment data in a single
    BPC  custome made template.
    Thanks.
    Tae

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Duplicate Records error when processing transaction file....BPC 7.0

    Hi All,
    I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
    1. Will we get duplicate records in transaction files?
    2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
    3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
    5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
    Your quickest reply is much appreciated.
    Thanks,
    Alex.

    Hi,
    I have the same problem.
    In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
    ES: cost1 --> cost
          cost2 --> cost
          cost3 --> cost
    In my desire was that in BPC the nature cost assume the result  cost = cost1 + cost2 + cost3.
    The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
    Any suggestion?

  • Error While importing the transaction data

    All SAP BPC Gurus,
      I need your help in resolving this error,  I have encountered this below error while uploading (importing) the transaction data of (Non-Reporting) application,  Would you please help me resolving this error.  I don't know if i'm doing anything wrong or is anything wrong with the setup.
    I used DataManager in EXCEL BPC and Ran the IMPORT Transaction Package.
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/CONVERT completed in 0 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    FILE= DATAMANAGER\DATAFILES\IFP_BPCUSER121\rate data file.csv
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.XLS
    CLEARDATA= No
    RUNLOGIC= No
    CHECKLCK= No
    [Messages]
    Task name CONVERT:
    XML file (...ANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.TDM) is empty or is not found
    Cannot find document/directory
    Application: PLANNING Package status: ERROR

    are you using the standard "Import" data package?
    Check the code in the Advanced tab of the Import data package
    Check your Transformation file is in correct format.
    code in Advaced tab should be as  below:
    PROMPT(INFILES,,"Import file:",)
    PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    PROMPT(RADIOBUTTON,%CLEARDATA%,"Select the method for importing the data from the source file to the destination database",0,{"Merge data values (Imports all records, leaving all remaining records in the destination intact)","Replace && clear datavalues (Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records)"},{"0","1"})
    PROMPT(RADIOBUTTON,%RUNLOGIC%,"Select whether to run default logic for stored values after importing",1,{"Yes","No"},{"1","0"})
    PROMPT(RADIOBUTTON,%CHECKLCK%,"Select whether to check work status settings when importing data.",1,{"Yes, check for work status settings before importing","No, do not check work status settings"},{"1","0"})
    INFO(%TEMPNO1%,%INCREASENO%)
    INFO(%ACTNO%,%INCREASENO%)
    TASK(/CPMB/CONVERT,OUTPUTNO,%TEMPNO1%)
    TASK(/CPMB/CONVERT,ACT_FILE_NO,%ACTNO%)
    TASK(/CPMB/CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    TASK(/CPMB/CONVERT,SUSER,%USER%)
    TASK(/CPMB/CONVERT,SAPPSET,%APPSET%)
    TASK(/CPMB/CONVERT,SAPP,%APP%)
    TASK(/CPMB/CONVERT,FILE,%FILE%)
    TASK(/CPMB/CONVERT,CLEARDATA,%CLEARDATA%)
    TASK(/CPMB/LOAD,INPUTNO,%TEMPNO1%)
    TASK(/CPMB/LOAD,ACT_FILE_NO,%ACTNO%)
    TASK(/CPMB/LOAD,RUNLOGIC,%RUNLOGIC%)
    TASK(/CPMB/LOAD,CHECKLCK,%CHECKLCK%)
    TASK(/CPMB/LOAD,CLEARDATA,%CLEARDATA%)

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • Importing and Updating Non-Duplicate Records from 2 Tables

    I need some help with the code to import data from one table
    into another if it is not a duplicate or if a record has changed.
    I have 2 tables, Members and NetNews. I want to check NetNews
    and import non-duplicate records from Members into NetNews and
    update an email address in NetNews if it has changed in Members. I
    figured it could be as simple as checking Members.MembersNumber and
    Members.Email against the existance of NetNews.Email and
    Members.MemberNumber and if a record in NetNews does not exist,
    create it and if the email address in Members.email has changed,
    update it in NetNews.Email.
    Here is what I have from all of the suggestions received from
    another category last year. It is not complete, but I am stuck on
    the solution. Can someone please help me get this code working?
    Thanks!
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    </cfquery>
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber
    FROM NetNews
    </cfquery>
    <cfif
    not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
    AND isnumeric(qryMember.MemberNumber))>
    insert into NetNews (Email_address, First_Name, Last_Name,
    MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')-
    </cfif>
    </cfloop>
    </cfquery>
    ------------------

    Dan,
    My DBA doesn't have the experience to help with a VIEW. Did I
    mention that these are 2 separate databases on different servers?
    This project is over a year old now and it really needs to get
    finished so I thought the import would be the easiest way to go.
    Thanks to your help, it is almost working.
    I added some additional code to check for a changed email
    address and update the NetNews database. It runs without error, but
    I don't have a way to test it right now. Can you please look at the
    code and see if it looks OK?
    I am also still getting an error on line 10 after the routine
    runs. The line that has this code: "and membernumber not in
    (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#
    cfsqltype="cf_sql_integer">)" even with the cfif that Phil
    suggested.
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber, Email_Address
    FROM NetNewsTest
    </cfquery>
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and membernumber not in (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#"
    cfsqltype="cf_sql_integer">)
    </cfquery>
    <CFIF qryMember.recordcount NEQ 0>
    <cfloop query ="qryMember">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    insert into NetNewsTest (Email_address, First_Name,
    Last_Name, MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')
    </cfquery>
    </cfloop>
    </cfif>
    <cfquery datasource="#application.dsrepl#"
    name="qryEmail">
    SELECT distinct Email
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and qryMember.email NEQ newsMember.email
    </cfquery>
    <CFIF qryEmail.recordcount NEQ 0>
    <cfloop query ="qryEmail">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    update NetNewsTest (Email_address)
    values ('#trim(qryMember.Email)#')
    where email_address = #qryEmail.email#
    </cfquery>
    </cfloop>
    </cfif>
    Thank you again for the help.

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • Why do we have a duplicate record in the .wsr file of the mail.dat?

    Any idea how we could have created a duplicate record in the .wsr (walk Sequence file)?   We have a post presort software that errors out because of the dupe.   The Mail.dat specification indicates that there are five 'key' fields (their combination must be unique) for the Walk Sequence file:

    Hi Michael
    Can you please tell me which field is being duplicated and can you please try to run the job again and wait for a couple of seconds before importing it to your post presort software.
    Thanks
    Anita.

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

Maybe you are looking for

  • Generate to File not working

    Hai, we are calling the report using the command line method not run_product.for these reports 'Generate to File' menu is not working.It shows an error like Umimplemented Error (rep-0999).There is one default printer attached to m/c. and the operatio

  • Using an event in table maintenance view to populate a field

    I have a table maintenance view, and I would like to populate one field of the table with derived data. I am trying to do this by coding a routine for one of the events, but so far no luck. I have been able to chege the data in the TOTAL table, but s

  • Release of Sale Orders

    Dear All, My requirement is that whenever any sales user create a sales order, then that sales order should go in Block so that any further processing can not be done. Only after the releasing of sales order by some higher authority, the order can be

  • Could not register a HeartbeatMonitorListener

    Hi!           Saw a different posting on this subject and have exactly the same           problem. I have 2 local servers, one WLS 7 server and the other is           Weblogic 8. I want messages sent to the WLS7 queue accessible from the           WL

  • Database Configuration Assistant fails during instalation of Oracle EE 8.1.7

    My machine runs RH 6.2 and I have tried to install it Oracle. After executing all pre-instalation steps, I run ./runInstall and instalation process goes on. I select Typical Instalation and instalation proceeds until 100%, when OUI tries to configure