Duplicate Records, Import transaction Data

Hi Everybody
I'm using BPC 7.5 NW and I gett\ a warning that says that there are duplicate records when I run the package "Load Transaction Data". The txt file that I'm using does not have duplicate records. I have the following data in my flat file:
ACCOUNT           INTCO               AMOUNT
61012                   I_65                       10
61012                   I_66                       12
61012                   I_67                       13
I'm using a conversion file for INTCO as:
EXTERNAL               INTERNAL
I_65                              I_99
I_66                              I_99
I_67                              I_99
When I ran the package, it says that there are duplicate records, the records are:
ACCOUNT           INTCO               AMOUNT
61012                   I_99                       10
61012                   I_99                       12
My cuestion is, It is not posible to use this package when I use conversion files? If I use the APPEND package, it works fine, but why it dosnt work whit the Import Transaction Data?
since I remember in MS version is posible to do that.
Thanks in advenced.
Regards

Hi,
Originally, you had the following records:
ACCOUNT INTCO AMOUNT
61012 I_65 10
61012 I_66 12
61012 I_67 13
However, after the conversion file, the records are become:
ACCOUNT INTCO AMOUNT
61012 I_99 10
61012 I_99 12
61012 I_99 13
So, there are 3 records which are duplicate.
The import package will not accept the 2nd and the 3rd record. Because these are duplicate records fro the 1st record. However, the append package will append the 2nd and the 3rd records to the 1st one.
Hope you got the idea.

Similar Messages

  • Duplicate records In Master Data

    Hi,
    I don't understant why we get Duplicate records in Master Data though it has got the overwritten functionality..
       Any idea will be appreciated..

    Hi,
    <u>Solution:</u> if the load to master data fails due to duplicate records,
    Goto Monitor screen --> in the details tab --> under processeing find the duplicate record --> on the context menu of the error record select 'Manual update'.
    After the above step is done....trigger the attribute change run for that infoobject.
    This should solve your problem.
    if there is any problem in the reporting, select the data using filter option on the master data.
    Regards,
    Vijay.

  • Duplicate records in master data infoobject...how to delete it...pls help

    hi all,
    how to delete duplicate records in master data infoobject which has no requests because it is a direct update?

    Hi,
    Right click on the info object and
    select Maintain
    in that you will get the Master data table
    from that select the Record and delete it.
    hope this solves your query.
    reward points if useful
    regards,
    ANJI

  • Import Transaction Data from BW Cube to BPC Cube

    Hi,
    Is there any document that explain how i can import/copy transactional data from bw cube to bpc cube like this below pic. http://img841.imageshack.us/img841/6998/prof.jpg
    Thanks in advance.

    Hi Again,
    With these documents I can import transactional data with a single keyfigure but if there are more than one keyfigure, importing and assigning keyfigures to dimensions might get confusing.
    For example i have two keyfigures "zbgstsad" and "zbgstsf" in bw cube and I have two account member in "S_ACCT": "S_F", "S_A". I did a conversion file with "zbgstsad(ext)->s_a(int)" and "zbgstsf(ext)->s_f(int)" rows. But i am not sure what i have to put instead of "?".
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    CONVERTAMOUNTWDIM=S_ACCT
    *MAPPING
    RPTCURRENCY=0CURRENCY
    S_ENTITY=ZBGENTITY
    S_ACCT= ?
    S_TIME=0CALMONTH
    S_CATEGORY=*NEWCOL(ACTUAL)
    S_OB=0UNIT
    S_URUNLER=ZBGURNLR
    AMOUNT= ?
    *CONVERSION
    S_ACCT=[COMPANY]B_ACCT.XLS!CONVERSION
    TIME=[COMPANY]B_TIME.XLS!CONVERSION
    What do you recommend?
    Thanks in advance.
    Burak
    Edited by: boguner on Aug 9, 2010 2:34 PM
    Edited by: boguner on Aug 9, 2010 2:40 PM

  • Deleting duplicate records from different data packets in BI data source.

    Hi,
    I am getting same (duplicate) records from different data packets in BI data source, after completion of extraction.
    I tried to store key fields of the first data packet in an internal table. But this internal table is not carrying the previous data at the time of extraction of second data packet.
    Is there any other way to remove duplicate records after completion of extraction.
    Thanks in advance.

    I did not extensively worked in BI routenes. But I recon there will be routene which will het executed before data mapping part there will be a start routene in which you can validate the existense of data before beeing passed from data source to cube.
    Hope this helps,
    Regards,
    Murthy.

  • Duplicate records during master data loading.

    hello guys,
    I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
    Data loading failed due to Duplicate records during master data loading.......
    Why do this error occur?How can we rectify this in a production environment?
    Thanks and Regards,
    S

    Hi SChandx200 ,
          May I ask where you get "Various issues in a BW production project"?
    Many Thanks,

  • BPC 7.5: Import Transactional Data Error

    When trying to run the Import Transactional Data package, the load will complete successfully (and all the data gets loaded), but when I go to view the log file I see the following error.
    Task name LOAD:
    BAdi not implemented for appset <APPSET>, dimension TIME, rule 00000001
    We recently transported this from our DEV system to our TEST system. There are no errors showing in the DEV system and everything transported correctly. Any ideas on why I may be seeing this error?

    Hi Howell,
    Any clue on how this error was resolved?
    We are facing this problem now in 7.5 SP13, in exactly the same scenario after transport.
    Appreciate if you could let us know.
    Thanks.
    Best Regards,
    Karthik AJ

  • Import Transaction Data - Duplicate records

    Hi,
    I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
    Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
    Regards
    Sue

    Hi,
    Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
    Thanks,
    Sreeni

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • How to delete a master record with transaction data?

    Hi,
        Iam able to delete master record(GL/Vendor/Customer)through OBR2 when there is no transaction data in those respective master records. Despite of clearing all open items in a particular vendor Account, Iam unable to delete that master record. Please suggest me how we can delete vendor master data having transactional data.
                               Thanks in Advance
    Regards,
    Satish

    Hi...
    Not sure whether this helps....you can mark the vendorrecord for deletion and later try to delete it.
    Logistics >> Materials management >> Purchasing >>
    (new menu) Master data >> Vendor >> Central >> Flag for deletion
    try using XK06/FK06
    Assign points if useful
    Regards
    Aravind

  • Duplicate records in exported data

    I'm trying to export the inventory data with a wildcard (%) filter on the
    Workstation Name.
    If I run the same filter in a query from ConsoleOne, I don't see any
    duplicate records.
    If I run the data export, the exported data for some workstations will have
    a duplicate DN. Just the DN is duplicated, all the other fields are either
    empty or have some default value.
    I have also ran the manual duplicate removal process and have tried
    deleting the records all together using the InventoryRemoval service.
    Any other ideas?

    Dlee,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
    - Check all of the other support tools and options available at
    http://support.novell.com.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Duplicate records in generic data source

    Hello,
    We have created a generic data source using a database view joing two tables MARA and MBEW.
    When we run the view in our DEV server, we get perfectly fine data. Now when run the same view in QA, we get duplicate records.
    Is it any thing to do with the CLIENT as in QA, we have 2 clients with same data.
    MARA     MANDT     =     MBEW     MANDT
    MARA     MATNR     =     MBEW     MATNR
    This is what I mention in JOIN Conditions.
    Hope I could explain my issue properly. Please HELP !
    Abhishek

    Please check the possibility of Multiple records for a given material in MBEW,as same material can be in multiple valuation areas
    More over you will be executing extraction in one client so it is very unlikely that you see data of the other client
    In dev normally we do not have good data to test so it seems like design is correct in dev

  • Duplicate records in BW Data Loads

    In my Project I am facing duplicate records in  Data Loads,  when I compare with PSA and DSO.  How to check those are duplicate and is there any mechanism through Excel Sheet or any?  Please help me out.  Advance thanks for your quick response.
    Edited by: svadupu on Jul 6, 2011 3:09 AM

    Hi ,
    Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
    In case of a standard DSO, records are always overwritten you would not get any duplicates .
    In case you are getting duplicate records in PSA and need to find them,
    Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
    Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .

  • Master data tables with unwanted records from transaction data upload

    Hi Friends,
      I have a master data table for infoobject 'C' with compounding characteristics 'A' & 'B'.  I upload this master data with values given below:
        <i><u> A,              B,              C,           Short text,                        Long text</u></i>
           <b>  <b>P,          10,           BBB,         Apple,                              Big Apples
             Q,             20 ,           XYZ  ,       Tomatoes    ,                    Red Tomatoes</b></b>
      When I load data into ODS from a source system, I may not necessarily have data for all these 3 fields in these transaction record.  Example:
      <i><u>     A,                B,             C,             D,            E</u></i>    
         <b> P                -1            FFF</b>          20           30            
         <b> Q                10           GGG        </b> 10           40
       The problem is when I upload the above transaction data, it populates the <b>master data table</b> too with these two new records <b>1 -1 FFF</b> and  <b>2 10 GGG</b>, which I would like to avoid.
       Is there any way?
       Will assign full points to anyone who helps me here.
       Thanks,
       JB

    Hi JB,
    If you want to load transactional data and still want to prevent the population of the master data table, I don't think it is possible, as it is goes aginst the data consistency in the warehouse.
    However, if you can afford not to load transactional data for such cases, you can activate referential integrity check for the infoobject C. Then neither transactional data nor masterdata enter the datawarehouse until you maintain masterdata yourself for the infoobject C.
    hope this helps.

Maybe you are looking for