Enabling duplicate records in infopackage?

Hi,
Is there any provision to have the duplicate records in the datasource ?
i am uploading from a flat file and flatfile has too many duplicate records!
this is for testing purpose only!
is it possible to have the duplicate records in BW?
Thanks,
Ravi

Hi Ravi,
if you load to ODS there is and option in settings "Unique Data Records"
if you check this option it will allow only unique records. from ODS to cube make this ODS as data mart then load to cube.
In Cube there is no such options.
Cheers,
Kumar.

Similar Messages

  • Enable Duplicate Record Functionality

    Hello All,
    I have just completed a custom 6i form based on TEMPLATE.fmb.
    One of my requirements is to be able to copy an entire record and duplicate it with standard functionality at Edit > Duplicated > Row.
    Is there any way to enable this functionality?
    I enabled the menu item with app_special.enable. This enabled the menu item but it won't duplicate a record.
    Thanks for any help,
    Bradley

    Review the developer guide, controlling records in a window topic. There it indicates why this is disabled by default and the cautions to have if implementing, as well as an example of it's use.

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate records error?

    hello all
    while extracting master data am getting duplicate records error?
    how do i rectify this?
    in infopackage screen in processing tab,will i get the  option " ignore double data records",?
    when will this option will be enable?
    regards

    Hello
    This option will be available only for Master Data and not for Transactional Data. You could control the Duplicate Records for Transactional Data in ODS, there is an option in the ODS Settings.
    ***F1 Help
    Flag: Handling of duplicate data records
    From BW 3.0 you can determine for DataSources for master data attributes and texts whether the extractor transfers more than one data record in a request for a value belonging to time-independent master data.
    Independently of the extractor settings (the extractor potentially delivers duplicate data records) you can use this indicator to tell the BW whether or not you want it to handle any duplicate records.
    This is useful if the setting telling the extractor how to handle duplicate records is not active, but the system is told from another party that duplicate records are being transferred (for example, when data is loaded from flat files).
    Sankar

  • Duplicate record error

    Hi,
    I am using a ODS as source to update the master data infoobject with flexible update. The issue is in spit of using option ONLY PSA ( Update subsequently in data target ) in the infopackage with error handling enabled, I am getting Duplicate record error. This happens only when I am updating through process chain. If I run it manually, the error doesn't comes. Plz let me know the reason.
    Thanks

    Hi Maneesh,
    As we r loading from ODS to info object we dont get the option "don't update duplicate records if exists"
    first did u checked any duplicate records found in PSA?? if so delete them from PSA.
    or one option is enable error handling in infopackage.
    or
    check any incosistencies for infoobject in RSRV and if found repair it and load again. check the inconsistencies for P,X,Y tables and for the complete object also.
    *assign points if helpfull*
    KS

  • Duplicate records in PSA and Error in Delta DTP

    Hi Experts,
    I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
    1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
    2)  I have run the Delta DTP.Now That one record has moved to DSO
    3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
    4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
    So would you please tell me how to rectify this Issue.
    My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
    Thanks and Regards,
    K.Krishna Chaitanya.

    you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
    you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
    the best thing is to have a timestamp.
    if in the datasource you have a date and time, you can create a timestamp yourself.
    you need in this case to use a function module.
    search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
    M.

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Sqlloader controlfileparam's to avoid duplicate records loading

    Hi All,
    I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
    Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
    Regards

    Hey
    i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
    On the difference between the bad and reject files try this link
    http://www.exforsys.com/content/view/1587/240/
    Regards,
    Sushant

  • SQL Loader loads duplicate records even when there is PK defined

    Hi,
    I have created table with pk on one of the column.Loaded the table using sql loader.the flat file has duplicate record.It loaded all the records without any error but now the index became unusable.
    The requirement is to fail the process if there are any duplicate.Please help me in understaing why this is happening.
    Below is the ctl file
    OPTIONS(DIRECT=TRUE, ERRORS=0)
    UNRECOVERABLE
    load data
    infile 'test.txt'
    into table abcdedfg
    replace
    fields terminated by ',' optionally enclosed by '"'
    col1 ,
    col2
    i defined pk on col1

    Check out..
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#sthref1457
    It states...
    During a direct path load, some integrity constraints are automatically disabled. Others are not. For a description of the constraints, see the information about maintaining data integrity in the Oracle Database Application Developer's Guide - Fundamentals.
    Enabled Constraints
    The constraints that remain in force are:
    NOT NULL
    UNIQUE
    PRIMARY KEY (unique-constraints on not-null columns)Since OP has the primary key in place before starting the DIRECT path load, this is contradictory to what the documentation says or probably a bug?

  • Duplicate records-problems

    Hi gurus
    We created a text datasource in R/3 and replicated it into BW 7.0
    An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
    The job failed because of duplicate records.
    We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
    When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
    In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
    Does anyone have any suggestion on how to solve this problem?
    Thanks,
    @nne Therese

    Hi Muraly,
    I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
    It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
    suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
    I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
    Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
    Please suggest me and explain me if I am wrong.
    Thanks,
    Sudha..

  • Duplicate Records generating for infocube.

    hi all,
    when I load the data from datasource to infocube I am getting the duplicate records.Actually the data is loaded into the datasource from a flat file when i executed first time it worked but when i changed  the flat file structure i am not getting the modified content in the infocube instead it is showing the duplicates.In the data source     'preview data' option  it is showing the required data i.e modified flat file) .But where as in the infocube  i made all the necessary changes in the datasource,infocube,infopackage,dtp but still I am getting the duplicates. I even deleted the data in the infocube.Still i am getting the duplicates. What is the ideal solution for this problem ? One way is to create a new  data source with the modified flat file but  I think it is not ideal .Then what is the possible solution with out creating the data source again.
    Edited by: dharmatejandt on Oct 14, 2010 1:46 PM
    Edited by: dharmatejandt on Oct 14, 2010 1:52 PM
    Edited by: dharmatejandt on Oct 14, 2010 1:59 PM

    Finally i got it .I deleted the requestids in the infopackage ( right click infopackage go to manage) then i executed the transformation ,dtp finally i got the required output with out duplicate.
    Edited by: dharmatejandt on Oct 14, 2010 4:05 PM

  • Duplicate records in delta load?????pls help!!!! will assign points

    Hi all,
    I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
    I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
    i ran init of delta without data transfer, extracted 0 records as expected.
    then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
    what could be the reason for duplicate records to occur in the delta load?
    i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
    Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
    Will assign points.

    ur selection criteria -
    01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
    both of ur selection includes the month- .02.2007
    might b all selections come under .02.2007
    hav u checkd tht?
    Regards,
    Naveen Natarajan

  • Duplicate records in Infoobject

    Hi gurus,
      While loading in to a infoobject itz throwing an error as 150 Duplicate records are found in the details tab. My doubt is about temporary correction i.e in the infopackage either we can increase the error handling number and load or we can five the third option(valid records update) and update to infoobject.
    My question is when error occurs i can find that that some records are there in added. After the error occured i forced the request to red(Delta) and deleted it and gave 3rd option(valid records update) and updated to infoobject. The load was successful all the records are transferred but in added the number of records is 0.And also i read in one article that the error records will be stored as seperate request in PSA even i cannot find that. I want to know when or how i will get the news records added.

    Hi Jitu,
    If its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,
    1.Go to RSRV- tcode, expand Master data and double click on second option.
    2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.
    3.select that particular infoobject and click on Delete button,
    if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.
    4.Select the request and click on the read manually button (wheel button).
    Dont forget to save the infopackge back by selecting only infoobject.
    hope this works.Let me know if u have doubts.
    Assign points if u r able to find the solution
    Regards,
    Manjula

Maybe you are looking for