Duplicate records in PSA and Error in Delta DTP

Hi Experts,
I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
2)  I have run the Delta DTP.Now That one record has moved to DSO
3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
So would you please tell me how to rectify this Issue.
My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
Thanks and Regards,
K.Krishna Chaitanya.

you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
the best thing is to have a timestamp.
if in the datasource you have a date and time, you can create a timestamp yourself.
you need in this case to use a function module.
search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
M.

Similar Messages

  • Duplicate records in PSA

    Hi all,
    how to identify & eliminate the duplicate records in the PSA??

    Hi,
    Here is the FI Help for the 'Handle Duplicate Record Keys' option in the Update tab:
    "Indicator: Handling Duplicate Data Records
    If this indicator is set, duplicate data records are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the the valid attribute value for the update for a given data record key.
    For time-dependent attributes, the validity ranges of the data record values are calculated according to their order (see example).
    If during your data quality measures you want to make sure that the data packages delivered by the DTP are not modified by the master data update, you must not set this indicator!
    Use:
    Note that for time-dependent master data, the semantic key of the DTP may not contain the field of the data source containing the DATETO information. When you set this indicator, error handling must be activated for the DTP because correcting duplicate data records is an error correction. The error correction must be "Update valid records, no reporting" or "Update valid records, reporting possible".
    Example:
    Handling of time-dependent data records
    - Data record 1 is valid from 01.01.2006 to 31.12.2006
    - Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    - The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid."
    By flagging this option in the DTP, you are allowing it to take the latest value.
    There is further information at this SAP Help Portal link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/content.htm
    Rgds,
    Colum

  • Record locking/blocking and error msgs

    There are some cases where a user is blocked from updating a record which is being held by another session, and the user get's a
    "Could not reserve record (2 tries). Keep trying?"
    In other cases, the user just gets an hourglass. What causes the difference in the Forms behavior?
    ---=cf

    There are some cases where a user is blocked from
    updating a record which is being held by another
    session, and the user get's a
    "Could not reserve record (2 tries). Keep
    trying?"
    In other cases, the user just gets an hourglass. What
    causes the difference in the Forms behavior?
    ---=cfThe first case happens when a user from a different machine has either updated the same record and not yet saved or simply has locked it, programmatically or manually (lock_record built-in).
    The second case might be that forms is waiting for a process to finish, or there's just too much network traffic.
    What happens when the hourglass (busy mouse cursor) is gone, does the process continues normally or does it throws an error???
    Tony

  • Duplicate records in BW Data Loads

    In my Project I am facing duplicate records in  Data Loads,  when I compare with PSA and DSO.  How to check those are duplicate and is there any mechanism through Excel Sheet or any?  Please help me out.  Advance thanks for your quick response.
    Edited by: svadupu on Jul 6, 2011 3:09 AM

    Hi ,
    Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
    In case of a standard DSO, records are always overwritten you would not get any duplicates .
    In case you are getting duplicate records in PSA and need to find them,
    Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
    Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .

  • Data Load Fails due to duplicate records from the PSA

    Hi,
    I have loaded the Master Data Twice in to the PSA.  Then, I created the DTP to load the data from the PSA to the InfoProvider.  The data load is failing with an error "duplicate key/records found".
    Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
    How can I set up the process chains to do so?
    Your answer to the above two questions is appreciated.
    Thanks,

    Hi Sesh,
    There are 2 places where the DTP checks for duplicates.
    In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
    The second stage will clean up duplicates  across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
    Hope this helps,
    Pieter

  • Duplicate record identifier and update

    My records look like 
    Name City Duplicateindicator 
    SAM   NYC   0
    SAM   NYC1 0
    SAM    ORD  0
    TAM   NYC  0
    TAM   NYC1  0 
    DAM   NYC  0  
    for some reason numeric character are inserted into city which duplicated my records , 
    I need to 
    Check for the duplicate records by name ( If name is repeating ) check for city if they  having same city (NYC and NYC1) are consider same city here. I am ok to do this for one city at a time.
    SAM has a duplicate record as NYC and NYC01 , the record which is having  SAM   NYC1 0 must be updated to SAM   NYC1 1 

    Good day tatva
    Since the Cities names is not exactly the same, you will need to parse the text somehow in order to clean the numbers from the name, this is best to do with SQLCLR using regular expression (If this fit your need, then I can post the CLR code for you).
    In this case you use simple regular expression replace function.
    On the result of the function you use simple query with the function ROW_NUMBER over (partition by RegularExpressionReplace(ColumnName, '[0-9]') order by
    ColumnName)
    on the result of the ROW_NUMBER  every row with ROW_NUMBER  more then 1 is duplicate
    I hope this useful :-)
      Ronen Ariely
     [Personal Site]    [Blog]    [Facebook]

  • Duplicate records in database view for ANLA and ANLC tables

    HI all,
    Can any one please suggest me how to remove duplicate records from ANLA and ANLC tables when creating a database view.
    thanks in advance,
    ben.

    Hi,
    Suppose we have two tables one with one field and another with two fields:
    TAB1 - Key field KEY1
    TAB2 - Key fields KEY1 & Key 2.
    No if we create a Database view of these two tables we can do by joining these two tables on Key field KEY1.
    Now if in View tab we have inculded TAB1- Key1.
    Now lets suppose following four entries are in table TAB1: (AAA), (BBB), (CCC).
    and following entries are in table TAB2: (AAA, 1), (AAA, 2),  (BBB, 3), (BBB, 5), (DDD, 3).
    The data base view will show following entries:
    AAA,
    AAA,
    BBB,
    BBB,
    Now these entris are duplicate in the output.
    This is because TAB2 has multilple entries for same key value of TAB1.
    Now if we want to remove multiple entries from ouput - we need to include an entry in selection conditions like TAB2-KEY2 = '1'.
    Regards,
    Pranav.

  • Write Optimized DSO Duplicate records

    Hi,
    We are facing problem while doing delta to an write optimized data store object.
    It gives error "Duplicate data record detected (DS <ODS name> , data package: 000001 , data record: 294 )
    But it can not have an duplicate record since data is from DSO and
    we have also checked in PSA the particular record and there we couldn't find the duplicate record.....
    There is no much complex routine also.....
    Have any one ever faced this issue and got the solution? Please let me know if yes.
    Thanks
    VJ

    Ravi,
    We have checked that there is no duplicate records in PSA.
    Also the source ODS has two keys and target ODS is having three Keys.
    Also the records that it has mentioned are having Record mode "N" New.
    Seems to be issue with write-ptimized DSO.
    Regards
    VJ

  • Full and Delta DTP doubts

    There was a ongoing delta load happening between DSO 1 to DSO 2.
    I deleted all requests from DSO 2.
    And then ran first Full DTP ( DSO 1* to DSO 2) and immediately after full DTP ran Delta DTP ( DSO 1 to *DSO 2).
    In Full DTP : There were 10,000 Records added  into DSO 2
    In delta DTP : there were 2,000 records added into DSO 2
    I was bit surprised when I saw some data added in delta since before ( just 5 mins back ) I had ran full DTP therefore my understanding was delta would not bring anything.
    By looking at various thread , I came to know that Full DTP get data from Active table and delta DTP get data from change log table.
    However what if , if same records present in Active table and change log , so same records will be pick up by Full DTP run and also while delta DTP run , assume target DSO is in addictive mode then in such scenario duplicate record will be added and final result would be diffrent 9( i.e data get double up ).
    How do we avoid such situation ?.
    Any thoughts ?.
    Edited by: Prashant on Aug 12, 2010 2:24 PM

    HI,
    DTP FULL: if you have 1 request in psa today and your dtp extraction mode is full it will load the data into the target.
                      in the next day you have run the infopackage and you habve got another request wich means in totall you have got 
                      2request in psa so what the full dtp does is it will try to load both of them and you will get error saying duplicate records.
    DTP DELTA:   it will load only the the fresh request from the source.
    howe ever we dont do any initialisation in dtp the full and delta are only the extraction modes of data from the source. and its the infopackage which will fetch the deltas once the intialisation is done.
    hope it helps.
    Thanks & Regards
    krishna.y

  • Duplicate records in DTP

    Guys,
    My DTP failed 4 days back, but the full load from Source system to PSA is executing successfully.
    To correct the DTP load, I have deleted the last DTP request and also all the PSA requests except the last one and when I re-executed the DTP I am getting Duplicate records. Also, in the failed DTP header, I am seeing all the deleted PSA requests?
    How do I resolve this issue? I cannot check Handle duplicate records since it is a Time dependent data.
    Thanks,
    Kumar

    Hi Kumar,
    1.deleting from PSA and updating is not a permanent solution..
    chk wht are the records creating dupicates in PSA first...
    the issue with the duplicates are not from PSA previously there are some records in the object itself...
    so deleting from PSA will not solve the issue...
    chk the data from source tables itself why the wrong data is cmg into PSA like tht...
    then u can edit with the correct records in PSA and update the data into target using DTP..
    u can create an error DTP for tht so tht it'll be easy to trace the duplicates easily,...
    2. You have the option "handle duplicate reords" in the DTP.
    check the box and try to load the data again.
    If this is time dependent master data then check also "valid to" as key along with other objects in sematic group option in the DTP.
    check this and they try to load the data.
    http://help.sap.com/saphelp_nw70/helpdata/EN/42/fbd598481e1a61e10000000a422035/content.htm
    Regards
    Sudheer

  • Duplicate value in MAPL cause error

    We use CWBQM transaction to change inspection plan/ multipple spec. As of now we have not implemented change number functionality at inspection plan level. So we don't use change number to change an inspection plan, but because of miscommunication/misunderstanding one of user use change number to delete one of MIC from an inspection plan at CWBQM.
    Now if we want to do any change to this particular plan then system force to enter change number.
    Now we have requirement to change text at "material task list assignment" tab. If we do any change using change number then system create one more entry with different valid from date (change number).  Here first system trigger error CZCL014 "no valid material-routing allocation exist for xxxxxx". If continue system display with one more entry which is duplicate record in MAPL and at time of save trigger runtime error.
    Have you ever faced this error? How to change text in "material task list assinment tab" using change number?

    I believe the system throw an error is because of older version of change number. so it creates one more entry with latest change number. Did you try deleting the older version and see if it reflects in MAPL ?
    Regards,
    Mahee

  • Duplicate Records in DTP's Temporary Storage Area

    I am getting duplicate records (36Records) in DTP's Temporary storage, while loading data to 0MAT_SALES (Text). In Error DTP also I am not getting the Error stack. What will be the reason?
    Ram Mohan

    As I informed in the Previous thread, I have Duplicate record.
    in PSA there is no any duplicate, while execute the DTP i am getting duplicate records in DTP's temporary storage area...
    Ram

  • Oracle 10 -   Avoiding Duplicate Records During Import Process

    I have two databases on different servers(DB1&DB2) and a dblink connecting the two. In DB2, I have 100 million records in table 'DB2Target'.
    I tried to load 100 million more records from tables DB1Source to DB2Target on top of existing 400 million records but after an hour I got a network error from DB2.
    The load failed after inserting 70% records.Now I have three tasks. First I have to find the duplicate record between DB1 and DB2. Second I have to find out the remaining 30% records missing from DB2Target.
    Third I have to re-load the remaining 30% records. What is the best solution?
    SELECT COUNT(*), A, B FROM DB2TARGET
    GROUP BY A, B
    HAVING COUNT(*) > 2
    re-loading
    MERGE INTO DB2TARGET tgt
    USING DB1SOURCE src
    ON ( tgt .A=  tgt .A)
    WHEN NOT MATCHED THEN
    INSERT ( tgt.A,  tgt .B)
    VALUES ( src .A,  src .B)Thanks for any guidance.

    when I execute this I get the folllowing error message:
    SQL Error: ORA-02064: distributed operation not supported
    02064. 00000 - "distributed operation not supported"
    *Cause:    One of the following unsupported operations was attempted
    1. array execute of a remote update with a subquery that references
    a dblink, or
    2. an update of a long column with bind variable and an update of
    a second column with a subquery that both references a dblink
    and a bind variable, or
    3. a commit is issued in a coordinated session from an RPC procedure
    call with OUT parameters or function call.
    *Action:   simplify remote update statement                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

  • How to delete Duplicate records from the informatica target using mapping?

    Hi, I have a scenario like below: In my mapping I have a source, which may containg unique records or duplicate records. Source and target are different tables. I have a target in my mapping which contains duplicate records. Using Mapping I have to delete the duplicate records in the target table. Target table does not contain any surrogate key. We can use target table as a look up table, but target table cannot be used as source in the mapping. We cannot use post SQL.

    Hi All, I have multiple flat files which i need to load in a single table.I did that using indirect option at session level.But need to dig out on how to populate substring of header in name column in target table. i have two columns Id and Name. in all input file I have only one column 'id' with header like H|ABCD|Date. I need to populate target like below example. File 1                                    File2     H|ABCD|Date.                      H|EFGH|Date.1                                            42                                            5  3                                            6 Target tale: Id    Name1     ABCD2     ABCD3     ABCD4     EFGH5     EFGH6     EFGH can anyone help on what should be the logic to get this data in a table in informatica.

Maybe you are looking for

  • 8.02 vs 8.1 update Worth It?

    I HATE my IPhone 5S!! Hate, hate, hate, Many times have wanted to throw out the window because it is near useless. Can't hear anyone, no one can hear me. Always having to adjust volume (again and again) to highest possible, still NG. The sync with my

  • Best Practice to configure tnsnames.ora on client of MAA environment in 10g

    Hi, I have a MAA environment, 1 RAC Primary of 2 nodes and 1 RAC standby of 2 nodes too. I want to configure the tnsnames.ora on clients (we have many clients on each PC) and I need to configure the tnsnames. I have read some papers but the informati

  • Procurement of material - customer specific

    Dear All Guru, I have posted this thread with same subject but i never had rational solution for this. Coz at least for project completion i need to take a decision if it is to be done manually at client place Raw material is procured customer specif

  • Runtime Error "DYNPRO_MSG_IN" and "MESSAGE_TYPE_X"

    Hi Gurus, I getting the following Runtime Error "DYNPRO_MSG_IN" and "MESSAGE_TYPE_X" can anybody help me.. Thanks & Regards Manna Das

  • OC4J 10G on Windows2000 with J2SDK1.4.2_04

    C:\oc4j\j2ee\home>java -jar oc4j.jar 04/05/14 19:01:01 Auto-deploying ../../home/applications/admin_ejb.jar (New serv er version detected)... Error in application default: Error loading package at f ile:/C:/oc4j/j2ee/home/applications/admin_ejb.jar,