Duplicate Records in Infocube : Full Repair without DSO is possible ?

Hello Gurus,
I have a critical situation in BI production. we have more then 3 years of data in inventory infocubes (PSA->ODS->CUBE). everything was working fine till december 2010. after that in January 2010 and March -2010 we were getting double records in infocubes. in this scenarion we don't have ODS in between.
The solution which i find is delete all data from Jan -2010 till today in infocube and also from PSA. on R/3 side delete set up tables and fill setup tables for Jan-2010 to Mar-2010. and create a infopackage for Full Repair request .  but i have below questions for these solution
(1) For Full Repair Info Package or Full Repair Request we don't need to delete INIT request .. correct me if i am wrong.
(2) For full repair request do we need to run Initialize Delta With Out Data Transfer first and then we have to do Full Repair ?
(3) If we don't have DSO in these scenario then also can we solve this with full repair request ?
Regards,
Komik Shah

Hi Venu,
We have data in PSA since 13/04/2010 because the process chain was failing since last 15 days. so  didn't get new records in PSA also. we are using BI 7.0.
Whole scenario is like this.
Data is there in Inventory Cube since last 3 year. but no body monitor the process chain since last 15 days and no body analyze any reports after dec-09. now they Analyze the report in April-10 and for some months they are getting Double records. process chain was  failing since last 15 days and the reason behind that was  INDEXING as well as wrong records in PSA.
So, My plan was to delete data from Jan-2010 to April-2010 and fill setup tables between Jan-2010 to april.2010. i will get data into PSA. but when i will load data to cube. i will get double records whether it's a full repair or not .
Regards,
Komik Shah
Edited by: komik shah on Apr 30, 2010 12:38 PM

Similar Messages

  • Some records missing while full repair from datasource 2LIS_11_VAITM

    hi friends,
      we are using the datasource 2LIS_11_VAITM.  If we run a full repair with a selection ., say for a particular dealer,  some document numbers are missing.  even if we run a full repair without any selection, same thing happens. 
    we couldnt able to spot out where the records are getting filtered. is there any way to check the datasource for any filterations.
    can anyone tell from which tables the data is flowing to this datasource.

    Hi.
    fill the set up table first for the application 11 for the selections which you want to get pulled in BW.
    A full repair load for a LO data source is pulled from a set up tables and a full repair will pull whatever is in set up tables.
    So fill the set up table for the values which you want to be pulled into BW first.
    I think right now your set up tables do not contain those records which you want in BW.
    Then verify in the t-code RSA3 for the selection and see if you are able to see those records there.
    if its showing up in RSA3 then schedule a full repair and it will show up in BW then.
    The major underlying table are VBAP,VBAK.you can verify the values from there.
    Do not forget do delete the set up tables before filling it.
    Thanks
    Ajeet

  • Loacate and remove duplicate records in infocube.

    Hi!!
    we have found the infocube 0PUR_C01 contians duplicate records for the month april 2008, approx 1.5lac records are extracted to this infocube, similar situations may be occuring in the subsequent months.
    How do I locate these records and remove them for the infocube?
    How do I ensure that duplicate records are not extracted in the infocube?
    All answers/ links are welcome!!
    Yours Truly
    K Sengupto

    First :
    1. How do I locate duplicate records in an Infocube? other than down load all the records in an excel file and use excel funtionality to locate duplicate records.
    This is not possible since a duplicate record would not exist - the records are sent to a cube with a + and - sign to accordingly summarize data.
    You search for duplicate data would become that much troublesome.
    If you have a DSO to load it from - delete data for that month and reload if possible this would be quicker and cleaner as opposed to removing duplicate records.
    If you had
    ABC|100 in your DSO and it got doubled
    it would be
    ABC|+100
    ABC|+100
    against different requests in the cube - and added to this ill be your correct deltas also.

  • Duplicate records in ODS change log

    Hi Experts,
    I have loaded records  from R/3 to ODS (psa and then data target).
    Records looks good in ODS active data.
    Then I did a delta load from ODS to InfoCube and I have recevied are the changed entries to Cube.
    Then, I checked the ODS and the Datamart status was Checked.
    But, when I did a delta load again after couple of days from ODS to InfoCube, I received some duplicate records. I have checked the Change Log of ODS and the duplicate records were there.
    I am not sure why my ODS is transporting entries from the ODS request for which there was a datamart status already checked.
    ( I cannot do selective deletion because there are about 1000 duplicate entries and in random order..).
    I beleive my best bet would be to:
    1. Delete both the delta load requests (the request with correct records and the request with duplicate records) from InfoCube.
    2. Remove datamart status from ODS
    3. Delta load from ODS to InfoCube
    Please let me know if this will solve the issue or I will still get the duplicate records in InfoCube?
    Also, do I need to Delete Change log data?

    Hi Ramesh,
    The entries in Active Data of ODS looks good. No duplicates.
    When i checked Change log,
    I had entries in ODS change log from two differnent requests.
    Request1 (Full repair request via PSA) in ODS (ODSR_xxxx1) has records the same as in R/3
    (eg., entry for DocumentA has keyfigure as value +$1000)
    Request2 (Delta Load via PSA) in ODS (ODSR_xxxx2) has two records with posive and negative keyfigues
    (eg, entries for DocumentA has keyfugure as value +$1000 and DocumentA has keyfugure as value -$1000)
    So, not sure if I should call these records as 'duplicates'.
    But, I see 3 entries in InfoCube for DocumentA with keyfigure values as +$1000, -$1000 and +$1000.
    I beleive the ODS change log should actually cancel the earlier +$1000 and load the new +$1000 entry.
    But, it seems to be not replacing the old one.
    I hope i am clear.

  • Duplicate records in report

    hello guys,
    he was asking,i have duplicate ecords in the report how do we rectify them?
    why and how the duplicate records come in reporting?how is it possible??
    pls explain me how can this is possible?
    thanks & regards

    Hi,
    It may be possible that your data target may be reading data from DSO (for eg).
    If this DSO have a keyfield as account but not center then in this case , the accounts with different centers but with same amount can acculmualte to duplicate data.
    This case may occur with a flat file load and the records need to be corrected in that case. Also the flat file can work directly in the case when we have both account & center as Keyfield for that particular DSO.
    This is scenario which can happen other than the above.
    Best Regards,
    Arpit 

  • Duplicate records to DSO

    Hello Friends,
    we have an issue with the Duplicate records in to DSO let me enplain the senarion
    The Heder and Details data is loaded to saperate DSO's
    and these 2 DSO's data shuld get merged in the third one,
    the Key fields in
    DSO 1 : DWRECID, 0AC_DOC_NO
    DSO 2 : DWRECID , DWPOSNR
    DSO 3 will fetch data from these the above 2
    Key Fields are : ]
    DWTSLO,
    DWRECID,
    DWEDAT ,
    AC_DOC_NO
    DWPOSNR,
    0CUSTOMER
    Now the data shuld be merge in to a single record in the 3 rd dso
    DSO 1  do not have the DWPOSNR object in its data fields also.
    we even have start routine  data from DSO 1 to populate  some values in the result fields from dso2 ,
    Please provide if you have any inputs to merge the data record wise.
    and also give me all the posibilites or options we have to over write " apart from mappings " the data ,

    Hi,
    You should go for creating an Infoset instead of creating third DSO.
    In that DSO provide the Keys of DSOs and the Common records with those keys will be merged  in that Infoset.
    Hope It Helps.
    Regards
    Praeon

  • Write Optimized DSO Duplicate records

    Hi,
    We are facing problem while doing delta to an write optimized data store object.
    It gives error "Duplicate data record detected (DS <ODS name> , data package: 000001 , data record: 294 )
    But it can not have an duplicate record since data is from DSO and
    we have also checked in PSA the particular record and there we couldn't find the duplicate record.....
    There is no much complex routine also.....
    Have any one ever faced this issue and got the solution? Please let me know if yes.
    Thanks
    VJ

    Ravi,
    We have checked that there is no duplicate records in PSA.
    Also the source ODS has two keys and target ODS is having three Keys.
    Also the records that it has mentioned are having Record mode "N" New.
    Seems to be issue with write-ptimized DSO.
    Regards
    VJ

  • Duplicate Records generating for infocube.

    hi all,
    when I load the data from datasource to infocube I am getting the duplicate records.Actually the data is loaded into the datasource from a flat file when i executed first time it worked but when i changed  the flat file structure i am not getting the modified content in the infocube instead it is showing the duplicates.In the data source     'preview data' option  it is showing the required data i.e modified flat file) .But where as in the infocube  i made all the necessary changes in the datasource,infocube,infopackage,dtp but still I am getting the duplicates. I even deleted the data in the infocube.Still i am getting the duplicates. What is the ideal solution for this problem ? One way is to create a new  data source with the modified flat file but  I think it is not ideal .Then what is the possible solution with out creating the data source again.
    Edited by: dharmatejandt on Oct 14, 2010 1:46 PM
    Edited by: dharmatejandt on Oct 14, 2010 1:52 PM
    Edited by: dharmatejandt on Oct 14, 2010 1:59 PM

    Finally i got it .I deleted the requestids in the infopackage ( right click infopackage go to manage) then i executed the transformation ,dtp finally i got the required output with out duplicate.
    Edited by: dharmatejandt on Oct 14, 2010 4:05 PM

  • Duplicate Records in the InfoCube how should i do to fix it?

    Hi All,
    we have different values between R/3 and BW, in the first control i find that in the cube i have duplicate records, how should i do to control and fix step by step this problem.
    the Infocube receive data from 7 ODS.
    let me know if you need further detail about our data model or load
    thanks a lot for your help
    Bilal

    Hello All,
    please i need further detail to don't make critical errors in the cube.
    when i control data in my infocube right click ==> view data with this selection
    0GL_ACCOUNT= R060501950
    0PSTNG_DATE= from 01.01.2009 to 31.03.2009
    i find duplicate records for all this info:
    0GL_ACCOUNT, 0CO_DOC_NO, 0DOC_DATE, 0PSTNG_DATE, 0COORDER, 0FISCPER... and all the key figures.
    to delete this duplicate records i have to make selections going: Manage ==> Contents tab ==> selective deletion (in the right corner beside) ... at this step what should i do?
    i have start in background or i can select "Selective Deletion"
    for this selective deletion which kind of info i have to put in reletion with my problem explained before
    0GL_ACCOUNT= R060501950
    0PSTNG_DATE= from 01.01.2009 to 31.03.2009
    IF I PUT THIS INFO AND EXECUTE wich records the system will delete? all the records with this selections or only the DUPLICATE RECORDS?
    Thanks a lot for your help
    Bilal

  • How to delete the duplicate records in a table without promary key

    I have a table that contains around 1 million records and there is no promary key or auto number coulums. I need to delete the duplicate records from this table. what is the simple effective way to do this.

    Please see this link:
    Remove duplicate records ...
    sqldevelop.wordpress.com

  • Full Repair/Init/Delta & LO Cockpit Information required

    Hello
    I'm pretty new to BW and I'm starting to dig a bit more now into the deeper stuff now.  I've created Cubes with Extractor from R/3 and now I'm interested to understand more the LO Cockpit various specificities.
    For a better understanding, let's establish a scenario as --> Activation of R/3 SD Billing --> BW --> DSO --> Info Cube
    Ok, first, let me start by explaining what my understanding is.
    First, I need to go into R/3, t-code LBWE, and under the "13: SD Billing BW", activate the extractors I'm interested in (Header & Item as an example in my scenario).  I'm also assuming that the default fields within the extractors are suiting my needs so no adjustments needed for now.  I should also schedule a job here that will create records in the Waiting Queue on a daily basis.
    Next, I need to fill the setup tables from the t-code SBIW; I enter a date in the future, it runs for a while, then I get my data prepared.  In the setup table, it included all the existing R/3 invoices as of now right? (1)  Let's say there are new invoices created minutes after I've created the setup table, they will be sent to the Waiting Queue as soon as the document are created or will the schedule job created earlier will post them to that Queue? (2)  If the job send them to the Waiting Queue, where are sitting the pending documents before being sent then? (3)
    Now, on my BW system, once the Data Source have been replicated, what do I need to create and execute to be able to proceed with a Delta process? (4) I know I have to create a u201CInitu201D package 1st but have no clue why except that itu2019s needed for me to be able to create a u201CDeltau201D  package afterwards.  Technically, what does the u201CInitu201D package really does? (5)  What is the reason to have a u201CInit without datau201D and u201CInit with datau201D, I mean, I know literally what it means but again, why would I choose one over the other? (6)
    Iu2019d also like to understand the concept behind a "Full Repair" request, which I have no clue what the purpose is.  I guess there some of these properties (Full Repair, Init) determines where to get the data on the R/3 side; Setup Tables or Waiting Queue? (7)
    Please, provide clear respond (Donu2019t assume I know what youu2019re think of ) on my questions (identified by a bold number) and don't hesitate to provide a long one if needed to; it's better to provide more information than not enough
    Thank you in advance for all of your help!
    P.s. I'm working with BI 7.0.

    Hi.....
    So if I understand properly, whenever I run the init, it toggle a flag on my source system (R/3) which means that all changes performed in the invoices are going to be sent in the Delta Queue? Until I run an Init, changes are not written anywhere (Other than internal SAP tables/structures required by R/3 Invoices processes).
    If so, does this means that I could loose some transactions from the time I run the creation of the setup tables and the time I execute the Init on my BW system? Let's say I create the setup table today and execute the init only tomorrow; all invoices that have been created by this time won't be tranferred to BW?
    yes,suppose you create a set up table today....and ron the init tomorrow........in between many new recordsmay come.......but those new records will not be in the set up table.......since those transaction happenned after the filling of the set up table........and new records only goes to Delta queue after you run the init.......
    I have to admit that it's hard for me to "Suppose" i did not want to do an init; what would be a good reason not to? I understand that both Init types will toggle the "Init" flag on but still can't figure appropriate business scenarios for both types
    Suppose your Delta mechanism is corrupted.......for which you need to do a full repair......I mean first ,you have to delete the existing init flag.........second, you will fill the set up table .......third,you will load the records by Full repair......then 4th step will be running init without data transfer..only to set the init flag..........it will not pick any records.............then delta.......
    Init with data transfer picks all the previous records......we cannot give any selection..............for selection we have to use Full repair.........then init without data transfer..............generally for Transaction data we will go for Full repair..........bcoz otherwise number of recordswill be very huge........and there may be duplicate records..................
    Another thing we want to say........we do full repair we mainly use for ODS.........it is just like Full upload.........in case of infocube we can use full upload instead of full repair.........but in case of ODS full repair is must............since ODS does'nt support Full upload and Delta upload parallely........otehrwise ODS activation will fail........
    Then again you hav to use the Program : RSSM_SET_REPAIR_FULL_FLAG  to convert full upload request to full repair request.....
    Hope this helps....
    Regards,
    Debjani.......
    Edited by: Debjani  Mukherjee on Nov 7, 2008 10:45 PM

  • A full repair request

    Hello Gurus,
         what is a full repair request?
    Many thanks,

    Hai ,
    Q. Repair Full Request
    Ans: You can find the option in the InfoPackage. Open up the InfoPackage and press ShiftCtrlF1 or from the menu Scheduler > Repair Full request. You need to check in the box and it will be done.
    You can go in for a repair full request when you have missed any delta loads or there are data corruption issues. By doing a full repair load, you can ensure that your data is correct and has good integrity.
    With this option you can continue to use your existing delta and not worry about resetting the delta.
    But make sure that the ODS is in overwrite mode and not additive. If its additive then you will face the problem of having duplicate data. If this is the case then you have to delete all the contents and then do the full repair request.
    Say if your current scenario has init/delta setup to your ODS.
    And for some reason you have to do a full load because you missed some data or you have to load some historic data.
    Then if you do a full load to ODS then the request will be loaded but do not get activated as full load to ODS once the delta/ init is setup.
    So in these scenarios you'll have to go for repair full request option when loading to ODS.
    You'll have the same option when loading to CUBE but in cube it doesn't matter as it is possible to do a full load even we had delta/init.
    So doing repair full will not disturb your Delta mechanism in your ODS.
    OSS Note on Repair Full Request.
    Some data is incorrect or missing in the PSA table or in the ODS object (Enterprise Data Warehouse layer).
    There may be a number of reasons for this problem: Errors in the relevant application, errors in the user exit, errors in the DeltaQueue, handling errors in the customers posting procedure (for example, a change in the extract structure during production operation if the DeltaQueue was not yet empty; postings before the Delta Init was completed, and so on), extractor errors, unplanned system terminations in BW and in R/3, and so on.
    Solution
    Read this note in full BEFORE you start actions that may repair your data in BW. Contact SAP Support for help with troubleshooting before you start to repair data.
    BW offers you the option of a full upload in the form of a repair request (as of BW 3.0B). If you want to use this function, we recommend that you use the ODS object layer.
    Note that you should only use this procedure if you have a small number of incorrect or missing records. Otherwise, we always recommend a reinitialization (possibly after a previous selective deletion, followed by a restriction of the Delta-Init selection to exclude areas that were not changed in the meantime).
    1. Repair request: Definition
    If you flag a request as a repair request with full update as the update mode, it can be updated to all data targets, even if these already contain data from delta initialization runs for this DataSource/source system combination. This means that a repair request can be updated into all ODS objects at any time without a check being performed. The system supports loading by repair request into an ODS object without a check being performed for overlapping data or for the sequence of the requests. This action may therefore result in duplicate data and must thus be prepared very carefully.
    The repair request (of the "Full Upload" type) can be loaded into the same ODS object in which the 'normal' delta requests run. You will find this request under the "Repair Request" option in the InfoPackage (Maintenance) menu.
    2. Prerequisites for using the "Repair Request" function
    2.1. Troubleshooting
    Before you start the repair action, you should carry out a thorough analysis of the possible cause of the error to make sure that the error cannot recur when you execute the repair action. For example, if a key figure has already been updated incorrectly in the OLTP system, it will not change after a reload into BW. Use transaction RSA3 (Extractor Checker) in the source system for help with troubleshooting. Another possible source of the problem may be your user exit. To ensure that the user exit is correct, first load a user exit with a Probe-Full request into the PSA table and check whether the data is correct. If it is not correct: Search for the error in the exit user. If you do not find it, we recommend that you deactivate the user exit for testing purposes and request a new Full Upload. It If the data arrives correctly, it is highly probable that the error is indeed in the user exit.
    We always recommend that you load the data into the PSA table in the first step and check the result there.
    2.2. Analyze the effects on the downstream targets
    Before you start the Repair request into the ODS object, make sure that the incorrect data records are selectively deleted from the ODS object. However, before you decide on selective deletion, you should read the Info Help for the "Selective Deletion" function, which you can access by pressing the extra button on the relevant dialog box. The activation queue and the ChangeLog remain unchanged during the selective deletion of the data from the ODS object, which means that the incorrect data is still in the change log afterwards. After the selective deletion, you therefore must not reconstruct the ODS object if it is reconstructed from the ChangeLog. (Reconstruction is usually from the PSA table but, if the data source is the ODS object itself, the ODS object is reconstructed from its ChangeLog). You MUST read the recommendations and warnings about this (press the "Info" button).
    You MUST also take into account the fact that the delta for the downstream data targets is created from the changelog. If you perform selective deletion and then reload data into the deleted area, this may result in data inconsistencies in the downstream data targets.
    If you only use MOVE and do not use ADD for updates in the ODS object, selective deletion may not be required in some cases (for example, if incorrect records only have to be changed, rather than deleted). In this case, the DataMart delta also remains intact.
    2.3. Analysis of the selections
    You must be very precise when you perform selective deletion: Some applications do not provide the option of selecting individual documents for the load process. Therefore, you must first ensure that you can load the same range of documents into BW as you would delete from the ODS object. This note provides some application-specific recommendations to help you "repair" the incorrect data records.
    If you updated the data from the ODS object into the InfoCube, you can also delete it there using the "Selective deletion" function. However, if it is compressed at document level there and deletion is no longer possible, you must delete the InfoCube content and fill the data in the ODS object again after repair.
    You can only perform this action after a thorough analysis of all effects of selective data deletion. We naturally recommend that you test this first in the test system.
    The procedure generally applies for all SAP applications/extractors. The application determines the selections. For example, if you cannot use the document number for selection but you can select documents for an entire period, then you are forced to delete and then update documents for the entire period in the data target. Therefore, it is important to look first at the selections in the InfoPackage exactly before you delete data from the data target.
    Some applications have additional special features:
    Logistics cockpit: As preparation for the repair request, delete the SetUp table (if you have not already done so) and fill it selectively with concrete document numbers (or other possible groups of documents determined by the selection). Execute the Repair request.
    Caution: You can currently use the transactions that fill SetUp tables with reconstruction data to select individual documents or entire ranges of documents (at present, it is not possible to select several individual documents if they are not numbered in sequence).
    FI: The Repair request for the Full Upload is not required here. The following efficient alternatives are provided: In the FI area, you can select documents that must be reloaded into BW again, make a small change to them (for example, insert a period into the assignment text) and save them -> as a result, the document is placed in the delta queue again and the previously loaded document under the same number in the BW ODS object is overwritten. FI also has an option for sending the documents selectively from the OLTP system to the BW system using correction programs (see note 616331).
    3. Repair request execution
    How do you proceed if you want to load a repair request into the data target? Go to the maintenance screen of the InfoPackage (Scheduler), set the type of data upload to "Full", and select the "Scheduler" option in the menu -> Full Request Repair -> Flag request as repair request -> Confirm. Update the data into the PSA and then check that it is correct. If the data is correct, continue to update into the data targets."
    Another Scenario where we use Repair Full Request
    If your data load is delta and now if you are planning to run Full load, then you can go for Repair full request.
    Eg: If you have missed data for 3 days and next delta loads are running successfully, in this case if you want to fill data for only particular 3 days, than you can run the statistical setup for that particular data source for 3 days and load it as a full load through repair request.
    It is particularly used in the upload of data into the ODS ,if the ODS has already Delta loads.
    This is required if want to upload any full uploads(with selection at infopackage or without) to be done to ODS, because ODS does not allow to activate the simple Full uploads if there are delta loads to it.
    But make sure that check the option "indicate request as repair" in the Scheduler before you start the IP.
    Thanks,
    Kiran Manyam

  • Full repair senario for multiple objects?

    Hi experts,
    I did a full repair in first level ODS. know their is second level ODS and a infocube on the top of it. the cube is a daily full load. Do I have to delete the records from second level ODS and the Cube too?
    Thanks in advance.
    Sharat.

    Kalpana,
    We are not using DTP here. but it is a BU 7.0 version. Yes I am using IP to load from first level DSO to five different objects (3 cubes and 2 DSO).
    So like you said I will delete the particular record from DSO's and the cubes and do a full repair request.
    1.What I understood is it is mandatory to selectively delete the reocord from DSO and the cube before I do the full repair request right?
    2.otherwise the load will fail becasue the exact record already is in the DSO? but it have the overwite functionality does it just overwrite it even if the exact record appers in the DSO?
    3. How this work in the cube? I think the cube have the addition functionality, does it double the quantity for the identical record already exist in the infocube?
    I will assign you the full points. Please help me understand the process.
    Thanks in advance.
    Sharat.

  • Duplicate records in delta load?????pls help!!!! will assign points

    Hi all,
    I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
    I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
    i ran init of delta without data transfer, extracted 0 records as expected.
    then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
    what could be the reason for duplicate records to occur in the delta load?
    i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
    Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
    Will assign points.

    ur selection criteria -
    01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
    both of ur selection includes the month- .02.2007
    might b all selections come under .02.2007
    hav u checkd tht?
    Regards,
    Naveen Natarajan

  • Duplicate Records in InfoProvider

    Hi,
    I am loading the Transaction Data from the flat files in to the Data Sources.
    Initially I have One Request (data from one flat file) loaded in to PSA and InfoCube that has say 100 records.
    Later, I loaded another flatfile in to PSA with 50 records (without deleting the initial request).  Now in PSA, I have 150 records.
    But I would like to load only 50 New records in to the Infocube.  When I am executing the DTP, its loading 150 records.  i.e. Total 100 (initial records) + 150 = 250 records.
    Is there any option by which I can avoid loading the duplicate records in to my InfoCube.
    I can find an option that says "Get Data by Request" in DTP.  I tried checking that, but no luck.
    How can I solve this issue and what exactly the check on "Get Data by Request" does?
    Thanks,

    Hi Sesh,
    There is an option in DTP where you can load only the new records. I think you have to select "Do not allow Duplicate records" radio button(i guess)... then try to load the data... I am not sure but you can research for that option in DTP...
    Regards,
    Kishore

Maybe you are looking for