Duplicate records in delta load?????pls help!!!! will assign points

Hi all,
I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
i ran init of delta without data transfer, extracted 0 records as expected.
then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
what could be the reason for duplicate records to occur in the delta load?
i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
Will assign points.

ur selection criteria -
01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
both of ur selection includes the month- .02.2007
might b all selections come under .02.2007
hav u checkd tht?
Regards,
Naveen Natarajan

Similar Messages

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Urgent please help will reward points

    How do i join two trans table with indexes.
    please help will reward points.

    Hi,
    Just copy paste this code, activate and test.
    INNER JOIN.
    TYPES: BEGIN OF  t_vbak,
                 vbeln TYPE vbeln,
                 END OF t_vbak.
    TYPES: BEGIN OF t_vbap ,
           vbeln TYPE vbeln_va,
           posnr TYPE posnr_va,
           END OF t_vbap.
    DATA: it_vbak TYPE STANDARD TABLE OF t_vbak.
    DATA: wa_vbak TYPE t_vbak.
    DATA: it_vbap TYPE STANDARD TABLE OF t_vbap.
    DATA: wa_vbap TYPE t_vbap.
    select-options:  s_vbeln for wa_vbak-vbeln.
    SELECT avbeln bposnr INTO table it_vbap FROM vbak AS a
                                INNER JOIN vbap AS b ON
                                avbeln = bvbeln WHERE a~vbeln in s_vbeln.
    loop at it_vbap into wa_vbap.
    WRITE:/ wa_vbap-vbeln, wa_vbap-posnr.
    endloop.
    LEFT OUTER JOINT.
    SELECT avbeln bposnr INTO table it_vbap FROM vbak AS a
                                 left outer join vbap AS b ON
                                 avbeln = bvbeln WHERE a~vbeln in s_vbeln.
    loop at it_vbap into wa_vbap.
    WRITE:/ wa_vbap-vbeln, wa_vbap-posnr.
    endloop.
    DIFFERENCE BETWEEN INNER AND LEFT OUTER JOIN.
    With respect to above example,
    INNER JOIN will fetch all those datas with reference to VBAK in VBAP.
    LEFT OUTER JOIN will also fetch all those datas with reference to VBAK in VBAP. But it will also fetch those data which are in VBAK, But not in VBAP.
    hope this helps.

  • Help....Duplicate records found in loading customer attributes!!!!

    Hi ALL,
    we have a full update master data with attributes job that  is failed with this error message: (note the processing is PSA and then Data targets)
    1 duplicate record found. 66 recordings used in table /BIC/PZMKE_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/XZTSA_CUST RSDMD 199     
    1 duplicate record found. 66 recordings used in table /BIC/XZMKE_CUST RSDMD 199     
    our datasource is 0CUSTOMER_ATTR i tried to use the transaction rso2 to get more information about this datasource to know where i can find the original data in R/3 but when i execute i got this message:
    DataSource 0CUSTOMER_ATTR  is extracted using functional module MDEX_CUSTOMER_MD
    Can you help me please what should i do to correct and reload or to find the duplicate data to tell the person on R/3 system to delete the duplicate.
    Thanks
    Bilal

    Hi Bilal,
    Could you try the following pls.
    Only PSA and Update Subsequently into Datatargets in the Processing tab.
    In Update tab use Error Handling , mark "Valid Records Update, Reporting Possible(request green).
    If my assumptions are right, this should collect the duplicate records in another request, from where you could get the data that needs to be corrected.
    To resolve this issue, just load using ONLY PSA and Update Subsequently into Datatargets with Ignore duplicate Records checked.
    Cheers,
    Praveen.

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • K7n2 delta probs pls help / german forum is down so i try here

    hi there
    i just bought an msi k7n2 delta but i got a few probs. i also bought an athlon xp 2500+ with barton die
    so now mz problem i am using the amd boxed cooler and i got  a idle temp of 30 degree or 29 this is not normal can t be with this crab of cooler and this processor how does this come
    other problem is after i am changing something in bios or it happen without changing to mz processor is autodetected bz the bios as 1100 mhz how come
    i reallz dont know what to do ply help me and give me answers
    thanks cman

    The AMD stock cooler are not to bad and will perform almost as good as a branded 3rd party cooler,  as long as you dont overclock they perform well with the specs.  A review was done on TomsHardware for the AMD coolers.  The BIOS version probably reading the CPU temp wrong. This is a known problem.  Can be up to -10 deg from true temp.
    As for the CPU speed you need to change the FSB to 166.

  • DMS Document Distribution Problem Pls, Help Will be appreciated

    When executing T-Code CVI8(Document Distribution), there is a system message that the order was successfully created. But when we looked at the distribution log using T-Code CVI9, the status of the document remains as SY and the processing of the distribution stops. We already finished the settings for document distribution and still this error persists. What could have been wrong? Pls advice.
    Mail Me On:- [email protected]
    Regards,
    Akshit A. Patel

    Hi Athol,
    i have checked the workflow config accordingly,
    workflow templates 20000104 and 20000137. are active.
    a. 20000104 BUS1082 x 2 (Initiated and Resend)
    b. 20000137 BUS1082002 x 1 (Resend)
    But the Release status for these templates is NOT DEFINED, should it be RELEASED & the container tab for 20000104 shows a warning icon(Mandatary import parameter)on partial order & intial value not set & same for 20000137 shows warning icon on distribution order package & initial value not set.
    what does this mean, is there any setting required for this?
    2. Transaction SWU3 – ensure that all has been generated (activated). Pay special attention to Maintain Runtime Environment -> Configure RFC destination.
    I HAVE checked the above all are active except Configure RFC destination, what is the setting required for the same.
    the 3, 4,& 5 setting what u have suggested are ok,i have checked them.
    Please suggest for the 1 & 2 points.
    thanks in advance.

  • MSI GE620DX pls HELP WILL BE APPRECIATED

    Help, just yesterday i got this issue with my ge620, upon startup, instead of blue light on power button, its stuck in red which is the nvidia graphics.... my lappy is overheating up[on strtup.. i already updated the bios which is the latest 10R.. any help????

    Hi Athol,
    i have checked the workflow config accordingly,
    workflow templates 20000104 and 20000137. are active.
    a. 20000104 BUS1082 x 2 (Initiated and Resend)
    b. 20000137 BUS1082002 x 1 (Resend)
    But the Release status for these templates is NOT DEFINED, should it be RELEASED & the container tab for 20000104 shows a warning icon(Mandatary import parameter)on partial order & intial value not set & same for 20000137 shows warning icon on distribution order package & initial value not set.
    what does this mean, is there any setting required for this?
    2. Transaction SWU3 – ensure that all has been generated (activated). Pay special attention to Maintain Runtime Environment -> Configure RFC destination.
    I HAVE checked the above all are active except Configure RFC destination, what is the setting required for the same.
    the 3, 4,& 5 setting what u have suggested are ok,i have checked them.
    Please suggest for the 1 & 2 points.
    thanks in advance.

  • Multiple Take Recording Issues...pls help.

    Hey everyone, I'm trying to record multiple takes on Logic Pro 9 with my keyboard (external instrument), the problem I'm experiencing is when I go right into my second take recording I hear the first take as well. Anyone know how to mute the first take so that my second take is clean without the distraction of the first take?

    Under the FILE menu, go to "Project Settings" > Recording
    Under MIDI - Overlapping Regions, select "Create Tracks and Mute in Cycle Record"
    See if that does it

  • Delta in Duplicate records.

    Hi Gurus,
    Daily we are uploading CRM data through process chain.
    In a week 3 to 4 times chain fails due to duplicate records 24 or 34...like that.
    And deleting red request in target and loading again.
    Why we are getting duplicate records in delta loading?
    What is the reason?
    Your help is appricate
    Thanks
    Ramu
    Message was edited by:
            Ramu T

    Hi Ramu,
                     Once try this way.Check the Keys in the table from which the datasource is made and add the corresponding infoobjects for that key fields in the compounding tab of the masterdata charesterstic.
    Then it checks the uniqueness.
    I have done this and it worked for me.
    Hope this helps
    Regards
    karthik

  • Table with Full / Delta Load information?

    Is there a table I can go to where I can see if a cube is a full load vs delta?
    Thanks, I will assign points!
    ~Nathaniel

    Hi,
    ckeck the table ROOSPRMSC in R/3.It gives you the complete details of your Init and Delta Loads.
    hope this helps.
    Assign points if useful.
    Regards,
    Venkat

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Duplicate records in BW Data Loads

    In my Project I am facing duplicate records in  Data Loads,  when I compare with PSA and DSO.  How to check those are duplicate and is there any mechanism through Excel Sheet or any?  Please help me out.  Advance thanks for your quick response.
    Edited by: svadupu on Jul 6, 2011 3:09 AM

    Hi ,
    Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
    In case of a standard DSO, records are always overwritten you would not get any duplicates .
    In case you are getting duplicate records in PSA and need to find them,
    Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
    Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .

  • Hi I cant able to see some web photos in my ipad showing 'unable to load' .. Some photos able to see but still almost 90% I can't ...pls help me to clear this issue ...thanks in advance

    Hi
    My ipad is not able to see some of the photos .... I can see with google images and all .. When I am loading some applications or safari images getting the same issue ..first it's showing 'loading' then its happens like 'unable to load ' pls help me to solve the issue ..thanks

    When you say that you can't find them anywhere, I assume that you mean that you can't find them on the iPad even though the storage figure says that the photos are there.
    You can try this. I don't know if this will work for you but this was recently discovered as solution for phantom photos. This was copied from a post started by gail from maine
    Delete all of your Recently Deleted photos from your Recently Deleted folder
    Then go to Settings>General>Date & Time, and change the date back to last September when iOS 8 came out, and then go back to the Recently Deleted folder. Although at the Album level it will still say "0" if you tap on the Album, you will be greeted by all of the "Recently Deleted" photos that have surpassed the 30 day limit. Mine went back to when I installed iOS 8.
    She found this in a post where txforever  was the person that discovered how to do this.
    More photos in settings than photos app shows

Maybe you are looking for

  • Waveform Display Blank

    I imported an audio track and even though sound is definitely playing when I play the track, I don't see any waveforms on the display. I had no problems with a previous project, so I'm not sure if I shut something off by accident. I need to see the w

  • Netweaver Sneak Preview ABAP Version installation problem

    hi, I could install to the step Database Insatnce Creation with this error: ERROR 2015-06-02 23:33:54 The dbmcli call for action PARAM_INIT_INST failed. Check the logfile XCMDOUT.LOG. And in the file XCMDOUT.LOG this such message was found: > Subproc

  • "Insuffient Memory" in ACR

    I'm having the same problem in several laptops with ACR telling me there's insuffient memory while it's saving changes in ACR. This happens when I open, change, save, and close NEF or jpg images then open another set of about 25 and do the same proce

  • Using class data in actor user interface?

    This may have been asked before but I've not been able to find anything so far... I'm starting to play with the Actor framework in LV2012 and have run into a dilemma.  The system I'm setting up is a fairly common one consisting of test stand control,

  • Workflow issue in different clients of same server

    WF should trigger whenever a contract is created or changed. It is working as expected in Q-client100 but it is not working in Q client-120. WF is triggering when a contract is created, but it is not working when a contact is changed 1) I have checke