Records missing in data load.

Hi All,
we have one load(DSO-->CUBE).
EX: In DSO for customer number 1 there are 20 entries.
      after uploading the data into Cube for this customer only 2 or 3 records loading into cube.
after applying the delete source package conditions at transformation of DSO-->CUBE also it should load 10 entries.
but only 2 or 3 records updating.
But when i debug the DTP Load, after END Routine it is showing the currect number of records., means it is showing 10 records for that customer..
but it cube only 3 records appearing..
What might be the reason..?
after end routine is there any chance to drop the records?
please share your ideas..
Thanks
Krishna.

Hi Asish,
The key fig values summation also not matching.
i tried for that one customer also.
but it is behaving same(means in debuginng it is showing currect no.of records but not in CUBE)
and there are no filter conditions used in DTP.
Thanks
Krishna
Edited by: krishnamurthy g on Oct 17, 2008 6:52 PM
Edited by: krishnamurthy g on Oct 20, 2008 8:30 PM

Similar Messages

  • Duplicate records during master data loading.

    hello guys,
    I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
    Data loading failed due to Duplicate records during master data loading.......
    Why do this error occur?How can we rectify this in a production environment?
    Thanks and Regards,
    S

    Hi SChandx200 ,
          May I ask where you get "Various issues in a BW production project"?
    Many Thanks,

  • Duplicate records in BW Data Loads

    In my Project I am facing duplicate records in  Data Loads,  when I compare with PSA and DSO.  How to check those are duplicate and is there any mechanism through Excel Sheet or any?  Please help me out.  Advance thanks for your quick response.
    Edited by: svadupu on Jul 6, 2011 3:09 AM

    Hi ,
    Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
    In case of a standard DSO, records are always overwritten you would not get any duplicates .
    In case you are getting duplicate records in PSA and need to find them,
    Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
    Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .

  • Duplicated records on infoobect data load

    Hi,
    I have a problem when loading dato to 0UCINSTALLA infoobject.
    It goes to Red flga and It reports duplicated records in /BI0/QUCINSTALLA and /BI0/YUCINSTALLA tables.
    I checked the infopackage and the "PSA only" checkbox is selected, and "Continuing..." and "Ingnore dup records" checkboxes are selected, too.
    If the "Ignore duplicated records" is selected, why is reporting the error?
    I don't know what to do with this problem.
    any ideas?
    thanks for the help.
    Mauricio.

    In transfer structure write a start routine that delete duplicate record like that:
    sort DATAPAK by /BIC/filed1 descending /BIC/filed2
    /BIC/filed3.
    delete adjacent duplicates from DATAPAK comparing /BIC/filed1 /BIC/filed2.
    Hope it helps.
    Regards

  • Adding leading zeros before data loaded into DSO

    Hi
    In below PROD_ID... In some ID leading zeros are missing before data loaded into BI from SRM into PROD_ID. Data type is character. If leading zeros are missing then data activation of DSO is failed due to missing zeros and have to manually add them in PSA table. I want to add leading zeros if they're missing before data loaded into DSO.... total character length is 40.. so e.g. if character is 1502 then there should be 36 zeros before it and if character is 265721 then there should be 34 zeros. Only two type of character is coming either length is 4 or 6 so there will be always need to 34 or 36 zeros in front of them if zeros are missing.
    Can we use CONVERSION_EXIT_ALPHPA_INPUT functional module ? As this is char so I'm not sure how to use in that case.. Do need to convert it first integer?
    Can someone please give me sample code? We're using BW 3.5 data flow to load data into DSO.... please give sample code and where need to write code either in rule type or in start routine...

    Hi,
    Can you check at info object level, what kind of conversion routine it used by.
    Use T code - RSD1, enter your info object and display it.
    Even at data source level also you can see external/internal format what it maintained.
    if your info object was using ALPHA conversion then it will have leading 0s automatically.
    Can you check from source how its coming, check at RSA3.
    if your receiving this issue for records only then you need to check those records.
    Thanks

  • Rate data loader

    Hey, Gurus!
    Does anyone know how many records does Oracle Data Loader On Demand send by package?
    I didn't find anything on the documentation (Data Loader FAQ, Data Loader Overview for R17, Data Loader User Guide).
    thanks in advance
    Rafael Feldberg

    Rafael, there is no upper limit for the number of records that the Data Loader can import. However, after doing a test import using the Import Wizard I would recommend keeping the number of records at a reasonable level.

  • Missing records while fetching data from Data mart

    Hi,
    I have some missing records in the ODS.The data is fetched from the other BW system.
    It is a delta load & all the loads are succesfull as of now.But still some records are missing.
    If i see in reconstruction tab, some requests are showing the Transfer structure status as clock(transfer stucture has changed sine the last request). Is it because of this time stamp status got changed ?
    I have done reinitialization & the missing Datas are fetched .But i would like to know the cause of missing records.
    Regrads,
    ANita

    Hi kedar,
    If there was a time stamp difference ,the data load should have got failed.All the delta loads were succesfull.
    But now they realised there are some missing records.Just for analysis purpose , i was looking into reconstruction tab & the transfer structure status was displayed as clock.
    But actually there was no change in the transfer stucture.
    Sometimes we used to get timestamp error &
    we used to replicate the data source & activate the transfer structure.
    To avoid these things in future what needs to be done.As this load gets triggered through process chain & each time the data load status was succesful.
    Every time we cant do replication of datasource while loading through Process chain unless the transfer structure gets changed.
    But my concern was is this the cause for missing records or something else.
    Regards,
    ANita

  • SPM Data Loads : Less number of records getting loaded in the Invoice Inbound DSO

    Dear Experts,
    We are working on a project, where data of different NON SAP Source Systems is being loaded into SPM, via Flat File Loads. We came across a very weird situation.
    For other Master and Transaction Data objects, it worked fine, but when we loaded Invoice File, less number of records are getting loaded in the Inbound DSO. The Invoice File contained 80000 records, but the inbound DSO has 78500 records only. We are losing out on 1500 Records.
    We are unable to figure out, as to which 1500 records are we missing out on. We couldn't find any logs, in the Inbound Invoice DSO. We are unable to find out if the records are erroneous, or there is any issue with something else. Is there a way to analyze the situation / Inbound invoice DSO.
    If there is any issue with the Outbound DSO or Cube, We know that it is possible to check the issue with the Data Load request, but for the Inbound DSO, we are not aware, as to which the way to analyze the issue, and why Inbound DSO is taking less records.
    Regards
    Pankaj

    Hi,
    Yes, It might be happen in DSO, because the data records have the simantic keys, so in Keyfileds selection you might have less no of records.
    If you have any rountines check the code(If any condetion for filtering the records).
    Regards.

  • Data records missing from  2lis_13_vditm

    Hi
    1.)2LIS_13_VDITM data loaded but certain billing items are missing.
    2.)I figured out how to create a “repair request” to manually load individual records.
    3.)Loading works in BWD development where not so many records exist
    4.)In BWP it does not pick the individual records I select since there are too many data records existing (table to big)
    Any idea how we can load the individual records which failed to load from psa to cube on that particular day without creating duplicate records in BW.even if the request are compressed.
    thankyou in advance

    There is an interesting thread on this discussion. check this out.
    A BW puzzle for you...I'll Award POINTS!
    The OSS note 739863 will give you some idea to fix your problem.
    This is applicable if you are missing 20th data and getting all the Delta data after that!
    If you are not getting any Delta after that, then you need to check your update methods, particularly Collective Run, if you are using the V3 Update.
    Regards,
    Sree
    Message was edited by: Sree Damodararaj

  • Data Load : Number of records count

    Hi Experts,
              I want to document number of records transferred to BW during an infopackage execution.
              I want to automate the process by running a report in background which will fetch a data from SAP tables about number of records been transfered by all my InfoPackage .
    I would like to know how should I proceed with.
             I want to know some System tables which contains same data as that of RSMO transaction displays to us.
    Kindly help with valuable replies.

    HI,
    inorder to get the record counts report you need to create a report based on below tables
    rsseldone, rsreqdone, rsldpiot, rsmonfact
    Check the below link which explain in detail with the report code as well.
    [Data load Quick Stats|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/90215bba-9a46-2a10-07a7-c14e97bdb764]
    This doc also explains how to trigger a mail with the details to all.
    Regards
    KP

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • Data Loader - Only imports first record; remaining records fail

    I'm trying to use Data Loader to import a group of opportunities. Everytime I run the Data Loader it only imports the first record. All the other records fail with the message "An unexpected error occurred during the import of the following row: 'External Unique Id: xxxxxxx'". After running the Data Loader, I can modify the Data file and remove the first record that was imported. By running the Data Loader again, the first row (previously the second row) will import successfully.
    Any idea what could be causing this behavior?

    W need a LOT more information, starting with the OS, and the version of ID, including any applied patches.
    Next we need to know if you are doing a single record per page or multiple records, whether the placeholders are on the master page, how many pages are in the document and if they all have fields on them (some screen captures might be useful -- embed them using the camera icon on the editing toolbar on the webpage rather than attaching, if it works [there seem to be some issues at the moment, though only for some people]).
    What else is on the page? Are you really telling it to merge all the records, or just one?
    You get the idea... Full description of what youhave, what you are doing, and what you get instead of what you expect.

  • # of records in a full load dont square with d # in an Init with data trans

    Heallo to all BI/FI SDNer's,
    FI AP n AR extractors like 0FI_AR_3 & 0FI_AP_3 pull 1785 records to PSA when loaded full but an Init with datatransfer for the same pull only 1740 records to PSA.
    I am very skeptical dat even a delta aft a repair full will not bring all records.
    What update methodologies are qualified for AP & AR ?
    OSS notes I found really dont answer my concern here......please comment SDNer's !!!
    Message was edited by:
            Jr Roberto

    Somehow it worked after redoing !!

  • Zero Record Data Load Problem

    Hi,
    Please give your suggestion for following problem.
    we are loading data from ETL (Flat File - Data Stage) into SAP BW 3.1.
    data may contain Zero records. When we try to push the data into BW. At ETL side, it is showing successful data transfer. At, BW side it is showing "Processing state" (Yellow light). and all BW resources are hang-up.
    When we try to send another data load from ETL side, We could not push the data as BW resources are hang up by the previous process.
    Whenever we are getting this kind of problem, we are killing the process and continuing with another data Re-load. But this is not a permanent solution. This is happening more often.
    What is the solution for this problem?
    One of my colleague suggested following suggestion. Shall I consider this one?
    Summary:  when loading with empty files, data may be in the processing state in BW 
    Details:  When user load with empty file(must be empty, can not have any line returns, user can check the data file in binary mode), data is loaded into BW with 0 records. BW will show be in yellow state(processing state) with 0 record showing, and in the PSA inside BW, 1 datapacket will show there with nothing inside. Depends on how user configured their system, BW server can either accept the 0 record packet or deny it. When BW server is configured to accept it, this load request will change to green state(finished state). When the BW server is configured to deny it, this load request will be in the yellow state.
    Please give me ur suggestions.
    Thanks in advance.
    Regards,
    VPR

    hi VPR,
    have you tried to set the light 'judge'ment
    go to monitor of one request and menu settings->evaluation of requests(traffic light), in next screen 'evaluation of requests', 'if no data is avaible in the system, the request' -> choose option 'is judged to be successful' (green).
    Set delta load to complete when no delta data
    hope this helps.

Maybe you are looking for