Duplicate records in cube.

Hi,Xperts,
i have checked in my PSA which has no duplicate records.but when i am loading the data to cube,in cube i am getting duplicate records.
can any one help me on this?

Hi Satish,
please check in R/3,
U told that it is delta load,  go to RSO2, select the required one and enter
there click on the generic delta tab and check the settings: what u have given:
Safety level lower limit: give the particular value to eliminate the data records so that the system knows from where exactly the data has to be loaded.
Check the two options below:
New status for changed records
Additive Delta
Select the appropriate one.
if helpful, please try this.
Regards
Swathi

Similar Messages

  • Getting duplicate records in cube from each data packet.

    Hi Guys,
    I am using 3.x BI version. I am getting duplicate records in cube. for deleting these duplicate records i have written code. still it is giving same result. Actually i have written a start routine for deleting duplicate records.
    These duplication is occurring depending on the number of packets.
    Eg: If the number of packets are 2 then it is giving me 2 duplicate records.
    If the number of packets are 7 then it is giving me 7 duplicate records.
    How can i modify my code so that it can fetch only one record by eliminating duplicate records? Or any other solution is welcomed.
    Thanks in advance.

    Hi  Andreas, Mayank.
      Thanks for your reply.
      I created my own DSO, but its giving error. And I tried with the stanadard DSO too. Still its giving the same error as could not activate.
    In error its giving a name of function module RSB1_OLTPSOURCE_GENERATE.
    I searched in R3 but could not get that one.
    Even I tried creating DSO for trial basis, they are also giving the same problem.
    I think its the problem from BASIS side.
    Please help if you have any idea.
    Thanks.

  • How to delete duplicate records in cube

    Hi,
    can u help me how to delete the duplicate records in my cube
    and tell me some predifined cubes and data sourcess for MM and SD modules

    Hi Anne,
    about "duplicate records" could you be more precise?.
    The must be at least one different Characteristic to distinguish one record from the other (at least Request ID). In order to delete Data from InfoCubes (selectively) use ABAP Report RSDRD_DELETE_FACTS (be carefull it does not request any confirmation as in RSA1 ...).
    About MM and SD Cubes see RSA1 -> Business Content -> InfoProvider by InfoAreas. See also for MetadataRepository about the same InfoProviders.
    About DataSources just execute TCode LBWE in you source sys: there you see all LO-Cockipt Extrators.
    Hope it helps (and if so remember reward points)
    GFV

  • Duplicate records in Cube Level ( BI 7.0 )

    Dear All
    I am working on BI 7.0 , I have an issue , i am  loading the data from  Flat File to ODS  and from ODS to Cube . In ODS  we are selected Overwrite option, in cube level we have an Summation option. the problem is while loading the data from Flatfile to ODS  records are fine . from ODS  to cube data  loading also fine but here i am getting the Duplicate records .
    what are the  best options to go ahead in such a situations??
    Regards
    KK

    I am sharing a case occured for me. Please see if its applicable for you.
    Sometimes, in the step loading to Cube, when any type of problem occurs, we restart the load. If the cube load prompts saying 'the lost load was unsuccessful.... reload?', this problem may occur. It will load the records in the previous load also.
    Verify what has been duplicated from the ODS changelog table and the cube load record count. If you see the number of records updated being the total of the records in the different ODSR request (in the change log table). Delete the previous load in the cube. (provided no other side effect is produced e.g from start routine etc)
    Cheers.

  • Delete overlapping/duplicate records from cube

    Hi All,
    Kindly let me know how to delete overlapping requests from a cube. Actually the cube is getting loaded from varuous infosources, but there are records which get duplicated and the are not wanted , then hiow to delete the duplicate records from the cube.
    Regards,
    dola

    I think what arun is perfectly right....
    use DSO for consolidation of various requests..from diferenet infosources...
    Now load from DSO to cube...and it is very much possible...though will require little work.
    Delete duplicate records option is usually used for master data.With transacdtion data i don't think its advisable.
    Regards,
    RK

  • Duplicate Records in COPA cube

    Dear Friends,
    We have two BW systems with one R/3 as a source system. There are identical COPA cubes in both the BW systems. We had created the COPA cube in the second BW system  recently and did a full load. When i validate the data between the two COPA cubes, i see duplicate records in the second COPA cube. I am wondering why is this so? Is there any setting in the R/3 side that i am missing or anything else? Any ideas?
    Thanks
    Raj

    HI ,
    I am also facing the same problem. In ZCOPA_C01_Q06 i am getting the value double. kindly help with steps as i am new to BI.
    Regards
    Amarendra

  • How to get rid of duplicate records generated frm hierarchical cube in sql?

    Hi All,
    database version 10gR2.
    I am trying to aggregated data for two hierarchical dimensions, specifically organization and products.
    I am using one ROLLUP for each dimension, which would be two ROLLUP in GROUP BY clause to do the aggregation for every level of organization and product that are in included in the hierarchy.
    the troubling part is that that products that have data in corresponding fact table are not always located at the lowest level (which is 6) of the product hierarchy.
    e.g.
    product_id                               level
    0/01/0101/010102/01010201    5                           -->01010201, at level 5 , has data in fact table
    0/01/0101/010103                   4                           -->010103, at level 4, has data in fact table as well
    0/02/0201/020102/02010203/0201020304/020102030405              6   --> at level 6,(lowest level) and has data in fact table     we have a flat product hierarchy stored in table as below:
    prod_id  up_code_1 up_code_2 up_code_3   up_code_4   up_code_5 up_code_6
    01010201     0     01     0101     010102     01010201      NULL
    010103     0     01     0101     010103     null             nulldue to the NULL in product in level 6 for 01010201, when i run the query below, one duplicate record will be generated.
    for 010103, there will be 2 duplicate records, and for 020102030405 will be none.
    Encounter the same issue with the organizational dimension.
    currently, I am using DISTINCT to get rid of the duplicate records, but I don`t feel right to do it this way.
    So, I wonder if there is a more formal and standard way to do this?
    select distinct ORG_ID, DAY_ID,  TRADE_TYPE_ID, cust_id, PRODUCT_ID, QUANTITY_UNIT, COST_UNIT, SOURCE_ID,
          CONTRACT_AMOUNT, CONTRACT_COST, SALE_AMOUNT,SALE_COST, ACTUAL_AMOUNT, ACTUAL_COST, TRADE_COUNT
    from (     
    select  coalesce(UP_ORG_ID_6, UP_ORG_ID_5, UP_ORG_ID_4, UP_ORG_ID_3, UP_ORG_ID_2, UP_ORG_ID_1) as ORG_ID,
          a.day_id as day_id,        
          a.TRADE_TYPE_ID as TRADE_TYPE_ID,
          a.CUST_ID,
          coalesce(UP_CODE_6, UP_CODE_5, UP_CODE_4, UP_CODE_3, UP_CODE_2, UP_CODE_1) as product_id,
          QUANTITY_UNIT,
          COST_UNIT,
          A.SOURCE_ID as SOURCE_ID,
          SUM(CONTRACT_AMOUNT) as CONTRACT_AMOUNT,
          SUM(CONTRACT_COST) as CONTRACT_COST,
          SUM(SALE_AMOUNT) as SALE_AMOUNT,
          SUM(SALE_COST) as SALE_COST,
          SUM(ACTUAL_AMOUNT) as ACTUAL_AMOUNT,
          SUM(ACTUAL_COST) as ACTUAL_COST,
          SUM(TRADE_COUNT) as TRADE_COUNT     
    from DM_F_LO_SALE_DAY a, DM_D_ALL_ORG_FLAT B, DM_D_ALL_PROD_FLAT D --, DM_D_LO_CUST E
    where a.ORG_ID=B.ORG_ID
          and a.PRODUCT_ID=D.CODE
    group by rollup(UP_ORG_ID_1, UP_ORG_ID_2, UP_ORG_ID_3, UP_ORG_ID_4, UP_ORG_ID_5, UP_ORG_ID_6),
          a.TRADE_TYPE_ID,
          a.day_id,
          A.CUST_ID,
          rollup(UP_CODE_1, UP_CODE_2, UP_CODE_3, UP_CODE_4, UP_CODE_5, UP_CODE_6),
          a.QUANTITY_UNIT,
          a.COST_UNIT,
          a.SOURCE_ID );Note, GROUPING_ID seems not help, at least i didn`t find it useful in this scenario.
    any recommendation, links or ideas would be highly appreciated as always.
    Thanks

    anyone ever encounter this kind of problems?
    any thought would be appreciated.
    thanks

  • How to delete duplicate record in Query report

    Hi Experts,
    I had created an infoset and query in my sap, but I want to delete some duplicate records before the list out put.Please can we add some further codes in the Extras code to delete duplicates? And how do it? Would you please give me a simple brief.
    Joe

    Hi,
    You can try to restrict in the filter area in query designer with the values for characteristic which gives correct
    result.
    But still i would suggest that in the cube you keep not the duplicate records as this is not your requirement and giving
    you wrong result.
    So you can reload the correct records in the cube inorder to avoid such problems even in future.
    Regards,
    Amit

  • Duplicate record with same primary key in Fact table

    Hi all,
       Can the fact table have duplicate record with same primary key . When i checked a cube i could see records with same primary key combination but the key figure values are different. My cube has 6 dimentions (Including Time,Unit and DP) and 2 key figures. So 6 fields combined to form the composite primary key of the fact table. When i checked the records in se16 i could see duplicate records with same primary key. Ther are no parallel loading happening for the cube.
    BW system version is 3.1
    Data base is : Oracle 10.2
    I am not sure how is this possible.
    Regards,
    PM

    Hi Krish,
       I checked the datapacket dimention also. Both the record have same dimention id (141). Except the Keyfigure value there is no other change in the Fact table record.  I know this is against the basic DBMS primary key rule. But i have records like this in the cube.
    Can this situation arise when same records is there in different data packet of same request.
    Thx,
    PM
    null

  • Duplicate data in cube

    Hi,
    we have 2 reports, there is difference in date between two reports. it is showing exact double data.
    can any body help me how can i check the cube is having double data.
    explaing me how to check the contents in cube.
    Thanks
    Rajini

    Hi Rajini,
    As everybody said, you can use T-code LISTCUBE and give the cube's technical name and execute it. Then click on field selection for output, select the characteristics you want, since you are saying data is getting doubled, don't select all the fields...maybe you would want to select fields like 0calday and characteristics through which you can track down double records. Then execute it and see, you will find your data.
    My guess is since you are saying double data, there would have been duplicate records i.e. data would have been loadde twice so once you are done checking the data and if you are sure same data was loaded twice, delete one request ID in your DTP.
    Before loading data next time, there is an option in DTP, where you have to check mark a box and the msg next to that box will be something like "Avoid Duplicate records". It will avoid loading same data twice.
    Guess this should solve your problem.
    Guru

  • Duplicate records: Process : Delete Overlapping Requests from InfoCube

    Hi Experts,
    We are loading data in standard costing cube with standard available option Full upload. In our process chain we have included process type "Delete Overlapping Requests from InfoCube". In our scenario we always load yesterday and today's data. In this case after loading yesterday's data, we need to check and delete the overlapping requests and then upload todays data.
    Many a times this deletion process is failing due to message "Couldn't lock cube" because it is already locked by user "ALEREMOTE". This cause system to duplicate the records in cube.
    How we can avoid this?
    Alok

    I tried running again and it again failed. Checked in SM12 and found this entry
    800     ALEREMOTE     08/14/2007     E     RSENQ_PROT_ENQ     CREA_INDX      ZCCA_C11                      DATATARGET     CREA_INDX                     ######################################     0     1
    This locked is not released since 14th. Is there way to remove the lock using some process.

  • Duplicate Records error when processing transaction file....BPC 7.0

    Hi All,
    I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
    1. Will we get duplicate records in transaction files?
    2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
    3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
    5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
    Your quickest reply is much appreciated.
    Thanks,
    Alex.

    Hi,
    I have the same problem.
    In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
    ES: cost1 --> cost
          cost2 --> cost
          cost3 --> cost
    In my desire was that in BPC the nature cost assume the result  cost = cost1 + cost2 + cost3.
    The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
    Any suggestion?

  • Loacate and remove duplicate records in infocube.

    Hi!!
    we have found the infocube 0PUR_C01 contians duplicate records for the month april 2008, approx 1.5lac records are extracted to this infocube, similar situations may be occuring in the subsequent months.
    How do I locate these records and remove them for the infocube?
    How do I ensure that duplicate records are not extracted in the infocube?
    All answers/ links are welcome!!
    Yours Truly
    K Sengupto

    First :
    1. How do I locate duplicate records in an Infocube? other than down load all the records in an excel file and use excel funtionality to locate duplicate records.
    This is not possible since a duplicate record would not exist - the records are sent to a cube with a + and - sign to accordingly summarize data.
    You search for duplicate data would become that much troublesome.
    If you have a DSO to load it from - delete data for that month and reload if possible this would be quicker and cleaner as opposed to removing duplicate records.
    If you had
    ABC|100 in your DSO and it got doubled
    it would be
    ABC|+100
    ABC|+100
    against different requests in the cube - and added to this ill be your correct deltas also.

  • Duplicate Records in the InfoCube how should i do to fix it?

    Hi All,
    we have different values between R/3 and BW, in the first control i find that in the cube i have duplicate records, how should i do to control and fix step by step this problem.
    the Infocube receive data from 7 ODS.
    let me know if you need further detail about our data model or load
    thanks a lot for your help
    Bilal

    Hello All,
    please i need further detail to don't make critical errors in the cube.
    when i control data in my infocube right click ==> view data with this selection
    0GL_ACCOUNT= R060501950
    0PSTNG_DATE= from 01.01.2009 to 31.03.2009
    i find duplicate records for all this info:
    0GL_ACCOUNT, 0CO_DOC_NO, 0DOC_DATE, 0PSTNG_DATE, 0COORDER, 0FISCPER... and all the key figures.
    to delete this duplicate records i have to make selections going: Manage ==> Contents tab ==> selective deletion (in the right corner beside) ... at this step what should i do?
    i have start in background or i can select "Selective Deletion"
    for this selective deletion which kind of info i have to put in reletion with my problem explained before
    0GL_ACCOUNT= R060501950
    0PSTNG_DATE= from 01.01.2009 to 31.03.2009
    IF I PUT THIS INFO AND EXECUTE wich records the system will delete? all the records with this selections or only the DUPLICATE RECORDS?
    Thanks a lot for your help
    Bilal

  • Duplicate records in delta load?????pls help!!!! will assign points

    Hi all,
    I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
    I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
    i ran init of delta without data transfer, extracted 0 records as expected.
    then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
    what could be the reason for duplicate records to occur in the delta load?
    i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
    Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
    Will assign points.

    ur selection criteria -
    01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
    both of ur selection includes the month- .02.2007
    might b all selections come under .02.2007
    hav u checkd tht?
    Regards,
    Naveen Natarajan

Maybe you are looking for

  • Select IN query - PreparedStatement

    String val = "AAA','BBB','CCC','DDD"; String query = "Select col1 from tabName where RTRIM(col2) in (?)" PreparedStatement p =conn.prepareStatement(query); p.setString(1,val);The above code doesn't return in value...Why? Using oracle DB, col2 is of t

  • Error when starting BEx

    Hi, When I am starting the Business Explorer, I am getting an Error Message: " A critical Program error occured. The Program has to close.Please refer to the trace for further information". Please reply if you have any info regarding this error.

  • Cannot delete greyed out podcasts on Ipod Touch

    I cannot delete greyed out podcasts on my ipod Touch since uprading to iOS 5.1. They don't appear when connected to itunes. Scarcely the end of the world but a little annoying! I'd be grateful for any suggestions.

  • How can I know the security role of the logged in user

    When you design an enterprise bean or Web component, you should always think about the kinds of users who will access the component. For example, an Account enterprise bean might be accessed by customers, bank tellers, and branch managers. Each of th

  • [Guide] Useful terminal commands for troubleshooting/system status.

    Hi everyone.  I'm looking for a concise list of useful commands to run incase something should happen to be going wrong somewhere (and also a 1-line for what they do)...the kind of output you'd post in a bug report.  Some other good commands would he