Regarding duplicate records allowing

Hi,
   From R3 system we are getting duplicate records., we need this duplicate records on the cube., so what option have to select
as at cube to allow duplicates.
                              Can anyone please let me know on it.

Hi,
Using T-code RSKC you can allow BW system to accept special characters in the data coming from the source system. The list of charu2019s can be obtained after analyzing S.S or u can confirm from the client.
If it is very important go to RSKC& editu2026.. if it is not required Go to PSA deleteu2026
Thanks
sudhakar.

Similar Messages

  • Fact Table allows duplicate records

    Is Fact Table allows duplicate records?

    What do you mean by duplicate records? It could be what appears duplicate to you is having some other technical key info that are different in the fact table.
    At technical level, there wouldn't be duplicate records in fact table [in SAP's BW/BI there are two fact tables for each cube - which can itself cause some confusion]

  • Duplicate record should not allow to enter

    hi
    i have three columns
    sno Number;
    in time date;
    out time date;
    this is the multiple taulat layout :
    User enter at run time values like this :
    1 01:15 03:00
    1 01:15 03:00 Here second record User should not allow to enter same Sno for Same In time and End time
    Before entering the data the total no of records : 0
    can u give me any suggestion.
    how to validate the columns.

    If you have a Unique Constraint on the table for these there columns it will prevent duplicate records.
    1 01:15 03:00 Here second record User should not allow to enter same Sno for Same In time and End timeBy "Second User" do you mean a user in a seperate Forms Session? If so, there is no way to check if the user in session 2 is entering a duplicate of a record entered by a user is session 1 accept through the use of a unique constraint on the table.
    There are methods available for checking for duplicates in the same Forms Session. Check out: Avoid duplicates in the same block and Forms: Record Group Processing (Duplicate Value Checking) for more information.
    Hope this helps,
    Craig B-)
    If someone's response is helpful or correct, please mark it accordingly.

  • Sqlloader controlfileparam's to avoid duplicate records loading

    Hi All,
    I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
    Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
    Regards

    Hey
    i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
    On the difference between the bad and reject files try this link
    http://www.exforsys.com/content/view/1587/240/
    Regards,
    Sushant

  • Import Transaction Data - Duplicate records

    Hi,
    I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
    Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
    Regards
    Sue

    Hi,
    Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
    Thanks,
    Sreeni

  • Duplicate Records in InfoProvider

    Hi,
    I am loading the Transaction Data from the flat files in to the Data Sources.
    Initially I have One Request (data from one flat file) loaded in to PSA and InfoCube that has say 100 records.
    Later, I loaded another flatfile in to PSA with 50 records (without deleting the initial request).  Now in PSA, I have 150 records.
    But I would like to load only 50 New records in to the Infocube.  When I am executing the DTP, its loading 150 records.  i.e. Total 100 (initial records) + 150 = 250 records.
    Is there any option by which I can avoid loading the duplicate records in to my InfoCube.
    I can find an option that says "Get Data by Request" in DTP.  I tried checking that, but no luck.
    How can I solve this issue and what exactly the check on "Get Data by Request" does?
    Thanks,

    Hi Sesh,
    There is an option in DTP where you can load only the new records. I think you have to select "Do not allow Duplicate records" radio button(i guess)... then try to load the data... I am not sure but you can research for that option in DTP...
    Regards,
    Kishore

  • What is the use of ingnore duplicate records ?

    Hi guru's
    what is the use of ingnore duplicate records ? it will not allow duplicate records when u r loading masterdata ?
    actually md will not duplicate records , why we use this option? with out select the check box in duplicate records, what will happen?
    its supports only for flat file or r/3 sys, if supports flat file tell me both procedure
    Thanks
    Reddy

    Hi,
    If u checks Ignore Duplicate records means, system will allow duplicate records.
    Actually Master data will not have duplicate records. This option is rarely useful in certain scenarios.
    regards
    SR

  • DELIVERY OF DUPLICATE RECORDS?

    hi friends,
    delivery of duplicate records under general info tab at datasource in bi.
    undefined , allow,none,
    my view 2 options will be enough allowed or not allowed,
    what is purpose  undefined , none ?
    regards
    suneel.

    This indicator gives information on how the DataSource behaves within a request with regard to duplicate records:
    ' ' The status of the DataSource is unknown.
    '0' The DataSource does not deliver any duplicate records within a request, with reference to its key.
    '1' The DataSource can deliver duplicate records within a request, with reference to its key. However, no duplicate records are delivered in a data package.
    This indicator is particularly important for delta-capable attribute tables and text tables.
    For the settings '0' and '1' you also have to define a key for the DataSource. This can be either in the DDIC using the maintenance for the corresponding field property of the extract structure fields, or (alternatively or additionally) in the metadata of the Datasource. A field in the DataSource also has the additional attribute ' DataSource Key Field', which transfersor corrects the DDIC property where necessary.
    Use
    DataSources can, for a key, transfer time-independent master data or time-independent texts from multiple data requests in one request to BW. If data records within a request are transferred to the BW more than once, in some circumstances this can be explained as application relevant and is therefore not considered to be an error. BW provides functionalities for handling duplicate data records that can handle such an ambiguity.
    Dependencies
    The DataSource transfers the information concerning whether it is transferring potential duplicate data records. This information is given to the scheduler when creating new InfoPackages. In the scheduler you can determine how the system responds to duplicate data records.

  • Duplicate records to DSO

    Hello Friends,
    we have an issue with the Duplicate records in to DSO let me enplain the senarion
    The Heder and Details data is loaded to saperate DSO's
    and these 2 DSO's data shuld get merged in the third one,
    the Key fields in
    DSO 1 : DWRECID, 0AC_DOC_NO
    DSO 2 : DWRECID , DWPOSNR
    DSO 3 will fetch data from these the above 2
    Key Fields are : ]
    DWTSLO,
    DWRECID,
    DWEDAT ,
    AC_DOC_NO
    DWPOSNR,
    0CUSTOMER
    Now the data shuld be merge in to a single record in the 3 rd dso
    DSO 1  do not have the DWPOSNR object in its data fields also.
    we even have start routine  data from DSO 1 to populate  some values in the result fields from dso2 ,
    Please provide if you have any inputs to merge the data record wise.
    and also give me all the posibilites or options we have to over write " apart from mappings " the data ,

    Hi,
    You should go for creating an Infoset instead of creating third DSO.
    In that DSO provide the Keys of DSOs and the Common records with those keys will be merged  in that Infoset.
    Hope It Helps.
    Regards
    Praeon

  • How to delete Duplicate records in IT2006

    Dear Experts
    We have a situation like where we have duplicate records with same start and end dates in IT2006. This is because of the incorrect configuration which we have corrected now, but we need to do  a clean-up for the existing duplicate records. Any idea on how to clean it?  I ran report RPTKOK00 to find these duplicates but I could not delete the duplicate/inconsistenct record using report RPTBPC10 or HNZUPTC0, i Could only delete the deductions happened in the record.
    Is there any standard report/any other means of deleting the duplicate records created in IT2006?
    Thanks in advance for all your help.
    Regards
    Vignesh.

    You could probably use se16n to identify the duplicates and create the list of quotas to delete, and you could probably use t-code lsmw to write up a script to delete them, but be aware that you can't delete a Quota if it's been deducted from.
    You'd have to delete the Absence/Attendance first, then delete the Quota, then recreate the Absence/Attendance.

  • How to delete duplicate record in Query report

    Hi Experts,
    I had created an infoset and query in my sap, but I want to delete some duplicate records before the list out put.Please can we add some further codes in the Extras code to delete duplicates? And how do it? Would you please give me a simple brief.
    Joe

    Hi,
    You can try to restrict in the filter area in query designer with the values for characteristic which gives correct
    result.
    But still i would suggest that in the cube you keep not the duplicate records as this is not your requirement and giving
    you wrong result.
    So you can reload the correct records in the cube inorder to avoid such problems even in future.
    Regards,
    Amit

  • Write Optimized DSO Duplicate records

    Hi,
    We are facing problem while doing delta to an write optimized data store object.
    It gives error "Duplicate data record detected (DS <ODS name> , data package: 000001 , data record: 294 )
    But it can not have an duplicate record since data is from DSO and
    we have also checked in PSA the particular record and there we couldn't find the duplicate record.....
    There is no much complex routine also.....
    Have any one ever faced this issue and got the solution? Please let me know if yes.
    Thanks
    VJ

    Ravi,
    We have checked that there is no duplicate records in PSA.
    Also the source ODS has two keys and target ODS is having three Keys.
    Also the records that it has mentioned are having Record mode "N" New.
    Seems to be issue with write-ptimized DSO.
    Regards
    VJ

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Duplicate record with same primary key in Fact table

    Hi all,
       Can the fact table have duplicate record with same primary key . When i checked a cube i could see records with same primary key combination but the key figure values are different. My cube has 6 dimentions (Including Time,Unit and DP) and 2 key figures. So 6 fields combined to form the composite primary key of the fact table. When i checked the records in se16 i could see duplicate records with same primary key. Ther are no parallel loading happening for the cube.
    BW system version is 3.1
    Data base is : Oracle 10.2
    I am not sure how is this possible.
    Regards,
    PM

    Hi Krish,
       I checked the datapacket dimention also. Both the record have same dimention id (141). Except the Keyfigure value there is no other change in the Fact table record.  I know this is against the basic DBMS primary key rule. But i have records like this in the cube.
    Can this situation arise when same records is there in different data packet of same request.
    Thx,
    PM
    null

Maybe you are looking for

  • How to create a Portlet application in OPM 10.4

    Hello, We would like to create a portlet application. The purpose of creation of such application is to embedded this in the portal. Can we get the pointer on this please?

  • A simple selection of objects behind and much more

    Selecting things under a different object in the Illustartor is sometimes very annoying. For proper selection of objects was invented several tools. Please use them alternately depending on the situation. - outline mode - preferences = object selecti

  • Plz Help me put  navigation menu on my front page, but a good one......

    Hi ive used spryware ive used normal typing but my navigation bar is awful how can i can create a good one ? Another thing is ive made the spyware navigation menu bar horizontal but when i load it up on my ftp files for the internet it ends up going

  • All choises in Dashboard prompt

    Hi All, I have created an date prompt and in default section i put the repository varibale(which contains the current month).All choises box is uncheked. But still when i go to my dashboard by default i will get all choises in drop down... any sugges

  • Organizing albums per year in my iPod

    Friends, I find that organizing albums per year is much more convenient than by alphabetical order. And makes much more sense, to me. Now this is no problem when you're in iTUnes, as you can visualize all your music both ways, per year, alphabeticall