How will avoid "duplication records" in to info cube

hi friends
what is tha proccesor to avoid duplication records  in to info cube
send me notes ...
i will waiting for urs reply

Hi,
1. Set DSO before cube, because DSO has overwrite functionality.
2..Select the check box "ignore duplicate records at infopackage level", while loading data
Regards
CSM Reddy

Similar Messages

  • How to avoid Duplicate Records  while joining two tables

    Hi,
    I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
    select
    e.id,
    e.seqNo,
    e.name,
    d.resDate,
    d.details
    from employees e,
    ((select * from dept)union(select * from dept_hist)) d
    join on d.id=e.id and e.seqno=d.seqno
    but this returing duplicate records.
    Could anyone please tell me how to avoid duplicate records of this query.

    Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
    But I am getting duplicate records if even I am distinct.

  • How to avoid duplication of vendor name and vendor account group

    Let me know the menu path for setting in SAP - controlling / avoid duplication vendor name and vendor account group in one purchasing organisation during creation of vendor / uploading mass data.
    with regards
    vv

    Hi,
    See vendor code is unique no. But the name is not at all unique. so system allow to create duplicate record.
    by giving authorization to single person only you can do it.
    system discipline is actually required

  • How to avoid duplication of mails on Mac book and I phone

    Am using Apple mail on Mac Book Lion version & I phone 4 with IOS.
    Am using corporate mail server ie     [email protected]
    Despite having done the mail settngs with the option -     Delete mail immediately on removing the mail from Inbox, the same mail gets downloaded on the other device. That is, i receive all mails on both my devices Mac Book and I phone even if its deleted from Inbox in one device.
    i am looking for an option of avoiding duplication of mails - that is mails deleted shouldnt come again on the other device.
    pl help.
    thanks

    Depends on what passwords you are talking about. Most passwords you have the option to change whenever you want.

  • How to avoid clear record when tab pages changes

    Hi All
    I am using oracle forms 10g and db 10g.
    I have created a form with four tab pages. Namely "EXPENSE" , "AMOUNT_DETAILS", "SUPPLIER" , "ACCOUNT".
    When i enter a data in page 1 ie Expense and move to next page page2 "AMOUNT_DETAILS", and enters data in page3 "SUPPLIER" and when i come back to page1 "EXPENSE" and also Page2 "AMOUNT" the data get cleared in Tab pages. There is no data again i need to enter the data manually.
    Can any one suggest me how to avoid this clear data.
    Thanks & Regards
    Srikkanth

    Hi,
    Thanks once again for your quick response.
    I have checked it , i was working with oracle apps.Now i have entered all the datas in the four tab and press save button in the screen at that time also the data get cleared.Can you please tell is there any work around for this.
    regards
    Srikkanth

  • How to check the size of an info cube?

    Dear all
    Is there any t-code which can be used to see the total disk space occupied by a specific cube?
    I know ST14 can be used for the TOP 30 cubes and stuff.
    but my cube is not in top 30.....so how should i see the total size of my cube in bytes?
    please help me...i need it urgently
    Please dont recommend the SAP note on Sizing...i have it...the caliculation gave me wrong answer!
    Total Points assured!!!!
    Edited by: reddy reddy on Jan 27, 2008 9:34 AM

    Hi,
    we can calculate the size of the info cube.using this information
    Each keyfiure occupies 10 Bytes of memory
    Each Char occupies 6 Bytes of memory
    So in an Infocube the maximum number of fields are 256 out of which 233 keyfigure, 16 Dimesions and 6 special char.
    So The maximum capacity of a cube
    = 233(Key figure)10 + 16(Characteristics)6 + 6(Sp.Char)*6
    In general InfoCube size should not exceed 100 GB of data.
    While considering sizing, you must consider factors like years of data rewuired for reporting, acceptable repsonse time, database limitations, archiving options, etc.
    pl read this document:
    [https://websmp209.sap-ag.de/sizing]
    Hope this helps,
    regards
    CSM Reddy

  • How to find the activity of the Info cubes or any other Info Provider

    Hello everyone,
    Is there anyway to find out the last activity date of the info cube or any other info provider?
    One way is through tools--> Statistics for BI (But this is not working as I do not have enough authorizations in the office)
    So, can anyone please give me a way to know the last activity date of all the Info Provider so that respective action can be taken on the Info Provider according to their usage.?
    I also need the Procedure to know the activity of the queries also?
    Hope to get an answer soon..!
    Thanks in advance..!
    Deepak

    Hi,
    Are you looking actvity related to (Data retrival or data load)? (When the last time data loaded in Info provider or when the last time query is executed on that Info provider)?
    If this is your requirement then BI statistics is only way to find out so get the authorization for the same.
    If your need is -- last changed/llast data load etc.. then you may check the option "last changed at" at in Info provider maintinence, and data load dates.
    I hoep it will help.
    Thanks,
    S

  • How to avoid duplications making a SUM calculated field

    Hi, I'm trying to resolve this problem;
    In my dataset (in a custom folder on the administrator) I have a list of tickets (with some attributes), and some of them are duplicated:
    For example
    Ticket_id     Group_id             Ticket_date                Resource_name          Ticket_status      
    5416          100000401       10/12/2007 7:10:31 am                Mr. A                 2
    5416          100000401       9/1/2008 11:00:44 pm                 Mr. A                 2
    57381         100000401       27/12/2007 11:37:11 am               Mr. A                 2
    57381         100000401       15/1/2008 9:33:12 am                 Mr. A                 2
    I want this duplication because I need it when I filter the dataset (using the Ticket_date) with two parameters:
    Ticket_date between lower_limit_date and upper_limit_date
    So, inside the report, regarding the ticket's number, if I take 2 records with the same ticket_id, I have a calculated field where I use the COUNT_DISTINCT and for each couple of tickets the count is always =1; in this way (inside the report) I have the following fields:
    Group_id           Count_tickects   Resource_name  
    100000401              2                    Mr. A                 This is OK !
    Now, if I want to count how many tickets have the status = 2 (it means they are closed), I want to obtain 2 and NOT 4
    I tried to use this calculation (Tickets Closed): SUM(CASE WHEN "Tickets Report #7 COMPL".Ticket Status Id = 2 THEN 1 ELSE 0 END)
    but I always had the result of 4...and this not correct, because the tickets are only 2...
    I also tried to use some analytic function using the OVER PARTITION BY, but I didn't obtain the correct result
    In other words I'd like to achieve this:
    Group_id           Count_tickects   Tickets Closed     Resource_name
    100000401              2                     2             Mr.AAny help will be appreciated
    Alex

    Hi Rod, thanks for the reply,
    I tried this calculation
    COUNT_DISTINCT(CASE WHEN "Tickets Report #7 COMPL".Incident Status Id = 2 THEN 1 ELSE 0 END)
    but it retrieves 1; it's right, because the result of the CASE statement is always 2 (the Ticket_status is always 2, 4 times), but this is not what I expect.
    I'd like to calculate how many tickets are effectively opened (Ticket_status = 2) and this result is *2*, because I have only 2 ticket_id (5416 and 57381); the problem is that inside the data set these two ticket_id's are repeated two times.
    I can't use "Hide Duplicate Rows" because this is only a layout solution (behind I have always the duplication); I can't use more than one aggregate function together because I have the error: Nested aggregate functions are not allowed...
    To resolve this problem, I think, I have to restrict the records which are processed by the CASE statement; instead of pass to the calculation all four records (twice the ticket_id 5416 and twice the ticket_id 57381) I have to pass only two records (one the ticket_id 5416 and once the ticket_id 57381).....but how....???
    Alex

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • Scenario - Webservice - XI - BW. How to Avoid duplicate records?

    Hi all,
    Webservice --> XI -->BW .
    BPM has been used to send to send the response back.
    BPM :
    start ->Receive(Request)> Transformation(Responsemap)>Send(SendtoBW)->Send(Send Response) ---> stop.
    We are making use of MSGID to maintain the uniqueness of each message which is coming from Webservice. Uniqueness is maintained for the combination of sale-num:trans-num:sap-saletype:sale-type like below. One msgID will be registered in XI for each Unique message.
    ex:   sale-num:trans-num:sap-saletype:sale-type
           1983:5837:E:NEW
    If they receive any duplicate message again it will send the response back to webservice as "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".*
    It is working correctly. But only problem is when XI is down or if any communication failure happens in the middle of the processing like below example.
    Sample example which has failed recently. A webservice call has been failed three times and the reasons are..
    First time :
    It got the error as ""FAILED TO INVOKE WEB SERVICE OPERATION OS_CWUSales
    Error receiving Web Service Response: Fatal Error: csnet read operation failed (No such file or directory) (11773)" .
    Second time:
    MessageExpiredException: Message c9237200-0c69-2a80-dd11-79d5b47b213a(OUTBOUND) expired.
    Third Time :
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a" ""
    If you observe when the call made 2nd time, the MsgID has been registered but due to server down or some other reason it could not able to process further.So MSGID got registered here but processing wasn't happened for the message. When they retried thrid time again to send the same call again they are getting error as "DUPLICATE GUID".
    DUPLICATE GUID means it has meaning that the message has been processed and the records has been updated in the backend system which is not happened now.
    Final Result is :
    Status in Webservice showing as "it has been updated in receicing system" as it is indicating as duplicate guid.
    - But it has not updated in backend system which is a problem.
    Firstly is there any suggestions on how to solve this problem?
    Is there any better way to handle this duplicate thing instead of Msgid?
    Please help me in solving this.
    Thanks & Regards
    Deepthi.
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM

    >> My suggestion: You can have a Webservice - BW synch-synch synario without BPM. Sender SOAP adapter sending a synch req msg, getting it mapped to BW req and getting its response back from BW, then map it to webservice response and send to webservice without any receiver SOAP adapter.
    Thanks for the suggestion . Looks like a good Idea.
    >> Regarding the problem of duplicate check: see when your BW system gets req msg, it processes it and then sends a response msg.........in this response message have a STATUS field with values S for success and E for error - send this response to webservice and store it in database for that req MSGID ........now in web service application, check the response code for a MSGID and if it is E for error, then only resend the msg to BW.
    Initially they have planned the same way. But sometimes the response is getting very late back from BW. So now once the request reached to XI then immediately they are sending back the response from XI itself withhardcoded "OK" status. They are not waiting for BAPI response from BW.
    So If message is succesfull the Status will go as "OK"
         If message is not succesfull the status will go with blank. and this will indicate that there is some problem in the other side. Then they will check the SOAP Fault message which will have the errors like 
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".
    "FAILED TO INVOKE WEB SERVICE OPERATION "
    Right now they are having the issue only for duplicate data and response time to go back. So by making use of MsgID they solved the Issue.But due to this we are seeing daily so many error messages failing with this DUPLICATE error in production which is hampering the performance.
    So we are thinking of to get rid of this MsgId method and BPM. So from your first answer I think we can able to acheive it w/o BPM and after that I need to check the response time how fast it will go back to webservice.
    -Deepthi.

  • How to check No of Records in INfo Cube

    Hi Experts
    My Infocube consists of records in Millions...and i performed selective Deletion.
    Now can i check the No of records in the Info Cube
    Please update

    To fully estimate the volume of data in a cube, I'd also suggest that you check the number of records in each dimension table of the cube.
    go t-code listschema
    enter cube name
    all relevant tables will be displayed. right click on a dim table -> call se16
    on this page hit button "number of entries"
    hope this helps...

  • How to avoid duplicates in CROSS JOIN Query

    Hi,
    I am using CROSS JOIN to get all the subset of a table col values as shown below:
    PRODUCT (Col Header)
    Bag
    Plate
    Biscuit
    While doing cross join we will get as
    Bag Bag
    Bag Plate
    Bag Biscuit
    Plate Bag
    Plate Plate
    Plate Biscuit ..... like this
    By placing where condition prod1 <> prod2 to avoid Bag Bag and Plate Plate values. So the output will be like below
    Bag Plate
    Bag Biscuit
    Plate Bag
    Plate Biscuit
    Now "Bag Plate" and "Plage Bag" are same combination how to avoid these records. My expected result is
    Bag Biscuit
    Plate Biscuit
    How to derive this ?
    Sridhar

    Hi,
    This is the the solution that I found as fit to the OP question, but
    Visakh16 already posted the same idea (assuming the names are unique) from the start and I don't think that anyone notice it!
    Sridhar.DPM did
    you check Visakh16's response
    (the second response received)?!?
    I will mark his response as an answer. If this is not what you need pls clarify and you can unmark it :-)
    [Personal Site] [Blog] [Facebook]

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Sales Order - Line Level Information in Info cube

    Hi BW Guru's,
    We are using the following scenario: -
    Info cube = Sales Order – Line level information (yes, we want to capture the product level sales order information in info cube for our internal reporting purpose)
    ODS = NO. we are not using the same
    Info Source – Same as data source (standard one)
    Data Source – 2LIS_11_VAITM (standard business content data source for sales order information)
    Delta in above data source = contract related information (5 extra fields) – working fine – added as append structure in the data source
    Extractor = Standard using SBIW - Perform setup - SD Sales Order (SD – 11)
    Number of Sales order line records / month = 80,000
    Historical Sales order data to be uploaded = 1 year = 80,000 * 12 = Nearly 1 Million records
    Hardware Configuration = 4 * 750 MHZ, 8 GB RAM
    Please advice on the following things: -
    1) How long it will take to upload 1 million records in BW?
    2) Does our server capacity sufficient enough to take 1 million load?
    3) Which delta procedure should we follow for day to day Sales Order line level data updation? Please advice on data selection procedure.
    4) In case, last month Sales Order is changed, how to reflect the same in our info cube? Please suggest the suitable method.
    Waiting for your reply.
    Thanks
    Rajiv

    Hi Rajiv,
    1) How long it will take to upload 1 million records in BW?
    This depends on your network, on the width of your data record, the performance of your machine, on the settings for communication between BW and R/3 ....
    It can be done in 2 hours, in 3 hours, it can take a whole day. There is no exact answer possible.
    2) Does our server capacity sufficient enough to take 1 million load?
    If your disc space is enough, no problem. If you already loaded your master data, there is not too much space necessary to load the transactional data.
    3) Which delta procedure should we follow for day to day Sales Order line level data updation? Please advice on data selection procedure.
    Just run your initial setup and initialize your delta. Run a daily delta package (this is what most companies do).
    4) In case, last month Sales Order is changed, how to reflect the same in our info cube? Please suggest the suitable method.
    The extractor takes care about this, just post the delta to your cube, that's it.
    Hope this helps,
    regards
    Siggi

  • Is it possible to use same data source for two info cube

    Hi,
    My Problem is in BW we can not have value of material at storage location level.In R/3 also value is maintained at plant level.
    Then we searched and we found out one hot to doc for summarized display of stock values on storage location level.
    Problem is that we have gone live in last December and we are using " 0AFMM_C02 " and it contains around 1,81,26,000 records. and according to note we have to use
    "0IC_C03".
    Both the cube uses same data sources for the data.So, how to get the data for "0IC_C03".
    and how to delete the data of existing info cube.And is it possible to delete data selectively from the info cube.
    Pls. help.
    Regards,
    viren.

    Hi,
    You can't create update rule from PSA.You can create from the infosource or from ODS or from cube to cube or ODS to ODS.
    In your scenario, what you can do is create update rules from the ODS to the new cube and then transfer the data from there. Or from the Infosource create rules to the new data target and then upload the full data and then set up the delta.
    Third option is to create update rules from the existing cube to the new cube and then load all the data one time. Then you can deactivate the update rules as that was needed only for 1 time data transfer.
    Cheers,
    Kedar

Maybe you are looking for