Synchronization error CRTranDB::AddNew : Duplicate record Id - FFFFF99B

I very recently upgraded to the new Blackberry Desktop Software for my Torch 9810.  the transition seemed to go well, I did a number of syncs w/o problems.  This morning I suddently received the sync error in the subject line, which is preventing me from completing a sync.  Help?
Syncing w/ Microsoft Outlook
Matt

Hello,
To resolve the synchronization error "CRTranDB::AddNew : Duplicate record Id", please backup your BlackBerry® smartphone and clear the Address Book All database.
The following KB articles provide the steps to backup and to clear the Address Book All database.
"How to back up data on a BlackBerry smartphone" http://bbry.lv/IWfPl0
 "How to clear the user content databases from the BlackBerry smartphone" http://bbry.lv/JtQKcw
Once the Address Book All database has been cleared, please try to synchronize again.
Hope this helps.
-FS
Come follow your BlackBerry Technical Team on Twitter! @BlackBerryHelp
Be sure to click Kudos! for those who have helped you.
Click Solution? for posts that have solved your issue(s)!

Similar Messages

  • Customizing error message of duplicate record

    HI All,
    I need to customize an error message of duplicate record inserting in a field of unique-constraint. I want to show the message in an alert. How can i do this?
    Arif

    Hi,
    you can use ON-MESSAGE trigger and check the error code then display your custom error message
    Regards,

  • Master data failed with  error `too many duplicate records'

    Dear all
    below is error message
                Data records for package 1 selected in PSA -                                             
                error                                                 4 in the update
    LOng text is giving as below
    Error              4 in the update
    Message no. RSAR119
    Diagnosis
    The update delivered the error code                                                   4.
    Procedure
    You can find further information on this error in the error message of the update.
    WOrking on BI - 7.0
    any solutions
    Thanks
    satish .a

    Hi,
    Go through these threads, they have same issue:
    Master data load: Duplicate Records
    Re: Master data info object - duplicate records
    Re: duplicate records in master data info object
    Regards
    Raj Rai

  • Error due to duplicate records

    Hello friends,
    I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
    When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
    But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
    Can any one try to throw some lights on this error?
    Regards
    Sre....

    Hello Roberto and Paolo
    There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
    I dont know the reason Exactly.
    Can you throw some lights on this, Why its like that?
    Thanks for the reply paolo and roberto
    Regards
    Sri

  • Duplicate records in PSA and Error in Delta DTP

    Hi Experts,
    I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
    1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
    2)  I have run the Delta DTP.Now That one record has moved to DSO
    3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
    4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
    So would you please tell me how to rectify this Issue.
    My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
    Thanks and Regards,
    K.Krishna Chaitanya.

    you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
    you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
    the best thing is to have a timestamp.
    if in the datasource you have a date and time, you can create a timestamp yourself.
    you need in this case to use a function module.
    search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
    M.

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • Duplicate records-problems

    Hi gurus
    We created a text datasource in R/3 and replicated it into BW 7.0
    An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
    The job failed because of duplicate records.
    We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
    When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
    In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
    Does anyone have any suggestion on how to solve this problem?
    Thanks,
    @nne Therese

    Hi Muraly,
    I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
    It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
    suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
    I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
    Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
    Please suggest me and explain me if I am wrong.
    Thanks,
    Sudha..

  • Duplicate records in Infoobject

    Hi gurus,
      While loading in to a infoobject itz throwing an error as 150 Duplicate records are found in the details tab. My doubt is about temporary correction i.e in the infopackage either we can increase the error handling number and load or we can five the third option(valid records update) and update to infoobject.
    My question is when error occurs i can find that that some records are there in added. After the error occured i forced the request to red(Delta) and deleted it and gave 3rd option(valid records update) and updated to infoobject. The load was successful all the records are transferred but in added the number of records is 0.And also i read in one article that the error records will be stored as seperate request in PSA even i cannot find that. I want to know when or how i will get the news records added.

    Hi Jitu,
    If its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,
    1.Go to RSRV- tcode, expand Master data and double click on second option.
    2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.
    3.select that particular infoobject and click on Delete button,
    if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.
    4.Select the request and click on the read manually button (wheel button).
    Dont forget to save the infopackge back by selecting only infoobject.
    hope this works.Let me know if u have doubts.
    Assign points if u r able to find the solution
    Regards,
    Manjula

  • Duplicate records in material master

    Hi All
    I am trying to init material master and I am getting this error message
    "281 duplicate record found. 0 recordings used in table /BI0/XMATERIAL"
    This is not the first time , I am initializing . Deltas were running for a month and then I had to run a repair request to capture some changes and then when I ran the delta from then onwards I am getting this message.
    I started by deleting all the records and running this inti packet.
    I have tried the error handling and also I do not see any option for "ignore duplicate records " in the packet.
    I cannot see any error (red) records also in the PSA enough though the message says there are errors
    Please advice
    Thanks

    Hi,
    The duplicate record check is in the extraction program. I would suggest you do not deactivate/comment it out.
    What you should do is to go back to your material master records from the source system and sort out the materials not having unique identifier. Once this is sorted out, you can then re-run your delta. You shouldn't have the problem again. I once had the same problem from an HR extraction and i had to go back to the source data and ask the business to correct the duplication. A record for an employee was changed and there was overlap of dates in the employees record. The BW extraction program saw this as a duplicate record.
    I hope this help.
    Do not forget to award the points please.
    Regards,
    Jacob

  • Duplicate record in Masterdata: Failure in Chain, Successful when manually

    Hi all,
    A daily load updates 0BPARTNER_ATTR from CRM.
    Recently this load started to fail, with the following error message:
    xx duplicate records found. xxxx recordings used in table /BI0/XBPARTNER
    The data load through PSA, but there is no red record in the PSA.
    After copying the PSA data to Excel for further analysis I concluded there are actually no duplicate in the PSA.
    Also, the option 'Ignore double data records'  in the infopackage is checked.
    When I manually update from PSA the load is successful.
    This started to happen about two weeks ago, and I didn't find an event which could have caused it.
    Now it happens every two or three days, and each time the manual update is successful.
    Any suggestions, anyone?
    Thanks, Jan.

    Hi Jan,
    Possibly you would have two requests in PSA and you would try to update the Data Target.
    Delete all the requests in PSA, schedule once again to bring the data to PSA and update then into Data Target.
    Thank you,
    Arvind

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate Records error when processing transaction file....BPC 7.0

    Hi All,
    I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
    1. Will we get duplicate records in transaction files?
    2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
    3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
    5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
    Your quickest reply is much appreciated.
    Thanks,
    Alex.

    Hi,
    I have the same problem.
    In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
    ES: cost1 --> cost
          cost2 --> cost
          cost3 --> cost
    In my desire was that in BPC the nature cost assume the result  cost = cost1 + cost2 + cost3.
    The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
    Any suggestion?

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • Duplicate records error?

    hello all
    while extracting master data am getting duplicate records error?
    how do i rectify this?
    in infopackage screen in processing tab,will i get the  option " ignore double data records",?
    when will this option will be enable?
    regards

    Hello
    This option will be available only for Master Data and not for Transactional Data. You could control the Duplicate Records for Transactional Data in ODS, there is an option in the ODS Settings.
    ***F1 Help
    Flag: Handling of duplicate data records
    From BW 3.0 you can determine for DataSources for master data attributes and texts whether the extractor transfers more than one data record in a request for a value belonging to time-independent master data.
    Independently of the extractor settings (the extractor potentially delivers duplicate data records) you can use this indicator to tell the BW whether or not you want it to handle any duplicate records.
    This is useful if the setting telling the extractor how to handle duplicate records is not active, but the system is told from another party that duplicate records are being transferred (for example, when data is loaded from flat files).
    Sankar

Maybe you are looking for