Lightrroom 1.1 created thousands of duplicate DNG records

Latest NewsSony Cyber-shot H7 review
Fujifilm IS Pro, third UV & IR camera
Sigma SD14 firmware 1.04
Canon Digital Photo Professional 3.0.2
Canon EOS Utility 2.0.2
Samsung announces S85
Pentax announces Optio E40 & M40
Samsung introduces new L-series cameras
Casio Exilim EX-Z77 and EX-S880
Just Posted! Fujifilm S5 Pro Review
Pentax K10D firmware 1.30
Adobe ships CS3 'Production Premium' and 'Maste...
Fujifilm S5 Pro firmware Ver 1.08
News by keyword...
More...Reviews / PreviewsSony DSC-H7
Fujifilm FinePix S5 Pro
Olympus E-410
Canon PowerShot TX1
Sony DSC-H9
Nikon D40X
Nikon Coolpix P5000
Casio Exilim EX-V7
Fujifilm FinePix F31 fd
Panasonic Lumix DMC-L1
Panasonic Lumix DMC-TZ3
Olympus SP-550 UZ
Pentax Optio A20
Kodak C875
Nikon Coolpix S10
Panasonic Lumix DMC-FZ8
Canon PowerShot SD900
Samsung NV7 OPS
By rating...
More...Camera DatabaseAgfa
Canon
Casio
Contax
Epson
Fujifilm
HP
JVC
Kodak
Konica Minolta
Kyocera
Leica
Nikon
Olympus
Panasonic
Pentax
Ricoh
Samsung
Sanyo
Sigma
Sony
ToshibaTimeline2007
2006
2005
2004
2003
2002
2001
2000
1999
1998
1997
1996
1995Buying GuideFeatures search
Side-by-side
Top Cameras...Sample GalleriesDiscussion ForumsNews Discussion
Open Talk
Beginners Questions
Pro Digital Talk
Lighting Technique
Canon Talk
Canon EOS 400D/350D/300D
Canon EOS 30D/20D/10D/D60/D30
Canon EOS-1D/1Ds/5D
Canon SLR Lens Talk
Casio Talk
Fujifilm Talk
Fujifilm SLR Talk
Kodak Talk
Kodak SLR Talk
Konica Minolta Talk
Konica Minolta SLR Talk
Leica Talk
Nikon Talk
Nikon D80/D70/D50/D40
Nikon D1/D2/D100/D200
Nikon SLR Lens Talk
Olympus Talk
Olympus SLR Talk
Panasonic Talk
Pentax Talk
Pentax SLR Talk
Ricoh Talk
Sigma SLR Talk
Sony Talk
Sony SLR Talk
Other Digicams
Storage and Media
PC Talk
Mac Talk
Printers and Printing
Samples and Galleries
RetouchingLearn / GlossaryOverview
Camera System
Digital Imaging
Exposure
Optical
StorageFeedbackNewsletterLinksRSS FeedsLatest News
Reviews / Previews
Latest models
Discussion forums
More...AboutFAQ
In the Press
Advertising
Statistics
About Us
Merchandise
Polls
CopyrightPreview: Lightroom 1.1 created thousands of duplicate records
Your message / edit has not yet been submitted to the forum, this page is a preview of how the message will look once posted. To post click on the 'Post now' button below, to make changes before posting click on 'Edit'.
Forum Pro Digital Talk
Subject Lightroom 1.1 created thousands of duplicate records
Posted by dvine [CLICK FOR PROFILE]
Date/Time 10:07:47 PM, Friday, July 13, 2007
I thought I had about 20,000 raw images in crw, cr2, nef and tif camera raw formats.
Over the years, I had backups and duplicates scattered about, but each unique image had the same base name... yymmdd_serial#.
So I looked forward to using LR to help organize the mess.
I created a LR database and incorporated my most complete collection, converting all to dng as I went. I ended up with about 20,ooo images.
I then began importing from other sources with similar, but not identical file structures, and with ignore duplicates checked. I was suprised to learn that I actually had about 30.000 images.
Unfortunately about 10,000 were duplicates of two dng types.
1) Many were copies with -2.dng added to the original file name. This must be a problem with the DNG converter which goes through any directory and simply copies and renames duplicates. These are not too difficult to eliminate with Vista's search for *-2.dng.
2) Much nastier was the fact that dng files with the same name that existed in different subdirectories were imported and only when I tried to clean up some of these subdirectories in LR, was I informed there were other files with the same name. So I had to use BreezeBrowser to find and eliminate about 5000 duplicate dng files... Now they are not exactly duplicates because many were gererated

That may be a consequence of not being on the forum, but by email or RSS.
I much prefer direct forum psoting, myself.
Don
Don Ricklin, MacBook 1.83Ghz Duo 2 Core running 10.4.10 & Win XP, Pentax *ist D
http://donricklin.blogspot.com/

Similar Messages

  • TS4001 iCloud created thousands of duplicate bookmarks

    My iCloud account created thousands of duplicate bookmarks.  I would like to delete them all.  Any thoughts on how I could go about this?

    Resetting Safari/iCloud bookmarks

  • Photos created thousands of duplicates

    After setting up iCloud Photo Library on my iPhone and Mac, I realized that I suddenly had 10,000 photos and 400 videos (I originally had half of each). I called Apple and they basically said that my iPhoto Library (where I and most Mac users had imported and stored iPhone pictures) had merged with my phone photo library, and there is no way to reverse the process.
    Not only have most of my photos been duplicated, but they are all out of order. Pictures from the night of November 19th show up again on November 20th, for example. This is a massive pain...the idea of sorting through 10,000+ pictures and deleting half of them sounds like an absolute nightmare. I also had to upgrade from 20GB of iCloud storage to 200GB because APPLE doubled my pictures.
    At least iPhoto warned you when you wanted to import duplicate photos...now they just do it without even asking you! Who the **** designed this Photos app?
    Does anyone know of a way I could easy delete duplicate photos in the Photos app? None of the photo-duplicate-deleter apps I've seen in the App Store can select the Photos Library folder...so I'm not sure what to do. Also, are other people having this problem? How did Apple let this happen?

    Fastest solution: restore from back up from prior to when this happened.
    This can happen if the HD icon or a system folder is inadvertently dragged to the iPhoto icon on Window.

  • How to remove thousands of duplicate emails in Apple Mail?

    Hi everybody,
    I know there are a couple of discussions on the web already about this issue, but I cannot find one that clearly answers my question or solves my problem.
    I'm working on Lion with Mail version 5.1 and I have a Gmail-account. I have been messing around a bit with IMAP and POP to re-collect all my Gmail-mails into Apple Mail, but now I see that almost every email has 1 to 5 or even more duplicates.
    I've tried Andreas Amann's "Remove Duplicates" script, but whatever Mailbox I check, it doesn't work with Lion (yet). I get this error:
    There is another script I tried, the one from this page: http://hints.macworld.com/comment.php?mode=view&cid=115625
    It just makes my computer freak out, so not OK.
    A third script I found here: http://jollyroger.kicks-***.org/software/
    I downloaded the "RemoveDuplicateMessages-Mail.zip"-file and installed it. It does say it finds and relocates duplicate mails from the Mailbox that is selected in Mail:
    It relocates all the duplicates to a folder on my Desktop, called "Remove Duplicate Messages". I do find them there, but the mails in the selected Mailbox in Mail.app still show their duplicates! Even when I throw the files from the new folder on Desktop in the Trash and empty it, nothing changes. Not even after restarting Mail. Or my Mac.
    So I'm still stuck with thousands of duplicates. Any help on this one?
    Thanks a lot,
    Alexander

    I have found a very satisfactory solution to eliminating my duplicates in Apple Mail, which is to stop using it and switch to Thunderbird.
    There's lots that I like better in Apple Mail, but Thunderbird is light-years more flexible in managing your email than is Apple Mail, because Thunderbird is an open platform with excellent Add-ons available, which include the nifty Remove Duplicate Messages: http://bit.ly/tbird-dups. It is stupid that Apple Mail hides duplicate messages in the background & won't allow users to search and destroy them. Frustrating!
    As a prologue to that, you need the TB import export Add-on: http://bit.ly/tbirdimportexport
    Managing my mail locally on my Mac was a huge project for me, involving integrating tens of thousands of emails. Mail was  from gmail, yahoo, and many years of Outlook PST files.
    Now that the project is complete, I'm very happy with Thunderbird to manage my email archives. Though I actually would prefer the Lion Apple Mail interface for day to day use, it just isn't flexible enough.

  • HT201812 I have thousands of duplicates. "Show Duplicates" does half the work. Help.

    I recently consolidated a couple incomplete backups into a brand new iTunes library on my Macbook Air.
    Despite needing to load both collections to cover all my music, there are thousands of duplicate song files in iTunes.
    I can use "Show Duplicates" to show all of them in one screen. But Apple's advice was to simply "select the duplicates I want to delete."
    Now, I suppose I can take an extra day off work, Press Command and start to click on every other file listed for several hours and hope I don't mess up at any point and have to start over.
    But I have to believe in this technologically advanced world, that there is a better way to accomplish this goal.
    Please help me dear sweet Internet.

    The first thing you should try to fix this problem is to sync your iPhone with iTunes over USB instead of WiFi. If that doesn't work. Delete the music that is corrupted on your iPhone from your iTunes library. Re-sync your iPhone with iTunes make sure it deletes the music. Then, reload the music on, and sync via USB. If none of that works, the only possible solution is to restore your iPhone's firmware.

  • Consolidation made thousands of duplicates

    I have never had problems with consolidation until today. But today I clicked the consolidate button and after about a minute of watching the blue bar, I realised that something was wrong, as I thought there were only about 100 songs or so to move. I tried to Stop, but it wouldn't respond. Finally, when it did respond, about three minutes later, I found that it had copied all other tunes into the itunes folder, whether they were already there or not, so now I have thousands of duplicates.
    Could somebody tell me why this has happened, how I can stop this from happening again, and what the **** I can do with all these duplicates. (The duplicates do not show up in the itunes library).
    (I have just upgraded to itunes 6.0.4 from 6.0.3: might that have had anything to do with it? )

    I'd just like to add a little more here.
    All the duplicates are like this: Lullaby.mp3 and Lullaby 1.mp3.
    I assumed that I could simply find all the 1.mp3 files with Spotlight and delete them. However, I have found that for some reason the itunes library refers to the 1.mp3 files. Therefore, If I delete them, I would then have to delete all of the library entries, and then find and add all the relevant files from the itunes folder into the library. This is going to take ages, and if anybody has a better idea I'd like to know.
    Just as importantly, I'd like to know why this happened, and what steps I can take to avoid it happening again.

  • ERROR ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found

    Hi,
    SAPSSRC.log
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: START OF LOG: 20071018195059
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#12 $ SAP
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: version R6.40/V1.4
    Compiled Nov 30 2005 20:41:21
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe -ctf I C:/temp/51030721/EXP2/DATA/SAPSSRC.STR C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/DDLORA.TPL C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/SAPSSRC.TSK ORA -l C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/SAPSSRC.log
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: job completed
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: END OF LOG: 20071018195059
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: START OF LOG: 20071018195133
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#12 $ SAP
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: version R6.40/V1.4
    Compiled Nov 30 2005 20:41:21
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe -dbcodepage 4103 -i C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/SAPSSRC.cmd -l C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/SAPSSRC.log -stop_on_error
    DbSl Trace: ORA-1403 when accessing table SAPUSER
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): WE8DEC
    (DB) INFO: ABTREE created #20071018195134
    (IMP) INFO: import of ABTREE completed (39 rows) #20071018195134
    (DB) INFO: ABTREE~0 created #20071018195134
    (DB) INFO: AKB_CHKCONF created #20071018195134
    (IMP) INFO: import of AKB_CHKCONF completed (0 rows) #20071018195134
    (DB) INFO: AKB_CHKCONF~0 created #20071018195134
    (DB) INFO: AKB_INDX created #20071018195134
    (IMP) INFO: import of AKB_INDX completed (0 rows) #20071018195134
    (DB) INFO: AKB_INDX~0 created #20071018195134
    (DB) INFO: AKB_USAGE_INFO created #20071018195134
    (IMP) INFO: import of AKB_USAGE_INFO completed (0 rows) #20071018195134
    (DB) INFO: AKB_USAGE_INFO~0 created #20071018195134
    (DB) INFO: AKB_USAGE_INFO2 created #20071018195134
    (IMP) INFO: import of AKB_USAGE_INFO2 completed (0 rows) #20071018195134
    (DB) INFO: AKB_USAGE_INFO2~0 created #20071018195134
    (DB) INFO: APTREE created #20071018195134
    (IMP) INFO: import of APTREE completed (388 rows) #20071018195134
    (DB) INFO: APTREE~0 created #20071018195134
    (DB) INFO: APTREE~001 created #20071018195134
    (DB) INFO: APTREET created #20071018195134
    (IMP) INFO: import of APTREET completed (272 rows) #20071018195134
    DbSl Trace: Error in exec_immediate()
    DbSl Trace: ORA-1452 occurred when executing SQL statement (parse error offset=35)
    (DB) ERROR: DDL statement failed
    (CREATE UNIQUE INDEX "APTREET~0" ON "APTREET" ( "SPRAS", "ID", "NAME" ) TABLESPACE PSAPBW1 STORAGE (INITIAL 44981 NEXT 0000000040K MINEXTENTS 0000000001 MAXEXTENTS 2147483645 PCTINCREASE 0 ) )
    DbSlExecute: rc = 99
    (SQL error 1452)
    error message returned by DbSl:
    ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
    (DB) INFO: disconnected from DB
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: END OF LOG: 20071018195134
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: START OF LOG: 20071018195314
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#12 $ SAP
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: version R6.40/V1.4
    Compiled Nov 30 2005 20:41:21
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe -dbcodepage 4103 -i C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/SAPSSRC.cmd -l C:\Program Files\sapinst_instdir\NW04\SYSTEM\ABAP\ORA\NUC\DB/SAPSSRC.log -stop_on_error
    DbSl Trace: ORA-1403 when accessing table SAPUSER
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): WE8DEC
    (DB) ERROR: DDL statement failed
    (DROP INDEX "APTREET~0")
    DbSlExecute: rc = 103
    (SQL error 1418)
    error message returned by DbSl:
    ORA-01418: specified index does not exist
    (IMP) INFO: a failed DROP attempt is not necessarily a problem
    DbSl Trace: Error in exec_immediate()
    DbSl Trace: ORA-1452 occurred when executing SQL statement (parse error offset=35)
    (DB) ERROR: DDL statement failed
    (CREATE UNIQUE INDEX "APTREET~0" ON "APTREET" ( "SPRAS", "ID", "NAME" ) TABLESPACE PSAPBW1 STORAGE (INITIAL 44981 NEXT 0000000040K MINEXTENTS 0000000001 MAXEXTENTS 2147483645 PCTINCREASE 0 ) )
    DbSlExecute: rc = 99
    (SQL error 1452)
    error message returned by DbSl:
    ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
    (DB) INFO: disconnected from DB
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    C:\usr\sap\BW1\SYS\exe\run/R3load.exe: END OF LOG: 20071018195315
    I'm getting this error "duplicate keys found". I'm finished installing the central instance and during the database instance, i got this error. I'm installing BW 3.5 on x64 windows server 2003 platform. I'm using NU kernel 6.40.
    Thanks for your suggestions on how to resolve this error.
    Reward points guaranteed.

    Issue solved by deleting central and database instance and started a new build. it finished without an error.
    Thank you.

  • Having corrected my corrupt catalog (PSE 11) I now have thousands of duplicate images. Is there any way of of removing the extra copies? I've had suggestions that apply to Windows, and iPhoto but can't find any advice for Photoshop Elements on Mac Maveric

    Having corrected my corrupt catalog (PSE 11) I now have thousands of duplicate images. Is there any way of of removing the extra copies? I've had suggestions that apply to Windows, and iPhoto but can't find any advice for Photoshop Elements on Mac Maverick

    You used the data.  Verizon can not see what it was sued for.  However your phone can see whats apps used the data.  go to settings-data usage- there will be a place that says data usage cycle.  line the dates up with your cycle.  then there will be a bar graph below that   extend bother white bars one all the way to the left and one all the way to the right.  after those are extended below that will be a list of apps,  there should be one that used over 2 gb and that will show you what app used that data in her purse

  • W_PARTY_D_U1 = ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found.

    Hi,
    We are implementing OBIA 11.1.1.7.1 which comes with ODI for Fianance and Procurement analytics.When we do full load from EBS the load plan gets failed and it throws the below error
    Caused By: java.sql.SQLException: ORA-20000: Error creating Index/Constraint : W_PARTY_D_U1 => ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found.
    Please help us to resolve the same.
    Thanks
    Rama

    This might need Patch:10402735
    if helps mark

  • How to create thousands of sql scripts easily?

    Hello,
    I would like to create thousand of scripts as following:
    insert into table tb_test('aaa');   update table tb_class( set nm = 'aaa');
    insert into table tb_test('basd'); update table tb_class( set nm = 'basd');
    insert into table tb_test('asdfa');update table tb_class( set nm = 'asdfs');
    insert into table tb_test('xxxyy');update table tb_class( set nm = 'xxxyy');
    Is there any simple way to generated those scripts automatically? I am thinking of :
        define var_str = ('aaa', 'bbb', 'asdfd',...'xxxyy');
        select 'insert into table tb_test(' || var_str || '); update table tb_class( set nm =' || var_str || ');'
        from dual;
    Then how do I generate those scripts from the var_str?
    Thanks,

    942572 wrote:
    I have more than 5000 parameters in a text file as followings, which can be opened by notepad or excel:
    'aaa',
    'basd',
    'asdfs',
    'xxxyy'
    Ok, so assuming this is a 1-off event, use Excel.
    Setup a quick formula to turn each into a proc call:
    [edit] if the values are in column "A", then in column "B", you could do:   =concatenate("p_proc('", a1, "');")
    then copy that formula down.
    [/edit]
    p_proc('aaa');
    p_proc('basd');
    etc.
    Then setup a little p_proc as Frank described. (and please, use a WHERE clause on that update ... or you're going to destroy some data )
    Then copy past those commands into a routine that ends up looking like:
    declare
    procedure p_proc ( in_value )
      is
      begin
          -- Frank's code goes here
      end;
    begin
       -- proc list goes here
      p_proc('aaa');
    ..... etc.
    end;
    That's 1 very primitive way to do it.
    If you need to reproduce this in future, you may want to consider loading that list of parameter values into a table, then you can do some more funky things. *shrug*

  • How to create thousands of autorizations roles??

    Hi gurus i am working in a project where i have to create thousands of autorization roles and i have to limit information in the DW by group of users, do you know if ther's a tool that help me to create that bunch of objetcs fast and easy? i mean i should create thousand of roles!!! there might be one!
    Thanks!

    Hi,
    For reporting users, in transaction PFCG you have to add the following authorization objects to your profile and maintain the required settings. Security of reporting users mainly focuses on Info Area, Info Provider /Query.
    S_RS_COMP: Info Area, Info Provider, Reporting Component
    S_RS_COMP1: Reporting Component, Query Owner
    S_RS_ICUBE / S_RS_ODSO - Info Area, Info Provider, Sub Object.
    S_RS_FOLD: To switch the display of Info Areas ON/OFF.
    You have to include the authorization objects S_RFC and S_TCODE. RRMX is the transaction which is to be checked for Reporting purpose.
    For securing at Info Object level, you can define your own custom authorization objects in transaction RSSM and add them to your profile.
    For securing work books,
    the authorization objets available are S_GUI and S_BDS_DS - For saving workbooks to Favourites.
    S_USER_AGR and S_USER_TCD (Transaction code: RRMX) is used for saving workbooks to roles.
    For Administrative users, you can use templates provided by SAP as reference for defining Authorization Objects.
    The main authorization object which provides Primary level of security for Administrative users is S_RS_ADMWB.
    Hope this helps.
    DST

  • User Exit to Search for Duplicate BP Records

    Hai Guys,
    I want to use a "User Exit" that gives me a warning message when I am creating a BP with the same name and zipcode which is already presnet in the system.
    Right now we are not getting any warning messgae and hence we are having duplicate BP records.
    SO I want the warning message pop up with the already created BP number presnet in the system.
    Your Help is Appreciated

    Hi Gregor,
    We are on CRM 5.0 and face a similar requirement of duplicate check for BPs. Can you please tell me if it can be achieved without using third party tools? If yes, could you please give an approach (BADIs that will have to be implemented). I'm on CRM 5.0 and could not find the views/tables CRMV_BP_* in TCode SM30. Please help.
    Thanks in advance,
    Vishal

  • ABAP/4 Open SQL array insert results in duplicate database records in SM58

    Hi Everyone,
    I am testing a file to idoc scenario in my Quality system. When I passed the input file, the mapping executed successfully and there are no entries in SMQ2 but still the idoc wasn't created in the ECC system. When I have checked in TRFC, I am getting the error  ABAP/4 Open SQL array insert results in duplicate database records for IDOC_INBOUND_AYNCHRONOUS function module. I thought this is a data issue and I have tested with a fresh data which was never used for testing in Quality but even then I am getting the same error.Kindly advise.
    Thanks,
    Laawanya

    use FM idoc_status_write_to_database to change the IDoc status from 03 to 30 and then  run WE14 or  RSEOUT00 to change the status back to 03
    resending idoc from status 03 ...is a data duplicatino issue on receiving side...why do u need to do that ?
    Use WE19 tcode to debug
    In we19
    1)U can choose your Idoc number in existing Idoc textbox
    2)Press execute
    3)u will display ur Idoc struct
    4)Dbl click on any field then u can modify its content
    5)PressStd Outbound Processing Btn to process modified Idoc
    Thats it

  • Duplicate data records through DTP

    Hi Guys,
    I am loading duplicate data records to customer master data.
    data upto PSA level is correct,
    now when  i am it from psa to customer master through dtp ,
    when in DTP in update tab i select the check box for duplicate data records then at bottom it shows the
    message that *ENTER VALID VALUE*
    After this message i am unable to click any function and repeat the same message again & again.
    So please give me solution that the above mentioned message shouldnt appear and then
    i will be able to Execute the data ?
    Thanks .
      Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

  • Duplicate data records through DTP for attribute

    Hi Guys,
    I am loading data to customer master data, But it contains duplicate data in large volume.
    I have to load both attribute and text data .
    Data upto PSA level is correct, and text data also is loaded successfully.
    When i am loading attribute data to customer master ,it fails due to duplicate data records.
    Then in dtp with update tab, I select the check box for duplicate data records .
    As i select this check box ,at bottom it shows the
    message that *ENTER VALID VALUE* .
    After this message i am unable to click any function and repeat the same message again & again.
    So i am unable to execute the DTP.
    helpful answer will get full points.
    So please give me solution that the above mentioned message should not appear and then
    i will be able to Execute the data ?
    Thanks .
    Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

Maybe you are looking for

  • Why are some of the songs in my music grey and I can't click on them

    Idownloaded songs on my computer, then synced them to my iPod. After I thought they downloaded, I checked my music and some of the songs were grey and I can't click on them  anyone have any solutions?

  • Requirement of sending Sapscript through mail and fax

    I need help in code for sending sapscript through email or fax as the data is maintained in customer master table . If anybody has done so please help. I do not want to use CONVERT_OTF function and SO...API1 function.

  • Invoking java methods from C/C++ on the machine with different JREs

    I implemented Windows NT Service instantiating JVM and invoking several java methods. Everything works but I have an issue with running the service on the machine where multiple different versions of JRE have been installed. The service is calling ja

  • Help to make faster my VI

    Hello! I have a cluster array with 105500 clusters; in a cluster there are 3 2D array with the same size (10 rows and 8 columns). I want to merge the data to obtain a single 2D array with 10*105500 rows and 8*3 columns. I  know that with large array

  • Issue with runbook and the "&"

    I was working on a Service request/ Runbook for creating a new Shared mailbox in exchange online.  Everything worked fine when I submitted the request(s). Then I had another team member try it out on a real request and things did not work as expected