Date bug in import process: 2 sets of dates from same-day shots

I took some photos the other day using my Canon 400D. Some were taken around 1:00 pm, and another series of them were taken around 4:00 pm. I then went to import them into Lightroom from a CompactFlash card, using a cardreader, by using the organize by date option with the format 2005\2005-12-17. When Lightroom's import process gathered the file data for the images, I discovered that there were two sets of dates, viz. 2007\2007-02-24 and 2007\2007-02-25. Considering that I was doing the import on 2007-02-24, the second date was quite in error.
I checked the EXIF data and the files seemed OK. The timestamps on the files on the card were also as expected.
I then manually copied the image files to a temporary folder on my PC. When I pointed Lightroom's import function to that folder, lo and behold, the files were picked up and the correct single date for the target folder was set up for me.
Has anyone else come across this problem? Any suggestions how to work around it from WITHIN Lightroom if it occurs again?
Witold

Are you by chance using Vista? That would be a "Known issue", the workaround being copying the images to the hard drive and importing from there instead of importing directly from the card/camera.
Alexander.
Canon EOS 400D (aka. XTi) • 20" iMac Intel • 12" PowerBook G4 • OS X 10.4 • LR 1 • PSE 4

Similar Messages

  • SMS_DISCOVERY_DATA_MANAGER Message ID 2636 and 620. Discovery Data Manager failed to process the discovery data record (DDR)

    Hi
    I'm seeing this critical error on my primary.
    SMS_DISCOVERY_DATA_MANAGER Message ID 2636 and 620. 
    Discovery data manager failed to process the discovery data record (DDR)"D:\Prog.....\inboxes\auth\ddm.box\userddrsonly\adu650dh.DDR", because it cannot update the data source.
    Where these ddr's are actually under ddm.box\userddrsonly\BAD_DDRS folder
    I see a ton of DDR files in that folder. Not sure if I can delete them, so I moved them to a temp folder. AD User discovery keeps generating them.
    Any help ?
    Thanks
    UK
    

    Check the ddm.log file for more information.
    My Blog: http://www.petervanderwoude.nl/
    Follow me on twitter: pvanderwoude

  • Importing two sets of pictures into same collection

    Hi,
    I have two sets of pictures on 2 different CF cards and would like to import them into the same collection because they belong together. I already imported one set of photos from one CF card into the appropriate collection name. Since there doesn't seem to be an ADD TO EXISTING COLLECTION option how does one add pictures from another folder to an existing collection? I know one can drag and drop them but I have over 100 additions.
    Don

    > I'm using LR 2. My folders are named Switzerland 1-6 for all my Swiss photos taken on 6 CF cards.
    Are you splitting the shoot into 6 folders just because one CF card happened not to have enough room for all photos? Why not just put the into one folder?
    > I didn't see an ADD TO COLLECTION option so I just dragged and dropped this time, but it would be so much easier to ADD them.
    Select photos in folder view, right-click the target collection and choose "Add Selected Photos to this Collection".
    > My collections are named according to the town or canton where the photos were taken.
    If the collection are based on metadata, then smart collections indeed is the best way. Use the proper fields in the metadata panel to add towns and cantons, then define a Smart collection that says "City contains Zurich". And you're done. Automatically.

  • EHS - CG36 - Import report - Set MSDS version from key file data

    Hi.
    This is in reference to thread EHS - CG36 - Import report - how to define MSDS version in key file?
    I'm faced with the same client requirement, came across this discussion, and wondering if there was a solution to this.
    I'm on ECC6.0. My client requests to retain the Version of the MSDS at the time of export (CG54 Dok-X, VER key file data) when it gets imported to another system (CG36). Example (similar to Roy's): If the reports in export system are 1.0 and 2.0, then they must be created in the import system the same, not 1.0 and 1.1.
    Apparently, as Christoph has stated, the standard CG36 Import process doesn't make use of the VER key file data besides storing it into the Report's Additional Info (DMS Class charact.).
    I tried to get around this via the user exit. In the IMPORT fm, I set the version/subversion of the report to be created but the C1F3 fm that does the actual report creation just ignores it. If you've done a similar approach, what have I missed? I'm also afraid if I have to clone the C1F3 fm...
    I appreciate your thoughts and inputs.
    Thanks in advance.
    Excerpt from my fm ZC13G_DOKX_SDB_IMPORT:
    FORM l_create_ibd_report...
      IF e_flg_error = false.
    *   fill the report_head
        e_report_head-subid     = i_subid.
        e_report_head-sbgvid    = i_sbgvid.
        e_report_head-langu     = i_langu.
        e_report_head-ehsdoccat = i_ehsdoccat.
        e_report_head-valdat    = sy-datum.
        e_report_head-rem       = i_remark.
    *beg-LECK901211-ins
        e_report_head-version    = i_ver.
        e_report_head-subversion = i_sver.
    *end-LECK901211-ins
    * Begin Correction 15.06.2004 745589 ***********************************
        IF ( l_api_subjoin_tab[] IS INITIAL ).
    *     create the report
          CALL FUNCTION 'C1F3_REPORT_CREATE'
            EXPORTING
              i_addinf            = i_addinf
              i_flg_header        = true
              i_flg_subjoin       = false
            IMPORTING
              e_flg_lockfail      = l_flg_lockfail
              e_flg_error         = l_flg_error
              e_flg_warning       = l_flg_warning
            CHANGING
              x_api_header        = e_report_head
            EXCEPTIONS
              no_object_specified = 1
              parameter_error     = 2
              OTHERS              = 3.
        ELSE.
    Edited by: Maria Luisa Noscal on Apr 8, 2011 8:23 AM

    Solution is to incorporate the logic used by tc CG36VEN, that is, the process of performing a direct table ESTDH update after the new report is saved into the database.
    Edited by: Maria Luisa Noscal on Apr 19, 2011 7:34 PM

  • Can I import two sets of data to Essbase at the same time?

    For example, use two MaxL Scripts to import data to one Essbase Application Datatbase in the Same time, is there any impacted or will fail one of them or nothing will happen but just slower?

    Hi,
    It is possible you run 2 dataloads at the same time, there will be probably be an impact in performance than just loading one file at a time, not sure of the impact, you will have to test.
    If you are talking about a BSO cube you could also have a look at the essbase configuration setting DLTHREADSWRITE to see if you can improve the dataload time by increasing the threads.
    If it is an ASO cube then you should be able to do multiple dataloads into different buffers, though you have to commit them to the database at the sametime.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Best Practice for data pump or import process?

    We are trying to copy existing schema to another newly created schema. Used export data pump to successfully export schema.
    However, we encountered some errors when importing dump file to new schema. Remapped schema and tablespaces, etc.
    Most errors occur in PL/SQL... For example, we have views like below in original schema:
    CREATE VIEW *oldschema.myview* AS
    SELECT col1, col2, col3
    FROM *oldschema.mytable*
    WHERE coll1 = 10
    Quite a few Functions, Procedures, Packages and Triggers contain "*oldschema.mytable*" in DML (insert, select, update) statement, for exmaple.
    Getting the following errors in import log:
    ORA-39082: Object type ALTER_FUNCTION:"TEST"."MYFUNCTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TEST"."MYPROCEDURE" created with compilation warnings
    ORA-39082: Object type VIEW:"TEST"."MYVIEW" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"TEST"."MYPACKAGE" created with compilation warnings
    ORA-39082: Object type TRIGGER:"TEST"."MYTRIGGER" created with compilation warnings
    A lot of actual errors/invalid objects in new schema are due to:
    ORA-00942: table or view does not exist
    My question is:
    1. What can we do to fix those errors?
    2. Is there a better way to do the import with such condition?
    3. Update PL/SQL and recompile in new schema? Or update in original schema first and export?
    Your help will be greatly appreciated!
    Thank you!

    I routinely get many (MANY) errors as follows and they always compile when I recompile using utlrp.
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_LASTOUTPUNCH" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_REFPERIODENDFOREMP" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_TAILOFFSECS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."FN_GDAPREPORTGATHERER" created with compilation warnings
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/ALTER_PROCEDURE
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ABSENT_EXCEPTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_BAL_PROJ" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_DETAILS" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_SUMMARY" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACTUAL_SCHEDULE" created with compilation warnings
    It works. In all my databases: peoplesoft, kronos, and others...
    I should qualify that it may still be necessary to 'debug' specific problems, but most common typical problems are easily resolved using the utlrp.sql. The usual problems I run into are typically because of a database link that points to another database such as in a production environment that we firewall our test and development databases from linking to (for obvious reasons).

  • Constraint on a Date Column to Not Allow Setting Weekend Dates

    We are utilizing SQL Server 2008 R2 for scheduling AD user migrations. We keep track of all user accounts to be migrated and have a column for Migration_DATE with data type "Date". 
    I am looking to add a constraint to the Migration_DATE column to not allow scheduling dates on weekends. In other words I don't want anyone setting dates that fall on the weekend or a list of dates on a table.  Can anyone help me with how to do this
    or point me in the right direction. 
    Thanks much!

    Hello,
    Based on your description, you had a table which used for track AD user migration. And you wan to alter the date column "Migration_DATE" to not allow insert weekend date. If I understanding correctly, you can add a constraint to check the insert values.
    ALTER TABLE teble_name ADD constraint ck_Migration_date
    CHECK (DATENAME(WEEKDAY,Migration_DATE) NOT IN ('Saturday','Sunday'))
    Reference:Creating and Modifying CHECK Constraints
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • Data modeler - howto assign process/function to data model

    Hello all
    Problem path:
    sd11 (data modeler)
    i.e model: UNIMODELL (this is the model used as training in help.sap.com)
    There is [functions/processes] button (F8)
    When using UNIMODELL there is a process assigned to model and I can choose it (goes to Display Module: A.... screen)
    But:
    when I create my own data model -> when clicking [functions/processes] i get:
    "No functions assigned" infrmation
    Where can I add fun/proc to my data model?
    Thx4anyHelp
    Mateusz

    What method did you use?
    If FK is created and not removed then no scope clause is added - no need for that.
    It works for me:
    CREATE TABLE TABLE_2
    Column_2 REF StructuredType_1 ,
    Column_3 REF StructuredType_1
    ALTER TABLE TABLE_2
    ADD ( SCOPE FOR ( Column_2 ) IS TABLE_1 )
    ALTER TABLE TABLE_2
    ADD ( SCOPE FOR ( Column_3 ) IS TABLE_3 )
    table_1 and table_3 are of StructuredType_1
    Philip

  • Process Chains are running from 3 days

    Hi BW experts,
    Daily our process chains run in the night time.
    But Friday's jobs are still running. So i stopped all process chains on saturday.
    Friday's jobs did not finish till now. Still these are running. These are in active.
    Some process chains are stopped at CREATE INDEXES.
    Some process chains are stopped at ODS Activation.
    These are in yellow status.
    Please can anyone tell me the solution for this. I have to start our process chains today night.
    How can i rectify this problem? Can we stop the yellow status jobs
    Thanks & Regards
    Anjali

    Hi 
    check the loads in the process monitor whether it has completed or not??
    Some times it will show yellow if it gets completed also.Otherwise check for the source systems from where u r getting data.There may be a problem with that.
    Cheers
    Sunil Reddy LCP

  • 2nd Display & iMac w/ 2 Different Sets of Content from Same iMac

    Can I do this?  (How can I do this?) I'd like to connect a second display to my iMac and use the iMac to generate content (like live feeds or video content from CNN.com) to play on the second display while I "work" on the iMac and its screen. Thus, each display have its own content--but the content is generated by the one iMac.  Could I use a HD tv as the second display?
    Thanks for your guidance.
    Bill

    Depends on the iMac, but some of them do support this.
    Look yours up at  http://www.everymac.com/systems/apple/imac/index-imac.html  and look for the section about displays and 2nd display support under your model.
    With the second display connected to the appropriate port, you use your Displays System Preferences to configure them. If you choose "mirror" both displays show the same thing. If you choose to not mirror, you can have an "extended desktop where you can place different windows on different monitors and do exactly what you want to do. AN HD TV might be usable but you have to look up what your iMac can handle in terms of pixels and resolution and compare that to your HD TV specs. You can certainly buy inexpensive monitors to use if your HD TV is compatible.

  • LR Import organization with visible date/time

    The new LR3 Import interface has proven to be a challenge for the workflow of my business application.
    I take numerous shots throughout my work day of a variety of subject matter that subsequently needs to be imported and grouped by client name. I delegate the import duties to a staff member.  The task requires that the staff member orginize the images in the import process into separate subfolders named for each client.  The staff member views the image thumbnails in the import interface, selects the appropriate images to import for each client with the "Check Box" functionality, and names an output folder for the imported images in the Destination panel (with "Organize: Into one folder" designated.)
    Here is the problem: our workflow requires that we perform the import process only occasionally through out the day--which necessitates handling and separating the images of multiple clients.  While LR3 does display thumbnails on the import, it no longer initially displays a list that shows the date and time of the image capture.  LR2 used to display an initial list that showed the date and time.  It was easy for my staff member to determine which images belonged to which client after the fact by coordinating capture time with client appointment time.  That ability is now removed.
    I realize there are "work arounds" possible including using separate CF cards for each client or shooting an initial frame of the clients name.  While minor, these steps seem to be unnecessary inconveniences.  I would much prefer to rely on the software for the solution.
    I noticed that a "rollover" with the mouse temporarily displays the date and time of the capture. Since LR has the capture info readily available, my enhancement request would be simply to display the date and time of the capture on the image border frame.  This would eliminate the need to rollover to view the data and would make our import organization easier and more streamlined.  (For those users who might feel that this creates too much "clutter" in their initial viewing of images to be imported, the feature could be made a "user preference" that could be toggled on and off . . .like data preferences selected in view options selected with Ctrl-J and toggled with keyboard shortcut I).

    Sorry but you're going to need to explain that again.

  • What is this message: Base Line Date for rule 11 not set ?

    Hello all,
    I am doing some test.
    Cenario:
    I need to create a Scheduling Agreement in transaction: VA41
    The material I am using to create the VA41, I have just created.
    I created in MM01 the views: Basic Datas 1 and 2, Sales View 1,2 and 3 and Account View 1.
    Issue:
    When I am creating the VA41 and I enter the material, the system shows the messages:
    1) Dates from:: Base line date for rule 11 not set
    2) Dates from:: Base line date for rule 9 not set
    *obs: the messages dos not block the creation of VA41. It is a information message.
    Question:
    What that means? What will happen if I do not fix that? What should I do in order to not have more these messages?
    Tks & Rgds,
    Barbara
    Message was edited by:
            Barbara Barbos

    Hi
    Please check whether you have assigned - payment terms both at company code and sales area level
    VVR

  • How can I set document date automatically as system date?

    I use FBV1, FBV2, F-02, FB02. Posting date is automatically set as system date. I want to set document date automatically as system date.
    How can I set document date automaticall as system date?
    Wbr.
    İlker Çokkeçeci
    Computer Engineer
    Ankara, Turkey

    Hi Expert,
    Please refer the below link..
    Document date to be defaulted as system date for all FI transactions
    as there mentioned, please go to T-Code: SHD0.
    Regards,
    GK
    SAP

  • Data packet not yet processing in ODS load??

    Hi all,
    I got an error when I loaded data from IS to the ODS. Can someone let me know why and how to resolve it. Thank you in advance.
    Here is the error message in the monitor:
    <b>Warning: data packet 1 & 2 arrived BW; processing: data packet not yet processing.
    (No data packet numbers could be determined for request REQU_77H7ERP54VXW5PZZP5J6DYKP7)</b>
    <b>Processing end:
    transfer rules (0 record): missing message
    Update PSA (0 record): messing messages
    Update rules (0 record): messging messages</b>

    John,
    I dont think its space problem.In st22 go with detail note.
    What happend, how to correct it.Will help you to solve the problem.
    Check this note <b>613440</b> also.
    <b>Note : 647125</b>
    Symptom
    A DYNPRO_FIELD_CONVERSION dump occurs on screen 450 of the RSM1 function group (saplrsm1).
    Other terms
    DYNPRO_FIELD_CONVERSION, 450, SAPLRSM1
    Reason and Prerequisites
    This is caused by a program error.
    The screen contains unused, hidden fields/screen elements that are too small for the screen check that was intensified with the current Basis patch (kernel patch 880). These fields originate in the 4.0B period of BW 1.0 and are never used.
    Solution
    Depending on your BW system release, you must solve the problem as follows:
    BW 3.0B
               ImportSupport Package 14 for 3.0B (BW 3.0B Patch 14 or SAPKW30B14) into your BW system. This Support Package will be available when note 571695 with the short text,"SAPBWNews BW 3.0B Support Package 14", which describes this Support Package in more detail, is released for customers.
    BW 3.1 Content
               ImportSupport Package 8 for 3.1 Content (BW 3.10 Patch 08 or SAPKW31008) into your BW system.This Support Package will be availablewhen note 571743 with the short text, "SAPBWNews BW 3.1 Content Support Package 08", is released for customers.
    The dump occurs with the invisible G_NEW_DATUM date field on the bottom right of the screen, which is only 1 byte long and can be deleted.
    You can delete the following unused fields/screen elements:
    %A_G_NEW_NOW                Selectionfield group
    G_NEW_ZEIT                  Input/output field
    G_NEW_UNAME                Input/output field
    G_NEW_DATUM                Input/output field
    %#AUTOTEXT021               Text field
    G_NEW_NOW                  Selection button
    G_NEW_BATCH                 Selection button
    You can delete these fields/screen elements because they are not used anywhere.
    This deletion does not cause any problems.
    After you delete the fields/screen elements, you must also delete the following rows in the flow logic in screen 450:
    FIELD G_NEW_DATUM           MODULE DOKU_NEW_DATUM.
    FIELD G_NEW_ZEIT            MODULE DOKU_NEW_ZEIT.
    The function group is then syntactically correct.
    Unfortunately, we cannot provide an advance correction.
    The aforementioned notes may already be available to provide information in advance of the Support Package release.However, the short text will still contains the words "preliminary version" in this case.
    For more information on BW Support Packages, see note 110934.
    Thanks
    Ram

  • User - Maximum logon duration, set expiry date ?

    Hello,
    Wanted to know whether BPC NW supports user setting of:
    Maximum duration logon; does the session become inactive after a certain amount of time in case you do not execute anything?
    Expiry date of user; can you set expiry dates and does a user automatically expire when you donu2019t log on for a certain amount of days?
    appreciate inputs.
    Thanks.

    Hi Ellora,
    I don't think if it possible in BPC NW, where as in BPC MS version we have Management Console which gives info. about will  all users who have been online for the given period of time.
    In BPC NW we have User Activity. BPC logs user and administrator behavior by recording information about each remote function call made from .NET to ABAP.
    Regards,
    Raghu

Maybe you are looking for

  • BDC Follow UP

    Hi Folks, After creating the BDC Recording and Generating a Program. Don't Know what modification should i do to proceed further. Could anyone please give the follow up proceedings for the BDC recording for Transaction MM02. Here is the Code Generate

  • What was your first experience owning a laptop?

    For many its a dream , for others looking around, its a sign of becoming envious one of these days, what was it like for you when you got your first laptop, and what kind was it? What was the reaction of others around you when you had it out in publi

  • How to get key from MDX Query

    Hi All, how to get key from mdx query ? example : SELECT [Measures].[67822GFASOU7KUT6FKHSQ34FV] ON COLUMNS NON EMPTY CROSSJOIN([ZCOMPANY].MEMBERS, [ZMILL].MEMBERS) ON ROWS FROM ZODS_GL/ZODS_GL_001 the result from this mdx query are zcompany text and

  • Server installation zurücksetzen

    wie kann ich meine Serverinstallation zurückseten und den Assistenten neu starten?

  • Idoc Post processing

    Dear Experts, I have a scenario where in once the delivery Idoc is kikked off, I need to kick a custom Idoc, can some one please guide me on how to handle this scenario. Please guide me on how to find a user exit post processing of an Outbound Idoc T