EXIF time altered on conversion from PEF to DNG

Converting Pentax PEF files from my *istDS to DNG using the Adobe DNG converter (4.1) results in the EXIF date/times all being shifted forward by six hours. I gather that the camera and the converter are miscommunicating regarding time zone. But the question is: short of running a batch operation to correct the times after the fact, what can be done about this to prevent this from happening in the future? I've also asked in the Pentax forums, in case someone there knows about any answers from the camera side of the equation.
One workaround I've found is to generate XMP files for my images before running the converter. I have discovered that if the EXIF info is present in the XMP file - and ACDSee Pro 2 will do this for me by default as soon as I try to write copyright info or anything else to my PEF - then DNG will use that instead of the "real" EXIF info to populate the EXIF fields of the resulting DNG file. Actually, I'm kind of guessing it is still messing up the "real" EXIF version of the info in the DNG, but it is also reproducing the XMP version info, and ACDSee is using the latter, and that's probably good enough for now. Still, I'd prefer other alternatives.

I still don't understand the issue *fully*. But examining my files with ExifTool, I see that the DNG converter is *not* in fact altering the EXIF info per se at all. What it is doing is adding an XMP block to the DNG file that contains a modified version of the DateTimeDigitized field. This modified version actually contains the time as shot, but it has the current time zone offset appended to it. That is, if the original EXIF dates read 09:00, and I am located in a time zone 6 hours behind GMT, the new DateTimeDigitized field created by the DNG Converter reads 09:00-06:00. Some applications will still display this as 9:00 AM, but others will display it as 3:00 PM - the GMT equivalent of 9:00 AM for folks in this time zone. In the case of the application I have been using (ACDSee Pro 2), if it sees time zone info on this field, it goes ahead and displays *all* times for the file in GMT, making it looks like the converter has modified more than it has.
I don't know if the converter is wrong to append the current time zone offset, or if ACDSee (and, apparently, some other applications) are wrong to display the time in GMT. The folks are ACDSee are looking into whether and how they should change the behavior of their application. I would also suggest Adobe consider whether appending the time zone info from the computer on which the conversion is being run really makes sense (perhaps it could be made an optional behavior). But I am inclined to suspect the real problem is an overly-vague specification - there may be no definitively correct behavior here.
So for me, this is enough understanding to feel like my workround is the way to go for me here. It actually suits my workflow better to generate XMP files before running the converter - it's the most convenient time for me to enter location information. Another workround would presumably be to run ExifTool to delete the XMP:DateTimeDigitized field immediately after running the converter.

Similar Messages

  • Recovering data from Time Capsule after conversion from Snow Leopard to Lion

    Had a problem with my iMac, which was running Snow Leopard O/S.  Data was backed up on Time Capsule using Time Machine, although recent back-ups had failed (but earlier back-ups had been OK).  Apple support determined that I needed to re-install the O/S and they walked me through it.  After blowing away all data (believing that once O/S reinstalled, could recover from Time Capsule), the reinstall failed.  Took to Apple store, where they installed Lion at my request.  Then tried to recover from Time Capsule, but couldn't do so.  Sparse file doesn't work (it shows 10GB, which is wrong as it had several hundred GB's before due to lots of photos).  Apple Support couldn't solve issue, believes it was corrupted with the last backup.  Ultimately recommended that I have professional data recovery service recover data from the iMac hard-drive (as opposed to Time Capsule).  Spoke to recovery service recommended by Apple, and it is very expensive (also, given the O/S reinstall, anything they recover will have new file names and no directory structure).  They also said getting data off the Time Capsule is even more difficult and expensive to recover.  I'm not a happy camper. 
    I would welcome any thoughts on what to do.  FWIW, I live in South Florida and there appear to be lots of data recovery companies locally, but Apple support had said that they have only a few authorized vendors that can do it without voiding warranty, so I'm reluctant to even call them.  Thanks.  Steve

    Your backups may be corrupted.
    Try to repair them, via Disk Utility (in your Applications/Utilities folder).
    Mount the sparse bundle by opening the TC in the Finder and double-clicking on the sparse bundle. Drag the sparse bundle into Disk Utility's sidebar, and do a +*Repair Disk*+ (not permissions) on it. If it finds errors it can't fix, run it again (and again), until it either fixes them all, or can't fix any more.

  • Run time error message - Conversion from string "Mobile Telecommunications" to type 'Boolean' is not valid.

    I get a error message when I select an element in the combo box, please review my variables:
     Public Sector As String
        Public Index As Decimal
    ...and code
     Select Case ComboBox1.Text
                Case "Banks"
                    Sector = "Bank" And Index = -0.086
                Case "Mobile Telecommunications"
                    Sector = "Mobile Telecommunications" And Index = -0.024
                Case "Real Estate Investment Trusts"
                    Sector = "Real Estate Investment Trusts" And Index = 0.132
            End Select

    And is a boolean operator. Try it like this instead:
    Select Case ComboBox1.Text
    Case "Banks"
    Sector = "Bank"
    Index = -0.086
    Case "Mobile Telecommunications"
    Sector = "Mobile Telecommunications"
    Index = -0.024
    Case "Real Estate Investment Trusts"
    Sector = "Real Estate Investment Trusts"
    Index = 0.132
    End Select
    Still lost in code, just at a little higher level.

  • Error while loading : Time conversion from 0CALDAY to 0FISCPER

    Hi ,
           I get the following error when loading to an ODs .
    My 0FISCPER is mapped to one of the date fields from the extractor (Document date) and also Have 0FISCVARNT as part of data fields  .
    " Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20060405 " .
    thanks .

    Hello CG
    Please refere this forum
    Time conversion problem
    Thanks
    Chandran

  • Date/time conversion from UTC to CET

    Hello,
    Jjust a little question to our SAP BI community!
    Is there a BI function/ggod way that converts a (datetime) string DDMMYYYYHH:MM:SS  in UTC time zone into a string DDMMYYYYHH:MM:SS in  CET timezone ?
    We receive this string in input in a flat file and need it to convert from UTC to CET.
    Or do you know any way to convert a date/time from IUTC to CET, eveen using an ABAP function?
    Thanks a lot,
    Matthieu

    Hi,
    Check this thread for time conversion from UTC to Local time
    [ABAP statement for converting UTC date/ time to local date/time],
    Regards,
    Daya Sagar

  • Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year A1 ) failed

    Get red error when loading Delta data and the red error is:
    Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year A1 ) failed with value 20060426
    We went to RSA1 -> Source Systems page, right click on the source system, choose Transfer Global Settings, check "Fiscal year variants" and check "Rebuild tables" and then run it twice, but still not helpful that we always get the above red error!
    Any idea?
    Thanks

    hi Anurag,
    You are right that run OB29 on BW, find there is no any period entry for variant A1.  Since all the variant should come from R3 by run "Global Transfer Settings" on the source system, I run OB29 on R3 and find the same thing that there is no any period entry for fiscal year variant A1.  I will go ahead to ask R3 people to correct it on R3, then conduct the "Global Transfer Settings".
    Thanks

  • Time conversion from u0085.Any help on this error in the PSA?

    Hi,
    In IDES, I chose a random infosource to test init and delta loads; in the process I created an infopackage for the load. During the load, Monitor displayed showed failure (Red); The details of the message in PSA is:
    “Time conversion from 0Calday to 0FISCPER (discal year) failed with value 19911021 “
    Any guide on how to fix this problem?
    Thanks

    hi Amanda,
    0fiscper has compound infoobject 0fiscvarnt (you can check RSD1 tab 'compound').
    when uploading data, conversion will failed if 0fiscvarnt is blank/initial or the date format/value is incorret, your date from 0calday seems correct so 0fiscvarnt blank
    seems it's your case, try to check in your transfer rules and mapped 0fiscvarnt infoobject, you can fill it with constant value, choose option 'constant' and fill with K4, you can only fill with values that exist in table T009T.
    you can transfer these values from r/3 via rsa1->source system->transfer global settings, mark 'fiscal year' something and option 'update table' and execute.
    hope this helps.

  • Time Conversion from Local to UTC Help! - Daylight Savings Problem?

    Hi,
    I am trying to use the generic conversion of local time to UTC time.  I thought I had everything working since I use the "Get Date/Time in Seconds" and then convert to "Seconds to Date/Time" witht the "to UTC" flag set true.  Then I convert back to a timestamp.
    I thought I had everything working until I hit Daylight Savings for the Pacific Standard Time.  Now the Labview is off by 1 hour!
    Is there an update to the Subvi's that do this?  Is this a known error?  Please help.   Thank you!
    Attachments:
    test25_LocalToUTC_Time.vi ‏8 KB

    Here is the path you are doing that is causing the logic to fall apart for you.
    1.  You are getting the current time.  In your images, it is 12:41 pm PDT which is 7:41 pm GMT
    2.  You are asking for the date/time record (that cluster that breaks out everything) but are saying it is UTC, so you get a date/time record that 19:41 UTC (7:41 pm).  And the DST flag is 0.  (Because UTC doesn't observe daylight savings time.)
    3.  Now  you are taking that date/time record and converting back to an actual date/time.  You don't wire up the UTC flag so it defaults to false.  Thus the date/time record is interpreted as local.  That DST flag is still 0 in your cluster.  So you are actually converting the time to 7:41 pm PST, which is actually 8:41 PDT.  (+1 hour for spring forward based on the month/date info.)
    4.  You are displaying that time stamp in the indicator labelled "Current Date/Time (UTC)", but it is not truly UTC, it is actually the conversion of a local time from PST to PDT, and it is not even the current local time.  It is actually a "local" time 8 hours into the future.  If you put the carat into that indicator's display format, you'll see that the UTC time is in the future as well.  You call it UTC, but you are displaying a future local time.
    The inconsistent conversions from local to UTC, and not accounting for the change of the DST flag from daylight time to UTC are what is confusing you.  You kind of get lucky during standard time because the DST flag is 0 for both local standard time and UTC.  But the conversions are still wrong, but it is a case of two wrongs are making it look right.  Even in standard time your input timestamp and your output timestamp indicator don't match which you would see if you used an Equals? function on them.

  • Project Conversion from 11.03 to R12- need help

    Hi All,
    I need some information regarding conversion from 11.03 to R12 along with Capital Project's conversion, do we need to take care Asset Assignment and Assets information at the time of Project Conversion. Iam doing Project and Task conversion different. Is there any good mapping document or Conversion FD, if any body have please send my mail ID. [email protected]
    Thanks for your help.
    Thanks,
    Srini..

    Usually only Open Invoices and Unapplied Receipts are migrated from Old system to New Instance.
    Post Migration Reconciliation shall be tough in that scenario. Reconsider/Rediscuss with Client.
    Migration can be done using APIs or Custom Scripts using Standard Interfaces.
    Hope this is helpful
    Regards,
    Sridhar

  • Deleting a Single Conversation from one Contact

    I have to say I find this a real let down with Skype.
    It is a breach of users rights to not be able to delete one conversation, and have it removed from all devices used.
    I use my Skype 24/7 for work and staying in touch with family, and therefore my history becomes FULL.
    Some of the conversations I have I need to keep for work reference, others with family, and non important stuff I need to delete.
    It is crazy that this cannot be done, and all you can do is delete your whole history - which is a non option for me, otherwise I would loose things like customer quote details, details for flights etc etc etc.
    Also deleting your history deletes all your favourites, and all the groups we have set up, eg Sales / Support etc !!!!
    Really stupid and not very efficient at all.
    Can someone from Skype PLEASE answer why this has been done as such ?

    Why would it be possible to delete a single conversation using Mac and not PC?  This thread, in the meantime is from 2013 and the issue has been raised multiple times, yet no response from Skype.  The management of the contact list and of information stored in the account is certainly as fundamental as being able to throw away old telephone bills and account statements.  The necessity to maintain messy and irrelevant has never been explained by skype that I can find.  Please change this, as only being able to "hide" a conversation leaves one with an ominous feeling of Big Brother is Watching You.
    Thank you.

  • Page numbers incorrect after conversion from Excel to pdf

    Page numbers incorrect after conversion from Excel to pdf
    ""This above link (thread:834599) is from a case back in 2011 that claims to solve this problem, but it does not solve this problem. I think that customer only cared about having continuous page numbering, not discrete page numbering per sheet.
    ========================
    I still have this issue in Acrobat XI and MS Office Professional Plus 2010. I keep upgrading to no avail. This regression has resulted in a huge time drain for me. If you fixed it, please explain how I can get my hands on the resolution.
    Previous versions of Excel and Adobe Acrobat enabled flexibility around the "Page #" of "Number of Pages" (Page &[Page] of &[Pages]) token, depending on context and usage. The "# of pages" token could represent EITHER the number of pages in the workbook OR the number of pages in the tab/sheet, depending on how you generated the PDF:
    You could select "Selected Sheets" and then select all or some of the individual sheets in the workbook, and the PDF would honor the discrete numbering of each of the sheets, so the first page of each sheet was p1 and the "# of pages" was the number of pages in the sheet; not the number of pages in the workbook; or
    You could select "Entire Workbook" and the PDF would honor continuous page numbers across all sheets, as a single document.
    Now, it only honors the total number of pages in the workbook, regardless of the method you use to publish to PDF: saving as PDF, printing to PDF, using "createPDF" from Acrobat plugin to Excel's menu ribbon; selecting all sheets, some sheets, or Entire Workbook; automatic First page number or "1" under Page Setup > Page> First page number. (This last option, btw, does restart every sheet at p1, but it hardly makes sense if the total number of pages is still the total number in the workbook instead of the number in the sheet.)
    I spent a lot of time trying each which way that the blog posts recommended and have tried this on multiple versions of Excel and Acrobat now.
    NONE of these time-consuming experiments gave me what I wanted.They all insist that "Page #" of "Number of Pages" (Page &[Page] of &[Pages]) is the total number of pages in the workbook or the total number of pages in the selected sheets combined.
    The numbering are correct in Excel Page Layout.
    The same issue happens when using LibreOffice calc. (Although, I never tested with Libre Office before, so I don't know that it ever worked).
    The workaround now is to create PDF for each spreadsheet one at a time, and then compile them using the Acrobat combine/binder feature. All alternatives are extremely time consuming and tedious. It used to be automatic. This is a major regression that has gone untreated for over a year now, maybe two years.
    My task takes infinitely more time to complete than it did with previous versions of Acrobat. That means that days are added to my project, when the functionality used to enable a quick pdf generation that was ready for review, now I have to do this very manual time-consuming set of steps to generate a draft. As the project has grown and more tabs are added, my pdf-generation task takes that much longer. We require lots of drafts. It used to be easy and fast. Now it is hard and time-consuming.
    In my opinion, the problem is not Excel; it is Acrobat because it was introduced with an upgrade in Acrobat, not an upgrade in Excel. The problem was introduced in Acrobat 9 or 10. Please provide a patch or add-on or something.

    If you are setting up the page numbers in Excel, the resulting PDF would display the the page numbers created in Excel. On Excel 2010 support page, (http://office.microsoft.com/en-us/excel-help/insert-and-remove-page-numbers-on-worksheets- HA010342619.aspx#BM2) is stated the following "tip" which indicates by default Excel 2010 starts numbering each tab with 1. Exel's workaround tip is below - 
    Set a different number for the starting page
    Tip   To number all of the worksheet pages in a workbook sequentially, first add page numbers to all worksheets in a workbook, and then use the following procedure to begin the page number for each worksheet with the appropriate number. For example, if your workbook contains two worksheets that with both be printed as two pages, you would use this procedure to begin the page numbering for the second worksheet with the number 3.
    On the Page Layout tab, in the Page Setup group, click the Dialog Box Launcher next to Page Setup.
    On the Page tab, in the First page number box, type the number that you want to use for the first page.
    Tip   To use the default numbering system, type Auto in the First page number box.
    Also helpful in the same section is the note on viewing page numbers. To see if the page numbering dilemma originates in Excel make sure you are using the Page Layout View see below:
    Hide All
    If you want numbers shown on pages when you print a worksheet  you can insert page numbers in the headers or footers of the worksheet pages. Page numbers that you insert are not displayed on the worksheet in Normal view — they are shown only in Page Layout view and on the printed pages.
    Overall it may be easier not to create the page numbers in Excel but instead create then in Acrobat using the Headers and Footers option in Acrobat.  I hope this helps - it sounds like a frustrating issue you are experiencing.

  • Conversion from milliseconds to Date in 1.5

    A java.util.Date object can be constructed by passing the number of milliseconds since 1 January 1970 as a constructor argument. This Date can then be formatted to a human-readable format with SimpleDateFormat.
    I have tested this conversion from milliseconds to a date both in Java SDK 1.4.1 and Java 1.5.0 to see if there are differences in the way dates are calculated from milliseconds. It seems that there are differences when the system timezone is set to Europe/Berlin. The dates from 1.5.0 are one hour ahead of those from 1.4.1 in a certain week in May 1945 and a day in September 1945.
    This means that milliseconds that are generated from a date by using the Java 1.4.1 runtime and then stored are interpreted differently when they are retrieved when using java runtime 1.5.0, if they happen to be one of those days in 1945. This could cause discrepancies when an application is migrated to JDK 1.5.0.
    This is only a minor problem, but is there any way to know what caused these changes in SDK 1.5.0 and what these changes are? Is there historical data that the Sun implementation is based on to calculate dates from millisecond values?
    Any help is appreciated.
    Kind regards

    I found the following at "http://thedailywtf.com/forums/70146/ShowPost.aspx"
    In summer 1945, Berlin and the Soviet-occupied part of Germany observed a daylight savings time of two hours. Unfortunately, Sun's JRE 1.4 implementation of GregorianCalendar defines a maximum DST of one hour and, in non-lenient mode, rejects the 2 hours as invalid when recalculating all fields after the millisecond field is set.
    Here's the bug report: http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4639407

  • IMessage - "new conversations from" ios 7   iMessage backups

    Hi, So I'm having two issues at the moment but I'll start with the "new conversations" issue. I just upgraded to the 5S and did a iTunes restore on the phone, restored using my backup and then activated iMessage which took a few hours before it actually happened. Any time I try and start a new conversation from my mobile (cell) number, iMessage automatically sends it as a standard SMS message (green) and not as a blue iMessage but when I start the conversation from an email address, iMessage works as normal but from "Messages" on my Macbook, I can send from my mobile number just fine so I'm not sure what's wrong.
    The second issue I'm having is that on my Mac, I have way more messages from one of my contacts, they go back quite a while but on my iphone they only go back as far as start of September. Any way how I can get iMessage on my Mac to force sync to my iphone and reload all my messages?
    Regards,
    Michael

    bump. first issue resolved, second I'm still experiencing issues with.

  • Implicit Conversion from data type sql_variant to datetime is not allowed.

     Getting a odd error. This code was working perfectly before a SQLServer upgrade.
    The linked database is working, I'm able to pull up data from it in separate queries just fine.
    I'm getting the following error.
    Implicit conversion from data type sql_variant to datetime is not allowed. Use the CONVERT function to run this query.
    Invalid column name 'TotalDay'. (.Net SqlClient Data Provider)
    can anyone spot the issue? I've tried sever variations of the same code, but still get the same thing.
    If I put this section in a query by it self it works just fine.
    ( DATEDIFF(ss,
    CONVERT(VARCHAR(10),( SELECT TOP ( 1 )
    TimeDate
    FROM [nav].AcsLog.dbo.EvnLog AS X3
    WHERE UDF2 = E.No_
    AND CONVERT(VARCHAR(10), X3.TimeDate, 101) = CONVERT(VARCHAR(10), @sdate, 101)
    ORDER BY TimeDate ASC
    ),101),
    CONVERT(VARCHAR(10),( SELECT TOP ( 1 )
    TimeDate
    FROM [nav].AcsLog.dbo.EvnLog AS X4
    WHERE UDF2 = E.No_
    AND CONVERT(VARCHAR(10), X4.TimeDate, 101) = CONVERT(VARCHAR(10), @sdate, 101)
    ORDER BY TimeDate DESC
    ),101)) ) AS TotalDayBadge ,

    >ANDCONVERT(VARCHAR(10),X3.TimeDate,101)=CONVERT(VARCHAR(10),@sdate,101)
    It is not a good idea to use string dates for predicates in WHERE clauses.
    Use DATETIME or DATE in predicates.
    If you are not interested in the time part of DATETIME, use DATE datatype.
    Example:
    SELECT CONVERT(DATE, getdate());
    -- 2014-08-25
    Datetime conversions:
    http://www.sqlusa.com/bestpractices/datetimeconversion/
    Between dates:
    http://www.sqlusa.com/bestpractices2008/between-dates/
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Design & Programming
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Data conversion from 212 format

    I found a problem during conversion data.
    I have a file which contains ASCII characters (in fact they're look
    like ASCII on any editor) . I have to convert them to integer but I
    know that data are organised into pairs of 12bit numbers packed into
    byte triplets (format 212).
    I made some vi (which convert 2 following ASCII characters to binary,
    then divide them to pairs of 4bits, then some rotation etc..) but the
    whole conversion takes too much time (64kb file conversion is taking
    approx. 1minute).
    Thank you for any suggestions,
    Michael ([email protected])

    Dear Michal,
    I hope this vi work faster.
    Input: 3bytes, data type unsigned int (U8)
    Output: 2 values, data type integer. (INT16)
    of course read all file to memory. Don't read everytime from file by 3
    bytes.
    Deniss
    ps. vi i send by e-mail.
    "Michal Szaj" wrote in message
    news:[email protected]..
    > On Wed, 18 Sep 2002 02:27:25 GMT, Greg McKaskle
    > wrote:
    >
    > >This is one of those things that is quite sensitive to how it is coded.
    > > If you make your VI available, others, including myself can give you a
    > >hand with speeding it up.
    > >
    > >As for guessing at what might be going on, first look at the array wire.
    > > If you are using locals or property nodes to access the a
    rray, that is
    > >the cause. If the array is written to a global and later accessed via
    > >the global, that is causing it.
    > >
    > >I suspect that the bit manipulations on 64K should take something like a
    > >few seconds.
    > >
    > >Greg McKaskle
    >
    > Thank you for your support.
    >
    > In this case I have only one vi and I don't use any global/local
    > variables.
    > At the beginning of the vi I load all text character from file to
    > memory and then I'm making further operations.
    >
    > If it helps I can send the vi with data file.
    >
    > Thnx,
    >
    > Michael ([email protected])

Maybe you are looking for