Lens Data

When Pentax updated its firmware for the K10D to version 1.1, the raw files (PEF or DNG) changed in some way where LR no longer is able to interpret the lens data to know what lens was used. As I understand it, Pentax's numbering system internal to the raw file that identifies the lens changed, and LR no longer interprets it.
I believe that this will probably be a temporary situation; eventually a newer version of LR will come out that supports lens metadata for Pentax K10D raw files.
My concern is this: when and if this problem is ever corrected, will I have to re-import my raw images so that their lens metadata will be acquired into the LR database?
A lot of "if's", I know. Hopefully my question actually makes sense! haha

Talk to Canon and Tokina - if that data is properly given to the camera and properly recorded in the EXIF data by the camera then iPhoto will display it - if it is not there then iPhoto can not do anything about it - only the manufactures can fix that
LN

Similar Messages

  • Lens data from Exif

    This is a question that was raised in the Lightroom Forum, but I think it might fit better in here. In Lightroom there is an option to sort images with respect to lens used, a very neat function, but when using raw files from Minolta (.mrw) or DNG's converted from these, no lens data is provided. The only information in there is "Unknown lens". Can this be fixed?

    Dana Gartenlaub wrote:
    > In File Info, Camera Data 1 lists any lens that is recent enough
    > to communicate its identity to the camera. This works on Pentax
    > PEFs or DNGs even when run through the DNG converter.
    It may work this way with Pentax PEF files but unfortunately doesn't with Minolta MRW or Sony ARW files. Of course, Minolta AF (and now Sony Alpha) lenses do report their identities to the camera, and the camera does write this info into the EXIF data but it ends up as an obscure lens ID in the Minolta (or Sony) Maker Notes where most 3rd-party software (including Adobe's) cannot (or would not) access it.
    Eric DeSilva wrote:
    > Adobe apps will store it in XMP format, but all the makernote
    > EXIF data is gone.
    This is not true! Adobe XMP won't read the Maker Notes sections but it does preserve them. When I convert an MRW file to DNG (with Adobe DNG Converter) and then create a TIFF or JPEG file from the DNG (with Camera Raw) then the Minolta Maker Notes will still be present in the final TIFF or JPEG file (and of course in the DNG file, too).
    Thomas Knoll wrote:
    > The problem of the lens data for Minolta/Sony cameras is the
    > lens ID codes have lots and lots of overlaps. Mostly because
    > Sigma released lenses that duplicate lots of the Minolta lenses
    > ID codes.
    Sigma lenses don't have Minolta/Sony lens IDs but just random numbers. Kindly ignore what Sigma lenses do. And please don't bother which lens 'was really used.' Just provide us with the lens ID found in the EXIF Minolta/Sony Maker Notes data. Please!
    By the way, is there a way to access the Minolta Maker Note LensID tag (or any tag) in the EXIF data through the JavaScript API? If not then please add one!
    Currently, I enter the names of the Minolta/Sony lenses manually into the metadata. I customized my File Info Panel 'Camera Data 1' so I can enter the data there in a fairly comfortable way. This is not too hard to do because usually I am using no more than two or three different lenses per assignment usually. This way, the lens names go into the exif/1.0/aux tag named 'Lens.' Unfortunatey, due to a bug in Bridge or in XMP, raw files and DNG files won't accept changes in this and a few more tags; only TIFF and JPEG files will. Instead, the change will get stored in Bridge's cache only. So I wrote a Bridge script to manipulate the XMP sidecar files' contents directly, to 'burn' the changes from the volatile cache permanently into the disk files' metadata.
    All this would become *much* easier when 1) we had access to the lens ID in the Minolta (or Sony) Maker Notes and 2) someone would eventually care about fixing that $#!+ metadata bug in Bridge (or XMP) which was already there in Bridge 1.0 and still is around in Bridge 2.1.
    -- Olaf

  • Panasonic G1 lens data in LR 2.2

    It appears that LR 2.2 does not recognize lens data as encoded by the Panasonic G1, including that for the Lumix kit lenses. Looking at the RW2 files, I can't find any lens model data at all. The JPEGs do carry lens model information, but perhaps it is in some non-standard format?
    So, the questions:
    Does anyone know for sure that lens data is not encoded in RW2 files by the G1, including in some proprietary way?
    Does anyone know if the JPEG lens data from the G1 is in some way non-standard?
    Does anyone know if LR will be able to handle such lens data as may be present in either type of file in the future?
    Perhaps it would be possible to re-encode the data from the JPEGs in some way that LR can use it?
    Thanks -
    Tim

    I don't like it either that Adobe chooses only to handle certain camera brands lens data.  It doesn't properly recognize lenses on my Olympus E620 either.  Sometimes it will include something about the lens, but that's not the real lens name, it has picked it up from other fields in the metadata.   Yes, you can pre-process the jpg files (and probably RAW too) to put the lens information in a place that Lightroom picks up.
    I also discovered that the Panasonic G1 and the Olympus E620 do not call the same lens by the same name, so at the same time I standardized the names for me so if the 70-300mm lens is on either camera, it shows up as the same lens.
    And the process is not so bad because now I'm merging GPS data into the files, as well as filling out the location fields automatically from the GPS data, so I am now doing a whole lot of pre-processing.
    Panasonic and Olympus put the lens name in the field LensType.  If you put text in XMP:Lens field then Lightroom picks it up.
    You can use exiftool in Windows at least to do this.  You would need something like this one line to copy the information frothe LensType field to the XMP:LENS field.  Mine is more complicated because I put the lens name in explicitly.
    exiftool -overwrite_original  "-xmp:lens<${LensType}"  *.jpg
    I thought there was a LR plugin that would allow you to execute a command on import, but I don't see it now.
    Judy

  • Export Full EXIF (inluding lens data)

    It would be nice if exporting would include all of the EXIF from the original image. In particular, I would like the lens data included. (Yes, I do not have "Minimize Embedded Metadata" checked.)
    For example (among other things), the original image (taken with Nikon D200) includes (exported with exiftool):
    ]Focus Distance : 10.00 m
    Focus Mode : AF-S
    Focus Position : 0x05
    Lens : 18-50mm f/2.8 G
    Lens Data Version : 0201
    Lens F Stops : 6.00
    Lens ID : Unknown (7F 48 2D 50 24 24 1C 06)
    Lens ID Number : 127
    Lens Type : G
    Max Aperture At Max Focal : 2.8
    Max Aperture At Min Focal : 2.8
    Max Focal Length : 50.4mm
    Min Focal Length : 18.3mm
    All of which I would love to see in the exported images. (I'm sure there are other ones too that I just didn't notice off-hand.)
    jon

    Yes, I found this morning that camera data is not exported. This is a major problem. I am entering contests which require the metadata to verify that the photo was taken within a certain period. I just realized that this data is lacking on every picture that I have exported from LR. Luckily I can process those photos in DxO to retain the data.

  • No lens data under dpp 3.13.0

    hi, I recently updated to DPP 3.13.0 and all of the lens data that I had brought in under the previous version has disappeared. If I try to update the lens data the application retuns the error can't connect to server. Under the previous version of DPP I was able to connect and download the lens data. Is anyone else having this problem? I'm running on mac os 10.7.5

    Kernel 3.13.7 contains the patch to fix this bug, from here on the mailing list: https://lkml.org/lkml/2014/3/20/819
    So now I don't have to hold back my kernel anymore, and this issue is resolved. Woopee!
    Last edited by zanny (2014-03-27 02:03:59)

  • Cyber Duck killed my Canon DPP Lens Data Downloads?

    I run a programme called DPP, Digital Photo Professional, which was working fine and I was able to download 'lens data' for this program.
    I then tried out Cyber Duck for some FTP transfers which went OK on my MBP.
    But now I find that although Mail and Safari work fine, the data download for DPP no longer works, all I get is 'unable to connect to server'.
    My MacMini has a copy of DPP on it and that is still able to download lens data, but I have not run Cyber Duck on that machine as yet.
    Even if I reinstall DPP it remains the same reporting 'unable to connect to server'.
    I tried contacting Cyber Duck who just bumped me off as it was a 'third party problem', which was not particularly very helpful of them.
    So I am now at a loss what to do.
    It would seem to point to something that may have been altered by using Cyber Duck, but what I do not know. All the internet connections seem to be correct as far as I can tell.
    Anyone have any idea what could have been changed to give me this problem?
    Many thanks for any help.

    I rolled my system back to a clone that I made before I downloaded and used Cyberduck.
    DPP is now able to download its Lens Data. So it would appear that Cyberduck does alter something which breaks DPP and stops it from connecting to the Lens Data server.

  • Lens Data- Why Can I Not Find It?

    I use a Mac and ran the trial version of Aperture and liked it but it did not run very smoothly so I have purchased LR3. The features and utilities are about the same with one exception, Aperture automatically gives you lens name along with all the other EXIF photo data (aperture, zoom, etc...). Lightroom gives me aperture, zoom, ISO and shutter, just like Aperture but will not tell me the lens make and model. Am I missing something? I looked at that lens profiler program but didnt understand why I need it (and looking at the instructions with a checkerboard and other extra steps seemed unnecessary). If the camera records the lens along with the other pertinent information automatically (it must as Aperture picked it right up), why can I not find it?

    I use a Pentax *ist DS and have a few different lenses. Some are older manual focus and a few are auto focus ones. The camera body reads the autofocus ones and whenever the photos were loaded up into Aperture, it gave the usual EXIF shot data as well as which lens was being used (whenever the autofocus ones were attached). Im hoping that LR3 can do the same thing without having to load up a separate plug in program in order to do it. I figured if the camera does it and Aperture picks it up, LR3 should too, right?

  • How to poll Aperture lens data for usage graph - Filters?

    Does anyone know how to set up a Filter in Aperture 3.1.1 so that one can see a list (or make a list) of the most to least common focal length one shoots at with interchangeable lenses?
    I assume it will probably have to be a search for all 20mm, then all 28mm, then all 35mm, then all 50mm, etc. and then look at the totals for each, penciled down, and see what focal lengths are the most and least used.
    I have set up separate camera smart folders so I would need to only conduct this search/filter on the two DSLR bodies I have. Otherwise, a Nikon scanner, Minolta scanner, and a G10 and some other point and shoots will come in to the mix and make it inaccurate.
    My objective is to see which lenses are redundant and which are less used then sell the extras.

    Joseph Coates wrote:
    My objective is to see which lenses are redundant and which are less used then sell the extras.
    Seems you should filter by lens and not by focal length.
    I keep a permanent set of Smart Albums, one for each lens I have. (Create one, then dupe it and change the Lens field and the Album name. Get the lens name from the metadata of a shot taken with the lens.) Periodically I check the totals to see my usage pattern.
    I recommend also setting up a series of Smart Albums for focal length. (As above, create one, then dupe and change the filter field and Album name. Group them in one Folder.) If you use cameras with different size sensors, you might want to set up both an actual focal length set, and a "35mm equivalent focal length" set. Use whatever ranges you find meaningful.
    Note that you can select any number of Folders to open all of them in the Browser and get a total number of images. Note that this number will change as you expand or collapse Stacks, and so the totals given depend somewhat on how you use Stacks. Note, too, that Smart Albums use very little overhead (the only thing stored are the criteria). Don't hesitate to make as many as you find useful.
    Message was edited by: Kirby Krieger, added illustrations.

  • Lens data show properly for Sony lenses

    Guys,
    Any possibility on correcting this ? read through old messages and see feedback on how some SIgma lenses overlap with Sony but to not show either is not good thing. Can you make the lens ID user definable in the worse case ?
    Thanks

    "I read somewhere that sigma provided all the profiles for their lenses and looking at the long long Canon and Nikon lists, I can't imagine Adobe went and bought/rented a warehouse of premium lenses purely to profile them."
    True by all account! Canon and Nikon are not doing what is the best interest to their own customers (Ok, this is my own statement ). A number of other lens manufacturers have indeed followed SIGMA's example and provided Adobe with lens profiles. The list is growing. The user community has also help created close to 1000 lens profiles. Thank you all!
    -Simon

  • Full lens data for Minolta / Sony Lenses

    At the moment the metadata browser for Minolta and sony users is pretty useless. The most attractive feature is to be able to automatically list the metadata for the lenses. But lightroom is unable to do this. C'mon Adobe if small apps like Dalifer can decode the Makernote info why can't you.

    Yep! My metadata lens need to be decode with Lightroom...

  • Photoshop not seeing lens model in exif data from A3 export

    Hi, I am trying to use the lens correction features in CS5 but the lens type can't be read in the exif data of an image I've exported from A3. It sees the focal length and body type. The lens type is listed in the Metadata Inspector in A3. I took the image into iPhoto and checked "extended photo info" and the lens type isn't there either. This happens with at least 2 lenses. One prime and one zoom. Any ideas?

    Hi again
    Having played with metadata a bit more, it is possible to include EXIF data for the lens size etc.. eg...EF300mm f2.8L IS USM But it doesn't append the manufacturer's name, eg I use Canon, yet 'Canon' is not appended to the Lens data like it is to the camera name..
    In the library Inspector click > Metadata > where it says 'General' click the down arrow, then click > Edit > from the window that appears click > EXIF > from the drop down list click > Lens (scroll down a bit)
    The lens you used will now be appended to your data.
    I have not tried exporting this to Photoshop so maybe you could post back if its possible to export this data to PS or not.
    Maybe someone has information on this and can chip in here with advice....
    Hope this is helpful....... Gerry....
    PS to all reading this, I also noticed while playing around with this ( I have not delved into metadata too much since Ap2) that all my personal details like mobile number, address, email etc are all appended by default, I imagine my address book info has been used here.
    A useful feature indeed while selling photo's, or if you intend it to be there..... but not so good if you are not aware it's there in the first place.....!
    Just a shared thought to whom it may concern....... Gerry.
    Message was edited by: windhoveruk

  • CS6 Bridge "Lens Specification Data" appears incorrect (W7 Pro 64Bit SP1)

    Viewing Camera Data (Exif) in CS6 Bridge
    Noc ahgnges to default settings applied in CS6 Bridge
    Observed Results:
    The new "Lens Specification Data" under the Camera Data (Exif) appears to be be correctly reporting the lens type but shows the lens as "f/0" which is incorrect
    This appears to be the case with pictures taken with two different lenes both of which are reported as f/0
    Further up in the same tab the "Lens" data is correctly reported for the lens type and maximum aperture.
    Expected Result:
    Wasd to see the maximum aperture of the lens reproted in addition to the lens type, which I would ahve expected to be the same data as reproted under the "Lens" information higher up in the same panel.
    Which begs the question what is the difference between the "Lens" information and the "Lens Specification Data" meant to be ?
    Observed these results with the following two lenses:
    EF-S17-55mm f/2.8 IS USM
    EF70-200mm f/2.8L IS II USM
    System Info:
    Won't let me paste in :(

    Hi Baichao,
    Thank you for your response.
    The camera in questions is a Canon 7D (actullay I shoot with a pair of 7D's and the results are the same for both cameras), the files are both .CR2 and .JPG, with the same results being seen for both file formats. the images have not been edited in any other applications but viewed immediately after donwload to the PC from the memory card.
    I have also viewed these and other older images in both the CS6 and CS5 versions of bridge for comparision purposes and see the same differences irrespective of how old the image is or if its been previously edited in CS5 for both .CR2 and .JPG files.
    In addition I also don't see an indication of the subject distance displayed in CS6 bridge, although the option is selected and works as expected in CS5 bridge.
    I will send you a seperate email with screen shots showing an example in both CS5 and CS6.
    Regards Mark

  • EXIF data LENS MODEL information is not displayed

    I have imported my photos into Aperture 2.0.1 and notice that in the EXIF data, the Lens Model is blank. I know the information is in the RAW image file because when I open it with other RAW converters I can see the lens information that I used to take the shot. I am not sure why this information isn't showing up or how to get it to see it or display it.
    Don't know if it matters but I shoot with a Pentax K10D.

    The lens data doesn't appear for my K10D either so I guess it does matter. The images I'm working on were just imported into Aperture 2.0.1. I've switched between Raw 1.0 and Raw 2.0 with no change. The field is still blank. It's get the focal length right so I expect the data is there. I'll bet it works fine for most of the Canikons.
    I really would like to be able to use this field for searching. I'll keep checking though.

  • Meta data - how to show lens spec's for ARW images

    In the metadata listing (in Library Mode) when I open a NEF image, data regarding the lens used are listed (e.g. 85mm f1.8).
    However when I open a Sony A700 ARW file no lens data are listed. How can I show lens data for ARW files too?
    Thanks for advise!

    Hope my suggestion..of just extracting Month and Year worked ?

  • BTREE and duplicate data items : over 300 people read this,nobody answers?

    I have a btree consisting of keys (a 4 byte integer) - and data (a 8 byte integer).
    Both integral values are "most significant byte (MSB) first" since BDB does key compression, though I doubt there is much to compress with such small key size. But MSB also allows me to use the default lexical order for comparison and I'm cool with that.
    The special thing about it is that with a given key, there can be a LOT of associated data, thousands to tens of thousands. To illustrate, a btree with a 8192 byte page size has 3 levels, 0 overflow pages and 35208 duplicate pages!
    In other words, my keys have a large "fan-out". Note that I wrote "can", since some keys only have a few dozen or so associated data items.
    So I configure the b-tree for DB_DUPSORT. The default lexical ordering with set_dup_compare is OK, so I don't touch that. I'm getting the data items sorted as a bonus, but I don't need that in my application.
    However, I'm seeing very poor "put (DB_NODUPDATA) performance", due to a lot of disk read operations.
    While there may be a lot of reasons for this anomaly, I suspect BDB spends a lot of time tracking down duplicate data items.
    I wonder if in my case it would be more efficient to have a b-tree with as key the combined (4 byte integer, 8 byte integer) and a zero-length or 1-length dummy data (in case zero-length is not an option).
    I would loose the ability to iterate with a cursor using DB_NEXT_DUP but I could simulate it using DB_SET_RANGE and DB_NEXT, checking if my composite key still has the correct "prefix". That would be a pain in the butt for me, but still workable if there's no other solution.
    Another possibility would be to just add all the data integers as a single big giant data blob item associated with a single (unique) key. But maybe this is just doing what BDB does... and would probably exchange "duplicate pages" for "overflow pages"
    Or, the slowdown is a BTREE thing and I could use a hash table instead. In fact, what I don't know is how duplicate pages influence insertion speed. But the BDB source code indicates that in contrast to BTREE the duplicate search in a hash table is LINEAR (!!!) which is a no-no (from hash_dup.c):
         while (i < hcp->dup_tlen) {
              memcpy(&len, data, sizeof(db_indx_t));
              data += sizeof(db_indx_t);
              DB_SET_DBT(cur, data, len);
              * If we find an exact match, we're done. If in a sorted
              * duplicate set and the item is larger than our test item,
              * we're done. In the latter case, if permitting partial
              * matches, it's not a failure.
              *cmpp = func(dbp, dbt, &cur);
              if (*cmpp == 0)
                   break;
              if (*cmpp < 0 && dbp->dup_compare != NULL) {
                   if (flags == DB_GET_BOTH_RANGE)
                        *cmpp = 0;
                   break;
    What's the expert opinion on this subject?
    Vincent
    Message was edited by:
    user552628

    Hi,
    The special thing about it is that with a given key,
    there can be a LOT of associated data, thousands to
    tens of thousands. To illustrate, a btree with a 8192
    byte page size has 3 levels, 0 overflow pages and
    35208 duplicate pages!
    In other words, my keys have a large "fan-out". Note
    that I wrote "can", since some keys only have a few
    dozen or so associated data items.
    So I configure the b-tree for DB_DUPSORT. The default
    lexical ordering with set_dup_compare is OK, so I
    don't touch that. I'm getting the data items sorted
    as a bonus, but I don't need that in my application.
    However, I'm seeing very poor "put (DB_NODUPDATA)
    performance", due to a lot of disk read operations.In general, the performance would slowly decreases when there are a lot of duplicates associated with a key. For the Btree access method lookups and inserts have a O(log n) complexity (which implies that the search time is dependent on the number of keys stored in the underlying db tree). When doing put's with DB_NODUPDATA leaf pages have to be searched in order to determine whether the data is not a duplicate. Thus, giving the fact that for each given key (in most of the cases) there is a large number of data items associated (up to thousands, tens of thousands) an impressive amount of pages have to be brought into the cache to check against the duplicate criteria.
    Of course, the problem of sizing the cache and databases's pages arises here. Your size setting for these measures should tend to large values, this way the cache would be fit to accommodate large pages (in which hundreds of records should be hosted).
    Setting the cache and the page size to their ideal values is a process of experimenting.
    http://www.oracle.com/technology/documentation/berkeley-db/db/ref/am_conf/pagesize.html
    http://www.oracle.com/technology/documentation/berkeley-db/db/ref/am_conf/cachesize.html
    While there may be a lot of reasons for this anomaly,
    I suspect BDB spends a lot of time tracking down
    duplicate data items.
    I wonder if in my case it would be more efficient to
    have a b-tree with as key the combined (4 byte
    integer, 8 byte integer) and a zero-length or
    1-length dummy data (in case zero-length is not an
    option). Indeed, these should be the best alternative, but testing must be done first. Try this approach and provide us with feedback.
    You can have records with a zero-length data portion.
    Also, you could provide more information on whether or not you're using an environment, if so, how did you configure it etc. Have you thought of using multiple threads to load the data ?
    Another possibility would be to just add all the
    data integers as a single big giant data blob item
    associated with a single (unique) key. But maybe this
    is just doing what BDB does... and would probably
    exchange "duplicate pages" for "overflow pages"This is a terrible approach since bringing an overflow page into the cache is more time consuming than bringing a regular page, and thus performance penalty results. Also, processing the entire collection of keys and data implies more work from a programming point of view.
    Or, the slowdown is a BTREE thing and I could use a
    hash table instead. In fact, what I don't know is how
    duplicate pages influence insertion speed. But the
    BDB source code indicates that in contrast to BTREE
    the duplicate search in a hash table is LINEAR (!!!)
    which is a no-no (from hash_dup.c):The Hash access method has, as you observed, a linear search (and thus a search time and lookup time proportional to the number of items in the buckets, O(1)). Combined with the fact that you don't want duplicate data than hash using the hash access method may not improve performance.
    This is a performance/tunning problem and it involves a lot of resources from our part to investigate. If you have a support contract with Oracle, then please don't hesitate to put up your issue on Metalink or indicate that you want this issue to be taken in private, and we will create an SR for you.
    Regards,
    Andrei

Maybe you are looking for

  • Scrollbars always on after upgrading Late-2007 MacBook

    After upgrading my wife's white Late-2007 MacBook, I noticed the scrollbars are always on and not like my MacBook Pro where they fade in and out. I don't think it is because her trackpad is not multi-touch. Does anyone know why this happens? Thanks i

  • Differing info on max RAM for an eMac-Anyone know?

    Hi guys/gals, Apple says the eMac takes 1GB RAM maximum, but I've read on other sites--third part providers--that it will take up to 2GB RAM maximum. I'm sure I should take Apple's word for it, but what's the deal with the other ramblings of more RAM

  • CAPTCHA check for spam prevention while submitting ADF forms

    Greetings Experts, Is there any build in capability within ADF framework that can leverage CAPTCHA or some other similar mechanism to prevent spam while ADF form submission?. There are may third party webservices that provide CAPTCHA or similar funct

  • Autoconfig error

    Hi, I am getting autoconfig error at the starting. I found that the object APPS.FND_MESSAGE is present in the database. Context Value Management will now update the Context file Updating Context file...COMPLETED Attempting upload of Context file and

  • Is my HD dead?

    Been running my 10.6.1 along side Windows 7 Ultimate RTM for a while, and it's been running smoothly and fine in both OSes. Yesterday, I was working in Windows, did a proper shutdown and left the office. Got home an hour later and attempted to boot i