No information retained

Hi- I copied my photos from 'My Pictures' on my old pc onto my external hard drive. Then I imported those photos from the HD into iPhoto on my MacBook. However, the dates on the PC photos say 1/1/2000 for a whole slew of photos. When I look at the Photos on the HD with the PC, I can see the original date of the photo when hovering over the thumbnail. Besides renaming everything, is there a better approach to move my photo library from my PC to my Mac? Thanks

Try opening any photo in Preview on the Mac. (double-clicking might do it, or drag to the application icon in /Applications/)
In Preview, Cmd-I should bring up the info window, which has a tab that will let you see EXIF data. If the date there is correct, iPhoto '08 should be able to identify the photo.
If there is no data in the Info window in Preview.app, the photo's data could have been changed or converted during the transfer.
When you selected the roll to Create New Roll, did you see all the photos in the roll highlighted? Sorry if this is an obvious question but if you're new to iPhoto it might be important. Did you see the roll renumber itself?
Also, the photos you have already imported will also have information in iPhoto. Cmd-I in iPhoto also brings up an info window where you can view EXIF data. Check if the date stamps there are correct - there might be as many as three or four (modified, digitized, imported, original or something like that)
If you have the date stamps in there then there might be no need to redo everything. I'd suggest letting iPhoto '08 sort it out, it's a lot more capable than iPhoto 6.

Similar Messages

  • PCR, ISR_SPECIAL_DATA_GET -Salary Information

    Hi All,
    I need few pointers for problem described below.
    We have a custom PCR( ISR ) for Transfer outside Team which should go to next level manager if the receiving manager has proposed a salary change, if there is no salary change it should go to HR Rep.
    I need to evaluate the Form Fields in real time, i was trying to use ISR_SPECIAL_DATA_GET, it gives you every thing else but no salary details, i was hoping to see salary figures and was willing to compare and decide where to route the workflow.
    Is there a simple way of retriving the salary information retained in PCR forms( which is BUS7051 from workflow perpective)
    Regards

    >>Can you tell me how data is geting loaded into ISR_SPECIAL_DATA?
    The form data is stored in XML format using the Business Document Service (BC-SRV-BDS).. but what you get out of the function call is, what is already stored.. which in your case is wrong anyway.. debugging this function will not help you.. you have to debug the function call that creates the form insteead.. did you try running the function module ISR_PROCESS_EVENT in test mode in the backend for this particlular PCR scenario?
    ~Suresh

  • Converting from PDF directly to Java Objects/XML (and PDF format questions)

    Hi,
    I posted this originally in the Acrobat Windows forums but was told I might have more luck here, so here goes:
    I am desperately trying to find a tool (preferably open source but commercial is fine also) that will sit on top of a PDF and allow me to query it's text for content and formatting (I don't care about images). I have found some tools that get me part of the way there, but nothing that seems to provide an end-to-end solution but is quite lightweight. My main question is WHY are there so many tools that go from PDF to RTF, and many tools that go from RTF to XML, but NONE that I can find that go PDF to XML.
    To clarify, by formatting I simply mean whether a line/block of text is bold/italic, and its font size. I am not concerned with exact position on the page. The background is that I will be searching PDFs and assigning importance to whether text is a heading/bodytext etc. We already have a search tool in place so implementing a pure PDF search engine is not an option. I need a lightweight tool that simply allows me to either make calls directly to the PDF OR converts to XML which I can parse.
    Some tools I have tried:
    1) PDFBox (Java Library) - Allows the extraction of text content easily, but doesn't seem to have good support for formatting.
    2) JPedal (Java Library) - Allows extraction of text content easily, and supports formatting IF XML structured data is in the PDF (not the case for my data).
    3)  Nitro PDF (Tool) + RTF to XML (script) - This works quite nicely and shows that PDF to XML is possible, but why do I have to use 2 tools? Also, these are not libraries I can integrate into my app.
    4) iText (Java Library) - Seems great at creating PDFs but poor at extracting content.
    I don't really expect someone to give me a perfect solution (although that would be nice!).
    Instead, what I'd like to know is WHY tools support PDF to RTF/Word/whatever retaining formatting, and other tools support RTF to XML with the formatting information retained. What is it about PDF and RTF/Word that makes it feasible to convert that way, but not to XML. Also, as I found in 3) above, it is perfectly feasible to end up as XML from PDF, so why do no tools support this reliably!
    Many thanks for any advice from PDF gurus.

    XML doesn't mean anything - it's just a generic concept for structuring
    information.  You need a specific GRAMMAR of XML to mean anything.  So what
    grammar would you use?  Something standard?  Make up your own?
    However, there are a number of commercial and open source products that can
    convert PDF to various XML grammars - SVG, ABW, and various custom grammars.
    But the other thing you need to understand is that most PDF files do not
    have any structure associated with them (as you saw when using JPEDAL).  As
    such, any concepts of paragraphs/sections/tables/etc. Are WILD GUESSES by
    the software in question.

  • Hard drive is gone after Leopard install

    Guess that's what I get for being an early adopter......Install looked like it was going fine. I did the Archive and Install option and preserved user information. When the system finally came up, it had to user information retained and the OS was jerky and hanging up. Finally it hung with a spinning beach ball for 30 minutes so I hit the power button.
    After that it all went haywire.....no boot, no drive, etc. Just got a grey screen with the folder and a question mark. It finally booted off the DVD and I was able to run disk utilities. Cannot format, mount or verify the hard drive. This is frusterating

    Matt
    I did read the post.
    It's like this.
    When a HD directory is corrupt there is a good chance that it will have be trashed.
    However, in my experience I have found that sometimes different versions of utilities, and different utilities can see different things.
    There is just a chance that the version of disk utility that created the directory might be able to see the HD. Worth a try I think.
    Cheers

  • Brand new Surface Pro 3 with Office Prof Plus 2013

    Just purchased today and brought home a brand new Surface Pro 3.
    The first thing I needed for work, Office, and so I started to install, via USB DVD drive, the program Office Professional Plus 2013 - the fully loaded flagship version of Office that I purchased in March of 2014.
    I was shocked that I am unable to install it. I first tried installing the 64-Bit Office version, since Windows 8.1 is 64-bit. Next I learned that I must uninstall the 32-bit trial versions of Office that came pre-loaded onto the Surface Pro 3. I even used
    the Microsoft Office Fixit utilities and the Support descriptions of manual removal of previous versions of Office.Resetting the Surface Pro 3, uninstalling all Office products from it, and installing as 32-bit did not help either.
    The errors I see are not describable, because nothing happens, the install just hangs with no messages. The machine keeps operating fine, but the Office installs do not work at all.
    All of these efforts have only led to frustration.It appears to me that the flagship 2014 Surface Pro 3 and the Office Professional Plus 2013 (purchased only this Spring) are incompatible. This came as a great shock for me because in all my history of installing
    current MS software on current systems, I have never had this problem. Maybe back in the days of Windows 3.1 it might have happened. Please, if anyone has any advice , please advise.

    Thanks Melon Chen for your quick and helpful reply. Somehow (I will explain below), I was at last able to install the Office 2013 Pro Plus Suite.
    Before I received your reply with detailed instructions I kept trying many other solutions .. eventually I made Office 2013 Pro Plus install, although I'm not exactly sure what ended up fixing my ability to install from DVD.
    here's what I did:
    1) I attempted to install Office 2013 Pro Plus after running Fixit Utility and again the clean boot, but those did not help. I received the same error message that I described (ie. no specific error coded message, only a message that Office 2013 Pro
    Plus failed to install.  After rebooting with the "clean" reset option ,  I noticed there was some of my personal setup information retained on the system (can't recall exactly what this was , but it was clear to me that the reset
    did not make my system in pristine factory image). I discovered an even cleaner reset is the following to reset the Surface Pro 3 'absolutely' to factory image using a downloaded factory recovery image that is available here: 
    -->
    http://www.microsoft.com/surface/en-us/support/warranty-service-and-recovery/downloadablerecoveryimage?lc=1033
    This also required creating a USB drive based install and this reset the Surface Pro 3 to factory condition as described here:
    -->
    https://www.microsoft.com/surface/en-us/support/warranty-service-and-recovery/usbrecovery
    Now, after all this, regarding the Office Pro Plus 2013 install, 64-bit, it failed yet again, but one good thing was that the error message stated something different ie containing the text
    " Background installation ran into a problem. ... We'll automatically resume...." . With this information, I explored the internet over the weekend and I found the following valuable site on the web:
    http://answers.microsoft.com/en-us/office/forum/office_2013_release-office_install/unable-to-install-office-2013-error-codes-30102-1/fde35d75-bddc-4dd6-bd0a-e01d07420f49
    Although I did not observe the error codes (apparently, it is explained that MS has stopped showing many error codes) as depicted, but the symptoms were identical. And so reading the link, there was one user , Rohn007, who compiled a beautiful listing of this
    problem of installing Office 2013 Pro with suggested solutions; lo and behold, the solution describing
    multiple wireless devices seemed to solve the problem. In other words, by being a heavy MS early-adopter, during Office 2013 Pro Plus install, I had a couple of MS devices, like Lumia phone, pc, etc.. all linked to the same wireless router
    during my attempts to install to the Surface Pro 3.  I have only a single MSN user ID and password for all the devices. I could be wrong, but this situation with shared wireless apparently can mess up the installs of Office 2013 Pro Plus- but often it
    is disguised and murky because no error codes are generated. By turning off all the wireless connections except for the one connecting to the Surface Pro 3, I was AT LAST ABLE TO INSTALL OFFICE 2013. 
    In the end, I installed Office 2013 32 bit, but I really wanted 64 bit. I will try now and uninstall everything again. and then go for the 64 bit version.
    I was very worn out by this experience. A full 2 days trying to install a brand new Office 2013 onto the brand new Surface Pro 3 with Win 8.1 apparently all came down to the fact maybe that I had too many wireless connections going! In the end I am
    happy that I can finally discover this beautiful piece of hardware, but I am a bit scared since I also would like to install Adobe CS4 from DVD drive and I am worried that I will again have much trouble. Do you have any suggestions when installing the Adobe
    CS 4 Master Collection, are there suggested steps to take? I like the idea of running Event Viewer, do you think that cab be helpful during installs.

  • Mac Mini and Bluetooth Keyboard and Mouse

    I have a Mac Mini as a server. I've decided to do a clean install of Mountain Lion and Mountain Lion Server. I don't have any wired keyboard or mouse and am having trouble getting the Mac Mini to see the keyboard and mouse. I can buy cheap wired ones but I'd like to use my bluetooth ones if at all possible. Can anyone offer some tips for doing this successfully? I need a method that assumes I have a Mac Mini with a clean, formated disk and a bootable flash drive with the latest copy of Mac OS X Mountain Lion.

    I did unpair the mouse. The keyboard has only ever beenpaired with the Mac Mini. Today I was finally able to get both devices talking with the Mac Mini, but I really don't know how I did this. I just played with turning them on and off. If the Mac Mini has been newly formatted (as I finally managed to you), is the pairing information retained?
    I'm thinking there is more to this issue than just bluetooth pairing.
    Thanks for your quick reply. (I'm still redownloading Mountain Lion from the Apple store and making a new bootable flash drive as the one I had won't let me install. Not sure why.)
    (If the issue is just one of pairing it would appear I don't need a wired mouse and keyboard.)

  • New Oracle DBA - Need help with backup & restore procedure via RMAN

    Hello everyone,
    I've been a SQL Server DBA for 12 years now, but new to the Oracle space. My first assignment at work was to refresh our training environment with production. So with that said, I took a full backup of our production database via RMAN and backed up the Control File. I then copied both the Control File and full backup from our production environment to training. I followed the procedures listed in the URL below:
    http://www.dba-oracle.com/t_rman_clone+copy_database.htm
    I then connected to RMAN and executed a 'show all' which is as follows:
    RMAN configuration parameters are:
    CONFIGURE RETENTION POLICY TO REDUNDANCY 1; # default
    CONFIGURE BACKUP OPTIMIZATION OFF; # default
    CONFIGURE DEFAULT DEVICE TYPE TO DISK; # default
    CONFIGURE CONTROLFILE AUTOBACKUP OFF; # default
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO '%F'; # default
    CONFIGURE DEVICE TYPE DISK PARALLELISM 1 BACKUP TYPE TO BACKUPSET; # default
    CONFIGURE DATAFILE BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE ARCHIVELOG BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE MAXSETSIZE TO UNLIMITED; # default
    CONFIGURE ARCHIVELOG DELETION POLICY TO NONE; # default
    The CONFIGURE CONTROLFILE AUTOBACKUP was set to ON but received a message that the database needed to be mounted, so, I mounted the database and made the changes, but when I shutdown the database and performed the startup nomount again, the settings were gone. Are these settings valid only when the database is mounted? How can I successfully refresh this training environment with production data? I'm at a standstill here so any help would be very much appreciated.
    Thank you,
    Pete

    The CONFIGURE CONTROLFILE AUTOBACKUP was set to ON but received a message that the database needed to be mounted, so, I mounted the database and made the changes, but when I shutdown the database and performed the startup nomount again, the settings were gone. These settings are persistent settings.So these information retain in control files.To reading information from control files database instance must be MOUNT or OPEN stage.Due to you have mount instance and try SHOW ALL command through RMAN.
    Are these settings valid only when the database is mounted? Not only MOUNT also OPEN stage.
    How can I successfully refresh this training environment with production data? I'm at a standstill here so any help would be very much appreciated.
    There are several ways like duplication.But you take full backup from production database using BACKUP DATABASE through rman.In this case you will get also AUTOBACKUP for controlfiles/spfiles.Then copy these backup files and all available archive logs to training server and perform below steps.
    1) You have set properly ORACLE_HOME and ORACLE_SID environment variable.After that first need restore spfile as
    rman target /
    startup force nomount;
    restore spfile from 'autobackup_location';
    startup force nomount;2) Now you have to restore controlfiie as
      rman>restore controlfile from  'autobackup_location';
    rman>alter database mount;
      3) Now need catalog(it means register) all backup files and archivelogs in new restored controlfile as
       rman>catalog start with 'backuplocation';
       4) Finally you can restore and recover your database as below
       rman>restore database;
    rman>recover database;
    rman>alter database open resetlogs;
       If you want restore database to new location then before executing RESTORE DATABASE command you can use SET NEWNAME FOR DATAFILE clause.Firstly refer backup recovery guide in online documentation.

  • Creating collection vs. materialized view - better performance?

    Hi, I am trying to improve the performance of our application and am looking at everything possible. I am wondering if the use of multiple, complex collections is slowing down our application. Would the use of materialized views, as opposed to collections improve things? thanks Karen

    to provide more info....
    here is the process which creates the list of species based on favorite species identified (and followed by the query to select from this collection)
    declare
    yes_are NUMBER;
    pCount NUMBER;
    l_seq_id NUMBER;
    yes_hms NUMBER;
    found_area NUMBER;
    found_unit NUMBER;
    unitmeasure VARCHAR2(2);
    pbCount NUMBER;
    pbPrice NUMBER;
    begin
    --create license collection so that if error on a submit the information retains
    if apex_collection.collection_exists('LICENSE_COLLECTION') then
    apex_collection.delete_collection('LICENSE_COLLECTION');
    end if;
    --create vessel collection so that if error on a submit the information retains
    if apex_collection.collection_exists('SUPVES_COLLECTION') then
    apex_collection.delete_collection('SUPVES_COLLECTION');
    end if;
    apex_collection.create_or_truncate_collection('FP_COLLECTION');
    --create collection to save landings
    apex_collection.create_or_truncate_collection('SPECIES_COLLECTION');
    --loop through the favorite species and populate with pre-existing data
    for rec IN (select *
    from frequent_species
    where permit_id = :G_PERMIT_ID
    order by fav_order)
    LOOP
    -- check to see if there is a priceboard entry for the favorite species
    select count(*) into pbCount
    from price_board
    where permit_id = :G_PERMIT_ID and
    species_itis = rec.species_itis and
    grade_code = rec.grade_code and
    market_code = rec.market_code and
    unit_of_measure = rec.unit_measure and
    price is not null;
    -- if there is a price board entry
    if pbCount = 1 then
    --get the default price for that species combination
    select price into pbPrice
    from price_board
    where permit_id = :G_PERMIT_ID and
    species_itis = rec.species_itis and
    grade_code = rec.grade_code and
    market_code = rec.market_code and
    unit_of_measure = rec.unit_measure and
    price is not null;
    --add landings row with price board data
    l_seq_id := apex_collection.add_member('SPECIES_COLLECTION',
    null,
    null,
    null,
    rec.species_itis,
    rec.grade_code,
    rec.market_code,
    rec.unit_measure,
    nvl(rec.disposition_code,:G_FIRST_DISPOSITION),
    0, null,pbPrice,null);
    -- no price board entry
    else
    -- add landings row without any priceboard data
    l_seq_id := apex_collection.add_member('SPECIES_COLLECTION',
    null,
    null,
    null,
    rec.species_itis,
    rec.grade_code,
    rec.market_code,
    rec.unit_measure,
    nvl(rec.disposition_code,:G_FIRST_DISPOSITION),
    0, null,null,null);
    end if;
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq => l_seq_id,
    p_attr_number =>14,
    p_attr_value => 'ERROR');
    --set first disposition
    :G_FIRST_DISPOSITION := nvl(rec.disposition_code,:G_FIRST_DISPOSITION);
    found_area:=0;
    -- All rows need to be checked to determine if additional info is needed based on partner_options table
    -- check if AREA will be needed
    select count(*) into found_area
    from partner_options
    where partner_id = :G_ISSUING_AGENCY and
    substr(species_itis,1,6) = rec.species_itis and
    option_type = 'ARE' and
    nvl(inactivate_option_date, sysdate) >= sysdate;
    -- landing row requires AREA data
    if found_area > 0 then
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq => l_seq_id,
    p_attr_number =>13,
    p_attr_value => 'Y');
    -- landing row does NOT require AREA data
    else
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq => l_seq_id,
    p_attr_number =>13,
    p_attr_value => 'N');
    end if;
    found_unit := 0;
    -- check if COUNT will be needed
    select count(*) into found_unit
    from partner_options
    where partner_id = :G_ISSUING_AGENCY and
    substr(species_itis,1,6) = rec.species_itis and
    option_type = 'LBC' and
    nvl(inactivate_option_date, sysdate) >= sysdate;
    -- landing row requires UNIT data
    if found_unit > 0 then
    select unit_measure into unitmeasure
    from partner_options
    where partner_id = :G_ISSUING_AGENCY and
    substr(species_itis,1,6) = rec.species_itis and
    option_type = 'LBC' and nvl(inactivate_option_date, sysdate) >= sysdate;
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq => l_seq_id,
    p_attr_number =>17,
    p_attr_value => 'Y');
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq => l_seq_id,
    p_attr_number =>19,
    p_attr_value => unitmeasure);
    --landing row does NOT require UNIT data
    else
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq => l_seq_id,
    p_attr_number =>17,
    p_attr_value => 'N');
    end if;
    -- check if HMS
    SELECT count(*) into yes_hms
    FROM HMSSpecies a
    where hmsspeciesitis = rec.species_itis;
    -- landing row requires HMS data
    if yes_hms > 0 and rec.grade_code = '10' then
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq=> l_seq_id,
    p_attr_number=>20,
    p_attr_value=>'Y');
    else
    -- landing row does NOT require HMS data
    apex_collection.update_member_attribute(p_collection_name=>'SPECIES_COLLECTION',
    p_seq=> l_seq_id,
    p_attr_number=>20,
    p_attr_value=>'N');
    end if;
    end loop;
    end;
    and the query for the region:
    SELECT
    apex_item.text(1,seq_id,'','','id="f01_#ROWNUM#"','','') "DeleteRow",
    apex_item.text_from_LOV(c004,'SPECIES')||'-'||apex_item.text_from_LOV(c005,'GRADE')||'-'||apex_item.text_from_LOV(c006,'MARKETCODE')||'-'||apex_item.text_from_LOV_query(c007,'select unit_of_measure d, unit_of_measure r from species_qc') unit,
    apex_item.select_list_from_LOV(6,c008,'DISPOSITIONS','onchange="getAllDisposition(#ROWNUM#)"','YES','0',' -- Select Favorite -- ','f06_#ROWNUM#','') Disposition,
    apex_item.select_list_from_LOV(7,c009,'GEARS','style="background-color:#FBEC5D; "onFocus="checkGearPreviousFocus(#ROWNUM#);"onchange="getAllGears(#ROWNUM#)"','YES','0','-- Select Favorite --','f07_#ROWNUM#','') Gear,
    apex_item.text(8,TO_NUMBER(c010),5,null,'onchange="setTotal(#ROWNUM#)"','f08_#ROWNUM#','') Quantity,
    apex_item.text(9,TO_NUMBER(c011),5,null,'onchange="getPriceBoundaries(#ROWNUM#)"','f09_#ROWNUM#','') Price,
    apex_item.text(10, TO_NUMBER(c012),5,null, 'onchange="changePrice(#ROWNUM#)" onKeyPress="selectDollarsFocus(#ROWNUM#);"','f10_#ROWNUM#','') Dollars,
    apex_item.select_list_from_LOV_XL(11, c014,'AREAFISHED','style="background-color:#FBEC5D; "onchange="getAllAreaFished(#ROWNUM#)"','YES','ERROR','-- Select Area Fished --','f11_#ROWNUM#','') Area_Fished,
    apex_item.text(12, c018,4,null,'style="background-color:#FBEC5D; "onBlur="setUnitQuantity(#ROWNUM#)"','f12_#ROWNUM#','') UNIT_QUANTITY,
    apex_item.text(13, 'CN',3,null,'readOnly=readOnly','f13_#ROWNUM#','') UNIT_COUNT,
    apex_item.checkbox(14,'Y','id="f14_#ROWNUM#" style="background-color:#FBEC5D; " onClick="alterYes(#ROWNUM#);" onKeyPress="alterYes(#ROWNUM#);"',c021) FinsAttached,
    apex_item.checkbox(15,'N','id="f15_#ROWNUM#" style="background-color:#FBEC5D; " onClick="alterNo(#ROWNUM#);" onKeyPress="alterNo(#ROWNUM#);"',c022) FinsNotAttached,
    apex_item.checkbox(16,'U','id="f16_#ROWNUM#" style="background-color:#FBEC5D; " onClick="alterUnk(#ROWNUM#);" onKeyPress="alterUnk(#ROWNUM#);"',c023) FinsUnknown
    from apex_collections
    where collection_name = 'SPECIES_COLLECTION' order by seq_id
    /

  • Clear Settings does not delete .xmp sidecars

    Using latest CR 6.4, Bridge 4.0.4.2 & Ps 12.0.4
    Posting here because most knowledgeable folks here rarely if ever seem to visit the Bridge forum anymore. And it does relate to CR settings.
    In Bridge, if you choose Develop Settings/Clear settings, the settings are indeed cleared but the sidecar files remain, even if you refresh the Windows folder and/or close and reopen both Bridge and Ps. Additionally, even if you manually delete the sidecars, the size of the exported cache file Bridge.T is not reduced but remains the same.
    Anyone else noticed this?

    Andrew_Hart wrote:
    …In Bridge, if you choose Develop Settings/Clear settings, the settings are indeed cleared but the sidecar files remain…
    That has always been the case, and it's perfectly normal, expected behavior.
    Your mistake is assuming that the xmp file would contain only ACR edit information.  That is just not the case.
    Here's an example of the kind of information retained in an xmp file in CS4 after clearing the develop settings in Bridge.  This should make it clear to you why the xmp file must remain after clearing the ACR development settings rather than having to be built from scratch all over again next time you open the file:
    <x:xmpmeta xmlns:x="adobe:ns:meta/" x:xmptk="Adobe XMP Core 4.2-c020 1.124078, Tue Sep 11 2007 23:21:40        ">
    <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
      <rdf:Description rdf:about=""
        xmlns:tiff="http://ns.adobe.com/tiff/1.0/">
       <tiff:Make>PENTAX</tiff:Make>
       <tiff:Model>PENTAX K20D</tiff:Model>
       <tiff:Orientation>1</tiff:Orientation>
      </rdf:Description>
      <rdf:Description rdf:about=""
        xmlns:exif="http://ns.adobe.com/exif/1.0/">
       <exif:ExifVersion>0221</exif:ExifVersion>
       <exif:ExposureTime>1/160</exif:ExposureTime>
       <exif:ShutterSpeedValue>7321928/1000000</exif:ShutterSpeedValue>
       <exif:FNumber>8/1</exif:FNumber>
       <exif:ApertureValue>6/1</exif:ApertureValue>
       <exif:ExposureProgram>0</exif:ExposureProgram>
       <exif:DateTimeOriginal>2010-02-10T18:29:45-08:00</exif:DateTimeOriginal>
       <exif:DateTimeDigitized>2010-02-10T18:29:45-08:00</exif:DateTimeDigitized>
       <exif:ExposureBiasValue>20/10</exif:ExposureBiasValue>
       <exif:MeteringMode>5</exif:MeteringMode>
       <exif:FocalLength>1900/100</exif:FocalLength>
       <exif:SensingMethod>2</exif:SensingMethod>
       <exif:FocalLengthIn35mmFilm>28</exif:FocalLengthIn35mmFilm>
       <exif:CustomRendered>0</exif:CustomRendered>
       <exif:ExposureMode>1</exif:ExposureMode>
       <exif:WhiteBalance>0</exif:WhiteBalance>
       <exif:SceneCaptureType>0</exif:SceneCaptureType>
       <exif:Contrast>2</exif:Contrast>
       <exif:Saturation>0</exif:Saturation>
       <exif:Sharpness>2</exif:Sharpness>
       <exif:SubjectDistanceRange>3</exif:SubjectDistanceRange>
       <exif:ISOSpeedRatings>
        <rdf:Seq>
         <rdf:li>800</rdf:li>
        </rdf:Seq>
       </exif:ISOSpeedRatings>
       <exif:Flash rdf:parseType="Resource">
        <exif:Fired>True</exif:Fired>
        <exif:Return>0</exif:Return>
        <exif:Mode>1</exif:Mode>
        <exif:Function>False</exif:Function>
        <exif:RedEyeMode>False</exif:RedEyeMode>
       </exif:Flash>
      </rdf:Description>
      <rdf:Description rdf:about=""
        xmlns:xap="http://ns.adobe.com/xap/1.0/">
       <xap:ModifyDate>2010-02-10T18:29:45-08:00</xap:ModifyDate>
       <xap:CreateDate>2010-02-10T18:29:45-08:00</xap:CreateDate>
       <xap:CreatorTool>K20D Ver 1.03</xap:CreatorTool>
       <xap:Rating>0</xap:Rating>
      </rdf:Description>
      <rdf:Description rdf:about=""
        xmlns:aux="http://ns.adobe.com/exif/1.0/aux/">
       <aux:LensInfo>160/10 450/10 40/10 40/10</aux:LensInfo>
       <aux:Lens>smc PENTAX-DA 16-45mm F4 ED AL</aux:Lens>
       <aux:LensID>4 254</aux:LensID>
      </rdf:Description>
      <rdf:Description rdf:about=""
        xmlns:crs="http://ns.adobe.com/camera-raw-settings/1.0/">
       <crs:RawFileName>_IMG0187.PEF</crs:RawFileName>
       <crs:Version>5.7</crs:Version>
       <crs:ProcessVersion>5.0</crs:ProcessVersion>
       <crs:HasSettings>False</crs:HasSettings>
       <crs:HasCrop>False</crs:HasCrop>
       <crs:AlreadyApplied>False</crs:AlreadyApplied>
      </rdf:Description>
      <rdf:Description rdf:about=""
        xmlns:photoshop="http://ns.adobe.com/photoshop/1.0/">
       <photoshop:SidecarForExtension>PEF</photoshop:SidecarForExtension>
      </rdf:Description>
    </rdf:RDF>
    </x:xmpmeta>
    Note the creation date:  <exif:DateTimeOriginal>2010-02-10T18:29:45-08:00</exif:DateTimeOrigin al>.  Yet, I just cleared the Develop Settings in the last few minutes.
    Wo Tai Lao Le
    我太老了

  • Preventing importing of low resolution images?

    I'm in process of moving a bucketload of images into a Lightroom library from various sources in the drives of the Macs at gome, and am having problem with Lightroom importing a bunch of low resolution images into the library. These files include various web exports and iPhoto library thumbnails etc. Now, this wouldn't be a problem, except:
    1) I have no control over the "Don't import suspected duplicates" feature, so I don't know of Lightroom is importing a thumbnail version of a shot (with EXIF) or the original. I suspect is importing both, but can't be sure, and the import report is just too tedious to go through, as the total size of the import includes over 100k exposures.
    2) Lightroom doesn't have a feature that'd let me search for scaled down images. I can't search by resolution, and the scaling flag search doesn't pick up images scaled outside Lightroom. I could throw away some shots with no EXIF data but exports with EXIF information retained cannot be thrown away from the lib.
    Anyone here have ideas on how to solve this? A solution that causes Lightroom to ignore files that aren't of the size that's the same as a the camera original, or one that lets me search for resized images from a library would both suffice.

    dorin_nicolaescu wrote:
    If you are going to judge whether to import or not by file size alone, an easy solution would be to:
    Use Windows Explorer (or Mac search) to search you Pictures folder (and subfolders) for files smaller than, say, 300 Kb.
    Delete them or move to another folder.
    Import the clean folder into Lightroom.
    Heh, I guess you missed a detail in my post. I'm importing more than 100.000 exposures contained in a few thousands of folders. The total size of the library that's been imported right now is about 650 GB. If I can't automate this, I can't do it.
    I shuffled through the Lightroom plugin API doc and seems I can't create a plugin that'd automate this either, as there's no call to initialize the value from the plugin itself - had an idea to provide a searchable width/height metadata plugin myself but apparently that's a no go.

  • Hi. Are there any overviews of my purchases from the Apple Store available on your website?

    Unfortunately I couldn't find a community for anything other than the App Store, but my question is aimed at the purchases from the Apple Store here on your site. I would like to print out a reciept for my Mac Mini which I bought in 2009.
    Tural

    If you made your purchase from the online Apple Store, did you create an account? If so you can sign into your account and view the purchases that you have made in the recent past. Unfortunately, the information retained by Apple and available for your viewing only covers the last 18 months.
    If you made the ourchase with a bank card, oerhaps you can loctae that information in your bank card records and with that information Apple may be able to locate the purchase for you. Call 1-800-MY-APPLE in the US when you have located the bank information for the purchase.

  • PHP and frames

    my site contains a scroller marquee that is below my header
    which stays the same and each time someone click a link anywhere in
    the site this marquee restarts, i would like this to continue
    without restarting until all info is shown then to restart like
    normal.
    my index.php file looks like this...
    <?
    include("header.php");
    include("marquee.php");
    include("mainpage.php");
    include("footer.php");
    ?>
    been told to use frames but now been told not to use them as
    they are going out of fasion!
    how do i get my marquee to work without restarting when a
    link is clicked on my site.
    i have tried to use frames and been told to use iframes, but
    have no idea how to get the content to resize the frames to fit
    can someone plz help me out here, been looking for too long
    now.
    thanks

    > been told to use frames but now been told not to use
    them as they are
    > going
    > out of fasion!
    Frames are bad for all sorts of reasons.
    BUT, they are useful when you need to keep a 'chunk' of
    information retain a
    separate state that the rest of the content you want to
    display...which is
    exactly what you want to do in your case. You want your
    marquee to retain
    it's uninterupted state whle the content on the rest of your
    site changed as
    one goes from page to page.
    So, the answer is to have a frameset with two frames. One
    frame is your
    scrolling marquee and the other are all your content pages on
    the site.
    This will accomplish what you want.
    Does being able to do that outweigh the disadvantages of
    using frames? I
    don't know. That's up to you, I suppose. Personally, I'd say
    'no', but it's
    your call.
    > i have tried to use frames and been told to use iframes,
    but have no idea
    > how
    > to get the content to resize the frames to fit
    An iFrame won't work in this case, as an iFrame is always a
    child of a
    parent page. You actually want two separate pages.
    -Darrel

  • Double items CCM

    Hi guys,
    We're currently facing an issue with double items in CCM and I ran out of ideas.
    The contentmanager has updated a catalog and published the Master. At first he didn't see the changes. He then removed the catalog, published the Master, uploaded the catalog again, performed the mapping, and published the Master again.
    In TREXADMIN I can see that there are double items. But when I go to the positions of the catalog (via maintain catalog) I can only see the recent items as it should be.
    So somewhere between maintaining en publishing the catalog something is not updated.
    There are no errors in /CCM/CHECK_TREX
    In table /CCM/D_CTLG_REQ there's only 1 active entry for Proces SE, Master catalog. This should be ok. There's also only 1 active entry for the concerning catalog with proces AI.
    Checked table /CCM/D_PUB_ST, but there are no errors here. All publications are succesful.
    Tabled /CCM/D_UPD_STATE  gives me a result for this particular catalog with status 05 (completed), ok as well to my opinion.
    Any ideas what to do in order to remove the double items in the catalog?
    Thanks,
    Timo

    Hi Timo,
    You should apply the following notes for correct operation of the CSE:
    AP-CAT-AUT     1232769     Deleted model information retained in CSE after full pubn.
    AP-CAT-AUT     1231427     Deleted view content retained in CSE after full publication
    AP-CAT-AUT     1154194     Deleted items persist in CSE after full publication
    Regards,
    Jason

  • Pinterest and the like

    I understand that the possibility of using Pinterest technology within dps apps is being looked at for future releases..BUT in the meantime does anyone have any alternative solutions to being able to have clickable content within an app that passes a url to an external website. Of course i can make anything a hyperlink...but  ideally want someone to have logged into the app so the information passed in the url is specific.
    This brings up the question too of whether i can have someone log in to my app and have that information retained for passing into a variable on all links.?
    Also....if i have a link within my dps app can i force the external site to open up within my dps framework so that it looks exactly like part of the app?
    Thanks for any help

    I had the smae problem as you and what i did was to go to settings>general>>reset then tapped on reset location & privacy which then when i went to the app and it asked again to have access to my photos and the problem was fixed

  • Hyper-V Replica Networking question

    Hi
    Just looking for some best practice guidance here, I have two standalone hyper-v hosts in my HQ (192.168.50.x) one replicating to the other then in my DR site I have extended replica to here (172.16.10.x) if and when i failover a VM from HQ with an IP of
    (192.168.50.x) it is not contactable in the DR subnet, what is best practice here as i dont want to have to manually change the IP of the app server to suit new DR site, just to add i dont run vlans as i dont have any switches capable of that, what are my
    options so it seamless failover and failback?
    Thanks in advance
    Spud
    Spudney

    Hi,
    You can also set the Failover IP Settings for Replica Virtual Machine (on Hyper-v console).
    More information:
    Retaining IP Address after failover using Hyper-V Recovery Manager
    http://blogs.technet.com/b/scvmm/archive/2014/04/04/retaining-ip-address-after-failover-using-hyper-v-recovery-manager.aspx
    Hope this helps.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

Maybe you are looking for

  • Reading text file and output (to stdout) a list of the unique words in the

    Hi, I have a main method as main.java package se.tmp; public class Main public static void main( String[] args ) WordAnalyzer.parse( args[0] ); and text file as words.txt the quick brown fox jumps over the lazy dog the quick brown fox jumps over the

  • Accessing Managed Bean Variables in Entity Impl Class

    How can I access managed bean variables in the entity Impl class . While inserting a new record in DB , i want to set few entity properties values . The values of those properties are available in the managed bean . How can i access those values from

  • Partner profile  for  edi

    Hi,   I am  new  to  edi  configuring  partner profile.    Tran we20 ,  which one  select   as customer  or  vendor. 1) Present inbound  is sales  order and  outbound  is puchase order. Present  what  partner  profile  should i select(customer/vendor

  • How to recover accidentally deleted photos do Iphone

    Hello I took photos with the iPhone 6 Day 10/14/2014 to 10/22/2014 and these pictures were accidentally erased cell, already accessed the Icloud and photos are there not. Does anyone know if there is a way to recover them? Thank you so much

  • Playback bug?

    I've submitted the following via feedback, but I figured I'd let this be known to see if it's reproducable: I guess I've encountered a bug that occurs when you replace a [partial] audio track with a longer version of another. For example, if you have