Ex post data writing in XMP files

Dear all
Usually, I do not use the option to include lightroom modifications in xmp files or something like that but trust on the catalogue as storage utility for all information.
Nevertheless, I read a few articles recommending to use XMP capability of LR in order to keep all information on how a raw file has been treated in order to come to the actual picture - irrespective of the existence of LR.
Is there any chance to use an ex post process that basically checks all information on the files I have in LR and creates automatically the corresponding XMP files for all photos?
Any help is highly appreciated.
Best regards
dornkaat

Those articles aren't of much value! The xmp files are only useful for that purpose if you open the files in Bridge/ACR. or if you open them in a text editor and can figure out what the code means. Their value is only for opening the files directly in Bridge/ACR or for exchanging keywords and IPTC metadata with another program. Note as well that the XMP doesn't contain all your work - eg history steps, virtual copies. So use the catalogue, and focus on backing it up anf testing that you could restore it if disaster strikes.
John

Similar Messages

  • Writing to XMP file or one file per folder?

    I am about to start using LR2. Used to using PS and Bridge. I use DNG and tiffs, and jpgs. I am used to having one file per folder for bridge's info. I do NOT want an XMP file per picture file. How does LR2 do this? I have read a few posts about XMP files and such. Is there a setting to make one file per folder, or does LR2 make XMP files for each picture?

    >XMP sidecars are only created for proprietary raw files (eg CR2, NEF, etc) and only when you save metadata out to file.
    Good, so since I do not use proprierary RAW's, I am safe. Thanks. I do plan to save metadata to the files, so in the future, other programs can use it.

  • Jdeveloper 11.1.1.6 fail to post data with uploaded orddoc file

    I migrated our application in 11.1.1.6 from 11.1.1.4
    everything works fine except from posting data of entity with OrdDocDomain attribute.
    Anyone have the same issue?
    the log is
    JBO-26041: Failed to post data to database during "Update": SQL Statement "UPDATE DOCUMENT Document SET FILENAME=:1,LOBCONTENT=:2,UPDATED_BY=:3,UPDATED_ON=:4,OPTLOCK_VERSION=:5 WHERE ID=:6".
         at com.sun.el.parser.AstValue.invoke(Unknown Source)
         at com.sun.el.MethodExpressionImpl.invoke(Unknown Source)
    Caused by: oracle.jbo.DMLException: JBO-26041: Failed to post data to database during "Update": SQL Statement "UPDATE DOCUMENT Document SET FILENAME=:1,LOBCONTENT=:2,UPDATED_BY=:3,UPDATED_ON=:4,OPTLOCK_VERSION=:5 WHERE ID=:6".
         at oracle.jbo.server.OracleSQLBuilderImpl.doEntityDML(OracleSQLBuilderImpl.java:583)
         at oracle.jbo.server.EntityImpl.doDMLWithLOBs(EntityImpl.java:8647)
         at oracle.jbo.server.EntityImpl.doDML(EntityImpl.java:8579)
         at com.intralot.igp.model.base.EntityImpl.doDML(EntityImpl.java:96)
         at oracle.jbo.server.EntityImpl.postChanges(EntityImpl.java:6816)
         at oracle.jbo.server.DBTransactionImpl.doPostTransactionListeners(DBTransactionImpl.java:3290)
         at oracle.jbo.server.DBTransactionImpl.postChanges(DBTransactionImpl.java:3093)
         at oracle.jbo.server.DBTransactionImpl.commitInternal(DBTransactionImpl.java:2097)
         at oracle.jbo.server.DBTransactionImpl.commit(DBTransactionImpl.java:2378)
         at oracle.adf.model.bc4j.DCJboDataControl.commitTransaction(DCJboDataControl.java:1615)
         at oracle.adf.model.dcframe.LocalTransactionHandler.commit(LocalTransactionHandler.java:139)
         at oracle.adf.model.dcframe.DataControlFrameImpl.commit(DataControlFrameImpl.java:1226)
         at com.intralot.igp.view.utils.ADFUtils.dcCommit(ADFUtils.java:466)
         at com.intralot.igp.taskflows.view.backing.documents.Documents.editDocumentListener(Documents.java:67)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         ... 72 more
    Caused by: java.sql.SQLException: Undefined type
         at oracle.jpub.runtime.Util._convertToOracle(Util.java:277)
         at oracle.jpub.runtime.Util.convertToOracle(Util.java:167)
         at oracle.jpub.runtime.MutableStruct.getDatumAttribute(MutableStruct.java:323)
         at oracle.jpub.runtime.MutableStruct.getDatumAttributes(MutableStruct.java:347)
         at oracle.jpub.runtime.MutableStruct.toDatum(MutableStruct.java:118)
         at oracle.ord.im.OrdSource.toDatum(OrdSource.java:93)
         at oracle.jpub.runtime.Util._convertToOracle(Util.java:183)
         at oracle.jpub.runtime.Util.convertToOracle(Util.java:167)
         at oracle.jpub.runtime.MutableStruct.getDatumAttribute(MutableStruct.java:323)
         at oracle.jpub.runtime.MutableStruct.getDatumAttributes(MutableStruct.java:347)
         at oracle.jpub.runtime.MutableStruct.toDatum(MutableStruct.java:118)
         at oracle.ord.im.OrdDocBase.toDatum(OrdDocBase.java:96)
         at oracle.jdbc.driver.OraclePreparedStatement.setORADataInternal(OraclePreparedStatement.java:10450)
    Anyone have the same issue?
    Thanks

    We are getting the same issue when migrated to latest WLS version 10.3.6.0(ADF runtime 11.1.1.6) from WLS 10.3.3.0.
    The Exception we are getting while trying to post an ORD type object:
    java.sql.SQLException: Undefined type
    at oracle.jpub.runtime.Util._convertToOracle(Util.java:277)
    at oracle.jpub.runtime.Util.convertToOracle(Util.java:167)
    at oracle.jpub.runtime.MutableStruct.getDatumAttribute(MutableStruct.java:323)
    at oracle.jpub.runtime.MutableStruct.getDatumAttributes(MutableStruct.java:347)
    at oracle.jpub.runtime.MutableStruct.toDatum(MutableStruct.java:118)
    at oracle.ord.im.OrdSource.toDatum(OrdSource.java:93)
    at oracle.jpub.runtime.Util._convertToOracle(Util.java:183)
    at oracle.jpub.runtime.Util.convertToOracle(Util.java:167)
    at oracle.jpub.runtime.MutableStruct.getDatumAttribute(MutableStruct.java:323)
    at oracle.jpub.runtime.MutableStruct.getDatumAttributes(MutableStruct.java:347)
    at oracle.jpub.runtime.MutableStruct.toDatum(MutableStruct.java:118)
    at oracle.ord.im.OrdImageBase.toDatum(OrdImageBase.java:99)
    at oracle.jdbc.driver.OraclePreparedStatement.setORADataInternal(OraclePreparedStatement.java:10450)
    at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:11651)
    at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:11631)
    at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:253)
    at weblogic.jdbc.wrapper.PreparedStatement.setObject(PreparedStatement.java:357)
    at oracle.jbo.server.OracleSQLBuilderImpl.bindInsertStatement(OracleSQLBuilderImpl.java:2035)
    at oracle.jbo.server.EntityImpl.bindDMLStatement(EntityImpl.java:10516)
    at oracle.jbo.server.OracleSQLBuilderImpl.doEntityDML(OracleSQLBuilderImpl.java:412)
    at oracle.jbo.server.EntityImpl.doDMLWithLOBs(EntityImpl.java:8647)
    at oracle.jbo.server.EntityImpl.doDML(EntityImpl.java:8579)
    at oracle.apps.grc.cms.model.eo.MediarepositoryimagesImpl.doDML(MediarepositoryimagesImpl.java:52)
    Can anyone please help on this. Is there any work around available?

  • Will somebody please post an Aperture-written xmp file *with* custom metadata in it - thanks.

    It is my understanding that Adobe's migration tool will NOT support transferal of custom metadata from Aperture to Lightroom.
    I have recently enhanced my Custom Metadata plugin to read custom metadata from Expression Media xmp, and it seems like a no-brainer to have a "preset" which can read/import/transfer custom metadata from Aperture xmp too - I just need a sample to work with - thanks.
    If you don't want to post here, please send directly to me.
    robcole.com - Contact Me
    Rob

    To attempt a wrap-up for this thread:
    Aperture does NOT provide a means to save custom metadata in XMP - the initial premise of this thread was incorrect: a thank you to John Beardsworth for setting me straight. Specific question asked has been answered.
    The purpose for the thread however was to learn how to get information out of Aperture so it can be gotten into Lightroom (final destination: custom metadata fields).
    Here is one way to get custom metadata in Aperture to custom metadata in Lightroom, recommended if you have enough free IPTC fields to house the custom data temporarily:
    * Create an applescript file with the following contents, modified to support your custom metadata fields; note: it must also be modified to specify the IPTC fields to be used:
    tell application “Aperture”
        set imageSel to the selection
        repeat with i from 1 to count of imageSel
            tell library 1
                tell item i of imageSel
                    set customFieldValue to value of custom tag "myCustomFieldName"
                    set value of IPTC tag "myFreeIptcFieldName" to customFieldValue
                end tell
            end tell
        end repeat
    end tell
    Note: you must adapt script depending on your custom fields, and which IPTC fields you have free.
    * Select photos in Aperture.
    * Execute the script.
    * Once photos are in Lightroom, with IPTC fields populated.., then in Lr's plugin manager, with 'Custom Metadata' plugin selected, click 'Transfer Metadata' button, then "follow the yellow-brick road" (you will be prompted with instructions..). Metadata will be transferred from IPTC fields to custom metadata in Lightroom.
    If you do NOT have sufficient IPTC fields to use, or would rather not use IPTC fields for this, then you'd have to modify the script to populate a row in a CSV file for each photo, then use the 'Import from CSV' function in Custom Metadata plugin. I can help if it comes to this..
    PS - There may be other ways as well, but maybe this is enough..
    Rob

  • How do I modify invoice request xml file by adding posting date?

    Hi,
    We import customer invoice requests via xml files from an external data source. Currently the standard SAP xml file does not include the posting date, and invoices enter SAP with a blank posting date. When the invoice is released, the posting date is taken from the invoice date.
    Due to our month end processes, we have hundreds of invoice requests every month where we do not want posting date to equal the invoice date, and for each of those invoices the posting date is manually entered one invoice at a time during the release process. This is very time consuming.
    We would like to build functionality in our external system to create the posting date at the xml file generation stage. Could anyone let us know the following:
    - what is the name of the posting date field on invoice requests (invoice documents)?
    - where would we place the additional script in the xml file?
    I'm attaching one of our current xml files (which already contains one section that has been customized)
    Your suggestions would be appreciated.
    Thanks,
    Kerstin

    Kerstin,
    The closest i could find in the WSDL of the Manage Invoice Request Web Service that handles this integration is "<ProposedInvoiceDate>" or possibly even "<ProposedDeviatingPostingdate>", both of which sit directly under the <CustomerInvoiceRequest> element.
    For more information, go to the Service Explorer, find the ManageInvoiceRequestsIn Web Service, download the WSDL, and open it in SOAP-UI or something similar. This way you can see all the fields that you can write to, which is where i found these two elements.

  • Editing xmp files in Windows

    Windows XP SP2 Pro
    Just read on another forum that it is possible to edit xmp data in a text editor in Windows. If possible, could someone kindly stear me in the right direction for doing this or post a link.
    I took a few dozen shots the other day and for some reason or other when imported into LR 99% of them showed up with the correct lens being used in the Metadata browser whilst the remaining 1% were indicated as 'Unknown Lens'. I would like to edit this data if possible so if the above is possible as suggested would appreciate some direction in the way to proceed. Thanks.

    Bogdan
    The situtation is this - last weekend I was asked to do a shoot of a friends daughter at her Church Communion. The shoot was a good result and I couldn't be more pleased with what I captured - the family are also :-)
    For the shoot I used a Canon 30D with an EF 17-40 f/4L USM and shot off countless images. There was no camera or lens change throughout the shoot and all images were shot in RAW.
    Upon Import to LR (only import, no editing done up to this point), the Metadata browser in the left hand panel of the Library indicated that 99% of these images were shot with the camera/lens combination above whilst the remaining 1%, whilst showing the correct camera, were indicated as 'Unknown Lens'. This may or may not be a corruption of the impregnated EXIF data within the RAW file itself but in this regard is not important.
    My gaol was to change the 'Unknown Lens' entry in the Metadata entry to reflect the lens actually used. To this end I first Exported an xmp file, opened the file, found where unknown lens was indicated and substituted it for the lens used. I first checked another xmp with the correct info shown to ensure that I typed the correct data in precisely in the format as shown in the referenced file. I then saved the modified xmp file, went back to LR and 'Removed' the RAW image from the Library. Following this I Re-Imported the image and imported the xmp file just in case it hadn't been brought in at the same time as the RAW file.
    Unfortunately, when highlighting this RAW image in the Library the Metadata browser panel still indicated the lens used as being 'Unknown'. I checked back in the xmp file and that still reflected my changes e.g. no mention of unknown lens and the correct lens being shown. I can only assume from this that upon Import of the .CR2 file into LR, LR is giving priority to the EXIF data buried deep within the .CR2 file itself and only takes data from the xmp file that is not contained within the EXIF file. I am no expert in these matters and therefore cannot come up with any other reason as to why when the xmp file has been corrected this is not reflected in the Metadata browser window in LR.
    This is certainly not the end of the world but it is something that would obviously be desirous if this problem keeps on repeating itself. If LR is to show 'statistics' of Cameras and Lenses used etc then for it to have worth it must reflect accurate data. Not that I am suggesting for one moment that this is a failure within LR itself you understand.
    Edit: Forget all references made to Camera model e.g. 30D/20D which may be confusing the issue. It was just that when I failed in my attempts to change the 'Unknown' lens entry to the correct data that I then attempted to see if it was possible to change any other data e.g. Camera model. This too was unsuccessful, but as stated can be dismissed as it is not the important issue here as is the case for the lens used and was only done for experimentation purposes.

  • How open xmp file in window 7

    When I do post processing by Raw, XMP file automatically created.
    1.How open this file in window 7?
    2. I need keep this file because I have same file in Adobe 12?
    3. What happened if delete it?
    4. If it not need how prevent to created it?

    Also before go to any adjustment we can ( I always do)   made copy or duplicate the  original and working with the copy.
    This is completely unnecessary.
    Adobe software never overwrites or modifies your RAW files. (Not true for original JPG files, in which case Adobe software could overwrite the original JPG file)
    It has nothing to do with saving as DNG or saving as JPG. Furthermore, if you are editing a RAW, in general I see no point to save it as a DNG, when saving it as a RAW by clicking on "DONE" accomplishes the same thing.

  • Batch extract file resolution from image xmp file

    Hi all,<br />I am looking for a batch process that extracts the X & Y pixel dimensions for a bunch of TIFF files.<br />I find the data in the XMP file when I save out the file:<br /><rdf:Description rdf:about=''<br />  xmlns:tiff='http://ns.adobe.com/tiff/1.0/'><br />  <tiff:XResolution>3000000/10000</tiff:XResolution><br />  <tiff:YResolution>3000000/10000</tiff:YResolution><br />  <tiff:ResolutionUnit>2</tiff:ResolutionUnit><br /> </rdf:Description><br /><br />I do not know if I need to start with the XMP file or if I can start with the source TIFF file.<br />Ultimately I would like to save the dimensions to a database.<br />Any help would be great

    It looks like every time I capture an image of a vm, a snapshot is taken of each disk and then a vhd object is also created. So can I convert the vhd's to a disk without going through the create-a-vm-from-image process? Going even further, if I just took
    snapshots of my vm's, can I turn those snapshots directly into disks?

  • Merge? or get rid of XMP files

    At some point in the past I managed to get XMP files associated with some of my raw (CR2) files.  It looks like something I did about a year ago made it happen on all my photos up to that date.  Later raw files don't have the same XMP files.
    Is there a way to merge the data in the XMP files back into the raw files?

    SWoolc wrote:
    Thinking about it, I guess the data won't be merged back to raw files, but rather the catalog file.   Can that be done?
    As  mentioned above the simple answer is no, the xmp cannot be merged into any file or catalog apart from  DNG which contains inside it's structure the information held in the xmp file for a proprietary raw file such as you CR2's.
    The information is contained in the Lightroom Catalog but having xmp files can be advantageous as a safety measure in case of a catalog failure(which ought to be backed up) or changing applications. And again as mentioned they can be safely deleted.

  • Can someone help with a previous post labeled "Writing to a data file with time stamp - Help! "

    Can someone possibly help with a previous post labeled "Writing to a data file with time stamp - Help! "
    Thanks

    whats the problem?
    Aquaphire
    ---USING LABVIEW 6.1---

  • Problem writing meta data changes in xmp in spite of enabled settings

    Dear Adobe Community
    After struggling with this for two full days and one night, you are my last hope before I give up and migrate to Aperture instead.
    I am having problems with Lightroom 5 writing meta data changes into xmp and including development settings in JPEG, inspite of having ticked all three boxed in catalog settings.
    In spite of having checked all boxes, Lightroom refused to actually perform the actions. I allowed the save action to take a lot longer than the saving indicator showed was needed, but regardless of this no edits made in the photo would be visible outside Lightroom. I also tried unticking and ticking and restarting my compute.
    Therefore, I uninstalled the program and the reinstalled it again (the trial version both times). I added about 5000 images to Lightroom (i.e. referenced). After having made a couple of changes for one photo in development settings, I tried closing the program. However, then this message was then displayed:
    I left the program open and running for about 5-5 hours, then tried closing the program, but the message still came up so I closed the program and restarted the computer. I tried making changes to another photo, saving and then closing and the same message comes up. The program also becomes unresponsive, and of course still no meta data has been saved to the photo, i.e. when opening it outside Lightroom, the edits of the photos is not shown.
    What do do? I would greatly appreciate any insights, since I have now completely hit the wall.
    Oh yes, that´s right:
    What version of Lightroom? Include the minor version number (e.g., Lighroom 4 with the 4.1 update).
    Lightroom 5.3
    Have you installed the recent updates? (If not, you should. They fix a lot of problems.)
    I installed the program two days ago and then for the second time today.
    What operating system? This should include specific minor version numbers, like "Mac OSX v10.6.8"---not just "Mac".
    Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36
    What kind(s) of image file(s)? When talking about camera raw files, include the model of camera.
    JPEG
    If you are getting error message(s), what is the full text of the error message(s)?
    Please see screen dumps above
    What were you doing when the problem occurred?
    Trying to save metadata + trying to open images that it seemed I had saved meta data to
    Has this ever worked before?
    No
    What other software are you running?
    For some time Chrome, Firefox, Itunes. Then I closed all other software.
    Tell us about your computer hardware. How much RAM is installed?  How much free space is on your system (C:) drive?
    4 GB 1333 MHz DDR3
    Has this ever worked before?  If so, do you recall any changes you made to Lightroom, such as adding Plug-ins, presets, etc.?  Did you make any changes to your system, such as updating hardware, printers or drivers, or installing/uninstalling any programs?
    No, the problems have been there all the time.

    AnnaK wrote:
    Hi Rob
    I think you succeeded in partly convincing me. : ) I think I will go for a non-destrucitve program like LR when I am back in Sweden, but will opt for a destructive one for now.  Unfortuntately, I have an Olypmus- so judging from your comment NX2 might not be for me.
    Hi AnnaK (see below).
    AnnaK wrote:
    My old snaps are JPEG, but I recently upgraded to an Olympus e-pl5 and will notw (edited by RC) start shooting RAW.
    Note: I edited your statement: I assume you meant now instead of not.
    If you start shooting raw, then you're gonna need a raw processor, regardless of what the next step in the process will be. And there are none better for this purpose than Lightroom, in my opinion. As has been said, you can export those back to Lightroom as jpeg then delete the raws, if storage is a major issue, or convert to Lossy DNG. Both of those options assume you're willing to adopt a non-destructive workflow, from there on out anyway (not an absolute requirement, but probably makes the most sense). It's generally a bad idea to edit a jpeg then resave it as a jpeg, because quality gets progressively worse every time you do that. Still, it's what I (and everybody else) did for years before Lightroom, and if you want to adopt such a workflow then yeah: you'll need a destructive editor that you like (or as I said, you can continue to use Lightroom in that fashion, by exporting new jpegs and deleting originals - really? that's how you want to go???). Reminder: NX2 works great on jpegs, and so is still very much a candidate in my mind - my biggest reservation in recommending it is uncertainty of it's future (it's kinda in limbo right now).
    AnnaK wrote:
    Rob Cole wrote:
    There is a plugin which will automatically delete originals upon export, but relying on plugins makes for additional complication too.
    Which plugin is this?
    Exportant (the option is invisible by default, but can be made visible by editing a text config file). To be clear: I do not recommend using Exportant for this purpose until after you've got everything else setup and functioning, and even then it would be worth reconsidering.
    AnnaK wrote:
    Rob Cole wrote:
    What I do is auto-publish to all consumption destinations after each round of edits, but that takes more space.
    How do you do this?
    Via Publish Services.
    PS - I also use features in 'Publish Service Assistant' and 'Change Manager' plugins (for complete automation), but most people just select publish collections and/or sets and click 'Publish' - if you only have a few collections/services it's convenient enough.
    AnnaK wrote:
    Would you happen to have any tips on which plugins I may want to use together with Photoshop Elements?
    No - sorry, maybe somebody else does.
    Did I get 'em all?
    Rob

  • Exif and gps-data in xmp-file overwritten, sometimes

    I have a problem where LR4 on import seems to rewrite the xmp-files, and in that process data is lost, notably GPS-data, is lost.
    My workflow:
    Create xmp-file with Exiftool from CR2-files (sets copyright etc)
    Geocode with Geosetter
    Color labels with Photo Mechanic
    Import to LR4
    After import geo data is gone for around 30% of the files. It seems like files that have recieved a color label by PM is not affected, and many files without color label are correctly imported.
    I never loose any data before importing to LR4.
    Anyone have any suggestions about what could be going on?
    I use a Canon 7D, Exiftool and Geosetter are updated to latest versions, LR version is 4.3.
    I couldn't find how to attach files, but if anyone is interested in seeing how the xmp looks before and after LR4 import, I could copy-paste the code in an answer.
    Best regards,
    /Pär

    Hi Pär,
    Do I understand correctly that you have an xmp-sidecar present to your CR2-raws, when importing into Lightroom?
    Then LR should read it during import.
    Maybe the format standards are not clear, maybe Geosetter does not write the xmp-parts where LR expects the data?
    But to save your original xmps: change your catalog preference settings, to NOT automatically write xmp to files.
    Then you will have time, because then LR just reads the xmps on import, but does not write anything, unless you invoke it by selecting images and hitting <ctrl> s.
    Which you would only do once you have assured that everything from your original xmp has arrived in LR's catalog.
    Maybe it does not work reading that just during import, but possibly on a 2nd read after?
    You could check, once you have disabled auto-xmp-writing, by invoking another explicit Read Metadata from File.
    ...just a wild guess, to rule out a potential bug there...
    Cornelia

  • How can I open a tdms file and replace a subset of data then save that change without re-writing the entire file again?

    Hi all,
    Is it possible to open a tdms file and make a small change an an array subset then save the file without having to save the whole dataset as a different file with a new name? That is to say, is there something similar to "Save" in MS Word rather than "Save As"... I only want to change a 1D array of four data points in a file of 7M data points.
    I am not sure if this make sense? Any help is apreciated.
    Thanks,
    Jack

    You can use either one, but for your application, I would use the synchronous.  It requires far less setup.  When you open the file, set both enable asynchronous and disable buffering to FALSE to enable you to use synchronous with arbitrary data sizes.
    Attached code is LabVIEW 2011.
    This account is no longer active. Contact ShadesOfGray for current posts and information.
    Attachments:
    UpdateTDMS.zip ‏20 KB

  • Creating Multiple IDOCs and Line Items based on Posting date from file

    Hi All,
    My scenario is File to IDOC(MBGMCR01)...
    Need your suggestions and help on how to go with this...
    Source file structure is
    DC61|2009-03-15|000000000001200051|00000005.00|200|0001|1234|
    DC61|2009-03-15|000000000001200363|00000001.00|300|0001|1234|
    DC61|2009-03-15|000000000001200334|00000002.00|400|0001|1234|
    DC61|2009-03-16|000000000001201145|00000001.00|200|0001|1234|
    DC61|2009-03-16|000000000001201086|00000002.00|100|0001|1234|
    DC61|2009-03-17|000000000001200051|00000003.00|200|0001|1234|
    DC61|2009-03-17|000000000001200052|00000003.00|200|0001|1234|
    DC61|2009-03-17|000000000001200053|00000003.00|200|0001|1234|
    DC61|2009-03-18|000000000001200056|00000003.00|200|0001|1234|
    And target IDOC(MBGMCR01) is
    IDOC (0u202699999)
    E1BP2017_GM_ITEM_CREATE(0u2026999999)
         Date
    For Each new Posting date(column 2) of the source a new idoc to be created and the corresponding records of each posting date to be added to E1BP2017_GM_ITEM_CREATE
    The out put for the above should be like this
    IDOC(2009-03-15)
    E1BP2017_GM_ITEM_CREATE=2009-03-15
    E1BP2017_GM_ITEM_CREATE=2009-03-15
    E1BP2017_GM_ITEM_CREATE=2009-03-15
    IDOC(2009-03-16)
    E1BP2017_GM_ITEM_CREATE=2009-03-16
    E1BP2017_GM_ITEM_CREATE=2009-03-16
    IDOC(2009-03-17)
    E1BP2017_GM_ITEM_CREATE =2009-03-17
    E1BP2017_GM_ITEM_CREATE=2009-03-17
    E1BP2017_GM_ITEM_CREATE=2009-03-17
    IDOC(2009-03-18)
    E1BP2017_GM_ITEM_CREATE=2009-03-18
    Will be thank ful if any one gives a hint....
    Thanks and regards,
    Sridhar

    I rather meant a picture of your mapping - anyways. Hope this is correct:
    Your souce structure:
    <MT_IAR>
      <IAR_Recordset>
        <IAR_Details>
          <Inv_adj_date>
        </IAR_Details>
        <IAR_Details>
          <Inv_adj_date>
        </IAR_Details>
    Than mapping should be like this:
    <Inv_adj_date>        ==> SplitbyValue        ==> IDOC
    Please confirm this doesn't work.

  • Recording data at particular iterations and writing to text file

    Hi all,
    this is my first time posting on the NI boards. I'm running into a couple problems I can't seem to figure out how to fix.
    I'm collecting data using a LabJack U3-HV daq. I've taken one of the out-of-the-box streaming functions that comes with the LabJack and modified it for my purposes. I am attempting to simply save the recorded data to a text file in columns, one for each of my 4 analog strain gauge inputs, and one for time. For some reason when the 'write to measurement file.vi' executes it is puts everything in rows, and the data is unintelligible.
    The 2nd issue I am facing, which is not currently visible in my vi, is that I am running my test for 60,000 cycles, which generates a ton of data. I'm measuring creep/fatigue with my strain gages so I don't need data for every cycle, probably for the first 1000, then the 2k, 4k, 6k, 8k, 10k, 20k, etc. More of an exponential curve. I tried using some max/min functions and then matching the 'write to measurement file.vi' with a case structure that only permitted it to write for particular iterations, but can't seem to get it to work.
    Thanks in advance for any help!
    Attachments:
    3.5LCP strain gages v2.vi ‏66 KB

    Hey carfreak,
    I've attached a screenshot that shows three different ways of trying to keep track of incrementing data and/or time in a while loop. The top loop just shows a program that demonstrates how shift registers can be used to transfer data between loops. This code just writes the iteration value to the array using the Build Array function.
    The first loop counts iterations in an extremely round-about way... the second shows that really you can just build the array directly using the iteration count (the blue "i" in the while loop is just an iteration counter).
    The final loop shows how you can use a time stamp to actually keep track of the relative time when a loop executes.
    Please note that these three should not actually be implemented together in one VI. I just built them in one BD for simplicity's sake for the screenshot. As described above, the producer-consumer architecture should be used when running parallel loops.
    Does that answer your question?
    Chris G
    Applications Engineer
    National Instruments
    Attachments:
    While Loops and Iterations.JPG ‏83 KB

Maybe you are looking for

  • How do i add blackberry contacts to i phone 5

    I am about to cross from Blackberry Torch 9800 to I-phone 5 but do not know how to move calandar & contacts 

  • Query problem with accumulated key figures

    Hi BI Gurus! I have a report problem that I hope you can help me with! In my report I have 2 key figures. One for accumulated revenue previous year (KF1) and one for accumulated revenue current year (KF2). Both key figures should be presented in a gr

  • Can I connect my Lacie Space II - 2 Tb to an Apple TV and stream my movies in the LaCie to my TV through the Apple TV?

    Can I connect my Lacie Space II - 2 Tb to an Apple TV and stream my movies in the LaCie to my TV through the Apple TV? I would really like to have an apple TV to stream from and use also as media player in replacement of mt PC. Is this possible? Than

  • Widgets: review clutters

    Hey there, I'm working on an german book with some review-widgets. They looks like this in iBooks Author: But they looks like this in iBooks on my iPad: Any ideas? --Alex

  • REUSE_ALV_LIST_DISPLAY append list checkboxes

    Hi, I am using REUSE_ALV_LIST_DISPLAY to display two lists after one another:      PERFORM append_event   USING    slis_ev_end_of_list                                      'LIST_2'                             CHANGING lt_alv_event.    CALL FUNCTION '