Reverse File for metadata in HFM

Hi,
Im Doing the next thing.
Im trying to reverse a flat file, but when im trying to do it the error i this one: com.microsoft.sqlserver.jdbc.SQLServerException: Violation of PRIMARY KEY constraint 'PK_REV_COL'. Cannot insert duplicate key in object 'dbo.SNP_REV_COL'.
Before that ..Im reverse another flat file and was a successful action, I erase that and ill do it again the reverse, but now the reverse show it nothing.
The issue is that why i was able to do it the first time, and now I cant, is a completely new...everything, and why one file show an error and the other not?????????????????
Ill already had a problem with the reverse and HFM ... we get a patch.
Best Regards

Probably you have duplicate column names specified in the file ... even if the case is different , ODI will throw this error .
Thanks,
Sutirtha

Similar Messages

  • Reviewer's Guide: Review.mdl file for demo

    Hi,
    I downloaded OWB904 from technet, and am going through the reviewer's guide. To do the demos, it calls out importing metadata from Review.mdl on the install drive.
    I have searched all through my computer, and all through the OWB site, and can't seem to find the files for the demos.
    Where can one obtain these files? We are very excited about trying out this new release of OWB.
    Thanks,
    Scott Rappoport

    I'm also going through the Reviewer's Guide, and I'm trying to load the metadata repository with the Review.mdl file located in the Reviewer's Guide zip. When I try to import Review.mdl using the OWB Client I get the following error:
    "Error: MDL1194: Import data file for Metadata Loader release 9.2 not supported by Metadata Loader release 3.1. Use OWB MDL Upgrade Utility to convert older MDL data files to the latest supported format."
    I tried using the OWB MDL File Upgrade Utility to convert Review.mdl as suggested by the error message and received another error message as follows:
    "CNV0002-0023(ERROR): MDLRELEASE value is not compatible with the driver file version."
    At this point I'm stuck. Please advise as soon as possible on how to proceed with loading the metadata repository.
    Thanks in advance.
    ...Phillip

  • Error when Reversing metadata from HFM

    Hi,
    I'm getting this error when i try to reverse metadata from HFM.
    "com.hyperion.odi.common.ODIHAppException: Error occurred while loading driver."
    I already try the suggestion in this forum including
    Cause
    The Java Library Path of the HFMDriver.dll is missing in Wrapper configuration file.
    Solution
    1. Backup and edit the ODI_HOME\tools\wrapper\conf\snpsagent.conf and include the following entry in Java Library Path section :
    wrapper.java.library.path.2=../drivers
    After adding it will look as follows:
    # Java Library Path (location of Wrapper.DLL or libwrapper.so)
    wrapper.java.library.path.1=../tools/wrapper/lib/
    wrapper.java.library.path.2=../drivers
    Note: HFMDriver.dll is usually present in <odi_home>\oracledi\drivers folder. If it is located in different path, then appropriate location should be referred. Also numbering convention like wrapper.java.library.path.1, wrapper.java.library.path.2 has to be in a incremental manner and no duplicates nor gaps should exist.
    2. Save the file and recreate the ODI Agent Service with the new configuration parameters as the classes are loaded during its creation and re-test the issue.
    But it still errors. Does anyone know another solution?
    Btw, i'm using ODI 10.1.3.5, and the HFM's version 9.3.1
    Those two are not on the same machine. And i've already installed HFM Client on ODI Machine.
    Thanks in advance.

    Ok, i've solved the problem above. Looks like i forgot to define driver folder in PATH variable.
    But now i got another error.
    Looks like the problem lies in HFM now.
    It's said that Server/Cluster incorrectly configured. But, in my HFMClient i've successfully access the application.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 38, in ?
    com.hyperion.odi.common.ODIHAppException: Error occurred in driver while connecting to Financial Management application [KALBEAPP] on [HFMTest] using user-name [adm].
         at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:58)
         at com.hyperion.odi.hfm.ODIHFMAppReverser.connect(ODIHFMAppReverser.java:27)
         at com.hyperion.odi.common.ODIModelImporter.importModels(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx0.f$0(<string>:38)
         at org.python.pycode._pyx0.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Caused by: com.hyperion.odi.hfm.wrapper.HFMException: Error occurred in driver while connecting to Financial Management application [KALBEAPP] on [HFMTest] using user-name [adm].
         at com.hyperion.odi.hfm.wrapper.HFMConnection.<init>(HFMConnection.java:57)
         at com.hyperion.odi.hfm.wrapper.HFMServer.getConnection(HFMServer.java:89)
         at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:52)
         ... 33 more
    Caused by: com.hyperion.odi.hfm.wrapper.HFMException: Opening HFM application failed. Error code: 0x80041143 [Server/Cluster is incorrectly configured. Please reconfigure your Cluster or Server connection.]
         at com.hyperion.odi.hfm.wrapper.HFMDriverJNI.getConnection(Native Method)
         at com.hyperion.odi.hfm.wrapper.HFMConnection.<init>(HFMConnection.java:51)
         ... 35 more
    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Error occurred in driver while connecting to Financial Management application [KALBEAPP] on [HFMTest] using user-name [adm].
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Thanks in advance...

  • Error in Loading Meta Data File for Service 'CL_ACCOUNTING_DOCUMENT_DP'

    Hi Guys,
    Need to your assistance in solving the below Error.
    1. Error while loading metadata file for various service which required Connectors to be created .
    Example  : CL_ACCOUNTING_DOCUMENT_DP',
    BEP
    ZCB_COST_CENTER_SRV
    1
    Cost Center Service
    CB_COST_CENTER_SRV
    1
    BEP
    ZCB_GOODS_RECEIPT_SRV
    1
    Goods Receipt Service
    CB_GOODS_RECEIPT_SRV
    1
    2. While Expanding the node for connectors in ESH_COCKPIT for SAPAPPLH  . Below Error occurs
    Could not rename Data Type "SIG_IL_USA_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "SIG_IL_SDR_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "SIG_IL_RES_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "SIGN_TYPE_UD_1" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "SIGN_TYPE_SM_1" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "SIGN_TYPE_RR_1" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "RMXTE_TRIALID_1" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QZUSMKZHL_1" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QWERKVORG_1" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QVNAME_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QVMENGE_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QVINSMK_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QVGRUPPE_2" in SWC EAAPPLH - errors occurred during renaming
    Could not rename Data Type "QVEZAEHLER_2" in SWC EAAPPLH - errors occurred during renaming

    Hi,
    do you have solved this issue? We have the same problem with ESH_COCKPIT and SAPAPPLH component.
    Regards,
    Martin Sindlar

  • Error while importing zip file for second time to B2B through ant script

    [echo] args=import
    importstatus:
    [echo] Commandline arguments 1: [import]
    [echo] Import with overwrite=true
    [echo] B2BCommandLineUtility: importRepository: Error messages:
    [echo] MDS-00521: error while reading document /soa/b2b/tpa_JOvrtiV-7030143
    019445112136.xml from metadata repository
    [echo] MDS-00520: failure to read document /soa/b2b/tpa_JOvrtiV-70301430194
    45112136.xml because it is not in the metadata repository
    [echo] MDS-00911: Document with name "/soa/b2b/tpa_JOvrtiV-7030143019445112
    136.xml" and version 16 does not exist in the repository.
    [echo] ORA-01403: no data found
    [echo] ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 407
    [echo] ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 603
    [echo] ORA-06512: at line 1
    [echo]
    [echo] ORA-01403: no data found
    [echo] ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 407
    [echo] ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 603
    [echo] ORA-06512: at line 1
    [echo]
    [echo] ORA-01403: no data found
    [echo] ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 407
    [echo] ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 603
    [echo] ORA-06512: at line 1
    [echo]
    [echo]
    Can anybody plz help???
    Thanks in Advance!!!

    Hi,
    Even I am getting the same error while importing the .zip file for the second time through B2B console-->Administration-->Import/export tab.
    But again it works for the third time..
    Error Message:
    Import of file OriginalProject.zip failed. MDS-00521: error while reading document /soa/b2b/tpa_MnjhBHh-70301432125893445241.xml from metadata repository MDS-00520: failure to read document /soa/b2b/tpa_MnjhBHh-70301432125893445241.xml because it is not in the metadata repository MDS-00911: Document with name "/soa/b2b/tpa_MnjhBHh-70301432125893445241.xml" and version 34 does not exist in the repository. ORA-01403: no data found ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 407 ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 603 ORA-06512: at line 1 ORA-01403: no data found ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 407 ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 603 ORA-06512: at line 1 ORA-01403: no data found ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 407 ORA-06512: at "DEV_MDS.MDS_INTERNAL_SHREDDED", line 603 ORA-06512: at line 1
    Please help...

  • Adobe Camera Raw (ACR), Jpeg Files and Metadata

    I have been using Adobe Camera Raw (ACR) with my Canon EOS 30D, an 8 MP camera, for a while now. I would make non-destructive changes to the raw file (.CR2) in ACR where the changes would be stored in an adjoining .xmp file. Jpegs of the unedited and edited camera raw files would be created so I have a before and after versions of the images that can be viewed anywhere.
    Now I have a Canon EOS 5D Mark III, a 22 MP camera and the raw files are much larger. I’m looking at using camera raw on jpeg files for some for my more casual photo shoots in order to save disk space. From what I have read, I can use ACR on jpeg files and that the changes would be stored in the metadata in the jpeg file. Is there a way for the changes to be stored in an adjoining .xmp file so that the original jpeg file is not modified, much like it is done with the .CR2 files?
    I am using Adobe Photoshop CS5 on a Windows 7 machine. All software is up-to-date.
    Also, I have found that Adobe’s camera raw handing is different than the raw handling of the photos using Canon’s software (DPP). Is there a way to make ACR closer to what Canon’s software does?
    Thanks,
    Mike

    Probably not going to happen.
    I agree, Canon's color is better than Adobe's in general - I just didn't know how good the Camera Standard profile might be for your particular camera.  I had hoped maybe they'd made it a very close match.
    Some time ago I got a very nice genius-level Camera Raw forum member named Vit to make me a custom profile that exactly matches the Canon color for my 40D, even to the point of emulating the way Canon fits the entire gamut of the captured image into the sRGB color space, so I'm more than happy.
    Others might tell you that you're silly for wanting the color to match, but I understand completely your position - if you do get that kind of feedback just ignore it and push on.
    Once you've set up a default to use Camera Standard, you may well be able to tweak the dozens of color controls to bring the Adobe default into line with the Canon color.  I did that once before getting my special 40D profile, comparing embedded raw file JPEGs with the Camera Raw preview display with a variety of images - it was tedious but effective.
    Best of luck.
    -Noel

  • Do I need to use Flex Files for a 24 fps edit?

    Hello.
    I am shooting on 16mm film, transferring to video (with key code burn in), editing on Final Cut 4, then matching back to film.
    I would like to save money by not having to create Flex Files, and was told that if I do a reverse telecine in Cinema Tools (converting 29.97 transfer to 24), and then edit at 24 in FCP, I will be able to generate (manually) my own cut list when I am done by noting the key code burned at the head and tail frame of my shots. The Cinema Tools manual even implies it would be able to generate a cut list itself?
    Can anyone speak to this or tell me why this plan wouldn't work--why I absolutely need a flex file? This will be a short film (under 10 minutes) and I will be capturing all the footage for the edit, so I don't need the flex file for help with capture, and I don't mind the time needed to log key codes myself for the cut list.
    Thanks so much in advance for any help!

    Hi. It costs enough to make a difference for my very, very small budget--a flat fee each time I transfer and I am shooting 100 foot rolls so there could be up to five or more transfers. Since I am capturing everything (entire camera rolls), I'm trying to understand if I need the flex files--if the key code is burned in, can't I use that for cut list? In other words, if I don't need the files for importing, what other reasons would you strongly advise me to have them made?

  • The method getBean() is not valid for metadata object /ex/model/remittance/client/common/bc4j.xcfg

    Hi,
    I am getting bellow error when i am running my application. I am using JDeveloper 11.1.2.3
    <Nov 6, 2013 9:52:57 AM AST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to STANDBY>
    <Nov 6, 2013 9:52:57 AM AST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to STARTING>
    <BC4JConfigLifeCycleCallBack> <contextInitialized> MDS-01702: The method getBean() is not valid for metadata object /ex/model/remittance/client/common/bc4j.xcfg - it is allowed on only bean metadata objects.
    oracle.mds.exception.MDSRuntimeException: MDS-01702: The method getBean() is not valid for metadata object /ex/model/remittance/client/common/bc4j.xcfg - it is allowed on only bean metadata objects.
      at oracle.mds.core.MetadataObject.getBean(MetadataObject.java:327)
      at oracle.adf.share.jndi.MDSBackingStore.getMOBean(MDSBackingStore.java:558)
      at oracle.bc4j.mbean.RuntimeMXBeanImpl.init(RuntimeMXBeanImpl.java:120)
      at oracle.bc4j.mbean.RuntimeMXBeanImpl.<init>(RuntimeMXBeanImpl.java:110)
      at oracle.bc4j.mbean.RuntimeMXBeanImpl.<init>(RuntimeMXBeanImpl.java:101)
      at oracle.bc4j.mbean.BC4JConfigLifeCycleCallBack.contextInitialized(BC4JConfigLifeCycleCallBack.java:114)
      at weblogic.servlet.internal.EventsManager$FireContextListenerAction.run(EventsManager.java:481)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.EventsManager.notifyContextCreatedEvent(EventsManager.java:181)
      at weblogic.servlet.internal.WebAppServletContext.preloadResources(WebAppServletContext.java:1872)
      at weblogic.servlet.internal.WebAppServletContext.start(WebAppServletContext.java:3153)
      at weblogic.servlet.internal.WebAppModule.startContexts(WebAppModule.java:1508)
      at weblogic.servlet.internal.WebAppModule.start(WebAppModule.java:482)
      at weblogic.application.internal.flow.ModuleStateDriver$3.next(ModuleStateDriver.java:425)
      at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
      at weblogic.application.internal.flow.ModuleStateDriver.start(ModuleStateDriver.java:119)
      at weblogic.application.internal.flow.ScopedModuleDriver.start(ScopedModuleDriver.java:200)
      at weblogic.application.internal.flow.ModuleListenerInvoker.start(ModuleListenerInvoker.java:247)
      at weblogic.application.internal.flow.ModuleStateDriver$3.next(ModuleStateDriver.java:425)
      at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
      at weblogic.application.internal.flow.ModuleStateDriver.start(ModuleStateDriver.java:119)
      at weblogic.application.internal.flow.StartModulesFlow.activate(StartModulesFlow.java:27)
      at weblogic.application.internal.BaseDeployment$2.next(BaseDeployment.java:636)
      at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
      at weblogic.application.internal.BaseDeployment.activate(BaseDeployment.java:205)
      at weblogic.application.internal.EarDeployment.activate(EarDeployment.java:58)
      at weblogic.application.internal.DeploymentStateChecker.activate(DeploymentStateChecker.java:161)
      at weblogic.deploy.internal.targetserver.AppContainerInvoker.activate(AppContainerInvoker.java:79)
      at weblogic.deploy.internal.targetserver.BasicDeployment.activate(BasicDeployment.java:184)
      at weblogic.deploy.internal.targetserver.BasicDeployment.activateFromServerLifecycle(BasicDeployment.java:361)
      at weblogic.management.deploy.internal.DeploymentAdapter$1.doActivate(DeploymentAdapter.java:51)
      at weblogic.management.deploy.internal.DeploymentAdapter.activate(DeploymentAdapter.java:200)
      at weblogic.management.deploy.internal.AppTransition$2.transitionApp(AppTransition.java:30)
      at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:240)
      at weblogic.management.deploy.internal.ConfiguredDeployments.activate(ConfiguredDeployments.java:169)
      at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:123)
      at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:180)
      at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:96)
      at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    <Nov 6, 2013 9:53:11 AM AST> <Notice> <LoggingService> <BEA-320400> <The log file C:\Users\maroof\AppData\Roaming\JDeveloper\system11.1.2.3.39.62.76.1\DefaultDomain\servers\DefaultServer\logs\DefaultDomain.log will be rotated. Reop
    I seen oracle docs solution
    MDS-01702: The method getBean() is not valid for metadata object {0} - it is allowed on only bean metadata objects.
    Cause: An attempt was made to access the metadata object using the bean API but the metadata object was not a bean metadata object. The schema for the root element may not have been registered with MDS, or the beans were not available on the classpath.
    Action: Regenerate the beans from the schema definitions and ensure that the correct versions are on the classpath. Check that the schemas are registered with the MDS instance.
    Level: 1
    Type: ERROR
    Impact: Configuration
    But i am not getting what i do in my application for removing this error.
    Maroof

    Do you use MDS in your application?
    It's possible that you see a leftover of an earlier deployment.
    Do you get the same error if you only start the integrated server (application server navigator->integrated server->debug or run)?
    IF you don't get it when you only start the server without running your application, the problem is somewhere in your app. If you get the error it's probably an old deployment or a configuration error of your server. In this case stop jdev, rename the DefaultDomain folder C:\Users\maroof\AppData\Roaming\JDeveloper\system11.1.2.3.39.62.76.1\DefaultDomain tp C:\Users\maroof\AppData\Roaming\JDeveloper\system11.1.2.3.39.62.76.1\xxxDefaultDomain and start jdev again. Then use hte application server navigator to create a new integrated server (you should see this option as the DefaultDomain folder is not present) and once the new integrated server is created test again.
    Timo

  • How to use certain entries from a txt file as metadata?

    So during my plugin development I stumbled on a problem.
    I have created a txt file for each picture telling me who's in the photos. So the first column in the file shows all the names that appear in the photo.
    I want to add those names into LR for each photo. I thought using them as metadata would be a good idea. Then I could let the user enter a name of some person and create a smart collection having all photos with that person in them.
    But putting it into practice isn't that easy.
    So I started with the classic metadata definition:
    metadataFieldsForPhotos = {
      id = 'Person',
      title = 'Person'
      dataType = 'string',
      searchable = true,
      browsable = true,
    Now I don't know how to actually read only the column entries of the txt file and displaying all the names appearing in it. Another problem is that each photo has its own txt file in "nameofphoto.jpg.txt" So LR must know which txt file belongs to a photo.
    I know my question isn't very specific but it would be great if someone could help...

    Thank you for the reply. Reading the file how I wanted worked fine and I chose to use keywords, as you suggested.
    So now I tried to add the keywords to the pictures and I get:
    "LrCatalog:withWriteAccessDo: could not execute action 'keywordProvider'. It was blocked by another write access call, and no timeout parameters were provided."
    There's my function so far:
    function Finder.findPerson(smartCollectionName)
         local catalog = LrApplication.activeCatalog()
         local photos = catalog:getMultipleSelectedOrAllPhotos()
           LrTasks.startAsyncTask( function()
                for i=1,#photos do
                     local photo = photos[i]
                     local photopath = photo:getRawMetadata("path")
                     local photoname = photo:getFormattedMetadata("fileName")
                     local folderpath = string.sub(photopath, 1, string.len(photopath) - string.len(photoname))
                     local metadataPath = folderpath .. "\.metaface\\" .. photoname .. ".txt"
                     local text = LrFileUtils.readFile(metadataPath)
                     local metadataValues = stringSplit(text)
                     for i = 0, #metadataValues do
                          catalog:withWriteAccessDo( "keywordProvider", function()
                               local keywords = catalog:createKeyword(metadataValues[i])
                               photos:addKeyword(keywords)
                          end)
                     end
               end
          end)
    end
    So what am I doing wrong?

  • Embedding data from xml file into metadata of a pdf

    Hi All
    I'm wanting to do the following, but struggling to figure the right way to go about it.
    I want to embedded data from my MIS into a pdf's metadata (as scrnshot). I can create a standalone xml file with all the data I require, but I'm unsure how to automate that being embedded into a pdf's advanced metadata. I know this can be done, as it worked at a previous employer, but I didn't get chance to find out how they did it.
    I'm wanting to do this so I can carry out a more advanced search of the metadata in Bridge.
    Any advice would be appreciated!

    Hi Northern,
        I have modified the modifyingXMP sample for you. After this change, put your xmp file as sample.xml and also put pdf file in the same folder where ModifyXMP executable is. After merging my changes, ModifyXMP file will read the sample.xml and will embed it into pdf file.
       Please follow the following steps
    1. Download XMPToolkit SDK and follow the steps to compile Sample
    2. Open ModifyingXMP file, replace all the content of that file with the below content
    3. Compile the ModifyingXMP file.
    4. The ModifyXMP.exe will be generated in folder (samples\target\windows\Debug), if you have changed the output folder it will be generated there.
    5. In parallel to ModifyingXMP.exe put the sample.xml (the xml file you have) and also the pdf file (say pdf file name is mypdf.pdf)
    6. Go to console and change directory to the directory where ModifyingXMP is and pass the following command
    ModifyingXMP mypdf.pdf
    7. Open the pdf file and check that value/properties
    For your reference, I am putting the content of the sample.xml file too, put this content in sample.xmp and any pdf and you will find subject field is getting added.
    ************** content of the sample.xml file. Create a file name sample.xml and put that content. Put sample.xml in parallel to ModifyingXMP.exe*******
    <rdf:RDF xmlns:rdf='http://www.w3.org/1999/02/22-rdf-syntax-ns#'>
        <rdf:Description rdf:about='' xmlns:dc='http://purl.org/dc/elements/1.1/'>
            <dc:subject>
                <rdf:Bag>
                    <rdf:li>XMP</rdf:li>
                    <rdf:li>SDK</rdf:li>
                    <rdf:li>Sample</rdf:li>
                </rdf:Bag>
            </dc:subject>
            <dc:format>image/tiff</dc:format>
        </rdf:Description>
    </rdf:RDF>
    ******************* MODIFIED CONTENT OF MODIFYING.CPP FILE. ***************************************************************************************** ************
    // ========================================================================================= ========
    // Copyright 2008 Adobe Systems Incorporated
    // All Rights Reserved.
    // NOTICE:  Adobe permits you to use, modify, and distribute this file in accordance with the terms
    // of the Adobe license agreement accompanying it.
    // ========================================================================================= ========
    * Tutorial solution for Walkthrough 2 in the XMP Programmers Guide, Modifying XMP
    * Demonstrates how to open a file for update, and modifying the contained XMP before writing it back to the file.
    #include <cstdio>
    #include <vector>
    #include <string>
    #include <cstring>
    // Must be defined to instantiate template classes
    #define TXMP_STRING_TYPE std::string
    // Must be defined to give access to XMPFiles
    #define XMP_INCLUDE_XMPFILES 1
    // Ensure XMP templates are instantiated
    #include "public/include/XMP.incl_cpp"
    // Provide access to the API
    #include "public/include/XMP.hpp"
    #include <iostream>
    #include <fstream>
    using namespace std;
    * Creates an XMP object from an RDF string.  The string is used to
    * to simulate creating and XMP object from multiple input buffers.
    * The last call to ParseFromBuffer has no kXMP_ParseMoreBuffers options,
    * thereby indicating this is the last input buffer.
    #include <sstream>     
    SXMPMeta createXMPFromRDF()
        string rdf;
        //open the RDF file and put it's content into rdf buffer
        ifstream inFile;
        inFile.open("sample.xml");//open the input file
        if (!inFile.is_open()) {
            cout <<"Couldn't open xml file" <<endl;
            exit(1);
        stringstream strStream;
        strStream << inFile.rdbuf();//read the file
        rdf = strStream.str();//str holds the content of the file
        SXMPMeta meta;
        // Loop over the rdf string and create the XMP object
        // 10 characters at a time
        int i;
        for (i = 0; i < (long)rdf.size() - 10; i += 10 )
            meta.ParseFromBuffer ( &rdf[i], 10, kXMP_ParseMoreBuffers );
        meta.ParseFromBuffer ( &rdf[i], (XMP_StringLen) rdf.size() - i );
        return meta;
    int main ( int argc, const char * argv[] )
        if ( argc != 2 ) // 2 := command and 1 parameter
            cout << "usage: ModifyingXMP (filename)" << endl;
            return 0;
        string filename = string( argv[1] );
        if(!SXMPMeta::Initialize())
            cout << "Could not initialize toolkit!";
            return -1;
        XMP_OptionBits options = 0;
        #if UNIX_ENV
            options |= kXMPFiles_ServerMode;
        #endif
        // Must initialize SXMPFiles before we use it
        if(SXMPFiles::Initialize(options))
            try
                // Options to open the file with - open for editing and use a smart handler
                XMP_OptionBits opts = kXMPFiles_OpenForUpdate | kXMPFiles_OpenUseSmartHandler;
                bool ok;
                SXMPFiles myFile;
                std::string status = "";
                // First we try and open the file
                ok = myFile.OpenFile(filename, kXMP_UnknownFile, opts);
                if( ! ok )
                    status += "No smart handler available for " + filename + "\n";
                    status += "Trying packet scanning.\n";
                    // Now try using packet scanning
                    opts = kXMPFiles_OpenForUpdate | kXMPFiles_OpenUsePacketScanning;
                    ok = myFile.OpenFile(filename, kXMP_UnknownFile, opts);
                // If the file is open then read get the XMP data
                if(ok)
                    cout << status << endl;
                    cout << filename << " is opened successfully" << endl;
                    // Create the XMP object and get the XMP data
                    SXMPMeta meta;
                    myFile.GetXMP(&meta);
                    // Create a new XMP object from an RDF string
                    SXMPMeta rdfMeta = createXMPFromRDF();
                    // Append the newly created properties onto the original XMP object
                    // This will:
                    // a) Add ANY new TOP LEVEL properties in the source (rdfMeta) to the destination (meta)
                    // b) Replace any top level properties in the source with the matching properties from the destination
                    SXMPUtils::ApplyTemplate(&meta, rdfMeta, kXMPTemplate_AddNewProperties | kXMPTemplate_ReplaceExistingProperties | kXMPTemplate_IncludeInternalProperties);
                    // Check we can put the XMP packet back into the file
                    if(myFile.CanPutXMP(meta))
                        // If so then update the file with the modified XMP
                        myFile.PutXMP(meta);
                    // Close the SXMPFile.  This *must* be called.  The XMP is not
                    // actually written and the disk file is not closed until this call is made.
                    myFile.CloseFile();
                else
                    cout << "Unable to open " << filename << endl;
            catch(XMP_Error & e)
                cout << "ERROR: " << e.GetErrMsg() << endl;
            // Terminate the toolkit
            SXMPFiles::Terminate();
            SXMPMeta::Terminate();
        else
            cout << "Could not initialize SXMPFiles.";
            return -1;
        return 0;
    Please let me know if you find any issue/assistance.
    -Sunil

  • Annoying but not fatal bug uncovered in Aperture file size metadata.

    The bug described below was found as a result of testing the relative print sharpness of the Aperture print routines. I printed an adjusted Canon 5D image with Aperture's print routine to a 4X6, 300ppi print output, then took the same image through P'shop, (as a PSD) downsized the file to the same dimensions and converted it to 8-bit color depth. Then printed the returned file through Aperture.
    To my surprise when I was performing this comparison, I found that the file size shown in the metadata panel of the test file was not reduced, even though I had significantly downsized the file (from an "outbound" 72.2GB to an "inbound" file of slightly less than 7GB. So I repeated the round trip to P'shop, using the OSX Inspector to examine the relevant .approject size at three points in the workflow:
    1) Before executing the "Open in External Editor" command;
    2) After the PSD version had been created by Aperture; and
    3) After re-sizing the file in P'Shop and saving it back to Aperture.
    The net of the test was as follows:
    1) The size of the base (pre-export) .approject increased by approximately 72.2GB when it created the 16-bit PSD file for export to P'shop - exactly what the metadata showed and what I expected for a 5D file at 16-bit color depth.
    2) When the resized file was returned from P'shop, the base .approject file was now only 7.5GB larger, indicating the size of the returned file was in fact reduced by the downsizing.
    So the problem is not creation of an abnormally large Aperture PSD file. The metadata file size is simply not updated when the downsized file is returned from P'shop.
    I submitted a bug report to Apple on this, but I'm not sure what priority it will get. So this is an FYI in case others have observed a similar phenomenon and been concerned about it.
    BTW: The file downsized via the P'Shop round trip was noticeably sharper on printing (from Aperture) than the print produced directly out of Aperture. So it appears the print output sharpening routines in Aperture still need considerable improvement. That's the improvement I'd rather have Apple spending their time on than an anoying but not fatal bug in file size metadata.
    Mike
    PowerMac G5 Dual 2.0GHz, Radeon X800XT Graphics Card; MacBookPro 2.0Ghz   Mac OS X (10.4.6)   iMac800 and Powerbook G4 also in household

    I was trying to imagine how you were managing taking more than a few photographs if your files were increasing the space take by 72GB a pop.
    I was also imagining a camera the size of an aircraft carrier.

  • Rreading and writing this file's metadata (xmp) has been disabled. Renders unplayable/corrupted files

    Hi everyone,
    I've been using Premiere Pro CC for a while now, as I have a YouTube channel, and I've rendered, exported and edited around a dozen different videos, and have had no issues until now. I haven't done anything to my settings, but randomly, I am now unable to export any videos. I've tried exporting multiple times, using different formats, and I still get the same error:
         File importer detected an inconsistency in the file structure of [FILE NAME]. Reading and writing this file’s metadata (XMP) has been disabled.
    I've looked around, and I've seen just about no help in fixing the issue, so I was hoping that someone could help me out.
    Some of my computer details are:
    Premiere Pro CC
    Toshiba Z10T
    Windows 8.1
    Hopefully that's enough to go by
    Thanks guys

    After moving LR3.5 and catalogs to a new computer running Windows 7 and LR from an XP system, I suddenly had 6400 files with exclamation points--failure to synch metadata or similar warnings on every file.  If I thought I had one without the error, if I clicked on it--there error showed up.    It was very frustrating and I was wondering if I was going to have to keep my old computer forever.
    After I searched for a couple of hours and found that there have been several discussions concerning failures to sync xmp sidecar files and failures to write metadata on this forum and others.  Some of the postings had angry rants against Adobe.  On my system, it was not Adobe that should be blamed, it was Microsoft:
    On one of the Adobe threads, I noticed a post that the person had multiple external drives and the problem did not show itself on one drive.  The person said that changing the permissions on the other hard drives solved the problem.   I tried this and it worked for me.  I did not have this problem with this same drive on my XP system.  I will try to find that other forum and thank the person who gave me the solution.

  • Reverse factoring for vendors

    Hi
    We are currently working on a solution for reverse factoring for vendors.
    The process is that when a vendor sends an invoice to us, it should be marked to be a part of reverse factoring.
    We invite the vendor to participate in this process and they can choose to sent all invoices through this process or select the ones they want.
    When we receive the invoice, it should be marked and a CSV or XML file should be generated and sent to our bank.
    Our bank pay the invoice and take the money directely from our account, the one we have agreed upon.
    The vendor receives the payment quickly from our bank.
    The bank sent a file to us of what invoices they have paid and we need to make this match to make a clearing automatically.
    So, the easy part is to define the layout of the file we sent to the bank, but I don't know how to mark the incoming invoice so that we know what invoices we should put into the file to the bank. Also what information do we need to make the clearing at the end?
    Hope anyone knows this process and has implemented it before, and can provide me with some more information.
    Looking forward to hearing from you.
    Best regards
    Birgitte Hamborg Jakobsen

    Hi Joss,
    is your requirement related to HCM module?
    Then , it should be posted in below link,
    SAP ERP Human Capital Management (SAP ERP HCM)
    -santosh.

  • How to reset/delete autocomplete entries in file info metadata?

    I've got some autocomplete junk in some fields of the file info metadata - how do I delete it? Vista 64.
    Thanks.

    Yes, I agree, that was a vague description.
    I open the File Info window for a single image or group of images to add location information. Once I've added, say a state such New Jersey in the state/province field, the next time I start to type"N" in that field the New Jersey autocomplete pops up. This is fine. The problem is I've added something wrong, such as Pensvylania (misspelled), and then I have that misspelled autocomplete popping up if I start to type "P". I also have some fragments, such as "Ne" where the autocomplete popped up and I just tabbed, rather than entered, and now I have "Ne" as an autocomplete choice.
    There must be a file somewhere that stores those autocomplete possibilities. I wanted to find that file and delete the bad ones. I could not see how to delete that choice when I was working directly in the File Info panes.
    Does this describe the problem a bit better?
    Thanks.

  • Placing DNG files for Mockup Design in Indesign

    Hi there
    I am a magazine designer where i get 400-500 photo selections from my photographer. We have finally upgraded to CS5 (yes yes took our time). I used to be able in CS2 Indesign place the DNG files for mockup of the pages, but now in CS5 it says: "Cannot place this file. No filter found for requested operation." Is there a filter available to do this or do I need to convert all of the DNG files to jpg to use in the mockups?

    The ability to place Camera Raw images directly into InDesign has been a much requested but exceptionally highly-debated feature request since Photoshop started support for Camera Raw. DNG is actually just a special, TIFF-like standardized version of a raw image format that was an attempt to provide a universal container for common features of the various camera vendors' “raw” formats plus plus a common method of storing the vast amount of the individual camera vendors' and model-dependent “secret sauce” used to decode the raw image data for creating more industry-standard TIFF, JPEG, etc. images.
    In many respects, a raw image is the digital equivalent of analog photography's concept of a negative, albeit not reversed in tonalilty and/or color. A significant amount of processing is obtain a usable image from raw image data. When shooting in JPEG mode with a digital camera, this processing is done in-camera and includes significant noise reduction, sharpening, color adjustment, and brightness / contrast adjustments depending upon the cameras' capabilities. All that processing is postponed to Photoshop's Camera Raw feature (or the vendors' own host computer-resident conversion software) when you elect to get raw-only images out of your digital cameras. A very large number of professional and amateur photographers who shoot raw mode always process their own images to TIFF or JPEG before submitting for any editorial review or layout. Trying to directly place the DNG file's raw image data, even for mockups, is like trying to do mockups using the negatives in the old days. Exactly what default settings would you use to use these images and how meaningful would the visual representation of those default settings be?
    I have been involved with InDesign ever since the first prereleases of InDesign 1.0 (not CS1) and I don't recall any support ever given for direct placement of DNG or any other raw image format. I don't even know of any third party InDesign plug-ins that ever did that! InDesign itself was never not distributed with the libraries and data used for such conversions.
    However, there are many cameras that have a mode by which both a raw image and a camera-created JPEG file are output to the camera's memory card. Perhaps you had some directories of images that had these dual image files. (Also, there are indeed only a handful of very specialized programs that produce DNG files directly. Most professional camera manufacturers have elected to continue to output proprietary “secret sauce” image formats. To get DNG files from such file formats, you need to either run a batch process to convert to DNG or perhaps Adobe's own DNG converter that can convert entire directories of raw image files to DNG.)
              - Dov

Maybe you are looking for

  • Creating Attachment using GOS

    Hi All, I am developing a module pool program which involves two screens 0100 and 0200. 0100 is my entry screen. I need to display the GOS toolbar on screen 0200. So I have done the coding in PBO of screen 0200. But when I am again calling screen 010

  • Localizing a Web Page

    I try to localize a WPC Site. My user has not set a language in UME, but I use the browser's setting language. Well, I create 2 Web Pages and i localize they in English and in Italian language. wpccontent/Sites/xxx/Web Pages/Home/Home@en@      wpccon

  • Lost Disks

    I purchased my MacBook Pro aboput 5 months ago.  Now I have a need to use the system disks that were supposed to come with it.  Unfortunatly I can't find any disks.  Is there a way to order the system disks from Apple or do I even need them? I'm sett

  • Wrong colour in Acrobat 11, and no updates available.

    Just got CC and had Acrobat 11 installed on my Mac. If I recall, version 10 on the Mac had a MAJOR bug (not on the PC, was fine) where it simply would not show proper colour and we had to go back to using version 8. It looks like the same bug exists

  • Compile_schema is slow while fast object checkpoint

    Hi, The DBMS_UTILITY.compile_schema(U1,false) normally runs 10 seconds on our "U1" schema (11.2.0.3, oracle linux, 64 bit). But when truncate partition, parallel query runs on an other schema ("U2"), the compilation of "U1" takes more than 1 hour. to