Load HFM Metadata using VB 2010

Hi,
I am trying to load HFM Metadata using .APP fie by VB 2010 & I am getting this error. Error: <?xml version="1.0"?>
<EStr><Ref>{26DB5B4E-982D-4416-A623-4FEC83C03847}</Ref><AppName>TCHFM</AppName><User/><DBUpdate>1</DBUpdate><ESec><Num>-2147024809</Num><Type>0</Type><DTime>6/28/2012 11:26:39 AM</DTime><Svr>FCS-P2R-006-D</Svr><File>CHsvMetadata.cpp</File><Line>998</Line><Ver>11.1.2.1.102.3324</Ver></ESec></EStr>
Number: 5
Is there any one one who has done this & can provide some help in resolving this will be a great help.
Thanks
Sagar

Thanks for the feedback. Appreciate it.
Yes you are correct. Dimensions are different in HFM and Planning. I understand why we need different targets but my concern now is since Option B is correct and lets say I am extracting Net Income (Under Account Dimension) rollup from EBS into Shared Library. Will it automatically update Net Income in HFM and Planning both? (Note:-Net Income Rollup exists in both HFM and Planning) or I will have to manually drag and drop Net Income from shared library to HFM and Planning application?.
In other words does it update the Shared Library and applications at the same time?
Thanks again.

Similar Messages

  • Suggestions to create HFM Metadata using excel template

    Hi, I’m new to HFM and in process of creating the Metadata for a sample hfm application for my learning purpose. I need some sample excel templates for HFM Metadata creation. I would appreciate your help on this. Thanks.

    Hi Masoud,
    Could you please share those excel files with me at [email protected] which might help me in creating HFM app? Can you also send me the process, info and tools used on Enterprise to HFM9.3.1 migration ?
    Appreciate your helps in this regard.
    Thanks,
    PR

  • Loading Metadata using API or VB scripting

    Hi,
    We are trying to explore the options for loading the Metadata in HFM Classic Application , is there any option to load the metadata using a HFM API or script ?
    Please help !!!
    Regards
    Vkunda

    Review the HsvMetadata object which is documented at http://docs.oracle.com/cd/E17236_01/epm.1112/hfm_objects.pdf
    There is a load method for Classic applications.
    Regards,
    John A. Booth
    http://www.metavero.com

  • Query regarding Metadata load in Essbase using rule file

    Hi
    In our application,we are  loading metadata using Essbase rule file and  here we are facing one problem.Suppose we have to load a level 0 member called "xxxx" under two different parent,assume the parent members are Parent_AAAA and Parent_BBBB.But our requirement is the consolidation operator for the first occurrence ( i.e under Parent_AAAA)  is "+" and for the shared member under Parent_BBBB is "-".But,when we are loading using rule file the consolidation operator for the shared member always being loaded as "+".Is there any trick to change the consolidation operator for shared member other than  changing it manually in outline?
    Pls help....Thanks in advance.

    I don't think there is a way to change the default setting.
    However, what you can do is  to have a different load rule and different text file with only the metadata that you want to have an exception for shared members. That way, you can achieve it as you do not have to change the existing rule file.
    Regards
    ORACLE | Essbase

  • Saving a document using Word 2010 into a managed metadat hierarchy

    Hi
    Can I save a document using Word 2010 into a document library that has a managed metadata hierarchy and the document appear in the correct place in the hierarchy ?
    Ideally it would be good if the Word 2010 "Save As" browser would show the managed metadata hierarchy, the user would select a node on the managed metadata hierarchy and the document would be save with the appropriate metadata to place the document
    in the correct place in the hierarchy.
    Thanks
    Nigel
    Nigel Price NJPEnterprises

    Hi Nigel,
    I think it's normal difference between "Open" and "Save As" option in MS Office application by design.
    Whenever we create the document from/out of SharePoint library, and click "Save As" and there is always no "Arrange by" part in dialog.
    When we open Office Word application, and open a library location from SharePoint site, if we disabled the option "Configure Navigation Hierarchies" for that particular library, there will be no "Arrange by" part in "Open"
    dialog(e.g. image below).
    It seems SharePoint doesn't provide the function for "Save As" the same as "Open" with different disign requirements, if we want to save document to the needed location in that library through "Save As" dialog, we
    can add the proper metadata column value (e.g. manged metadata column if there is) in "Document Information Panel", and/or navigate to the proper library folder location.
    Thanks
    Daniel Yang
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Re-load the metadata back into HFM

    Hi
    I extracted the data out of HFM Client, however I am unable to re-load the metadata back into HFM Client.
    Thanks in advance.

    Why are you unable to reload the metadata?
    What are the exact steps you are following?
    Answering these questions will help you get helped...

  • Exception message, failed to get value, could not load managed metadata, invalid field name

    Hi, i have created some site collection columns with managed metdata and taxonomy term sets. I have then created some site content types of those site columns. Some of them function properly and some don't.
    When i have created or uploaded a document to the document library i start to "tag" the document by first choosing witch conent type i want too use, but when it comes to save the document it renders an error message(its not the full content
    of the message):
    "exception message, failed to get value, could not load managed metadata, invalid field name"
    I have created some other site content types before and with the same site columns and they do not generate a error message. Is there a solution for my dilemma?

    try these links:
    https://prashobjp.wordpress.com/2014/02/14/failed-to-get-value-of-the-column-name-column-from-the-managed-metadata-field-type-control-see-details-in-log-exception-message-invalid-field-name-guid-site-url/
    http://www.sharepointconfig.com/2011/03/issues-provisioning-sharepoint-2010-managed-metadata-fields/
    http://blog.goobol.com/category/sharepoint/sharepoint-issue-troubleshooting/
    http://www.instantquick.com/index.php/correctly-provisioning-managed-metadata-columns?c=elumenotion-blog-archive/random-whatnot
    https://pholpar.wordpress.com/2010/03/03/taxonomy-issues/
    Please mark answer as correct if it is correct else vote for it if you find it useful Happy SharePointing

  • NI 5660 Driver DLL Errors when using Teststand 2010 and LabVIEW Run-Time Engine 2010

    This problem seems similar to the post "Resource not found error in executable on developmen​t machine." but I didn't want to repost under that thread because I only happened upon it by chance and none of my searches brought me there... so I made a more descriptive Subject.
    I am working on a system that uses a PXI Chassis with a NI 5600 Downconverter and a NI 5620 high speed digitizer, among other PXI Cards. 
    I inherited working code written in LabVIEW 2010, running with the LabVIEW Run-Time Engine 2010.  The code was using a custom executive and my task was to rewrite the test using TestStand 2010.  I reused the majority of the old code.  The old code used NI-5660 to control the 5600 and 5620.  When I run my sequence using the LV Development System and TestStand, it runs without any issues.  When I change the adapter over to LabVIEW Run-Time Engine 2010, all of my NI5660 VIs become broken due to DLL issues.  It warns that the nipxi5600u​.dll was not initialized correctly.  Many of the errors are associated with the NI Tuner and NI Scope. After this LabVIEW will crash randomly, and the seqeunce will not work in TestStand even when back with the LV Development Adapter.  The only way to recover after this is to restart the computer - TestStand automatically reverts back to the development system, the VIs are no longer broken and the sequence works again. 
    I have all of my VIs associated with a project. After reading a little bit about DLLs and TestStand, I found all of the DLLs in the dependencies section of my project and added them to my TestStand workspace.  I also used Dependency Walker to track down the problems with the nipxi5600u​.dll, the 2 DLL files that it said were not found already existed in the same folder as the original DLL (C:\Windows\System32).  I have also performed a Mass Compile to make sure everything was running in LV 2010.  If I skip the steps involving the 5660, my entire sequence runs fine. 
    The previous code was running with the LabVIEW Run-Time Engine without any issues.  Is there just a step I am missing?  Has anyone seen anything like this before?  I can send screenshots of errors to provide more detail if necessary. 

    I have tried some more things and still can't get it to work.  I have added the VIs mentioned in the Notes On Creating Modulation Executables KB both to the TestStand workspace and the LabVIEW project holding all of my VIs.  This did not change the results. 
    When I try to run my sequence, The first error I get is shown in Error 1445.bmp.  This happens when I try to use the NI 5660 initialize.vi.  If I click ignore, the next error I see is shown in Error -20551.bmp.  When I try to open the VI to look at it, I get the 2 DLL errors shown in Error loading nipxi5600u.bmp and Error loading nidaq32.bmp.  When I close TestStand, I get the error LabVIEW Fatal Error.bmp. 
    Attachments:
    Error1445.JPG ‏164 KB
    Error -20551.JPG ‏174 KB
    Error loading nipxi5600u.JPG ‏9 KB

  • Workspace Error Running Financial Reports Post HFM Metadata Change

    We recently updated the metadata in HFM to use the Custom4 dimention. We are now trying to run reports from Workspace and get an error 5200 - error executing query. I have confirmed that the database connection in Workspace is ok. I am able to run the reports from the Financial Reporting Studio. Any ideas what's wrong when running from Workspace?

    That error means that the report is trying to retrieve data from a member that either does not exist or is not valid. I have seen it happen when the user POV chooses [Year] for period.
    I would start by trying to get the report to run from Report Studio where you can control the POV settings. If you still get the error, you will need to go through every data point in the report and verify that each member exists.

  • I am attempting to print a number of DL size envelopes using Word 2010 and Excel2010 on a HP Officej

    I am attempting to print a number of DL size envelopes using Word 2010 and Excel 2010 on a HP Officejet 6500A Plus printer in Operating system Windows 7. Using the mail merge facility the printer reports a paper mismatch. Word does not appear to have envelope DL size in its list of paper sizes only something called envelope #10. The Print driver software when I run out of stationery asks me to load more stationery and hit the OK button.
    Questions
    1. How do I add envelope DL size to Word 2010
    2. Where is the Officejet 6500A Plus OK button
    3. Has this problem been experienced by others
    Regards

    I am having the same problem does anyone have an answer there are many having the same problem

  • Building a portal using SharePoint 2010

    Hi,
    We are trying to build a portal using SharePoint 2010. 
    The pimary requirement for the portal is to allow users to upload, view and manage different types of documents (xls, doc, ppt, vsd).
    The uploaded documents need to be mapped to the different categories as per their classification. 
    For example, the documents fall into 5 main categories: A, B, C, D, E. These categories are further divided into sub-categories and so on for two more levels.
    Example, Main categories: A, B, C, D, E
    Sub categories for A: 1, 2, 3, 4, 5 and so on
    So, if one clicks on node A and then click on node 4, then all related documents will be displayed. (A.4)
    From the front end point of view, the user should be able to access these documents by just clicking on the nodes representing different categories.
    There are some more functionalities to be built for this portal - search, versioning, etc.
    Can someone please provide me the (high level steps) approach for building this portal and some reference material from features/architecture/technical requirements point of view? 
    This would of great help!
    Thanks!

    Sure. Most of what you're describing is around the topic of document storage.
    As you may be aware the object documents are stored in within SharePoint is a document library. You can have sub folders within that library but they're to be avoided if possible and shouldn't be a default choice.
    To classify your documents there is several routes you can take. The least complicated is to use a single Content Type with a single Managed Metadata Column. This column would use a Mananged Metadata Term taken from a term set with the structure of A (1,2,3,4,5),B(x,y,z),etc..
    That will allow your users to tag the document on upload (or modify it later) with a term such as A:1 or B:52.
    To browse only the relevant items you can use the Managed Navigation tools in the document library, this allows you to show a term set and filter to various levels, for example all A terms, (ie A:1, A:2 etc.) or drill down to A:1 specifically.
    Reading list:
    Intro to Managed Metadata:
    http://office.microsoft.com/en-gb/office365-sharepoint-online-enterprise-help/introduction-to-managed-metadata-HA102832521.aspx
    Metadata navigation for a list or library:
    http://office.microsoft.com/en-gb/sharepoint-server-help/configure-metadata-navigation-for-a-list-or-library-HA101820113.aspx
    Optional Extras:
    Intro to content types:
    http://msdn.microsoft.com/en-us/library/ms472236%28v=office.14%29.aspx

  • Graphics using graphic API are drawn below the loaded content created using Flash authoring tool

    Here is my problem.
    I am developing a Analog Dial component. Here , I am
    extending UIComponent and loading a swf file generated using Flash
    CS3 authoring tool ( the swf basically has the circle and needle)
    and adding the loaded content as child of Dial class.
    Next I use Graphics API to draw the major and minor ticks on
    the dial.
    As mentioned in the curveTo method of flash.display.Graphics
    class documentation (
    http://livedocs.adobe.com/flex/201/langref/index.html),
    if you are using graphics api and also loading content created
    using the flash authoring environment the vector graphics will be
    drawn underneath the loaded content.
    Well , Is there any way to make the graphics appear on top of
    the loaded contents?

    It appears that this may be accomplished more easily if I do something similar to the StrobeMediaPlayback implementation. Looking at the StrobeMediaPlayback source code it looks like Adobe has done something a little different than their ControlBarPlugin, placing the controlbar and root media element inside separate MediaContainers and then adding those containers to the display list. Is this recommended over using the frameworks ParallelElements? If so, is communication between the control bar and root media element still a matter of just updating the target reference via metadata?

  • TSV_TNEW_PAGE_ALLOC_FAILED error while loading the DATA using DTP

    Hi,
    While loading the data using DTP for 2  DSO's we are gettig the error
    TSV_TNEW_PAGE_ALLOC_FAILED
    can any one kindly help me out regarding the same.
    Thank You,
    Poornima.

    Hi Soundarya,
    Thanks a lot for the reply. But i found that its running fine in development, where as coming to quality its throwing an error. These happened for Two DSO's. In both the transformations i have identified that the Transformation names are different from Development and Quality..
    There are no routines written for them and no select statements have been used
    Can you please suggest me regarding the same.
    Edited by: Poornima Gayatri on Mar 22, 2010 7:00 AM

  • Canu0092t load IDoc Metadata!

    Hi all,
    I don’t know why, but I can’t see the IDoc metadata in IDX2 on my Integration Server.
    I already have defined the RFC’s connection between these two systems (the user used to connect has the authorities needed). The port and the partner profile are created (we20 and we21 on backend side)
    When I try to create “load meta data” manually (XI Integration Server, transaction IDX2) with IDoc Types ZSX_P1100 and source port (the port of backend), I receive “Basic type ZSX_P1100 does not exist” message!?!? But this message type exists on backend side… I already checked! Why I can’t load it???
    Anybody knows what’s happening?
    Thanks in advance,
    Ricardo.

    Hi Ricardo -
    >>><i>I need to release the IDoc segmets before the tranfer of meta data to my integration server (XI -> IDX2)?</i>
    It's definitely good practice when you set up a new port in IDX1 to see if you can manually load the metadata from IDX2, but it's not a requirement.  At runtime, if the metadata isn't there, it will go and get it and load it into the cache.
    Regards,
    Jin

  • Purpose of extracting the metadata after loading the metadata

    Hi Hyperion experts,
    AFTER loading the metadata in File what is the reason to go for extracting and where we will use

    I keep the XML files around after every load for a variety of reasons :
    - Loading to other applications : I'll generally make changes in dev / test and move to production. I guess I could probably get LCM to do the moving around; however, we're not big users of it at this point. I just load the file into the next app. It's pretty easy to open the client and hit load metadata. ;)
    - Historical Records : Since I keep a copy of every 'major' load into the system, I can easily 'roll back' or tell you when a change was made as far back as I want.
    - SOX Audits : Part of our controls is to ensure that changes loaded to production are approved through our documentation process. Having the files allows them to confirm what changes were made when.
    - Point in Time comparisons : Pretty much just like the SOX Audit, I will periodically review two points in time to see what has changed. For instance, every year when I roll out our budget Smartview template, I compared the current metadata to the prior year to do a quick check to see what accounts have changed. Depending on what has changed during hte last year, I may need to update my template accordingly, etc. [I use an XML differencing tool to do the comparisons]
    - Insurance : ' I'm not paranoid, I just know everyone is after me ' While I keep backups of my databases, etc, etc, I like having the file versions 'just in case'. If any of my other tools malfunction or I lose a backup, I can always grab my trusty files and reload.

Maybe you are looking for