Planning 11.1.2.4 Metadata load errors RemoteException occurred in server thread

Hi,
I am trying to load metadata using ODI 11.1.1.7 into Hyperion Planning 11.1.2.4. It errored out saying
Cannot load dimension member, error message is: RemoteException occurred in server thread; nested exception is:
    java.rmi.UnmarshalException: unrecognized method hash: method not supported by remote object.
Source - File
Please let me know if there is any fix to this.
Thanks,
Sravan

Did you definitely follow the steps in "How to Apply ODI Patch 18687916 for Hyperion Planning and Errors that May Occur if Patch Has Not Been Applied Correctly (Doc ID 1683307.1)"
If you are still getting the error if everything in the support doc has been done then maybe the issue relates to 11.1.2.4
It is interesting to see the Oracle state of direction doc have the following statement:
"The KM's for EPM release 11.1.2.3 and ODI release 11.1.1.7 are not certified with EPM Release 11.1.2.4."
Cheers
John
http://john-goodwin.blogspot.com/

Similar Messages

  • Error Msg: RemoteException occurred in server thread; nested exception is:

    Hi Everyone,
    I got the above msg and does anyone has any idea what could be the root of the problem? Attached below is extraction from the server log file.
    Thanks,
    Kelvin
    Server Log
    [#|2005-09-28T23:28:59.836+0800|INFO|sun-appserver-pe8.1_01|javax.enterprise.system.container.ejb|_ThreadID=21;|EJB5018: An exception was thrown during an ejb invocation on [SupplierTableFacadeBean]|#]
    [#|2005-09-28T23:28:59.836+0800|INFO|sun-appserver-pe8.1_01|javax.enterprise.system.container.ejb|_ThreadID=21;|
    javax.ejb.EJBException
         at com.sun.ejb.containers.BaseContainer.processSystemException(BaseContainer.java:2807)
         at com.sun.ejb.containers.BaseContainer.completeNewTx(BaseContainer.java:2713)
         at com.sun.ejb.containers.BaseContainer.postInvokeTx(BaseContainer.java:2521)
         at com.sun.ejb.containers.BaseContainer.postInvoke(BaseContainer.java:819)
         at com.sun.ejb.containers.EJBObjectInvocationHandler.invoke(EJBObjectInvocationHandler.java:137)
         at $Proxy89.createSupplier(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.sun.corba.ee.impl.presentation.rmi.StubInvocationHandlerImpl.invoke(StubInvocationHandlerImpl.java:167)
         at com.sun.corba.ee.impl.presentation.rmi.bcel.BCELStubBase.invoke(Unknown Source)
         at supplier._SupplierTableFacadeRemote_DynamicStub.createSupplier(_SupplierTableFacadeRemote_DynamicStub.java)
         at org.apache.jsp.process.supplier.addsupplier_005fprocess_jsp._jspService(addsupplier_005fprocess_jsp.java:141)
         at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:105)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:860)
         at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:336)
         at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:301)
         at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:251)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:860)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at org.apache.catalina.security.SecurityUtil$1.run(SecurityUtil.java:249)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAsPrivileged(Subject.java:500)
         at org.apache.catalina.security.SecurityUtil.execute(SecurityUtil.java:282)
         at org.apache.catalina.security.SecurityUtil.doAsPrivilege(SecurityUtil.java:165)
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:257)
         at org.apache.catalina.core.ApplicationFilterChain.access$000(ApplicationFilterChain.java:55)
         at org.apache.catalina.core.ApplicationFilterChain$1.run(ApplicationFilterChain.java:161)
         at java.security.AccessController.doPrivileged(Native Method)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:157)
         at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:263)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:551)
         at org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:225)
         at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:173)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:551)
         at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:161)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:551)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:132)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:551)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:933)
         at org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:184)
         at com.sun.enterprise.web.connector.grizzly.ProcessorTask.process(ProcessorTask.java:653)
         at com.sun.enterprise.web.connector.grizzly.ProcessorTask.process(ProcessorTask.java:534)
         at com.sun.enterprise.web.connector.grizzly.ProcessorTask.doTask(ProcessorTask.java:403)
         at com.sun.enterprise.web.connector.grizzly.WorkerThread.run(WorkerThread.java:55)
    Caused by: java.lang.NullPointerException
         at supplier.SupplierTableFacadeBean.createSupplier(SupplierTableFacadeBean.java:89)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.sun.enterprise.security.SecurityUtil$2.run(SecurityUtil.java:153)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sun.enterprise.security.application.EJBSecurityManager.doAsPrivileged(EJBSecurityManager.java:950)
         at com.sun.enterprise.security.SecurityUtil.invoke(SecurityUtil.java:158)
         at com.sun.ejb.containers.EJBObjectInvocationHandler.invoke(EJBObjectInvocationHandler.java:128)
         ... 44 more
    |#]
    -----------------------------------------------------------------------------------------------------------------------------------------------------

    I have attached my createSupplier method from the session bean. I hope it helps in finding the problem
    Regards,
    Kelvin
    public String createSupplier(String supid, String password, String name, String address,String contactno, String faxno, String email, String kcpersonnal, String kcpdesignation, String cpcno, String industry, String natureofbiz, String paidupcapital, String preferredcontract, Integer creditrating, String others) throws FinderException, UserException{
    SupplierTableLocal supplierTable = null;
    String key;
    boolean ok;
    //Checking for the minimum creation values.
    if ( supid.equals("") || password.equals("") || name.equals("")|| address.equals("")|| contactno.equals("") || faxno.equals("")|| email.equals("")|| kcpersonnal.equals("") || kcpdesignation.equals("")|| cpcno.equals("")|| industry.equals("")|| natureofbiz.equals("")||paidupcapital.equals("")) {
    throw new UserException("Please enter the userid, password, name, address, contactno, faxno, email, kcpersonnel, kcpdesignation, cpcno, industry, natureofbiz, paidupcapital.");
    //Checks if supplierid is unique.
    else{
    Collection list = supplierHome.findBySupid(supid);
    if( (list.size() != 0) ) {
    throw new UserException("SupplierID already exists.");
    //If SupplierID chosen is unique
    else {
    do{
    key = supid;
    //Check if the Primary Key has already been taken
    try{
    supplierTable = supplierHome.findByPrimaryKey(key);
    ok = true;
    catch (javax.ejb.FinderException ex) {
    ok = false;
    } while (ok);
    try{
    supplierTable = supplierHome.create(supid, password, name, address, contactno, faxno, email, kcpersonnal, kcpdesignation, cpcno, industry, natureofbiz, paidupcapital, preferredcontract, creditrating, others);
    catch (javax.ejb.CreateException e){
    return null;
    return key;
    }

  • Just trying out my new mac but email won't load, error says "The mail server denied access to the account because an administrator or other mail client was using it when Mail tried to log in. Try again later." A little lock is beside the email inbox

    Just trying out my new mac but email won't load, error says "The mail server denied access to the account because an administrator or other mail client was using it when Mail tried to log in. Try again later." A little lock is beside the email inbox account, no password prompted and account is online and enabled... Thoughts?

    Have you tried clicking on the lock to see it it will then ask for a passwored.  Otherwise, try reblooting.

  • RemoteException: 111 java.rmi.ServerError: Error occurred in server thread

    Hi,
    I'm just new to the RMI field of JAVA. I'm trying to write a program that import the math.jar file in the server side. And the client end can invoke the methods of math.jar and get the value.
    I got the error messages as below:
    RemoteException: 111 java.rmi.ServerError: Error occurred in server thread; nested exception is:
         java.lang.NoClassDefFoundError: org/mathwhizz/Heron
    It can work well if I use source classes(*.class) directly instead of importing math.jar. The failure only happened when I try to make use of jar file. Thus I guess my setup and configuration should be ok. Is there anything I should be very careful if I try to run RMI and import a jar file in the program of server side ?
    Do I need to use JNLP and web start application in this issue?
    I'm appreciated for your responses...thanks...
    Sincerely,
    Brandon

    I've got a problem with rmi, when I launch my server, I have this error:
    Erreur du remote: java.rmi.ServerError: Error occurred in server thread; nested exception is:
    java.lang.NoClassDefFoundError: com/borland/dx/dataset/DataSetData
    Help me please
    Thank you

  • An Unknown Error Has Occurred - Project Server 2010 - Tasks Page

    Greetings!   Some of my resources are recieving the error "An Unknown Error has occurred" (in red)  when trying to access their TASKS Page in Project Server 2010.  
    I've researched this error quite a bit and am finding suggestions and answers pretty much all over the board.    Before I have my server people start installing a slew of hot fixes - I want to know if anyone has an idea of the underlying cause
    for this error.    Since this issue doesn't affect everyone (i.e. I can access my task page), is it possibly related to the actual schedule in which they have assignments?    
    Any help is appreciated.
    Thanks!
    ~Randy 

    i have found the solution for this issue, you need to do this:
    in the logs you must search the "exception" and the logs give you a ID, this ID is about the task assignment in the specific resource , so this assignment is corrupt.
    this ID, you need to merge in the SQL server in the project server database, you need to merge ID task with ID Project with ID Resource, so when you identify the ID project and ID resource you can
    DELETE this assignment in the project from project profesional, because this task has a corrupt assignment, you must delete
    ONLY FROM project professional.
    This solutión solve this specific problem for this assignment  that
    IS DIFERENT TO THIS SOLUTION  post of Brian Smith:
    http://blogs.msdn.com/b/brismith/archive/2010/07/02/project-server-2010-an-unknown-error-has-occurred-in-project-center-resource-center-or-tasks.aspx?Redirected=true
    Erick Gutiérrez PMI Membership #ID 2089740 MTCS - Microsoft Project Server Managing Projects

  • MetaData Load error in 11.1.1.3

    Hi,
    I am trying to load metadata into a 11.1.1.3 Planning application using the outline load utility and encountered the below error -
    D:\Hyperion\products\Planning\bin>OutlineLoad /A:UMB01 /U:admin /N /I:d:\ENT_MD.csv /D:Entity
    /L:ENT_MD.log /X:ENT_MD.err
    Enter password:
    [INFO] RegistryLogger - REGISTRY LOG INITIALIZED
    [INFO] RegistryLogger - REGISTRY LOG INITIALIZED
    D:\Hyperion\common\config\9.5.0.0\product\planning\9.5.0.0\planning_1.xml
    displayName = Planning
    componentTypes =
    priority = 50
    version = 9.5.0.0
    build = 1
    location = D:\Hyperion\products\Planning
    taskSequence =
    task =
    using Java property for Hyperion Home D:\Hyperion
    Setting Arbor path to: D:\Hyperion\common\EssbaseRTC\9.5.0.0
    d{ISO8601} INFO main com.hyperion.audit.client.runtime.AuditRuntime - Audit Client has been created for the server http://<server>:28080/interop/Audit
    [Wed Oct 14 13:45:24 EDT 2009]Unable to obtain dimension information and/or perform a data load : Unrecognized column header value(s), refer to previous messages. (Note: column header values are case sensitive.)
    d{ISO8601} INFO pinging com.hyperion.audit.client.cache.AuditConfigFilter - Client Enable Status false
    d{ISO8601} INFO Thread-16 com.hyperion.audit.client.cache.AuditConfigFilter - Client Enable Status false
    d{ISO8601} INFO filterConfig com.hyperion.audit.client.cache.AuditConfigFilter - Client Enable Status false
    In the first row of excel sheet,which is a CSV file, I have the header names viz.
    Entity, Parent, Data Storage, Aggregation in columns A,B,C,D respectively. I don't understand why it thorws out "Unrecognized column header values".
    Please clarify.
    Thanks

    Hi,
    Here is an example of how I loaded one record to the entity dimension
    entityload.csv =
    Parent,Entity,Alias: Default,Data Storage,Aggregation (Consol)
    Entity,Alliance,Joint Venture Alliance,Store,~
    command line =
    OutlineLoad -f:passfile.txt /A:plansamp /U:admin /I:entityload.csv /D:Entity /L:outlineLoad.log /X:outlineLoad.exc /N
    outlineload.log =
    [Thu Oct 22 16:45:11 BST 2009]Successfully located and opened input file "E:\Hyperion\products\Planning\bin\entityload.csv".
    [Thu Oct 22 16:45:11 BST 2009]Header record fields: Parent, Entity, Alias: Default, Data Storage, Aggregation (Consol)
    [Thu Oct 22 16:45:11 BST 2009]Located and using "Entity" dimension for loading data in "PLANSAMP" application.
    [Thu Oct 22 16:45:11 BST 2009]Load dimension "Entity" has been unlocked successfully.
    [Thu Oct 22 16:45:11 BST 2009]A cube refresh operation will not be performed.
    [Thu Oct 22 16:45:11 BST 2009]Create security filters operation will not be performed.
    [Thu Oct 22 16:45:11 BST 2009]Examine the Essbase log files for status if Essbase data was loaded.
    [Thu Oct 22 16:45:11 BST 2009]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Metadata load error

    HI,
    When i was trying to update meta data using OUTLINELOAD., Its showing as loaded 10 members successfully.
    But when i checked in Planning ., its not updated...
    I checked in log its showing as BPMA is enabled., enabled to update meta data...
    Can you please help me on this.
    Thanks in advance..........

    [Mon Jul 02 17:35:03 ICT 2012]Located and using "BusinessUnit" dimension for loading data in "MFG" application.
    java.lang.RuntimeException: Planning Adapter loads may cause issues on BMPA enabled applications if metadata is modified.
    at com.hyperion.planning.HyperionPlanningBean.beginLoad(Unknown Source)
    at com.hyperion.planning.utils.HspOutlineLoad.parseAndLoadInputFile(Unknown Source)
    at com.hyperion.planning.utils.HspOutlineLoad.halAdapterInfoAndLoad(Unknown Source)
    at com.hyperion.planning.utils.HspOutlineLoad.loadAndPrintStatus(Unknown Source)
    at com.hyperion.planning.utils.HspOutlineLoad.main(Unknown Source)
    [Mon Jul 02 17:35:03 ICT 2012]Load dimension "BusinessUnit" has been unlocked successfully.
    [Mon Jul 02 17:35:03 ICT 2012]A cube refresh operation will not be performed.
    [Mon Jul 02 17:35:03 ICT 2012]Create security filters operation will not be performed.
    [Mon Jul 02 17:35:03 ICT 2012]Examine the Essbase log files for status if Essbase data was loaded.
    [Mon Jul 02 17:35:03 ICT 2012]Planning Outline data store load process finished.
    10 data records were read, 10 data records were processed, 10 were successfully
    loaded, 0 were rejected.

  • Metadata Load Error in HFM 9.2

    All,
    When we try to load a metadat file with "Merge" option, it workes fine. But while "Replace" it's throwing the following error message. We'll have to use "Replace" option to remove un-neccessary elements from the hierarchy. I would really appreciate if you can help me out on this, Thanks much!
    Metadata referential integrity check started at 16:38:48
    <refint>
    <Module Name="Journals">
    <Deleted>
    <Dim Name="ENTITYNODES">
    <Table Name="LTD_JLTMPENT" Scenario="" Year="">
    <Member ParentName="TotLLS" ChildName="LLSAdj" Count="1" Label="14_INTERCO BALANCE" Value="" Period="[None]">
    </Member>
    </Table>
    <Table Name="LTD_JLTMPENT" Scenario="" Year="">
    <Member ParentName="TotMast" ChildName="MASTAdj" Count="1" Label="PUSHDOWNS B&O" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotMast" ChildName="MASTAdj" Count="1" Label="PUSHDOWNS SG&A" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotMast" ChildName="MASTAdj" Count="2" Label="05_MAST IC CLEAR" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotMast" ChildName="MASTAdj" Count="1" Label="14_INTERCO BALANCE" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotMast" ChildName="MASTAdj" Count="2" Label="44_MAST INV TRUE UP" Value="" Period="[None]">
    </Member>
    </Table>
    <Table Name="LTD_JLTMPENT" Scenario="" Year="">
    <Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="PUSHDOWNS B&O" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="PUSHDOWNS SG&A" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="13_LZ PROFIT ELIM" Value="" Period="[None]">
    </Member>
    <Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="14_INTERCO BALANCE" Value="" Period="[None]">
    </Member>
    </Table>
    </Dim>
    </Deleted>
    <Changed>
    </Changed>
    </Module>
    </refint>
    Load ended at: 16:39:04
    Elapsed time: 00:00:16

    What exactly are you getting rid of in your metadata?
    It appears your metadata is causing a conflict with a journal entry and/or journal template. I'd look carefully at the error message below and make sure that those elements (i.e. TotLLS, LLSAdj, TotMast, MASTAdj, TotBA, BAAdj) and make sure they still exist in your metadata.

  • "RABAX" occurs on server side error while testing asynchoronous web service

    I got error while testing asynchronous web service in WS navigator. I have created asynchronouse web service using RFC. Then I configured it in SOAMANAGER. When I tested it, got error "RABAX occurs on server side". Also I got dump in ST22. It is 'UNCHAUGHT_EXCEPTION - CX_SOAP_SEQ_SCD_ERROR'.
    I have tested for synchronous web service it works fine.  I found a difference in both web service WSDL file for below parameters value -
    commit
    blocking
    transaction
    wsrm
    I tried different way .. but no solution .. Plz suggest if someone have any idea.....Its very helpfull....

    your problem:
    Missing class: oracle.tip.adapter.jms.JmsManagedConnectionFactory
    Dependent class: oracle.tip.adapter.fw.wsdl.WSDLUtils
    Loader: oracle.bpel.common:10.1.3
    Code-Source: /oraclesoa/oraclesoa/product/10.1.3.1/OracleAS_1/bpel/lib/orabpel.jar
    Configuration: <code-source> in /oraclesoa/oraclesoa/product/10.1.3.1/OracleAS_1/j2ee/home/config/server.xml
    It happens when server is custom installed, try to reinstall it as full version and problem should disappear otherwise your need another full installation to retrieve and replace orabpel.jar file from (or maybe more)

  • "MDL1223: Metadata Loader cannot be executed ..." error?

    Hi folks,
    Am getting this strange error from OMBPlus (11Gr2):
    MDL1223: Metadata Loader cannot be executed while other user(s) are accessing the workspace.
    when trying to import an MDL file, when using this statement:
    catch {OMBIMPORT MDL_FILE '$projname.mdl'\
                                USE UPDATE_MODE\
                                CONTROL_FILE 'control.txt'\
                                OUTPUT LOG '$projname.log'} result
    where the CONTROL_FILE (control.txt) simply has:
    IGNOREUOID=Y
    CONFIGPARAM=N
    SINGLEUSER=Y
    Am thinking that SINGLEUSER setting may have something to do with it? Have got an SR open with Oracle on it but no luck there so far - just wondering if anyone else has come across something like this...? Seems to certainly sound like a new 11G-ish/workspace-y kind of thing. Am really curious to know how to determine what users are currently accesing a workspace?
    Thanks,
    Jim

    Hi Jim,
    I would be the SINGLEUSER tag. In 11.1 and 11.2 you should connect using the USE SINGLE_USER_MODE option and remove the SINGLEUSER setting from the param file.
    Regards,
    John

  • ODI Planning metadata load takes very long time

    Hi,
    I am using ODI for Hyperion Planning metadata load.
    We are facing performance issues, as the process is taking a very long time.
    The process uses "LKM File to SQL" and "IKM SQL to Hyperion Planning"
    The number of rows to process in the file is around 70000. The file is generated from DRM. The ODI integration process takes around 2 hours to process and load the file to Planning. Even if we add 1 new row to the file and everything else remains same in the file, the process takes that long.
    So, the whole process takes around 3 hours to load to Planning for all dimensions.
    I tried increasing the fetch rows to 200 in source but there is no significant increase in performance. The Heap size is set to maximum of 285 MB in odiparams.bat.
    How can the performance be improved?
    Can I use different LKM or change any other setting?
    Thanks,
    RS

    Hi John,
    In my current implementation, the dimension hierarchies are maintained in DRM.
    So, business directly makes changes in DRM and exports the hierarchies in a text file.
    I agree that loading 70000 records is odd on regular basis, but it makes it easier for business to retain control of their hierarchies and maintainance is easy in DRM.
    On bulk loading to DB table and loading to Planning, I have 2 questions:
    1. Do you think that "LKM SQL to SQL" [Oracle to Planning] will have significant improvement in performance over "LKM File to SQL" [File to Planning], as we are still using "Sunopsis Memory engine" as staging area?
    2. I checked your blog, there you have suggested using "Sunopsis memory engine" for "LKM SQL to SQL".
    Is it mandatory to use "Sunopsis emory engine" as staging area, can we use any other user defined staging area [Oracle tables]?
    Cheers,
    RS

  • Automated MetaData Load

    Greetings,
    I'm curious to know how some of the users in this forum perform automated metadata load against dimensions in the Shared Library (and subsequently against any EPMA Planning apps). We have several apps whose dimensions are shared amongst various Planning apps which are updated manually when the dataload fails because of missing members. These are updated manually on a monthly basis due to the relative mall number of missing members.
    However, we are building an app with a dimension that is quite dynamic (a lot of new members are added and a few are deleted). This woud be insane to perform manually to update the Shared Library. Thus I'm looking for any suggestions on how to automate this via batch file or via any other means, including using "Create Member Properties".
    Any suggestions or ideas would be greatly welcomed...
    Many thanks.
    cg

    CG,
    .err genrates only when
    no proper names for member
    no proper alise name .. while data loading...etc etc
    These things can also be achived via some programing langague ... like java .. but u need to know all the possible error and your programme should be able to find out where it has gone wrong so that it can rebuild if any new member or child is missing via data load
    unix > u can use SED functions to do that which will find and replace any possible outcomes
    itb happen to me some time back where i used Java to replace some mmebr which used to get rejected while data loading .. but it was like specific butin ur case u have to know all possible outcomes ...

  • HFM Metadata Loading On-going Problem

    Hi There,
    We just migrated to HFM 11.1.1.3 from HFM 4.
    We used to have an issue in HFM 4 where almost everytime we loaded metadata, the system would hang and become unresponsive. The load screen would just sit there without ever completing the load. You could still use the system but the metadata load would never actually load. The only thing that would resolve it would be to reboot all of the HFM application servers.
    This happened to us again the other day but now we're on the new HFM 11.1.1.3. Again, the resolution was to reboot the HFM applications. We tried just restarting the services on the various servers but that didn't work. A full reboot was required.
    And nothing was wrong with the metadata itself as it quickly loaded without errors into our DEV and QA environments. As well, we kicked all of the users out of the system prior to the metadata load. Most get out immediately and certainly no heavy calculations or consolidations are being performed during the metadata load.
    Has anyone else experienced this issue? Should a reboot always precede or accompany a metadata load? Is there any recommendation as to how often you should reboot the Hyperion servers (monthly, quarterly, etc.) as good practice?
    Many Thanks,
    Mike

    We are having a similar issue with version 11.1.2.0.0
    We try to run an application metadata load from the client and we get an error message
    Unexpected error: 80005005 occurred.
    Running the load from the workspace, we receive the following message.
    An error has occurred. Please contact your administrator.
    Show Details:
    Error Reference Number: {F187C156-ABDA-40DD-A687-B471F35535E3};User Name: mpus54965@gsd1w105c
    Num: 0x80004005;Type: 0;DTime: 4/27/2011 1:27:41 PM;Svr: GSD4W023C;File: CHsvMetadataLoadACV.cpp;Line: 347;Ver: 11.1.2.0.0.2762;
    This is the second time we have encountered this problem. Oracle support was not able to determine the cause. The fix is to reboot the database server, but we are looking for insight to the problem. Note: Our current development environment is a single server virtual environment connected to a database server. We are planning to move to a more robust test environment with four applications and one database server in a few weeks.
    Thanks for your help,

  • ETL Load error

    Hi
    When i ran the ETL load for Project analytics , it errored out.
    In the DAC , these 3 items errored
    1. SIL_GlobalCurrencyGeneral_Update
    2. SDE_ORA_UserDimension
    3. SDE_ORA_EmployeeDimension
    below is the error from the log file. Any help will be appreciated
    9 SEVERE Thu Sep 24 17:54:48 PDT 2009
    START OF ETL
    10 SEVERE Thu Sep 24 17:57:14 PDT 2009 Unable to evaluate method getNamedSourceIdentifier for class com.siebel.analytics.etl.etltask.PauseTask
    11 SEVERE Thu Sep 24 17:57:15 PDT 2009 Unable to evaluate method getNamedSource for class com.siebel.analytics.etl.etltask.PauseTask
    12 SEVERE Thu Sep 24 17:57:20 PDT 2009 Unable to evaluate method getNamedSourceIdentifier for class com.siebel.analytics.etl.etltask.InformaticaTask
    13 SEVERE Thu Sep 24 17:57:20 PDT 2009 Unable to evaluate method getNamedSource for class com.siebel.analytics.etl.etltask.InformaticaTask
    14 SEVERE Thu Sep 24 17:58:27 PDT 2009 Unable to evaluate method getNamedSourceIdentifier for class com.siebel.analytics.etl.etltask.TaskPrecedingActionScriptTask
    15 SEVERE Thu Sep 24 17:58:27 PDT 2009 Unable to evaluate method getNamedSource for class com.siebel.analytics.etl.etltask.TaskPrecedingActionScriptTask
    16 SEVERE Thu Sep 24 17:58:39 PDT 2009 Starting ETL Process.
    17 SEVERE Thu Sep 24 17:59:14 PDT 2009 Informatica Status Poll Interval new value : 20000(milli-seconds)
    19 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/ORACLE specified is not a currently existing directory
    20 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/Oracle specified is not a currently existing directory
    21 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/oracle specified is not a currently existing directory
    22 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/ORACLE (THIN) specified is not a currently existing directory
    24 SEVERE Thu Sep 24 18:07:10 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/FLAT FILE specified is not a currently existing directory
    25 SEVERE Thu Sep 24 18:08:23 PDT 2009 Request to start workflow : 'SILOS:SIL_CurrencyTypes' has completed with error code 0
    26 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_ProductMultipleCategories_Full' has completed with error code 0
    27 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_ExchangeRateGeneral_Full' has completed with error code 0
    28 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_Product_Categories_Derive' has completed with error code 0
    29 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SILOS:SIL_Parameters_Update' has completed with error code 0
    30 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_Stage_GLAccountDimension_FinSubCodes' has completed with error code 0
    31 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_EmployeeDimension_Addresses_Full' has completed with error code 0
    32 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_UserDimension_Full' has completed with error code 0
    33 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_GeoCountryDimension' has completed with error code 0
    34 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SILOS:SIL_GlobalCurrencyGeneral_Update' has completed with error code 0
    35 SEVERE Thu Sep 24 18:19:53 PDT 2009 Error while contacting Informatica server for getting workflow status for SDE_ORA_UserDimension_Full
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Sep 24 18:19:40 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Thu Sep 24 18:19:43 2009]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_UserDimension_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_UserDimension_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_UserDimension_Full] will be failed.]
    Workflow run id [1611].
    Start time: [Thu Sep 24 18:08:26 2009]
    End time: [Thu Sep 24 18:15:45 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SDE_ORA_UserDimension_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Thu Sep 24 18:19:43 2009
    =====================================
    ERROR OUTPUT
    =====================================
    37 SEVERE Thu Sep 24 18:19:53 PDT 2009 pmcmd startworkflow -sv Integ_r1211 -d Domain_r1211 -u Administrator -p **** -f SDE_ORAR12_Adaptor -lpf /oracle/Informatica/PowerCenter8.6.0/server/infa_shared/SrcFiles/ORA_R12_Flatfile.DataWarehouse.SDE_ORAR12_Adaptor.SDE_ORA_Stage_GLAccountDimension_FinSubCodes.txt SDE_ORA_Stage_GLAccountDimension_FinSubCodes
    Status Desc : Succeeded
    WorkFlowMessage : Workflow executed successfully.
    Error Message : Successfully completed.
    ErrorCode : 0
    36 SEVERE Thu Sep 24 18:19:53 PDT 2009 Error while contacting Informatica server for getting workflow status for SIL_GlobalCurrencyGeneral_Update
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Sep 24 18:19:12 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Thu Sep 24 18:19:21 2009]
    Folder: [SILOS]
    Workflow: [SIL_GlobalCurrencyGeneral_Update] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SIL_GlobalCurrencyGeneral_Update] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SIL_GlobalCurrencyGeneral_Update] will be failed.]
    Workflow run id [1610].
    Start time: [Thu Sep 24 18:08:26 2009]
    End time: [Thu Sep 24 18:15:47 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SIL_GlobalCurrencyGeneral_Update.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Thu Sep 24 18:19:21 2009
    =====================================
    ERROR OUTPUT
    =====================================
    38 SEVERE Thu Sep 24 18:19:53 PDT 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_UserDimension_Full
    39 SEVERE Thu Sep 24 18:19:53 PDT 2009 Could not attach to workflow because of errorCode 36331 For workflow SIL_GlobalCurrencyGeneral_Update
    40 SEVERE Thu Sep 24 18:19:53 PDT 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORAR12_Adaptor:SDE_ORA_UserDimension_Full:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_ORA_UserDimension_Full
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Sep 24 18:19:40 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Thu Sep 24 18:19:43 2009]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_UserDimension_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_UserDimension_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_UserDimension_Full] will be failed.]
    Workflow run id [1611].
    Start time: [Thu Sep 24 18:08:26 2009]
    End time: [Thu Sep 24 18:15:45 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SDE_ORA_UserDimension_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Thu Sep 24 18:19:43 2009
    =====================================
    ERROR OUTPUT
    =====================================
    Re-Queue to attempt to run again or attach to running workflow
    if Execution Plan is still running or re-submit Execution Plan to execute the workflow.
    EXCEPTION CLASS::: com.siebel.analytics.etl.etltask.IrrecoverableException
    com.siebel.analytics.etl.etltask.InformaticaTask.doExecute(InformaticaTask.java:179)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.doExecuteWithRetries(GenericTaskImpl.java:410)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:306)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:213)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.run(GenericTaskImpl.java:585)
    com.siebel.analytics.etl.taskmanager.XCallable.call(XCallable.java:63)
    java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    java.util.concurrent.FutureTask.run(FutureTask.java:138)
    java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    java.util.concurrent.FutureTask.run(FutureTask.java:138)
    java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:885)
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:907)
    java.lang.Thread.run(Thread.java:619)
    41 SEVERE Thu Sep 24 18:20:01 PDT 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SILOS:SIL_GlobalCurrencyGeneral_Update:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SIL_GlobalCurrencyGeneral_Update
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :

    Shyam
    i have attached the info you had asked.
    When i reran , this time only SDE_ORA_ExchangeRateGeneral failed. Below is the output of 2 files
    1. SDE_ORA_ExchangeRateGeneral_Full
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Fri Sep 25 10:19:38 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Fri Sep 25 10:19:43 2009]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_ExchangeRateGeneral_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_ExchangeRateGeneral_Compress_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_ExchangeRateGeneral_Full] will be failed.]
    Workflow run id [1806].
    Start time: [Fri Sep 25 10:18:00 2009]
    End time: [Fri Sep 25 10:19:18 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SDE_ORA_ExchangeRateGeneral_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Fri Sep 25 10:19:43 2009
    =====================================
    ERROR OUTPUT
    =====================================
    2. SDE_ORA_ExchangeRateGeneral_Full_SESSIONS
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMREP, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    This Software may be protected by U.S. Patent Numbers 6,208,990; 6,044,374; 6,014,670; 6,032,158; 5,794,246; 6,339,775; 6,850,947; 6,895,471; 7,254,590 and other U.S. Patents Pending.
    Invoked at Fri Sep 25 10:08:55 2009
    [[REP_57066] Request timed out.]
    [09/25/2009 10:12:05-[REP_55112] Unable to connect to the Repository Service [Rep_r1211] since the resilience time is up.]
    [Failed to connect to repository service [Rep_r1211].]
    An error occurred while accessing the repository[Failed to connect to repository service [Rep_r1211].]
    [09/25/2009 10:12:05-[REP_55102] Failed to connect to repository service [Rep_r1211].]
    Repository connection failed.
    Failed to execute listobjectdependencies.
    Completed at Fri Sep 25 10:12:05 2009
    =====================================
    ERROR OUTPUT
    =====================================

  • Metadata export error

    hi,
    I try to transfer my metadata with oracle WB Transfer.
    I Trace it and recieve folloing log :
    **! Transfer logging started at Mon Feb 07 16:47:45 IRST 2005 !**
    **! Client is 127.0.0.1 !**
    **! Server is 127.0.0.1 !**
    OWB Bridge processed arguments
    Bridge Parameters are:
    <BA>=<KAFA_COLLECTION2>
    <LANGUAGE>=<All Languages>
    <LOCKREPOSITORY>=<True>
    <file>=<C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107782265830.XMI>
    Using attribute sorting
    export to file
    Default local= fa_IR
    Exporting project:KAFA_PROJECT2
    initializing project:KAFA_PROJECT2
    Initializing module :KAFA_DW_MODULE
    exportSchema:KAFA_DW_MODULE SCHM14314
    Exporting cube:EMPLOYEMENT_CUBE
    exportCube:EMPLOYEMENT_CUBE/EMPLOYEMENT_CUBE Cube14396
    exportCubeDimUse:CDU14344FK14400
    exportFactLevelUse:FLU14396FK14400
    exportCubeDimUse:CDU14359FK14403
    exportFactLevelUse:FLU14396FK14403
    exportCubeDimUse:CDU14379FK14397
    exportFactLevelUse:FLU14396FK14397
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_0
    exportMeasure:SALARY/SALARY MEA38108
    Exporting cube:HOKM_CUBE
    exportCube:HOKM_CUBE/HOKM_CUBE Cube38100
    exportCubeDimUse:CDU14344FK38101
    exportFactLevelUse:FLU38100FK38101
    exportCubeDimUse:CDU14359FK38104
    exportFactLevelUse:FLU38100FK38104
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_2
    exportMeasure:ESTEKHDAMDATE/ESTEKHDAMDATE MEA38109
    exportMeasure:SALARY/SALARY MEA38110
    Exporting dimension:DEP_DIM
    exportDimension:DEP_DIM/DEP_DIM DIM14359
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_3
    exportLevel:DEP_MAIN/DEP_MAIN LEV14360
    exportClassificationEntry for:Description CLE_4
    exportLevelAttribute:DEP_ID/DEP_ID LATR14363
    exportLevelAttribute:DEP_NAME/DEP_NAME LATR14367
    exportKey:DEP_MAIN_UK/DEP_MAIN_UK KEY14361
    exportKeyAttributeUse:KEYAU14363KEY14361
    exportHierarchy:HIE_DEP/HIE_DEP HEIR14374
    exportHierarchyLevelUse:HLU14374LEV14360
    Exporting dimension:EMP_DIM
    exportDimension:EMP_DIM/EMP_DIM DIM14344
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_6
    exportLevel:EMP_MAIN/EMP_MAIN LEV14345
    exportClassificationEntry for:Description CLE_7
    exportLevelAttribute:EMP_ID/EMP_ID LATR14348
    exportLevelAttribute:EMP_NAME/EMP_NAME LATR14352
    exportKey:EMP_MAIN_UK/EMP_MAIN_UK KEY14346
    exportKeyAttributeUse:KEYAU14348KEY14346
    exportHierarchy:HIE_EMP/HIE_EMP HEIR14354
    exportHierarchyLevelUse:HLU14354LEV14345
    Exporting dimension:JOB_DIM
    exportDimension:JOB_DIM/JOB_DIM DIM14379
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_8
    exportLevel:JOB_MAIN/JOB_MAIN LEV14380
    exportClassificationEntry for:Description CLE_9
    exportLevelAttribute:JOB_ID/JOB_ID LATR14383
    exportLevelAttribute:JOB_NAME/JOB_NAME LATR14389
    exportKey:JOB_MAIN_UK/JOB_MAIN_UK KEY14381
    exportKeyAttributeUse:KEYAU14383KEY14381
    exportHierarchy:HIE_JOB/HIE_JOB HEIR14391
    exportHierarchyLevelUse:HLU14391LEV14380
    Exporting mappings
    exportDimensionTableMap:DTMAP14359
    exportItemMap:IMAP14364
    exportItemUse:TIU14363
    exportItemUse:SIU14364
    exportItemMap:IMAP14368
    exportItemUse:TIU14367
    exportItemUse:SIU14368
    exportDimensionEntityUse:EU_DIM14359
    exportDimensionTableUse:EU_TAB14359
    exportDimensionTableMap:DTMAP14344
    exportItemMap:IMAP14349
    exportItemUse:TIU14348
    exportItemUse:SIU14349
    exportItemMap:IMAP14353
    exportItemUse:TIU14352
    exportItemUse:SIU14353
    exportDimensionEntityUse:EU_DIM14344
    exportDimensionTableUse:EU_TAB14344
    exportDimensionTableMap:DTMAP14379
    exportItemMap:IMAP14384
    exportItemUse:TIU14383
    exportItemUse:SIU14384
    exportItemMap:IMAP14390
    exportItemUse:TIU14389
    exportItemUse:SIU14390
    exportDimensionEntityUse:EU_DIM14379
    exportDimensionTableUse:EU_TAB14379
    exportFactTableMap:FTMAP14396
    exportMapDependency:MAPDEP14396DTMAP14344
    exportMapDependency:MAPDEP14396DTMAP14359
    exportMapDependency:MAPDEP14396DTMAP14379
    exportItemMap:IMAP38108
    exportItemUse:TIU38108
    exportItemUse:SIU38108
    exportCubeEntityUse:EU_Cube14396
    exportFactUse:EU_TAB14396
    exportFactLevelGroup:FLG_TAB14396
    exportFactTableMap:FTMAP38100
    exportMapDependency:MAPDEP38100DTMAP14344
    exportMapDependency:MAPDEP38100DTMAP14359
    exportItemMap:IMAP38109
    exportItemUse:TIU38109
    exportItemUse:SIU38109
    exportItemMap:IMAP38110
    exportItemUse:TIU38110
    exportItemUse:SIU38110
    exportCubeEntityUse:EU_Cube38100
    exportFactUse:EU_TAB38100
    exportFactLevelGroup:FLG_TAB38100
    Exporting table:DEP_DIM
    exportTable:DEP_DIM/DEP_DIM TAB14359
    exportKey:DEP_DIM_DEP_MAIN_UK/DEP_DIM_DEP_MAIN_UK KEY14362
    exportKeyAttributeUse:KEYAU14364KEY14362
    exportColumn:DEP_MAIN DEP_ID/DEP_MAIN_DEP_ID COL14364
    exportColumn:DEP_MAIN_DEP_NAME/DEP_MAIN_DEP_NAME COL14368
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_10
    Exporting table:EMP_DIM
    exportTable:EMP_DIM/EMP_DIM TAB14344
    exportKey:EMP_DIM_EMP_MAIN_UK/EMP_DIM_EMP_MAIN_UK KEY14347
    exportKeyAttributeUse:KEYAU14349KEY14347
    exportColumn:EMP_MAIN EMP_ID/EMP_MAIN_EMP_ID COL14349
    exportColumn:EMP_MAIN_EMP_NAME/EMP_MAIN_EMP_NAME COL14353
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_11
    Exporting table:JOB_DIM
    exportTable:JOB_DIM/JOB_DIM TAB14379
    exportKey:JOB_DIM_JOB_MAIN_UK/JOB_DIM_JOB_MAIN_UK KEY14382
    exportKeyAttributeUse:KEYAU14384KEY14382
    exportColumn:JOB_MAIN JOB_ID/JOB_MAIN_JOB_ID COL14384
    exportColumn:JOB_MAIN_JOB_NAME/JOB_MAIN_JOB_NAME COL14390
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_12
    Exporting table:EMPLOYEMENT_CUBE
    exportTable:EMPLOYEMENT_CUBE/EMPLOYEMENT_CUBE TAB14396
    exportKey:SEG_UK_EMPLOYEMENT_CUBE/SEG_UK_EMPLOYEMENT_CUBE KEY14406
    exportKeyAttributeUse:KEYAU14401KEY14406
    exportKeyAttributeUse:KEYAU14404KEY14406
    exportKeyAttributeUse:KEYAU14398KEY14406
    exportForeignKey:FK_EMPLOYEMENT_CUBE_14362/FK_EMPLOYEMENT_CUBE_14362 FK14403
    exportKeyAttributeUse:KEYAU14404FK14403
    exportForeignKey:FK_EMPLOYEMENT_CUBE_14347/FK_EMPLOYEMENT_CUBE_14347 FK14400
    exportKeyAttributeUse:KEYAU14401FK14400
    exportForeignKey:FK_EMPLOYEMENT_CUBE_14382/FK_EMPLOYEMENT_CUBE_14382 FK14397
    exportKeyAttributeUse:KEYAU14398FK14397
    exportColumn:DEP_MAIN_DEP_ID/DEP_MAIN_DEP_ID COL14404
    exportColumn:EMP_MAIN_EMP_ID/EMP_MAIN_EMP_ID COL14401
    exportColumn:JOB_MAIN_JOB_ID/JOB_MAIN_JOB_ID COL14398
    exportColumn:SALARY/SALARY COL38108
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_13
    Exporting table:HOKM_CUBE
    exportTable:HOKM_CUBE/HOKM_CUBE TAB38100
    exportForeignKey:FK_HOKM_CUBE_14362/FK_HOKM_CUBE_14362 FK38104
    exportKeyAttributeUse:KEYAU38105FK38104
    exportForeignKey:FK_HOKM_CUBE_14347/FK_HOKM_CUBE_14347 FK38101
    exportKeyAttributeUse:KEYAU38102FK38101
    exportColumn:DEP_MAIN_DEP_ID/DEP_MAIN_DEP_ID COL38105
    exportColumn:EMP_MAIN_EMP_ID/EMP_MAIN_EMP_ID COL38102
    exportColumn:ESTEKHDAMDATE/ESTEKHDAMDATE COL38109
    exportColumn:SALARY/SALARY COL38110
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_14
    exportContext:CTXT00
    exportTypeSet:TSET
    Exporting datatypes
    exportScalarDatatype:NUMBER DTYP7836
    exportScalarDatatype:VARCHAR2 DTYP7839
    exportClassification:KAFA_COLLECTION2 CLAS1
    exportClassificationType:Warehouse Builder Business Area CLT15
    exportClassification:Description CLAS5
    exportClassificationType:Dimensional Attribute Descriptor CLT16
    Exporting project KAFA_PROJECT2 complete.
    **! Target bridge jvm parameters = "..\..\..\jdk\jre\bin\javaw" -mx50m -DORCLCWM_META_MODEL_FILE=..\..\bridges\admin\orcl_cwm.xml -classpath .;..\..\bridges\lib\bridge_cwmlite10.jar;..\..\bridges\lib\bridge_parser10.jar;..\..\bridges\lib\vek10.jar;..\..\..\lib\xmlparserv2.jar;..\..\..\jdbc\lib\ojdbc14.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\util.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;..\..\bridges\lib\bridge_cwmlite10.jar;..\..\bridges\lib\bridge_parser10.jar;..\..\bridges\lib\vek10.jar;..\..\..\lib\xmlparserv2.jar;..\..\..\jdbc\lib\ojdbc14.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\util.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;"..\..\..\jdk\jre\lib\rt.jar;..\..\..\jdk\jre\lib\charsets.jar";"..\..\bridges\admin;..\..\bridges\lib\bridge_wrapper10.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\reposimpl.jar;..\..\lib\int\repossdk.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;..\..\lib\int\util.jar" oracle.cwm.tools.bridge.BridgeWrapper -bridge_name oracle.cwm.bridge.cwmlite.ImportMain !**
    **! Target bridge parameters = -log_level 2 -log_file C:\TEMP\bridges\log\null-nullMy_Metadata_Transfer1107782265830.log -olapimp.deploytoaw Y -olapimp.awname dw_targetschema -olapimp.awobjprefix dw_ts -olapimp.loadcubedata Y -olapimp.awaggregate 1 -olapimp.dimuniquekeys Y -olapimp.awuser -olapimp.createviews Y -olapimp.viewprefix -olapimp.viewaccesstype OLAP -olapimp.creatematviews Y -olapimp.viewscriptdir -olapimp.deploy Y -olapimp.username dw_targetschema -olapimp.password password -olapimp.host 200.20.20.11 -olapimp.port 1521 -olapimp.sid ora10g -olapimp.inputfilename C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107782265830.XMI -olapimp.outputfilename C:\Documents and Settings\stabatabaee\Desktop\test.sql -paramfile C:\TEMP\bridges\1107782266112.par !**
    setting parameter: olapimp.deploytoaw = Y
    setting parameter: olapimp.awname = dw_targetschema
    setting parameter: olapimp.awobjprefix = dw_ts
    setting parameter: olapimp.loadcubedata = Y
    setting parameter: olapimp.awaggregate = 1
    setting parameter: olapimp.dimuniquekeys = Y
    setting parameter: olapimp.awuser =
    setting parameter: olapimp.createviews = Y
    setting parameter: olapimp.viewprefix =
    setting parameter: olapimp.viewaccesstype = OLAP
    setting parameter: olapimp.creatematviews = Y
    setting parameter: olapimp.viewscriptdir =
    setting parameter: olapimp.deploy = Y
    setting parameter: olapimp.username = dw_targetschema
    setting parameter: olapimp.host = 200.20.20.11
    setting parameter: olapimp.port = 1521
    setting parameter: olapimp.sid = ora10g
    setting parameter: olapimp.inputfilename = C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107782265830.XMI
    setting parameter: olapimp.outputfilename = C:\Documents and Settings\stabatabaee\Desktop\test.sql
    Loading Metadata
    Loading XMI input file
    connecting ...
    processing dim: DEP_DIM
    processing level: DEP_MAINin dimension DEP_DIM
    processing level attribute use: DEP_MAIN_DEP_ID in level DEP_MAIN for level attribute DEP_ID
    processing level attribute : DEP_ID in level DEP_MAIN
    processing level attribute use: DEP_MAIN_DEP_NAME in level DEP_MAIN for level attribute DEP_NAME
    processing level attribute : DEP_NAME in level DEP_MAIN
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    cwm_olap_dimension.Set_Description (USER, 'DEP_DIM', '');
    cwm_olap_dimension.Set_Display_Name(USER, 'DEP_DIM', 'DEP_DIM');
    cwm_olap_dimension.Set_Plural_Name(USER, 'DEP_DIM', 'DEP_DIM');
    cwm_olap_level.Set_Description (USER, 'DEP_DIM', 'DEP_MAIN', '');
    cwm_olap_level.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN');
    begin
    cwm_olap_level_attribute.set_name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN_DEP_ID', 'DEP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_ID', 'DEP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_ID', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.set_name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN_DEP_NAME', 'DEP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_NAME', 'DEP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_NAME', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'DEP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'DEP_DIM', 'Short_Description', 'Short_Description', 'Short Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'DEP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'DEP_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'DEP_DIM', 'Long_Description', 'Long_Description', 'Full Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Long Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'DEP_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'DEP_DIM', 'Short_Description', 'DEP_MAIN', 'DEP_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'DEP_DIM', 'Long_Description', 'DEP_MAIN', 'DEP_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    commit;
    exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
    end;
    processing dim: EMP_DIM
    processing level: EMP_MAINin dimension EMP_DIM
    processing level attribute use: EMP_MAIN_EMP_ID in level EMP_MAIN for level attribute EMP_ID
    processing level attribute : EMP_ID in level EMP_MAIN
    processing level attribute use: EMP_MAIN_EMP_NAME in level EMP_MAIN for level attribute EMP_NAME
    processing level attribute : EMP_NAME in level EMP_MAIN
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    cwm_olap_dimension.Set_Description (USER, 'EMP_DIM', '');
    cwm_olap_dimension.Set_Display_Name(USER, 'EMP_DIM', 'EMP_DIM');
    cwm_olap_dimension.Set_Plural_Name(USER, 'EMP_DIM', 'EMP_DIM');
    cwm_olap_level.Set_Description (USER, 'EMP_DIM', 'EMP_MAIN', '');
    cwm_olap_level.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN');
    begin
    cwm_olap_level_attribute.set_name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN_EMP_ID', 'EMP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_ID', 'EMP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_ID', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.set_name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN_EMP_NAME', 'EMP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_NAME', 'EMP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_NAME', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'EMP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'EMP_DIM', 'Short_Description', 'Short_Description', 'Short Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'EMP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.dBRD-02501: Internal Error: Bridge run failed due to a system exception! Please contact Oracle Support with the stack trace and the details on how to reproduce it.
    java.lang.NullPointerException
    at java.io.Writer.write(Writer.java:126)
    at oracle.cwm.bridge.cwmlite.ImportCommon.trace(ImportCommon.java:51)
    at oracle.cwm.bridge.cwmlite.ImportUtil.traceOut(ImportUtil.java:648)
    at oracle.cwm.bridge.cwmlite.ImportUtil.runPLSQL(ImportUtil.java:628)
    at oracle.cwm.bridge.cwmlite.ImportCube.importCubeAW(ImportCube.java:139)
    at oracle.cwm.bridge.cwmlite.ImportMain.runBridge(ImportMain.java:152)
    at oracle.cwm.tools.bridge.BridgeWrapper.run(BridgeWrapper.java:509)
    at java.lang.Thread.run(Thread.java:534)
    **! Transfer logging stopped at Mon Feb 07 16:49:32 IRST 2005 !**
    hat is the problem ???
    thanks,
    shima

    Does anyone else have any insight into a 'named pipes error' when attempting to export WSUS metadata from the Windows Internal Database?
    RRS
    Truly, it makes very little sense. Named Pipes is the ONLY way to communicate with a Windows Internal Database. If Named Pipes were not enabled, the WSUS Server would not work.
    From your original post...
    I have enabled the TCP/IP and Named Pipes protocols in SQL. They are selected under surface area configuration as well.
    You're aware that you cannot do this for the Windows Internal Database?
    Which then brings us to a second variation on the previous question -- How many, and what type, of SQL Server instances are installed on this machine?
    Lawrence Garvin, M.S., MCSA, MCITP:EA, MCDBA
    SolarWinds Head Geek
    Microsoft MVP - Software Packaging, Deployment & Servicing (2005-2014)
    My MVP Profile: http://mvp.microsoft.com/en-us/mvp/Lawrence%20R%20Garvin-32101
    http://www.solarwinds.com/gotmicrosoft
    The views expressed on this post are mine and do not necessarily reflect the views of SolarWinds.

Maybe you are looking for