Metadata Load Error in HFM 9.2
All,
When we try to load a metadat file with "Merge" option, it workes fine. But while "Replace" it's throwing the following error message. We'll have to use "Replace" option to remove un-neccessary elements from the hierarchy. I would really appreciate if you can help me out on this, Thanks much!
Metadata referential integrity check started at 16:38:48
<refint>
<Module Name="Journals">
<Deleted>
<Dim Name="ENTITYNODES">
<Table Name="LTD_JLTMPENT" Scenario="" Year="">
<Member ParentName="TotLLS" ChildName="LLSAdj" Count="1" Label="14_INTERCO BALANCE" Value="" Period="[None]">
</Member>
</Table>
<Table Name="LTD_JLTMPENT" Scenario="" Year="">
<Member ParentName="TotMast" ChildName="MASTAdj" Count="1" Label="PUSHDOWNS B&O" Value="" Period="[None]">
</Member>
<Member ParentName="TotMast" ChildName="MASTAdj" Count="1" Label="PUSHDOWNS SG&A" Value="" Period="[None]">
</Member>
<Member ParentName="TotMast" ChildName="MASTAdj" Count="2" Label="05_MAST IC CLEAR" Value="" Period="[None]">
</Member>
<Member ParentName="TotMast" ChildName="MASTAdj" Count="1" Label="14_INTERCO BALANCE" Value="" Period="[None]">
</Member>
<Member ParentName="TotMast" ChildName="MASTAdj" Count="2" Label="44_MAST INV TRUE UP" Value="" Period="[None]">
</Member>
</Table>
<Table Name="LTD_JLTMPENT" Scenario="" Year="">
<Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="PUSHDOWNS B&O" Value="" Period="[None]">
</Member>
<Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="PUSHDOWNS SG&A" Value="" Period="[None]">
</Member>
<Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="13_LZ PROFIT ELIM" Value="" Period="[None]">
</Member>
<Member ParentName="TotBA" ChildName="BAAdj" Count="1" Label="14_INTERCO BALANCE" Value="" Period="[None]">
</Member>
</Table>
</Dim>
</Deleted>
<Changed>
</Changed>
</Module>
</refint>
Load ended at: 16:39:04
Elapsed time: 00:00:16
What exactly are you getting rid of in your metadata?
It appears your metadata is causing a conflict with a journal entry and/or journal template. I'd look carefully at the error message below and make sure that those elements (i.e. TotLLS, LLSAdj, TotMast, MASTAdj, TotBA, BAAdj) and make sure they still exist in your metadata.
Similar Messages
-
Hi,
I am trying to load metadata using ODI 11.1.1.7 into Hyperion Planning 11.1.2.4. It errored out saying
Cannot load dimension member, error message is: RemoteException occurred in server thread; nested exception is:
java.rmi.UnmarshalException: unrecognized method hash: method not supported by remote object.
Source - File
Please let me know if there is any fix to this.
Thanks,
SravanDid you definitely follow the steps in "How to Apply ODI Patch 18687916 for Hyperion Planning and Errors that May Occur if Patch Has Not Been Applied Correctly (Doc ID 1683307.1)"
If you are still getting the error if everything in the support doc has been done then maybe the issue relates to 11.1.2.4
It is interesting to see the Oracle state of direction doc have the following statement:
"The KM's for EPM release 11.1.2.3 and ODI release 11.1.1.7 are not certified with EPM Release 11.1.2.4."
Cheers
John
http://john-goodwin.blogspot.com/ -
MetaData Load error in 11.1.1.3
Hi,
I am trying to load metadata into a 11.1.1.3 Planning application using the outline load utility and encountered the below error -
D:\Hyperion\products\Planning\bin>OutlineLoad /A:UMB01 /U:admin /N /I:d:\ENT_MD.csv /D:Entity
/L:ENT_MD.log /X:ENT_MD.err
Enter password:
[INFO] RegistryLogger - REGISTRY LOG INITIALIZED
[INFO] RegistryLogger - REGISTRY LOG INITIALIZED
D:\Hyperion\common\config\9.5.0.0\product\planning\9.5.0.0\planning_1.xml
displayName = Planning
componentTypes =
priority = 50
version = 9.5.0.0
build = 1
location = D:\Hyperion\products\Planning
taskSequence =
task =
using Java property for Hyperion Home D:\Hyperion
Setting Arbor path to: D:\Hyperion\common\EssbaseRTC\9.5.0.0
d{ISO8601} INFO main com.hyperion.audit.client.runtime.AuditRuntime - Audit Client has been created for the server http://<server>:28080/interop/Audit
[Wed Oct 14 13:45:24 EDT 2009]Unable to obtain dimension information and/or perform a data load : Unrecognized column header value(s), refer to previous messages. (Note: column header values are case sensitive.)
d{ISO8601} INFO pinging com.hyperion.audit.client.cache.AuditConfigFilter - Client Enable Status false
d{ISO8601} INFO Thread-16 com.hyperion.audit.client.cache.AuditConfigFilter - Client Enable Status false
d{ISO8601} INFO filterConfig com.hyperion.audit.client.cache.AuditConfigFilter - Client Enable Status false
In the first row of excel sheet,which is a CSV file, I have the header names viz.
Entity, Parent, Data Storage, Aggregation in columns A,B,C,D respectively. I don't understand why it thorws out "Unrecognized column header values".
Please clarify.
ThanksHi,
Here is an example of how I loaded one record to the entity dimension
entityload.csv =
Parent,Entity,Alias: Default,Data Storage,Aggregation (Consol)
Entity,Alliance,Joint Venture Alliance,Store,~
command line =
OutlineLoad -f:passfile.txt /A:plansamp /U:admin /I:entityload.csv /D:Entity /L:outlineLoad.log /X:outlineLoad.exc /N
outlineload.log =
[Thu Oct 22 16:45:11 BST 2009]Successfully located and opened input file "E:\Hyperion\products\Planning\bin\entityload.csv".
[Thu Oct 22 16:45:11 BST 2009]Header record fields: Parent, Entity, Alias: Default, Data Storage, Aggregation (Consol)
[Thu Oct 22 16:45:11 BST 2009]Located and using "Entity" dimension for loading data in "PLANSAMP" application.
[Thu Oct 22 16:45:11 BST 2009]Load dimension "Entity" has been unlocked successfully.
[Thu Oct 22 16:45:11 BST 2009]A cube refresh operation will not be performed.
[Thu Oct 22 16:45:11 BST 2009]Create security filters operation will not be performed.
[Thu Oct 22 16:45:11 BST 2009]Examine the Essbase log files for status if Essbase data was loaded.
[Thu Oct 22 16:45:11 BST 2009]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
Cheers
John
http://john-goodwin.blogspot.com/ -
HI,
When i was trying to update meta data using OUTLINELOAD., Its showing as loaded 10 members successfully.
But when i checked in Planning ., its not updated...
I checked in log its showing as BPMA is enabled., enabled to update meta data...
Can you please help me on this.
Thanks in advance..........[Mon Jul 02 17:35:03 ICT 2012]Located and using "BusinessUnit" dimension for loading data in "MFG" application.
java.lang.RuntimeException: Planning Adapter loads may cause issues on BMPA enabled applications if metadata is modified.
at com.hyperion.planning.HyperionPlanningBean.beginLoad(Unknown Source)
at com.hyperion.planning.utils.HspOutlineLoad.parseAndLoadInputFile(Unknown Source)
at com.hyperion.planning.utils.HspOutlineLoad.halAdapterInfoAndLoad(Unknown Source)
at com.hyperion.planning.utils.HspOutlineLoad.loadAndPrintStatus(Unknown Source)
at com.hyperion.planning.utils.HspOutlineLoad.main(Unknown Source)
[Mon Jul 02 17:35:03 ICT 2012]Load dimension "BusinessUnit" has been unlocked successfully.
[Mon Jul 02 17:35:03 ICT 2012]A cube refresh operation will not be performed.
[Mon Jul 02 17:35:03 ICT 2012]Create security filters operation will not be performed.
[Mon Jul 02 17:35:03 ICT 2012]Examine the Essbase log files for status if Essbase data was loaded.
[Mon Jul 02 17:35:03 ICT 2012]Planning Outline data store load process finished.
10 data records were read, 10 data records were processed, 10 were successfully
loaded, 0 were rejected. -
Hi Experts
I have an issue when I load metadata via ODI to HFM. I get this message:
Error code: 0x80040154 [Class not registered
com.hyperion.odi.common.ODIHAppException: Metadata load failed. Error code: 0x80040154 [Class not registered
After some search on the net I see that this is due to a patch which is already been installed(ODI 10.1.3.5.5). The HFMDriver.dll is renamed as HFMDriver32.dll and HFMDriver64.dll rename as HFMDriver.dll.
Patch 9377717: ORACLE DATA INTEGRATOR 10.1.3.6.0 PATCH
1. I have reversed the HFM application TestApp into target for ODI and everything seems fine in the operator
2. I have created a simple source flat file for the Account dimension
3. I have created an interface to the Account dim. and verified the mapping with no errors
4. The process stops when trying to Load the Metadata to HFM(Step 5 / of 7)
5. When I search the log I see the Error code: 0x80040154 [Class not registered]
Does anyone have any idea why the interface does not load the metadata?
Brs
Inge Andre
Edited by: 819836 on Apr 14, 2011 12:49 PMThis instructions given us by the support have fixed the problem on the first topic.
1. Unregister the adapter by opening the command prompt and changing the path to
C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\ RegSvcs.exe /u C:\Hyperion\products\FinancialDataQuality\SharedComponents\AdapterComponents\fdmAdapterCOMadmin\fdmAdapterCOMadmin.dll
and:
RegSvcs.exe /u C:\Hyperion\products\FinancialDataQuality\SharedComponents\AdapterComponents\fdmFM11xG5C\fdmFM11xG5C.dll
Please verify the correct path to the dll's before proceeding
2. Re-register the adapter using the 64 bit version of RegSvcs, C:\WINDOWS\Microsoft.NET\Framework64\v2.0.50727\RegSvcs.exe <PathToDLL> i.e. ..\Hyperion\products\FinancialDataQuality\SharedComponents\AdapterComponents\fdmFM11xG5C\fdmFM11xG5C.dll
N.B. Do not re-register the ComAdmin.dll because that is not a 64 bit component.
4. Open the FDM workbench and configure the adapter by right clicking the adapter -> Configure and re-entering the username and password.
Regards. -
HFM Metadata Loading On-going Problem
Hi There,
We just migrated to HFM 11.1.1.3 from HFM 4.
We used to have an issue in HFM 4 where almost everytime we loaded metadata, the system would hang and become unresponsive. The load screen would just sit there without ever completing the load. You could still use the system but the metadata load would never actually load. The only thing that would resolve it would be to reboot all of the HFM application servers.
This happened to us again the other day but now we're on the new HFM 11.1.1.3. Again, the resolution was to reboot the HFM applications. We tried just restarting the services on the various servers but that didn't work. A full reboot was required.
And nothing was wrong with the metadata itself as it quickly loaded without errors into our DEV and QA environments. As well, we kicked all of the users out of the system prior to the metadata load. Most get out immediately and certainly no heavy calculations or consolidations are being performed during the metadata load.
Has anyone else experienced this issue? Should a reboot always precede or accompany a metadata load? Is there any recommendation as to how often you should reboot the Hyperion servers (monthly, quarterly, etc.) as good practice?
Many Thanks,
MikeWe are having a similar issue with version 11.1.2.0.0
We try to run an application metadata load from the client and we get an error message
Unexpected error: 80005005 occurred.
Running the load from the workspace, we receive the following message.
An error has occurred. Please contact your administrator.
Show Details:
Error Reference Number: {F187C156-ABDA-40DD-A687-B471F35535E3};User Name: mpus54965@gsd1w105c
Num: 0x80004005;Type: 0;DTime: 4/27/2011 1:27:41 PM;Svr: GSD4W023C;File: CHsvMetadataLoadACV.cpp;Line: 347;Ver: 11.1.2.0.0.2762;
This is the second time we have encountered this problem. Oracle support was not able to determine the cause. The fix is to reboot the database server, but we are looking for insight to the problem. Note: Our current development environment is a single server virtual environment connected to a database server. We are planning to move to a more robust test environment with four applications and one database server in a few weeks.
Thanks for your help, -
Hi,
I'm loading data into HFM from flat file. When the interface is executed only some of the data are getting loaded. When i checked for the errors in the log, I'm getting the following error message in log:
'Line: 56, Error: Invalid cell for Period Apr'
Then i found that its an invalid intersection in HFM which am trying to load.
In FDM there is an option to validate invalid intersections during data load.
I would like to know how to do so in ODI to overcome this kind of error i.e. is there any option in ODI to ignore this kind of error.
Kndly help me.
Thanks in advanceHi,
I think even if the metadata exists still there might be some issues with HFM forbidden cells. There are HFM rules that determines which intersections are editable/loadable which are not. Please look at your HFM admin regarding forbidden rules. Or otherwise change the property of Custom dimensions so that it accepts data into all intersections.
Thanks,
Debasis -
"MDL1223: Metadata Loader cannot be executed ..." error?
Hi folks,
Am getting this strange error from OMBPlus (11Gr2):
MDL1223: Metadata Loader cannot be executed while other user(s) are accessing the workspace.
when trying to import an MDL file, when using this statement:
catch {OMBIMPORT MDL_FILE '$projname.mdl'\
USE UPDATE_MODE\
CONTROL_FILE 'control.txt'\
OUTPUT LOG '$projname.log'} result
where the CONTROL_FILE (control.txt) simply has:
IGNOREUOID=Y
CONFIGPARAM=N
SINGLEUSER=Y
Am thinking that SINGLEUSER setting may have something to do with it? Have got an SR open with Oracle on it but no luck there so far - just wondering if anyone else has come across something like this...? Seems to certainly sound like a new 11G-ish/workspace-y kind of thing. Am really curious to know how to determine what users are currently accesing a workspace?
Thanks,
JimHi Jim,
I would be the SINGLEUSER tag. In 11.1 and 11.2 you should connect using the USE SINGLE_USER_MODE option and remove the SINGLEUSER setting from the param file.
Regards,
John -
Automate DRM Metadata load into HFM
Is there a way to automate a DRM metadata load into HFM? In both instances, we are using the latest Fusion versions and I wanted to know if there is a way to automate loading the flat file produced by DRM into HFM. How would this be done? Does EPMA versus classic HFM setup play a role? If there is no automation, I imagine this would require an admin to manually export from DRM and manually import into HFM.
Thanks in advanceAny thoughts on this guys? I'd like to open the question up a bit.
How is DRM metadata used to source Hyperion applications? I see there is an XML output of metadata. There is some talk about generating an .ads file. What are the other ways? SQL database views?
I'm using a classic Planning application. Appreciate any thoughts.
-Chris -
Error in loading data to HFM via ODI
Hi,
I extracted a slice of HFM data (input level) from an interface (extract HFM to SQL) successfully. However, when I tried to load same extracted data (no modification) back to HFM with another interface (load SQL to HFM), it failed. We're using ODI 11g (11.1.1.6.0), HFM is 11.1.2.0.0.
following is error messgae, can anyone help where is wrong.
Thanks
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 3, in <module>
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:265)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: 1
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:619)
Caused by: Traceback (most recent call last):
File "<string>", line 3, in <module>
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:265)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: 1
at org.python.core.PyException.fillInStackTrace(PyException.java:70)
at java.lang.Throwable.<init>(Throwable.java:181)
at java.lang.Exception.<init>(Exception.java:29)
at java.lang.RuntimeException.<init>(RuntimeException.java:32)
at org.python.core.PyException.<init>(PyException.java:46)
at org.python.core.PyException.<init>(PyException.java:43)
at org.python.core.Py.JavaError(Py.java:455)
at org.python.core.Py.JavaError(Py.java:448)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:397)
at org.python.core.PyObject.__call__(PyObject.java:401)
at org.python.pycode._pyx15.f$0(<string>:6)
at org.python.pycode._pyx15.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
... 19 more
Caused by: com.hyperion.odi.common.ODIHAppException: 1
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:265)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
... 33 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
at com.hyperion.odi.hfm.ODIHFMDataLoader$OptionsDataLoad.createConsolidateOptions(ODIHFMDataLoader.java:118)
at com.hyperion.odi.hfm.ODIHFMDataLoader$OptionsDataLoad.<init>(ODIHFMDataLoader.java:49)
at com.hyperion.odi.hfm.ODIHFMDataLoader$OptionsDataLoad.validate(ODIHFMDataLoader.java:72)
at com.hyperion.odi.hfm.ODIHFMDataLoader.validateOptions(ODIHFMDataLoader.java:294)
at com.hyperion.odi.hfm.ODIHFMAppStatement.validateLoadOptions(ODIHFMAppStatement.java:152)
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:202)
... 38 moreHi,,
try to check the connection and name of the source system.
Use SM59 to do this or right click on source system say check.
hope it helps
bhaskar -
hi,
I try to transfer my metadata with oracle WB Transfer.
I Trace it and recieve folloing log :
**! Transfer logging started at Mon Feb 07 16:47:45 IRST 2005 !**
**! Client is 127.0.0.1 !**
**! Server is 127.0.0.1 !**
OWB Bridge processed arguments
Bridge Parameters are:
<BA>=<KAFA_COLLECTION2>
<LANGUAGE>=<All Languages>
<LOCKREPOSITORY>=<True>
<file>=<C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107782265830.XMI>
Using attribute sorting
export to file
Default local= fa_IR
Exporting project:KAFA_PROJECT2
initializing project:KAFA_PROJECT2
Initializing module :KAFA_DW_MODULE
exportSchema:KAFA_DW_MODULE SCHM14314
Exporting cube:EMPLOYEMENT_CUBE
exportCube:EMPLOYEMENT_CUBE/EMPLOYEMENT_CUBE Cube14396
exportCubeDimUse:CDU14344FK14400
exportFactLevelUse:FLU14396FK14400
exportCubeDimUse:CDU14359FK14403
exportFactLevelUse:FLU14396FK14403
exportCubeDimUse:CDU14379FK14397
exportFactLevelUse:FLU14396FK14397
exportClassificationEntry for:KAFA_COLLECTION2 CLE_0
exportMeasure:SALARY/SALARY MEA38108
Exporting cube:HOKM_CUBE
exportCube:HOKM_CUBE/HOKM_CUBE Cube38100
exportCubeDimUse:CDU14344FK38101
exportFactLevelUse:FLU38100FK38101
exportCubeDimUse:CDU14359FK38104
exportFactLevelUse:FLU38100FK38104
exportClassificationEntry for:KAFA_COLLECTION2 CLE_2
exportMeasure:ESTEKHDAMDATE/ESTEKHDAMDATE MEA38109
exportMeasure:SALARY/SALARY MEA38110
Exporting dimension:DEP_DIM
exportDimension:DEP_DIM/DEP_DIM DIM14359
exportClassificationEntry for:KAFA_COLLECTION2 CLE_3
exportLevel:DEP_MAIN/DEP_MAIN LEV14360
exportClassificationEntry for:Description CLE_4
exportLevelAttribute:DEP_ID/DEP_ID LATR14363
exportLevelAttribute:DEP_NAME/DEP_NAME LATR14367
exportKey:DEP_MAIN_UK/DEP_MAIN_UK KEY14361
exportKeyAttributeUse:KEYAU14363KEY14361
exportHierarchy:HIE_DEP/HIE_DEP HEIR14374
exportHierarchyLevelUse:HLU14374LEV14360
Exporting dimension:EMP_DIM
exportDimension:EMP_DIM/EMP_DIM DIM14344
exportClassificationEntry for:KAFA_COLLECTION2 CLE_6
exportLevel:EMP_MAIN/EMP_MAIN LEV14345
exportClassificationEntry for:Description CLE_7
exportLevelAttribute:EMP_ID/EMP_ID LATR14348
exportLevelAttribute:EMP_NAME/EMP_NAME LATR14352
exportKey:EMP_MAIN_UK/EMP_MAIN_UK KEY14346
exportKeyAttributeUse:KEYAU14348KEY14346
exportHierarchy:HIE_EMP/HIE_EMP HEIR14354
exportHierarchyLevelUse:HLU14354LEV14345
Exporting dimension:JOB_DIM
exportDimension:JOB_DIM/JOB_DIM DIM14379
exportClassificationEntry for:KAFA_COLLECTION2 CLE_8
exportLevel:JOB_MAIN/JOB_MAIN LEV14380
exportClassificationEntry for:Description CLE_9
exportLevelAttribute:JOB_ID/JOB_ID LATR14383
exportLevelAttribute:JOB_NAME/JOB_NAME LATR14389
exportKey:JOB_MAIN_UK/JOB_MAIN_UK KEY14381
exportKeyAttributeUse:KEYAU14383KEY14381
exportHierarchy:HIE_JOB/HIE_JOB HEIR14391
exportHierarchyLevelUse:HLU14391LEV14380
Exporting mappings
exportDimensionTableMap:DTMAP14359
exportItemMap:IMAP14364
exportItemUse:TIU14363
exportItemUse:SIU14364
exportItemMap:IMAP14368
exportItemUse:TIU14367
exportItemUse:SIU14368
exportDimensionEntityUse:EU_DIM14359
exportDimensionTableUse:EU_TAB14359
exportDimensionTableMap:DTMAP14344
exportItemMap:IMAP14349
exportItemUse:TIU14348
exportItemUse:SIU14349
exportItemMap:IMAP14353
exportItemUse:TIU14352
exportItemUse:SIU14353
exportDimensionEntityUse:EU_DIM14344
exportDimensionTableUse:EU_TAB14344
exportDimensionTableMap:DTMAP14379
exportItemMap:IMAP14384
exportItemUse:TIU14383
exportItemUse:SIU14384
exportItemMap:IMAP14390
exportItemUse:TIU14389
exportItemUse:SIU14390
exportDimensionEntityUse:EU_DIM14379
exportDimensionTableUse:EU_TAB14379
exportFactTableMap:FTMAP14396
exportMapDependency:MAPDEP14396DTMAP14344
exportMapDependency:MAPDEP14396DTMAP14359
exportMapDependency:MAPDEP14396DTMAP14379
exportItemMap:IMAP38108
exportItemUse:TIU38108
exportItemUse:SIU38108
exportCubeEntityUse:EU_Cube14396
exportFactUse:EU_TAB14396
exportFactLevelGroup:FLG_TAB14396
exportFactTableMap:FTMAP38100
exportMapDependency:MAPDEP38100DTMAP14344
exportMapDependency:MAPDEP38100DTMAP14359
exportItemMap:IMAP38109
exportItemUse:TIU38109
exportItemUse:SIU38109
exportItemMap:IMAP38110
exportItemUse:TIU38110
exportItemUse:SIU38110
exportCubeEntityUse:EU_Cube38100
exportFactUse:EU_TAB38100
exportFactLevelGroup:FLG_TAB38100
Exporting table:DEP_DIM
exportTable:DEP_DIM/DEP_DIM TAB14359
exportKey:DEP_DIM_DEP_MAIN_UK/DEP_DIM_DEP_MAIN_UK KEY14362
exportKeyAttributeUse:KEYAU14364KEY14362
exportColumn:DEP_MAIN DEP_ID/DEP_MAIN_DEP_ID COL14364
exportColumn:DEP_MAIN_DEP_NAME/DEP_MAIN_DEP_NAME COL14368
exportClassificationEntry for:KAFA_COLLECTION2 CLE_10
Exporting table:EMP_DIM
exportTable:EMP_DIM/EMP_DIM TAB14344
exportKey:EMP_DIM_EMP_MAIN_UK/EMP_DIM_EMP_MAIN_UK KEY14347
exportKeyAttributeUse:KEYAU14349KEY14347
exportColumn:EMP_MAIN EMP_ID/EMP_MAIN_EMP_ID COL14349
exportColumn:EMP_MAIN_EMP_NAME/EMP_MAIN_EMP_NAME COL14353
exportClassificationEntry for:KAFA_COLLECTION2 CLE_11
Exporting table:JOB_DIM
exportTable:JOB_DIM/JOB_DIM TAB14379
exportKey:JOB_DIM_JOB_MAIN_UK/JOB_DIM_JOB_MAIN_UK KEY14382
exportKeyAttributeUse:KEYAU14384KEY14382
exportColumn:JOB_MAIN JOB_ID/JOB_MAIN_JOB_ID COL14384
exportColumn:JOB_MAIN_JOB_NAME/JOB_MAIN_JOB_NAME COL14390
exportClassificationEntry for:KAFA_COLLECTION2 CLE_12
Exporting table:EMPLOYEMENT_CUBE
exportTable:EMPLOYEMENT_CUBE/EMPLOYEMENT_CUBE TAB14396
exportKey:SEG_UK_EMPLOYEMENT_CUBE/SEG_UK_EMPLOYEMENT_CUBE KEY14406
exportKeyAttributeUse:KEYAU14401KEY14406
exportKeyAttributeUse:KEYAU14404KEY14406
exportKeyAttributeUse:KEYAU14398KEY14406
exportForeignKey:FK_EMPLOYEMENT_CUBE_14362/FK_EMPLOYEMENT_CUBE_14362 FK14403
exportKeyAttributeUse:KEYAU14404FK14403
exportForeignKey:FK_EMPLOYEMENT_CUBE_14347/FK_EMPLOYEMENT_CUBE_14347 FK14400
exportKeyAttributeUse:KEYAU14401FK14400
exportForeignKey:FK_EMPLOYEMENT_CUBE_14382/FK_EMPLOYEMENT_CUBE_14382 FK14397
exportKeyAttributeUse:KEYAU14398FK14397
exportColumn:DEP_MAIN_DEP_ID/DEP_MAIN_DEP_ID COL14404
exportColumn:EMP_MAIN_EMP_ID/EMP_MAIN_EMP_ID COL14401
exportColumn:JOB_MAIN_JOB_ID/JOB_MAIN_JOB_ID COL14398
exportColumn:SALARY/SALARY COL38108
exportClassificationEntry for:KAFA_COLLECTION2 CLE_13
Exporting table:HOKM_CUBE
exportTable:HOKM_CUBE/HOKM_CUBE TAB38100
exportForeignKey:FK_HOKM_CUBE_14362/FK_HOKM_CUBE_14362 FK38104
exportKeyAttributeUse:KEYAU38105FK38104
exportForeignKey:FK_HOKM_CUBE_14347/FK_HOKM_CUBE_14347 FK38101
exportKeyAttributeUse:KEYAU38102FK38101
exportColumn:DEP_MAIN_DEP_ID/DEP_MAIN_DEP_ID COL38105
exportColumn:EMP_MAIN_EMP_ID/EMP_MAIN_EMP_ID COL38102
exportColumn:ESTEKHDAMDATE/ESTEKHDAMDATE COL38109
exportColumn:SALARY/SALARY COL38110
exportClassificationEntry for:KAFA_COLLECTION2 CLE_14
exportContext:CTXT00
exportTypeSet:TSET
Exporting datatypes
exportScalarDatatype:NUMBER DTYP7836
exportScalarDatatype:VARCHAR2 DTYP7839
exportClassification:KAFA_COLLECTION2 CLAS1
exportClassificationType:Warehouse Builder Business Area CLT15
exportClassification:Description CLAS5
exportClassificationType:Dimensional Attribute Descriptor CLT16
Exporting project KAFA_PROJECT2 complete.
**! Target bridge jvm parameters = "..\..\..\jdk\jre\bin\javaw" -mx50m -DORCLCWM_META_MODEL_FILE=..\..\bridges\admin\orcl_cwm.xml -classpath .;..\..\bridges\lib\bridge_cwmlite10.jar;..\..\bridges\lib\bridge_parser10.jar;..\..\bridges\lib\vek10.jar;..\..\..\lib\xmlparserv2.jar;..\..\..\jdbc\lib\ojdbc14.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\util.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;..\..\bridges\lib\bridge_cwmlite10.jar;..\..\bridges\lib\bridge_parser10.jar;..\..\bridges\lib\vek10.jar;..\..\..\lib\xmlparserv2.jar;..\..\..\jdbc\lib\ojdbc14.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\util.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;"..\..\..\jdk\jre\lib\rt.jar;..\..\..\jdk\jre\lib\charsets.jar";"..\..\bridges\admin;..\..\bridges\lib\bridge_wrapper10.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\reposimpl.jar;..\..\lib\int\repossdk.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;..\..\lib\int\util.jar" oracle.cwm.tools.bridge.BridgeWrapper -bridge_name oracle.cwm.bridge.cwmlite.ImportMain !**
**! Target bridge parameters = -log_level 2 -log_file C:\TEMP\bridges\log\null-nullMy_Metadata_Transfer1107782265830.log -olapimp.deploytoaw Y -olapimp.awname dw_targetschema -olapimp.awobjprefix dw_ts -olapimp.loadcubedata Y -olapimp.awaggregate 1 -olapimp.dimuniquekeys Y -olapimp.awuser -olapimp.createviews Y -olapimp.viewprefix -olapimp.viewaccesstype OLAP -olapimp.creatematviews Y -olapimp.viewscriptdir -olapimp.deploy Y -olapimp.username dw_targetschema -olapimp.password password -olapimp.host 200.20.20.11 -olapimp.port 1521 -olapimp.sid ora10g -olapimp.inputfilename C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107782265830.XMI -olapimp.outputfilename C:\Documents and Settings\stabatabaee\Desktop\test.sql -paramfile C:\TEMP\bridges\1107782266112.par !**
setting parameter: olapimp.deploytoaw = Y
setting parameter: olapimp.awname = dw_targetschema
setting parameter: olapimp.awobjprefix = dw_ts
setting parameter: olapimp.loadcubedata = Y
setting parameter: olapimp.awaggregate = 1
setting parameter: olapimp.dimuniquekeys = Y
setting parameter: olapimp.awuser =
setting parameter: olapimp.createviews = Y
setting parameter: olapimp.viewprefix =
setting parameter: olapimp.viewaccesstype = OLAP
setting parameter: olapimp.creatematviews = Y
setting parameter: olapimp.viewscriptdir =
setting parameter: olapimp.deploy = Y
setting parameter: olapimp.username = dw_targetschema
setting parameter: olapimp.host = 200.20.20.11
setting parameter: olapimp.port = 1521
setting parameter: olapimp.sid = ora10g
setting parameter: olapimp.inputfilename = C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107782265830.XMI
setting parameter: olapimp.outputfilename = C:\Documents and Settings\stabatabaee\Desktop\test.sql
Loading Metadata
Loading XMI input file
connecting ...
processing dim: DEP_DIM
processing level: DEP_MAINin dimension DEP_DIM
processing level attribute use: DEP_MAIN_DEP_ID in level DEP_MAIN for level attribute DEP_ID
processing level attribute : DEP_ID in level DEP_MAIN
processing level attribute use: DEP_MAIN_DEP_NAME in level DEP_MAIN for level attribute DEP_NAME
processing level attribute : DEP_NAME in level DEP_MAIN
declare
id number;
s_id number;
l_id number;
levelday_id number;
levelmonth_id number;
levelquarter_id number;
levelyear_id number;
f_id number;
begin
select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
cwm_olap_dimension.Set_Description (USER, 'DEP_DIM', '');
cwm_olap_dimension.Set_Display_Name(USER, 'DEP_DIM', 'DEP_DIM');
cwm_olap_dimension.Set_Plural_Name(USER, 'DEP_DIM', 'DEP_DIM');
cwm_olap_level.Set_Description (USER, 'DEP_DIM', 'DEP_MAIN', '');
cwm_olap_level.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN');
begin
cwm_olap_level_attribute.set_name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN_DEP_ID', 'DEP_ID');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_ID', 'DEP_ID');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_description(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_ID', '');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.set_name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN_DEP_NAME', 'DEP_NAME');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_NAME', 'DEP_NAME');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_description(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_NAME', '');
exception when OTHERS then null;
end;
begin
cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'DEP_DIM', 'Short_Description');
exception when others then null;
end;
begin
cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'DEP_DIM', 'Short_Description', 'Short_Description', 'Short Description');
select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'DEP_DIM', 'Short_Description');
exception when others then null;
end;
begin
cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'DEP_DIM', 'Long_Description');
exception when others then null;
end;
begin
cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'DEP_DIM', 'Long_Description', 'Long_Description', 'Full Description');
select descriptor_id into id from all_olap_descriptors where descriptor_value='Long Description';
cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'DEP_DIM', 'Long_Description');
exception when others then null;
end;
begin
cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'DEP_DIM', 'Short_Description', 'DEP_MAIN', 'DEP_NAME');
exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
end;
begin
cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'DEP_DIM', 'Long_Description', 'DEP_MAIN', 'DEP_NAME');
exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
end;
commit;
exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
end;
processing dim: EMP_DIM
processing level: EMP_MAINin dimension EMP_DIM
processing level attribute use: EMP_MAIN_EMP_ID in level EMP_MAIN for level attribute EMP_ID
processing level attribute : EMP_ID in level EMP_MAIN
processing level attribute use: EMP_MAIN_EMP_NAME in level EMP_MAIN for level attribute EMP_NAME
processing level attribute : EMP_NAME in level EMP_MAIN
declare
id number;
s_id number;
l_id number;
levelday_id number;
levelmonth_id number;
levelquarter_id number;
levelyear_id number;
f_id number;
begin
select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
cwm_olap_dimension.Set_Description (USER, 'EMP_DIM', '');
cwm_olap_dimension.Set_Display_Name(USER, 'EMP_DIM', 'EMP_DIM');
cwm_olap_dimension.Set_Plural_Name(USER, 'EMP_DIM', 'EMP_DIM');
cwm_olap_level.Set_Description (USER, 'EMP_DIM', 'EMP_MAIN', '');
cwm_olap_level.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN');
begin
cwm_olap_level_attribute.set_name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN_EMP_ID', 'EMP_ID');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_ID', 'EMP_ID');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_description(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_ID', '');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.set_name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN_EMP_NAME', 'EMP_NAME');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_NAME', 'EMP_NAME');
exception when OTHERS then null;
end;
begin
cwm_olap_level_attribute.Set_description(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_NAME', '');
exception when OTHERS then null;
end;
begin
cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'EMP_DIM', 'Short_Description');
exception when others then null;
end;
begin
cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'EMP_DIM', 'Short_Description', 'Short_Description', 'Short Description');
select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'EMP_DIM', 'Short_Description');
exception when others then null;
end;
begin
cwm_olap_dim_attribute.dBRD-02501: Internal Error: Bridge run failed due to a system exception! Please contact Oracle Support with the stack trace and the details on how to reproduce it.
java.lang.NullPointerException
at java.io.Writer.write(Writer.java:126)
at oracle.cwm.bridge.cwmlite.ImportCommon.trace(ImportCommon.java:51)
at oracle.cwm.bridge.cwmlite.ImportUtil.traceOut(ImportUtil.java:648)
at oracle.cwm.bridge.cwmlite.ImportUtil.runPLSQL(ImportUtil.java:628)
at oracle.cwm.bridge.cwmlite.ImportCube.importCubeAW(ImportCube.java:139)
at oracle.cwm.bridge.cwmlite.ImportMain.runBridge(ImportMain.java:152)
at oracle.cwm.tools.bridge.BridgeWrapper.run(BridgeWrapper.java:509)
at java.lang.Thread.run(Thread.java:534)
**! Transfer logging stopped at Mon Feb 07 16:49:32 IRST 2005 !**
hat is the problem ???
thanks,
shimaDoes anyone else have any insight into a 'named pipes error' when attempting to export WSUS metadata from the Windows Internal Database?
RRS
Truly, it makes very little sense. Named Pipes is the ONLY way to communicate with a Windows Internal Database. If Named Pipes were not enabled, the WSUS Server would not work.
From your original post...
I have enabled the TCP/IP and Named Pipes protocols in SQL. They are selected under surface area configuration as well.
You're aware that you cannot do this for the Windows Internal Database?
Which then brings us to a second variation on the previous question -- How many, and what type, of SQL Server instances are installed on this machine?
Lawrence Garvin, M.S., MCSA, MCITP:EA, MCDBA
SolarWinds Head Geek
Microsoft MVP - Software Packaging, Deployment & Servicing (2005-2014)
My MVP Profile: http://mvp.microsoft.com/en-us/mvp/Lawrence%20R%20Garvin-32101
http://www.solarwinds.com/gotmicrosoft
The views expressed on this post are mine and do not necessarily reflect the views of SolarWinds. -
Metadata Loads (.app) - What is best practice?
Dear All,
Our metadata scan and load duration is approximately 20 mins (full load using replace option). Business hfmadmin has suggested the option of partial dimension loads in an effort to speed up the loading process.
HFM System Admins prefer Metadata loads with replace option as there seems to less associated risk.
Using partial loads there appears to be risk to cross dimension integrity checking, changes are merged, potentially duplicating of members when moved in Hierarchy.
Are there any other risk with partial loads?
Which approach is considered best practice?When we add new entities to our structure and load them with the merge option, they will always appear on the bottom of the structure. But when we use the replace option they will appear in the order that we want it. For us, and for the user friendlyness we always use the replace option. And for us the Metadata-load usually takes at least 35 minutes. Last time - 1.15...
-
Greetings,
I'm curious to know how some of the users in this forum perform automated metadata load against dimensions in the Shared Library (and subsequently against any EPMA Planning apps). We have several apps whose dimensions are shared amongst various Planning apps which are updated manually when the dataload fails because of missing members. These are updated manually on a monthly basis due to the relative mall number of missing members.
However, we are building an app with a dimension that is quite dynamic (a lot of new members are added and a few are deleted). This woud be insane to perform manually to update the Shared Library. Thus I'm looking for any suggestions on how to automate this via batch file or via any other means, including using "Create Member Properties".
Any suggestions or ideas would be greatly welcomed...
Many thanks.
cgCG,
.err genrates only when
no proper names for member
no proper alise name .. while data loading...etc etc
These things can also be achived via some programing langague ... like java .. but u need to know all the possible error and your programme should be able to find out where it has gone wrong so that it can rebuild if any new member or child is missing via data load
unix > u can use SED functions to do that which will find and replace any possible outcomes
itb happen to me some time back where i used Java to replace some mmebr which used to get rejected while data loading .. but it was like specific butin ur case u have to know all possible outcomes ... -
Metadata load during 24 hours and still has not finished
Hi all,
I have a problem with the metadata load on HFM 9.3.1.3
It seems it is running from about 24 hours.
it happened some time that the log has not been produced from the HFM client, but I always have found in the log that the metadata load had finsihed succesfully.
This time I have no clue that the dataload has completed.
I have performed some times ago the delete invalid record task, that completed succesfully.
Can some one tell me wich are the drivers that make the data load process last so long?
Regards, Andrea Zannetti.Changes to the Entity or Currency dimensions can take far longer than changes to the Account, Custom, or Scenario dimensions. The reason for this lies primarily with the calc status tables, since changes to Entity or Currency require HFM to make changes including inserts to all of the CSE/CSN tables. Changes made to the Accounts or Customs do not require updates to the CSE/CSN tables.
Another common place for very long metadata loads lies with journals. The "Check integrity" box for metadata loads tells HFM to read through each and every journal and journal template in the entire application, checking to see whether even a single journal or template could become invalid if it allowed the metadata to be loaded. NEVER uncheck this, since once metadata has invalidated even one journal, you can never load metadata with the check in place afterward.
Another possibility is that metadata loads have to wait for all currently running data loads and consolidations to complete before the metadata load can start. Once it starts, subsequent data loads and consolidations have to wait for the metadata load to complete before they can start. This is to preserve data integrity.
One last place to look for long metadata load times comes with an application whose database is performing poorly, possible because of missing/broken/out of date indexes.
--Chris -
Hi,
Iam a newbie to Hyperion. Iam trying to load metadata into the classic application for a particular dimension.
I get the error as "unrecognized column header value "".
I removed all spaces in the csv file but still the error is the same.
Please do provide me a clue on what i should to proceed with my metadata load.
Thanks
SwamiHi John,
The log file says the following information.
Unrecognized column header value "Cost Analysis".
Unrecognized column header value "Alias_Default".
Unrecognized column header value "Data_Storage"
Unrecognized column header value "Aggregation_(OPEX)".
My header in the csv is as below.
"Cost_Analysis", "Alias_Default", "Data_Storage" and so on..
So how do i correct it.
Maybe you are looking for
-
MacBook Pro and MacBook RAM Compatibility
I was planning on getting a new MacBook Pro (I have a 2gen MacBook), and I wanted to upgrade the ram to 4gb. If I were to do this, could I take the 2gb ram sticks from the Pro and put it in the reg. MacBook? I'm pretty sure I can, but just to make su
-
I have recently upgraded to a new iMac: 3.4 GHz Intel Core i7. 4 GB 1333 MHz DDr3. AMD Radeon HD 6970M. 1024 MB. Mac OS X Lion 10.7.4. I have it attached to a 24-inch Cinema Display. Before I switched to my new iMac, it would blink black a couple of
-
How to prevent changing DNS server address
I work for a public school district. We just purchased our first batch of Win 8.1 PCs, but they are not the Pro version, so there is no gpedit. I want to prevent students from accessing the TCP/IPv4 Properties dialog box in order to ensure that the
-
PLEASE ITUNE CAN'T OPEN!!! ERROR MESSAGE!!!
help anyone!!! today when i want to upload and add some songs to my library windows keeps coming with this message: itune.exe has encountered an error, windows is going to shut it down, please restart the program i have tried EVERYTHING!!! reinstalli
-
How to unlock the request ?
Hi all, I have made some changes to an ods. And made a request to those changes in the development. But while i am trying to transport it to quality. I find that ods has been changed and its request is been created by another user. So, how to handle