FDM Multi-load error in system 11

Hi All,
I'm getting an error while loading the data into HFM through FDM in system 11. We just migrated our system from 9.2, and this used to work fine in the old system. As per the error message, it creates "PER" instead of "PERIODIC" in the new system 11 output file provided below. And for the [MISSING_VALUE], I think we'll have to add "* to [NONE]" maps in other dimensions. But I'm clueless on the "PER" value, can you share your thought please!! Thanks in advance!
FDM generated the file in system 9.2:
!Data
!Period=February...January
!Column_Order = Scenario,Year,View,Entity,Value,Account,ICP,Custom1,Custom2,Custom3,Custom4
GUIDANCEQ3;2010;PERIODIC;BAWOS;<Entity Currency>;999010;[ICP None];SALESCOGSPCS;0050;[None];[None];;9440;;;;;;;;;;
FDM generated the file in system 11:
!Data
!Period=February...January
!Column_Order = Scenario,Year,View,Entity,Value,Account,ICP,Custom1,Custom2,Custom3,Custom4
GUIDANCEQ4;2010;PER;BAWOS;<Entity Currency>;999010;[MISSING_VALUE];SALESCOGSPCS;0050;[MISSING_VALUE];[MISSING_VALUE];;9440;;;;;;;;;;
I'm using the below header in the data file for multi-load:
WOS
GuidanceQ4
02/28/2010
12
R,R,0,,,,Y,N,,PERIODIC
c,ud1,ud2,A,v,v,v,v,v,v,v,v,v,v,v,v

There are KM articles that address these 2 issues. Likewise there are several forum postings that address these exact issues. Did you do a search of the forum or support.oracle.com?
https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=PROBLEM&id=840319.1
https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=PROBLEM&id=1055134.1

Similar Messages

  • FDM Multi-load integration script?

    Hi,
    Can we use Multi-Load with integration import script. Or we can only use with Flat files?
    Thanks in advance

    There are KM articles that address these 2 issues. Likewise there are several forum postings that address these exact issues. Did you do a search of the forum or support.oracle.com?
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=PROBLEM&id=840319.1
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=PROBLEM&id=1055134.1

  • FDM Locations Lock Error in System 11

    All,
    We've just migrated our 9.2 system to 11.1.1.3.00 . When we try to lock all locations (Current Category Period), it's always throwing an error saying "Multiple-step operation generated errors. Check each status value". We're using the admin id which is having "Administartor" access and grant access to all locations is checked.
    This was working fine in system 9.2. We have total 60 locations available in FDM. We've raised one ticket with Oracle but they are taking too much time to respond. If any of you had experienced this issue before, please let me know.
    Thanks in advance!!

    This is more than likely a Oracle Client issue. What oracle Client is installed and what db version is being used? What is your cursor sharing db parameter set to? Have you had the DBA run a trace when this process was performed? This is going to be something more complicated than a forum discussion. I would continue to work through the sr process.

  • FDM Batch Loader Error

    Hello,
    I'm getting an error when trying to run a script from outside FDM workbench. The script works fine inside workbench and it's almost entirely copied out of the documentation.
    Script
    Sub autoload()
    Dim lngProcessLevel
    Dim strDelimiter
    Dim blnAutoMapCorrect
    Dim EssbaseFilePath
    Dim BATCHENG
    Set BATCHENG = CreateObject("upsWBatchLoaderDM.clsBatchLoader")
    BATCHENG.mInitialize API, SCRIPTENG
    lngProcessLevel = 10 'up to export
    strDelimiter= "_"
    blnAutoMapCorrect = False
    Set BATCHENG.PcolFiles = BATCHENG.fFileCollectionCreate(CStr(strDelimiter))
    'single processing
    BATCHENG.mFileCollectionProcess BATCHENG.PcolFiles, CLng(lngProcessLevel) ,, CBool(blnAutoMapCorrect)
    End Sub
    Batch
    C:\Hyperion\UpStream\WebLinkDataMart\SharedComponents\upsShell.exe CustomScriptEx=HFM2Ess2~<user>~<password>~<loadbalancer>~"e:\log"~autoload~LanguageCode=1033~EncodeUnicode=0
    Error
    ** Begin FDM Runtime Error Log Entry [7/26/2011 1:01:24 PM] **
    ERROR:
    Code............ 438
    Description..... Object doesn't support this property or method
    Procedure....... clsBlockProcessor.ActExport
    Component....... upsWBlockProcessorDM
    Version......... 920
    Thread.......... 38140
    IDENTIFICATION:
    User............<ADMIN>
    Computer Name... <SERVER>
    App Name........ <APP>
    Client App...... BatchEngine
    ** Begin FDM Runtime Error Log Entry [7/26/2011 1:01:24 PM] **
    ERROR:
    Code............ 438
    Description..... File [a_Essbase_FDMActual_Apr–2011_RA.dat] Caused - Object doesn't support this property or method
    Procedure....... clsBatchLoader.mFileCollectionProcess
    Component....... upsWBatchLoaderDM
    Version......... 920
    Thread.......... 33432
    My goal is to just create a simple automated process that will grab files from the openbatches directory but I'm kind of stuck as to why this would run fine in work bench but throw errors from outside using the syntax provided in the API / Admin guide...
    Edited by: user13048127 on Jul 26, 2011 12:58 PM

    You're right. It does appear to be failing at the export step even when stepping through the process manually from the web interface.
    It is generating a different error when doing that though:
    ** Begin Runtime Error Log Entry [7/27/2011 9:16:56 AM] **
    ERROR:
    Code............ 2500
    Description..... [EsbPutObject] Threw Code=
    Procedure....... clsHPDataManipulation.fCopyToServer
    Component....... upsEB7XG
    Version......... 100
    Thread.......... 35588
    ** Begin FDM Runtime Error Log Entry [7/27/2011 9:16:57 AM] **
    ERROR:
    Code............ 4100
    Description..... Stream Failed, Invalid file path provided!
    \\<server>\fdmdata\HFM2Ess2\HFM2Ess2\Data\33.ua2
    Procedure....... clsAppServer.fFileGetStream
    Component....... upsAppSv
    Version......... 920
    Thread.......... 38304
    In this situation the application is looking for a 33.ua2 file but there is only a 33.ua1 file in the directory... this was reproduceable (but with different numbers for the files e.g. 26.ua1 is present but it was looking for 26.ua2).
    I'm a little confused as to why the application would be looking for a file name different than the file generated.
    Could this be related to permissions or access?

  • Open Batch Multi Load file type error

    Hi,
    I have been trying to use Open Batch MultiLoad via FDM Workbench, but encountered error that it .csv file type is unknown at Import stage.
    A couple things that I have tried:
    - Using multiload via FDM Web Interface to load the .csv file: Success
    - Using open batch multiload via Workbench to load the .csv file: Failed
    - I tried to rename the .csv file itu .txt (without changing the content), tried to use open batch multiload via Workbench to load the .txt file: Success
    It seems that if I try to execute open batch multiload, FDM is able to read the CSV format but refuse the .csv file type to be processed.
    Do I miss something for using openbatch multiload to load .csv file?
    Thanks,
    Erico
    *[File Content]*
    LOC1
    Budget
    1/31/2008
    2
    R,M
    Center,Description,ACCouNT,UD1,UD3,UD4,DV,v,v,v,UD2
    Sunnyvale,Description,Sales,GolfBalls,,,Periodic,1001,2001,3000,Customer2
    Sunnyvale,Description,Purchases,GolfBalls,,,Periodic,2001,3001,4000,Customer2
    Sunnyvale,Description,OtherCosts,GolfBalls,,,Periodic,4001,5001,5000,Customer2*[Error Message Received]*
    Invalid object Node=ML40942.3981712963_P1?MULTILOADFILE.,CSV_UNKNOWN FILE TYPE IN MULTILOAD BATCH FOR FILE [MULTILOADFILE.CSV]! - 35610
    *[FDM Version]*
    FDM 11.1.2.1

    Kemp2 wrote:
    Hi Erico,
    What is the fix for this issue? I am having same issue.
    Thx
    KempHi Kemp,
    I didn't get the fix for this issue. Since we decided to not use the Open Batch Multi Load (not because this issue), I stopped researching on this issue.
    But some workaround that you might want to try:
    <li>Simply have the source file in .txt file type</li>
    or
    <li>Since open batch uses script, before executing the Multi Load script, change the file type from .csv to .txt using script</li>
    Hope this helps.
    -Regards

  • FDM 11.1.2.1 taking too much time to Export (Multi-load)

    Hi...
    There is something very odd here...
    I'm using FDQM 11.1.2.1 to multi-load a file. (300mb)
    So, the steps: Import (about 5min), Validade (about 5 min) - I have some explicit maps (about 2300).
    Export is my problem.... its taking more than 1 hour.
    I made a test using a Essbase rule to load this data using SQL with Join on FDM mapping table - TDATAMAP (because of the maps). So, in this way i spend about 3 minutes to load everything...
    The question is: why using FDM web client is taking to much time to Export data?
    Target: Planning App
    Thanks in advance
    -Cheers,
    Rafael

    Yes, you are exactly right. That is the correct order for the outline. Sorry, it's been a long time since I've used Essbase and my memory isn't what it used to be. Still, what keeps data load times to a minimum is to have the sparse fields in sort order so that the same data blocks don't have to be pulled into memory multiple times. The sparse fields should be sorted and in the same order as the outline. The best way to ensure that this is occurring is to have the sparse fields first, dense fields last.
    With dense dimensions sorted 1st, you can see how the sparse fields can get out of sort order, causing blocks to be revisited:
    D1A -> D2A -> S1A -> S2A -> S3A -> Data
    D1A -> D2A -> S1A -> S2A -> S3B -> Data
    D1B -> D2A -> S1A -> S2A -> S3A -> Data
    The 3rd record forces Essbase to re-visit the datablock that was already loaded by the 1st record.
    On the other hand, if sparse fields are first and sorted left to right, the first two records are loaded at the same time to the same block:
    S1A -> S2A -> S3A -> D1A -> D2A -> Data
    S1A -> S2A -> S3A -> D1B -> D2A -> Data
    S1A -> S2A -> S3B -> D1A -> D2A -> Data
    Sorry for the confusion. Hope this clarifies field sort order vs. outline order.

  • Specified Driver could not be loaded due to system error 5

    Hello,
    I am trying to run oracle 9i on my lap top. I installed the package from the website and got everything installed correctly. I checked my sqlnet.ora and tsnnames.ora files and the test connection in the ODBC driver is working fine.
    However, as i try and connect to the database through my ASP application i get the error:
    Code: 80004005
    Specified Driver could not be loaded due to system error 5
    I have been searching these forums for an answer and the closest solution i found was that something may be wrong with my path variable. I checked it and everything looks fine, it points to my /bin directory.
    I am not sure if it is something wrong with my driver (though i installed the latest one available) or something else. I Would appreciate any help anyone can offer.
    Thanks
    samir.

    The user that IIS runs as doesn't have appropriate access to the Oracle directory or to ODBC. There's a good, recent Usenet thread on this
    http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&oe=UTF-8&threadm=3d119df8%240%24234%24ed9e5944%40reading.news.pipex.net&rnum=2&prev=/groups%3Fq%3DASP%2Bsystem%2Berror%2B5%2Bgroup:comp.databases.oracle.*%26hl%3Den%26lr%3D%26ie%3DUTF-8%26oe%3DUTF-8%26scoring%3Dd%26selm%3D3d119df8%25240%2524234%2524ed9e5944%2540reading.news.pipex.net%26rnum%3D2
    I'm assuming that you've verified that you can, say, create a new DSN with the Oracel driver and successfully test it with the "test connection" button.
    Justin

  • How can I load data with Scripts on *FDM* to a HFM target System????

    Hi all!
    I need help because I can´t find a good guide about scripting on FDM. The problem I have is the next one.
    I have on mind to load my data with data load file in FDM to a HFM target system, but I would like to load an additional data using an event script, ie after validate. I would need any way to access to HFM system though FDM Scripts, is it possible??
    If so, It would be wonderful to get a data from HFM with any Point of View, reachable from FDM Scripts in order to load or getting any data.
    I´ve looking for a good guide about scripting in FDM but I couldn´t find any information about accessing data on HFM target system, does it really exist?
    Thanks for help

    Hi,
    Take a look at the LOAD Action scripts of your adapter. This might give you an idea.
    Theoretically it should be possible to load data in an additional load, but you need to be very careful. You don't want to corrupt any of the log and status information that is being stored during the load process. The audit trail is an important feature in many implementations. In this context it might not be a good idea to improve automation and risk compliance of your system.
    Regards,
    Matt

  • Load error in FDM

    Has anyone ever gotten a data load error message when performing the export step in FDM? Once I load my data, I get a two line error file that says data load started and data load completed, but is an error file. My file loads, but I cannot get a goldfish next to the check step.

    Hi friends,
    I have solution for my environment (11.1.1.3). Try this:
    1) Open user preferences (workspace / file / preferences)
    2) Select "Consolidation" and select target application
    3) Go to the bottom of this dialog/form and UNCHECK "Save all files in Unicode format" option.
    4) Try export again.
    Vladislav

  • Specified driver could not be loaded due to system error 127: The specified procedure could not be found. (...SQORA32.DLL).

    I recently upgraded from 10g to 11g and upon this upgrade, a website of mine stopped working.
    It is a classic ASP website.  I added a "testdatabase" page to the website to test the basic functionality of opening a connection.  I've pasted the testdatabase page below.  Anyways, upon running this page, I get this error:
    Specified driver could not be loaded due to system error 127: The specified procedure could not be found. (Oracle in OraClient11g_home1_32bit, D:\app\oracleadm\product\11.2.0\client_1\BIN\SQORA32.DLL).
    I have two other environments where I have done the same upgrade and the site works just fine.  I have compared all the IIS settings in my other environments to that of the one suffering from the error.  Also, in the IIS settings, I have made sure that "Allow 32bit" was checked.
    D:\app\oracleadm\product\11.2.0\client_1\BIN\SQORA32.DLL is actually where this DLL resides. 
    I have also checked to make sure the environment variables are correct.
    The very first environment variable in the PATH setting is D:\app\oracleadm\product\11.2.0\client_1\bin; and there are no other references to oracle in the PATH environment variable.
    I have done a google search and I've seen that people also tell me to check ORACLE_HOME to make sure it's correct.  I've checked this in both of the working environments and there is nothing there.  There is also nothing there in the broken environment.  What should be there?  If anything?
    I don't know where else to look.  What could be causing this error?
    <%
    Option Explicit
    Response.Expires = 0
    Response.Buffer  = true
    Dim conn
    Err.Clear
    On Error Resume Next
    Set conn = Server.CreateObject("ADODB.Connection")
    conn.Open "DSN=RPS11;UID=FFXQM;PWD=ffxqm6prd$"
    If Err.Number <> 0 Then
    Response.Write (Err.Description & "<br><br>")
    Response.End
    conn.Close
    Set conn = Nothing
    End If
    conn.Close
    Set conn = Nothing
    On Error GoTo 0
    %>

    As a temporary solution, add the Administrators group to the IUSR and IWAM userids and see if that works. If it solves the problem, then it IS a matter of setting the users privledges correctly.

  • "DD8x". Error message for loading process:Operating system error !

    Using the OLE command I receive the following error:
    Cannot load the file "File Name" with the loading process "DD8x". Error message for loading process:
    Operating system error <> !
    strCommand ="DATAFILEIMPORT('J:\ctf\GM\040119-2\dat7.dat','',0)"
    lsuccess = IDIACommand.CmdExecuteSync(strCommand)
    I get the same error with the "DataLoadHdFile" command.
    After the error happens, I cannot load any files in the directory until I
    reboot DIAdem.

    Hi punkmonkey,
    I'm curious why you compare the results of the "DataLoadHdFile" command with that of the "DataFileImport" command. The former is a time-honored DAT-file-specific command for only reading the header file contents-- NOT the data, and the latter is a brand new DIAdem 9.0 function for loading all sorts of data files-- TDM files included.
    If you want to load a DAT file using an OLE command, why don't you try the time-honored "DataLoad" command, which is specifically designed for loading DAT files? The "DataLoadHdFile" command is really only used to do selective loading (specific channels or rows) of part of a DAT file. Do you want to load selectively, or do you want to load all the channels from the DAT file?
    When you get the same error with
    the "DataLoadHdFile" command, are you executing that command also through OLE automation, or directly in DIAdem? What happens if you try both the "DataLoadHdFile" and the "DataFileImport" commands directly from DIAdem (no OLE)?
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • Specified driver could not be loaded due to system error 126

    I have some problem connecting to Oracle through ODBC. It's an ODBC error, but I cannot find anything to help me on the web so I hope someone already encounter this problem and can answer.
    So I created a new Oracle Database (SID=orcl) on one of our database server, and a user KRISS with DBA rights. It is Oracle 9i. When I tried to create a new ODBC Data Source , system DSN :
    Data Source Name : orcl
    TNS Service name : orcl
    User ID : kriss.
    when I click on Test Connection, I have the following error message :
    Unable to connect
    SQLState=IM003
    Specified driver could not be loaded due to system error 126 ( Oracle in OraHome90)
    The configuration of my system is :
    Windows 2000 Server SP3 5.00.2195
    Oracle 9i
    Any help would be great, thanks in advance,
    Kriss

    I found the answer in an other part of this forum, thank you Justin for :
    'Note that Microsoft doesn't recommend using the Microsoft ODBC driver with an Oracle9 client-- they suggest sticking with 8.1.7.
    Now that I use ODBC driver 8.1.7, I can connect to my Oracle 9i db !!!

  • Hadoop ODBC linked server error: Specified driver could not be loaded due to system error 182

    Hi,
    I am trying to connect to a Hadoop server through a linked server in SQL Server. I downloaded and installed the Microsoft Hive ODBC driver - HiveODBC64.msi. But while I am trying to create the Linked server using this ODBC driver I am getting below error.
    " Specified driver could not be loaded due to system error 182 (Microsoft Hive ODBC driver, D:\....\HiveODBC64.dll). (Microsoft SQL Server, Error: 7303)"
    When I tested it locally and in another server, it is working properly. I am getting this error only in this server. looks like a dll error. But cannot identify where/what is the error. Could someone help me fix this issue.
    Server info:
    OS Name: Microsoft Windows Server 2008 R2 Enterprise Version 6.1.7601 Service Pack 1 Build 7601
    SQL SERVER: Microsoft SQL Server 2008 R2(SP1)

    Hi LinkedServer,
    According to your description, you want to create linked server to connect to a Hadoop server in SQL Server, and you have created it successfully in the test server, the problem is that you get the issue in the production environment, right?
    Based on the error message, as you said it seems that the issue is related to the HiveOBDC64.dll assembly. Since you have done it on the test server, you can copy and replace HiveODBC64.dll to production server in the C:\Program Files\Microsoft Hive ODBC
    Driver\lib directory. Or you can reinstall the Microsoft Hive ODBC driver. Here is a blog which describe how to create a SQL Server Linked Server to HDInsight HIVE using Microsoft Hive ODBC Driver step by step, please refer to the link below to double check
    if your steps are correct.
    http://blogs.msdn.com/b/igorpag/archive/2013/11/12/how-to-create-a-sql-server-linked-server-to-hdinsight-hive-using-microsoft-hive-odbc-driver.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • I have an alert message in disc utility. Error: storage system verify or repair failed. Problems were found with the partitian map which might prevent booting.

    While running verify disc in disc utility on 251 GB SSD SM256c Media I came up with this message: Alert system verify or repair failed. In the descriptions box of history it said, problems were found with the partitions map which might prevent booting. This is followed by a message in red reading, Error: Storage system verify or repair failed. At the time I was down loading raw pictures off my camera through a card reader across my computer to a relatively new 2TB Western Digital portable hard drive. after about 13 to 15 pictures downloaded that drive faileded. My Mac Book Air works fine, but i get this message now everytime I run disc verify on this disc. The Macintosh HD checks out fine.

    While running verify disc in disc utility on 251 GB SSD SM256c Media I came up with this message: Alert system verify or repair failed. In the descriptions box of history it said, problems were found with the partitions map which might prevent booting. This is followed by a message in red reading, Error: Storage system verify or repair failed. At the time I was down loading raw pictures off my camera through a card reader across my computer to a relatively new 2TB Western Digital portable hard drive. after about 13 to 15 pictures downloaded that drive faileded. My Mac Book Air works fine, but i get this message now everytime I run disc verify on this disc. The Macintosh HD checks out fine.

  • Data load from Legacy system to BW Server through BAPI

    Requirements: We have different kind of legacy systems and SAP BW server. We want to load all legacy system data into SAP BW server using BAPI. Before loading we have to validate all data. If there are bad data, data missing we have to let the legacy system user/ operator knows to fix the data into their system with detail explanation. When it is fixed, we have to load the data again.
    Load Scenario:  We have two options to load data from legacy systems to BW Server.
    1.     We need to load data directly from legacy system to BW Server using BAPI program.
    2.     Legacy Systems data would be in workstations or flash drive as .txt (one line separated by comma) or .csv file. Need to load from .txt /.csv file to BW Server using BAPI program.
    What we want in the BAPI program code?
    It will Read / Retrieve data from text / csv file and will put into the Internal table. Internal table structure would be based on BAPI InfoObject structure.
    Call BAPI InfoObject function module ‘BAPI_IOBJ_CREATE’ to create InfoObject, include all necessary / default components, do the error check, load the data and return the status.
    Could some one help me with the sample code please? I am new in ABAP / BAPI coding.
    Is there any other better idea to load data from legacy system to BW server? BTW we are using BW 3.10. Is there any better option with BI 7.0 to resolve the issue? I appreciate your help.

    my answers:
    1. this is a scendario for a data push into SAP BW. You can only use SOAP-Based Transfer of Data.
    http://help.sap.com/saphelp_nw04/helpdata/en/fd/8012403dbedd5fe10000000a155106/frameset.htm
    (here for BW 3.5, but you'll find similar for 7.0)
    In this scenario you'll have an RFC dinamically created for every Infosource you need to transfer data.
    2. You can make a chain for each data load, so you can call the RFC "RSPC_API_CHAIN_START" to start the chain from external.
    the second solution is more simply and available on every release.
    Regards,
    Sergio

Maybe you are looking for