FDM data target system load error

Hi,
I am trying to Load the data from FDM11.1.1.3 to Planning11.1.1.3
I am using Essbase Adapter ES11X-G4-E,I imported data from a csv file ,it is imported success,and Export file was also successfully created,
My question : after this I am getting the error message in another window Target system laod , Error:Invalid use of null
Can you please anybody tell me what is the reasons for getting this error ?
Regards,
SV

One of the values on the Hyperion Essbase Integration Setup tab of the adapter is currently empty. Perform the following
a) Right-Click on the Essbase adpater in the workbench and choose Configure
b) Click on the Hyperion Essbase Integration Setup tab
c)Click on the Load/Check button
d) Verifiy that the List1, List2, and List3 boxes are not empty. These must have values:
List1: 0 -Load
List2: opt1
List3: opt1

Similar Messages

  • Master Data Flat File Loading Error

    Hi guys.
    I received the following error when try to preview a master data flat file.
      109       ENDIF.                                                           
      110     ENDIF.
    May I know how to solve this?
    Thanks!

    Fulham,
    Please give us the nature of data you are loading and also try and simulate the same and see if it goes through.
    What are the infoobjects you are loading for the master data ...?
    Arun

  • PSA data change not loaded to the target.

    Hi all,
    We have the data in PSA (it is the master data)- while transferring the data from PSA to the target, system throws error - on analysing the error we found that the characters in the data is wrong, the same was corrected in the PSA - on reloading after correction to the target, system is ignoring the corrected records and loading only the other records.
    What could be the reason for this behaviour.
    Can experts help.
    Regards,
    M.M

    Dear Mr.Goud/Murali,
    Thank you for your immediate reply and instructions.
    Mr. Goud,
    The PSA QM status is only green but even then the corrected record is not loaded to the target.
    Mr. Murali,
    As per your information we do not have an Error DTP -  i have only one DTP and we are in BI7.00 - I have deleted the request from the target and corrected the erroneous record and am trying to load only that record by using the filter option in the DTP - still the data is not loaded.
    Can you suggest method of adopting - i am not able to understand why that corrected record is not loaded. Is it that manually corrected record in the PSA will not be loaded into the target?
    Kindly clarify.
    Regards,
    M.M

  • How can I activate the transfer rules for the ODS updating a data target.

    We are on BW 3.5 and I'm loading data into the 0FIGL_O10 ODS  and then uploading the data into the cube 0FIGL_C10. The data loads just fine to the ODS but when I try to <u><b>'update the data target'</b></u> I get a date & time stamp' error on the info-package transfer rules.
    I then Replicate the datasource 80FIGL_O01.
    I must then <u><b>'activate' the transfer rules</b></u>.
    However I cannot get the transfer rules for 80FIGL_O10 in CHANGE MODE to activate them.
    How can I activate the transfer rules for the ODS updating a data target.
    The error text is as follows:
    DataSource 80FIGL_O10 has to be replicated (time stamp, see long text)
    Message no. R3016
    Diagnosis
    DataSource 80FIGL_O10 does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 02/15/2007 10:42:33.
    The time stamp in the BW system is 11/07/2006 13:11:54.
    System response
    The load process has been terminated.
    <b>Procedure
    Copy the DataSource again and then activate the transfer rules that belong to it. You have to activate the transfer rules in every case, even if they are still active after the DataSource has been copied.</b>
    Thanks for your assistance.
    Denny

    Hi Dennis,
           Try, using Business Content to activate your data source
           hope this will help you
    How activate business content?
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a66d5e07211d2acb80000e829fbfe/frameset.htm

  • Missing Data Target in Infopackage for Update ODS Data in Data Target Cube

    Hello & Best Wishes for the New Year to all of you,
    I have 3 ODS (1 on Full Update and 2 with Delta Updates). All these 3 ODS update data to a single CUBE. In my development system this works correctly. Data load from PSA to ODS to Cube.
    Now I transported this to my QA and Production System. In QA and Production System, I am able to load data upto all the 3 ODS and ACTIVATE Data in all these 3 ODS.
    When I am trying for "Update ODS Data in Data Target" to load data from ODS to Cube in QA /PD, the system created Infopackage (ODS to CUBE), doesnot get the Data Target details. (Initial Upload / Full Upload). The Data Target Tab is Blank (expected Cube details).
    I have tried to transport again after deleting the update rules.
    Can you suggest what could be the problem ?
    regards - Rajesh Sarin

    Thanks Dinesh,
    I have the ODS to CUBE Update Rules ACTIVE in the QA and PD system. Still the problem exists only in QA and PD. In DV the ODS to CUBE Data Target is available in the Infopackage and loading the data correctly to Cube.
    Listing all the trials I have done :
    1) Originally transported with all the Collected Objects. Inspite of having Active Update Rules, Data Target Cube was missing in QA and PD.
    2) After this problem, I again transported only the Update Rules through the Transport Connection, still the problem didnot get solved in QA and PD.
    3) Again, I sent a transport to delete the Update Rules which deleted the ODS to CUBE update rules in (DV), QA, PD. After that I sent another request to CREATE the ODS To CUBE UPDATE Rules in (DV), QA and PD. Still the Data Target is missing, inspite of having Active Update Rules in QA and PD.
    In DV the ODS to CUBE Data Target is available and loading the data correctly to Cube, even now.
    4) I have also tried "Generate Export Data Source" for the 3 ODS in QA and PD. Still it doesnot help.
    Can you please suggest ?
    regards - Rajesh Sarin

  • BW 3.5 - Update to PSA only / Read PSA and Update Data Target

    Hi Folks,
    I am planning to use on a  BW 3.5 data source and infopackage that loads to PSA only (as the source system job runs quite long and we would run other prepration loads in parallel before pushed to final cube). And then later in the process chain the process Read PSA and Update Data Target to load the data to final cube.
    As we plan to run the process chain 3 times a day I was wondering if the Read PSA and Update Data Target always takes only the latest request loaded to the PSA (as it is always the same infopackage) or do we have to delete the PSA request at the end of each process chain run so that not the next run will load the PSA request again as it is still in the PSA?
    Thanks for all replies in advance,
    Axel

    You are loading the Delta till the PSA, multiple times a day, and after each update to the PSA you will send the data to the Cube.
    Once its there in the PSA, you are performing further tasks & then updating the cube multiple times (once after each delta)
    Well, the 1st load to the cube should be OK.
    But, from the next load onwards, it would pick up all the requests from the PSA & you will have wrong values in the Cube.
    Because, i think, while using 'Read PSA & Update the Target', you can't set it for a FULL or a DELTA load.
    It will bring in everything that exists in the PSA.
    I would advice you to clean the PSA before the starting next Delta.
    But, there is a counter argument.
    If the data mart sign is reflected in the PSA for the request that has been loaded into the cube, then the next time the system will pick up the request which does not have a data mart symbol.
    You can always give this a try in your Dev system.
    Edited by: Vishal Sanghvi on Apr 1, 2011 1:58 PM

  • How can I load data with Scripts on *FDM* to a HFM target System????

    Hi all!
    I need help because I can´t find a good guide about scripting on FDM. The problem I have is the next one.
    I have on mind to load my data with data load file in FDM to a HFM target system, but I would like to load an additional data using an event script, ie after validate. I would need any way to access to HFM system though FDM Scripts, is it possible??
    If so, It would be wonderful to get a data from HFM with any Point of View, reachable from FDM Scripts in order to load or getting any data.
    I´ve looking for a good guide about scripting in FDM but I couldn´t find any information about accessing data on HFM target system, does it really exist?
    Thanks for help

    Hi,
    Take a look at the LOAD Action scripts of your adapter. This might give you an idea.
    Theoretically it should be possible to load data in an additional load, but you need to be very careful. You don't want to corrupt any of the log and status information that is being stored during the load process. The audit trail is an important feature in many implementations. In this context it might not be a good idea to improve automation and risk compliance of your system.
    Regards,
    Matt

  • Error "cannot load request real time data targets" for new cube in BI 7.

    Hi All,
    WE have recently upgarded our SCM system from 4.1 to SCM 7.0 which incorporated BI 7.0.
    I am using BI 7.0 for first time and ahve the following issue:
    I ceated a new infocube and data source of flat file and succesfully created transformation, and Data Transfer Process. Everything looked fine. I added flat file and checked preview and could see data. Now when I start job to load data in infocube the follwing error is shown "cannot load request real time data targets". 
    I checked cube type in setting in infcune is shows as Standard.  When I doube clicked on error the following message showed up
    You are trying to load data into a real-time InfoCube using a DTP.
    This is only possible if the correct load settings have been defined for the InfoCube.
    Procedure
    In the object tree of the Data Warehousing Workbench, call Load Behavior of Real-Time InfoCube from the context menu of the InfoCube. Switch load behavior to Transactional InfoCube can be loaded; planning not allowed.
    I did not understand what it is meant and how to set changes. Can someone advice and follow me through.
    Thanks
    KV

    Hi Kverma,
    Real-time InfoCubes can be filled with data using two different methods: using the transaction for entering planning data, and using BI staging, whereby planning data cannot be loaded simultaneously. With Real time cube you can select the method you want to use for update as
    Real Time data Target can be loaded With Data; Planning not allowed &
    Real Time data Target can be Planned; Data loading not allowed
    You can change this behaviour by right clicking on cube and selecting Change real time load behaviour and select first option. You will be able to load the data then
    Regards,
    Kams

  • Error while loading the data from PSA to Data Target

    Hi to all,
         I'm spacing some error while loading the data to data target.
    Error :  Record 1 :Value 'Kuldeep Puri Milan Joshi ' (hex. '004B0075006C0064006500650070002000500075007200690
    Details:
    Requests (messages): Everything OK
    Extraction (messages): Everything OK
    Transfer (IDocs and TRFC): Errors occurred
          Request IDoc : Application document posted
          Info IDoc 2 : Application document posted
          Info IDoc 1 : Application document posted
          Info IDoc 4 : Application document posted
          Info IDoc 3 : Application document posted
          Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 er
    Processing (data packet): Errors occurred
          Update PSA ( 2462  Records posted ) : No errors
          Transfer Rules ( 2462  -> 2462  Records ) : No errors
          Update rules ( 2462  -> 2462  Records ) : No errors
          Update ( 0 new / 0 changed ) : Errors occurred
          Processing end : Errors occurred
    I'm totally new to this issue. please help to solve this error.
    Regards,
    Saran

    Hi,
    I think you are facing an invalid character issue.
    This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
    I will add the step by step procedure to edit PSA data and update into target (request based).
    In your case the error message says Error : Record 1 :Value 'Kuldeep Puri Milan Joshi '. You just need to conver this to Capital letter in PSA and reload.
    Edit the field to KULDEEP PURI MILAN JOSHI in PSA and push it to target.
    Identifying incorrect records.
    System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
    1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
    2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
    3. Then you can go to PSA and filter using the incorrect records based on the particular field.
    4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
    If you want to confirm find the PSA table and search manually."
    Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
    Steps to resolve this
    1. Force the request to red in RSMO > Status tab.
    2. Delete the request from target.
    3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
    4.Edit the record
    5. Save PSA data.
    6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
    Refer how to Modify PSA Data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
    This should solve your problem for now.
    As a long term you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
    RSKC --> type ALL_CAPITAL --> F8 (Execute)
    OR
    Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
    Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
    Refer
    /people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
    /people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
    /people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
    http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
    For adding Other characters
    OSS note #173241 – “Allowed characters in the BW System”
    Thanks,
    JituK
    Edited by: Jitu Krishna on Mar 22, 2008 1:52 PM

  • Error in Source System While Uploading Data from ODS to Data Target

    Hi All,
    Iam getting Following Error while Uploading data from ODS to Data Target
    a) Error in Source System.
    b)DataSource 80MKT_DS01 has to be replicated (time stamp, see long text)
    Message no. R3016
    Diagnosis
    DataSource 80MKT_DS01 does not have the status of the source system in the Business Information Warehouse.
    The time stamp in the source system is 21.06.2005 07:31:33.
    The time stamp in the BW system is 27.11.2004 17:41:45.
    Regards,
    Chakri

    This generally occurs when you change the data source or enhance the data source or the coonections to the datasoiurse are broken. To resdolve thius sort of errors just
    1. Check for the coonect between the sourse & target systems
    2 . Replicate the data sourcses.
    3. Acivated TR,UR,& the rest of the flow But check in the source system if the structre is modified & is so do the changes in the IS , TR,UR correspondingly & the try to load .Hoppe it allows you to load .
    Thanks ,
    PSG

  • Error: Unable to retrieve target System Data.

    This error suddenly started appearing after we moved some code from Dev to Test. Not sure what changed. Does anybody has any idea? I get this error when I click on
    Metadata --> Control tables.
    ** Begin FDM Runtime Error Log Entry [2011-10-05 06:08:03] **
    ERROR:
    Code............................................. -2147467259
    Description......................................
    At Line: 45
    Procedure........................................ clsBlockProcessor.ActConnect
    Component........................................ upsWBlockProcessorDM
    Version.......................................... 1112
    Thread........................................... 12592
    IDENTIFICATION:
    User............................................. admin
    Computer Name.................................... NLSVEPMHFMT01
    App Name......................................... BAABV
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... ORAOLEDB.ORACLE
    Data Server......................................
    Database Name.................................... HFMTEST2
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... TANZANIA
    Location ID...................................... 749
    Location Seg..................................... 3
    Category......................................... ACTUAL SCENARIO
    Category ID...................................... 12
    Period........................................... Apr - 2011
    Period ID........................................ 4/30/2011
    POV Local........................................ False
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-05 06:08:03] **
    ERROR:
    Code............................................. -2147467259
    Description......................................
    At Line: 45
    Procedure........................................ clsBlockProcessor.DimensionList
    Component........................................ upsWBlockProcessorDM
    Version.......................................... 1112
    Thread........................................... 12592
    IDENTIFICATION:
    User............................................. admin
    Computer Name.................................... NLSVEPMHFMT01
    App Name......................................... BAABV
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... ORAOLEDB.ORACLE
    Data Server......................................
    Database Name.................................... HFMTEST2
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... TANZANIA
    Location ID...................................... 749
    Location Seg..................................... 3
    Category......................................... ACTUAL SCENARIO
    Category ID...................................... 12
    Period........................................... Apr - 2011
    Period ID........................................ 4/30/2011
    POV Local........................................ False
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-05 06:08:03] **
    ERROR:
    Code............................................. -2147467259
    Description......................................
    At Line: 45
    Procedure........................................ ObjScriptReturnMarshaler.GetDimensionList
    IDENTIFICATION:
    User............................................. admin
    Computer Name.................................... NLSVEPMHFMT01
    App Name......................................... BAABV
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... ORAOLEDB.ORACLE
    Data Server......................................
    Database Name.................................... HFMTEST2
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... TANZANIA
    Location ID...................................... 749
    Location Seg..................................... 3
    Category......................................... ACTUAL SCENARIO
    Category ID...................................... 12
    Period........................................... Apr - 2011
    Period ID........................................ 4/30/2011
    POV Local........................................ False
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False

    You need to register/configure the HFM Adapter on the FDM application server. The adapter needs to be configured to run under the FDM Service account. You will want to place the "AdapterComponents" folder that was extracted from the .zip file of the FM11x-G5-E adapter in the Oracle\Middlware\EPMSystem11R1\Products\FinancialDataQuality\SharedComponents directory.
    A) Login to the FDM application via the workbench on the FDM application server
    B) Choose File > Register Adapter and browse to the AdapterComponents\fdmFm11xG5E\ folder and select the fdmFM11xG5E.dll file and choose open
    C) Expand the Target System adapters and right-click on the FM11x-G5-E adapter and choose Configure
    D) Populate the Com Admin screen with the FDM Service account userid/password/confirm password/domain and click OK

  • Need to post Full Load data (55,000 records) to the target system.

    Hi All,
    We are getting the data from SAP HR system and we need to post this data to the partner system. So we configured Proxy(SAP) to File(Partner) scenario. We need to append the data of each message to the target file. Scince this is a very critical interface, we have used the dedicated queues. The scenario is working fine in D. When the interface transported to Q, they tested this interface with full load i.e with 55,000 messages.All messages are processed successfully in Integration Engine and to process in Adapter engine, it took nearly 37 hrs. We need to post all 55,000 records with in 2 hrs.
    The design of this interface is simple. We have used direct mapping and the size of each message is 1 KB. But need to append all messages to one file at the target side.We are using Advantco sFTP as receiver adapter and proxy as a sender.
    Could you please suggest a solution to process all 55,000 messages with in 2hrs.
    Thanks,
    Soumya.

    Hi Soumya,
    I understand your scenario as, HR data has be send to third party system once in a day. I guess, they are synchronizing employee (55,000) data in third party system with SAP HR data, daily.
    I would design this scenario as follows:-
    I will ask ABAPer to write a ABAP program, which run at 12:00, pickup 55,000 records from SAP HR tables and place them in one file. That file will be placed in SAP HR file system (you can see it using al11). At 12:30, PI File channel will pick up the file and transfer the file to third party target system as it is, without any transformation. File to File, pass through scenario (no ESR objects). Now, ask the target system to take the file, run their program (they should have some SQL routines). That SQL program will insert these records into target system tables.
    If 55,000 records make huge file at SAP HR system, ask ABAPer to split it into parts. PI will pick them in sequence based on file name.
    In this approach, I would ask both SAP HR (sender) and third party (target) system people to be flexible. Otherwise, I would say, it is not technically possible with current PI resources. In my opinion, PI system is middleware, not system in which huge computations can be done. If messages are coming from different systems, then collection in middleware makes sense. In your case, collecting large number of messages from single system, at high frequency is not advisable. 
    If third party target system people are not flexible, then go for File to JDBC scenario. Ask SAP HR ABAPer to split input file into more number of files (10-15, you PI system should be able to handle). At receiver JDBC, use native SQL. You need java mapping to construct, SQL statements in PI. Donu2019t convert flat file to JDBC XML structure, in your case PI cannot handle huge XML payload.
    You have to note, hardware upgrade is very difficult (you need lot of approvals depending your client process) and very costly. In my experience hardware upgrade will take 2-3 months.
    Regards,
    Raghu_Vamsee

  • How to trigger the interface in target system to load the data automaticall

    Hi Friends,
    This is proxy to file interface. My requirement is to “trigger” the interface in target system to load the data automatically. For ex: if you are sending a customer data file "/exe_test/custd.dat” then the trigger file should be created as “/exe_test/custd.trg”. How do we actually create this trigger file?
    Thanks in advance,
    Prathibha.

    Hi Prathibha,
    File (Trigger) -> BP
    BP <-> ABAP server proxy (synchronous)
    BP -> File
    is one and the standard alternative. Just make your interface synchronous, add a msg type for the request (i think you need only one field), take the existing for the response. Regenerate your proxy.
    Alternative 2 would be:
    File (Trigger) -> ABAP server proxy (asynchronous)
    ABAP server proxy calling a ABAP client proxy
    ABAP client proxy -> File (asynchronous)
    You avoid using BPM, what can be an advantage of huge traffic (performance). This solution has a worse value for the future, because a person, who did not developed the process wouldnt see that the both messages belong together.
    Regards,
    Udo

  • Not been able to export data from  FDQM to target system

    Hi,
    Am not been able to export data from FDQM to my target system which is Hyperion Enterprise 6.4
    I have imported, Validated the mapping but as soon as i click on export after creation of the export file the system throws an error as "*Error: Arguments are of the wrong type, are out of acceptable range, or are in conflict with one* *another*". Also am pasting the error log down below
    ** Begin Enterprise Adapter Runtime Error Log Entry [2009-08-20-16:10:27] **
    ERROR:
    Code.............. 10230
    Description....... Data Load Errors.
    Enterprise API Return Code: ALREADY_LOCKED_RO-.
    Procedure......... clsHPDataManipulation.fDBLoad
    Component......... upsHE6xG4A
    Version........... 100
    Thread............ 6028
    IDENTIFICATION:
    User.............. administrator
    Computer Name..... HYPERION
    ENTERPRISE CONNECTION:
    App Name.......... GCIP_S
    Connect Status.... Connection Open
    GLOBALS:
    Zero-For-No....... True
    INI File Path..... C:\WINDOWS\HypEnt.ini
    NameCat.txt Path.. C:\Hyperion\FDM\GCIP\Outbox\Logs\NameCat.txt
    NameCat Entity....
    NameCat Category..
    NameCat Exists.... False
    Any suggestions any one, what should i do

    Hello,
    Is it possible that the entity that you are loading to in Enterprise is locked? It appears that it may be per the below error. You can only load to unlocked intersections, so I would start by checking the catagory and entity combination for being locked.
    Regards
    JF

  • How do I consolidate in FDM AFTER all files load to target?

    Hi,
    We are using FDM to import and export out to HFM forecasting data. The files are sent via ETL to the OpenBatch folder and then I am using parallel processing to import, validate, and consolidate the files to the target system, which is HFM. The problem is the performance. As each file is loaded using the batch loader script, it loads and consolidates each file as each file processes. I would like to find a way to load and export all files FIRST to HFM - and then when the last file is encountered (the 12th file), I would like to run a consolidate on the file to HFM so that the consolidation will consolidate all at once and help save some time. Here are the steps I would like to accomplish:
    1. Import, Validate, and Export the first 11 files to HFM
    2. Import, Validate, Export, and Consolidate the 12th file to HFM - the idea is that all previous 11 files will consolidate atone time instead of the out of the box way using the standard BatchLoader script.
    Does anyone know HOW I can make this happen? Out of the box this is not possible without some major script customization. Could someone please give me some sort of idea where I can make this happen? I really appreciate the help. Below is the standard Batch Loader script I am using and I would like to somehow customize:
    Sub webBatchStdParallel()
    'Oracle Hyperion FDM CUSTOM Script:
    'Parallel Processing
    'STANDARD Batch Parallel Processing
    'Declare Local Variables
    Dim lngProcessLevel
    Dim strDelimiter
    Dim blnAutoMapCorrect
    Dim lngParallelProcessCount
    Dim strLoadBalanceServerName
    Dim BATCHENG
    Set BATCHENG = CreateObject("upsWBatchLoaderDM.clsBatchLoader")
    BATCHENG.mInitialize API, SCRIPTENG
    'Initialize variables Variables
    lngProcessLevel = 10                     'Up-To-Consolidate
    strDelimiter = "_"                         'File Name Delimiter
    blnAutoMapCorrect = True                    'Auto Map Correction is On
    lngParallelProcessCount = 2                    'Number of Parallel Processes (Should equal # of Processors)
    strLoadBalanceServerName = "Everest0555"     'Load balance server for parallel process
    'Create the file collection
    Set BATCHENG.PcolFiles = BATCHENG.fFileCollectionCreate(CStr(strDelimiter))
    'Execute a Standard Parallel batch
    BATCHENG.mFileCollectionProcessParallel BATCHENG.PcolFiles, CLng(lngProcessLevel), CLng(lngParallelProcessCount), CStr(strLoadBalanceServerName), , CBool(blnAutoMapCorrect)
    End Sub

    Hi,
    Thanks for all the input so far, I really appreciate it. I have made a lot of progress and added code to the batch loader script.
    Again, What I am trying to do is load 12 forecasting files (Periodic) for each month through FDM and then load to HFM. After the load, I want to kick off the consolidation process to the target system (HFM) AFTER the data has been loaded through FDM to help speed up time by running the consolidation all at once in HFM instead of by each file one by one in FDM. Here is some more info:
    1. I have 12 files loaded to the OpenBatch Folder - Each file is a monthly file.
    2. The 12 files (Starting from November 2010 - Oct 2011) are exported and loaded to HFM. No consolidation takes place.
    3. I want to add a procedure AFTER the load to run consolidation on the last file loaded (Oct. 2011) in HFM so the consolidation will kick off in HFM and consolidate the remaining months in HFM instead of in FDM. I tested manually and the time is increased 10-fold.
    4. I have potentially identified the "ActConsolidate" Method could be used to run the consolidation in the target system. In my code you can see that I have added the ActConsolidate call AFTER the batch load. I was hoping this would run the consolidation directly in HFM by calling the consolidation in HFM. However, when I run the script I keep getting the following error:
    "91 - Object variable or With block variable not set At Line: 52" Apparently I am not calling or setting the object up correctly. Am I missing something? I sincerely appreciate everyones help and will promise to give big kudos to anyone who can help me along. I am almost there. THis is very difficult for me and I know it's a tough scenario because no one seems to have a solution posted anywhere on this forum.
    Below is my code:
    Sub webBatchStdParallelFcst()
    'Oracle Hyperion FDM Custom Script:
    'Created By:     
    'Date Created:     10/4/2010 9:27:32 AM
    'Purpose:
    'STANDARD Batch Parallel Processing and consolidation - Foresacting
    'Declare Local Variables for Batch Processing
    Dim lngProcessLevel
    Dim strDelimiter
    Dim blnAutoMapCorrect
    Dim lngParallelProcessCount
    Dim strLoadBalanceServerName
    Dim BATCHENG
    Set BATCHENG = CreateObject("upsWBatchLoaderDM.clsBatchLoader")
    BATCHENG.mInitialize API, SCRIPTENG
    'Declare Local Variables for consolidation
    Dim strLoc
    Dim strCat
    Dim strStartPer
    Dim strEndPer
    Dim CONSOLENG
    Set CONSOLENG = CreateObject("upsWBlockProcessorDM.clsBlockProcessor")
    'Initialize variables for Batch Processing
    lngProcessLevel = 8                     'Up-To-Load
    strDelimiter = "_"                    'File Name Delimiter
    blnAutoMapCorrect = True               'Auto Map Correction is On
    lngParallelProcessCount = 2               'Number of Parallel Processes (Should equal # of Processors)
    strLoadBalanceServerName = "everest3105"     'Load balance server for parallel process
    'Initialize ActConsolidate Variables
    strLoc = "FORECAST"
    strCat = "FDMPLAN"
    strStartPer = "Oct - 2011"
    strEndPer = "Nov - 2011"
    'Create the file collection
    Set BATCHENG.PcolFiles = BATCHENG.fFileCollectionCreate(CStr(strDelimiter))
    'Execute a Standard Parallel batch
    BATCHENG.mFileCollectionProcessParallel BATCHENG.PcolFiles, CLng(lngProcessLevel), CLng(lngParallelProcessCount), CStr(strLoadBalanceServerName), , CBool(blnAutoMapCorrect)
    'Execute Consolidation on Target System
    CONSOLENG.ActConsolidate strLoc, strCat, strStartPer, strEndPer
    End Sub
    However, I am getting the following error: "91 - Object variable or With block variable not set At Line: 52" I don't understand, the objects appear to be set up properly.

Maybe you are looking for

  • How to Install AS 10g version 10.1.2.3 on Windows server 2008 64bit

    Dear Exparts, Hope you are fine. Application Server: Standalone 10.1.2.3 OS: Windows Server 2008 64bit Hardware: Intel Xeion ProcessorI know Oracle Application Server 10g & Standalone my version is 10.1.2.3 is not competible with Windows Server 2008

  • Airport Express and iPhones can't see network-macs and ipad can?

    I have an Airport Extreme base station connected to a comcast modem, with 2 express units for airtunes. After some fiddling around due to the comcast outage, now I can't see either of the Airport Express units in the Airport Utility, AND neither my o

  • Auto-insert digital signature

    Using LCD 8.1 Is it possible to generate a non-interactive form with a digital signature already in-place? We're generating invoices out of SAP and need to have a digital signature in place when we email the PDF to our vendor. The digital signature w

  • Java need in SAP PI/XI

    Hello, I am new to SAP XI. I have worked 3years as ABAP developer. I do not know anything in Java. Please suggest the amount of java knowledge that I need to develop to become profound in SAP XI.Can anyone suggest any material that i can go thru to d

  • HTML Client: How to display the edit screen of an item given its ID?

    To edit a selected item, I could easily add a new button on the command line and choose the existing method, "editSelected" But how to edit another item instead of the selected item? Like How to edit an item whose ID # is 10? i.e How to display the e