Automating the cube load process

Hi All,
I have created the essbase cube using Hyperion Integration services 9.3.1 as my datasource is the star schema residing in a oracle db. I can successfully load the cube and see the results in smart view.
I want to know how I can automate the data loading process(both dimension and fact) into the essbase cubes either by using unix scripts and windows .bat scripts.
Your inputs are Appreciated. Thanks in Advance.
Regards
vr

What everyone has shown you is the use of esscmd or MaxL, howevery you stated you are using EIS. Until you get to 11.1 and use Essbase studio, these won't help you. What you will need to do is to go into EIS, then select your metadata outlline. Pretend you are going to build the cube manually. Start scheduling it, when you get to the screen that allows you to schedule it or create a file, create the file only. You will want the name to end in .cbs (Hint, if you select both outline and data load, you will get all the statements in one script)
After you are done, if you look on the EIS server under arborpath\EIS\Batch, you should see the .cbs file you created. It is a regular text file so you can look at it. If you open it, you will see loaddata and/or loaddimensions statements This is what does the work.
To run it, create a batch file with the command olapicmd -fscriptFileName >logfilename
For more information about the Integration Services Shell look at http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/eis_sysadmin.pdf Chapter 5 tells you everything you need to know

Similar Messages

  • Automate the Cube loading process using script

    Hi,
    I have created the Essbase cube using Hyperion Essbase Studio 11.1.1 and my data source is Oracle.
    How can I automate the data loading process into the Essbase cubes using .bat scripts?
    I am very new to Essbase. Can anyone help me on this in detail?
    Regards
    Karthi

    You could automate the dimension building and dataloading using Esscmd/ Maxl scripts and then call them via .bat scripts.
    Various threads available related to this post. Anyways, you could follow the following steps.
    For any script provide the login credentials and select the database.
    LOGIN server username password ;
    SELECT Applic_name DB_name;
    To build dimension:
    BUILDDIM location rulobjName dataLoc sourceName fileType errorLog
    Eg: BUILDDIM 2 rulfile 4 username password 4 err_file;
    For Dataload
    IMPORT numeric dataFile fileType y/n ruleLoc rulobjName y/n [ErrorFile]
    Eg: IMPORT 4 username password 2 "rulfile_name" "Y";
    Regards,
    Cnee

  • Optimize the data load process into BPC Cubes on BW

    Hello Gurus,
    We like to know how to optimize the data load process and our scenario for this is that we have ECC Classic Ledger,  and we are looking for the best way to load data into the BW Infocubes from an ECC source.
    To complement the question above, from what tables the data must be extracted and then parsed to BW so the consolidation it´s done ?  also, is there any other module that has to be considered from other modules like FI or EC-CS for this?
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    Have you looked at the BW Business Content extractors available for the classic GL? If not, I suggest you take a look. BW business content provides all the business logic you will normally need to get data out of ECC and into BW for pretty much every ECC application component in existence: [http://help.sap.com/saphelp_nw70/helpdata/en/17/cdfb637ca5436fa07f1fdc0123aaf8/frameset.htm]
    Ethan

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • Automate the data load process using task scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler.
    Thanks

    Thanks for your help.
    I have done data loading using Data_load.batch file . the .bat file call to .msh file using this i have followed your steps & automatic data lload is possible.
    If you need this then please let me know.
    Thanks again.
    Edited by: 949936 on Oct 26, 2012 5:46 AM

  • Error in generating/deleting the index of the cube in Process chain

    Hi All,
    I am getting an error in the process chain during the deletion/generation of index in the cube....
    error says: " Deleting/reconstructing indexes for InfoCube 0CSAL_C01 is not permitted "
    Need your input for the same.....
    Thanks & Regards,
    Anup

    Hi Anup,
    Try to analyse the error in the manage tab of that cube, and try to do the reconstruction of indexes manually.
    Good day!!!
    Regards
    Arun S

  • Automating the import / export process in 9.0.2

    In Portal 3.0.9.x the export and import were entirely command-line driven tools, so it was possible to automate the process to a certain degree.
    In 9.0.2 the import/export process is a combination of scripts and manual actions through the Portal UI to import and export the transport sets.
    Is there any way in 9.0.2 to automate this process ?

    You make me feel better. I am having THE problem. Logs say it was all a success but NOTHING is imported. I tried with portal 9.0.2.2.22 on Win2000 and now I'll try it on 9.0.2.2.14A on Sun. What did you use?

  • Planning Cube Load process

    Hi Gurus,
    How to Load data to DP cubes from a BW 3.5 system

    If the BW 3.5 system is an external system then you need to build an Infocube in APO datamart and first load the data from external BW system to APO BW. Then create Planning area, POS, CVCs etc and then load the Planning data from APO Infocube to Planning area. using Transaction /SAPAPO/TSCUBE. DP Cubes are virtual ones and there is no physical existence. Please check this [DP Process|http://help.sap.com/saphelp_scm50/helpdata/en/8f/9d6937089c2556e10000009b38f889/frameset.htm] and also please go through the [Best Practices|http://help.sap.com/bp_scmv250/BBLibrary/HTML/DPL_EN_DE.htm]. open the link and click on Configuration guide. It will you through step by step process.

  • Automating the book building process in FM11 - best tool?

    Hi - we're gearing up to create an automated book-building process using FM11.
    Before we start, a quick straw-poll:
    What do you folks recommend / suggest as the best automation tool?
    We want to trigger the process when SGML files are dumped into a directory. When we've done directory-triggered automation in the past under FM7.1, we've tended to use VB.NET code to trigger an instance of an FM7.1 API client (usually to build and collate PDFs, etc)... but, since we're in a new era with a new release of FM, we thought we'd take an open-minded, 'blue sky' approach here and see what tools members of the Framemaker community are using.
    If one particular approach seems to be easier / more advantageous / more widely adopted than any of the others, then I think that might nudge us down that road...
    Thanks.

    Hi Nigel...
    You might want to look at our plugin, AutoFM ..
         http://leximation.com/tools/info/autofm.php
    This lets you drive FrameMaker from an XML file and a batch file (or other command line scripting). The XML file contains the instructions on which files to open and what to do with them. You can perform various basic functionality like saveas, print, update book, etc. It also lets you call other automation options like ExtendScript, FrameScript, or FDK clients to extend the processing as needed.
    Download the 30-day trial and see if it does what you need.
    AutoFM is the piece that is used to drive DITA-FMx (via FMx-Auto) in this video for automated DITA to PDF publishing, but can be used for any type of FM automation ..
         http://blog.leximation.com/2012/02/automated-dita-to-pdf-publishing-with-fmx-auto/
    Keep in mind that various types of FrameMaker automation may require using FrameMaker Server .. so be sure to check the EULA and use the right FM version for your needs.
    Cheers,
    ...scott
    Scott Prentice
    Leximation, Inc.
    www.leximation.com

  • Automating the Auto Levels process

    As part of my work flow after importing the files into Aperture and applying meta data I then apply Auto levels to each image in turn, this is a very manual process involving the mouse and the right arrow key to move to the next image, wait for it to be read then press the auto levels button and then right arrow to the next image.
    Is there a way to get Aperture to go through all the images in an Album applying the Auto levels command to each image individually.
    It may even by by a keystroke macro process if there is such a one for OSX.
    I do not want to lift the adjustments from one corrected image and apply them to all the images as this does not reflect the auto needs of each image.
    After auto levelling each image I select the images for customised work.
    Thanks for any suggestions.

    Yes it worked - you have to redefine the action for each time you run aperture as the button for the auto levels function moves around the screen according to where you have the Aperture window.
    You also have to allow for Aperture slowing down so you have to build in a delay before pressing the Auto button.
    While running this process you cannot run another app that will sit over the Aperture window or take focus from Aperture.
    What surprises me is that the two Auto buttons do not have keystroke alternatives. (or at least I could not find them in the documentation)

  • Cube compression process chain creating

    Hi,
    I want to create Cube compression process chain so that it will automate the process to compress the cube.
    Could any please provide me the steps to create Process chain to automate the cube compression.
    Regards,
    Prem.
    Please search the forum
    Edited by: Pravender on Aug 27, 2010 11:54 AM

    Hi,
    You can modify ur process chain in RSPC.
    Get into change mode in planning view and in the left side check for the different process where you can find the Compression process.
    Drag the same to right section and create a variant for the same and give the cube name. Once done then you can put the same after the cube load process (If roll up is there then it shud be after rollup if not then cube load process).
    Once all changes are done then, activate it and collect the same in transport to move the changes to next level.
    Hope this helps.
    Murali

  • Report availabilty doesnt showup in the Cube

    Hi Gurus
    we have a cube and everyday full updates are loading to the cube thru process chains
    we have delete overlapping req step in the process chain for this cube..
    its a basic cube and it doesnt have any aggregates...
    everyday around 20 requests load to this cube.,. but today, all the requests loaded for this cube and surprisingly there is no Report Availabiltiy symbol shows up even though all the reqs are successfull...
    all the reqs in the cube are successfull without showing any RA symbol in the cube
    when i checked the "manage" tab of the cube, no reqs are available fore reporting even though the reqs are successfull
    i have already checked the transferred and added records...everything is fine and when i checked the infocube content, data is available, but no RA symbol shows up to any of the request
    please help me....
    appreciate your swift replies
    regards

    Gurus
    I have checked althese options...
    no job is running which loads to the cube...
    refreshed cube many times in RSA1 and logged off and logged on ...but in vain...
    still no RA symbol shows up... and all the technical and QA status of the reqs are green
    wht to do i wonder??!!
    cheerios

  • Data load process for FI module

    Dear all,
    We are using BI7.00 and in one of our FI data source 0EC_PCA_1 we had data load failure, the cause for the failure was analysed and we did the following
    1) deleted the data from cube and the PSA
    2) reloaded (full load) data - without disturbing the init.
    This solved our problem. Now when the data reconciliation is done we find that there are doubled entries for some of the G/L codes.
    I have a doubt here.
    Since there is no setup table for FI transactions (correct me if i am wrong), the full load had taken the data which was also present in the delta queue and subsequently the delta load had also loaded the same data
    (some g/l which was available as delta).
    Kindly provide the funtioning of FI data loads. Should we go for a Down time and how FI data loads works without setup tables.
    Can experts provided valuable solution for addressing this problem. Can anyone provide step by step process that has to be adopted to solve this problem permenantly.
    Regards,
    M.M

    Hi Magesh,
    The FI datasources do not involve Setup tables while performing full loads and they do not involve outbound queue during delta loads.
    Full load happens directly from your datasource view to BI and delta is captured in the delta queue.
    Yes you are right in saying that when you did a full load some of the values were pulled that were also present in the delta queue. Hence you have double loads.
    You need to completely reinitialise as the full load process is disturbed. Taking a down time depends on how frequent the transactions are happening.
    You need to.
    1. Completely delete the data in BW including the initialisation.
    2. Take a down time if necessary.
    3. Reintialise the whole datasource from scratch.
    Regards,
    Pramod

  • Issue while cube load

    Hi All,
    I am getting following error while doing the cube load.
    ***Error Occured in __XML_PARALLEL_LOADER: In __XML_UNIT_LOADER: In __XML_RUN_CC_AUTOSOLVE: Incremental aggregation over the dense DIMENSION OLAP_AGG!SCENARIO is not supported when aggregating a VARIABLE dimensioned by a COMPRESSED COMPOSITE.
    I understand this error comes when we try to aggregate the compressed cube and dimensions are not limit to all(limit to some values only). I faced this issue when manually running the aggmap of the cube after limiting the dimenisons.
    But this is happening now in the cube load through the script generated by AWM. I cannot run the cube load through AWM as this is in UAT env.
    One more clue is that the permit read is implemented and it do not allows to see some private scenarios but this should not cause the issue as permit_read will only get executed when user tries to connect in read only mode.
    Here is the xml load log content.
    13:08:47 ***Error Occured: Failed to Build(Refresh) XPRO_OLAP_AGG.OLAP_AGG Analytic Workspace.
    13:08:46 ***Error Occured in __XML_PARALLEL_LOADER: In __XML_UNIT_LOADER: In __XML_RUN_CC_AUTOSOLVE: Incremental aggregation over the >dense DIMENSION OLAP_AGG!SCENARIO is not supported when aggregating a VARIABLE dimensioned by a COMPRESSED COMPOSITE.
    13:08:46 Job #AWXML$_9179_3238 Stopped.
    13:08:44 Job #AWXML$_9179_3237 Stopped.
    13:08:42 Incremental aggregation over the dense DIMENSION OLAP_AGG!SCENARIO is not supported when aggregating a VARIABLE dimensioned by a >COMPRESSED COMPOSITE.
    13:08:29 Started Auto Solve for Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q2 2009 Partition.
    13:08:29 Finished Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q2 2009 Partition. Processed 71359 >Records. Rejected 0 Records.
    13:08:21 Started Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q2 2009 Partition.
    13:08:21 Started Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q1 2009 Partition.
    13:08:21 Started Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q1 2008 Partition.
    13:08:20 Attached AW XPRO_OLAP_AGG.OLAP_AGG in MULTI Mode.
    13:08:20 Attached AW XPRO_OLAP_AGG.OLAP_AGG in MULTI Mode.
    13:08:19 Attached AW XPRO_OLAP_AGG.OLAP_AGG in MULTI Mode.
    13:08:17 Running Jobs: AWXML$_9179_3237, AWXML$_9179_3238, AWXML$_9179_3239. Waiting for Free Process...
    13:08:17 Started 3 Finished 0 out of 5 Tasks.
    13:08:17 Running Jobs: AWXML$_9179_3237, AWXML$_9179_3238, AWXML$_9179_3239.
    13:08:17 Started 3 Finished 0 out of 5 Tasks.
    13:08:17 Running Jobs: AWXML$_9179_3237, AWXML$_9179_3238.
    13:08:17 Started 2 Finished 0 out of 5 Tasks.
    13:08:17 Running Jobs: AWXML$_9179_3237.
    13:08:17 Started 1 Finished 0 out of 5 Tasks.
    13:08:17 Starting Parallel Processing.
    13:08:17 Detached AW XPRO_OLAP_AGG.OLAP_AGG.
    13:07:17 Attached AW XPRO_OLAP_AGG.OLAP_AGG in RW Mode.
    13:07:15 Started Build(Refresh) of XPRO_OLAP_AGG.OLAP_AGG Analytic Workspace.
    13:07:14 Job# AWXML$_9179 to Build(Refresh) Analytic Workspace XPRO_OLAP_AGG.OLAP_AGG Submitted to the Queue.Any help on this error?
    Thanks
    Brijesh

    In AW, Right click on the Cube to Maintain it. Once done, Rebuild the cube. If this doesn't work, then first save the cube to template. Delete the cube, recreate it from the saved template and then maintain it followed by restarting all services and then do the full build.
    Regards
    Ankur Patel
    글 수정: user8337346

  • CO-PA (0COPC_C02) Cube loading..

    Hi Friends,
    Can anybody help me out from this problem.
    I want to load data into C0-PA infocube e.g: 0COPC_C02.
    For this infocube doen't have any predefined datasource in R/3 eventhough exits but 0records.
    So I've to create datasource for this infocube(0COPC_C02).
    how to select dimensions, keyfigures & navigational attr. Plx suggest me. It's very urgent
    Cheers
    Suresh

    hi,
    following data source is represents your cube requirement
    0CO_PC_ACT_1: Actual Costing/Material Ledger - Costs
    check the above data source in r/3 side. otherwise based on the above data source create LIS data source in r/3 side. replicate that data source in BW. do the standard loading process in BW side. so based on your data source create dimensions, key figure and all in your cube. hope this help you.
    if you want to know more about the cube check the following link
    http://help.sap.com/saphelp_nw70/helpdata/en/23/1569372b2b7d20e10000009b38f842/frameset.htm
    if helpful provide points
    regards
    harikrishan N

Maybe you are looking for

  • No picture when connecting Mac Mini GMA 950 to Panasonic TX-PF42G10 via HDM

    Hi ! I have a Mac Mini Core Duo 1,66 GHz with GMA950 chipset (MacMini1,1 sold from february 2006 to september 2006) which I was using as a Plex mediacenter connected to an old CRT display via an s-video adapter. I purchased a brand new Panasonic TX-P

  • Entering more than one item using bapi_salesorder_createfromdat1

    Hi, I am connecting a CRM system with SAP, I have got the xml file for Sales Order with all the data's and i want to create a sales order in sap system using bapi_salesorder_createfromdat1. My question is I am able to enter one item to the sap sales

  • Java applet and vertical scroll bar

    Hi, We are upgrading from SRM 3.0 to 5.0 and are encountering these 2 issues: 1- When clicking on approval preview users get a Java pop up box titled "request authentication", the warning message is: Identification Required. Please select certificate

  • Leading zeros remove from FB05 for assignment field in process open items b

    Dear expert At present i am using the electronic bank statement completed OT83 & OT51configuration, when i am running FF-5 the EBS soft copy has been uploaded and created batch input session. I executed  batch input session and the transactions has b

  • PSE 10 one program for two computers

    Hello; my question is, it possible to use my PSE 10 that is installed on my PC and use it on my laptop via a home network ? downloading the program onto my PC is not an option. I once had it installed I had to replace my drive thus using the two allo