Automation of load

Hello -
I have a scenario like this, appreciate if any BW gurus help me.
Existing Cube : A.
Cubes : X(daily snapshot data) and Y(monthly snapshot data).
MultiProvider : Z
Load Latest 3 months (92 requests) data on daily wise into the X cube from already existing cube A.
Once new month load is started, we need to automate the process of loading the previous 3rd month first day data requests from X into Y cube.  After successful load into the Y cube, delete all the daily load requests in X cube for the previous 3rd Month (requests) only, so that we have only the daily loads for the current 3 months in the X cube.
Y cube should only contain data for 1st day of every month excluding the current 3 months.
This process needs to be automated when there is a start of New Month.
I wanted to know where exactly I need to write the code for the automation.
Hope this is clear.
Thanks in advance.
Praveen

Hi Praveen,
You can do this using process chain.
1)You can make the process chain to be trigered by an event.
You do so by having the Job that start the PC start after an Event.
Then you define your set of job's (thru SM37) and make sure these jobs fire the event that starts the PC.
There is an ABAP program you can use that fires events (FM -> BP_EVENT_RAISE).
2)In the Start process of your process chain, select Direct Scheduling and then click the Change Selections icon. This will bring you to the Start Time window where you can put in the Date/Time you want to start your process chain. At the bottom of this window, click on the Periodic job box and you will notice another icon way at the bottom called Period Values. Click this to determine what frequency the job should be rescheduled (ie. daily, weekly, etc.).
3)you can do this by right-clicking on the start process, selection "Maintain Variant", choose "Direct Scheduling". Next select "Change Selections", pick your date and time, set your period values and you're done.
Hope this helps.
****Assign Points If Helpful****
Thanks,
Amith

Similar Messages

  • Automated MetaData Load

    Greetings,
    I'm curious to know how some of the users in this forum perform automated metadata load against dimensions in the Shared Library (and subsequently against any EPMA Planning apps). We have several apps whose dimensions are shared amongst various Planning apps which are updated manually when the dataload fails because of missing members. These are updated manually on a monthly basis due to the relative mall number of missing members.
    However, we are building an app with a dimension that is quite dynamic (a lot of new members are added and a few are deleted). This woud be insane to perform manually to update the Shared Library. Thus I'm looking for any suggestions on how to automate this via batch file or via any other means, including using "Create Member Properties".
    Any suggestions or ideas would be greatly welcomed...
    Many thanks.
    cg

    CG,
    .err genrates only when
    no proper names for member
    no proper alise name .. while data loading...etc etc
    These things can also be achived via some programing langague ... like java .. but u need to know all the possible error and your programme should be able to find out where it has gone wrong so that it can rebuild if any new member or child is missing via data load
    unix > u can use SED functions to do that which will find and replace any possible outcomes
    itb happen to me some time back where i used Java to replace some mmebr which used to get rejected while data loading .. but it was like specific butin ur case u have to know all possible outcomes ...

  • Automated data load from APO to SEM transactional cube

    Hi ,
    We have BW-SEM system integrated with APO system.
    I could see automated data loads from APO to SEM transactional cube ..
    Infopackage name as "Request loaded using the APO interface without monitor log" ..
    I don't see any infopackage by this name in both the systems ( APO & SEM )..
    I am not sure how it configured ..
    Appreciate any inputs on how its happens .....
    Thanks in advance

    Hi,
    As I mentioned the starting point will be the tcode BPS0. There will be 2 planning areas created (if I am correct) one for SEM cube and the other for APO cube. The better way to find it will be goto tcode se16 and enter UPC_BW_AREA and key in the cube names in the cube field. this will give you the planning area names now look for a multiplanning area which has the 2 areas included in them (this is available in table UPC_AREAM).
    then goto BPS0 and you will have to find which function is being used to post the data.
    thanks

  • Jobs and Tasks for Automation of load data in Essbase ASO Cube

    Hi, all people,
    my question is about creating jobs or tasks for automation loading data in essbase cubes.
    I use .bat file witch run MaxL script for loading data form oracle database into an essbase cube. For running .bat file is made a Windows Task.
    Is there any other opportunities to run .bat file not using windows tasks? Maybe there is special utilities in Oracle EPM Systems or any other.....
    I am using essbase version 11.1.1.2.0.
    Thanks for reply))

    There is no internal scheduler.You either have to use Windows Scheduler, Unix Cron or a third party tool. Take a look at Star Analytics Command center, it is designed specifically for Hyperion applications http://www.staranalytics.com/products/command_center.htm

  • Flex app behaves differently when Adobe automation files loaded

    We have an application built with Flex 3.1. I am attempting to create QTP tests for this application.  The instrumentation files are loaded at run time, not compiled into the application.
    I am using QTP 9.5 with the Adobe Flex Add-in 3.0.  I am able to see the controls an run tests to some extent.
    The main issue I am trying to resolve is that there is a difference in behavior of the application when it is started with the instrumentation files. The differences occur both inside and outside of the QTP environment.
    The observed differences are:
    The initial page is a login screen. After entering the id and password, I can click ‘Enter’ or click the “Log in” command button. With the instrumentation files loaded, the ‘Enter’ key no longer works.
    When I log out of the application, a progress screen displays while the environment clean-up occurs.  When the progress bar hits 100%, the original login screen is displayed. With instrumentation files loaded, activity stops when the progress bar hits 100% and the original login screen is never displayed.
    After logging in, a window with perspective tabs displays.  Each perspective tab page contains a window containing a list of resource associated with the perspective. When instrumentation files are loaded, the tab pages exist, but the perspective windows are not created.
    I realize I am dealing with application-specific behavior here and I am pursuing this with our developers. What I am looking for is any information related to differences in application behavior when instrumentation files are loaded.

    I can't seem to reproduce what you are seeing with this sample application:
    <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009"
                   xmlns:s="library://ns.adobe.com/flex/spark">
        <s:states>
            <s:State name="state1" />
            <s:State name="state2" />
        </s:states>
        <s:controlBarContent>
            <s:Button label="change state" click="currentState=currentState=='state1'?'state2':'state1'" />
        </s:controlBarContent>
        <s:Button x="50" y="50" label.state1="state1" label.state2="state2" skinClass="CustomButtonSkin" />
    </s:Application>
    Can you provide a similar sample application that demonstrates the issue?

  • EIS error while automating data loading

    Hi,
    We daily load ASO cube using EIS through planned jobs (.cbs).
    This morning the load data process was still running after 3 hours... it last 20 mins normally. The only error I was having was the following, and nothing explicit in the olapisvr.log!
    Error :
    Environment variable ARBORPATH is not set.
    Unable to open input file
    Essbase Integration Services Shell - 11.1.1.3.0 (Build EIS111110B021)
    Copyright(c) 1991, 2009, Oracle Corporation
    All rights reserved
    [LOCAL] ->
    Script :
    %EIS_BIN%\olapicmd.exe -f%EIS_BATCH%\dwh_cos.cbs

    Hi,
    While creating batch file did you set the ARBORPATH?
    If not include SET ARBORPATH=Location of the essbase server in batch file and give a try.
    Thanks,
    Prathap

  • Automation of Load Rule in Unix

    Hi,
    We have Essbase installed on the Linux server.When are running a shell script to run a calc or run a load rule we are getting an error "Essbase client Directory Not Found".
    Do we need to install the Essbase Client on the server to resolve this?
    Any help is appriciated.

    There is no internal scheduler.You either have to use Windows Scheduler, Unix Cron or a third party tool. Take a look at Star Analytics Command center, it is designed specifically for Hyperion applications http://www.staranalytics.com/products/command_center.htm

  • OLE automation application load and unload in each step in the sequence of the TestStand.

    what i need to configure for it unloading after step ? Changes in the RunOptions of the step doesnt help .
    I need work with unique application during this test in all sequences .
    Thanks , wait for your answer ASAP TestStand version 1.02
    OLE automation made in the VC++ 6.0
    I am used by ActiveX Automation Adapter

    I think the problem might be with COM. You will need to make your sequence release your reference before COM can unload it. To do that in TestStand, you set the reference to nothing (nothing is a keyword in TestStand). This will cause TestStand to release reference to your object and then COM will unload the object.
    Please give us an update if that worked.
    Anand Jain
    Applications Engineer
    National Instruments
    http://www.ni.com/ask

  • InfoPackage to load multiple flat file for different plant

    Hi
    There is a requirements to load the data from multiple flat files by different plant into BI which source data is coming from other system. Each flat file has it own naming convention.
    As data has to load via infopackage into the targeted cube. Then I have to create multiple infopackage in order to load the data according to the number of flat files that provided. However, new plant will be created due to expansion of bussiness. If that is the case, additional infopackage is required to create as and when there is a new plant. 
    Is there a way to create a infopackage to load the data from flat file by plant? Your advice is much appreciated.
    Thanks and regards
    Kang Ring

    Hi,
    The best way I can suggest is....
    1. Create a folder in AL11 for your requirements.
    2. In InfoPackage under Extraxction Tab you can have File Name , just side of this you can fine ABAP Routines ICon there you click on that and asks Name give some and then it will takes to your into ABAP Code there you can access the AL11 and then write code in such way that to take file irrespective of the File name, whatever the file is there in that Folder it ahoud take from that AL11 Folder path.
    3. Load the Data using PC.
    4.At the end of the PC Delete the File in AL11 folder , becasue we don't need it for next load.
    Else
    See the thread
    Re: Automation of load from application server
    Thanks
    Reddy
    Edited by: Surendra Reddy on Jun 4, 2010 11:03 AM

  • Automate SQL*LOADER

    Hi,
    I am working on a Decision Support System project.
    I need to load flat files into Oracle tables through SQL * Loader. And the entire process should be invoked through JAVA front end.
    How do I go about?
    I will deeply appreciate any help.
    Raghu.

    Hi,
    In our prev. project, We have customized-(automated) SQL*LOADER. There, we were using UNIX O/S. So We have used shell scripts for creating the control file (a script can create control file automatically). And it will call the sql loader and load the data into tables.
    Here u can use same logic.
    If ur flat file contents are in same format, u can use the static control files (means, at installation time u can create control files) and whenever u want this, u can call. Dont go for dynamic control files.
    1. If u is using Java as front end, u can use native methods and call sql loader (exe). Problem is, U can not invoke, and such thing from client m/c. U can do it only from server side.
    2. This way also u can try. By using external procedure method, u can call shared library and shared library can invoke sql loader (write a small C shared library program for invoking SQL*LOADER). Here, u can invoke SQL*LOADER from client m/c also.
    3. One more ways is there. By using listener tech. u can invoke it. Create listener program and run on server side as back ground process. Whenever, there is request, it will sql loader.
    With regards,
    Boby Jose Thekkanath
    [email protected]
    Dharma Computers(p) Ltd.
    Bangalore-India.

  • Data Load -- Best Practices Analytics 9.0.1

    We are currently implementing Essbase I would be interested in feedback concerning data load practices. <BR><BR>We have a front end system which delivers live operational type data in a sql database. Currently, I use Access to run queries against the data to load into Enterprise, but I would like to move to an automated, daily load for Essbase. At this point in Essbase, I have several load rules that I apply to Excel files which were exported from Access (not a good solution). I would assume that a better answer would be a SQL load, but I wonder how others typically go about loading information. What about loading financial data consolidated in another system (Enterprise)?<BR><BR>Thanks for any feedback,<BR><BR>Chris

    Wanted to give an update of my progress today.
    I again began with a clean installation of 9.0.0.  Brought up the CF administrator and completed the installation.  From there, I went directly to installing the 9.0.1 update and the 9.0.1 hotfix.  To my amazement, the cf administrator came up with an issue. But . . .
    I then went into the administrator to install my 'customizations' (i.e. my datasources, my SMTP mail server, my custom tags, etc).  Truly nothing unusual.  Almost sad to say - vanilla.  I then shut down the service as recommended to have some of the changes 'take effect'.  Boom, the cf administrator no longer appears but gives me the blank screen and the same error messages I have listed in my first note.  So again, it must be "something either I turned on/off incorrectly, but don't even know where to look".
    Would this be considered a bug?
    Libby H

  • OWB and AWM for loading DWH

    Hello,
    I am on the following version:
    DB: Oracle 10.2.0.1 (both source DB and target DWH)
    OWB : 10.2.0.1
    AWM : 10.2.0.1
    I have the following process:
    From the source DB I use OWB to load relational dimensions (SCD 2). I use AWM to load data from these dimensional tables to a Analytic workspace. I use OWB because I need to track changes (SCD) and use AWM because OWB currently does not support Parent-Child dimensions where as AWM supports. I would love to load AW from OWB directly if it had parent-child support.
    My question is:
    All the load should be trigger-based, When my process (SQL) completes on the source table the OWB process of loading the relational dimension should start which is not a problem, I using the run_owb_code.sql for this. once the run_owb_code is complete, I want the AWM's maintain data process to start. I don't know how to do it. I tried saving the maintenance task to script and tried executing it, it doesn't seem to work. I don't want to AWM and run cube maintenance every time the cube needs to be refreshed.
    Please note I don't have a cube in the relation world. My fact(cube) in the AW is a direct load from a table (as I don't need SCD here) so basically use OWB just to load the dimensions alone. AWM creates the actual cube with the dimension tables loaded using OWB and the fact table from source, So I don't think I can use the "dbms_awm.refresh_awcube(...".
    Can somebody please help me in automating this load process
    Thanks a lot in advance!
    Maruthi

    Hello,
    I am sorry, after some research, I came across this post
    Re: Cube and dimension refresh proactively
    As per the above post, the script that is generated using AWM will work if the AW is not attached by any user.
    I detached the AW and executed the script and the cube was populated with data.
    Thanks,
    Maruthi

  • Issue about Automated Process Traffic light status

    Hi experts,
    In our system(BW 3.1), we found sth interesting. The automated data load finished. But when monitoring the status using T-code RSMO, we found some traffic lights are set to red. The reason for red light is timeout.
    I figure: There is sth such as 'finish flag' for the traffic light to turn green, but after the dataload finished, the system didn't get it, so it waited till timeout.
    I am not sure of it, and can't find any material about it.
    Could anyone help me to explain this phenomenon?
    many thanks and best regards,
    Zehua

    Hi,
       Data will always be loaded into the cube irrespective of the traffic light status. Whether the load is in the cube or no, depends on the processing stage at which it has failed. You can see this in the monitor.
    Goto the details tab. There are multiple headings like Extraction, Transfer, Processing.
    Extraction and transfer are related to how the data is brought into the system. Processing is when the data is actually loaded somewhere either PSA or infoprovider or both depending on your processing type set in the infopackage.
    In your process chain, various processes are triggered depending on the type of link maintained in between two processes. These are of 3 types 1) Successful 2) Failed 3) Always .
    If the 1st one is selected, then the chain will move forward, if the previous step is green, 2nd case, if it is red and the 3rd one always.
    Regarding why the status are not being updated, check where the load is yellow. If it is transfer, then possiblities are that IDOcs are'nt being set properly, if it is a custom datasource with function module check whether the NO_MORE_DATA exception is being raised properly.
    Also check whether there are any short dumps in the source system or the BW system
    Hope this helps,
    Regards,
    Regards.

  • Automated creation of resources, subinventories and locations

    hi folks,
    we are on r12.1.3 and i have been tasked with setting up an automation for loading of new subinventories for new inventory orgs (we will have plenty of both) in the Inventory module, as well as setting up resource address and subinventories in the Service module.
    my charter is to setup subinventories (usable and defective each for a field resource) for these new inventory orgs. setup resources (people, logical). finally, there are the subinventory addresses that need to be associated with resources.
    i am sorry for the rather open-ended question here. i would like to know if there are any particular methods that are better than the other, any precedences or sequencing of events that one needs to keep in mind. is a UI / custom table necessary at all, or will a flat file suffice?
    broadly, if there are suggestions on how to go about doing this in a logical and unobtrusive fashion while minimizing pitfalls, i would be thankful for those.
    merry christmas and a happy new year to everyone...
    kind regards.

    Hi Dennis,
    Did you solve this issue?
    Can you please share the solution with me?
    Bye, Angelo

  • Business Object - Automation

    Hi,
    I'm have a dashboard where my data's are coming from different team. The different teams are sending data thru email. So, i'm collecting the data and implement into the dashboard manually.
    Is there any possible ways to do automation in BO instead of feeding data manually each time in dashboard.
    I have did Semi automation, i loaded all the data into access db which i have received from mails from different team. Then i have created universe and connceted to that access db. so thru by query browser, i connected to universe and mapped all the datas in dashboard. So data will get refresh automatically.
    But instead of loading all datas manually each and every time, is there any possible to do automation in BO or Xcelsius.
    Can anyone give suggestion on this. How to avoid email datas and how we can laod data autoamtically into database.
    Regards,
    Rajesh

    You have to automate this before it gets to BO. Collecting data via emails is obsolete.
    give your users mobile apps connected to your backend DB for automated , up to date data storage, then connect BO to that DB for proper reporting.
    BO purpose is to report of the data already collected. Collection of data is not BO's purpose.

Maybe you are looking for