How to automate FDM load process

Hi,
I am using Impot script to extract data from GL to HFM through FDM.
I know the process of Automation If I have to load a flat file into the target system through FDM.
But I am not using Flat file. I am using import script to load GL data directly into HFM.
So Can anyone let me know if its possible to automate.
Thanks,
Hima

Yes, put an empty file in the open batch directory named according to the batch naming standards. This will trigger the import integration script.
Please flag this answer as correct if you find it to be so.

Similar Messages

  • How to design data load process chain?

    Hello,
    I am designing data load process chains for the first time and would like to get some general information on best practicies in that area.
    My situation is as follows:
    I have 3 source systems (R3 and two for which I use flat files).
    How do you suggest, should I define one big chain for all my loading process (I have about 20 InfoSources) or define a few shorter e.g.
    1. Master data R3
    2. Master data flat file system 1
    3. Master data flat file system 2
    4. Transaction data R3
    5. Transaction data file sys 1
    ... and execute one after another succesful end?
    Could you also suggest me any links or manuals on that topic?
    Thank you
    Andrzej

    Andrzej,
    My advise is to make separate chains for master & transaction data (always load in this order!) and afterwards make a 'master chain' where you insert these 2 chains one after the other (so: Start process -> Master data chain -> Transaction data chain).
    Regarding the separate chains; paralellize as much as possible (if functionally allowed). Normally, the number of parallel ('vertical') chains equals the nr of CPU's available (check with basis-person).
    Hope this provides you with enough info to start off with!
    Regards,
    Marco

  • Flat file loading... how to automate the loading of multiple files

    HI Alll,
    I'm wondering if it's possible to create a routine that automates the loading of several flat files into a PSA table.
    I only want a single info package, and I would like this info package to load file 1, then load file 2, file 3 etc etc....
    The alternative is to create info packages for each flat file, but this seems clunky and inefficient.
    Is this possible?
    Regards,
    Frederick

    Hi,
    It is not possible with single infopackage. you have to create different infopackages. But you can put it in process chain in sequence to load different flat files.

  • How to automate the loading of Correspondence attachment and templates

    Under the "*Administration - Document*" screen, in our application ( version 8.0.0.3 Public Sector ) , we have 4 tabs / views , they are "*Correspondence Attachments*", "*Case Attachments*", "*Correspondence Templates*" and "*Literature*". We are going to have at least 3000 of these files that are needed to be loaded into each of this view. Is there a way to automate the process of the bulk loading of these documents / files into Siebel ( The Siebel operation performed could be Insert , Update or Delete ).
    Your help is really appreciated.
    Thanks
    S

    "2. To avoid the above painful process as an ongoing task, setup the document server for
    handling this more smoothly. Refer Correspondence, Proposals, and Presentations Guide."
    We do have document server component up and running. But if we have 3000 templates that need to do loaded into Siebel , how can document server would help in this sense without having someone to manually creating a new Correspondence Template record ? I am looking for an automated process that can be reused repeatedly.

  • How to automate the loading of excel file

    Hi Guru,
    As we know we can load excel file through ODI but their is some manual work we have to do each time and it will only support in Windows platform
    How we can load excel file through ODI in Unix system and it should be automate so in one directory i got this file it should able to load the data automatically no manual intervention required.

    On windows jdbc-odbc bridge is used to read from excel file. Since such odbc setup is available only on windows hence the restriction. If there is any type 4 jdbc driver available for excel that it would works on unix too. 

  • How to automate manual idoc processing in BD87

    I am trying to automate manual processing of iDOCs in BD87. I used the following code to pass idoc-id to global variable 'DCN' and then skip the first screen of BD87 to go to processing directly. After running following code
    SET PARAMETER ID 'DCN' FIELD itabhdr-idoc_id.
    CALL TRANSACTION 'BD87' AND SKIP FIRST SCREEN.
    I get to the first screen because it cannot take my idoc-id to BD87. How I can pass idoc-id to global? I have used the above code to goto VA02 with VBELN and it worked perfectly.

    Hi,
    Try this small BDC program...This worked for me..
    PARAMETERS: p_idoc TYPE edidc-docnum DEFAULT '0000000000519276'.
    DATA: t_bdcdata TYPE STANDARD TABLE OF bdcdata.
    PERFORM bdc_begin USING 'RBDMON00' '1100'.
    PERFORM bdc_valu  USING: 'BDC_CURSOR' 'SX_UPDDA-HIGH',
                             'BDC_OKCODE' '=CRET',
                             'SX_DOCNU-LOW' p_idoc ,
                             'SX_CRETI-LOW' '00:00:00',
                             'SX_CRETI-HIGH' '00:00:00',
                             'SX_UPDDA-LOW' '',
                             'SX_UPDDA-HIGH' ''.
    CALL TRANSACTION 'BD87' USING t_bdcdata MODE 'E'.
    *       FORM bdc_begin                                                *
    *  -->  RV_PROGRAM                                                    *
    *  -->  RV_SCREEN                                                     *
    FORM bdc_begin USING rv_program rv_screen.
      DATA: s_bdcdata TYPE bdcdata.
      s_bdcdata-program = rv_program.
      s_bdcdata-dynpro  = rv_screen.
      s_bdcdata-dynbegin = 'X'.
      APPEND s_bdcdata TO t_bdcdata.
    ENDFORM.
    *       FORM bdc_valu                                                 *
    *  -->  RV_FNAM                                                       *
    *  -->  RV_FVAL                                                       *
    FORM bdc_valu USING rv_fnam rv_fval.
      DATA: s_bdcdata TYPE bdcdata.
      s_bdcdata-fnam = rv_fnam.
      s_bdcdata-fval = rv_fval.
      APPEND s_bdcdata TO t_bdcdata.
    ENDFORM.
    Thanks,
    Naren

  • How to automate the scheduling process in DI

    Hi all,
       i need to schedule the jobs when ever my source will get up dated ex i have source in oracle as soon as my source is updated My jobs sould be schedule autometically  
    Thank you

    >    i need to schedule the jobs when ever my source will get up dated ex i have source in oracle as soon as my source is updated My jobs sould be schedule autometically  
    You didn't specify your source environment as the solution might depend on it.  Since you didn't, I'll pretend you are using Oracle for example but nonetheless, the same procedure would apply to other database systems.
    You can create a PL/SQL routine that could be used as a trigger for any inserts/updates.
    With Oracle you can use the HOSTS command see:
    http://www.oracle.com/webapps/online-help/forms/10g/state?navSetId=_&navId=3&vtTopicFile=f1_help/builth_m/host.html&vtTopicId=
    e.g. host(c:\dir.bat)
    You can your job .bat file that is generated for your job when you do an "export execution command" from the management console.
    Hope that helps,
    -Gera Mats

  • Help Please : Cron Job to automate load process

    Hi
    I am trying to automate data load process. I am loading data into a number of tables from flat files.
    Currently I have a UNIX (SunOS) file with a bunch of SQLLDR commands and I changed permission on this file to executable. Every morning I execute this file to load data.
    Now I want to automate this process by writing a cron job and scheduling it. I am running into a number of problems. I exported ORACLE_SID, ORACLE_HOME and PATH still cron is unable to find SQLLDR.
    Whatelse am I missing. Here is my command file and cron file.
    Please help!?!?!?
    ORAENV VARiables
    export ORACLE_HOME=/export/opt/oracle/product/8.1.6
    export ORACLE_SID=fid1
    export PATH=$PATH:$ORACLE_HOME/bin
    .profile
    . $HOME/oraenv
    daily_full.sql file
    export ORACLE_SID ORACLE_HOME PATH
    sqlldr userid=user/pwd control=acct.ctl log=acct.log
    sqlldr .......
    Cron Job
    16 11 * * 1-5 /apps/fire/data/loadscripts/daily_full.sql >> /apps/fire/data/loadscripts/fulllog.log 2>&1
    Output fulllog.log file
    /apps/fire/data/loadscripts/daily_full.sql: sqlldr: not found
    /apps/fire/data/loadscripts/daily_full.sql: sqlldr: not found
    Thanks
    Shanthi

    Hi Ramayanapu,
    first; you have written a shell-script not an sql-script. Please rename your file from daily_full.sql to daily_full.sh
    I suggest that you use the cronjob from a user who has the enviroment with the variables ORACLE_SID and ORACLE_HOME.
    In this case cron will operate from the $HOME variable of this user.
    Perhaps your export will destroy the .kshrc setting. The statement has no effect in your script, please remove it.
    Rename your sqlldr-Statement as follows;
    $ORACLE_HOME/bin/sqlldr userid=user/pwd control=<path>acct.ctl log=acct.log
    <path> will placed with the path of your controlfile.
    Your user/pwd will correspond with a ORACLE user who has the right to insert in the destination table.
    Your logfile will be place in the %HOME directory.
    Hope that i could help to solve your problems
    with kind regards
    Hans-Peter

  • Automate data load from BPC cube to BW cube

    Hi Gurus,
    I've got all my budgeting & forecasting data in BPC cube. Now I need to load it to a BW cube and combine with Actuals in another BW cube through Multiprovider and build reports on the multiprovider.
    My question is:
    What is the best way to automate the loading process of BPC cube data to BW cube ??
    I should also be able to load the property values of BPC dimensions to the BW info objects.
    The methods I followed are:
    1. Run "Export" data package and load BPC data to a CSV file and run BW DTP/infopackage to process the CSV file into BW cube. Problem with this is - I canot automate these two steps, and even if I did, I cannot export property values to a flat file.
    2. Build transformations directly from BPC cube to an Infosource and from Infosource to BW cube. Problem with this is - in the transformations I cannot use the rule: "Read Master Data". I may have to write a routine, but my ABAP is not good enuf.
    Please help with an alternative solution
    Thanks,
    Venkat

    Thanks for the reply. I know I will have more options if I'm BPC 7.5 NW. But I'm on BPC 7.0 NW.
    I managed to pull the attribute values of BPC dimensions using routines. But still it's risky because the tech names of BPC objects may change if one modifies the dimensions or run optimization.
    It's a shame SAP haven't provided a robust solution to load BPC cube data to BW cube data.
    I don't want to load BW cube data to BPC cube and depend on EVDRE reports for my 'Plan vs Actual' reports. I (and end users) always want to lean towards BEx reports.

  • How to automate backup on SSM?

    Hi guys,
    I need assistance in how to automate the backup process on SAP Strategy Management 7.5 (SP10).
    Acctually I´m using Transporter to generate the ZIP file that contains the structure (meta-data) in SSM.
    Also, I´m generating a dump file from my Dimensional Model into Application Server client.
    So, I dind´t find a way to automate these process.
    Any ideas for this demand?
    I appreciate the suggestions.
    Best Regards,
    Bruno Heissler

    Guys,
    Just let you know, to automate the backup process from PAS (Dimensional Model) I´ve worked with batch command line. Something like this:
    PASADMIN.EXE -j DUMPCUBO;CUBENAME -u USER
    It´s working fine. Generating the Dump files on the Application Server/Home directory.
    Any idea to automate the backup process from Transporter Tool?
    Best Regards,
    Bruno Heissler

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • Automate the Cube loading process using script

    Hi,
    I have created the Essbase cube using Hyperion Essbase Studio 11.1.1 and my data source is Oracle.
    How can I automate the data loading process into the Essbase cubes using .bat scripts?
    I am very new to Essbase. Can anyone help me on this in detail?
    Regards
    Karthi

    You could automate the dimension building and dataloading using Esscmd/ Maxl scripts and then call them via .bat scripts.
    Various threads available related to this post. Anyways, you could follow the following steps.
    For any script provide the login credentials and select the database.
    LOGIN server username password ;
    SELECT Applic_name DB_name;
    To build dimension:
    BUILDDIM location rulobjName dataLoc sourceName fileType errorLog
    Eg: BUILDDIM 2 rulfile 4 username password 4 err_file;
    For Dataload
    IMPORT numeric dataFile fileType y/n ruleLoc rulobjName y/n [ErrorFile]
    Eg: IMPORT 4 username password 2 "rulfile_name" "Y";
    Regards,
    Cnee

  • Automate the data load process using task scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler.
    Thanks

    Thanks for your help.
    I have done data loading using Data_load.batch file . the .bat file call to .msh file using this i have followed your steps & automatic data lload is possible.
    If you need this then please let me know.
    Thanks again.
    Edited by: 949936 on Oct 26, 2012 5:46 AM

  • How to set delta load in process chain

    hello all
    The scenario is i created process chain for transaction data and all the infopackages are in initial load now I want to do delta,how to set delta in process chain,bcoz only the new records should load?
    kindly anyone let me know.
    regards
    balaji

    Hi Balaji,
    In most projects thay dont have the "init" infopackages in the process chain. They have 2 infopackages for a load - One for init and the other for delta. Now that you have created the process chain what I suggested was to change the infopackage settings to delta load and change the description of the infopackage.
    The other option would be to create a new infopackages for delta and include them in the chain replacing the "init" ones.
    Bye
    Dinesh

  • How to make udev load ALSA driver automaticly?

    I have added " modprobe snd-via82xx " (ALSA driver) to /etc/profile.d/mysh.sh, and I could use almost every multimedia software except mp3blaster.
    But after I installed udev, the udev load "via82cxxx_audio"  (OSS driver) automaticly instead of snd-via82xxx. mp3blaster can work now, but the esd output plug-in of XMMS could not work, so did stardict (is here anyone have ever used it?).
    I modprobe snd-via82xxx manually, but have no use.
    How to make OSS and ALSA work parallelly? And, how to make udev load ALSA automacticly instead of OSS?

    I have knew this method. What I mean is that the udev load OSS driver at boot. Although I modprobe the ALSA driver,  my software (stardict) can not play a wav sound (Other software can).
    I wants udev NOT to load OSS driver at boot. Or, it is the best, make OSS and ALSA work parallelly,

Maybe you are looking for