FDM Script to import data

Hi,
I'm new to FDM scripting. So any pointers will be appreciated. I need to import a TB with the following format in exel file.
row1 ***CostCenter Number***
row2 ***TB name*
row3 - Other parameters
row4 - Blank
row5 - Data headers
row6 - Data begins - format given below -
row7 - Account Desc Opening Bal Closing Bal
I have created an import format with the 4 fields - CostCentre, Account,Desc and closing bal. Now my question is -
1. How do i create script that picks the costcenter number from row1 ?
Thanks in Advance!!

That format will not work in an excel file, FDM requires a specific excel template, also the FDM excel templates do not use import scripts,so first convert to a csv file.
You will then need to use a temporary variable for this e.g. RES.PvarTemp1 and create 2 import scripts
Assuming that cost centre is the only value on line one of the file what you need to do is:
Attach an Import script to the Amount entry in your import format and put in the following code (this assumes cost centre is always the first header line and your script is called GetCostCentre)
If Not Len(RES.PvarTemp1) > 0 Then
RES.PvarTemp1 = DW.Utilities.fParseString(strRecord,4,1,",")
End If
GetCostCentre = strField
Attach an import script to the CostCentre entry in the import format called SetCostCentre, it will have one line
SetCostCentre = RES.PvarTemp1
There is a very similar example in the FDM admin guide Scripting cahpter which has a bit more description.
Edited by: SH on Dec 7, 2012 11:17 AM

Similar Messages

  • Automate script for import data

    automate script for importing data to the cube from txt files.
    Ex: - I have below 2 files for import
    aaaa.txt
    bbbb.txt
    For the above simply i can write esscmd or Maxl script for importing files .
    Esscmd :
    IMPORT 3 "\aaaa.txt" 4 "N";
    IMPORT 3 "\bbbb.txt" 4 "N"
    my question , suppose if some more files added to the existing files, for example cccc.txt got added ,how can i write a code for this?

    Well I guess this will do
    dataDir=" " # Give the data directory path here
    maxlFile="maxl1.msh"
    cd $dataDir
    for file in `ls *` ;
    do
    maxlStmt="import database sample.basic data from data_file '$file' using rules_file 'xxx.rul'
    on error write to 'xxx.log';";
    printf $maxlStmt >> $maxlFile
    done
    ls * is lists all the files in the directory. If you want to be more specific and if you yhink tat your data folder may contain more files then you can use "ls *.txt" if your file extension to load is txt.
    Well I haven't tested this as I am on vacation...So you can just follow the similar way.
    This is only for the 3rd point which you said

  • Using  FDM Import Action script to import data from ERPI table "tdataseg_t"

    Hi Experts
    I have extracted data from EBS using ERPI and loaded into the intermediate table "tdataseg_t"
    I am trying to use Import Action script within FDM to extract data from "tdataseg_t" table (where ERPI extract data is stored) as i could not use the normal import script.
    Requirement : I have to restrict the custom 2 dimension based on Account dimension.
    Dim Account
    Account = ???? ( i am struck here)
    If Account = 1000 to 5000 Then
    Custom2 = Right( product , 5)
    End If
    but , I could not find the exact syntax to call the Account dimension from the "tdataseg_t" table.
    Please Advise
    Thanks
    Sak
    Edited by: user12292415 on Feb 26, 2012 1:19 AM

    Hello,
    Importing data via a manual script defeats the purpose of ERPi. As it will not provide you an audit trail, drill-through/drill-back, etc.
    Your best bet is to use the default settings by the software. Just because it returns more records than you want is not a bad thing. You can conditionally change/alter the information either in an EventScript inside of FDM or by mapping the un-needed information to IGNORE.
    Thank you,

  • Is it possible to create an Apple Script to import data from a web data feed (RESTful XML/JSON)?

    Before I get started looking for a programmer, does anyone know if it will be possible to create a script that will allow Numbers to connect to an external data feed in either XML or JSON? I want to enable some colleagues who use Mac to connect to the same data feed that Excel for Office connects to.. Thanks,

    It is definitely not hard to import XML or JSON data via an AppleScript. As an amateur tinkerer, I found a way, for example, to bring in OFX data from my bank (OFX is a flavor of XML) using the XML parser built into System Events, which is AppleScriptable.
    How satisfied your colleagues would be with the solution, though, will depend on what you mean by "connect to an external data feed." If you are dealing with datasets in the thousands of rows and need frequent and rapid response, then you may not be happy. In that kind of situation you would need to expect at least a few seconds of wait time every time you "refresh."
    If, as in my case, you have only a few hundred rows and don't need to bring the data in every few seconds, then the AppleScript solution is very practical. The programmer could probably goose the performance and make the job easier by using AppleScript as a wrapper for a script in Python or another language that is faster than AppleScript at parsing XML and JSON.
    SG

  • Script for Importing data

    Hii All
    I want to run import of all the datafile s with help of script
    I have many datafile s with differant users and I want to run db import
    Plz Help me How to write a script .
    Thanking You

    GaurAVS,
    I don't know what were the reaosns that made you selected this option, question did you try using any other options to move data from source to destination. Couple of questions for you?
    1. One single export dump for each schema user or is it more than one file?
    2. I assume from your earlier post, you got more than 50 schema users in your database?
    Here is simple bash script, assuming you have one export dump file for each schema user. Make a directory call data and move all the dump file, please read script carefully for more instruction and modify according to your needs.
    In addition before running this script on linux box run follwoing from command prompt
    $dos2unix import.sh
    #!/bin/bash
    function importfile
         echo "importing $file for user $1"
         imp $1/$2 full=y file=$file log=./data/`basename $file dmp`log
         test=$?
         if [ $test -ne 0 ]
         then
                echo "Importing for user $1 failed ..."
             exit 1;
         fi
    function listfile
        # you have to supply username and password here
           for file in ./data/*.dmp
           do
             echo "exporting $file for $1"
             # If you used username as filename for each export dump
              unset username
              +username=`basename $file .dmp`
              echo "Username is set to $username"
              # Now to get password for this username ; you can create a file with
              # this format username=password
              # $1 -- Username
              # $2 -- Password
              unset pw
              pw=`cat ./data/pwd.txt.txt|grep $username|cut -d'=' -f2`
              echo "Password is set to $pw"
              # You have to write a small function to get username and password
              importfile $username $pw
               done
    listfile $1 $2Edited by: OrionNet on Dec 20, 2008 11:47 PM

  • Import data to FDM fails for planning app

    Hi,
    I have a problem with importing data from oracle view using FDM script.
    I have to make 2 things:
    1. Import data to HFM app (it works fine)
    2. Import data to Planning
    First of all i have configured HFM integration and set up all necessary information - it works fine (I can simply import data to import view using FDM).
    In the next step I'm trying to do the same with Planning data.
    I have set up connection to Essbase (I have test it setting up period table where I set manualy period table by getting values from Essbase Period and Year dimensions)
    When running the import process I got message that import failed. And thats all - no more informations in logs.
    I have also checked that:
    Data are imported from oracle view to my repository (new table with data was created but there is a problem with "showing them" on the FDM import site.
    Can any body help me with this one?
    regards
    Karol

    Hi,
    I take it you are using a Integration script to load the data from the sql respository, are you using the example script in the documentation and updating it ?
    If you are using the example script make sure the line > SQLIntegration = True matches the name of the function, so if it was
    Function SqlDload(strLoc, lngCatKey, dblPerKey, strWorkTableName)
    then it needs to be SqlLoad = True
    I have used the script to load data directly from a sql table into essbase so it does work in FDM
    If you are still having trouble then let me know further information or the script you are trying to run and I might be able to spot something.
    Cheers
    John

  • FDM Scripting Query for last imported source file using Batch Processing

    Hi Experts,
    I'm currently in the processing of automating the FDM load process on our version of FDM 9.3.3 using batch processing and the FDM Task Manager. Most of the process works fine including an email alert which notifies users of when a data load has taken place.
    As part of that email alert I am trying to attach the source file that has been loaded in batch processing. I have managed to get an attachment using the following FDM Script Object of:
    "API.MaintenanceMgr.fPartLastFile(strLoc, True, False)".
    But have noticed that using this only attaches the last "manually" imported file rather than the last file imported using the batch processing.
    My question is: Is it possible for someone to steer me into the right direction of either a more appropriate API or if I have missed a step in my script.
    Any help as always would be much appreciated.
    Cheers
    Pip

    Unfortunately the batch process does not work the same way as on-line. I am assuming you are using the normal batch load and not Multiload (although the batch is simisar).
    the batch file name gets recorded on the tBatchContents table, and moved to the import/batches folder under the folder for the current batch run. However, if successful the file gets deleted (and from memory does not get archived). To add the import file to the e-mail, after a successful load, i think you will need to store a copy of it prior to importing the file.

  • Performance issue with FDM when importing data

    In the FDM Web console, a performance issue has been detected when importing data (.txt)
    In less than 10 seconds the ".txt" and the ".log" files are created the INBOX folder (the ".txt" file) and in the OUTBOX\Logs (the ".log" file).
    At that moment, system shows the message "Processing, please wait” during 10 minutes. Eventually the information is displayed, however if we want to see the second page, we have to wait more than 20 seconds.
    It seems a performance issue when system tries to show the imported data in the web page.
    It has been also noted that when a user tries to import a txt file directly clicking on the tab "Select File From Inbox", the user has to also wait other 10 minutes before the information is displayed on the web page.
    Thx in advance!
    Cheers
    Matteo

    Hi Matteo
    How much data is being imported / displayed when users are interacting with the system.
    There is a report that may help you to analyse this but unfortunately I cannot remember what it is called and don't have access to a system to check. I do remember that it breaks down the import process into stages showing how long it takes to process each mapping step and the overall time.
    I suspect that what you are seeing is normal behaviour but that isn't to say that performance improvements are not possible.
    The copying of files is the first part of the import process before FDM then starts the import so that will be quick. The processing is then the time taken to import the records, process the mapping and write to the tables. If users are clicking 'Select file from Inbox' then they are re-importing so it will take just as long as it would for you to import it, they are not just asking to retrieve previously imported data.
    Hope this helps
    Stuart

  • Table Import Data - "Insert script" - National characters

    Hi all,
    it looks like that there is a problem with support of national characters in imported data file when method "Insert script" is chosen.
    Table -> Import Data -> Open datafile "csv".
    As far as in the preview window I'm seeing properly displayed national characters from csv data file and when I'm choosing "Insert" or "SQL Loader" method - data is properly imported to the table.
    But when I'm using "Insert script" method, in generated script national characters are changed into "bushes":
    http://imm.io/V0J9
    SQL Developer: Version 3.2.20.09
    OS: Windows XP SP3
    Client code page: WIN-1250
    Tested databases: 10g, 11g

    <p>This has been fixed in the latest build. The patch is now available for <a href = "http://www.oracle.com/technology/software/products/sql/index.html">download</a>.
    </p>
    Regards
    </p>Sue

  • Error. Import data in FDM

    Hi all,
    when i try to import data into FDM i get following error:
    ** Begin FDM Runtime Error Log Entry [2009-04-02-17:29:08] **
    ERROR:
    Code......................................... 5
    Description.................................. Invalid procedure call or argument
    Procedure.................................... clsImpProcessMgr.fLoadAndProcessFile
    Component.................................... upsWObjectsDM
    Version...................................... 931
    Thread....................................... 5284
    IDENTIFICATION:
    User......................................... vnovokhatskiy
    Computer Name................................ ARSSRV99
    App Name..................................... Test1
    Client App................................... WebClient
    CONNECTION:
    Provider..................................... ORAOLEDB.ORACLE
    Data Server..................................
    Database Name................................ HYPER_srv34
    Trusted Connect.............................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location..................................... DAILYEUR
    Location ID.................................. 750
    Location Seg................................. 4
    Category..................................... ACTUAL
    Category ID.................................. 12
    Period....................................... Feb - 2009
    Period ID.................................... 2/28/2009
    POV Local.................................... False
    Language..................................... 1033
    User Level................................... 1
    All Partitions............................... True
    Is Auditor................................... False
    what can cause it?

    What, in that error log, tips you off it has something to do with dimensions being improperly mapped?

  • Need a script to import the data from flat file

    Hi Friends,
    Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
    Thanks.
    Srini

    Here is my init.ora file
    # $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
    # Copyright (c) 1991, 1997, 1998 by Oracle Corporation
    # NAME
    # init.ora
    # FUNCTION
    # NOTES
    # MODIFIED
    # atsukerm 08/06/98 - fix for 8.1.
    # hpiao 06/05/97 - fix for 803
    # glavash 05/12/97 - add oracle_trace_enable comment
    # hpiao 04/22/97 - remove ifile=, events=, etc.
    # alingelb 09/19/94 - remove vms-specific stuff
    # dpawson 07/07/93 - add more comments regarded archive start
    # maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
    # jloaiza 03/07/92 - change ALPHA to BETA
    # danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
    # ghallmar 02/03/92 - db_directory -> db_domain
    # maporter 01/12/92 - merge changes from branch 1.8.308.1
    # maporter 12/21/91 - bug 76493: Add control_files parameter
    # wbridge 12/03/91 - use of %c in archive format is discouraged
    # ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
    # thayes 11/27/91 - Change default for cache_clone
    # jloaiza 08/13/91 - merge changes from branch 1.7.100.1
    # jloaiza 07/31/91 - add debug stuff
    # rlim 04/29/91 - removal of char_is_varchar2
    # Bridge 03/12/91 - log_allocation no longer exists
    # Wijaya 02/05/91 - remove obsolete parameters
    # Example INIT.ORA file
    # This file is provided by Oracle Corporation to help you customize
    # your RDBMS installation for your site. Important system parameters
    # are discussed, and example settings given.
    # Some parameter settings are generic to any size installation.
    # For parameters that require different values in different size
    # installations, three scenarios have been provided: SMALL, MEDIUM
    # and LARGE. Any parameter that needs to be tuned according to
    # installation size will have three settings, each one commented
    # according to installation size.
    # Use the following table to approximate the SGA size needed for the
    # three scenarious provided in this file:
    # -------Installation/Database Size------
    # SMALL MEDIUM LARGE
    # Block 2K 4500K 6800K 17000K
    # Size 4K 5500K 8800K 21000K
    # To set up a database that multiple instances will be using, place
    # all instance-specific parameters in one file, and then have all
    # of these files point to a master file using the IFILE command.
    # This way, when you change a public
    # parameter, it will automatically change on all instances. This is
    # necessary, since all instances must run with the same value for many
    # parameters. For example, if you choose to use private rollback segments,
    # these must be specified in different files, but since all gc_*
    # parameters must be the same on all instances, they should be in one file.
    # INSTRUCTIONS: Edit this file and the other INIT files it calls for
    # your site, either by using the values provided here or by providing
    # your own. Then place an IFILE= line into each instance-specific
    # INIT file that points at this file.
    # NOTE: Parameter values suggested in this file are based on conservative
    # estimates for computer memory availability. You should adjust values upward
    # for modern machines.
    # You may also consider using Database Configuration Assistant tool (DBCA)
    # to create INIT file and to size your initial set of tablespaces based
    # on the user input.
    # replace DEFAULT with your database name
    db_name=DEFAULT
    db_files = 80 # SMALL
    # db_files = 400 # MEDIUM
    # db_files = 1500 # LARGE
    db_file_multiblock_read_count = 8 # SMALL
    # db_file_multiblock_read_count = 16 # MEDIUM
    # db_file_multiblock_read_count = 32 # LARGE
    db_block_buffers = 100 # SMALL
    # db_block_buffers = 550 # MEDIUM
    # db_block_buffers = 3200 # LARGE
    shared_pool_size = 3500000 # SMALL
    # shared_pool_size = 5000000 # MEDIUM
    # shared_pool_size = 9000000 # LARGE
    log_checkpoint_interval = 10000
    processes = 50 # SMALL
    # processes = 100 # MEDIUM
    # processes = 200 # LARGE
    parallel_max_servers = 5 # SMALL
    # parallel_max_servers = 4 x (number of CPUs) # MEDIUM
    # parallel_max_servers = 4 x (number of CPUs) # LARGE
    log_buffer = 32768 # SMALL
    # log_buffer = 32768 # MEDIUM
    # log_buffer = 163840 # LARGE
    # audit_trail = true # if you want auditing
    # timed_statistics = true # if you want timed statistics
    max_dump_file_size = 10240 # limit trace file size to 5 Meg each
    # Uncommenting the line below will cause automatic archiving if archiving has
    # been enabled using ALTER DATABASE ARCHIVELOG.
    # log_archive_start = true
    # log_archive_dest = disk$rdbms:[oracle.archive]
    # log_archive_format = "T%TS%S.ARC"
    # If using private rollback segments, place lines of the following
    # form in each of your instance-specific init.ora files:
    # rollback_segments = (name1, name2)
    # If using public rollback segments, define how many
    # rollback segments each instance will pick up, using the formula
    # # of rollback segments = transactions / transactions_per_rollback_segment
    # In this example each instance will grab 40/5 = 8:
    # transactions = 40
    # transactions_per_rollback_segment = 5
    # Global Naming -- enforce that a dblink has same name as the db it connects to
    global_names = TRUE
    # Edit and uncomment the following line to provide the suffix that will be
    # appended to the db_name parameter (separated with a dot) and stored as the
    # global database name when a database is created. If your site uses
    # Internet Domain names for e-mail, then the part of your e-mail address after
    # the '@' is a good candidate for this parameter value.
    # db_domain = us.acme.com      # global database name is db_name.db_domain
    # FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
    # vms_sga_use_gblpagfil = TRUE
    # FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
    # adversely affect performance. On some non-VMS ports the db_block_cache_*
    # debugging modes have a severe effect on performance.
    #_db_block_cache_protect = true # memory protect buffers
    #event = "10210 trace name context forever, level 2" # data block checking
    #event = "10211 trace name context forever, level 2" # index block checking
    #event = "10235 trace name context forever, level 1" # memory heap checking
    #event = "10049 trace name context forever, level 2" # memory protect cursors
    # define parallel server (multi-instance) parameters
    #ifile = ora_system:initps.ora
    # define two control files by default
    control_files = (ora_control1, ora_control2)
    # Uncomment the following line if you wish to enable the Oracle Trace product
    # to trace server activity. This enables scheduling of server collections
    # from the Oracle Enterprise Manager Console.
    # Also, if the oracle_trace_collection_name parameter is non-null,
    # every session will write to the named collection, as well as enabling you
    # to schedule future collections from the console.
    # oracle_trace_enable = TRUE
    # Uncomment the following line, if you want to use some of the new 8.1
    # features. Please remember that using them may require some downgrade
    # actions if you later decide to move back to 8.0.
    #compatible = 8.1.0
    Thanks.
    Srini

  • Import data from XML file into ICM script

    Hello,
    Is it possible to read data from an XML file into an ICM Admin script? If yes, using which node?
    Thanks,
    Justine.

    Hi,
    as far as i know not directy. Only Method i could think of right now is via CVP Application (Java Class, Custom Node) or IVR Script.
    To import data you only have the Database Node which does only allow sources form the Database Lookup Explorer ir passing data along to/from external scripts.
    Fabian

  • Is it possible to import data into a Reader Extended PDF created in Livecycle?

    Hi there,
    I make a lot of Reader Extended forms for my company. When I issue form updates, sometimes staff have to re-copy or re-type records into forms with repeating subforms.  I'd like to be able to jumpstart their process by importing from an old, filled-out form, into my new updated form.
    I've found I can do this by making an XML data file of the data from the old form, but I can't import into an Extended from using Adobe Reader. Most of our staff don't have Acrobat, so they aren't able to save data into non-extended forms.
    Is this clear, what I'm asking? Import XML data into Reader Extended form.
    Thanks for any help you can give,
    Laura

    Hi,
    It's indeed possible to import data in a reader enabled form with Reader.
    Here's a sample:
    LiveCycle Blog: XML per Skript in Adobe Reader importieren//Import XML via Script into Adobe Reader

  • BUG 3.0/3.1 Import data from CSV in non english local

    Hello,
    I have 3.0 on a German XP (NLS Decimal "," and Group ".") but with
    AddVMOption -Duser.language=en
    because of bug 9231534
    I try to import data from a delimited file that contains the number "123,23"
    When I use the Import Data Wizard it generates an insert with "12323.0". This import works, but gives me the wrong values, 12323 instead of 123,23.
    I have 3.1 with german UI and try the same import.
    It generates a correct insert "123.23", but on executing it fails, because it expects the german decimal separator, this is wrong for two reasons:
    The generated script has a dot as decimal separator and it would not make sense to use a comma, because this is the value separator in an insert script.
    To execute the script it should use the same NLS-Setting as for generating.
    Regards
    Marcus

    Hello,
    has anybody found a solution?
    Testcase: SQL Developer 3.1 on a German XP with default NLS Settings
    CREATE TABLE "TEST_TABLE"
           "NUM" NUMBER
          ,"VCH"  VARCHAR2(10 BYTE)
        ) ;Test file test_insert.dsv
    num;vch
    1;KL
    1,5;tz
    12345,45;ooImporting using the wizard inserts the first row correctly, for the others I get
    SET DEFINE OFF
    --Einfügen für Zeilen  1  bis  3  nicht erfolgreich
    --ORA-01722: invalid number
    --Zeile 2
    INSERT INTO TEST_TABLE (NUM, VCH) VALUES (1.5,'tz');
    --Zeile 3
    INSERT INTO TEST_TABLE (NUM, VCH) VALUES (12345.45,'oo');Beside the wrong umlaut in the message the insert statement itself is correct, because you cannot use the german decimal separator "," in the script. The bug is, that it should use the same NLS settings for generating and running the script.
    Regards
    Marcus

  • Import data in After Effects

    hello,
    is it possible to import data from other sources by using scripts in AE? I would like to generate a composition that uses a var "name" that would be taken in a list of names in a database or a source out of AE.
    Thks,

    ExtendScript does have the capability of doing file I/O. It's described in the JavaScipt Tools Guide. From the ExtendScript Editor: Help > JavaScript Tools Guide CS6.
    Dan

Maybe you are looking for

  • Site Editor not working in IE10

    Hi, A client of mine if having trouble using the Site Editor in IE10. He said it opens to a white screen that just says "site editor" in the middle. Has anyone had issues with this, or know if this is a system glitch? Thanks!

  • TS3623 movie preview OK but cannot buy, rejected with "your purchase could not be completed" was fine before?

    recently installed Apple TV, worked fine last evening to buy and watch a movie, tonight can watch a preview but when I go to buy I'm rejected with teh message "your purchase could not be completed"  solution?

  • Rename file name in photostream

    question: how do i rename an image within photostream? note distinction: file name vs photo title. evidently photostream displays file name. hence, if you have gone to the trouble of renaming a photo title in photoshop and then created a photostream

  • Log for Change of Real-Time Load Behavior

    Hi experts, is there a log for the change of the real-time load behavior? I checked SLG1, but I did not find any object / subobject that seemed to be helpful. Angie

  • How do I find the security tab

    I need to find "options" window so I can select the security. I want to remove the check marks next to "block reported sites" and "block reported web forgeries".