How to load a flat file with lot of records

Hi,
I am trying to load a flat file with hundreds of records into an apps table. when i create the process and deploy it onto the console it asks for an input in an html form. why does it ask for an input when i have specified the input file directory in my process? is there any way around tis where in it just reads all the records from the flat file directly??is custom queues anyway related to what I am about to do?any documents on this process will be greatly appreciated.If any one can help me on this it will be great. thank you guys....

After deploying it, do you see if it is active and the status is on from the BPEL console BPEL Process tab? It should not come up to ask for input unless you are clicking it from the Dashboard tab. Do not click it from the Dashboard. Instead you should put some files into the input driectory. Wait few seconds you should see the instances of the BPEL process is created and start to process the files asynchrously.

Similar Messages

  • How to load a flat file with utf8 format in odi as source file?

    Hi All,
    Anybody knows how we can load a flat file with utf8 format in odi as source file.Please everybody knows to guide me.
    Regards,
    Sahar

    Could you explain which problem are you facing?
    Francesco

  • How can we load a flat file with very, very long lines into a table?

    Hello:
    We have to load a flat file with OWB. The problem is that each of the lines in the file might be up to 30.000 characters long (up to 1.000 units of information in each line, 30 characters long each)
    Of course, our mapping should insert these units of information as independent rows in a table (1.000 rows, in our example).
    We do not know how to go about it. We usually load flat files using table functions, but we am not sure that they will be able to cope with these huge lines. And how should we pivot those lines? Will the Pivot operator do the trick? Or maybe we should pivot those lines outside the database before loading them?
    We are a bit lost. Any suggestion would be appreciated.
    Regards
    Edited by: [email protected] on Oct 29, 2008 8:43 AM
    Edited by: [email protected] on Oct 29, 2008 8:44 AM

    Yes, well, we could define a 1.000 column external table, and then map those 1.000 columns to the Pivot operator… perhaps it would work. But we have been investigating a little bit, and we think that we have found a better solution: there is a unix utility called “fold”. This utility can split our 30.000 character lines in 1.000 lines, 30 characters long each: just what we needed. Then we can load the resulting file using an external table.
    We think this is a much better solution that handling 1.000 columns in the external table and in the Pivot operator.
    Thanks for your help.
    Regards
    Edited by: [email protected] on Oct 29, 2008 10:35 AM

  • How to load unicode data files with fixed records lengths?

    Hi!
    To load unicode data files with fixed records lengths (in terms of charachters and not of bytes!) using SQL*Loader manually, I found two ways:
    Alternative 1: one record per row
    SQL*Loader control file example (without POSITION, since POSITION always refers to bytes!)<br>
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode.dat
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001111112234444
    01NormalDExZWEI
    02ÄÜÖßêÊûÛxöööö
    03ÄÜÖßêÊûÛxöööö
    04üüüüüüÖÄxµôÔµ Alternative2: variable length records
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode_var.dat "VAR 4"
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001501NormalDExZWEI002702ÄÜÖßêÊûÛxöööö002604üuüüüüÖÄxµôÔµ Problems
    Implementing these two alternatives in OWB, I encounter the following problems:
    * How to specify LENGTH SEMANTICS CHAR?
    * How to suppress the POSITION definition?
    * How to define a flat file with variable length and how to specify the number of bytes containing the length definition?
    Or is there another way that can be implemented using OWB?
    Any help is appreciated!
    Thanks,
    Carsten.

    Hi Carsten
    If you need to support the LENGTH SEMANTICS CHAR clause in an external table then one option is to use the unbound external table and capture the access parameters manually. To create an unbound external table you can skip the selection of a base file in the external table wizard. Then when the external table is edited you will get an Access Parameters tab where you can define the parameters. In 11gR2 the File to Oracle external table can also add this clause via an option.
    Cheers
    David

  • How to create flat file with fixed lenght records

    I need help to export an Oracle table to a flat file with fixed lenght and without columns separator.
    the fixed length is the more important demand.
    My table have 50 columns with varchar, date and number .
    Date and number columns may be empty, null o with values.
    Thanks a lot for any help.
    [email protected]

    Hi,
    You can use this trick:
    SQL>desc t
    Name                                      Null?    Type
    NAME                                               VARCHAR2(20)
    SEX                                                VARCHAR2(1)
    SQL>SELECT LENGTH(LPAD(NAME,20,' ')||LPAD(SEX,1,' ')), LPAD(NAME,20,' ')||LPAD(SEX,1,' ') FROM T;
    LENGTH(LPAD(NAME,20,'')||LPAD(SEX,1,'')) LPAD(NAME,20,'')||LPA
                                          21                    aF
                                          21                    BM
                                          21                    CF
                                          21                    DM
    4 rows selected.
    SQL>SELECT *  FROM t;
    NAME                 S
    a                    F
    B                    M
    C                    F
    D                    M
    4 rows selected.Regards

  • How to parse a flat file with C#

    I need to parse a flat file with data that looks like
    01,1235,555
    02,2135,558
    16,156,15614
    16,000,000
    You get the idea. Anyway, I'd like to just used a derived column and move on except I need to put a line number on each row as it comes by so the end looks like,
    1,01,1235,555
    2,02,2135,558
    3,16,156,15614
    4,16,000,000
    I'm trying to do with a script transformation but I can't seem to get the hang of the syntax. I've tried looking at various examples but everybody seems to prefer VB and I'd like to keep all of my packages C#. I've set up my input and my output columns I just
    need to figure out how to write the code that says something like:
    row_number = 1
    line_number = row_number
    record_type = input.split.get the second data element
    data_point_1 = input.split.get the third data element
    row_number = row_number ++

    /* Microsoft SQL Server Integration Services Script Component
    * Write scripts using Microsoft Visual C# 2008.
    * ScriptMain is the entry point class of the script.*/
    using System;
    using System.Data;
    using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
    using Microsoft.SqlServer.Dts.Runtime.Wrapper;
    [Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
    public class ScriptMain : UserComponent
    private int rowCounter = 0;
    // Method that will be started before the rows start to pass
    public override void PreExecute()
    base.PreExecute();
    // Lock variable for read
    VariableDispenser variableDispenser = (VariableDispenser)this.VariableDispenser;
    variableDispenser.LockForRead("User::MaxID");
    IDTSVariables100 vars;
    variableDispenser.GetVariables(out vars);
    // Fill the internal variable with the value of the SSIS variable
    rowCounter = (int)vars["User::MaxID"].Value;
    // Unlock variable
    vars.Unlock();
    // Method that will be started for each record in you dataflow
    public override void Input0_ProcessInputRow(Input0Buffer Row)
    // Seed counter
    rowCounter++;
    // Fill the new column
    Row.MaxID = rowCounter;
    Here is a script to get an incremental ID. On the ReadWriteVariables of the script add the "User::MaxID" variables to get the last number. On the Inputs and Outputs tab, create an output column  here in the code it's MaxID numeric data types.

  • How to load multiple flat files

    Hi,
    I am new to ODI 10, I have one requirement where I need load flat files containing the folder size11GB.
    I want to load them all in a single instances with using of single data server, single physical schema and single logical schema.
    How can we do this.
    Also to execute this in package what steps and precautions do we need to follow.
    Thx

    Is the data in your files the same format?
    If so, simply follow one of the many guides to looping around files that use a common structure to load them, you can do this in parallel if you want.
    http://odiexperts.com/multiple-files-single-interface/
    or
    http://www.odigurus.com/2011/05/multiple-files-single-target-table.html

  • How to load several transformation files with a single action

    Hi everybody,
    We are loading data from BI cube into BPC cube. We are working on SAP BPC 7.0 version and we have designed several transformation files in order to load each key figure we need.
    Now, we want to load all the transformation files executing only one action. Which one is the best way to do it?
    We thought that it would be possible to build a single process chain, where we would call the target cube and all the transformation files. In this way, the administrator only has to execute once a package that would execute the process chain. We don't want the administrator to execute several times a package looking for the different transformation files.
    How can we do it? Is there any example or document related to it?
    Any idea out there?
    Kind regards
    Albert Mas

    HI SCOTT,
    I AM FACING A PROBLEM WHEN I RUN 2 ROUNDS IN ONE TRANSFORMATION FILE...
    I need to distribute a source field in to BPC through making 2 conversion files... following is the data
    Transformation file
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    CONVERTAMOUNTWDIM=ZOUTPUT
    *MAPPING
    CATEGORY=*NEWCOL(ACT)
    PAO=0COSTCENTER
    TIME=0FISCYEAR
    ZOUTPUT=0FUNDS_CTR
    SIGNEDDATA=0DEB_CRE_LC
    *CONVERSION
    PAO=PAO_CONVER.XLS
    ZOUTPUT=ZOUTPUT_CONVER.xls
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    CONVERTAMOUNTWDIM=ZOUTPUT
    *MAPPING
    CATEGORY=*NEWCOL(ACT)
    PAO=0COSTCENTER
    TIME=0FISCYEAR
    ZOUTPUT=0FUNDS_CTR
    SIGNEDDATA=0DEB_CRE_LC
    *CONVERSION
    PAO=PAO_CONVER.XLS
    ZOUTPUT=AMOUNT_CONVER.XLS
    Conversion file 1 (PAO=PAO_CONVER.XLS)
    EXTERNAL
    INTERNAL
    ID0001
    F08001
    ID0002
    F08001
    ID0003
    F08001
    DG0001
    F08001
    DG0002
    F08001
    Conversion file 2 (ZOUTPUT=ZOUTPUT_CONVER.xls)
    ID0001
    FX01
    VALUE*1
    ID0002
    FX01
    VALUE*1
    ID0003
    FX01
    VALUE*.40
    DG0001
    FX02
    VALUE*1
    DG0002
    FX02
    VALUE*1
    Conversion file 3 (ZOUTPUT=AMOUNT_CONVER.XLS)
    EXTERNAL
    INTERNAL
    FORMULA
    ID0003
    FX02
    VALUE*.60
    I am getting the following error
    [Start validating transformation file]
    Validating transformation file format
    Start validation transformation 1/2
    Validating options...
    Validation of options was successful.
    Validating mappings...
    Validation of mappings was successful.
    Validating conversions...
    Validation of the conversion was successful
    Start validation transformation 2/2
    Validating options...
    Validation of options was successful.
    Validating mappings...
    Validation of mappings was successful.
    Validating conversions...
    Validation of the conversion was successful
    Creating the transformation xml file. Please wait...
    Transformation xml file has been saved successfully.
    Begin validate transformation file with data file...
    [Start test transformation file]
    Validate has successfully completed
    ValidateRecords = YES
    Reject count: 0
    Record count: 6
    Skip count: 0
    Accept count: 6
    0COSTCENTER is not a valid command or column 0COSTCENTER does not exist in source
    Validation with data file failed

  • How to load a flat file into BW-BPS using Web Browser

    Hello, i have a problem with the "How to do Paper". I want to upload a Excel CSV file , but the paper only describes a txt file Uplaod. Does anybody can help me ?Thanks !

    You need to parse the line coming in from the flat file...
    You can do this with generic types in your flat file structure (string). 
    Then you loop through the table of strings that is your flat file and parse the string so that it breaks up the line for each comma.  There is an ABAP command called: SPLIT - syntax is as follows:
    SPLIT dobj AT sep INTO
          { {result1 result2 ...} | {TABLE result_tab} }
          [IN {BYTE|CHARACTER} MODE].
    Regards,
    Zane

  • How to create sqlplus output spool to a flat file  with a header record

    Hi all,
    I've requirement to spool data from a table to a flat file along with column headings in the first row.I'm getting data but I need header record in the first row also?
    Thanks,
    Mahender.

    Hi, Mahender,
    If you give this SQL*Plus command before you start SPOOLing
    SET   PAGESIZE  50000then you can get the usual SQL*Plus column headings, and they won't repeat unless you have more than 50,000 rows of output.
    When I can't do that, I use PROMPT
    {spool}
    SPOOL foo.txt
    PROMPT empno ename job mgr ...
    SELECT empno, ename, job, mgr ...
    PROMPT will trim leading whitespace (unless you enclose it in quotes), but it will leave spacing between the columns alone.
    You could also do a separate query, where you select some literals from dual:SPOOL foo.txt
    SELECT ' empno ename job mgr ...'
    FROM dual;
    SELECT empno, ename, job, mgr ...

  • How to load a flat file

    Hi,
    I have created an empty BPEL process which should take a flat file from my input directory and write it on to the database.But how is this process initiated?would the process be initiated by the presence of a flat file.... which is not happening in my case?or should it be be done by giving an input through a html form?if so how should it be done?Any reply in this case would be highly appreciated. thanks in advance

    Hi ,
    You can use File Adapter to read the file from your specified directory.
    Oracle JDev BPEL Designer has in-built support for configuring file adapter to read file from a input directory.
    HTH.
    Rakesh

  • How to Load a Flat File into BW-BPS Using a Web Browser

    Hello,
    I'm using the upload functionality described in the how to guide.
    When we want to have this functionality available for 12 different Planning levels. Do I have to create the Web Interface (as described in the how to guide) for each Planning Level separately, or can i pass a parameter in the URL (wenn calling the File Upload functionality) to determine which Planning level and Function it is.
    This pice of coding i want to have a bit more flexible
    *Execute planning function
    CALL FUNCTION 'API_SEMBPS_FUNCTION_EXECUTE'
            EXPORTING
              i_area     = 'ZIPM0001' " <<<< ADJUST
              i_plevel   = 'ZCAPB006' " <<<< ADJUST
              i_package  = '0-ADHOC'  " <<<< ADJUST
              i_function = 'ZEX00001' " <<<< ADJUST
              i_param    = 'Z0000001' " <<<< ADJUST
            IMPORTING
              e_subrc    = l_subrc
            TABLES
              etk_return = lt_bapiret.
    Does someone have an idea ?
    Thank you
    Dieter

    Hi Dieter,
    you should be able to grab the variable value by the following statement (e.g. in this case 'area' is being passed along, works for whatever you want to send) is:
    data: l_area type upc_y_area.
    l_area = request->get_form_field( 'area' ).
    in this case the calling URL looks like:
    <normal URL>?area=example_area
    example_area will then contain your value.
    Then depending on the value execute your different SEM functions
    Note that if you want to load different flatfile formats, more has to change in the functions as indicated in the white paper,
    Hope it helps,
    Regards,
    MArc
    I got it from the following document I found on SAPNet or SDN (forgot..) some time back:
    How To… Call a BPS Web Interface with Predefined Selections

  • How to cretae two flat file with a single program

    Hi All,
    I am trying to creating to two files on the application server using the open data set and close data.
    lets say the first file is file1 and second file is file2. but when i go to the Tcode AL11 and check the file only second file is appears there.
    its may be coz i m using two times this open data and close data in my program.
    can u tell help me howz these two file is appears on the application server i.e AL11.
    its very urgent pls help me.
    Thanks!
    Vipin

    Hi Do one thing,
    Start ur program in debugging mode complete first open data set and close data set and then go and chekc whehter the file got created or not and also check the sy-subrc values when u r doing ur open data set.
    if the first file is not done then no file will be in AL11 including ur new file name.
    In this way you can find whether the file got created or not.
    check that u r giving differnet file names fifferently otherwise it will be keep on overwriting the existing one.
    Regards,
    sasi

  • Extract of Data from Oracle in a Flat File with Column and Record seperator

    I have a case where I have to extract whole data froma Oracle Table where the Columns should be seperated by |@|
    and Records by ^*^.
    The reason for this is My data has Space and New line in it. So My Program to recoganize each column and record I want them to be seperated by special chars.
    Itried this but of no much help.
    set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on;
    spool on;
    set colsep '|@|';
    set recsepchar '^*^';
    spool "T_COMPLAINT.dat";
    select * from T_COMPLAINT where ROWNUM < '100' order by cptoid;
    spool off;

    Having '@' and '*' characters in the data will not make any difference if you are using a combined column separator of '|@|' - provided any process you use subsequently can handle it.
    However, the recsepchar parameter appears to be restricted to a single character which is repeated right across the page, so I don't think you could have a single iteration of '|*|' using this method:
    SQL> set  newpage 0 space 0 pagesize 0 feed off head off trimspool on
    SQL> set recsep EACH
    SQL> set recsepchar '*'
    SQL> set colsep '|@|'
    SQL> select * from testa;
    A         |@|1@        |@|22-JUN-2010
    B         |@|2*        |@|22-JUN-2010
    B         |@|2*        |@|22-JUN-2010
    ********************************************************************************Edited by: LindaA on 23-Jun-2010 08:33

  • How to handle flat file with variable delimiters in the file sender adapter

    Hi friends,
    I have some flat files in the ftp server and hope to poll them into XI, but before processing in XI, I hope to do some content conversion in the file sender adapter, according to the general solution, I just need to specify the field names, field seperator, end seperator, etc. But the questions is:
    The fileds in the test data may have different amount of delimiters (,), for example:
    ORD01,,,Z4XS,6100001746,,,,,2,1
    OBJ01,,,,,,,,,,4,3     
    Some fileds only have 1 ',' as the delimiter, but some of them have multiple ','.
    How can I handle it in the content conversion?
    Regards,
    Bean

    Hi Bing,
    Plz do refer the following blogs u will get an idea:
    File content conversion Blogs
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Regards,
    Vinod.

Maybe you are looking for

  • Error while saving xml document in table

    I am using dbms_xmlsave.insert to insert records in the table. The procedure works fine when the xml file size is less then 1 Mb. But when the file size is around 4 MB I get the error java.lang.outofMemoryError. Is there any limitation of number of r

  • No sound from HP G62 laptop and can no longer get sound from speakers or headphone jack . USB sworks

    have an HP G62 laptop.  I can no longer receive sound from speakers or from headphone jack.  If I plug speakers into usb - I will get sound.  Any ideas?

  • Lost editing in QuickOffice after hard reset

    I am no longer able to edit files in QuickOffice since a hard reset. I was under the impression that Anna included a full QuickOffice license but it is asking me to pay to upgrade. I bought a full license some time ago before the Anna upgrade. Solved

  • Logical disk free space counters missing for server 2008 r2 and 2012 r2

    I'm trying to get low disk space alerts for server 2008R2 and 2012 R2.  While the monitors exist for 2008 and 2012 (Not-R2), they are missign for all R2 operating systems.  Any idea how to either obtain these monitors, or create them so I can get ale

  • Javac not found OC4J

    Hi people I´d like to know what path oc4j is using to compile java files. I´m trying to use jasper report, but an error says that javac cannot be found. I´ve tried to set bindir to find javac in server.xml and change JAVA_HOME to another jdk dir. but