Load multiples files in multiples tables

Hi all,
I'm trying to understand how to tackle with a problem. Every night I receive 38 files in format:
File1-DD-MM-YYYY.txt
File2-DD-MM-YYYY.txt
File3-DD-MM-YYYY.txt
File4-DD-MM-YYYY.txt
tha I have to read and insert into some staging tables. My actual consultant did this:
- an interface run a shell command like "ls * > list_of_files.txt"
- another interfrace read the file list and perform 5 step for each file in a soft of loop
Is this the best way? Seems to be too handmade and the consultat is loosing many hours to understnad why sometimes a data is missing, a data is not fully loaded, etc...
Thanks for any hints
Stefano

Well the process seems to be fine , since in the file he is getting all the files name and then reading one by one and loading the data .
In order to check why and where you are loosing all the data do this.
Pick a file and run the process . Set Temporary Objects - False
Step 1. Load the File completely and check if there is any *.bad or *.error files generated at the Source Files Folder
Step 2. Check the Number of the rows in the file and check the number of the Rows in the C$
Step 3. If step 2 is perfect and correct that means the file is loading correctly.
Step 4. Now check the Counts and data in I$ and check the counts if something looks wrong check the join or filter condition if present that can be causing it
Step 5. If Step 4 is also perfect then check the Last insert/update step .
Debugging step by step will make it easier to find out the exact issue.

Similar Messages

  • Sql*loader map multiple files to multiple tables

    Can a single control file map multiple files to multiple different tables? If so, what does the syntax look like? I've tried variations of the following, but haven't hit the jackpot yet.
    Also, I understand that a direct load will automatically turn off most constraint checking. I'd like to turn this back on when I'm done loading all tables. How/when do I do that? I can find multiple references to 'REENABLE DISABLED CONSTRAINTS', but I don't know where to say that.
    TIA.
    LOAD DATA
    INFILE 'first.csv'
    TRUNCATE
    INTO TABLE first_table
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    (a,b,c)
    INFILE 'second.csv'
    TRUNCATE
    INTO TABLE second_table
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    (x,y,z,xx,yy,zz)
    etc.

    Here you go what you want..
    http://www.psoug.org/reference/sqlloader.html
    LOAD DATA
    INFILE 'c:\temp\demo09a.dat'
    INFILE 'c:\temp\demo09b.dat'
    APPEND
    INTO TABLE denver_prj
    WHEN projno = '101' (
    projno position(1:3) CHAR,
    empno position(4:8) INTEGER EXTERNAL,
    projhrs position(9:10) INTEGER EXTERNAL)
    INTO TABLE orlando_prj
    WHEN projno = '202' (
    projno position(1:3) CHAR,
    empno position(4:8) INTEGER EXTERNAL,
    projhrs position(9:10) INTEGER EXTERNAL)
    INTO TABLE misc_prj
    WHEN projno != '101' AND projno != '202' (
    projno position(1:3) CHAR,
    empno position(4:8) INTEGER EXTERNAL,
    projhrs position(9:10) INTEGER EXTERNAL)
    Thanks
    Aravindh

  • SQL*LOADER control file for Multiple Tables

    Dear DBA's
    I am loading data throgh SQL*LOADER in Oracle. I have some problem, While
    Inserting data into multiple table I have created single Control file in
    following way which is giving me error, Data is stored in seprate fileswith
    tablename.dat, So each tables data is stored in seprate .DAT file. So Iwant to
    insert data from single control file taking data from .dat multiple files into
    multiple tables. Is it possible?? or some different method
    Please help me as early as possiable.
    Thanks & Regards
    Shailesh
    CREATE TABLE T1 (
    L1 VARCHAR2(10),
    L2 NUMBER(10),
    L4 NUMBER(10),
    MYDATE DATE);
    CREATE TABLE T2 (
    L1 VARCHAR2(10),
    L2 NUMBER(10),
    L3 LONG,
    L4 NUMBER(10),
    MYDATE DATE)
    CONTROL FILE :-
    UNRECOVERABLE LOAD DATA
    INFILE 't2.dat'
    INSERT INTO TABLE t2
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    ( l1 char,l2 char,l4 char,mydate date,l3 char(2000000))
    INFILE 't1.dat'
    INSERT INTO TABLE t1
    FIELDS TERMINATED BY ','
    OPTIONALLYENCLOSED BY '"'
    TRAILING NULLCOLS
    ( f1 char,f2 char,f4 char))
    T2.DAT
    Raju,14,1,09-NOV-2000,"Has a powerful data parsing,engine which putslittle
    limitation on the format of the data in thedatafile."
    Manu,14,2,09-NOV-2000,"Can load data from multiple datafiles, during the same
    load session. "
    T1.DAT
    Raj,14,1,09-NOV-2000
    Mau,14,2,09-NOV-2000
    null

    Hi,
    I tried to use your control file and got numerous erros.
    try using seperate control files:
    T1.CTL:
    LOAD DATA
    INFILE 'D:\T2.dat'
    INSERT INTO TABLE t2
    ( l1 char,l2 char,l4 char,mydate date,l3 char(20000))
    T2.CTL
    LOAD DATA
    INFILE 'D:\T1.dat'
    INSERT INTO TABLE t1
    ( l1 char,l2 char,l4 char,mydate date)

  • Loading data from multiple files to multiple tables

    How should I approach on creating SSIS package to load data from multiple files to multiple tables. Also, Files will have data which might overlap so I might have to create stored procedure for it. Ex. 1st day file -data from au.1 - aug 10 and 2nd day
    file might have data from aug.5 to aug 15.  So I might have to look for max and min date and truncate table with in that date range.

    thats ok. ForEachLoop would be able to iterate through the files. You can declare a variable inside loop to capture the filenames. Choose fully qualified as the option in loop
    Then inside loop
    1. Add execute sql task to delete overlapping data from the table. One question here is where will you get date from? Does it come inside filename?
    2. Add a data flow task with file source pointing to file .For this add a suitable connection manager (Excel/Flat file etc) and map the connection string property to filename variable using expressions
    3. Add a OLEDB Destination to point to table. You can use table or view from variable - fast load option and map to variable to make tablename dynamic and just set corresponding value for the variable to get correct tablename
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Loading XML file into DB Table

    Hi
    I m quite new to the loading XML file into database table.
    It will be great if anyone could guide me to through.
    Now,
    i have an XML file which has to be loaded into the DB table.
    what are the steps involved in doing this. How do i go from here ??
    your help is greatly appriciated ???
    Thank you so much!!
    -Shashi

    OK - Although you really should read the XMLDB FAQ on this forum, here is some sample code of ONE of the ways of doing it
    (there are multiple ways - and this is not the most simple one)
    Based on Oracle 11gR1
    -- sqlplus /nolog
    clear screen
    set termout on
    set feed on
    set lines 40
    set long 10000000
    set serveroutput on
    set lines 100
    set echo on
    connect / as sysdba
    col filename for a80
    col xml      for a80
    -- Create schema “OTN”
    drop user OTN cascade;
    purge dba_recyclebin;
    create user OTN identified by OTN;
    grant dba, xdbadmin to OTN;
    EXECUTE dbms_java.grant_permission( 'OTN', 'java.io.FilePermission','G:\OTN\xmlstore','read' );
    prompt pause
    pause
    clear screen
    -- Create directory
    connect OTN/OTN;
    show user
    drop directory OTN_USE_CASE;
    CREATE directory OTN_USE_CASE AS 'G:\OTN\xmlstore';
    SELECT extract((XMLTYPE(bfilename('OTN_USE_CASE','ABANDA-20030407215829881GMT.xml'),NLS_CHARSET_ID('AL32UTF8'))),'*') AS "XML"
    from   dual;
    prompt pause
    pause
    clear screen
    -- Directory Listing - Tom Kyte
    create global temporary table DIR_LIST
    ( filename varchar2(255) )
    on commit delete rows
    create or replace
      and compile java source named "DirList"
    as
    import java.io.*;
    import java.sql.*;
    public class DirList
    {public static void getList(String directory)
                       throws SQLException
    {   File path = new File( directory );
        String[] list = path.list();
        String element;
        for(int i = 0; i < list.length; i++)
        {   element = list;
    #sql { INSERT INTO DIR_LIST (FILENAME)
    VALUES (:element) };
    create or replace procedure get_dir_list( p_directory in varchar2 )
    as language java
    name 'DirList.getList( java.lang.String )';
    prompt pause
    pause
    clear screen
    -- The content of the global temporary table
    exec get_dir_list( 'G:\OTN\xmlstore' );
    select * from dir_list;
    -- "COMMIT" will clear / truncate the global temporary table...
    prompt pause
    pause
    clear screen
    -- Combined: Reading XML content from multiple XML files
    commit;
    exec get_dir_list( 'G:\OTN\xmlstore' );
    select * from dir_list where filename like '%.xml'
    and rownum <= 10;
    prompt pause
    pause
    clear screen
    select extract((XMLTYPE(bfilename('OTN_USE_CASE',dl.filename),NLS_CHARSET_ID('AL32UTF8'))),'*') AS "XML"
    from dir_list dl
    where dl.filename like '%.xml' and rownum <= 2;
    prompt pause
    pause
    clear screen
    -- If you can select it you can insert it...
    -- drop table OTN_xml_store purge;
    create table OTN_xml_store of xmltype
    xmltype store as binary xml
    commit;
    exec get_dir_list( 'G:\OTN\xmlstore' );
    set time on timing on
    insert into OTN_xml_store
    select XMLTYPE(bfilename('OTN_USE_CASE',dl.filename),NLS_CHARSET_ID('AL32UTF8')) AS "XML"
    from dir_list dl
    where dl.filename like '%.xml';
    set time off timing off
    commit;
    select count(*) from OTN_xml_store;
    prompt pause
    pause
    clear screen
    -- If you can select it you can create resources and files
    set time on timing on
    commit;
    exec get_dir_list( 'G:\OTN\xmlstore' );
    select count(*) from dir_list where filename like '%.xml';
    set serveroutput on size 10000
    DECLARE
    XMLdoc XMLType;
    res BOOLEAN;
    v_foldername varchar2(4000) := '/public/OTN/';
    cursor c1
    is
    select dl.filename FNAME
    , XMLTYPE(bfilename('OTN_USE_CASE',dl.filename),NLS_CHARSET_ID('AL32UTF8')) XMLCONTENT
    from dir_list dl
    where dl.filename like '%.xml'
    and rownum <= 100;
    BEGIN
    -- Create XDB repository Folder
    if (dbms_xdb.existsResource(v_foldername))
    then
    dbms_xdb.deleteResource(v_foldername,dbms_xdb.DELETE_RECURSIVE_FORCE);
    end if;
    res:=DBMS_XDB.createFolder(v_foldername);
    -- Create XML files in the XDB Repository
    for r1 in c1
    loop
    if (DBMS_XDB.CREATERESOURCE(v_foldername||r1.fname, r1.xmlcontent))
    then
    dbms_output.put_line(v_foldername||r1.fname);
    null;
    else
    dbms_output.put_line('Loop Exception :'||sqlerrm);
    end if;
    end loop;
    EXCEPTION WHEN OTHERS THEN
    dbms_output.put_line('Others Exception: '||sqlerrm);
    END;
    set time off timing off
    commit;
    prompt pause
    pause
    clear screen
    -- FTP and HTTP
    clear screen
    prompt
    prompt *** FTP - Demo ***
    prompt
    prompt pause
    pause
    host ftp
    -- open localhost 2100
    -- user OTN OTN
    -- cd public
    -- cd OTN
    -- ls
    -- bye
    clear screen
    prompt
    prompt *** Microsoft Internet Explorer - Demo ***
    prompt
    prompt pause
    pause
    host "C:\Program Files\Internet Explorer\IEXPLORE.EXE" http://OTN:OTN@localhost:8080/public/OTN/
    prompt pause
    pause
    -- Accessing the XDB Repository content via Resource View
    -- Selecting content from a resource via XBDUriType
    clear screen
    prompt set long 300
    set long 300
    prompt Relative Path - (path)
    SELECT path(1) as filename
    FROM RESOURCE_VIEW
    WHERE under_path(RES, '/public/OTN', 1) = 1
    and rownum <= 10
    prompt pause
    pause
    clear screen
    prompt Absolute Path - (any_path)
    select xdburitype(any_path).getClob() as xml
    FROM RESOURCE_VIEW
    WHERE under_path(RES, '/public/OTN', 1) = 1
    and rownum <= 1;
    prompt pause
    pause
    -- CLEANUP ENVIRONMENT
    clear screen
    prompt
    prompt >>>>> Clean UP !!! <<<<<<
    prompt
    prompt Cleanup environment and drop user...!!!
    prompt
    pause
    clear screen
    conn / as sysdba
    alter session set current_schema=OTN;
    begin
    dbms_xdb.deleteResource('/public/OTN',dbms_xdb.DELETE_RECURSIVE_FORCE);
    commit;
    end;
    alter session set current_schema=sys;
    drop user OTN cascade;
    Based on http://www.liberidu.com/blog/?p=1053

  • Loading XML File to Physical table in BW

    Hi,
    I have a requirement to load XML file BW physical table.
    The XML file that I am getting looks pretty complex compared to the XML file I have seen online.
    I need help in transforming the file and Abap code to load the file to physical table
    I have already created the table in SE11.
    XML file
    <?xml version="1.0"?>
    <?mso-application progid="Excel.Sheet"?>
    <Row ss:AutoFitHeight="0" ss:Height="36">
        <Cell ss:StyleID="s62"><Data ss:Type="String">First Name</Data></Cell>
        <Cell ss:StyleID="s62"><Data ss:Type="String">Bank Name -
    add. info</Data></Cell>
       </Row>
       <Row ss:AutoFitHeight="0" ss:Height="22.5" ss:StyleID="s67">
        <Cell><Data ss:Type="String">John Mayor</Data></Cell>
        <Cell><Data ss:Type="String">New: Local bank</Data></Cell
       </Row>
    my requirement is to get this values into physical table i.e
    First name                 bank name
    John Mayor               new: local bank
    thanks
    Edited by: Bhat Vaidya on Apr 14, 2010 11:59 AM
    Edited by: Bhat Vaidya on Apr 14, 2010 12:00 PM
    Edited by: Bhat Vaidya on Apr 14, 2010 12:01 PM
    Edited by: Bhat Vaidya on Apr 14, 2010 12:01 PM

    No longer working on the issue.

  • Un-'Locking' multiple files in multiple folders....

    So I just spent 2 hours at the 'Genius' bar manually 'unlocking' hundreds if not thousands of photos in my 'iPhoto' (05) library in order to upgrade to iPhoto '06... apparently, when I imported some pictures from Windows... it brought them in as 'locked' (i.e. when you 'get info...' on a file, and it has the 'Locked' check-box checked), well apparently, in order to upgrade to iPhoto '06, you have to 'unlock' every single file... given the way that iPhoto stores your files, if you've got a few locked, scattered throughout your library, it's going to take a LONG time to find them all and unlock them.
    My question for the Mac OS X specialists at large is, "Is there a way to un-lock multiple files in multiple directories and folders, WITHOUT doing it manually? I can't believe I actually sat at a genius bar and did this for TWO hours. According to my Genius, that was the only way to do it. Are there any other Genius' out there that may have a differing opinion? Keep in mind, he was one of 2 or 3 native english speakers (here in the very busy Shibuya, Tokyo store) any solutions would be appreciated, because it appears I've got multiple files in my MacBook iPhoto library that are 'locked' and I'd like to find an easier way to unlock them.
    FYI, we went to finder and searched for 'other' "Files Write Protected" etc., but we were never able to find ONLY files that are locked... is there a better way? Surely there has to be. Looking forward to learning something new.
    Hal W.
    Tokyo, Japan
    Mac Mini, MacBook 13.3"   Mac OS X (10.4.7)  

    There's also a Terminal command that will work:
    Launch Terminal from your utilities folder, and enter this command:
    find /Users/yourname/Pictures/"iPhoto Library"/ -flags uchg -exec chflags nouchg {} \;
    Be sure you get the "spaces" right, including the ones before and after the curly braces--{}--and there is no space between \; and it should work just fine. You may want to just copy and paste the above into a text program, fill in your short user name, then copy and paste into Terminal. And it must be all one line. After you've entered the command, hit the Return key to execute it. It will look in your iPhoto Library folders for all files that have the locked flag, then change the flag to unlocked.
    Francine
    Francine
    Schwieder

  • Loading XML files into Database table

    Loading XML files into Database table
    Hi I have some XML files say 100 files in a virtual directory created using &quot;Create or replace directory command&quot; and those files need to be loaded into a table having a column of XMLTYPE. 1)How to load that using Oracle provided procedures/packages

    Check out the Oracle XDB Developer's Guide, Chapter 3. There is an example of using BFileName function to load the xml files from a directory object created using create or replace directory. It works really well.
    Ben

  • Loading Hex Files with multiple records with each line starting with : record marker, how do I load this hex file into a front panel table ? thanks in advance, Jeff.

    I have written a routine that loads a Hex file, checks for record marker,length,address etc then loads the data into a table on front panel. However, my routine only loads single record hex files. I need to be able to load multiple record files, I need to identify the ':' colon record marker start for each record in the hex file, I just can't see how to loop my routine so that all records with ':' prefix marker are loaded.

    Jeff,
    I'm not exactly sure of what you are trying to do. Is it possible you could post your code, or a screen shot of the diagram in question, so that I can get a better understanding, and possibly offer some assistance?
    I am familiar with hex files, but not with the : marker separating records. This may be able to be solved with simple text parsing, and as I am not familiar with how you are retrieving and/or displaying the files, I require more information.
    Thanks

  • Loading XML files into multiple tables

    I've got XML like so...
    <?xml version="1.0" encoding="UTF-8"?>
    <MainTitle Version="1.0" Date="2009-01-11">
    <MainName>
    <ID1>A</ID1>
    <ID2>ABC</ID2>
    <ID3>ABC123</ID3>
    <Desc>Some text</Desc>
    <feature>f1</feature>
    <feature>f2</feature>
    <Category>
    <name>n1</name>
    <attribute>more stuff</attribute>
    </Category>
    <Category>
    <name>n2</name>
    <attribute>even more stuff</attribute>
    </Category>
    <Category>
    <name>n3</name>
    <attribute>different stuff</attribute>
    </Category>
    <Category>
    <name>n4</name>
    <attribute>More of the same<attribute>
    <attribute>But different still</attribute>
    <attribute>Even more different junk<attribute>
    </Category>
    </MainName>
    </MainTitle>
    Where each MainName instance in the file can have 0 or more ( unbounded ) Category and Feature tags and each Category instance can have multiple attribute tags. The file contains many thousands of MainName instances and has embedded a good mix of possible tags.
    I believe I can load this into 9i xmltype table or a 9i table with an xmltype column, then query the data to get it out...
    SQL> create table mytab (
    2 xmlraw XMLType
    3 );
    Table created.
    SQL>
    SQL> insert into mytab values ( sys.xmltype.createxml(
    2 '<?xml version="1.0" encoding="UTF-8"?>
    3 <MainTitle Version="1.0" Date="2009-01-11">
    4 <MainName>
    5 <ID1>A</ID1>
    6 <ID2>ABC</ID2>
    7 <ID3>ABC123</ID3>
    8 <Desc>Some text</Desc>
    9 <feature>f1</feature>
    10 <feature>f2</feature>
    11 <Category>
    12 <name>n1</name>
    13 <attribute>more stuff</attribute>
    14 </Category>
    15 <Category>
    16 <name>n2</name>
    17 <attribute>even more stuff</attribute>
    18 </Category>
    19 <Category>
    20 <name>n3</name>
    21 <attribute>different stuff</attribute>
    22 </Category>
    23 <Category>
    24 <name>n4</name>
    25 <attribute>More of the same</attribute>
    26 <attribute>But different still</attribute>
    27 <attribute>Even more different junk</attribute>
    28 </Category>
    29 </MainName>
    30 </MainTitle>')
    31 );
    1 row created.
    1 select
    2 extract(a.xmlraw,'/MainTitle/MainName/ID1/text()'),
    3 extract(a.xmlraw,'/MainTitle/MainName/ID2/text()'),
    4 extract(a.xmlraw,'/MainTitle/MainName/ID3/text()'),
    5 extract(a.xmlraw,'/MainTitle/MainName/Desc/text()'),
    6 extract(a.xmlraw,'/MainTitle/MainName/feature/text()'),
    7 extract(a.xmlraw,'/MainTitle/MainName/Category/text()'),
    8 extract(a.xmlraw,'/MainTitle/MainName/Category/name/text()'),
    9 extract(a.xmlraw,'/MainTitle/MainName/Category/attribute/text()')
    10* from mytab a
    SQL> /
    A
    ABC
    ABC123
    Some text
    f1f2
    n1n2n3n4
    more stuffeven more stuffdifferent stuffMore of the sameBut different stillEven
    more different junk
    This all works just fine, however, it's not quite what I need. For starters, the multiple tag data is concatenated and when I try to specifically query it out using a where clause I get ORA 22950. So, not sure how to deal with that.
    Is it possible to use sqlldr to get the 200MB XML file loaded into a table like that above?
    Now, given multiple "feature" and "category" data per "MainName", I need to use the SQL to dump the XML data into a set of tables built to model the structure of the XML...
    roughly..
    Main_Table (
    ID1 Varchar2(10)
    ID2 varchar2(10)
    ID3 varchar2(10)
    desc varchar2(100)
    Features_Table (
    ID1 varchar2(10)
    feature varchar2(100)
    Category_Table (
    ID1 varchar2(10)
    name varchar2(100)
    attribute varchar2(100)
    What are the groups recommendations here? Should I continue down this route or is there a better way?

    When I suggested the option to parse the XML in PL/SQL I was referring to pulling the data into PL/SQL and then performing all parsing activity against the PL/SQL copy and you don't need to make SQL calls.
    Here is a quick sample for parsing out all the Category/name elements from the XML once it is loaded into PL/SQL
    DECLARE
      l_index     PLS_INTEGER;
      l_category  XMLTYPE;
      l_db_row    XMLTYPE := XMLTYPE('<?xml version="1.0" encoding="UTF-8"?>
    <MainTitle Version="1.0" Date="2009-01-11">
       <MainName>
          <ID1>A</ID1>
          <ID2>ABC</ID2>
          <ID3>ABC123</ID3>
          <Desc>Some text</Desc>
          <feature>f1</feature>
          <feature>f2</feature>
          <Category>
             <name>n1</name>
             <attribute>more stuff</attribute>
          </Category>
          <Category>
             <name>n2</name>
             <attribute>even more stuff</attribute>
          </Category>
          <Category>
             <name>n3</name>
             <attribute>different stuff</attribute>
          </Category>
          <Category>
             <name>n4</name>
             <attribute>More of the same</attribute>
             <attribute>But different still</attribute>
                   <attribute>Even more different junk</attribute>
              </Category>
         </MainName>
    </MainTitle>');
    BEGIN
       l_index := 1;
       WHILE l_db_row.Existsnode('/MainTitle/MainName/Category[' ||
                                 To_Char(l_index) || ']') > 0
       LOOP
          l_category := l_db_row.Extract('/MainTitle/MainName/Category[' ||
                                           To_Char(l_index) || ']');
          dbms_output.put_line(l_category.extract('Category/name/text()').getStringVal());
          l_index := l_index + 1;
       END LOOP;
    END;You could repeat the WHILE loop to parse out the attribute column as well since it repeats. This is what Dave's post was showing and what I was referring to.
    Hint: If you are trying to use .extract to go after an optional node, you need to verify the node exists via existsNode first. If you don't you can get an "ORA-30625: method dispatch on NULL SELF argument is disallowed" error when trying to extract a non-existent node.

  • Loading multiple files from multiple users.

    Our system is moving from a standalone app to a web system. The users will have export files generated by our app which they will need to import up to the web. In the web system, the users are connecting via SSO, so apps server is using a single JDBC connection and we are querying the CLIENT_IDENTIFIER at the database end to see who is doing what.
    The export file is essentially a zip with the first file being a list of which filenames in the zip translate to what tables in the database they are from.
    The new system will require a little work on each file to update certain things prior to actually inserting the data to its final destination.
    My confusion is how to best do this. What we essentially need to do is move the data from the text file into a table along with some flag for identifying the user that put it there. Then update the data as needed and finally insert it to the final table destination. The first thought was to use external tables. However if you have two users importing at the same time, how do you differentiate the data? The other idea was to use sqlldr. The trouble was there is no way (that I'm aware of) to be able to add the flag for who's data this is on the way over with sqlldr, it will only bulk copy the data from the file over to the table you specify.
    So the basic question is how do I get data for a single table into the system when I have multiple users (SSO signed on using the same DB connection via apps) uploading their own copies of data ultimately headed for the same database table but the data needs a little modification on the way? What's the best way to do this?
    Thanks.

    Our system is moving from a standalone app to a web system. The users will have export files generated by our app which they will need to import up to the web. In the web system, the users are connecting via SSO, so apps server is using a single JDBC connection and we are querying the CLIENT_IDENTIFIER at the database end to see who is doing what.
    The export file is essentially a zip with the first file being a list of which filenames in the zip translate to what tables in the database they are from.
    The new system will require a little work on each file to update certain things prior to actually inserting the data to its final destination.
    My confusion is how to best do this. What we essentially need to do is move the data from the text file into a table along with some flag for identifying the user that put it there. Then update the data as needed and finally insert it to the final table destination. The first thought was to use external tables. However if you have two users importing at the same time, how do you differentiate the data? The other idea was to use sqlldr. The trouble was there is no way (that I'm aware of) to be able to add the flag for who's data this is on the way over with sqlldr, it will only bulk copy the data from the file over to the table you specify.
    So the basic question is how do I get data for a single table into the system when I have multiple users (SSO signed on using the same DB connection via apps) uploading their own copies of data ultimately headed for the same database table but the data needs a little modification on the way? What's the best way to do this?
    Thanks.

  • Loading xml file with multiple rows

    I am loading data from xml files using xsl for transformation. I have created xsl's and loaded some of the data. In an xml file with multiple row, it's only loading one (the first) row. Any idea how I can get it to read and load all the records in the file???

    Could some please help me with the above. I desparately need to move forward.

  • How to define mapping from multiple files to Oracle Tables in 9i

    Around 100-200 Flat files are created every 30 minutes and each filename is different - Filename has datetime Stamp as part of the file name apart from the product code as first 12 characters.
    Can anyone guide me in How to define mappings to these files using OWB ?
    What I can do is consolidate all files into one known single file name and map the files to Oracle tables which I don't want to do because I need to reject errorneous files.
    Can anyone provide me some tips on this ?
    Thanks in Advance.
    Sohan.

    As you know, in OWB you need to define the flat file source in a 'static' way (name, location, etc. have to be defined previously), so you cannot deal directly with dinamically generated file names. One solution would be to consolidate them into a single file (which you can define statically in OWB), but prefix every record with the filename. In this way it is easy to understand from which file the rejected records came from. If you are using unix, it is very easy to write a script to do this. Something like this will do:
    awk '{printf "%s,%s\n",FILENAME,$1}' yourfilename >> onefile
    where yourfile is the name of the file you are currently processing, while onefile is the name of the consolidated file. You can run this for all files in your directory by substituting yourfilename with * .
    You can then disregard the file name field in OWB, while processing the rejected records based on the file name prefix by using unix utilities like grep and similar.
    Regards:
    Igor

  • Error while loading Flat file to the table (ORA-00936: missing expression)

    lat file Hi Gurus
    Receiving the following error while trying to load of flat file to the database :
    ODI-1228: Task test_file_load (Integration) fails on the target ORACLE connection DEMO_STAGE.
    Caused By: java.sql.SQLSyntaxErrorException: ORA-00936: missing expression
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:889)
         at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
         at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:204)
         at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:540)
         at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
         at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1079)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1466)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
         at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3937)
         at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1535)
         at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    The file which I have tried to load is : SRC_SALES_PERSON and teh table structure is
    CREATE table "TRG_SALES_PERSON"(
    "SALES_PERSON_ID" NUMBER(8,0) NOT NULL,
    "FIRST_NAME" VARCHAR2(80),
    "LAST_NAME" VARCHAR2(80),
    "DATE_HIRED" VARCHAR2(80),
    "DATE_UPDATED" DATE NOT NULL)
    Knowledge module used are
    LKM File to SQL
    IKM SQL Incremental Update
    We rae using ODI 11g R2 ...
    Thanks and reallty appreciate any help in thsi regard.

    HI there,
    I am facing the same issue while loading data from SRC_SALES_PERSON(flat file) to TRG_CUSTOMER.
    I dont see any errors in the steps however the data is not laoded finally. Here are the sql commands
    **On source**
    select     ID     C11_ID,
         LASTNAME     C9_LASTNAME
    from      TABLE
    /*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=SRC_SALES_PERSONSNP$CRLOAD_FILE=D:\Pratima\Softwares\ODI\ofm_odi_companion_generic_11.1.1.5.1_disk1_1of1[1]\demo\oracledi-demo\oracledi\demo\file/SRC_SALES_PERSON.txtSNP$CRFILE_FORMAT=FSNP$CRFILE_SEP_FIELD=0x0009SNP$CRFILE_SEP_LINE=0x000D0x000ASNP$CRFILE_FIRST_ROW=0SNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=IDSNP$CRTYPE_NAME=STRINGSNP$CRORDER=1SNP$CRLINE_OFFSET=1SNP$CRLENGTH=11SNP$CRPRECISION=11SNP$CRACTION_ON_ERROR=0SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=FIRSTNAMESNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=12SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=LASTNAMESNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=62SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=DATE1SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=112SNP$CRLENGTH=20SNP$CRPRECISION=20SNP$CR$$SNPS_END_KEY*/
    On Target
    insert into STAGING.C$_0TRG_CUSTOMER
         C11_ID,
         C9_LASTNAME
    values
         :C11_ID,
         :C9_LASTNAME
    The actual code at source fails however the step is in green.
    Thanks in Advance,
    Pratima

  • How to load csv file in oracle table

    Hi,
    i have csv files to which i want to load the csv file data and table structure in oracle. my csv files on the window machine but database is runing on ibm aix.so how to do it please any one can help me about. thanks a lot in advance
    i know th syntax below
    $sqlldr userid=username/password control=<filename> log=<log filename>

    Hello,
    Yes, with SQL*Loader you can do it.
    But first you have to create a Table into your database with columns and datatype matching
    the content of you "csv" file.
    Then, you'll have to prepare a control file and use the option fields terminated by "," as it's
    a "csv" format.
    Then, you could execute your statement.
    Please, find enclosed a link with some example about SQL*Loader and "csv" file:
    [http://www.orafaq.com/wiki/SQL*Loader_FAQ]
    Hope this help.
    Best regards,
    Jean-Valentin

Maybe you are looking for

  • /etc/launchd-user.conf and launchd.conf no longer work in Yosemite 10.10.2

    As the subject says. According to this: https://support.apple.com/en-us/HT201684, they should. However according to the man page for launchd.conf. "launchd.conf is no longer respected by the system."  Setting "umask 077" in /etc/launchd-user.conf sho

  • Difference between CRM 4.0 and CRM 5.0?

    Hi, can anybody send me the CRM Release notes 5.0 docs to my mail id <removed by SDN Forum Moderator>  I just want to migrate to CRM Release 5.0. Need all the func difference of CRM 4.0 and CRM 5.0 Thank You...

  • Customer needs account fixed

    email is [email protected] Customer is trying to logon, was able to reset passwrod but could not logon, told there was no username and cannot reregister. Customer would like direct email notification when his account is fixed.

  • Whatsapp for my ipad mini with retina display?

    dear apple,              How To download whatsapp on my ipad mini with retina display?              Please do something for it.              please fix this problem as soon as possible.              PLEASE APPLE. Yours sincerely, ipad user

  • AT&T Unlimited Data Plans

    Well lookie that Verizon since you took away unlimited data plans at&t is coming out with unlimiteddata plans, guess where everyone is going to go now?