How to load multiple CSV files into oracle in one go.

Hi,
I have project requirement like: I'll be getting one csv file as one record for the source table.
So i need to know is there any way by which I can load multiple csv in one go. I have searched a lot on net about external table.(but problem is I can only use one consolidate csv at a time)
and UTL_FILE same thing consolidate file is required here to load.
but in my scenario I'll have (1000+) csv files(as records) for the table and it'd be hectic thing to open each csv file,copy the record in it and paste in other file to make consolidate one.
Please help me ..it's very new thing for me...I have used external table for , one csv file in past but here I have to user many file.
Table description given below.
desc td_region_position
POSITION_KEY             NUMBER     Primary key
POSITION_NAME       VARCHAR2(20)     
CHANNEL                     VARCHAR2(20)     
LEVEL                     VARCHAR2(20)     
IS_PARTNER             CHAR(1)     
MARKET_CODE             VARCHAR2(20)
CSV file example:
POSITION_KEY|POSITION_NAME|CHANNEL|LEVEL|IS_PARTNER|MARKET_CODE
123002$$FLSM$$Sales$$Middle$$Y$$MDM2203
delimeter is --  $$my database version as follows:
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
"CORE     10.2.0.5.0     Production"
TNS for IBM/AIX RISC System/6000: Version 10.2.0.5.0 - Productio
NLSRTL Version 10.2.0.5.0 - ProductionEdited by: 974253 on Dec 10, 2012 9:58 AM

if your csv files have some mask, say "mask*.csv" or by file "mask1.csv" and "mask2.csv" and ...
you can create this.bat file
FOR %%c in (C:\tmp\loader\mask*.csv) DO (
   c:\oracle\db\dbhome_1\BIN\sqlldr <user>@<sid>l/<password> control=C:\tmp\loader\loader.ctl data=%%c
   )and C:\tmp\loader\loader.ctl is
OPTIONS (ERRORS=0,SKIP=1)
LOAD DATA
  APPEND 
  INTO TABLE scott.td_region_position
  FIELDS TERMINATED BY '$$' TRAILING NULLCOLS
  ( POSITION_KEY,
     POSITION_NAME ,
     CHANNEL,
     LVL,
     IS_PARTNER,
     MARKET_CODE
  )test
C:\Documents and Settings\and>sqlplus
SQL*Plus: Release 11.2.0.1.0 Production on Mon Dec 10 11:03:47 2012
Copyright (c) 1982, 2010, Oracle.  All rights reserved.
Enter user-name: scott
Enter password:
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> select * from td_region_position;
no rows selected
SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Pr
oduction
With the Partitioning, OLAP, Data Mining and Real Application Testing options
C:\Documents and Settings\and>cd C:\tmp\loader
C:\tmp\loader>dir
Volume in drive C has no label.
Volume Serial Number is F87F-9154
Directory of C:\tmp\loader
12/10/2012  10:51 AM    <DIR>          .
12/10/2012  10:51 AM    <DIR>          ..
12/10/2012  10:55 AM               226 loader.ctl
12/10/2012  10:38 AM               104 mask1.csv
12/10/2012  10:39 AM               108 mask2.csv
12/10/2012  10:57 AM               151 this.bat
               4 File(s)            589 bytes
               2 Dir(s)   4,523,450,368 bytes free
C:\tmp\loader>this.bat
C:\tmp\loader>FOR %c in (C:\tmp\loader\mask*.csv) DO (c:\oracle\db\dbhome_1\BIN\
sqlldr user@orcl/password control=C:\tmp\loader\loader.ctl data=%c )
C:\tmp\loader>(c:\oracle\db\dbhome_1\BIN\sqlldr user@orcl/password control=C
:\tmp\loader\loader.ctl data=C:\tmp\loader\mask1.csv )
SQL*Loader: Release 11.2.0.1.0 - Production on Mon Dec 10 11:04:27 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Commit point reached - logical record count 1
C:\tmp\loader>(c:\oracle\db\dbhome_1\BIN\sqlldr user@orcl/password control=C
:\tmp\loader\loader.ctl data=C:\tmp\loader\mask2.csv )
SQL*Loader: Release 11.2.0.1.0 - Production on Mon Dec 10 11:04:28 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Commit point reached - logical record count 1
C:\tmp\loader>sqlplus
SQL*Plus: Release 11.2.0.1.0 Production on Mon Dec 10 11:04:46 2012
Copyright (c) 1982, 2010, Oracle.  All rights reserved.
Enter user-name: scott
Enter password:
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> select * from td_region_position;
POSITION_KEY POSITION_NAME        CHANNEL              LVL                  I
MARKET_CODE
      123002 FLSM                 Sales                Middle               Y
MDM2203
      123003 FLSM1                Sa2les               M2iddle              Y
MDM22203
SQL>

Similar Messages

  • How to load excel-csv file into oracle database

    Hi
    I wanted to load excel file with csv extension(named as trial.csv) into
    oracle database table(named as dept),I have given below my experiment here.
    I am getting the following error,how should I rectify this?
    For ID column I have defined as number(5) datatype, in my control file I have defined as interger external(5),where as in the Error log file why the datatype for ID column is comming as character?
    1)my oracle database table is
    SQL> desc dept;
    Name Null? Type
    ID NUMBER(5)
    DNAME CHAR(20)
    2)my data file is(trial.csv)
    ID     DNAME
    11     production
    22     purchase
    33     inspection
    3)my control file is(trial.ctl)
    LOAD DATA
    INFILE 'trial.csv'
    BADFILE 'trial.bad'
    DISCARDFILE 'trial.dsc'
    APPEND
    INTO TABLE dept
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (ID POSITION(1:5) INTEGER EXTERNAL(5)      
    NULLIF ID=BLANKS,
    DNAME POSITION(6:25) CHAR
         NULLIF DNAME=BLANKS
    3)my syntax on cmd prompt is
    c:\>sqlldr scott/tiger@xxx control=trial.ctl direct=true;
    Load completed - logical record count 21.
    4)my log file error message is
    Column Name Position Len Term Encl Datatype
    ID           1:5 5 , O(") CHARACTER
    NULL if ID = BLANKS
    DNAME 6:25 20 , O(") CHARACTER
    NULL if DNAME = BLANKS
    Record 1: Rejected - Error on table DEPT, column ID.
    ORA-01722: invalid number
    Record 21: Rejected - Error on table DEPT, column ID.
    ORA-01722: invalid number
    Table DEPT:
    0 Rows successfully loaded.
    21 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 21
    Total logical records rejected: 21
    Total logical records discarded: 0
    Total stream buffers loaded by SQL*Loader main thread: 0
    Total stream buffers loaded by SQL*Loader load thread: 0
    5)Result
    SQL> select * from dept;
    no rows selected
    by
    balamuralikrishnan.s

    Hi everyone!
    The following are the steps to load a excell sheet to oracle table.i tried to be as simple as possible.
    thanks
    Step # 1
    Prapare a data file (excel sheet) that would be uploaded to oracle table.
    "Save As" a excel sheet to ".csv" (comma seperated values) format.
    Then save as to " .dat " file.
    e.g datafile.bat
    1,Category Wise Consumption Summary,Expected Receipts Report
    2,Category Wise Receipts Summary,Forecast Detail Report
    3,Current Stock List Category Wise,Forecast rule listing
    4,Daily Production Report,Freight carrier listing
    5,Daily Transactions Report,Inventory Value Report
    Step # 2
    Prapare a control file that define the data file to be loaded ,columns seperation value,name and type of the table columns to which data is to be loaded.The control file extension should be " .ctl " !
    e.g i want to load the data into tasks table ,from the datafile.dat file and having controlfile.ctl control file.
    SQL> desc tasks;
    Name Null? Type
    TASK_ID NOT NULL NUMBER(14)
    TASK_NAME VARCHAR2(120)
    TASK_CODE VARCHAR2(120)
    : controlfile.ctl
    LOAD DATA
    INFILE 'e:\datafile.dat'
    INTO TABLE tasks
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    (TASK_ID INTEGER EXTERNAL,
    TASK_NAME CHAR,
    TASK_CODE CHAR)
    The above is the structure for a control file.
    Step # 3
    the final step is to give the sqlldr command to execute the process.
    sqlldr userid=scott/tiger@abc control=e:\controlfile.ctl log=e:\logfile.log
    Message was edited by:
    user578762

  • How to load a java file into Oracle?

    Hello,
    I have some problem in loading a java file into Oracle 8i. I used "loadjava -user <userName/Password> -resolve <javaClassName>" command.
    JAVA Version used JDK 1.5
    When calling this class file from a trigger gives this error
    ORA-29541: class ANTONY1.DBTrigger could not be resolved.
    Can anyone help me to resolve this problem?

    Hello,
    Which username did you use to load the class, and which user is running the trigger?
    The default resolver searches only the current user's schema and PUBLIC so the java class needs to be loaded into one of these.
    cheers,
    Anthony

  • Loading multiple .csv files into a table using SSIS

    Hi,
    I have a requirement where I have 200+ csv files to be loaded into Netezza table using SSIS.
    Issue I am facing is all columns have different number of columns, for ex, file 1 has columns A,B,C and file 2 has columns C,D,E. My target table has all columns from A to E in it. 
    But, when I am using for each loop container, only the file for which I have specified filepath+filename in loop variable, that is getting loaded. Rest all files, no data is getting loaded from them and package is executing successfully.
    Any help is appreciated.
    Regards,
    VT

    if you want to iterate through files then all files should be in same folder and you should use file enumerator type within ForEach loop. Then inside loop you might need a script task to generate the data flow on the fly based on the avialble input columns
    from the file and do the mapping in the destination. So I assume you put NULLs (or some default value) for missing columns from the file
    http://blog.quasarinc.com/ssis/best-solution-to-load-dynamically-change-csv-file-in-ssis-etl-package/
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Loading data from .csv file into Oracle Table

    Hi,
    I have a requirement where I need to populate data from .csv file into oracle table.
    Is there any mechanism so that i can follow the same?
    Any help will be fruitful.
    Thanks and regards

    You can use Sql Loader or External tables for your requirement
    Missed Karthick's post ...alredy there :)
    Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM

  • "how to load a text file to oracle table"

    hi to all
    can anybody help me "how to load a text file to oracle table", this is first time i am doing, plz give me steps.
    Regards
    MKhaleel

    Usage: SQLLOAD keyword=value [,keyword=value,...]
    Valid Keywords:
    userid -- ORACLE username/password
    control -- Control file name
    log -- Log file name
    bad -- Bad file name
    data -- Data file name
    discard -- Discard file name
    discardmax -- Number of discards to allow (Default all)
    skip -- Number of logical records to skip (Default 0)
    load -- Number of logical records to load (Default all)
    errors -- Number of errors to allow (Default 50)
    rows -- Number of rows in conventional path bind array or between direct path data saves (Default: Conventional path 64, Direct path all)
    bindsize -- Size of conventional path bind array in bytes (Default 256000)
    silent -- Suppress messages during run (header, feedback, errors, discards, partitions)
    direct -- use direct path (Default FALSE)
    parfile -- parameter file: name of file that contains parameter specifications
    parallel -- do parallel load (Default FALSE)
    file -- File to allocate extents from
    skip_unusable_indexes -- disallow/allow unusable indexes or index partitions (Default FALSE)
    skip_index_maintenance -- do not maintain indexes, mark affected indexes as unusable (Default FALSE)
    commit_discontinued -- commit loaded rows when load is discontinued (Default FALSE)
    readsize -- Size of Read buffer (Default 1048576)
    external_table -- use external table for load; NOT_USED, GENERATE_ONLY, EXECUTE
    (Default NOT_USED)
    columnarrayrows -- Number of rows for direct path column array (Default 5000)
    streamsize -- Size of direct path stream buffer in bytes (Default 256000)
    multithreading -- use multithreading in direct path
    resumable -- enable or disable resumable for current session (Default FALSE)
    resumable_name -- text string to help identify resumable statement
    resumable_timeout -- wait time (in seconds) for RESUMABLE (Default 7200)
    PLEASE NOTE: Command-line parameters may be specified either by position or by keywords. An example of the former case is 'sqlldr scott/tiger foo'; an example of the latter is 'sqlldr control=foo userid=scott/tiger'. One may specify parameters by position before but not after parameters specified by keywords. For example, 'sqlldr scott/tiger control=foo logfile=log' is allowed, but 'sqlldr scott/tiger control=foo log' is not, even though the position of the parameter 'log' is correct.
    SQLLDR USERID=GROWSTAR/[email protected] CONTROL=D:\PFS2004.CTL LOG=D:\PFS2004.LOG BAD=D:\PFS2004.BAD DATA=D:\PFS2004.CSV
    SQLLDR USERID=GROWSTAR/[email protected] CONTROL=D:\CLAB2004.CTL LOG=D:\CLAB2004.LOG BAD=D:\CLAB2004.BAD DATA=D:\CLAB2004.CSV
    SQLLDR USERID=GROWSTAR/[email protected] CONTROL=D:\GROW\DEACTIVATESTAFF\DEACTIVATESTAFF.CTL LOG=D:\GROW\DEACTIVATESTAFF\DEACTIVATESTAFF.LOG BAD=D:\GROW\DEACTIVATESTAFF\DEACTIVATESTAFF.BAD DATA=D:\GROW\DEACTIVATESTAFF\DEACTIVATESTAFF.CSV

  • How to insert a image file into oracle database

    hi all
    can anyone guide me how to insert a image file into oracle database now
    i have created table using
    create table imagestore(image blob);
    but when inserting i totally lost don't know what to do how to write query to insert image file

    Hi I don't have time to explain really, I did have to do this a while ago though so I will post a code snippet. This is using the commons file upload framework.
    Firstly you need a multi part form data (if you are using a web page). If you are not using a web page ignore this bit.
    out.println("<form name=\"imgFrm\" method=\"post\" enctype=\"multipart/form-data\" action=\"FileUploadServlet?thisPageAction=reloaded\" onSubmit=\"return submitForm();\"><input type=\"FILE\" name=\"imgSource\" size='60' class='smalltext' onKeyPress='return stopUserInput();' onKeyUp='stopUserInput();' onKeyDown='stopUserInput();' onMouseDown='noMouseDown(event);'>");
    out.println("   <input type='submit' name='submit' value='Submit' class='smalltext'>");
    out.println("</form>"); Import this once you have the jar file:
    import org.apache.commons.fileupload.*;Now a method I wrote to upload the file. I am not saying that this is correct, or its the best way to do this. I am just saying it works for me.
    private boolean uploadFile(HttpServletRequest request, HttpSession session) throws Exception {
            boolean result = true;
            String fileName = null;
            byte fileData[] = null;
            String fileUploadError = null;
            String imageType = "";
            String error = "";
            DiskFileUpload fb = new DiskFileUpload();
            List fileItems = fb.parseRequest(request);
            Iterator it = fileItems.iterator();
            while(it.hasNext()){
                FileItem fileItem = (FileItem)it.next();
                if (!fileItem.isFormField()) {
                    fileName = fileItem.getName();
                    fileData = fileItem.get();
                    // Get the imageType from the filename extension
                    if (fileName != null) {
                        int dotPos = fileName.indexOf('.');
                        if (dotPos >= 0 && dotPos != fileName.length()-1) {
                            imageType = fileName.substring(dotPos+1).toLowerCase();
                            if (imageType.equals("jpg")) {
                                imageType = "jpeg";
            String filePath = request.getParameter("FILE_PATH");
            session.setAttribute("filePath", filePath);
            session.setAttribute("fileData", fileData);
            session.setAttribute("fileName", fileName);
            session.setAttribute("imageType", imageType);
            return result;  
         } And now finally the method to actually write the file to the database:
    private int writeImageFile(byte[] fileData, String fileName, String imageType, String mode, Integer signatureIDIn, HttpServletRequest request) throws Exception {
            //If the previous code found a file that can be uploaded then
            //save it into the database via a pstmt
            String sql = "";
            UtilDBquery udbq = getUser(request).connectToDatabase();
            Connection con = null;
            int signatureID = 0;
            PreparedStatement pstmt = null;
            try {
                udbq.setUsePreparedStatements(true);
                con = udbq.getPooledConnection();
                con.setAutoCommit(false);
                if((!mode.equals("U")) || (mode.equals("U") && signatureIDIn == 0)) {
                    sql = "SELECT SEQ_SIGNATURE_ID.nextval FROM DUAL";
                    pstmt = con.prepareStatement(sql);
                    ResultSet rs = pstmt.executeQuery();
                    while(rs.next()) {
                       signatureID = rs.getInt(1);
                    if (fileName != null && imageType != null) {
                        sql = "INSERT INTO T_SIGNATURE (SIGNATURE_ID, SIGNATURE) values (?,?)";
                        InputStream is2 = new ByteArrayInputStream(fileData);
                        pstmt = con.prepareStatement(sql);
                        pstmt.setInt(1, signatureID);
                        pstmt.setBinaryStream(2, is2, (int)(fileData.length));
                        pstmt.executeUpdate();
                        pstmt.close();
                        con.commit();
                        con = null;
                if(mode.equals("U") && signatureIDIn != 0) {
                    signatureID = signatureIDIn.intValue();
                    if (fileName != null && imageType != null) {
                        sql = "UPDATE T_SIGNATURE SET SIGNATURE = ? WHERE SIGNATURE_ID = ?";
                        InputStream is2 = new ByteArrayInputStream(fileData);
                        pstmt = con.prepareStatement(sql);
                        pstmt.setBinaryStream(1, is2, (int)(fileData.length));
                        pstmt.setInt(2, signatureID);
                        pstmt.executeUpdate();
                        pstmt.close();
                        con.commit();
                        con = null;
            } catch (Exception e) {
                con = null;
                throw new Exception(e.toString());
            return signatureID;
       }

  • Example for loading a csv file into diadem from a labview application

    Hi everyone, i'm using labview 8.2 and DIAdem 10.1.
    I've been searching in NI example finder but I had no luck so far.
    I have already downloaded the labview connectivity VIs.
    Can anyone provide a example that can help me loading a csv file into diadem from a labview application?
    Thanks

    Hi Alexandre.
    I attach an example for you.
    Best Regards.
    Message Edité par R_Duval le 01-15-2008 02:44 PM
    Romain D.
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    NIDays 2010 : Conférence mondiale de l'instrumentation virtuelle
    >>Détails et Inscription<<
    Attachments:
    Classeur1.csv ‏1 KB
    Load CSV to Diadem.vi ‏15 KB

  • How to load 100 CSV file to DSO / InfoCube

    Hi
    How to load 100 CSV files of transcational data to DSO / InfoCube?

    Hi,
    you can achieve to write the .bat (batch file program in windows)
    .bat program it will convert to xsl to .CSV format.
    in process chain you can use the OS COMMAND option while creating process chain.
    it will first trigger the bat file then it will trigger the excel files and info package.
    now i am also working on same scenario now.
    please find the below doc how to create the process chain using os command.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e0c41c52-2638-2e10-0b89-ffa4e2f10ac0?QuickLink=index&…
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/3034bf3d-9dc4-2d10-9e93-d4ba01505ed9?QuickLink=index&…
    Thanks,
    Phani.

  • How to read/write .CSV file into CLOB column in a table of Oracle 10g

    I have a requirement which is nothing but a table has two column
    create table emp_data (empid number, report clob)
    Here REPORT column is CLOB data type which used to load the data from the .csv file.
    The requirement here is
    1) How to load data from .CSV file into CLOB column along with empid using DBMS_lob utility
    2) How to read report columns which should return all the columns present in the .CSV file (dynamically because every csv file may have different number of columns) along with the primariy key empid).
    eg: empid report_field1 report_field2
    1 x y
    Any help would be appreciated.

    If I understand you right, you want each row in your table to contain an emp_id and the complete text of a multi-record .csv file.
    It's not clear how you relate emp_id to the appropriate file to be read. Is the emp_id stored in the csv file?
    To read the file, you can use functions from [UTL_FILE|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABGGEDF] (as long as the file is in a directory accessible to the Oracle server):
    declare
        lt_report_clob CLOB;
        l_max_line_length integer := 1024;   -- set as high as the longest line in your file
        l_infile UTL_FILE.file_type;
        l_buffer varchar2(1024);
        l_emp_id report_table.emp_id%type := 123; -- not clear where emp_id comes from
        l_filename varchar2(200) := 'my_file_name.csv';   -- get this from somewhere
    begin
       -- open the file; we assume an Oracle directory has already been created
        l_infile := utl_file.fopen('CSV_DIRECTORY', l_filename, 'r', l_max_line_length);
        -- initialise the empty clob
        dbms_lob.createtemporary(lt_report_clob, TRUE, DBMS_LOB.session);
        loop
          begin
             utl_file.get_line(l_infile, l_buffer);
             dbms_lob.append(lt_report_clob, l_buffer);
          exception
             when no_data_found then
                 exit;
          end;
        end loop;
        insert into report_table (emp_id, report)
        values (l_emp_id, lt_report_clob);
        -- free the temporary lob
        dbms_lob.freetemporary(lt_report_clob);
       -- close the file
       UTL_FILE.fclose(l_infile);
    end;This simple line-by-line approach is easy to understand, and gives you an opportunity (if you want) to take each line in the file and transform it (for example, you could transform it into a nested table, or into XML). However it can be rather slow if there are many records in the csv file - the lob_append operation is not particularly efficient. I was able to improve the efficiency by caching the lines in a VARCHAR2 up to a maximum cache size, and only then appending to the LOB - see [three posts on my blog|http://preferisco.blogspot.com/search/label/lob].
    There is at least one other possibility:
    - you could use [DBMS_LOB.loadclobfromfile|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978]. I've not tried this before myself, but I think the procedure is described [here in the 9i docs|http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96591/adl12bfl.htm#879711]. This is likely to be faster than UTL_FILE (because it is all happening in the underlying DBMS_LOB package, possibly in a native way).
    That's all for now. I haven't yet answered your question on how to report data back out of the CLOB. I would like to know how you associate employees with files; what happens if there is > 1 file per employee, etc.
    HTH
    Regards Nigel
    Edited by: nthomas on Mar 2, 2009 11:22 AM - don't forget to fclose the file...

  • How to upload above20000 records csv files into oracle table,Oracle APEX3.2

    Can any one help me how to CSV upload more than 20,000 records using APEX 3.2 upload process.i am using regular upload process using BOLB file
    SELECT blob_content,id,filename into v_blob_data,v_file_id,v_file_name
    FROM apex_application_files
    WHERE last_updated = (select max(last_updated)
    from apex_application_files WHERE UPDATED_BY = :APP_USER)
    AND id = (select max(id) from apex_application_files where updated_by = :APP_USER);
    I tried to upload but my page getting time out. my application best working up to 1000 records. after that its getting timed out.Each record is storing 2 secornds in the oracle table.So 1000 records it taking 7 minuts after that APEX upload webpage getting timed out
    please help me with source how to speed upload csv file process or help another best with with source example.
    Thanks,
    Sant.
    Edited by: 994152 on Mar 15, 2013 5:38 AM

    See this posting:
    Internet Explorer Cannot Display
    There, I provided a couple of links on this particular issue. You need to change the timeout on your application server.
    Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://www.apress.com/9781430235125
    http://apex.oracle.com/pls/apex/f?p=31517:1
    http://www.amazon.de/Oracle-APEX-XE-Praxis/dp/3826655494
    -------------------------------------------------------------------

  • Error loading local CSV file into external table

    Hello,
    I am trying to load a CSV file located on my C:\ drive on a WIndows XP system into an 'external table'. Everyting used to work correctly when using Oracle XE (iinstalled also locally on my WIndows system).
    However, once I am trynig to load the same file into a Oracle 11g R2 database on UNIX, I get the following errr:
    ORA-29913: Error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    error opening file ...\...\...nnnnn.log
    Please let me know if I can achieve the same functionality I had with Oracle XP.
    (Note: I cannot use SQL*Loader approach, I am invoking Oracle stored procedures from VB.Net that are attempting to load data into external tables).
    Regards,
    M.R.

    user7047382 wrote:
    Hello,
    I am trying to load a CSV file located on my C:\ drive on a WIndows XP system into an 'external table'. Everyting used to work correctly when using Oracle XE (iinstalled also locally on my WIndows system).
    However, once I am trynig to load the same file into a Oracle 11g R2 database on UNIX, I get the following errr:
    ORA-29913: Error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    error opening file ...\...\...nnnnn.log
    Please let me know if I can achieve the same functionality I had with Oracle XP.
    (Note: I cannot use SQL*Loader approach, I am invoking Oracle stored procedures from VB.Net that are attempting to load data into external tables).
    Regards,
    M.R.So your database is on a unix box, but the file is on your Windows desktop machine? So .... how is it you are making your file (on your desktop) visible to your database (on a unix box)???????

  • How to load a XML file into a table

    Hi,
    I've been working on Oracle for many years but for the first time I was asked to load a XML file into a table.
    As an example, I've found this on the web, but it doesn't work
    Can someone tell me why? I hoped this example could help me.
    the file acct.xml is this:
    <?xml version="1.0"?>
    <ACCOUNT_HEADER_ACK>
    <HEADER>
    <STATUS_CODE>100</STATUS_CODE>
    <STATUS_REMARKS>check</STATUS_REMARKS>
    </HEADER>
    <DETAILS>
    <DETAIL>
    <SEGMENT_NUMBER>2</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>3</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic administration</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>4</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic finance</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>5</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic logistics</REMARKS>
    </DETAIL>
    </DETAILS>
    <HEADER>
    <STATUS_CODE>500</STATUS_CODE>
    <STATUS_REMARKS>process exception</STATUS_REMARKS>
    </HEADER>
    <DETAILS>
    <DETAIL>
    <SEGMENT_NUMBER>20</SEGMENT_NUMBER>
    <REMARKS> base polytechnic</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>30</SEGMENT_NUMBER>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>40</SEGMENT_NUMBER>
    <REMARKS> base polytechnic finance</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>50</SEGMENT_NUMBER>
    <REMARKS> base polytechnic logistics</REMARKS>
    </DETAIL>
    </DETAILS>
    </ACCOUNT_HEADER_ACK>
    For the two tags HEADER and DETAILS I have the table:
    create table xxrp_acct_details(
    status_code number,
    status_remarks varchar2(100),
    segment_number number,
    remarks varchar2(100)
    before I've created a
    create directory test_dir as 'c:\esterno'; -- where I have my acct.xml
    and after, can you give me a script for loading data by using XMLTABLE?
    I've tried this but it doesn't work:
    DECLARE
    acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    BEGIN
    insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
    select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
    ) x2
    END;
    This should allow me to get something like this:
    select * from xxrp_acct_details;
    Statuscode status remarks segement remarks
    100 check 2 rp polytechnic
    100 check 3 rp polytechnic administration
    100 check 4 rp polytechnic finance
    100 check 5 rp polytechnic logistics
    500 process exception 20 base polytechnic
    500 process exception 30
    500 process exception 40 base polytechnic finance
    500 process exception 50 base polytechnic logistics
    but I get:
    Error report:
    ORA-06550: line 19, column 11:
    PL/SQL: ORA-00932: inconsistent datatypes: expected - got NUMBER
    ORA-06550: line 4, column 2:
    PL/SQL: SQL Statement ignored
    06550. 00000 -  "line %s, column %s:\n%s"
    *Cause:    Usually a PL/SQL compilation error.
    and if I try to change the script without using the column HEADER_NO to keep track of the header rank inside the document:
    DECLARE
    acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    BEGIN
    insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
    select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '/ACCOUNT_HEADER_ACK/DETAILS'
      passing acct_doc
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
    ) x2
    END;
    I get this message:
    Error report:
    ORA-19114: error during parsing the XQuery expression: 
    ORA-06550: line 1, column 13:
    PLS-00201: identifier 'SYS.DBMS_XQUERYINT' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at line 4
    19114. 00000 -  "error during parsing the XQuery expression: %s"
    *Cause:    An error occurred during the parsing of the XQuery expression.
    *Action:   Check the detailed error message for the possible causes.
    My oracle version is 10gR2 Express Edition
    I do need a script for loading xml files into a table as soon as possible, Give me please a simple example for understanding and that works on 10gR2 Express Edition
    Thanks in advance!

    The reason your first SQL statement
    select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
    ) x2
    returns the error you noticed
    PL/SQL: ORA-00932: inconsistent datatypes: expected - got NUMBER
    is because Oracle is expecting XML to be passed in.  At the moment I forget if it requires a certain format or not, but it is simply expecting the value to be wrapped in simple XML.
    Your query actually runs as is on 11.1 as Oracle changed how that functionality worked when 11.1 was released.  Your query runs slowly, but it does run.
    As you are dealing with groups, is there any way the input XML can be modified to be like
    <ACCOUNT_HEADER_ACK>
    <ACCOUNT_GROUP>
    <HEADER>....</HEADER>
    <DETAILS>....</DETAILS>
    </ACCOUNT_GROUP>
      <ACCOUNT_GROUP>
      <HEADER>....</HEADER>
      <DETAILS>....</DETAILS>
      </ACCOUNT_GROUP>
    </ACCOUNT_HEADER_ACK>
    so that it is easier to associate a HEADER/DETAILS combination?  If so, it would make parsing the XML much easier.
    Assuming the answer is no, here is one hack to accomplish your goal
    select x1.status_code,
            x1.status_remarks,
            x3.segment_number,
            x3.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS'
      passing acct_doc as "d",
      columns detail_no      for ordinality,
              detail_xml     xmltype       path 'DETAIL'
    ) x2,
    xmltable(
      'DETAIL'
      passing x2.detail_xml
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS') x3
    WHERE x1.header_no = x2.detail_no;
    This follows the approach you started with.  Table x1 creates a row for each HEADER node and table x2 creates a row for each DETAILS node.  It assumes there is always a one and only one association between the two.  I use table x3, which is joined to x2, to parse the many DETAIL nodes.  The WHERE clause then joins each header row to the corresponding details row and produces the eight rows you are seeking.
    There is another approach that I know of, and that would be using XQuery within the XMLTable.  It should require using only one XMLTable but I would have to spend some time coming up with that solution and I can't recall whether restrictions exist in 10gR2 Express Edition compared to what can run in 10.2 Enterprise Edition for XQuery.

  • How to load a BDB file into memory?

    The entire BDB database needs to reside in memory for performance reasons, it needs to be in memory all the time, not paged in on demand. The physical memory and virtual process address space are large enough to hold this file. How can I load it into memory just before accessing the first entry? I've read the C++ API reference, and it seems that I can do the following:
    1, Create a DB environment;
    2, Call DB_ENV->set_cachesize() to set a memory pool large enough to hold the BDB file;
    3, Call DB_MPOOLFILE->open() to open the BDB file in memory pool of that DB environment;
    4, Create a DB handle in that DB environment and open the BDB file (again) via this DB handle.
    My questions are:
    1, Is there a more elegant way instead of using that DB environment? If the DB environment is a must, then:
    2, Does step 3 above load the BDB file into memory pool or just reserve enough space for that file?
    Thanks in advance,
    Feng

    Hello,
    Does the documentation on "Memory-only or Flash configurations" at:
    http://download.oracle.com/docs/cd/E17076_02/html/programmer_reference/program_ram.html
    answer the question?
    From there we have:
    By default, databases are periodically flushed from the Berkeley DB memory cache to backing physical files in the filesystem. To keep databases from being written to backing physical files, pass the DB_MPOOL_NOFILE flag to the DB_MPOOLFILE->set_flags() method. This flag implies the application's databases must fit entirely in the Berkeley DB cache, of course. To avoid a database file growing to consume the entire cache, applications can limit the size of individual databases in the cache by calling the DB_MPOOLFILE->set_maxsize() method.
    Thanks,
    Sandra

  • Performance problems loading an XML file into oracle database

    Hello ODI Guru's,
    I am trying to load and XML file into the database after doing simple business validations. But the interface takes hours to complete.
    1. The XML files are large in size >200 Mb. We have an XSD file for the schema definition instead of a DTD.
    2. We used the external database feature for loading these files in database.
    The following configuration was used in the XML Data Server:
    jdbc:snps:xml?f=D:\CustomerMasterData1\CustomerMasterInitialLoad1.xml&d=D:\CustomerMasterData1\CustomerMasterInitialLoad1.xsd&re=initialLoad&s=CM&db_props=oracle&ro=true
    3. Now we reverse engineer the XML files and created models using ODI Designer
    4. Similar thing was done for the target i.e. an Oracle database table as well.
    5. Next we created a simple interface with one-to-one mapping from the XSD schema to the Oracle database table and executed the interface. This execution takes more than one hour to complete.
    6. We are running ODI client on Windows XP Professional SP2.
    7. The Oracle database server(Oracle 10g 10.2.0.3) for the target schema as well as the ODI master and work repositories are on the same machine.
    8. I tried changing the following properties but it is not making much visible difference:
    use_prepared_statements=Y
    use_batch_update=Y
    batch_update_size=510
    commit_periodically=Y
    num_inserts_before_commit=30000
    I have another problem that when I set batch_update_size to value greater that 510 I get the following error:
    java.sql.SQLException: class org.xml.sax.SAXException
    class java.lang.ArrayIndexOutOfBoundsException said -32413
    at com.sunopsis.jdbc.driver.xml.v.a(v.java)
    The main concern is why should the interface taking so long to execute.
    Please send suggestions to resolve the problem.
    Thanks in advance,
    Best Regards,
    Nikunj

    Approximately how many rows are you trying to insert?
    One of the techniques which I found improved performance for this scenario was to extract from the xml to a flat file, then to use SQL*LOADER or external tables to load the data into Oracle.

Maybe you are looking for

  • A better way to sync

    for everyone out there who's having trouble syncing big movie files onto their apple tv wirelessly, i have something that might help... for me personally, i always have trouble syncing big movie files wirelessly, the apple tv would always disconnect

  • Random restarts, Software malfunction? Error message.

    Hi everyone! I would like to ask for your help. My MacBook Pro started the have a malfunction about 1 month ago. It has been randomly restarting by itself. As you can imagine this can be very annoying when you are working on your projects.... (saving

  • BUG: SQL Developer 1.5.3 CSV export column order broken

    There's a bug in the 1.5.3 release of SQL Developer regarding data export. This bug re-orders the column order in the data export to be alphabetically sorted. This produces mangled CSV files if used on a table or a view. This is a critical bug to me

  • Nokia 5800 XpressMusic Text Error

    Hi ,  for example  when i write a message i choose hand writing , when i write in the box nothing appears in the box ( i mean it appears in the message but it doesnt appear in the box) ,,,   and another problem , when i change the keyboard it has min

  • ABAP Clinet Proxy issue

    Hi Frnds, I am working on ABAP Client proxy , when i am sending data to PI getting error message in ECC moni <SAP:Code area="INTERNAL">HTTP_RESP_STATUS_CODE_NOT_OK</SAP:Code>   <SAP:P1>401</SAP:P1>   <SAP:P2>Unauthorized</SAP:P2> <SAP:Stack>HTTP resp