How to read a txt file then insert them into db tables

Hi,
Please help me to solve this problem. This is what I need to do:
1. read a text file which holds the values to be inserted into some database tables. For example it holds ename, deptno, sal, dname for emp and dept tables. These are in a text file now.
2. Then I need to insert these data into db tables like emp and dept, and check if there are dups or voilate any constraints. The records to be inserted are usually > 100,000, so I am also looking for a fast way to do this.
3. This operation happens regularly, so I need to schedule the job. I know this can be done by using procedures and dbms_jobs.
Any help will be greately appreciated.

Generally, in 8i, folks would use sql loader to accomplish this. Sql loader does exactly what you want, but is an external application, so it would probably be easiest to schedule it to run using your operating system's scheduler (i.e. cron on Unix, Task Scheduler on Windows). You could certainly write a PL/SQL procedure that called the external sql loader application, and use the Oracle scheduler to schedule it, but that strikes me as more effort than it's worth.
If you're using 9i, you could look into external tables, which would simplify the process greatly.
Justin

Similar Messages

  • How to read a CSV file and Insert data into an Oracle Table

    Hi All,
    I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .
    Please let me some suggestions on this.
    Thanks,
    Chandra R

    jeneesh wrote:
    And, please don't "hijack" 5 year old thread..Better start a new one..I've just split it off to a thread of it's own. ;)
    @OP,
    I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .You don't have a "clob file" as there's no such thing. CLOB is a datatype for storing large character based objects. A file is something on the operating system's filesystem.
    So, why have you stored comma seperated data in a CLOB?
    Where did this data come from? If it came from a file, why didn't you use SQL*Loader or, even better, External Tables to read and parse the data into structured format when populating the database with it?
    If you really do have to parse a CLOB of data to pull out the comma seperated values, then you're going to have to write something yourself to do that, reading "lines" by looking for the newline character(s), and then breaking up the "lines" into the component data by looking for commas within it, using normal string functions such as INSTR and SUBSTR or, if necessary, REGEXP_INSTR and REGEXP_SUBSTR. If you have string data that contains commas but uses double quotes around the string, then you'll also have the added complexity of ignoring commas within such string data.
    Like I say... it's much easier with SQL*Loader of External Tables as these are designed to parse such CSV type data.

  • How to read a txt file and update DB table

    I have to read a table which has some entries in it as plain text file, then I should update my table if the table entries and the data in text file are NOT MATCHING o.w. keep the table entry as it is

    It will helps to u
    data : begin of t_temp occurs 0,
             text(256),
           end of t_temp.
    data : s_file type string.
    SELECTION-SCREEN BEGIN OF BLOCK B1  WITH FRAME  TITLE TEXT-S01.
    PARAMETER: P_FILE LIKE RLGRAP-FILENAME.
    SELECTION-SCREEN : END OF BLOCK B1.
    clear s_file.
    move p_file to s_file.
    uploade SOURCE file to t_tab1
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        FILENAME                      = s_file
        FILETYPE                      = 'ASC'
       HAS_FIELD_SEPARATOR           = 'X'
             HEADER_LENGTH                 = 0
             READ_BY_LINE                  = 'X'
             DAT_MODE                      = ' '
             CODEPAGE                      = ' '
             IGNORE_CERR                   = ABAP_TRUE
             REPLACEMENT                   = '#'
             CHECK_BOM                     = ' '
           IMPORTING
             FILELENGTH                    =
             HEADER                        =
      TABLES
        DATA_TAB                      = t_temp
    EXCEPTIONS
       FILE_OPEN_ERROR               = 1
       FILE_READ_ERROR               = 2
       NO_BATCH                      = 3
       GUI_REFUSE_FILETRANSFER       = 4
       INVALID_TYPE                  = 5
       NO_AUTHORITY                  = 6
       UNKNOWN_ERROR                 = 7
       BAD_DATA_FORMAT               = 8
       HEADER_NOT_ALLOWED            = 9
       SEPARATOR_NOT_ALLOWED         = 10
       HEADER_TOO_LONG               = 11
       UNKNOWN_DP_ERROR              = 12
       ACCESS_DENIED                 = 13
       DP_OUT_OF_MEMORY              = 14
       DISK_FULL                     = 15
       DP_TIMEOUT                    = 16
       OTHERS                        = 17
    IF SY-SUBRC <> 0.
      MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
              WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Text deleted by Moderator.  Do NOT request points
    Regards
    Saimedha

  • I use version 4.01. How do I backup bookmarks and then import them into my laptop?

    I want to export my bookmarks from my desktop and then import them into Firefox on my laptop. The Help file tells me to open "Organize Bookmarks" 'but I can't find it! Help!

    Sorry, that menu item was renamed. You want "Show All Bookmarks" now.

  • Read info about files in specific folder into internal table

    Hi Experts
    I need to have last modified date and filename information read into an internal table. I need to read in the information of all files in a specific (UNIX) folder (I do not know the name of the single files).
    I really hope someone can help.
    Thanks a lot
    Kind regards,
    Torben

    Hi Guys
    Thanks a lot for you input.
    I managed to get my program to works as follows:
    REPORT  ZDELETE_ARCHIVING_FILES.
    *Step 1: Get the list of files in the directory into internal table :
    DATA: DLIST    LIKE EPSFILI OCCURS 0 WITH HEADER LINE,
          DPATH    LIKE EPSF-EPSDIRNAM,
          MDATE    LIKE SY-DATUM,
          MTIME    LIKE SY-UZEIT.
    DATA: BEGIN OF FATTR OCCURS 0,
              FILE_NAME  LIKE EPSF-EPSFILNAM,
              FILE_SIZE  LIKE EPSF-EPSFILSIZ,
              FILE_OWNER LIKE EPSF-EPSFILOWN,
              FILE_MODE  LIKE EPSF-EPSFILMOD,
              FILE_TYPE  LIKE EPSF-EPSFILTYP,
              FILE_MTIME(12),
          END OF FATTR.
    DATA: P_PATH(50) TYPE C.
    CONCATENATE '/ARCHIVE/' sy-sysid '/archive' INTO P_PATH.
    *        WRITE: / P_PATH.
    DPATH = P_PATH.
    *Get files in folder - read into internal table DLIST
    *if filenames are longer than 40 characters
    *then use FM SUBST_GET_FILE_LIST.
    CALL FUNCTION 'EPS_GET_DIRECTORY_LISTING'
         EXPORTING
              DIR_NAME               = DPATH
         TABLES
              DIR_LIST               = DLIST
         EXCEPTIONS
              INVALID_EPS_SUBDIR     = 1
              SAPGPARAM_FAILED       = 2
              BUILD_DIRECTORY_FAILED = 3
              NO_AUTHORIZATION       = 4
              READ_DIRECTORY_FAILED  = 5
              TOO_MANY_READ_ERRORS   = 6
              EMPTY_DIRECTORY_LIST   = 7
              OTHERS                 = 8.
    IF SY-SUBRC EQ 0.
    *Step 2: Read the file attributes into an internal table
    *Read info about the files (attributes) into the internal table FATTR
    *as we need info about the last time it was changed (MTIME)
      LOOP AT DLIST.
        CALL FUNCTION 'EPS_GET_FILE_ATTRIBUTES'
             EXPORTING
                  FILE_NAME              = DLIST-NAME
                  DIR_NAME               = DPATH
             IMPORTING
                  FILE_SIZE              = FATTR-FILE_SIZE
                  FILE_OWNER             = FATTR-FILE_OWNER
                  FILE_MODE              = FATTR-FILE_MODE
                  FILE_TYPE              = FATTR-FILE_TYPE
                  FILE_MTIME             = FATTR-FILE_MTIME
             EXCEPTIONS
                  READ_DIRECTORY_FAILED  = 1
                  READ_ATTRIBUTES_FAILED = 2
                  OTHERS                 = 3.
        IF SY-SUBRC EQ 0.
          FATTR-FILE_NAME = DLIST-NAME.
          APPEND FATTR.
        ENDIF.
      ENDLOOP.
      SORT FATTR BY FILE_NAME.
      DATA: time(10), date LIKE sy-datum.
      DATA: months TYPE i.
      DATA: e_file TYPE string.
      LOOP AT FATTR.
      CLEAR: time, months, e_file.
        IF FATTR-FILE_NAME(10) = 'archive_BW'.
    *Convert the unix time into readable time
          PERFORM p6_to_date_time_tz(rstr0400) USING FATTR-FILE_MTIME
                                                     time
                                                     date.
    *Calculate the number of months between date (MTIME) and current day
    *ex 31.03.2009 - 01.04.2009 = 1 month
    *ex 01.02.2009 - 01.04.2009 = 2 month
        CALL FUNCTION 'MONTHS_BETWEEN_TWO_DATES'
          EXPORTING
            i_datum_von = date
            i_datum_bis = sy-datum
          IMPORTING
            e_monate = months.
            CONCATENATE DPATH '/' FATTR-FILE_NAME INTO e_file.
    *        WRITE: / FATTR-FILE_NAME,
    *                 FATTR-FILE_SIZE,
    *                 date,
    *                 time,
    *                 sy-datum,
    *                 months,
    *                 e_file.
    *Step 3: Delete files where months > 1
          IF months > 1.
            OPEN dataset e_file for output in text mode encoding default.
            DELETE dataset e_file.
            CLOSE dataset e_file.
          ENDIF. "months > 1
        ENDIF.
      ENDLOOP.
    ENDIF.

  • How to read from txt file that has words in between?

    Hi all,
    I am using Labview 8.2.
    I would like to read from a text file.  I have data (after each time it is has averaged over 100 waveforms) repeatedly stored on to the file.  The idea is to further improve SNR in post processing by again averaging the data (that has been averaged over the 100 waveforms).  
    I can get LabView  to save the data repeatedly into the file, so it keeps getting appended.
    The problem is to read the data in labview so I can now again average it.  The problem is the labview seperates the sets of data with the following:
    " Channels    1    
    Samples    9925    
    Date    2008/10/28    
    Time    17:16:11.638363    
    X_Dimension    Time    
    X0    -3.0125000000000013E-3    
    Delta_X    2.500000E-6    
    ***End_of_Header***        "
    So When I read it, it only sees the first set of data.
    Can someone please tell me how to read all the sets of data in labview?
    I have attached the file I want to read "acquiredwaveform.txt"  and the basic VI (really basic btw) to read the file.
    Thanks
    Solved!
    Go to Solution.
    Attachments:
    ReadFileAndAverage.vi ‏48 KB
    acquiredWaveform.txt ‏605 KB

    Thanks again DFGray for the comments. 
    After the correlations to find the peak positions, i just take the max value.  And you are right the accuracy is limited by the number of  samples per cycle.  Perhaps it would be clearer if you see the code.
    1) Basically I get a signal on the up and downslope of the sine wave.  On the down slope however the signal is negative, i.e. its is flipped.  So before I shift and average...I 'cut' the waveform into 4 (when cycles per buffer = 2, then I get 4 signals, 2 on the up slope and 2 on down slope) bits.  Counting from one, I flip the even number, cut it, and but an array of waveforms which is then sent to be convolved and shifted.
    2) Array of waveforms are stored to be phased shifted (Convolve and shift vi) and averaged (entire averaging vi which uses the convolve and shift vi as a sub vi). 
    * Phase shifting doesn't work when I cut and put it together (So something is wrong in cut waveform vi) 
    * Also if it isn't too time consuming could you give me an example of interpolating and shifting thing.
    * Also if you have any comments regarding the following VIs please let me know.  
    Thanks 
    Attached is:
    1) Cut waveform vi
    2) Convolve and shift
    3) Entire averaging 
    Attachments:
    SubVICutWaveforms.vi ‏37 KB
    SubVIConvolveShift.vi ‏30 KB
    SubVIEntireAveraging2.vi ‏43 KB

  • Help! How to read a .txt file into a Java class and make 2D array?

    Hi guys,
    Im a newbie with arrays, just started really using them.. please bear with me if I don't seem to understand much..
    I have a .txt file that contains either a square or rectangle (random width and length).. How can I read each line into a Java class into a 2D array with rows and columns?

    Example :
    import javax.swing.*;
    import java.util.ArrayList;
    import java.io.BufferedReader;
    import java.io.FileReader;
    import java.io.File;
    public class ReadInto2DArrayExample {
        public static void main(String[] args) {
            ArrayList array = new ArrayList();
            char [][] twoDimesionArray = null;
            try
                JFileChooser fileChooser = new JFileChooser();
                if (fileChooser.showOpenDialog(null) == JFileChooser.APPROVE_OPTION)
                    File file = fileChooser.getSelectedFile();
                    BufferedReader reader = new BufferedReader(new FileReader(file));
                    String data;
                    //Read from file
                    while ((data = reader.readLine()) != null)
                        //Convert data to char array and add into array
                        array.add(data.toCharArray());
                    reader.close();
                    //Creating a 2D char array base on the array size
                    twoDimesionArray = new char [array.size()][];
                    //Convert array from ArrayList to 2D array
                    for (int i = 0; i < array.size(); i++)
                        twoDimesionArray[i] = (char [])array.get(i);
                    //Test the 2D Array
                    for (int y = 0; y < twoDimesionArray.length; y++)
                        char [] temp = twoDimesionArray[y];
                        for (int x = 0; x < temp.length; x ++ )
                            System.out.print(temp[x]);
                        System.out.println("");
            catch (Exception ex)
                ex.printStackTrace();
    }

  • SAP GRC - Exporting rules from GRC - how to read the .txt file generated ?

    Hi there,
    I am using GRC Compliance Calibrator and have downloaded the default Global rules defined in Compliance Calibrator using the Rule Architect -> Utilities->Export rules.
    This gave me a massive txt file with a lot of tables and data. Reading through this forum, I did figure out that lines starting with M are the header rows for the tables and D rows are the data rows.
    My question is, how do i figure out what each of the Virsa tables stand for (e.g. VIRSA_CC_FUNCACT, VIRSA_CC_FUNCPRM) ?
    I tried SE11 and looking up these tables in the SAP environment associated with this CC install, however it says that the table was not found.
    Could someone please point me to :
    A) A list of the common Virsa CC tables and their descriptions ?
    OR
    B) How can i find what these tables stand for online or in the SAP environment?
    Many thanks !

    Hi Santosh,
    There is no option available to export only the customized rule sets to another system. The export rules option will give all the rules that are available in that system.
    You can do in the below manner
    a) Extract the data from Export rules
    b) Open that text file in a spreadsheet and edit the spreadsheet [Remove the rule sets & the rules not required in production system]
    c) Save the spreadsheet in UTF-8 text file
    d) Upload them in the production.
    The above procedure is bit complex and cumbersome -as changing the text file is risky. Even a space will not generate any rules in the RAR. I would suggest rename the new rule set in different naming convention and upload in your test environment before uploading the text files  in Production.
    But, using the Export and Import option you cannot upload only the customized rule set as the extract happens for the entire rules sets available in the system.
    Thanks and Best Regards,
    Srihari.K

  • Can't figure out how to read from .txt file...

    It looks so simple:
    I have a txt file with one line in it:
    1038It is actually a highscore for a little game. But how do get this out of the file into an int?
    Thank you.

    this reads out a complete text file:
    try
    FileInputStream fis = new FileInputStream(fileName);
    BufferedInputStream bis = new BufferedInputStream(fis);
    DataInputStream dis = new DataInputStream(bis);
    String tmp = "";
    while ( dis.available() > 0 ) { tmp += dis.readLine() + "\n"; }
    dis.close();
    bis.close();
    fis.close();
    } catch (Exception e) { e.printStackTrace(); }
    if you want to convert string to int use
    int i = Integer.parseInt( String );

  • How to read LARGE .TXT FILES from server

    i m unable to read large .txt (>100kb) files from server.
    also is there any way to store the content in mobile phone.
    PLEASE HELP as iam in dire need and only few days are left.
    thanks

    Hi
    > i m unable to read large .txt (>100kb) files from server.
    What do you mean by that? Are you trying to display that amount of text and you get an error. Are you trying to read from the server and you get there an error?
    > also is there any way to store the content in mobile phone.
    Using RMS. But be aware of the size limit specific to every phone.
    Mihai

  • Read Excel file and insert it into a table

    Hi All
    i'm developing a web app using jdeveloper 11.1.1.3
    I want to upload an excel file to a table, is there way in the frame work to do this easily
    Tx

    convert the excel fil to a .csv file and expose the csv file as a datacontrol and then use it in the table
    http://technology.amis.nl/blog/2306/uploading-csv-files-into-dynamic-adf-faces-tables-programmtically-creating-columns

  • How to save the pdf file or word doc into sap table

    Hi Expertu2019s
       I have a pfd file in my presentation server .Now I want to save the pdf file into sap table using module pool program. Whenever i need, I want to open that file from the table and show it in the Screen. Please any one tell me how I can save the file. What is the table name, guide me.
    Regards,
    S.Nehru.

    Hi,
    Try the following code
    FORM gui_upload.
      DATA: lv_filetype(10) TYPE c,
            lv_gui_sep TYPE c,
            lv_file_name TYPE string.
      lv_filetype = 'PDF'.
      lv_file_name = <name of ur file>.
    DATA: tb_file_data TYPE TABLE OF text4096.
    * FM call to upload file
      CALL FUNCTION 'GUI_UPLOAD'
        EXPORTING
          filename                = lv_file_name
          filetype                = lv_filetype
          has_field_separator     = lv_gui_sep
        TABLES
          data_tab                = tb_file_data
        EXCEPTIONS
          file_open_error         = 1
          file_read_error         = 2
          no_batch                = 3
          gui_refuse_filetransfer = 4
          invalid_type            = 5
          no_authority            = 6
          unknown_error           = 7
          bad_data_format         = 8
          header_not_allowed      = 9
          separator_not_allowed   = 10
          header_too_long         = 11
          unknown_dp_error        = 12
          access_denied           = 13
          dp_out_of_memory        = 14
          disk_full               = 15
          dp_timeout              = 16
          OTHERS                  = 17.
      IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    ENDFORM.                    "gui_upload
    I dont think you can save the data into sap tables in PDF format.
    With the above code you can save data into an internal table.
    Regards,
    Manish

  • Import a txt file from a location into the table through toad

    Hi,
    I have a location which is mounted on my oracle database.
    i have made one directory of that location.
    I need to load one .txt file present on that mapped location with the help of toad.
    My delimiter is ' | ' .
    Please help me out.
    Thanks

    The environment you described (and your intensions if I got it right) seems to call for a solution using External Tables (see Managing Tables)
    You shouldn't have any problems using TOAD.
    Regards
    Etbin

  • IO - Read two image files and put them into one file

    Hi,
    i have 3 files in all. The two image files and one text file. I need to place the image in the first image file, followed by text in the text file and then the image in the second image file, into one file.
    Can anyone tell me how do i go about doing this ?
    i tried using fileinputstream and fileoutputstream, which works fine if all the 3 files have text but when the first and the third file have image, the code doesn't give any error but the result file displays only the image from the first file and nothing else.
    i am running short of time and need to do this really soon.
    if anyone has done anything like this. please let me know,
    thanx,
    poonam

    One approach would be to programmatcally create a single zip/jar file from the three input files. You can use the java.util.zip and java.util.jar packages for this purpose.
    The other apprach would be to create a single image by drawing images and text strings on a BufferedImage object.
    I think the first approach is preferable because you can easily extract the individual files from the zip/jar file

  • How to extract data from CLOB and insert them into DB?

    Hi PL/SQL Gurus,
    I have no experience in PL/SQL, but I have a requirement now where I have to use it.
    We have a table with 10 columns, one of them is a CLOB and it holds XML data. The XMLs are very huge in size.
    Two new columns are added to the table and the data has to be filled for the existing records from the corresponding XML. The XML has all the data as attributes. I started searching on the internet and tried if I could extract the data out of XML.
    SELECT extractValue(value(x), '/Order/Package/@Code',
    'xmlns:ns0="http://mycompany.com/Order/OrderType.xsd"' )
    FROM ORDER_TABLE A
    , TABLE(
    XMLSequence(
    extract(
    xmltype(A.XML_DATA)
    , '/ns0:Root'
    , 'xmlns:ns0="http://mycompany.com/Order/OrderType.xsd"'
    ) x;
    But this isn't working. Could anyone please provide some ideas.
    I just want to confirm that I am able to extract data. Once this is done I would like to create a procedure so that all the existing records can be updated.
    Thanks in advance.
    Regards,
    Fazzy

    Can this be acheived using a SQL statement.Yes, you can do it with the DML error logging clause.
    Here's a quick example I've just set up :
    Base table
    SQL> create table order_table (
      2   order_id number,
      3   package_code varchar2(30),
      4   package_desc varchar2(100),
      5   xml_data clob
      6  );
    Table created
    SQL> alter table order_table add constraint order_table_pk primary key (order_id);
    Table altered
    Adding data...
    SQL> insert into order_table (order_id, xml_data)
      2  values (1, '<?xml version="1.0" encoding="UTF-8"?>
      3  <ns0:Root xmlns:ns0="http://mycompany.com/Order/OrderType.xsd" BookingDate="2009-06-07">
      4  <ns1:OrderKey xmlns:ns1="http://mycompany.com/Order/OrderKeyTypes.xsd" SystemCode="THOMAS" Id="458402-TM1" Version="1"/>
      5  <ns0:Package Code="0001" Desc="ProductName1"/>
      6  <ns0:PromotionGroup Code="DSP" Name="OrderThomasPortal"/>
      7  <ns0:Promotion Code="TH902" Name="OrderThomasPortal" SellingName="OrderThomasPortal" BackOfficeName="OrderThomasPortal"/>
      8  <ns0:FinancialSupplier Code="HHT" Name="NYOrdSupp"/>
      9  <ns0:SellingCurrency Code="EUR" Name="Euro"/>
    10  <ns0:Owner ClientId="02654144" ClientMandator="T" ClientSystem="ROSY"/>
    11  <ns0:Agent PersonInAgency="5254" AgentCode="000009" CommissionAmount="2.8" CommissionDueDate="2009-07-01+02:00" VatOnCommission="0"/>
    12  </ns0:Root>');
    1 row inserted
    SQL> insert into order_table (order_id, xml_data)
      2  values (2, '<?xml version="1.0" encoding="UTF-8"?>
      3  <ns0:Root xmlns:ns0="http://mycompany.com/Order/OrderType.xsd" BookingDate="2009-06-07">
      4  <ns1:OrderKey xmlns:ns1="http://mycompany.com/Order/OrderKeyTypes.xsd" SystemCode="THOMAS" Id="458402-TM1" Version="1"/>
      5  <ns0:Package Code="0002" Desc="ProductName2/>
      6  <ns0:PromotionGroup Code="DSP" Name="OrderThomasPortal"/>
      7  <ns0:Promotion Code="TH902" Name="OrderThomasPortal" SellingName="OrderThomasPortal" BackOfficeName="OrderThomasPortal"/>
      8  <ns0:FinancialSupplier Code="HHT" Name="NYOrdSupp"/>
      9  <ns0:SellingCurrency Code="EUR" Name="Euro"/>
    10  <ns0:Owner ClientId="02654144" ClientMandator="T" ClientSystem="ROSY"/>
    11  <ns0:Agent PersonInAgency="5254" AgentCode="000009" CommissionAmount="2.8" CommissionDueDate="2009-07-01+02:00" VatOnCommission="0"/>
    12  </ns0:Root>');
    1 row inserted
    SQL> commit;
    Commit complete
    {code}
    Note that the second row inserted contains a not well-formed XML (no closing quote for the /Root/Package/@Desc attribute).
    *Creating the error logging table...*
    {code}
    SQL> create table error_log_table (
      2   ora_err_number$ number,
      3   ora_err_mesg$   varchar2(2000),
      4   ora_err_rowid$  rowid,
      5   ora_err_optyp$  varchar2(2),
      6   ora_err_tag$    varchar2(2000)
      7  );
    Table created
    {code}
    *Updating...*
    {code}
    SQL> update (
      2    select extractvalue(doc, '/ns0:Root/ns0:Package/@Code', 'xmlns:ns0="http://mycompany.com/Order/OrderType.xsd"') as new_package_code
      3         , extractvalue(doc, '/ns0:Root/ns0:Package/@Desc', 'xmlns:ns0="http://mycompany.com/Order/OrderType.xsd"') as new_package_desc
      4         , package_code
      5         , package_desc
      6    from (
      7      select package_code
      8           , package_desc
      9           , xmltype(xml_data) doc
    10      from order_table t
    11    )
    12  )
    13  set package_code = new_package_code
    14    , package_desc = new_package_desc
    15  log errors into error_log_table ('My update process')
    16  reject limit unlimited
    17  ;
    1 row updated
    SQL> select order_id, package_code, package_desc from order_table;
      ORDER_ID PACKAGE_CODE                   PACKAGE_DESC
             1 0001                           ProductName1
             2                               
    {code}
    One row has been updated as expected, the other has been rejected and logged into the error table :
    {code}
    SQL> select * from error_log_table;
    ORA_ERR_NUMBER$ ORA_ERR_MESG$                                                                    ORA_ERR_ROWID$     ORA_ERR_OPTYP$ ORA_ERR_TAG$
              31011 ORA-31011: XML parsing failed                                                    AAAF6PAAEAAAASnAAB U              My update process
                    ORA-19202: Error occurred in XML processing                                                                       
                    LPX-00244: invalid use of less-than ('<') character (use &lt;)                                                    
                    Error at line 5                                                                                                   
                    ORA-06512: at "SYS.XMLTYPE", line 272                                                                             
                    ORA-06512: at line 1                                                                                              
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Maybe you are looking for

  • ITunes Match fails to match recently ripped CD. Can I avoid matching this CD?

    I have a CD from a band that is not available on iTunes that I've loaded to my MacBook Pro. As expected iTunes Match can't match this album. As a result, iTunes Match fails to proceed past the third step of the matching process and continously loops.

  • COUNT OF ROWS

    Hi , I want to find count of rows of a particular data set in a table expression. How to do that? Can you please help me out. Thanks susheel sush

  • Yosemite wi fi

    i am running a Mac mini, I am using a USB power link wireless network adapter. this device worked perfect with 10.6.2. After installing Yosemite. 1. The software does not launch. 2. When Power link is plugged in the software icon disappear. 3. The po

  • GoogleApps Provisioning Error

    Hi All, I am facing errors while provisioning users to Googleapps. Here all the steps I followed 1.. Created Domain in Google apps 2. Created admin user in GApps 3. Enabled Provision in googleapps 4. Downloaded the connector 5. Downloaded the jar fil

  • Can't do a query with REST api

    Hi Having discovered that using the standard SOAP API is like poking yourself in the eye with a sharp stick, I'm trying to use the REST API. Initially, I've been trying to exercise it via the POSTMAN extension on Chrome. But I'm having no joy. I can