Best way of reading clob and loadng into table

Hi,
I'm loading the data from clob to one of our table. For this task its taking 10 minutes for 8,000 records. Down the road we are expecting more than 20,000 records.
Is there any fastest way to load the data in this approach. please help me out
sournce table is lob_effect1 and target table canbe any table.
CREATE TABLE lob_effect1 (
  id  INTEGER NULL,
  loc CLOB    NULL
  STORAGE (
    NEXT       1024 K
CREATE OR REPLACE FUNCTION f_convert(p_list IN VARCHAR2)
    RETURN varchar2
  AS
    l_string       VARCHAR2(32767) := p_list || ',';
    l_comma_index  PLS_INTEGER;
    l_index        PLS_INTEGER := 1;
   -- l_tab          test_type := test_type();
      v_col_val                         varchar2(32767);
   v_col_val_str            varchar2(32767);
  BEGIN
    LOOP
     --     dbms_output.put_line(l_string);
      l_comma_index := INSTR(l_string, ',', l_index);
               EXIT WHEN l_comma_index = 0;
      v_col_val := SUBSTR(l_string, l_index, l_comma_index - l_index);
               v_col_val_str :=v_col_val_str ||','||chr(39)||v_col_val|| chr(39);
               v_col_val_str :=ltrim(v_col_val_str,',');
          --     dbms_output.put_line(v_col_val_str);
      l_index := l_comma_index + 1;
    END LOOP;
    RETURN v_col_val_str;
  END f_convert;
CREATE OR REPLACE
PROCEDURE p_load_clob1(
    p_date     IN DATE DEFAULT NULL,
    p_tab_name IN VARCHAR2,
    p_clob     IN CLOB DEFAULT NULL)
IS
  var_clob CLOB;
  var_clob_line            VARCHAR2(4000);
  var_clob_line_count      NUMBER;
  var_clob_line_word_count NUMBER;
  v_col_val                VARCHAR2(32767);
  v_col_val_str            VARCHAR2(32767);
  v_tab_name               VARCHAR2(200):='coe_emea_fi_fails_new_tmp';
  v_sql                    VARCHAR2(32767);
  n_id                     NUMBER;
  CURSOR cur_col_val(p_str VARCHAR2)
  IS
    SELECT * FROM TABLE(fn_split_str(p_str));
BEGIN
  INSERT
  INTO lob_effect VALUES
      seq_lob_effect.nextval,
      p_clob
  RETURNING id
  INTO n_id;
  COMMIT;
  SELECT loc INTO var_clob FROM lob_effect1 WHERE id =n_id;
  var_clob_line_count := LENGTH(var_clob) - NVL(LENGTH(REPLACE(var_clob,chr(10))),0) + 1;
  FOR i                                  IN 1..var_clob_line_count
  LOOP
    var_clob_line           := regexp_substr(var_clob,'^.*$',1,i,'m');
    var_clob_line_word_count:=LENGTH(var_clob_line) - NVL(LENGTH(REPLACE(var_clob_line,',')),0) + 1;
    v_col_val_str           :=NULL;
    v_col_val               :=NULL;
    FOR rec_col_val         IN cur_col_val(var_clob_line)
    LOOP
      v_col_val     :=rec_col_val.column_value;
      v_col_val_str :=v_col_val_str ||','||chr(39)||v_col_val|| chr(39);
      v_col_val_str :=ltrim(v_col_val_str,',');
    END LOOP;
    v_sql :='insert into '||p_tab_name||' values ('||v_col_val_str||')';
    EXECUTE immediate v_sql;
  END LOOP;
  COMMIT;
EXCEPTION
WHEN OTHERS THEN
  dbms_output.put_line('Error:' || SQLERRM);
END;
/Thanks & Regards,
Ramana.

Thread: HOW TO: Post a SQL statement tuning request - template posting
HOW TO: Post a SQL statement tuning request - template posting

Similar Messages

  • Best way to read NTFS and WRITE to it?

    I know there is a laborious 3rd party software called MacFuse that will allow Macs to read AND write to a NTFS drive..but does anyone know of a more reliable and easier way? MacFuse sounds like a tech savvy way to go about and do it....
    Just curious...
    Thanks!

    There is shareware programed called Paragon NTFS. However, I see no reason not to use MacFuse and NTFS-3g. Both are open-source programs that are easy to use.
    Simply download and install MacFuse and than do the same for NTFS-3g. They both use preference panes for settings, but they work "as-is". Once installed together, NTFS volumes work the same as any other volume in OS X. It's not only for the "tech-savvy".

  • Best way to read KEKO and KEPH data

    Hi All,
    I have to collect data from KEKO and KEPH.
    I tried FM 'CK_F_KEKO_KEPH_READING'. but it takes a long time. Is there any better way to read that data. I have to collect data for multiple materials and multiple plants.
    Thanks in advance..!

    Try
    <b>CK_F_KEKO_KEPH_DIRECT_READ
    CK_F_KEKO_KEPH_READING
    CK_F_KEKO_KEPH_DIRECT_READ</b>
    <u><b>also look at Function Group = "CK2U"</b></u>
    Hope this’ll give you idea!!
    <b>Pl... award the points.</b>
    Good luck
    Thanks
    Saquib Khan
    "Some are wise and some are otherwise"

  • Best way to read a CSV file into an internal table

    Hi,
    I have a comma seprated file, which I need to read into an internal table, and be able to look at particular values - basically looping round and populating BDC fields. Whats the easiest way of doing this?

    The easiest way would be to define a flat interal table.
    Data: begin of itab occurs 0,
          rec(1000) type c,
          end of itab.
    Then call the function module GUI_UPLOAD to get the data into your internal table from the file on the frontend. I assume that you know how to use this function module.  You could also use the method GUI_UPLOAD of the class CL_GUI_FRONTEND_SERVICES.
    For example,  say you have a comman delimited file with materail number, plant, quantity in it.
    4000001,0004,1.00
    4000002,0006,2.00
    Define another internal table to hold the data.
    Data: begin of itab2 occurs 0,
          matnr type mara-matnr
          werks type marc-werks
          quant(10) type c.
          end of itab2.
    Then loop at the internal table(ITAB) and use the split statement to split it up into the appropriate fields of the other interal table(ITAB2). 
    Loop at itab.
    clear itab2.
    split itab-rec at ',' into itab2-matnr
                               itab2-werks
                               itab2-quant.
    append itab2.
    endloop.
    That's it.
    Regards,
    Rich Heilman
    Message was edited by: Rich Heilman

  • What is the best way to read, process, and write an Excel File Server side...SQL Server Agent Job

    So I was using dynamic Excel commands to open and save as using...
    Microsoft.Office.Interop.Excel.Application
    and
     workbook.SaveAs(StringDestinationFile, XlFileFormat.xlExcel8, Type.Missing, Type.Missing, Type.Missing, Type.Missing, XlSaveAsAccessMode.xlExclusive, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing);
    which worked all fine and dandy client side. Then when I attempted to create a SQL Server Agent Job, this failed as a result of SQL Server side not being able to execute dynamic Excel commands.
    So do I need to therefore try and do this function via Microsoft.ACE.OLEDB.12.0 commands? And where can I find the commands and syntax to open and save as? I have to Open a .xlsx file, save it as a .xls file, and then open this newly created .xls file and
    then save it as a .csv file.
    Thanks for your review and am hopeful for a reply.
    ITBobbyP85

    I think you might be over complicating things.
    You can use SSIS with Excel Source/Destination connections to read in, or output to an excel sheet/file.

  • What is the best way to read and manipulate large data in excel files and show them in Sharepoint

    Hi ,
    I have a large excel file that has 700,000 records in it. The excel file has a few columns that change every day.
    What is the best way to read the data form the excel file in fastest and most efficient way.
    2 nd Problem,
    I have one excel file that has many rows each row contain some data that has certain keywords.
    What I want is  to segregate the data of rows into respective sheets(tabs ) in the workbook.
    for example in rows have following data 
    1. Alfa
    2beta
    3 gama
    4beta
    5gama
    6gama
    7alfa
    in excel
    I want there to be 3 tabs now with each of the key words alfa beta and gamma.

    Hi,
    I don't really see any better options for SharePoint. SharePoint use other production called 'Office Web App' to allow users to view/edit Microsoft Office documents (word, excel etc.). But the web version of excel doesn't support that much records as well
    as there's size limitations (probably the default max size is 10MB).
    Regarding second problem, I think you need some custom solutions (like a SharePoint timer job/webpart ) to read and present data.
    However, if you can reduce the excel file records to something near 16k (which is supported rows in web version of excel) then you can use SharePoint Excel service to refresh data automatically in the excel file in SharePoint from some external sources.
    Thanks,
    Sohel Rana
    http://ranaictiu-technicalblog.blogspot.com

  • What is the best way to open emails and attachments without using wifi?

    For I-phone and I-pad, what is the best way to open emails and attachments without using wifi?  I turned off wifi in my settings but my boss thinks there may be another way and a better way to use something else instead of wifi.  Any help would be appreciated!  Thank you!

    Thanks!  That is a very good question you post.  My boss asked me that and I am assuming that he is having issues with using wifi wherever he is at.  My boss is the kind of person that when he asks something you look into it and ask him no questions...he's the only one asking questions!  But thank you for your response I will tell him the information you told me and hopefully that will help!

  • How to read from and write into the same file from multiple threads?

    I need to read from and write into a same file multiple threads.
    How can we do that without any data contamination.
    Can u please provide coding for this type of task.
    Thanks in advance.

    Assuming you are using RandomAccessFile, you can use the locking functionality in the Java NIO library to lock sections of a file that you are reading/writing from each thread (or process).
    If you can't use NIO, and all your threads are in the same application, you can create your own in-process locking mechanism that each thread uses prior to accessing the file. That would take some development, and the OS already has the capability, so using NIO is the best way to go if you can use JDK 1.4 or higher.
    - K
    I need to read from and write into a same file
    multiple threads.
    How can we do that without any data contamination.
    Can u please provide coding for this type of task.
    Thanks in advance.

  • What is the best way to merge a file content into log file

    What is the best way to merge a file content into log file.
    In worst case, I will read the file line by line as string, then use
    logger.info(lineString)to output to log file.
    However, is there better way to do this?
    The eventual log file will be something like:
    log message 1
    log message 2
    content from file line 1
    content from file line 2
    content from file line 3
    log message 3
    log message 4Thanks

    John618 wrote:
    Thank you and let me explain:
    1. What do you mean by better?
    I would like to see better performance. read line by line and log each line as string can be slow. Did you measure this and determine that it is actually a problem for your application? Or are you guessing?
    Regardless of what you do you are still going to need to read the file.
    >
    2.The only better way I can think of is not having to do it, but I assume you have a very good reason to want to do this.
    Yes, I have to do it beacuse the requirement is to have that file content be part of logging.
    Any idea?How is it supposed to be part of it? For example which of the following is better?
            File AAA - contents
                       First Line
                       Second Line XXX
            Log 1
                    2009-03-27 DEBUG: Random preceding line
                    2009-03-27 DEBUG: First Line
                    2009-03-27 DEBUG: Second Line XXX
                    2009-03-27 DEBUG: Random following line
            Log 2
                    2009-03-27 DEBUG: Random preceding line
                    2009-03-27 DEBUG: ----- File: AAA -------------
                    First Line
                    Second Line XXX
                    2009-03-27 DEBUG: Random following lineBoth of the above have some advantages and disadvantages.
    The first in a mult-threaded app can end up with intermittent log entries in between lines, so having log lines with thread ids becomes important.
    The first can be created by reading one line at a time and posting one at a time.
    The second can be created by reading the entire file as a single string and then posting using a single log statement.

  • What's the best way to read JSON data?

    Hi all;
    What is the best way to read in JSON data? And is the best way to use it once read in to turn it into XML and apply XPath?
    thanks - dave

    jtahlborn wrote:
    without having a better understanding of what your definition of "use it" is, this question is essentially unanswerable. Jackson is a fairly popular library for translating json to/from java objects. the json website provides a very basic library for parsing to/from xml. which one is the "best" depends on what you want to do with it.Good point. We have a reporting product ([www.windward.net|http://www.windward.net]) and we've had a number of people ask us for JSON support. But how complex the data is and what they want to pull is all over the place. The one thing that's commin is they generally want to pull down the JSON data, and then put specific items from that in the report.
    XML/XPath struck me as a good way to do this for a couple of reasons. First it seems to map well to the JSON data layout. Second it provides a known query language. Third, we have a really good XPath wizard and we could then use it for JSON also.
    ??? - thanks - dave

  • Best way to read from a file

    What would be the best way to read from a file. Which classes do I need to use?
    I have to write a program, which reads data from a comma separated flat file, parse it and after inserting some busineess logic insert into a databse .
    I will have to read the data line by line.
    Any help????

    I would use:
         public void readData()
              try
                   data = new String[this.countRows("comp.txt")][];
                   BufferedReader br = new BufferedReader(new FileReader("comp.txt"));
                   for(int x = 0; x < data.length; x++)
                        StringTokenizer temp = new StringTokenizer(br.readLine(), "?");
                        data[x] = new String[temp.countTokens()];
                        for(int y = 0; y < data[x].length; y++)
                             data[x][y] = temp.nextToken();
              catch(Exception e)
                   System.out.println(e.toString());
         private int countRows(String f)
              int t = 0;
              try
                   BufferedReader brCountRows = new BufferedReader(new FileReader(f));
                   while(brCountRows.readLine() != null)
                        t++;
              catch(Exception e)
                   System.out.println(e.toString());
                   return t;
              return t;
         }It works deliciously!

  • Efficient way to read CLOB data

    Hello All,
    We have a stored procedure in oracle with CLOB out parameter, when this is executed from java the stored proc is executed fast but reading data from clob datatype using 'subString' functionality takes more time (approx 6sec for 540kb data).
    Could someone please suggest what is the efficient way to read data from Clob (We need to read data form clob and write into a file).
    Thanks & Regards,
    Prashant,

    Hi,
    you can try buffered reading / writing the data, it usually speeds the process up.
    See example here:
    http://www.oracle.com/technology/sample_code/tech/java/sqlj_jdbc/files/advanced/LOBSample/LOBSample.java.html

  • Best way to insert millions of records into the table

    Hi,
    Performance point of view, I am looking for the suggestion to choose best way to insert millions of records into the table.
    Also guide me How to implement in easier way to make better performance.
    Thanks,
    Orahar.

    Orahar wrote:
    Its Distributed data. No. of clients and N no. of Transaction data fetching from the database based on the different conditions and insert into another transaction table which is like batch process.Sounds contradictory.
    If the source data is already in the database, it is centralised.
    In that case you ideally do not want the overhead of shipping that data to a client, the client processing it, and the client shipping the results back to the database to be stored (inserted).
    It is must faster and more scalable for the client to instruct the database (via a stored proc or package) what to do, and that code (running on the database) to process the data.
    For a stored proc, the same principle applies. It is faster for it to instruct the SQL engine what to do (via an INSERT..SELECT statement), then pulling the data from the SQL engine using a cursor fetch loop, and then pushing that data again to the SQL engine using an insert statement.
    An INSERT..SELECT can also be done as a direct path insert. This introduces some limitations, but is faster than a normal insert.
    If the data processing is too complex for an INSERT..SELECT, then pulling the data into PL/SQL, processing it there, and pushing it back into the database is the next best option. This should be done using bulk processing though in order to optimise the data transfer process between the PL/SQL and SQL engines.
    Other performance considerations are the constraints on the insert table, the triggers, the indexes and so on. Make sure that data integrity is guaranteed (e.g. via PKs and FKs), and optimal (e.g. FKs should be indexes on the referenced table). Using triggers - well, that may not be the best approach (like for exampling using a trigger to assign a sequence value when it can be faster done in the insert SQL itself). Personally, I avoid using triggers - I rather have that code residing in a PL/SQL API for manipulating data in that table.
    The type of table also plays a role. Make sure that the decision about the table structure, hashed, indexed, partitioned, etc, is the optimal one for the data structure that is to reside in that table.

  • Best Way to Stack 2d and 3d in a Composite

    Hey everyone, it seems I'm really on a 3d binge with Photoshop CS6. It's just a blast! Anyway, I'm looking to figure out the best way to stack 2d and 3d items together in the same document, and maximize the editibility...
    I'm working on a digital room setting, and I have part of the room in 3d and part in 2d. So, for example, the walls, ceiling, floor, table, chairs and corner hutch are 3d, but the regular hutch and buffet server are in 2d.
    - The table and chairs need to be in the foreground
    - The regular hutch and server are behind the table and chairs
    - The corner hutch needs to be behind all other pieces of furniture
    So, you may be catching on by now to my little dilemna... I want all the 3D items to have the same lighting and reflect off one another, however, I need the 2d layers to somehow fit "inbetween" the 3d objects. The simple question is, how can this be done? Is there a way to mask the foreground 3D objects so that they appear to be in from of the 2D objects? My only current solution is this:
    - Move the whole 3D layer (the scene layer) all the way to the back
    - Make the final decision on the position of the table and chairs
    - Hide all other meshes in the scene besides the table and chairs
    - Render, so the edges are clean
    - Load the newly rendered 3D layer as a selection (which loads the table and chairs only)
    - Either create a new layer from the selection of the rendered 3D layer, or mask the 3D layer so that the table and chairs appear to be in the foreground
    I didn't anticipate this post being this long, so I apologize,... but as you can assume, the method above means I can no longer edit/reposition/scale the 3D at all, so if there's a better way to stack these room elements I would be extremely grateful to whoever knows how!
    Thanks!
    Andy

    First, I was wrong about layers needing to be hidden to prevent their appearance being included in a new Postcard. I misunderstood something that happened when working with a 3D document a while ago.
    Materials support transparency/opacity. A material has an opacity control that can use a texture. The material opacity percentage is multiplied by the opacity of the texture.
    When a Postcard is created, its default material's opacity control will contain an instance of the texture that's in the diffuse control, which is a PSB file containing the 2D layer (pixels, Shape, Smart Object or Group) from which the Postcard was created.
    Here is an example. Two solid Pyramid meshes and a Postcard of a cloud Shape layer:
    The Shape layer from which the Postcard was created is in the document that can be opened by picking Edit in the dropdown of the buttons highlighted above:

  • What's the best way for reading this binary file?

    I've written a program that acquires data from a DAQmx card and writes it on a binary file (attached file and picture). The data that I'm acquiring comes from 8 channels, at 2.5MS/s for, at least, 5 seconds. What's the best way of reading this binary file, knowing that:
    -I'll need it also on graphics (only after acquiring)
    -I also need to see these values and use them later in Matlab.
    I've tried the "Array to Spreadsheet String", but LabView goes out of memory (even if I don't use all of the 8 channels, but only 1).
    LabView 8.6
    Solved!
    Go to Solution.
    Attachments:
    AcquireWrite02.vi ‏15 KB
    myvi.jpg ‏55 KB

    But my real problem, at least now, is how can I divide the information to get not only one graphic but eight?
    I can read the file, but I get this (with only two channels):
    So what I tried was, using a for loop, saving 250 elements in different arrays and then writing it to the .txt file. But it doesn't come right... I used 250 because that's what I got from the graphic: at each 250 points it plots the other channel.
    Am I missing something here? How should I treat the information coming from the binary file, if not the way I'm doing?
    (attached are the .vi files I'm using to save in the .txt format)
    (EDITED. I just saw that I was dividing my graph's data in 4 just before plotting it... so It isn't 250 but 1000 elements for each channel... Still, the problem has not been solved)
    Message Edited by Danigno on 11-17-2008 08:47 AM
    Attachments:
    mygraph.jpg ‏280 KB
    Read Binary File and Save as txt - 2 channels - with SetFilePosition.vi ‏14 KB
    Read Binary File and Save as txt - with SetFilePosition_b_save2files_with_array.vi ‏14 KB

Maybe you are looking for

  • Mail app is open and close after sec

    When i go to mail i tap the app is open and it is closing in 3 seconds.I can not use mail app at all.Can you help me?I already restore my i phone as the instructions from telephone company support(Vodafone) but the problerm is still exist.I call at A

  • A Update Error in PLSQLEO, PLEASE HELP ME!!!!

    There are two table based on two VOs. The relation of two VOs is master-detail. For some reason, there is no view-link between two VOs. when I select different record(singleSelection) in master table, the detail table will show different data to user

  • Touchscreen no longer working after webosdoctor

    What are my options here? The select language screen comes up, but the screen is unresponsive to my touch. The buttons on the side still work however. Any advice? Post relates to: HP TouchPad (WiFi) This question was solved. View Solution.

  • IPad 2 chooses to use Cellular data instead of Wi-Fi (This is Sh*t!)

    Just learned today, the hard way, that for some reason the iPad 2 in it's eternal wishdom chooses to use Cellular data (3G) over Wi-Fi while both are available! If the iPad 2 is not smart enough or can not be made smart enough to strongly choose Wi-F

  • New Hard Drive - How to re-download previously purchased songs?

    I have an iTunes account. My son and I have purchased songs thru the same account. I have my songs in iTunes in my mac desktop. He has his songs in his iPod. My desktop's hard drive recently died and had to be replaced. I had all my songs backed up s