Question regarding bulk binding

Gurus,
How does oracle try to differentiate whether the query should use the concept of bulk binding and a normal query ? Is the key word FOR ALL makes the difference ? Or is there any other difference ??
Please help
Regards

Its not up to oracle to decide, Its the developer who decides it. Bulk bind is used to reduce context switch. And you can achieve it with some of the bulk bind utilities provided by oracle like FORALL and BULK COLLECT.
Read [Overview of Bulk Binds|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14251/adfns_packages.htm#sthref853] for more info.
Thanks,
Karthick.

Similar Messages

  • Dynamic SQL and Bulk Bind... Interesting Problem !!!

    Hi Forum !!
    I've got a very interesting problem involving Dynamic SQL and Bulk Bind. I really Hope you guys have some suggestions for me...
    Table A contains a column named TX_FORMULA. There are many strings holding expressions like '.3 * 2 + 1.5' or '(3.4 + 2) / .3', all well formed numeric formulas. I want to calculate each formula, finding the number obtained as a result of each calculation.
    I wrote something like this:
    DECLARE
    TYPE T_FormulasNum IS TABLE OF A.TX_FORMULA%TYPE
    INDEX BY BINARY_INTEGER;
    TYPE T_MontoIndicador IS TABLE OF A.MT_NUMBER%TYPE
    INDEX BY BINARY_INTEGER;
    V_FormulasNum T_FormulasNum;
    V_MontoIndicador T_MontoIndicador;
    BEGIN
    SELECT DISTINCT CD_INDICADOR,
    TX_FORMULA_NUMERICA
    BULK COLLECT INTO V_CodIndicador, V_FormulasNum
    FROM A;
    FORALL i IN V_FormulasNum.FIRST..V_FormulasNum.LAST
    EXECUTE IMMEDIATE
    'BEGIN
    :1 := TO_NUMBER(:2);
    END;'
    USING V_FormulasNum(i) RETURNING INTO V_MontoIndicador;
    END;
    But I'm getting the following messages:
    ORA-06550: line 22, column 43:
    PLS-00597: expression 'V_MONTOINDICADOR' in the INTO list is of wrong type
    ORA-06550: line 18, column 5:
    PL/SQL: Statement ignored
    ORA-06550: line 18, column 5:
    PLS-00435: DML statement without BULK In-BIND cannot be used inside FORALL
    Any Idea to solve this problem ?
    Thanks in Advance !!

    Hallo,
    many many errors...
    1. You can use FORALL only in DML operators, in your case you must use simple FOR LOOP.
    2. You can use bind variables only in DML- Statements. In other statements you have to use literals (hard parsing).
    3. RETURNING INTO - Clause in appropriate , use instead of OUT variable.
    4. Remark: FOR I IN FIRST..LAST is not fully correct: if you haven't results, you get EXCEPTION NO_DATA_FOUND. Use Instead of 1..tab.count
    This code works.
    DECLARE
    TYPE T_FormulasNum IS TABLE OF VARCHAR2(255)
    INDEX BY BINARY_INTEGER;
    TYPE T_MontoIndicador IS TABLE OF NUMBER
    INDEX BY BINARY_INTEGER;
    V_FormulasNum T_FormulasNum;
    V_MontoIndicador T_MontoIndicador;
    BEGIN
    SELECT DISTINCT CD_INDICATOR,
    TX_FORMULA_NUMERICA
    BULK COLLECT INTO V_MontoIndicador, V_FormulasNum
    FROM A;
    FOR i IN 1..V_FormulasNum.count
    LOOP
    EXECUTE IMMEDIATE
    'BEGIN
    :v_motto := TO_NUMBER('||v_formulasnum(i)||');
    END;'
    USING OUT V_MontoIndicador(i);
    dbms_output.put_line(v_montoindicador(i));
    END LOOP;
    END;You have to read more about bulk- binding and dynamic sql.
    HTH
    Regards
    Dmytro
    Test table
    a
    (cd_indicator number,
    tx_formula_numerica VARCHAR2(255))
    CD_INDICATOR TX_FORMULA_NUMERICA
    2 (5+5)*2
    1 2*3*4
    Message was edited by:
    Dmytro Dekhtyaryuk

  • Bulk Binds-How to avoid naming all columns?

    When using Bulk Binds for inserts/updates,One has to explicitly use all column names in declaration.This is maint intensive as
    addition of any colum will require changes in code at multiple locations.
    Is there any way to reference the columns though other means(such as records)?
    thanks

    Zia wrote:
    1) I have a detail report which contains more than 100 thousands rows. This report is already on menu and available to users. User have got an option on menu that they can select output either 'screen' or 'spreadsheet'. If user select option as 'spreadsheet', report open in excel file but in old excel and truncating excessive rows as .xls have limitation of maximum 65 thousands rows. I do not want to use 'csv' format as directly conversion into excel is more convinient for users. If report contains less than 65 thousands rows then everything is fine and no issues with the report. I hope I could explain real problem this time.65 thousands rows is the limitation of that version u use. I use Microsoft Office 2010 where limitation is 1048576 is a sheet.
    One solution is uninstall the present Office version and use Office 2010.
    2) Regarding second point, I have a summary report which is all fine and showing desired output in desired format. But when user want output in excel file, some of report columns occupy more than one columns and causing difficulties for users. Actually users do need to convert few reports into excel and work with formulas in excel but in this situation they have to do lots of changes in format before applying formulas. A sample screen shot of a report and converted report in excel file both can be seen on this link [https://skydrive.live.com/?cid=573511bde4261fe6#cid=573511BDE4261FE6&id=573511BDE4261FE6%21120]
    I'm not sure about the solution. Most probably you have more space in repeating frame and cause this thing happen. shorter the space in the frame and try.
    Hope this will help you.

  • Bulk Binding-How to avoid naming all columns

    When using Bulk Binds for inserts/updates,One has to explicitly use all column names in declaration.This is maint intensive as
    addition of any colum will require changes in code at multiple locations.
    Is there any way to reference the columns though other means(such as records)?
    thanks

    Zia wrote:
    1) I have a detail report which contains more than 100 thousands rows. This report is already on menu and available to users. User have got an option on menu that they can select output either 'screen' or 'spreadsheet'. If user select option as 'spreadsheet', report open in excel file but in old excel and truncating excessive rows as .xls have limitation of maximum 65 thousands rows. I do not want to use 'csv' format as directly conversion into excel is more convinient for users. If report contains less than 65 thousands rows then everything is fine and no issues with the report. I hope I could explain real problem this time.65 thousands rows is the limitation of that version u use. I use Microsoft Office 2010 where limitation is 1048576 is a sheet.
    One solution is uninstall the present Office version and use Office 2010.
    2) Regarding second point, I have a summary report which is all fine and showing desired output in desired format. But when user want output in excel file, some of report columns occupy more than one columns and causing difficulties for users. Actually users do need to convert few reports into excel and work with formulas in excel but in this situation they have to do lots of changes in format before applying formulas. A sample screen shot of a report and converted report in excel file both can be seen on this link [https://skydrive.live.com/?cid=573511bde4261fe6#cid=573511BDE4261FE6&id=573511BDE4261FE6%21120]
    I'm not sure about the solution. Most probably you have more space in repeating frame and cause this thing happen. shorter the space in the frame and try.
    Hope this will help you.

  • Bulk binding issues in 9i/8i

    Hi all,
    I would like to know, if there is any change ,in the bulk binding feature of PL/SQL in Oracle 9i .
    Please guide me on following issues ,
    1) We use it since it was introduced in 8i and now we are migrating to 9i. I would be happy to exploit any new feature, if there .
    2) when I am inserting or updating with buld binding, the collection I use must be dense (not sparse).Is there any way to oversome this limitation ?
    3) Suppose there is a table of 1 million rows. I have to transfer the data from this table to another using bulk binding. What I want to do is, want to transfer the rows in batches of 10000 rows. As collecting 1 million records data once in a collection does not look like a good idea.
    Please advice.
    Regards.

    There are almost always new features with each successive release. There are a few sections in the PL/SQL User's Guide http://download-west.oracle.com/docs/cd/B10501_01/appdev.920/a96624/05_colls.htm#23723 that discuss bulk operations in 9.2-- you probably want to take a look at that. There is also a book from Oracle Press called "Oracle 9i New Features" which is excellent.
    The LIMIT clause may have been introduced in 9i, i.e.
    FETCH <<some cursor>>
      BULK COLLECT INTO <<some collection>>
      LIMIT 1000There is a section in the PL/SQL User's Guide pertaining to the LIMIT clause http://download-west.oracle.com/docs/cd/B10501_01/appdev.920/a96624/05_colls.htm#29027
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • LOOP inside FORALL in bulk binding

    Can I use a loop inside forall in bulk bind updates?
    as I understand, forall statement strictly loops through the length of the bulk limit size for the immediately following statement only.
    I am attempting to use a loop there to update more than one table.
    cursor c is select id from temp where dt_date > sysdate-30;
    BEGIN
    loop
    fetch c into v_id;
    limit 1000;
    forall i in 1..v_id.count
    UPDATE table_one set new_id = v_id(i);
    exit when C%NOTFOUND;
    end loop;
    end;
    I want to update another table table_two also immediately after updating table_one like this:
    forall i in 1..v_id.count
    UPDATE table_one set new_id = v_id(i);
    BEGIN select nvl(code,'N/A') into v_code from T_CODES where ID_NO = v_id(i); EXCEPTION WHEN NO_DATA_FOUND v_code='N/A'; END;
    UPDATE table_two set new_code =v_code;
    exit when C% not found.
    This is not working and when I run it, I get an error saying encountered BEGIN when one of the following is expected.
    I got around this by having another FOR loop just to set up the values in another array variable and using that value in another second forall loop to update table_two.
    Is there any way to do this multiple table udpates in bulkbinding under one forall loop that would enable to do some derivation/calculation if needed among variables [not array variables, regular datatype variables].
    Can we have like
    forall j in 1.. v_id.count
    LOOP
    update table 1;
    derive values for updating table 2;
    update table 2;
    END LOOP;
    Thank You.

    Well, without questioning reasions why do you want this, you are confusing bulk select and forall. You need:
    begin
        loop
          fetch c bulk collect into v_id limit 1000;
          exit when v_id.count = 0;
          forall i in 1..v_id.count
            UPDATE table_one set new_id = v_id(i);
        end loop;
    end;
    /SY.

  • Jdbc thin driver bulk binding slow insertion performance problem

    Hello All,
    We have a third party application reporting slow insertion performance, while I traced the session and found out most of elapsed time for one insert execution is sql*net more data from client, it appears bulk binding is being used here because one execution has 200 rows inserted. I am wondering whether this has something to do with their jdbc thin driver(10.1.0.2 version) and our database version 9205. Do you have any similar experience on this, what other possible directions should I explore?
    here is the trace report from 10046 event, I hide table name for privacy reason.
    Besides, I tested bulk binding in PL/SQL to insert 200 rows in one execution, no problem at all. Network folks confirm that network should not be an issue as well, ping time from app server to db server is sub milisecond and they are in the same data center.
    INSERT INTO ...
    values
    (:1, :2, :3, :4, :5, :6, :7, :8, :9, :10, :11, :12, :13, :14, :15, :16, :17,
    :18, :19, :20, :21, :22, :23, :24, :25, :26, :27, :28, :29, :30, :31, :32,
    :33, :34, :35, :36, :37, :38, :39, :40, :41, :42, :43, :44, :45)
    call count cpu elapsed disk query current rows
    Parse 1 0.00 0.00 0 0 0 0
    Execute 1 0.02 14.29 1 94 2565 200
    Fetch 0 0.00 0.00 0 0 0 0
    total 2 0.02 14.29 1 94 2565 200
    Misses in library cache during parse: 1
    Optimizer goal: CHOOSE
    Parsing user id: 25
    Elapsed times include waiting on following events:
    Event waited on Times Max. Wait Total Waited
    ---------------------------------------- Waited ---------- ------------
    SQL*Net more data from client 28 6.38 14.19
    db file sequential read 1 0.02 0.02
    SQL*Net message to client 1 0.00 0.00
    SQL*Net message from client 1 0.00 0.00
    ********************************************************************************

    I have exactly the same problem, I tried to find out what is going on, changed several JDBC Drivers on AIX, but no hope, I also have ran the process on my laptop which produced a better and faster performance.
    Therefore I made a special solution ( not practical) by creating flat files and defining the data as an external table, the oracle will read the data in those files as they were data inside a table, this gave me very fast insertion into the database, but still I am looking for an answer for your question here. Using Oracle on AIX machine is a normal business process followed by a lot of companies and there must be a solution for this.

  • Jdbc thin driver and bulk binding slow insertion performance

    Hello All,
    We have a third party application reporting slow insertion performance, while I traced the session and found out most of elapsed time for one insert execution is sql*net more data from client, it appears bulk binding is being used here because one execution has 200 rows inserted. I am wondering whether this has something to do with their jdbc thin driver(10.1.0.2 version) and our database version 9205. Do you have any similar experience on this, what other possible directions should I explore?
    here is the trace report from 10046 event, I hide table name for privacy reason.
    Besides, I tested bulk binding in PL/SQL to insert 200 rows in one execution, no problem at all. Network folks confirm that network should not be an issue as well, ping time from app server to db server is sub milisecond and they are in the same data center.
    INSERT INTO ...
    values
    (:1, :2, :3, :4, :5, :6, :7, :8, :9, :10, :11, :12, :13, :14, :15, :16, :17,
    :18, :19, :20, :21, :22, :23, :24, :25, :26, :27, :28, :29, :30, :31, :32,
    :33, :34, :35, :36, :37, :38, :39, :40, :41, :42, :43, :44, :45)
    call count cpu elapsed disk query current rows
    Parse 1 0.00 0.00 0 0 0 0
    Execute 1 0.02 14.29 1 94 2565 200
    Fetch 0 0.00 0.00 0 0 0 0
    total 2 0.02 14.29 1 94 2565 200
    Misses in library cache during parse: 1
    Optimizer goal: CHOOSE
    Parsing user id: 25
    Elapsed times include waiting on following events:
    Event waited on Times Max. Wait Total Waited
    ---------------------------------------- Waited ---------- ------------
    SQL*Net more data from client 28 6.38 14.19
    db file sequential read 1 0.02 0.02
    SQL*Net message to client 1 0.00 0.00
    SQL*Net message from client 1 0.00 0.00
    ********************************************************************************

    I have exactly the same problem, I tried to find out what is going on, changed several JDBC Drivers on AIX, but no hope, I also have ran the process on my laptop which produced a better and faster performance.
    Therefore I made a special solution ( not practical) by creating flat files and defining the data as an external table, the oracle will read the data in those files as they were data inside a table, this gave me very fast insertion into the database, but still I am looking for an answer for your question here. Using Oracle on AIX machine is a normal business process followed by a lot of companies and there must be a solution for this.

  • Question Regarding MIDI and Sample Accuracy

    Hi,
    I have 2 questions regarding MIDI.
    1. MIDI is moved by ticks. In the arrange window however, you can move a region by samples. When doing this, you can move within values of the ticks (which you can see on your position box that pops up) Now, will this MIDI note actually be played back at that specific sample point, or will it round the event to the closest tick? (example, if I have a MIDI note directly on 1.1.1.1, and I move the REGION in the arrange... will that MIDI note now fall on the sample that I have moved the region to, or will it be rounded to the closest tick?)
    2. When making a midi template from an audio region, will the MIDI information land exactly on the sample of the transient, or will it be rounded to the closest tick?
    I've looked through the manual, and couldn't find any specific answer to these questions.
    Thanks!
    Message was edited by: Matthew Usnick

    Ok, I've done some experimenting, and here are my results.
    I believe those numbers ARE samples. I came to this conclusion by counting (for some reason it starts on 11) and cutting a region to be 33 samples long (so, minus 11, is 22 actual samples). I then went to the Audio Bin window, and chose to view region length as samples. And there it said it: 22 samples. So, you can in fact move MIDI regions by samples!
    Second, I wanted to see if the MIDI notes in the region itself would be quantized to the nearest tick. I cut a piece of audio, so it had a 1 sample attack (zoomed in asa far as I could in the sample editor, selected the smallest portion, and faded in, and made the start point, the region start position). I saved the region as a new audio file, and loaded it up in the exs sampler.
    I then made a MIDI region, with and triggered the sample on beat 1 (quantized, on the money). I then went into the arrange window, made a fixed cycle length, and bounced the audio. I then moved the MIDI region by one sample to the right. I did this 22 times (which is the number of samples in a tick, at 120, apparently). After bouncing all of these (cycle position remained fixed, only the MIDI region was moving) I imported all the audio into the arrange on new tracks, and YES!!! The sample start was cascaded by a sample each time!
    SO.
    Not only can you move MIDI regions by sample, but the positions are NOT quantized to Logics ticks!
    This is very good news, and glad I worked this out!
    (if anyone thinks this sounds wrong, please correct me, but I'm pretty sure I proved it, in my test)
    Message was edited by: Matthew Usnick

  • Question regarding homehub and Open reach router -...

    Hi all,
      I had infinity installed earlier this month and am happy with it so far. I do have a few questions regarding the service and hardware though.
      I run both my BT openreach router and BT Home hub from the same power socket. The problem is, if I turn the plug on so both the Homehub and Openreach Router start up at the same time, the home hub will never get an Internet connection from the router. To solve this I have to turn the BT home hub on first and leave it for a minute, then start the router up and it all works fine. I'm just curious if this is the norm or do I have some faulty hardware?
      Secondly, I appreciate the estimated speed BT quote isn't always accurate, I was quoted 49mbits down but received 38mbits down - Which I was happy with. Recently though it has dropped to 30. I am worried this might continue to drop over time. and as of present I am 20mbits down on the estimate . For the record 30mbits is actually fine and probably more than I would ever need. If I could boost it some how though I would be interested to hear from you.
    Thanks, .

    Just a clarification: the two boxes are the HomeHub (router, black) and the modem (white).  The HomeHub has its own power switch, the modem doesn't.
    There is something wrong if the HomeHub needs to be turned on before the modem.  As others have said, in general best to leave the modem on all the time.  You should be able to connect them up in any order, or together.  (For example, I recently tripped the mains cutout, and when I restored power the modem and HomeHub went on together and everything was ok).
    Check if the router can connect/disconnect from the broadband using the web interface.  Leaving the modem and HomeHub on all the time, go to http://192.168.1.254/ on a browser on a connected computer, and see whether the Connect/Disconnect button works.

  • Question regarding IWDTree and context Value Node naming

    Hi,
    I have a question regarding the IWDTree / IWDTreeNodeType components.
    I have a context looking like this:
    Context
      + ResponseNode
        + PersonNode (1..1)
          + PersonAddressNode                    (empty node, placeholder)
          | + AdresNode (0..n)
          + PersonChildNode                      (empty node, placeholder)
          | + PersonNode (0..n)
          |   + PersonAddressNode                (empty node, placeholder)
          |     + AddressNode (0..n)
          + PersonParentsNode                    (empty node, placeholder)
            + PersonNode (0..n)
              + PersonAddressNode                (empty node, placeholder)
                + AddressNode (0..n)
    The context represents a person, a person's address, and a person's children and parents with their respective addresses.
    As a result, on different branches, a PersonNode and AddressNode can appear.
    And for some strange reason, all PersonNodes and AddressNodes link to the same ResponseNode.PersonNode.PersonParentsNode.PersonNode and ResponseNode.PersonNode.PersonParentsNode.PersonNode.PersonAddressNode.AddressNode respectively, irregardless of their branch...
    Is it illegal to have multiple PersonNode and AddressNode node names, and should they be named uniquely?

    Generally, node names need to be unique inside the context, attributes in different nodes can have same names. I wonder if the context structure you described will result in code without compile errors.
    The WD Tree can only be used with recursive context nodes or with a hierarchy of non-singleton child nodes.
    Can you give an example how your tree should look like at runtime?

  • Question regarding roaming and data usage

    I am currently out of my main country of service, and as such I have a question regarding roaming and data usage.
    I am told that the airplane mode is sufficient from keeping the phone off from roaming, but does this apply to any background data usage for applications and such?
    If the phone is in airplane mode, are all use of the phone including wifi and application use through the wifi outside of all extra charges from roaming?

    Ann154 wrote:
    If you are getting charged to use the wifi, then it is possible.  Otherwise no
    Just to elaborate here, Ann154 is referring to access charges for wifi, which is nothing to do with Verizon, so if you are using it in a plane, hotel, an internet cafe etc that charges for Wifi rather than being free .   Verizon does not charge you (or indeed know about!) wifi usage, or any other usage that is not on their cellular network (such as using a foreign SIM for example in global phones)  So these charges, if any, will not show up on the verizon bill app.  Having it in airplane mode prevents all cellular data traffic so you should be fine

  • Question regarding MM and FI integration

    Hi Experts
    I have a question regarding MM and FI integration
    Is the transaction Key in OMJJ is same as OBYC transaction key?
    If yes, then why canu2019t I see transaction Key BSX in Movement type 101?
    Thanks

    No, they are not the same.  The movement type transaction (OMJJ) links the account key and account modifier to a specific movement types.  Transaction code (OBYC) contains the account assignments for all material document postings, whether they are movement type dependent or not.  Account key BSX is not movement type dependent.  Instead, BSX is dependent on the valuation class of the material, so it won't show in OMJJ.
    thanks,

  • **question regarding 3G and wif**.

    I have a question regarding 3G and wifi. I have #G activated as well as wifi, when I go to retrieve mail for example I get a pop up asking me if I want to connect to a wifi network…should I have wifi and 3G activated at the same time, and why am I getting the pop up…
    Thanks

    You can have them on at the same time, but they will not be used at the same time for data. The order of preference for data is WiFi > 3G > EDGE > GPRS. You're getting the pop up, most likely, because you have Settings > Wi-Fi > Ask to Join Networks set to ON. You can set that to OFF, and the iPhone will still join known (i.e. previously used) WiFi networks automatically.

  • Question regarding Dashboard and column prompt

    My question regarding Dashboard and column prompt:
    1) Dashboard prompt usually work with only for columns which are in subject area. In my report I've created some of the columns which are based on other columns. Like I've daysNumber column that is based on two other columns, as it calculates the difference of two dates. When I create dashboard prompt I can't find this column there. I need to make a prompt on this column.
    2)For one of the column I've only two values 1 and 0. When I create prompt for this column, is it possible that in drop down list It shows 'Yes' for 1 and 'No' for 0 and still filter the request??

    Hi Toony,...
    I think there was another way of doing this...
    In the dashboard prompt go to Show option > select SQL Results from dropdown.
    There you need to write your Logical SQL like...
    SELECT CASE WHEN 1=0 THEN PERIODS.YEAR ELSE difference of date functionality END FROM SubjectAreaName
    Here.. Periods.Year is the column which is already exists in repository's presentation layer..
    and difference of date functionality is the code or formula of column which you want to show in drop-down...
    Also write the CASE WHEN 1=0 THEN PERIODS.YEAR ELSE difference of date functionality END code in fx of that prompt.
    I think it helps you in doing this..
    Just check and inform me if it works...
    Thanks & Regards
    Kishore Guggilla
    Edited by: Kishore Guggilla on Oct 31, 2008 9:35 AM

Maybe you are looking for

  • How to center the text displayed in a JList

    Hi, The width of the JList display area is wider than the text displayed. I tried to center the text displayed to make it look nicer. I had tried using "setAlignmentX(Component.CENTER_ALIGNMENT)" in my own ListCellRenderer class but did not work. Any

  • Problem in designing adobe form

    hi, i have 2 fields in adobe form and iam trying to update a transaction using bdc.but out of 2 fields only second one getting updated.pl send some sol 2 this.                                                                 regards                   

  • Errors in FM RH_CONVERT_PNNNN_TO_SDATA

    Hi All, We are running RBDMIDOC using the message type HRMD_A, alll the infotypes data has been created except the infotype 1000 position is not. We observed for the FM RH_CONVERT_PNNNN_TO_SDATA , here 2 error messages are created regarding WEARS fie

  • Many files in preferences, having symbols for names - can they be discarded

    Hi all When I go to the following destination, username > library > preferences, I have so many files with names having pi, rho, @, angstrong, beta, epsilon and greater than symbols incorporated. These are empty files. What is their purpose? Are they

  • Batch processing - HDR feature Lightroom 6 - Is this possible?

    We shoot a lot of real estate interiors; is there a way to batch process sets of images (in this case 5 per shot) in Lightroom 6. I did vaguely remember seeing a video from Adobe saying that you could line up the new shot when the first was merging b