Performance issue loading 4000 records from XML

Hello, Im' trying to upload in a table with the following sqlstatement records from an XML having content of this type
<?xml version="1.0" encoding="UTF-8"?>
<custom-objects xmlns="http://www.mysite.com/xml/impex/customobject/2006-10-31">
    <custom-object type-id="NEWSLETTER_SUBSCRIBER" object-id="[email protected]">
  <object-attribute attribute-id="customer-no"><value>BLY00000001</value></object-attribute>
  <object-attribute attribute-id="customer_type"><value>registered</value></object-attribute>
        <object-attribute attribute-id="title"><value>Mr.</value></object-attribute>
        <object-attribute attribute-id="first_name"><value>Jean paul</value></object-attribute>
        <object-attribute attribute-id="is_subscribed"><value>true</value></object-attribute>
        <object-attribute attribute-id="last_name"><value>Pennati Swiss</value></object-attribute>
        <object-attribute attribute-id="address_line_1"><value>newsletter ADDRESS LINE 1 data</value></object-attribute>
        <object-attribute attribute-id="address_line_2"><value>newsletter ADDRESS LINE 2 data</value></object-attribute>
        <object-attribute attribute-id="address_line_3"><value>newsletter ADDRESS LINE 3 data</value></object-attribute>
        <object-attribute attribute-id="housenumber"><value>newsletter HOUSENUMBER data</value></object-attribute>
        <object-attribute attribute-id="city"><value>newsletter DD</value></object-attribute>
        <object-attribute attribute-id="post_code"><value>6987</value></object-attribute>
        <object-attribute attribute-id="state"><value>ASD</value></object-attribute>
        <object-attribute attribute-id="country"><value>ES</value></object-attribute>
        <object-attribute attribute-id="phone_home"><value>0044 1234567 newsletter phone_home</value></object-attribute>
        <object-attribute attribute-id="preferred_locale"><value>fr_CH</value></object-attribute>
        <object-attribute attribute-id="exported"><value>true</value></object-attribute>
        <object-attribute attribute-id="profiling"><value>true</value></object-attribute>
        <object-attribute attribute-id="promotions"><value>true</value></object-attribute>
        <object-attribute attribute-id="source"><value>https://www.mysite.com</value></object-attribute>
        <object-attribute attribute-id="source_ip"><value>10.10.1.1</value></object-attribute>
        <object-attribute attribute-id="pr_product_serial_number"><value>000123345678 product serial no.</value></object-attribute>
        <object-attribute attribute-id="pr_purchased_from"><value>Store where product to be registered was purchased</value></object-attribute>
        <object-attribute attribute-id="pr_date_of_purchase"><value></value></object-attribute>
        <object-attribute attribute-id="locale"><value>fr_CH</value></object-attribute> 
    </custom-object>
    <custom-object type-id="NEWSLETTER_SUBSCRIBER" object-id="[email protected]">
   <object-attribute attribute-id="customer-no"><value></value></object-attribute>
   <object-attribute attribute-id="customer_type"><value>unregistered</value></object-attribute>
        <object-attribute attribute-id="title"><value>Mr.</value></object-attribute>
        <object-attribute attribute-id="first_name"><value>Jean paul</value></object-attribute>
        <object-attribute attribute-id="is_subscribed"><value>true</value></object-attribute>
        <object-attribute attribute-id="last_name"><value>Pennati Swiss</value></object-attribute>
        <object-attribute attribute-id="address_line_1"><value>newsletter ADDRESS LINE 1 data</value></object-attribute>
        <object-attribute attribute-id="address_line_2"><value>newsletter ADDRESS LINE 2 data</value></object-attribute>
        <object-attribute attribute-id="address_line_3"><value>newsletter ADDRESS LINE 3 data</value></object-attribute>
        <object-attribute attribute-id="housenumber"><value>newsletter HOUSENUMBER data</value></object-attribute>
        <object-attribute attribute-id="city"><value>newsletter CASLANO</value></object-attribute>
        <object-attribute attribute-id="post_code"><value>6987</value></object-attribute>
        <object-attribute attribute-id="state"><value>TICINO</value></object-attribute>
        <object-attribute attribute-id="country"><value>CH</value></object-attribute>
        <object-attribute attribute-id="phone_home"><value>0044 1234567 newsletter phone_home</value></object-attribute>
        <object-attribute attribute-id="preferred_locale"><value>fr_CH</value></object-attribute>
        <object-attribute attribute-id="exported"><value>true</value></object-attribute>
        <object-attribute attribute-id="profiling"><value>true</value></object-attribute>
        <object-attribute attribute-id="promotions"><value>true</value></object-attribute>
        <object-attribute attribute-id="source"><value>https://www.mysite.com</value></object-attribute>
        <object-attribute attribute-id="source_ip"><value>85.219.17.170</value></object-attribute>
        <object-attribute attribute-id="pr_product_serial_number"><value>000123345678 product serial no.</value></object-attribute>
        <object-attribute attribute-id="pr_purchased_from"><value>Store where product to be registered was purchased</value></object-attribute>
        <object-attribute attribute-id="pr_date_of_purchase"><value></value></object-attribute>
        <object-attribute attribute-id="locale"><value>fr_CH</value></object-attribute> 
    </custom-object>
</custom-objects>
I use the following sequence of queries below to do the insert (XML_FILE is passed to the procedure as XMLType) 
INSERT INTO DW_CUSTOMER.NEWSLETTERS (
   BRANDID,
   CUSTOMER_EMAIL,
   DW_WEBSITE_TAG
Select
p_brandid as BRANDID,
CUSTOMER_EMAIL,
p_website
FROM
(select XML_FILE from dual) p,
XMLTable(
xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
'/custom-objects/custom-object' PASSING p.XML_FILE
COLUMNS
customer_email PATH '@object-id'
) CUSTOMER_LEVEL1;
INSERT INTO DW_CUSTOMER.NEWSLETTERS_C_ATT (
   BRANDID, 
   CUSTOMER_EMAIL,
   CUSTOMER_NO, 
   CUSTOMER_TYPE,
   TITLE,
   FIRST_NAME,
   LAST_NAME,
   PHONE_HOME,
   BIRTHDAY,
   ADDRESS1,
   ADDRESS2,
   ADDRESS3,
   HOUSENUMBER,
   CITY,
   POSTAL_CODE,
   STATE,
   COUNTRY,
   IS_SUBSCRIBED,
   PREFERRED_LOCALE,
   PROFILING,
   PROMOTIONS,
   EXPORTED,
   SOURCE,
   SOURCE_IP,
   PR_PRODUCT_SERIAL_NO,
   PR_PURCHASED_FROM,
   PR_PURCHASE_DATE,
   LOCALE,
   DW_WEBSITE_TAG)
    with mainq as
        SELECT
        CUST_LEVEL1.customer_email as CUSTOMER_EMAIL,
        CUST_LEVEL2.*
        FROM
        (select XML_FILE from dual) p,
        XMLTable(
        xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
        '/custom-objects/custom-object' PASSING p.XML_FILE
        COLUMNS
        customer_email PATH '@object-id',
        NEWSLETTERS_C_ATT XMLType PATH 'object-attribute'
        ) CUST_LEVEL1,
        XMLTable(
        xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
        '/object-attribute' PASSING CUST_LEVEL1.NEWSLETTERS_C_ATT
        COLUMNS
        attribute_id PATH '@attribute-id',
        thevalue PATH 'value'
        ) CUST_LEVEL2
    select
    p_brandid
    ,customer_email
    ,nvl(max(decode(attribute_id,'customer_no',thevalue)),SET_NEWSL_CUST_ID) customer_no   
    ,max(decode(attribute_id,'customer_type',thevalue)) customer_type
    ,max(decode(attribute_id,'title',thevalue)) title
    ,substr(max(decode(attribute_id,'first_name',thevalue)) ,1,64)first_name
    ,substr(max(decode(attribute_id,'last_name',thevalue)) ,1,64) last_name
    ,substr(max(decode(attribute_id,'phone_hone',thevalue)) ,1,64) phone_hone
    ,max(decode(attribute_id,'birthday',thevalue)) birthday
    ,substr(max(decode(attribute_id,'address_line1',thevalue)) ,1,100) address_line1
    ,substr(max(decode(attribute_id,'address_line2',thevalue)) ,1,100) address_line2
    ,substr(max(decode(attribute_id,'address_line3',thevalue)) ,1,100) address_line3   
    ,substr(max(decode(attribute_id,'housenumber',thevalue)) ,1,64) housenumber
    ,substr(max(decode(attribute_id,'city',thevalue)) ,1,128) city
    ,substr(max(decode(attribute_id,'post_code',thevalue)) ,1,64) postal_code
    ,substr(max(decode(attribute_id,'state',thevalue)),1,256) state
    ,substr(max(decode(attribute_id,'country',thevalue)),1,32) country
    ,max(decode(attribute_id,'is_subscribed',thevalue)) is_subscribed
    ,max(decode(attribute_id,'preferred_locale',thevalue)) preferred_locale
    ,max(decode(attribute_id,'profiling',thevalue)) profiling
    ,max(decode(attribute_id,'promotions',thevalue)) promotions
    ,max(decode(attribute_id,'exported',thevalue)) exported   
    ,substr(max(decode(attribute_id,'source',thevalue)),1,256) source   
    ,max(decode(attribute_id,'source_ip',thevalue)) source_ip       
    ,substr(max(decode(attribute_id,'pr_product_serial_number',thevalue)),1,64) pr_product_serial_number
    ,substr(max(decode(attribute_id,'pr_purchased_from',thevalue)),1,64) pr_purchased_from   
    ,substr(max(decode(attribute_id,'pr_date_of_purchase',thevalue)),1,32) pr_date_of_purchase
    ,max(decode(attribute_id,'locale',thevalue)) locale
    ,p_website   
    from
    mainq
    group by customer_email, p_website
I CANNOT MANAGE TO INSERT 4000 records in less than 30 minutes!
Can you help or advise how to reduce this to reasonable timings?
Thanks

Simplified example on a few attributes :
-- INSERT INTO tmp_xml VALUES ( xml_file );
INSERT ALL
  INTO newsletters (brandid, customer_email, dw_website_tag)
  VALUES (p_brandid, customer_email, p_website)
  INTO newsletters_c_att (brandid, customer_email, customer_no, customer_type, title, first_name, last_name)
  VALUES (p_brandid, customer_email, customer_no, customer_type, title, first_name, last_name)
SELECT o.*
FROM tmp_xml t
   , XMLTable(
       xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31')
     , '/custom-objects/custom-object'
       passing t.object_value
       columns customer_email varchar2(256) path '@object-id'
             , customer_no    varchar2(256) path 'object-attribute[@attribute-id="customer-no"]/value'
             , customer_type  varchar2(256) path 'object-attribute[@attribute-id="customer_type"]/value'
             , title          varchar2(256) path 'object-attribute[@attribute-id="title"]/value'
             , first_name     varchar2(64)  path 'object-attribute[@attribute-id="first_name"]/value'
             , last_name      varchar2(64)  path 'object-attribute[@attribute-id="last_name"]/value'
     ) o

Similar Messages

  • Performance problem with selecting records from BSEG and KONV

    Hi,
    I am having performance problem while  selecting records from BSEG and KONV table. As these two tables have large amount of data , they are taking lot of time . Can anyone help me in improving the performance . Thanks in advance .
    Regards,
    Prashant

    Hi,
    Some steps to improve performance
    SOME STEPS USED TO IMPROVE UR PERFORMANCE:
    1. Avoid using SELECT...ENDSELECT... construct and use SELECT ... INTO TABLE.
    2. Use WHERE clause in your SELECT statement to restrict the volume of data retrieved.
    3. Design your Query to Use as much index fields as possible from left to right in your WHERE statement
    4. Use FOR ALL ENTRIES in your SELECT statement to retrieve the matching records at one shot.
    5. Avoid using nested SELECT statement SELECT within LOOPs.
    6. Avoid using INTO CORRESPONDING FIELDS OF TABLE. Instead use INTO TABLE.
    7. Avoid using SELECT * and Select only the required fields from the table.
    8. Avoid nested loops when working with large internal tables.
    9. Use assign instead of into in LOOPs for table types with large work areas
    10. When in doubt call transaction SE30 and use the examples and check your code
    11. Whenever using READ TABLE use BINARY SEARCH addition to speed up the search. Be sure to sort the internal table before binary search. This is a general thumb rule but typically if you are sure that the data in internal table is less than 200 entries you need not do SORT and use BINARY SEARCH since this is an overhead in performance.
    12. Use "CHECK" instead of IF/ENDIF whenever possible.
    13. Use "CASE" instead of IF/ENDIF whenever possible.
    14. Use "MOVE" with individual variable/field moves instead of "MOVE-
    CORRESPONDING" creates more coding but is more effcient.

  • Sqlldr is loading only 1st record from xml document

    Hi,
    I am trying to load XML doc with multiple records using sql*loader.
    I have registered my XSD perfectly.
    This is my control file
    LOAD DATA
    INFILE *
    INTO TABLE Orders APPEND
    XMLType(xmldata)
    FIELDS(
         xmldata LOBFILE (CONSTANT FULDTL_2.xml)
    TERMINATED BY '???')
    BEGINDATA
    FULDTL_2.xml
    -- Here, what I have to give for TERMINATED BY '???'
    My xml doc
    <Order ID="146120486" Status="CL" Comments="Shipped On 08/05/2008"/>
    <Order ID="143417590" Status="CL" Comments="Handset/Device has been received at NRC" ShipDate=""/>
    sqlldr is loading only 1st record from the file.
    How can I make it to load all the records from my xml doc.
    Thanks in advance.

    thanks for both the replies above - essentially the same correct solution.
    something worth noting now that I've written and tested both a SAX solution and a DOM solution is that there is a significant (4 x) time penalty using SAX.
    I considering dividing the vector I am storing/recovering into chunks and saving each chunk separately using DOM to speed things up...
    any thoughts on this approach?

  • Performance issues while query data from a table having large records

    Hi all,
    I have a performance issues on the queries on mtl_transaction_accounts table which has around 48,000,000 rows. One of the query is as below
    SQL ID: 98pqcjwuhf0y6 Plan Hash: 3227911261
    SELECT SUM (B.BASE_TRANSACTION_VALUE)
    FROM
    MTL_TRANSACTION_ACCOUNTS B , MTL_PARAMETERS A  
    WHERE A.ORGANIZATION_ID =    B.ORGANIZATION_ID 
    AND A.ORGANIZATION_ID =  :b1 
    AND B.REFERENCE_ACCOUNT =    A.MATERIAL_ACCOUNT 
    AND B.TRANSACTION_DATE <=  LAST_DAY (TO_DATE (:b2 ,   'MON-YY' )  )  
    AND B.ACCOUNTING_LINE_TYPE !=  15  
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      3      0.02       0.05          0          0          0           0
    Fetch        3    134.74     722.82     847951    1003824          0           2
    total        7    134.76     722.87     847951    1003824          0           2
    Misses in library cache during parse: 1
    Misses in library cache during execute: 2
    Optimizer mode: ALL_ROWS
    Parsing user id: 193  (APPS)
    Number of plan statistics captured: 1
    Rows (1st) Rows (avg) Rows (max)  Row Source Operation
             1          1          1  SORT AGGREGATE (cr=469496 pr=397503 pw=0 time=237575841 us)
        788242     788242     788242   NESTED LOOPS  (cr=469496 pr=397503 pw=0 time=337519154 us cost=644 size=5920 card=160)
             1          1          1    TABLE ACCESS BY INDEX ROWID MTL_PARAMETERS (cr=2 pr=0 pw=0 time=59 us cost=1 size=10 card=1)
             1          1          1     INDEX UNIQUE SCAN MTL_PARAMETERS_U1 (cr=1 pr=0 pw=0 time=40 us cost=0 size=0 card=1)(object id 181399)
        788242     788242     788242    TABLE ACCESS BY INDEX ROWID MTL_TRANSACTION_ACCOUNTS (cr=469494 pr=397503 pw=0 time=336447304 us cost=643 size=4320 card=160)
       8704356    8704356    8704356     INDEX RANGE SCAN MTL_TRANSACTION_ACCOUNTS_N3 (cr=28826 pr=28826 pw=0 time=27109752 us cost=28 size=0 card=7316)(object id 181802)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   SORT (AGGREGATE)
    788242    NESTED LOOPS
          1     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_PARAMETERS' (TABLE)
          1      INDEX   MODE: ANALYZED (UNIQUE SCAN) OF
                     'MTL_PARAMETERS_U1' (INDEX (UNIQUE))
    788242     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_TRANSACTION_ACCOUNTS' (TABLE)
    8704356      INDEX   MODE: ANALYZED (RANGE SCAN) OF
                     'MTL_TRANSACTION_ACCOUNTS_N3' (INDEX)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      row cache lock                                 29        0.00          0.02
      SQL*Net message to client                       2        0.00          0.00
      db file sequential read                    847951        0.40        581.90
      latch: object queue header operation            3        0.00          0.00
      latch: gc element                              14        0.00          0.00
      gc cr grant 2-way                               3        0.00          0.00
      latch: gcs resource hash                        1        0.00          0.00
      SQL*Net message from client                     2        0.00          0.00
      gc current block 3-way                          1        0.00          0.00
    ********************************************************************************On a 5 node rac environment the program completes in 15 hours whereas on a single node environemnt the program completes in 2 hours.
    Is there any way I can improve the performance of this query?
    Regards
    Edited by: mhosur on Dec 10, 2012 2:41 AM
    Edited by: mhosur on Dec 10, 2012 2:59 AM
    Edited by: mhosur on Dec 11, 2012 10:32 PM

    CREATE INDEX mtl_transaction_accounts_n0
      ON mtl_transaction_accounts (
                                   transaction_date
                                 , organization_id
                                 , reference_account
                                 , accounting_line_type
    /:p

  • Performance issues with FDK in large XML documents

    In my current project with FrameMaker 8 I'm experiencing severe performance issues with some FDK API calls.
    The documents are about 3-8 MBytes in size. Fortmatted they cover 150-250 pages.
    When importing such an XML document I do some extensive "post-processing" using FDK. This processing happens in Sr_EventHandler() during the SR_EVT_END_READER event. I noticed that some FDK functions calls which modify the document's structure, like F_ApiSetAttribute() or F_ApiNewElementInHierarchy(), take several seconds, for the larger documents even minutes, to complete one single function call. I tried to move some of these calls to earlier events, mostly to SR_EVT_END_ELEM. There the calls work without a delay. Unfortunately I can't rewrite the FDK client to move all the calls that are lagging to earlier events.
    Does anybody have a clue why such delays happen, and possibly can make a suggestion, how to solve this issue? Thank you in advance.
    PS: I already thought of splitting such a document in smaller pieces by using the FrameMaker book function. But I don't think, the structure of the documents will permit such an automatic split, and it definitely isn't an option to change the document structure (the project is about migrating documents from Interleaf to XML with the constraint of keeping the document layout identical).

    FP_ApplyFormatRules sounds really good--I'll give it a try on Monday. Wonder how I could miss it, as I already tried FP_Reformatting and FP_Displaying at no avail?! By the way, what is actually meant with FP_Reformatting (when I used it I assumed it would do exactly what FP_ApplyFormatRules sounds to do), or is that one another of Lynne's well-kept secrets?
    Thank's for all the helpful suggestions, guys. On Friday I already had my first improvements in a test version of my client: I did some (not all necessary) structural changes using XSLT pre-processing, and processing went down from 8 hours(!) to 1 hour--Yeappie! I was also playing with the idea of writing a wrapper to F_ApiNewElementInHierarchy() which actually pastes an appropriate element created in a small flow on the reference pages at the intended insertion location. But now, with FP_ApplyFormatRules on the horizon, I'm quite confident to get even the complicated stuff under control, which cannot be handled by the XSLT pre-processing, as it is based on the actual formatting of the document at run-time and cannot be anticipated in pre-processing.
    --Franz

  • How to parse and retrieve records from xml files into columns in Table

    Hi
    I attached the thing what i tried.
    Table to hold the XML COntent:
    create table xmlfile(xml_con sys.xmltype);
    Inserting Xml file content into the Above table:
    insert into xmlfile values(sys.xmltype.CreateXml('<Root><name>RAM</name><age>23</age></Root>'))
    SQL> select * from xmlfile;
    XML_CON
    <Root>
    <name>RAM</name>
    <age>23</age>
    </Root>
    SQL> select extractValue(xml_con, '/Root/name') content from xmlfile;
    CONTENT
    RAM
    This one works fine
    But if the file content is as below( contains MUltiple Records)
    insert into xmlfile values(sys.xmltype.CreateXml('<Root><Record><name>RAM</name><age>23</age></Record><Record><name>SAM</name><age>23</age></Record></Root>'))
    SQL> select extractValue(xml_con, '/Root/Record/name') content from xmlfile;
    ERROR at line 1:
    ORA-19025: EXTRACTVALUE returns value of only one node
    Can anyone help me 4 this issue-How to extract multiple records from the XML file inthis manner(from PL/SQL without using JAVA)
    OR
    If there is anyother way to do this please tell me?

    SQL> SELECT EXTRACTVALUE (COLUMN_VALUE, '//name') NAME,
           EXTRACTVALUE (COLUMN_VALUE, '//age') age
      FROM TABLE
              (XMLSEQUENCE
                  (EXTRACT
                      (XMLTYPE
                          ('<Root>
                              <Record>
                                <name>RAM</name>
                                <age>23</age>
                              </Record>
                              <Record>
                                <name>SAM</name>
                                <age>23</age>
                              </Record>
                            </Root>'
                       '/Root/Record'
    NAME       AGE      
    RAM        23       
    SAM        23       
    2 rows selected.

  • Load Sales Order from Xml file

    Hi,
    I want to load a sales order from xml file. How can I do so? Where I'll get the xml schema for Sales Order or other documents (delivery,invoice etc)?
    Plz reply with code and xml file.

    When I'm going to load SO from xml file, it's showing an error:
    "The connected value 0 was not found in table Uasge of Nota Fiscal".
    What is this table for? Which attribute is related with this table?
    How can I find out that?

  • How to show records from xml file

    HI All
    I have created one region its actually search region
    which having 5 items and result table region
    I want to search records based on that 5 items and want to show output in table
    I have table name as hr_api_transactions which contains lot of columns
    and that table also contain one column
    name as TRANSACTION_DOCUMENT of type CLOB()
    that columns xml files for each record
    I want to extract data from that xml file and want to display.

    I have created one region on seeded page
    in that region I have created one table for output
    that region is search region
    which having 5 items of textfield and 2 items of type submit button
    GO and Clear
    I want to search based on that 5 items
    I want to display records in table that I have created on that region
    I have one seeded table
    that contain one column
    that column contain xml file for each individual records
    that xaml file contains values what I want to display
    MY problems are
    how can I extract data from xml file?
    how can I show all values for each records on that table?
    how can I search based on that 5 items?
    now I am able to find out single value from that XML file
    by using SQL command
    select xmltype(transaction_document).extract('//IrcPostingContentsVlEORow/CreationDate/text()').getStringVal() CreationDate
    from hr_api_transactions
    where transaction_ref_table = 'PER_ALL_VACANCIES'
    and transaction_ref_id = 4693;how can I extract more than one records from that XML file

  • Issues loading .txt files from server...

    Hi.
    I have an issue with a script I created that loads text strings from a file on my server.
    The swf file loads a message from a .txt file, displays it in an animation when the animation finishes it loops back to the start and loads the next message and displays it in the animation.
    All works fine. However the script reads the text file from the server on every loop ( 3 seconds ) This will be hard on my server so... Is there a way to read the text file once only and then loop through the eight statements?
    Thanks. Z
    Actionscript 2. Flash CS3. Flash Player 10.
    quotes.txt - text file containing eight quotes
    quote_txt - Dynamic field within animation
    ranQuote = new LoadVars();
    ranQuote.onLoad = function(success) {
        if (success) {
            RanNum = RanNum+1;
            if (RanNum>=9) {
                RanNum = 1;
                ran = this["quote"+(RanNum)];
                quote_txt.text = ran;
            } else {
                ran = this["quote"+RanNum];
                quote_txt.text = ran;
        } else {
            quote_txt.text = "Display if can't load txt file";
    ranQuote.load("quotes.txt");

    Unless I'm missing something, each time you load the file you are loading the entire file.  So you should have all of what's in it stored.  Your load function appears to just be picking out one of them out randomly after they're all loaded.
    I assumed wrongly that this isn't all the code you have and other loading code is what you are trying to inhibit.
    What you probably want to do is take most of your onLoad code and place it into a function.  What you have that is pulling up the quotes should utilize that function.  You shouldn't be using that loading code each time some button or whatever loop cycles.

  • To load special records from worksheet in main memory: C#

    Hello
    I used this code. I do not have any error. My program is run after 55 seconds. My data is large.
    Now all recirds load in RAM. How can I load special records in RAM. For example I want to load record 2 from Book1.xlsx in RAM.
    Microsoft.Office.Interop.Excel.Application app_main = new Microsoft.Office.Interop.Excel.Application();
    Microsoft.Office.Interop.Excel.Workbooks workbooks_main = app_main.Workbooks;
    Microsoft.Office.Interop.Excel.Workbook workbook_main = workbooks_main.Open(Application.StartupPath + @"\Book1.xlsx");
    Microsoft.Office.Interop.Excel.Worksheet worksheet_main = workbook_main.Worksheets[1];
    var cellValue_main = (worksheet_main.Cells[2, 1] as Excel.Range).Value;
    workbook_main.Close();
    workbooks_main.Close();
    System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook_main);
    System.Runtime.InteropServices.Marshal.ReleaseComObject(workbooks_main);
    app_main.Quit();
    System.Runtime.InteropServices.Marshal.ReleaseComObject(app_main);

    Hi ARZARE,
    I have made a test with your code and the value of “cellValue_main”, and I modified your code as below:
    Microsoft.Office.Interop.Excel.Application app_main = new Microsoft.Office.Interop.Excel.Application();
    Microsoft.Office.Interop.Excel.Workbooks workbooks_main = app_main.Workbooks;
    Microsoft.Office.Interop.Excel.Workbook workbook_main = workbooks_main.Open(@"D:\Backup\Desktop\Test.xlsx");
    Microsoft.Office.Interop.Excel.Worksheet worksheet_main = workbook_main.Worksheets[1];
    var cellValue_main = (worksheet_main.Cells[2, 1] as Microsoft.Office.Interop.Excel.Range).Value;
    var rowValue_main = (worksheet_main.Rows[1] as Microsoft.Office.Interop.Excel.Range).Value2; //row value
    workbook_main.Close();
    workbooks_main.Close();
    System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook_main);
    System.Runtime.InteropServices.Marshal.ReleaseComObject(workbooks_main);
    app_main.Quit();
    System.Runtime.InteropServices.Marshal.ReleaseComObject(app_main);
    If you want to get the value of row 2, you could store the “rowValue_main” and query from it. Also, you could loop the cell value of the row to get all the data of the row.
    In addition, if you want to query data from the sheet, I think you could do as the reply from Joel. You could store the data to a dataset or datatable, and query from it.
    If you have any further questions, please feel free to post in this forum.
    Best Regards,
    Edward
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place. <br/> Click <a
    href="http://support.microsoft.com/common/survey.aspx?showpage=1&scid=sw%3Ben%3B3559&theme=tech"> HERE</a> to participate the survey.

  • Inserting nested records from XML to DB

    Hi,
    I am facing a problem with inserting nested records in XML to DB. For example, I have this XML:
    <?xml version="1.0" encoding="utf-8" ?>
    <ns0:CutLOTUpdate xmlns:ns0="http://LayoutTracking/v1.0">
    <C>1</C>
    <COMMENTS>Main1</COMMENTS>
    <CUT_DATA>
    <CUT>
    <D>1</D>
    <COMMENTS>2Main1</COMMENTS>
    <IT>
    <E>11</E>
    <COMMENTS>3Det1</COMMENTS>
    </IT>
    <IT>
    <E>12</E>
    <COMMENTS>3Det2</COMMENTS>
    </IT>
    </CUT>
    <CUT>
    <D>2</D>
    <COMMENTS>2Main2</COMMENTS>
    <IT>
    <E>21</E>
    <COMMENTS>3Det1</COMMENTS>
    </IT>
    <IT>
    <E>22</E>
    <COMMENTS>3Det2</COMMENTS>
    </IT>
    </CUT>
    </CUT_DATA>
    </ns0:CutLOTUpdate>
    I would like to insert these data into the following table in a denormalized form:
    CREATE TABLE A (
    C NUMBER,
    D NUMBER,
    E NUMBER,
    C_COMMENTS VARCHAR2(50),
    D_COMMENTS VARCHAR2(50),
    E_COMMENTS VARCHAR2(50))
    I have tried using this procedure:
    CREATE OR REPLACE PROCEDURE insc (Cut_Clob CLOB) AS
    Cut XMLType;
    BEGIN
    /*Converts Cut_Clob parameter into XML */
    Cut := sys.xmltype.createXML(Cut_Clob);
    /*Inserts data from XML to table*/
    INSERT INTO a
    ( C ,
    C_COMMENTS ,
    D ,
    D_COMMENTS ,
    E ,
    E_COMMENTS )
    SELECT DISTINCT
    ExtractVALUE(CUT, '/ns0:CutLOTUpdate/C' , 'xmlns:ns0="http://LayoutTracking/v1.0') C,
    ExtractValue(CUT, '/ns0:CutLOTUpdate/COMMENTS', 'xmlns:ns0="http://LayoutTracking/v1.0') C_COMMENTS,
    ExtractVALUE(value(ct), '/CUT/D') D,
    ExtractValue(value(ct), '/CUT/D_COMMENTS') D_COMMENTS,
    ExtractVALUE(value(it), '/IT/E') E,
    ExtractValue(value(it), '/IT/E_COMMENTS') E_COMMENTS
    FROM TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT','xmlns:ns0="http://LayoutTracking/v1.0'))) ct,
    TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT/IT','xmlns:ns0="http://LayoutTracking/v1.0'))) it;
    COMMIT;
    END;
    However, this resulted into a cartesian product.
    Is it possible for me to insert this XML into such table? If yes, can anyone show me how?
    I apologize if this seems trivial to you and I appreciate your time for helping me.
    Thank you,
    Kaye

    Hi,
    I have tried:
    FROM TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT','xmlns:ns0="http://LayoutTracking/v1.0'))) ct,
    TABLE (xmlsequence(extract(CUT,'/ns0:CutLOTUpdate/CUT_DATA/CUT/IT','xmlns:ns0="http://LayoutTracking/v1.0'))) it;
    This did not work - resulting in Cartesian product.
    I am working with Oracle 10g DB 10.2.0.1.
    If it's not too much, I am hoping that someone could show me a script to parse this XML and actually place it in a denormalized form.
    If you think this is not possible, can you please just point me to an example where the same XML (with nested information) can be inserted into 3 different tables (relational)?
    I have tried searching on different sources, to no avail. I am a beginner on this... I apologize for any inconveniece caused.

  • I have problem loading all items from XML with for each

    Hi all,
    I'm low level as3 programmer and I need help whit this code:
    I have gallery XML file:
    <gallery>
        <item>
            <id>1</id>
            <strana>0</strana>
            <naslov>Lokacije</naslov>
            <aktuelno>1</aktuelno>
            <slika>1.jpg</slika>
        </item>
        <item>
            <id>2</id>
            <strana>2</strana>
            <naslov>Coaching</naslov>
            <aktuelno>1</aktuelno>
            <slika>2.jpg</slika>
        </item>
        <item>
            <id>3</id>
            <strana>0</strana>
            <naslov><![CDATA[O.Å . Bratstvo - panel]]></naslov>
            <aktuelno>0</aktuelno>
            <slika>3.jpg</slika>
        </item>  
    </gallery>
    And:
    var loader: URLLoader = new URLLoader();
         loader.load(new URLRequest("gallery.xml");
    var xml = new XML(evt.target.data);
        for each(var item in xml..item) {
            centralniText.htmlText = item.slika;
    only shows  last item from XML file:
    3.jpg
    I want all. Please help.

    can adding count item in XML help somehow?
    <gallery>
        <item>
            <id>1</id>
            <strana>0</strana>
            <naslov>Lokacije</naslov>
            <aktuelno>1</aktuelno>
            <slika>1.jpg</slika>
        </item>
        <item>
            <id>2</id>
            <strana>2</strana>
            <naslov>Coaching</naslov>
            <aktuelno>1</aktuelno>
            <slika>2.jpg</slika>
        </item>
        <item>
            <id>3</id>
            <strana>0</strana>
            <naslov><![CDATA[O.Å . Bratstvo - panel]]></naslov>
            <aktuelno>0</aktuelno>
            <slika>3.jpg</slika>
        </item>   
        <last>3</last>
    </gallery>

  • Deleting records from XML file--just a little problem

    Hi!
    My xml file has a simple form like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <AdressBook>
        <Name>Suzy</Name>
        <Lastname>Love</Lastname>
        <Adress>You street 22</Adress>
        <Phone>911</Phone>
        <Email>[email protected]</Email>
      </Record>
    <Record>
       <Name>Judy</Name>
       <Lastname>Goblin</Lastname>
       <Adress>Milkyway</Adress>
       <Phone>911</Phone>
      <Email>[email protected]</Email>
    </Record>
    </AdressBook>Now, whenever I delete a record from that file it leaves a remaining tag <Record /> on the loction were previous record was. How do I delete that remaining tag or better yet, how do I delete Record element in whole. I am using jdom.
    Here is the code I use to delete records:
    public void deleteRecordFromFile(Record r)
              Element root=doc.getRootElement();
              List records=root.getChildren();
              Iterator listIt=records.iterator();
              while(listIt.hasNext())
                   Element rec=(Element)listIt.next();
                   Record p=new Record("","","","","");
                   List recChildren=rec.getChildren();
                   Element e=null;
                   for( int i=0;i<recChildren.size();i++)
                        e=(Element)recChildren.get(i);
                        switch (i)
                        case 0: p.setName(e.getText()); break;
                        case 1: p.setLastName(e.getText());break;
                        case 2: p.setAddress(e.getText());break;
                        case 3: p.setPhone(e.getText());break;
                        case 4: p.setEmail(e.getText());break;
                        default: ;
                   if(p.equals(r))
                        rec.removeContent();
                        System.out.println("Record deleted!");
              writeToFile(filename);
         }Like you see above I use the method removeContent(), should I use some other method? I need to get rid of that remaining tags in order to make my editing of the records stored in there easier. I guess I could make a seperate routine that would clean my file of those tags, but I think that is just too much time consuming...or not?
    Please help me out here.:)
    Message was edited by:
    byteminister

    You cannot remove the elements of a List while iterating over them without the risk of throwing a ConcurrentModificationException, whatever method you would use. So you can only delete the records in two steps: collect them all, and delete them afterwards.
    public void deleteRecordFromFile(Record r)
              Element root=doc.getRootElement();
              List records=root.getChildren();
              Iterator listIt=records.iterator();
              // Create a container for storing references to the records that will be deleted.
              ArrayList<Element> deleteList = new ArrayList<Element>();
              while(listIt.hasNext())
                   Element rec=(Element)listIt.next();
                   Record p=new Record("","","","","");
                   List recChildren=rec.getChildren();
                   Element e=null;
                   for( int i=0;i<recChildren.size();i++)
                        e=(Element)recChildren.get(i);
                        switch (i)
                        case 0: p.setName(e.getText()); break;
                        case 1: p.setLastName(e.getText());break;
                        case 2: p.setAddress(e.getText());break;
                        case 3: p.setPhone(e.getText());break;
                        case 4: p.setEmail(e.getText());break;
                        default: ;
                   if(p.equals(r))
                        // This record will be deleted.
                        deleteList.add(rec);
              // Delete all the records in deleteList.
              records.removeAll(deleteList);
              writeToFile(filename);
         }

  • Performance Issue - Fetching latest date from a507 table

    Hi All,
    I am fetching data from A507 table for material and batch combination. I want to fetch the latest record based on the value of field DATBI. I have written the code as follows. But in the select query its taking more time. I dont want to write any condition in where claue for DATBI field because I have already tried with that option.
    SELECT kschl
               matnr
               charg
               datbi
               knumh
        FROM a507
        INTO TABLE it_a507
        FOR ALL ENTRIES IN lit_mch1
        WHERE kschl = 'ZMRP'
        AND   matnr = lit_mch1-matnr
        AND   charg = lit_mch1-charg.
    SORT it_a507 BY kschl matnr charg datbi DESCENDING.
      DELETE ADJACENT DUPLICATES FROM it_a507 COMPARING kschl matnr charg.

    Hi,
    These kind of tables will be storing large volumes of data. Thus while making a select on it, its important to use as many primary key fields as possible in the where condition. Here you can try mentioning KAPPL since its specific to a requirement. If its for purchasing use 'M' and try.
    if not lit_mch1[] is initial.
    SELECT kschl
    matnr
    charg
    datbi
    knumh
    FROM a507
    INTO TABLE it_a507
    FOR ALL ENTRIES IN lit_mch1
    WHERE kappl = 'M'
    AND kschl = 'ZMRP'
    AND matnr = lit_mch1-matnr
    AND charg = lit_mch1-charg.
    endif.
    SORT it_a507 BY kschl matnr charg datbi DESCENDING.
    DELETE ADJACENT DUPLICATES FROM it_a507 COMPARING kschl matnr charg.
    This should considerably increase the performance
    Regards,
    Vik

  • Performance issue in selecting data from a view because of not in condition

    Hi experts,
    I have a requirement to select data in a view which is not available in a fact with certain join conditions. but the fact table contains 2 crore rows of data. so this view is not working at all. it is running for long time. im pasting query here. please help me to tune it. whole query except prior to not in is executing in 15 minutes. but when i add not in condition it is running for so many hours as the second table has millions of records.
    CREATE OR REPLACE FORCE VIEW EDWOWN.MEDW_V_GIEA_SERVICE_LEVEL11
       SYS_ENT_ID,
       SERVICE_LEVEL_NO,
       CUSTOMER_NO,
       BILL_TO_LOCATION,
       PART_NO,
       SRCE_SYS_ID,
       BUS_AREA_ID,
       CONTRACT,
       WAREHOUSE,
       ORDER_NO,
       LINE_NO,
       REL_NO,
       REVISED_DUE_DATE,
       REVISED_QTY_DUE,
       QTY_RESERVED,
       QTY_PICKED,
       QTY_SHIPPED,
       ABBREVIATION,
       ACCT_WEEK,
       ACCT_MONTH,
       ACCT_YEAR,
       UPDATED_FLAG,
       CREATE_DATE,
       RECORD_DATE,
       BASE_WAREHOUSE,
       EARLIEST_SHIP_DATE,
       LATEST_SHIP_DATE,
       SERVICE_DATE,
       SHIP_PCT,
       ALLOC_PCT,
       WHSE_PCT,
       ABC_CLASS,
       LOCATION_ID,
       RELEASE_COMP,
       WAREHOUSE_DESC,
       MAKE_TO_FLAG,
       SOURCE_CREATE_DATE,
       SOURCE_UPDATE_DATE,
       SOURCE_CREATED_BY,
       SOURCE_UPDATED_BY,
       ENTITY_CODE,
       RECORD_ID,
       SRC_SYS_ENT_ID,
       BSS_HIERARCHY_KEY,
       SERVICE_LVL_FLAG
    AS
       SELECT SL.SYS_ENT_ID,
                 SL.ENTITY_CODE
              || '-'
              || SL.order_no
              || '-'
              || SL.LINE_NO
              || '-'
              || SL.REL_NO
                 SERVICE_LEVEL_NO,
              SL.CUSTOMER_NO,
              SL.BILL_TO_LOCATION,
              SL.PART_NO,
              SL.SRCE_SYS_ID,
              SL.BUS_AREA_ID,
              SL.CONTRACT,
              SL.WAREHOUSE,
              SL.ORDER_NO,
              SL.LINE_NO,
              SL.REL_NO,
              SL.REVISED_DUE_DATE,
              SL.REVISED_QTY_DUE,
              NULL QTY_RESERVED,
              NULL QTY_PICKED,
              SL.QTY_SHIPPED,
              SL.ABBREVIATION,
              NULL ACCT_WEEK,
              NULL ACCT_MONTH,
              NULL ACCT_YEAR,
              NULL UPDATED_FLAG,
              SL.CREATE_DATE,
              SL.RECORD_DATE,
              SL.BASE_WAREHOUSE,
              SL.EARLIEST_SHIP_DATE,
              SL.LATEST_SHIP_DATE,
              SL.SERVICE_DATE,
              SL.SHIP_PCT,
              0 ALLOC_PCT,
              0 WHSE_PCT,
              SL.ABC_CLASS,
              SL.LOCATION_ID,
              NULL RELEASE_COMP,
              SL.WAREHOUSE_DESC,
              SL.MAKE_TO_FLAG,
              SL.source_create_date,
              SL.source_update_date,
              SL.source_created_by,
              SL.source_updated_by,
              SL.ENTITY_CODE,
              SL.RECORD_ID,
              SL.SRC_SYS_ENT_ID,
              SL.BSS_HIERARCHY_KEY,
              'Y' SERVICE_LVL_FLAG
         FROM (  SELECT SL_INT.SYS_ENT_ID,
                        SL_INT.CUSTOMER_NO,
                        SL_INT.BILL_TO_LOCATION,
                        SL_INT.PART_NO,
                        SL_INT.SRCE_SYS_ID,
                        SL_INT.BUS_AREA_ID,
                        SL_INT.CONTRACT,
                        SL_INT.WAREHOUSE,
                        SL_INT.ORDER_NO,
                        SL_INT.LINE_NO,
                        MAX (SL_INT.REL_NO) REL_NO,
                        SL_INT.REVISED_DUE_DATE,
                        SUM (SL_INT.REVISED_QTY_DUE) REVISED_QTY_DUE,
                        SUM (SL_INT.QTY_SHIPPED) QTY_SHIPPED,
                        SL_INT.ABBREVIATION,
                        MAX (SL_INT.CREATE_DATE) CREATE_DATE,
                        MAX (SL_INT.RECORD_DATE) RECORD_DATE,
                        SL_INT.BASE_WAREHOUSE,
                        MAX (SL_INT.LAST_SHIPMENT_DATE) LAST_SHIPMENT_DATE,
                        MAX (SL_INT.EARLIEST_SHIP_DATE) EARLIEST_SHIP_DATE,
                        MAX (SL_INT.LATEST_SHIP_DATE) LATEST_SHIP_DATE,
                        MAX (
                           CASE
                              WHEN TRUNC (SL_INT.LAST_SHIPMENT_DATE) <=
                                      TRUNC (SL_INT.LATEST_SHIP_DATE)
                              THEN
                                 TRUNC (SL_INT.LAST_SHIPMENT_DATE)
                              ELSE
                                 TRUNC (SL_INT.LATEST_SHIP_DATE)
                           END)
                           SERVICE_DATE,
                        MIN (
                           CASE
                              WHEN TRUNC (SL_INT.LAST_SHIPMENT_DATE) >=
                                      TRUNC (SL_INT.EARLIEST_SHIP_DATE)
                                   AND TRUNC (SL_INT.LAST_SHIPMENT_DATE) <=
                                          TRUNC (SL_INT.LATEST_SHIP_DATE)
                                   AND SL_INT.QTY_SHIPPED = SL_INT.REVISED_QTY_DUE
                              THEN
                                 100
                              ELSE
                                 0
                           END)
                           SHIP_PCT,
                        SL_INT.ABC_CLASS,
                        SL_INT.LOCATION_ID,
                        SL_INT.WAREHOUSE_DESC,
                        SL_INT.MAKE_TO_FLAG,
                        MAX (SL_INT.source_create_date) source_create_date,
                        MAX (SL_INT.source_update_date) source_update_date,
                        SL_INT.source_created_by,
                        SL_INT.source_updated_by,
                        SL_INT.ENTITY_CODE,
                        SL_INT.RECORD_ID,
                        SL_INT.SRC_SYS_ENT_ID,
                        SL_INT.BSS_HIERARCHY_KEY
                   FROM (SELECT SL_UNADJ.*,
                                DECODE (
                                   TRIM (TIMA.DAY_DESC),
                                   'saturday',   SL_UNADJ.REVISED_DUE_DATE
                                               - 1
                                               - early_ship_days,
                                   'sunday',   SL_UNADJ.REVISED_DUE_DATE
                                             - 2
                                             - early_ship_days,
                                   SL_UNADJ.REVISED_DUE_DATE - early_ship_days)
                                   EARLIEST_SHIP_DATE,
                                DECODE (
                                   TRIM (TIMB.DAY_DESC),
                                   'saturday',   SL_UNADJ.REVISED_DUE_DATE
                                               + 2
                                               + LATE_SHIP_DAYS,
                                   'sunday',   SL_UNADJ.REVISED_DUE_DATE
                                             + 1
                                             + LATE_SHIP_DAYS,
                                   SL_UNADJ.REVISED_DUE_DATE + LATE_SHIP_DAYS)
                                   LATEST_SHIP_DATE
                           FROM (SELECT NVL (s2.sys_ent_id, '00') SYS_ENT_ID,
                                        cust.customer_no CUSTOMER_NO,
                                        cust.bill_to_loc BILL_TO_LOCATION,
                                        cust.early_ship_days,
                                        CUST.LATE_SHIP_DAYS,
                                        ord.PART_NO,
                                        ord.SRCE_SYS_ID,
                                        ord.BUS_AREA_ID,
                                        ord.BUS_AREA_ID CONTRACT,
                                        NVL (WAREHOUSE, ord.entity_code) WAREHOUSE,
                                        ORDER_NO,
                                        ORDER_LINE_NO LINE_NO,
                                        ORDER_REL_NO REL_NO,
                                        TRUNC (REVISED_DUE_DATE) REVISED_DUE_DATE,
                                        REVISED_ORDER_QTY REVISED_QTY_DUE,
                                        --   NULL QTY_RESERVED,
                                        --    NULL QTY_PICKED,
                                        SHIPPED_QTY QTY_SHIPPED,
                                        sold_to_abbreviation ABBREVIATION,
                                        --        NULL ACCT_WEEK,
                                        --      NULL ACCT_MONTH,
                                        --    NULL ACCT_YEAR,
                                        --  NULL UPDATED_FLAG,
                                        ord.CREATE_DATE CREATE_DATE,
                                        ord.CREATE_DATE RECORD_DATE,
                                        NVL (WAREHOUSE, ord.entity_code)
                                           BASE_WAREHOUSE,
                                        LAST_SHIPMENT_DATE,
                                        TRUNC (REVISED_DUE_DATE)
                                        - cust.early_ship_days
                                           EARLIEST_SHIP_DATE_UnAdj,
                                        TRUNC (REVISED_DUE_DATE)
                                        + CUST.LATE_SHIP_DAYS
                                           LATEST_SHIP_DATE_UnAdj,
                                        --0 ALLOC_PCT,
                                        --0 WHSE_PCT,
                                        ABC_CLASS,
                                        NVL (LOCATION_ID, '000') LOCATION_ID,
                                        --NULL RELEASE_COMP,
                                        WAREHOUSE_DESC,
                                        NVL (
                                           DECODE (MAKE_TO_FLAG,
                                                   'S', 0,
                                                   'O', 1,
                                                   '', -1),
                                           -1)
                                           MAKE_TO_FLAG,
                                        ord.CREATE_DATE source_create_date,
                                        ord.UPDATE_DATE source_update_date,
                                        ord.CREATED_BY source_created_by,
                                        ord.UPDATED_BY source_updated_by,
                                        ord.ENTITY_CODE,
                                        ord.RECORD_ID,
                                        src.SYS_ENT_ID SRC_SYS_ENT_ID,
                                        ord.BSS_HIERARCHY_KEY
                                   FROM EDW_DTL_ORDER_FACT ord,
                                        edw_v_maxv_cust_dim cust,
                                        edw_v_maxv_part_dim part,
                                        EDW_WAREHOUSE_LKP war,
                                        EDW_SOURCE_LKP src,
                                        MEDW_PLANT_LKP s2,
                                        edw_v_incr_refresh_ctl incr
                                  WHERE ord.BSS_HIERARCHY_KEY =
                                           cust.BSS_HIERARCHY_KEY(+)
                                        AND ord.record_id = part.record_id(+)
                                        AND ord.part_no = part.part_no(+)
                                        AND NVL (ord.WAREHOUSE, ord.entity_code) =
                                               war.WAREHOUSE_code(+)
                                        AND ord.entity_code = war.entity_code(+)
                                        AND ord.record_id = src.record_id
                                        AND src.calculate_back_order_flag = 'Y'
                                        AND NVL (cancel_order_flag, 'N') != 'Y'
                                        AND UPPER (part.source_plant) =
                                               UPPER (s2.location_code1(+))
                                        AND mapping_name = 'MEDW_MAP_GIEA_MTOS_STG'
                                      --  AND NVL (ord.UPDATE_DATE, SYSDATE) >=
                                      --         MAX_SOURCE_UPDATE_DATE
                                        AND UPPER (
                                               NVL (ord.order_status, 'BOOKED')) NOT IN
                                               ('ENTERED', 'CANCELLED')
                                        AND TRUNC (REVISED_DUE_DATE) <= SYSDATE) SL_UNADJ,
                                EDW_TIME_DIM TIMA,
                                EDW_TIME_DIM TIMB
                          WHERE TRUNC (SL_UNADJ.EARLIEST_SHIP_DATE_UnAdj) =
                                   TIMA.ACCOUNT_DATE
                                AND TRUNC (SL_UNADJ.LATEST_SHIP_DATE_Unadj) =
                                       TIMB.ACCOUNT_DATE) SL_INT
                  WHERE TRUNC (LATEST_SHIP_DATE) <= TRUNC (SYSDATE)
               GROUP BY SL_INT.SYS_ENT_ID,
                        SL_INT.CUSTOMER_NO,
                        SL_INT.BILL_TO_LOCATION,
                        SL_INT.PART_NO,
                        SL_INT.SRCE_SYS_ID,
                        SL_INT.BUS_AREA_ID,
                        SL_INT.CONTRACT,
                        SL_INT.WAREHOUSE,
                        SL_INT.ORDER_NO,
                        SL_INT.LINE_NO,
                        SL_INT.REVISED_DUE_DATE,
                        SL_INT.ABBREVIATION,
                        SL_INT.BASE_WAREHOUSE,
                        SL_INT.ABC_CLASS,
                        SL_INT.LOCATION_ID,
                        SL_INT.WAREHOUSE_DESC,
                        SL_INT.MAKE_TO_FLAG,
                        SL_INT.source_created_by,
                        SL_INT.source_updated_by,
                        SL_INT.ENTITY_CODE,
                        SL_INT.RECORD_ID,
                        SL_INT.SRC_SYS_ENT_ID,
                        SL_INT.BSS_HIERARCHY_KEY) SL
        WHERE (SL.BSS_HIERARCHY_KEY,
               SL.ORDER_NO,
               Sl.line_no,
               sl.Revised_due_date,
               SL.PART_NO,
               sl.sys_ent_id) NOT IN
                 (SELECT BSS_HIERARCHY_KEY,
                         ORDER_NO,
                         line_no,
                         revised_due_date,
                         part_no,
                         src_sys_ent_id
                    FROM MEDW_MTOS_DTL_FACT
                   WHERE service_lvl_flag = 'Y');
                   thanks
    asn

    Also 'NOT IN' + nullable columns can be an expensive combination - and may not give the expected results. For example, compare these:
    with test1 as ( select 1 as key1 from dual )
       , test2 as ( select null as key2 from dual )
    select * from test1
    where  key1 not in
           ( select key2 from test2 );
    no rows selected
    with test1 as ( select 1 as key1 from dual )
       , test2 as ( select null as key2 from dual )
    select * from test1
    where  key1 not in
           ( select key2 from test2
             where  key2 is not null );
          KEY1
             1
    1 row selected.Even if the columns do contain values, if they are nullable Oracle has to perform a resource-intensive filter operation in case they are null. An EXISTS construction is not concerned with null values and can therefore use a more efficient execution plan, leading people to think it is inherently faster in general.

Maybe you are looking for

  • G5 No power, only 1st LED on at test

    hi, I have read many of the posts here relating to Imac g5 not starting up. I have the same problem. I have a g5 ALS. I did the LED test. The first LED comes on but the second LED would not flicker or come on. So, I concluded (from the apple support

  • Portege A100 - problems recognising a new DVD writer

    Hi, I am attempting to replace the Panasonic/Matsu****a UJDA750 combo drive with a LiteOn SOSW-852S DVD writer. The LiteOn is CSEL, and is not recognized by the Portege A100. I've tried changing it to Master/Slave/CSEL and Inverse CSEL using E2MS_WIN

  • Permissions Repair Error: ARDAgent.app

    An Old Leopard Issue Rears it's head. System Utility Message: Warning: SUID file "System/Library/CoreServices/RemoteManagement/ARDAgent.app/Contents/MacOS/ARDAg ent" has been modified and will not be repaired. Any suggestions appreciated. Found on cl

  • Regarding collect statement - Production report

    Dear Experts, I've the following requirement A ) I'm creating a ALV report from Z table where i want the summary of the qty across different processing type like CT , ST , WS  & FG . B ) For each processing types i want the total qty to be displayed

  • SAP notes analysis

    Hi All have the SAP Notes for executing a equerry in background. I am not able to analyses the Note. Can any one please help me on this. It is really urgent. Following is the Note details. *Note 537735* - SAP Query: save to file in the background Not