Performance issue loading data out of AS400

Hi,
For loading data out of AS400, I have created a view containing a join between three AS400 tables connected with a database link (And some more changes in the file tnsnames.ora and the listener. Hell of a job with Oracle, but it works finally)
When I use the tool Toad, the results of this query will be shown in about 20 seconds.
When I use this view in OWB to load this data into a target table, then the load takes about 15 MINUTES!
Why is this so slow?
Do I have to configure something in OWB to make this load faster?
Other loads when I'm using views (to Oracle tables) to load data are running fast.
It seems that Oracle does internally more dan just running the view.
Who knows?
Regards,
Maurice

Maurice,
OWB generates optimized code based on whether sources are local or remote. With remote sources, Warehouse Builder will generate code that uses inline views in order to minimize network traffic.
In your case, you confuse the generation by creating a view that does some remote/local joins telling OWB that the object is local (which is only partly true).
Perhaps what you could do is create one-to-one views and leave it up to OWB to join the objects. One additional advantage you gain with this approach is that you can keep track of the impact analysis based on your source tables rather than views that are based on the tables with flat text queries.
Mark.

Similar Messages

  • Performance issues with data warehouse loads

    We have performance issues with our data warehouse load ETL process. I have run
    analyze and dbms_stats and checked database environment. What other things can I do to optimize performance? I cannot use statspack since we are running Oracle 8i. Thanks
    Scott

    Hi,
    you should analyze the db after you have loaded the tables.
    Do you use sequences to generate PKs? Do you have a lot of indexex and/or triggers on the tables?
    If yes:
    make sure your sequence caches (alter sequence s cache 10000)
    Drop all unneeded indexes while loading and disable trigger if possible.
    How big is your Redo Log Buffer? When loading a large amount of data it may be an option to enlarge this buffer.
    Do you have more then one DBWR Process? Writing parallel can speed up things when a checkpoint is needed.
    Is it possible using a direct load? Or do you already direct load?
    Dim

  • Performance issue and data getting interchanged in BO Webi report + SAP BW

    Hi,
    We are using SAP BW queries as the source for creating some BO reports.
    Environments :
    SAP - SAP BI 7.1
    BO - BO XI 3.1
    Issues :
    The reports were working fine in Dev and Q with less data. But when we point the universes to BW prod ( where we have much data), the reports are taking quite a long time to refresh and getting timed out. This query has some key figures which are having customer exits defined to show only one month data. And also BW accelerators are updated for the infocubes pertaining to this query. The BO report is giving data if we apply a filter in 'Query Panel' of Webi to show only current month dates. But then the issue is the values are getting interchanged for many objects. For ex: there are 2 objects- ABS version and Market region. The values are getting interchanged in the BO level.
    Please let us know if anything needs to be done in BO or BW to fix this issue if anyone has faced the same
    Also Please let us know if customer exits and accelerators works fine with BO
    Thanks
    Sivakami

    Hi,
    Thanks Roberto. We'll check the notes
    @Ingo,
    We are able to solve the performance issue by removing unused Key figures and dimensions from the query, but the column value interchange issue still persisits
    The build version is  - 12.3.0
    Query Stripping
    Where should we enable query stripping? When i went through some documentation it was written that it'll be enabled automatically from XI 3.1 Sp3. Can you please let us know if its so and what we need to do to enable it.
    The coulmn interchange is happening when we use dimensions in a certain order. When product type is used along with Market region. Market region shows values of Product type also in Webi report.
    Thanks & Regards,
    Sivakami

  • Performance issue - Loading and Calculating

    Hi,<BR>I am having 5 GB of data. It is taking 1 hr to load and 30 min to calculate. <BR>I did the following things to improve the performance.<BR>1) Sort the data and loading them in the order of largest sparse first, followed by smallest and dense<BR>2) Enabled parallel load, gave 6 threads for prepare and 4 for writing.<BR>3) Increased data file cache as 400MB and data cache as 50MB, then index cache as 100MB.<BR>4) Calculation only for 4 dimensions, out of 9. In that 2 are dense and 2 are sparse. <BR>5) Calculation with parallel calculation having 3 threads and CALCTASKDIMS as 2.<BR><BR>But i am not getting any improvements.<BR>While doing the calculation i got following message in the logs.<BR>I feel that CALCTASKDIM is not working<BR><BR>[Fri Jan  6 22:01:54 2006]Local/tcm2006/tcm2006/biraprd/Info(1012679)<BR>Calculation task schedule [2870,173,33,10,4,1]<BR><BR>[Fri Jan  6 22:01:54 2006]Local/tcm2006/tcm2006/biraprd/Info(1012680)<BR>Parallelizing using [1] task dimensions. Usage of Calculator cache caused reduction in task dimensions<BR><BR>[Fri Jan  6 22:33:54 2006]Local/tcm2006/tcm2006/biraprd/Info(1012681)<BR>Empty tasks [2434,115,24,10,2,0]<BR><BR>Can any one help me what the above log message is telling and what are the other things to be done to imrpove the performance.<BR><BR>Regards<BR>prsan<BR><BR><BR>

    <p>its not the problem with ur calc task dim.</p><p> </p><p><b>Calculation task schedule [2870,173,33,10,4,1</b>] indicates that ur parell calc can start with 2870calculations in parallel, after which 173 can be performed inparallel then 33 ,10,4 and 1.</p><p> </p><p><b>Empty tasks [2434,115,24,10,2,0]</b>  means these manytasks dont need any calculation- either because there is no data orthey are marked clean due to intelligent calc.</p><p> </p><p>the problem lies with your cal cache setting. try increaing thecal cache settings in ur cfg file and use calcache high setting inyour calc.</p><p> </p><p>hope this works<br></p>

  • Adding items to custom list - performance issues - Request timed out error

    Hi,
    I am using VS 2010, c#.
    I am inserting records from an excel file, which contains 6000 records to a custom list.
    I am using the Visual webpart, and oledb connection. I am not using the openXML format, as i need to use "xls" and "xlsx" type.
    I am doing the below steps currently:
    1) From the file upload control, i am saving the excel to the sharepoint layouts - inside that a custom folder, to keep the records.
    2) Using the oledb connection to read the records and fill it in a datatable.
    3) Inserting the datatable values to a custom list.
    Looping through the datatable and inserting 3000 – 4000 records to List, I am getting the "Request timeout error".
    Below is the sample code, where i am looping the datatable and inserting the record:
    SPList prjCmpList = oWeb.Lists[strListName];
    SPListItemCollection prjCmpListItems = prjCmpList.Items;
    SPListItem prjCmpListItem;
    foreach (DataRow dr in dt.Rows)
    try
    prjCmpListItem = prjCmpListItems.Add();
    prjCmpListItem["Title"] = dr[0];
    prjCmpListItem["FirstName"] = dr[1];
    prjCmpListItem.Update();
    catch (Exception)
    I am using the "using block" for the spsite and web objects. When I see the execution time out in my web.config it is set to 3600.  But I am getting the request time out after 2 mins only.
    I have checked these link's : http://msdn.microsoft.com/en-us/library/aa973248%28v=office.12%29.aspx
    http://www.etechplanet.com/blog/how-to-import-an-excel-spreadsheet-in-sharepoint-and-save-it-as-a-custom-list.aspx
    http://social.msdn.microsoft.com/Forums/sharepoint/en-US/8664375c-fae0-483a-b43f-ce7d353b896a/importing-3000-records-to-sharepoint-list-causing-requesting-time-out-error?forum=sharepointadminlegacy
    However, I am following the best practices, and also the insertion of records is to be done programmatically.
    Has anyone faced this kind of issue? How to fix this?

    HI,
    Thanks for the reply.
    If the list has custom columns which consists of string and Date types, can the same type be used? Or should the list be an empty list without any columns?
    In the xml if it is put as "onerror= continue", how to fetch the records which error out?
    Also, in the datatable code block, which I had put in the first post, I am also trying as below:
    if(dt[1] != null)
    if(!string.IsNullorEmpty(dt[1].toString())
    prjCmpListItem["FirstName"] = dt[1].toString();
    Can these condition checks be performed with the "processbatchdata" approach?
    Thanks

  • Issue loading data from PSA to Cube

    Hi Gurus:
    I have a strange issue.  My DELTA load to ODS from R/3 come thru fine.  However, the load fails while moving
    from ODS to the cube.  The error is in 'Fiscal Year/Period'.  It is a straight 1:1 map in update rules.  If I bring the
    data just to PSA there is no error.  But when the data moves from PSA to cube, one record in each of the 2
    packages the valiue of Fiscal Year/Period changes from "012/2009" to "015/2009".
    Any idea why this might be happeing and suggestions on How can I move this data to cube?
    Thanks in advance and a very happy New Year!
    Best.....PBSW

    Are you using or copying Busines Content in your start routines?
    In the SD data extractors, the Fiscal Year/Period was provided WITHOUT a Fiscal Variant.
    In the Start Routine, the Fiscal Variant was hard coded to "K4" (I think).
    If you are using the code and a Fiscal Calendar other than the Hard coded one, you might get this type of error.
    If so, consider setting the Fiscal Variant as a constant in the update rules.
    Good Luck,
    John Hawk

  • Performance issue on Date filter

    In my where condition I wanted a date filter on sale date.  I want all sales after 9/1/2014.   
    CASE1
    If I explicitly use date filter like 
    SaleDate > '2014-09-01 00:00:00:000' I am getting the result in 2 seconds.
    CASE2
    Since I may need to use this data value again, I created a date table with single column "date" and loaded the value '2014-09-01 00:00:00:000' . 
    So now my where is like 
    SaleDate > (Select date from dateTable)
    When I run this , the result does not show up even after 10 mins. Both date types are datetime. I am baffled.  Why is this query not coming up with the result?

    As mentioned by Erland, for the optimizer, both situation are very different. With a literal, the optimizer can properly estimate the number of qualifying rows and adapt the query plan appropriately. With a scalar subquery, the value is unknown at compile
    time, and the optimizer will use heuristics to accommodate any value. In this case, the selection for all rows more recent than September 1st 2014 is probably a small percentage of the table.
    I can't explain why the optimizer or engine goes awry, because the subquery's result is a scalar, and shouldn't result in such long runtime. If you are unlucky, the optimizer expanded the query and actually joins the two tables. That would make the indexes
    on table dateTable relevant, as well as distribution and cardinality of dateTable's row values. If you want to know, you would have to inspect the (actual) query plan.
    In general, I don't think your approach is a smart thing to do. I don't know why you want to have your selection date in a table (as opposed to a parameter of a stored procedure), but if you want to stick to it, maybe you should break the query up into something
    like this. The optimizer would still have to use heuristics (instead of more reliable estimates), but some unintended consequences could disappear.
    Declare @min_date datetime
    Set @min_date = (SELECT date FROM dateTable)
    SELECT ...
    FROM ...
    WHERE SaleDate > @min_date
    If you use a parameter (or an appropriate query hint), you will probably get performance close to your first case.
    Gert-Jan

  • Issues loading data in table component after dploying to tomcat5.5.28

    Hi
    I export a war file from sjsc2 u 1 into tomcat5.5.28 and also set up jndi to point to the datasource properly but displaying table component with the data loaded from creator.i'm getting the following error
    type Exception report
    message
    description The server encountered an internal error () that prevented it from fulfilling this request.
    exception
    javax.servlet.ServletException: Servlet execution threw an exception
         com.sun.rave.web.ui.util.UploadFilter.doFilter(UploadFilter.java:194)
    root cause
    java.lang.AbstractMethodError: oracle.jdbc.driver.OracleDatabaseMetaData.locatorsUpdateCopy()Z
         com.sun.sql.rowset.CachedRowSetXImpl.execute(CachedRowSetXImpl.java:972)
         com.sun.sql.rowset.CachedRowSetXImpl.execute(CachedRowSetXImpl.java:1410)
         com.sun.data.provider.impl.CachedRowSetDataProvider.checkExecute(CachedRowSetDataProvider.java:1219)
         com.sun.data.provider.impl.CachedRowSetDataProvider.absolute(CachedRowSetDataProvider.java:283)
         com.sun.data.provider.impl.CachedRowSetDataProvider.getRowKeys(CachedRowSetDataProvider.java:232)
         com.sun.data.provider.impl.CachedRowSetDataProvider.cursorFirst(CachedRowSetDataProvider.java:351)
         com.sun.data.provider.impl.CachedRowSetDataProvider.setCachedRowSet(CachedRowSetDataProvider.java:182)
         com.sun.data.provider.impl.CachedRowSetDataProvider.close(CachedRowSetDataProvider.java:209)
         epnl_idbadge.managers_browse_screen.destroy(managers_browse_screen.java:380)
         com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.destroy(ViewHandlerImpl.java:580)
         com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.renderView(ViewHandlerImpl.java:316)
         com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:87)
         com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:221)
         com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:117)
         javax.faces.webapp.FacesServlet.service(FacesServlet.java:198)
         com.sun.rave.web.ui.util.UploadFilter.doFilter(UploadFilter.java:194)
    note The full stack trace of the root cause is available in the Apache Tomcat/5.0.28 logs.
    Pls can anyone show me the way out.
    Thanks in advance

    Thanks for your response ,I have the following drivers in my Tomcat\common\ lib
    ojbc14.jar
    ojbc14_g.jar
    ojbc14dms.jar
    ojbc14dms_g.jar
    orai18n.jar
    Pls check if there is any thing i need to do to make it work right.
    Thanks

  • Performance issue with using out parameter sys_refcursor

    Hello,
    I'm using Oracle 10g, with ODP.Net in a C# application.
    I'm using about 10 stored procedures, each having one out parameter of sys_refcursor type. when I use one function in C# and call these 10 sp's, it takes about 78ms to excute the function.
    Now to improve the performance, I created one SP with 10 output parameters of sys_refcursor type. and i just call this one sp. the time taken has increased , not it takes abt 95ms.
    is this the right approach or how can we improve the performance by using sys_refcursor.
    please suggest, it is urgent, i'm stuck up with this issue.
    thanks
    shruti

    With 78ms and 95ms are you talking about milliseconds or minutes? If it's milliseconds then what's the problem and does it really matter if there is a difference of 17 milliseconds as that could just be caused by network traffic or something. If you're talking minutes, then we would need more information about what you are attempting to do, what tables and processing is going on, what indexes you have etc.
    Query optimisation tips can be found on this thread.. When your query takes too long ....
    Without more information we can't really tell what's happening.

  • Performance issues. Took out EVENT.ADDED_TO_STAGE, Slower!?

    I don't understand. I went through all my code doing the following:
    init() {
         this.addEventListener(Event.ADDED_TO_STAGE, stageReady);
         this.addEventListener("creationComplete", creationComplete);
    public function creationComplete(event:Event=null):void {
         this.removeEventListener("creationComplete", creationComplete);
    public function stageReady(event:Event=null):void {
         this.removeEventListener(Event.ADDED_TO_STAGE, stageReady);
    Now my app runs super slow! Even slower than before. This isn't making any since ... I read this was supposed to help free up memory!!

    I have gone in and checked the box next to Perform Grouping on Server. The check box next to Use Index or Servers for Speed was already checked. The Links for the subreport are set up as follows.
    For subreport: Supported vs. Unsupported (This one comes from the ComputerSystem table)
    Container Report fields to link to: AST_AssetPeople.Asset_ID
    AS_AssetPeople.Asset_ID field link-
    Subreport parameter field to use: ?Pm-AST_AssetPeople.Asset_ID_
    Select data in subreport based on field: AST_ComputerSystem.Asset_ID_
    Asset ID is the only field that I found that the two tables have in common. I believe that I have the links set up correctly, but I could be wrong.
    Either way, I am still having the same problems.

  • Performance issue loading 4000 records from XML

    Hello, Im' trying to upload in a table with the following sqlstatement records from an XML having content of this type
    <?xml version="1.0" encoding="UTF-8"?>
    <custom-objects xmlns="http://www.mysite.com/xml/impex/customobject/2006-10-31">
        <custom-object type-id="NEWSLETTER_SUBSCRIBER" object-id="[email protected]">
      <object-attribute attribute-id="customer-no"><value>BLY00000001</value></object-attribute>
      <object-attribute attribute-id="customer_type"><value>registered</value></object-attribute>
            <object-attribute attribute-id="title"><value>Mr.</value></object-attribute>
            <object-attribute attribute-id="first_name"><value>Jean paul</value></object-attribute>
            <object-attribute attribute-id="is_subscribed"><value>true</value></object-attribute>
            <object-attribute attribute-id="last_name"><value>Pennati Swiss</value></object-attribute>
            <object-attribute attribute-id="address_line_1"><value>newsletter ADDRESS LINE 1 data</value></object-attribute>
            <object-attribute attribute-id="address_line_2"><value>newsletter ADDRESS LINE 2 data</value></object-attribute>
            <object-attribute attribute-id="address_line_3"><value>newsletter ADDRESS LINE 3 data</value></object-attribute>
            <object-attribute attribute-id="housenumber"><value>newsletter HOUSENUMBER data</value></object-attribute>
            <object-attribute attribute-id="city"><value>newsletter DD</value></object-attribute>
            <object-attribute attribute-id="post_code"><value>6987</value></object-attribute>
            <object-attribute attribute-id="state"><value>ASD</value></object-attribute>
            <object-attribute attribute-id="country"><value>ES</value></object-attribute>
            <object-attribute attribute-id="phone_home"><value>0044 1234567 newsletter phone_home</value></object-attribute>
            <object-attribute attribute-id="preferred_locale"><value>fr_CH</value></object-attribute>
            <object-attribute attribute-id="exported"><value>true</value></object-attribute>
            <object-attribute attribute-id="profiling"><value>true</value></object-attribute>
            <object-attribute attribute-id="promotions"><value>true</value></object-attribute>
            <object-attribute attribute-id="source"><value>https://www.mysite.com</value></object-attribute>
            <object-attribute attribute-id="source_ip"><value>10.10.1.1</value></object-attribute>
            <object-attribute attribute-id="pr_product_serial_number"><value>000123345678 product serial no.</value></object-attribute>
            <object-attribute attribute-id="pr_purchased_from"><value>Store where product to be registered was purchased</value></object-attribute>
            <object-attribute attribute-id="pr_date_of_purchase"><value></value></object-attribute>
            <object-attribute attribute-id="locale"><value>fr_CH</value></object-attribute> 
        </custom-object>
        <custom-object type-id="NEWSLETTER_SUBSCRIBER" object-id="[email protected]">
       <object-attribute attribute-id="customer-no"><value></value></object-attribute>
       <object-attribute attribute-id="customer_type"><value>unregistered</value></object-attribute>
            <object-attribute attribute-id="title"><value>Mr.</value></object-attribute>
            <object-attribute attribute-id="first_name"><value>Jean paul</value></object-attribute>
            <object-attribute attribute-id="is_subscribed"><value>true</value></object-attribute>
            <object-attribute attribute-id="last_name"><value>Pennati Swiss</value></object-attribute>
            <object-attribute attribute-id="address_line_1"><value>newsletter ADDRESS LINE 1 data</value></object-attribute>
            <object-attribute attribute-id="address_line_2"><value>newsletter ADDRESS LINE 2 data</value></object-attribute>
            <object-attribute attribute-id="address_line_3"><value>newsletter ADDRESS LINE 3 data</value></object-attribute>
            <object-attribute attribute-id="housenumber"><value>newsletter HOUSENUMBER data</value></object-attribute>
            <object-attribute attribute-id="city"><value>newsletter CASLANO</value></object-attribute>
            <object-attribute attribute-id="post_code"><value>6987</value></object-attribute>
            <object-attribute attribute-id="state"><value>TICINO</value></object-attribute>
            <object-attribute attribute-id="country"><value>CH</value></object-attribute>
            <object-attribute attribute-id="phone_home"><value>0044 1234567 newsletter phone_home</value></object-attribute>
            <object-attribute attribute-id="preferred_locale"><value>fr_CH</value></object-attribute>
            <object-attribute attribute-id="exported"><value>true</value></object-attribute>
            <object-attribute attribute-id="profiling"><value>true</value></object-attribute>
            <object-attribute attribute-id="promotions"><value>true</value></object-attribute>
            <object-attribute attribute-id="source"><value>https://www.mysite.com</value></object-attribute>
            <object-attribute attribute-id="source_ip"><value>85.219.17.170</value></object-attribute>
            <object-attribute attribute-id="pr_product_serial_number"><value>000123345678 product serial no.</value></object-attribute>
            <object-attribute attribute-id="pr_purchased_from"><value>Store where product to be registered was purchased</value></object-attribute>
            <object-attribute attribute-id="pr_date_of_purchase"><value></value></object-attribute>
            <object-attribute attribute-id="locale"><value>fr_CH</value></object-attribute> 
        </custom-object>
    </custom-objects>
    I use the following sequence of queries below to do the insert (XML_FILE is passed to the procedure as XMLType) 
    INSERT INTO DW_CUSTOMER.NEWSLETTERS (
       BRANDID,
       CUSTOMER_EMAIL,
       DW_WEBSITE_TAG
    Select
    p_brandid as BRANDID,
    CUSTOMER_EMAIL,
    p_website
    FROM
    (select XML_FILE from dual) p,
    XMLTable(
    xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
    '/custom-objects/custom-object' PASSING p.XML_FILE
    COLUMNS
    customer_email PATH '@object-id'
    ) CUSTOMER_LEVEL1;
    INSERT INTO DW_CUSTOMER.NEWSLETTERS_C_ATT (
       BRANDID, 
       CUSTOMER_EMAIL,
       CUSTOMER_NO, 
       CUSTOMER_TYPE,
       TITLE,
       FIRST_NAME,
       LAST_NAME,
       PHONE_HOME,
       BIRTHDAY,
       ADDRESS1,
       ADDRESS2,
       ADDRESS3,
       HOUSENUMBER,
       CITY,
       POSTAL_CODE,
       STATE,
       COUNTRY,
       IS_SUBSCRIBED,
       PREFERRED_LOCALE,
       PROFILING,
       PROMOTIONS,
       EXPORTED,
       SOURCE,
       SOURCE_IP,
       PR_PRODUCT_SERIAL_NO,
       PR_PURCHASED_FROM,
       PR_PURCHASE_DATE,
       LOCALE,
       DW_WEBSITE_TAG)
        with mainq as
            SELECT
            CUST_LEVEL1.customer_email as CUSTOMER_EMAIL,
            CUST_LEVEL2.*
            FROM
            (select XML_FILE from dual) p,
            XMLTable(
            xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
            '/custom-objects/custom-object' PASSING p.XML_FILE
            COLUMNS
            customer_email PATH '@object-id',
            NEWSLETTERS_C_ATT XMLType PATH 'object-attribute'
            ) CUST_LEVEL1,
            XMLTable(
            xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
            '/object-attribute' PASSING CUST_LEVEL1.NEWSLETTERS_C_ATT
            COLUMNS
            attribute_id PATH '@attribute-id',
            thevalue PATH 'value'
            ) CUST_LEVEL2
        select
        p_brandid
        ,customer_email
        ,nvl(max(decode(attribute_id,'customer_no',thevalue)),SET_NEWSL_CUST_ID) customer_no   
        ,max(decode(attribute_id,'customer_type',thevalue)) customer_type
        ,max(decode(attribute_id,'title',thevalue)) title
        ,substr(max(decode(attribute_id,'first_name',thevalue)) ,1,64)first_name
        ,substr(max(decode(attribute_id,'last_name',thevalue)) ,1,64) last_name
        ,substr(max(decode(attribute_id,'phone_hone',thevalue)) ,1,64) phone_hone
        ,max(decode(attribute_id,'birthday',thevalue)) birthday
        ,substr(max(decode(attribute_id,'address_line1',thevalue)) ,1,100) address_line1
        ,substr(max(decode(attribute_id,'address_line2',thevalue)) ,1,100) address_line2
        ,substr(max(decode(attribute_id,'address_line3',thevalue)) ,1,100) address_line3   
        ,substr(max(decode(attribute_id,'housenumber',thevalue)) ,1,64) housenumber
        ,substr(max(decode(attribute_id,'city',thevalue)) ,1,128) city
        ,substr(max(decode(attribute_id,'post_code',thevalue)) ,1,64) postal_code
        ,substr(max(decode(attribute_id,'state',thevalue)),1,256) state
        ,substr(max(decode(attribute_id,'country',thevalue)),1,32) country
        ,max(decode(attribute_id,'is_subscribed',thevalue)) is_subscribed
        ,max(decode(attribute_id,'preferred_locale',thevalue)) preferred_locale
        ,max(decode(attribute_id,'profiling',thevalue)) profiling
        ,max(decode(attribute_id,'promotions',thevalue)) promotions
        ,max(decode(attribute_id,'exported',thevalue)) exported   
        ,substr(max(decode(attribute_id,'source',thevalue)),1,256) source   
        ,max(decode(attribute_id,'source_ip',thevalue)) source_ip       
        ,substr(max(decode(attribute_id,'pr_product_serial_number',thevalue)),1,64) pr_product_serial_number
        ,substr(max(decode(attribute_id,'pr_purchased_from',thevalue)),1,64) pr_purchased_from   
        ,substr(max(decode(attribute_id,'pr_date_of_purchase',thevalue)),1,32) pr_date_of_purchase
        ,max(decode(attribute_id,'locale',thevalue)) locale
        ,p_website   
        from
        mainq
        group by customer_email, p_website
    I CANNOT MANAGE TO INSERT 4000 records in less than 30 minutes!
    Can you help or advise how to reduce this to reasonable timings?
    Thanks

    Simplified example on a few attributes :
    -- INSERT INTO tmp_xml VALUES ( xml_file );
    INSERT ALL
      INTO newsletters (brandid, customer_email, dw_website_tag)
      VALUES (p_brandid, customer_email, p_website)
      INTO newsletters_c_att (brandid, customer_email, customer_no, customer_type, title, first_name, last_name)
      VALUES (p_brandid, customer_email, customer_no, customer_type, title, first_name, last_name)
    SELECT o.*
    FROM tmp_xml t
       , XMLTable(
           xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31')
         , '/custom-objects/custom-object'
           passing t.object_value
           columns customer_email varchar2(256) path '@object-id'
                 , customer_no    varchar2(256) path 'object-attribute[@attribute-id="customer-no"]/value'
                 , customer_type  varchar2(256) path 'object-attribute[@attribute-id="customer_type"]/value'
                 , title          varchar2(256) path 'object-attribute[@attribute-id="title"]/value'
                 , first_name     varchar2(64)  path 'object-attribute[@attribute-id="first_name"]/value'
                 , last_name      varchar2(64)  path 'object-attribute[@attribute-id="last_name"]/value'
         ) o

  • Loading data from a flatfile versus xml

    As of now we are loading data from the AS400 by using flatfiles.
    For the sake of uniformity and flexibility we want to start using xml instead of flatfiles.
    But for what I have seen the loading of xml seems to be a lot slower then flatfiles.
    I think I have to chose between two options
    Loading infto xlmldb - XMLTYPE
    or
    make use of DBMS_XMLSTORE.
    I think it would be best to use the pl/sql package because we will not need extra storage for the xml table and then the fastest way should be dbms_xmlstore since it is written in c.
    If someone can come up with another, faster way, fine with me.
    If I want to use the package I have the following problem.
    I know how to tell which columns to load but my source xmlfile needs to be stored in at least 2 tables.
    It was possible with wb_xml_load but I think dbms_xmlstore is the faster solution.
    Can anybody tell me how to proceed?
    A little performance degradation is OK but we still need to load lots of data, so the faster the better.
    regards.

    Hi Priya
    There is a post on leveraging XDB with some interesting details;
    http://blogs.oracle.com/warehousebuilder/2007/09/leveraging_xdb.html
    Cheers
    David

  • Urgent : general abap performance issue

    HI floks
    i did some development in new smartform its working fine but i have issue in data base performance is 76% . but i utilize similar below code with various conditions in various 12 places . is it possible to reduce performance this type of code . check it and mail me how can i do it . if possible can suggest me fast .how much % is best for this type of performance issues.
    DATA : BEGIN OF ITVBRPC OCCURS 0,
           LV_POSNR LIKE VBRP-POSNR,
           END OF ITVBRPC.
    DATA : BEGIN OF ITKONVC OCCURS 0,
            LV_KNUMH LIKE KONV-KNUMH,
            LV_KSCHL LIKE KONV-KSCHL,
           END OF ITKONVC.
    DATA:  BEGIN OF ITKONHC OCCURS 0,
           LV_KNUMH LIKE KONH-KNUMH,
           LV_KSCHL LIKE KONH-KSCHL,
           LV_KZUST LIKE KONH-KZUST,
           END OF ITKONHC.
    DATA: BEGIN OF ITKONVC1 OCCURS 0,
           LV_KWERT LIKE KONV-KWERT,
           END OF ITKONVC1.
    DATA :  BEGIN OF ITCALCC OCCURS 0,
           LV_KWERT LIKE KONV-KWERT,
           END OF ITCALCC.
    DATA: COUNTC(3) TYPE n,
           TOTALC LIKE KONV-KWERT.
    SELECT POSNR FROM VBRP INTO ITVBRPC
      WHERE VBELN = INV_HEADER-VBELN AND ARKTX = WA_INVDATA-ARKTX .
    APPEND ITVBRPC.
    ENDSELECT.
    LOOP AT ITVBRPC.
    SELECT KNUMH KSCHL FROM KONV INTO ITKONVC WHERE KNUMV =
    LV_VBRK-KNUMV AND KPOSN = ITVBRPC-LV_POSNR AND KSCHL = 'ZLAC'.
    APPEND ITKONVC.
    ENDSELECT.
    ENDLOOP.
    SORT ITKONVC BY LV_KNUMH.
    DELETE ADJACENT DUPLICATES FROM ITKONVC.
    LOOP AT ITKONVC.
    SELECT KNUMH KSCHL KZUST FROM KONH INTO ITKONHC WHERE KNUMH = ITKONVC-LV_KNUMH AND KSCHL = 'ZLAC' AND KZUST = 'Z02'.
    APPEND ITKONHC.
    ENDSELECT.
    ENDLOOP.
    LOOP AT ITKONHC.
    SELECT KWERT FROM KONV INTO ITKONVC1 WHERE KNUMH = ITKONHC-LV_KNUMH AND
    KSCHL = ITKONHC-LV_KSCHL AND KNUMV = LV_VBRK-KNUMV.
    MOVE ITKONVC1-LV_KWERT TO ITCALCC-LV_KWERT.
    APPEND ITCALCC.
    ENDSELECT.
    endloop.
    LOOP AT ITCALCC.
    COUNTC = COUNTC + 1.
    TOTALC = TOTALC + ITCALCC-LV_KWERT.
      ENDLOOP.
    MOVE ITKONHC-LV_KSCHL TO LV_CKSCHL.
    MOVE TOTALC TO LV_CKWERT.
    it's urgent ..........
    thanks .
    bbbbye
    suresh

    You need to use for all entries instead of select inside the loop.
    Try this:
    DATA : BEGIN OF ITVBRPC OCCURS 0,
    VBELN LIKE VBRP-VBELN,
    LV_POSNR LIKE VBRP-POSNR,
    END OF ITVBRPC.
    DATA: IT_VBRPC_TMP like ITVBRPC occurs 0 with header line.
    DATA : BEGIN OF ITKONVC OCCURS 0,
    LV_KNUMH LIKE KONV-KNUMH,
    LV_KSCHL LIKE KONV-KSCHL,
    END OF ITKONVC.
    DATA: BEGIN OF ITKONHC OCCURS 0,
    LV_KNUMH LIKE KONH-KNUMH,
    LV_KSCHL LIKE KONH-KSCHL,
    LV_KZUST LIKE KONH-KZUST,
    END OF ITKONHC.
    DATA: BEGIN OF ITKONVC1 OCCURS 0,
    KNUMH LIKE KONV-KNUMH,
    KSCHL LIKE KONV- KSCHL,
    LV_KWERT LIKE KONV-KWERT,
    END OF ITKONVC1.
    DATA : BEGIN OF ITCALCC OCCURS 0,
    LV_KWERT LIKE KONV-KWERT,
    END OF ITCALCC.
    DATA: COUNTC(3) TYPE n,
    TOTALC LIKE KONV-KWERT.
    *SELECT POSNR FROM VBRP INTO ITVBRPC
    *WHERE VBELN = INV_HEADER-VBELN AND ARKTX = WA_INVDATA-ARKTX .
    *APPEND ITVBRPC.
    *ENDSELECT.
    SELECT VBELN POSNR FROM VBRP INTO TABLE ITVBRPC
    WHERE VBELN = INV_HEADER-VBELN AND
                     ARKTX = WA_INVDATA-ARKTX .
    If sy-subrc eq 0.
      IT_VBRPC_TMP[] = ITVBRPC[].
      Sort IT_VBRPC_TMP by vbeln posnr.
      Delete adjacent duplicates from IT_VBRPC_TMP comparing vbeln posnr.
    SELECT KNUMH KSCHL FROM KONV
                   INTO TABLE ITKONVC
                   WHERE KNUMV = LV_VBRK-KNUMV AND
                   KPOSN = ITVBRPC-LV_POSNR AND
                    KSCHL = 'ZLAC'.
    if sy-subrc eq 0.
       SORT ITKONVC BY LV_KNUMH.
        DELETE ADJACENT DUPLICATES FROM ITKONVC COMPARING LV_KNUMH.
       SELECT KNUMH KSCHL KZUST FROM KONH
                 INTO TABLE ITKONHC
                 WHERE KNUMH =  ITKONVC-LV_KNUMH AND
                               KSCHL = 'ZLAC' AND
                               KZUST = 'Z02'.
       if sy-subrc eq 0.
    SELECT KNUMH KSCHL KWERT FROM KONV
                   INTO TABLE ITKONVC1
                    WHERE KNUMH = ITKONHC-LV_KNUMH AND
                                   KSCHL = ITKONHC-LV_KSCHL AND
                                    KNUMV = LV_VBRK-KNUMV.
        Endif.
    Endif.
    Endif.
    *LOOP AT ITVBRPC.
    *SELECT KNUMH KSCHL FROM KONV INTO ITKONVC WHERE KNUMV =
    *LV_VBRK-KNUMV AND KPOSN = ITVBRPC-LV_POSNR AND KSCHL = 'ZLAC'.
    *APPEND ITKONVC.
    *ENDSELECT.
    *ENDLOOP.
    *SORT ITKONVC BY LV_KNUMH.
    *DELETE ADJACENT DUPLICATES FROM ITKONVC.
    *LOOP AT ITKONVC.
    SELECT KNUMH KSCHL KZUST FROM KONH INTO ITKONHC WHERE KNUMH = ITKONVC-LV_KNUMH AND KSCHL = 'ZLAC' AND KZUST = 'Z02'.
    *APPEND ITKONHC.
    *ENDSELECT.
    *ENDLOOP.
    *LOOP AT ITKONHC.
    *SELECT KWERT FROM KONV INTO ITKONVC1 WHERE KNUMH = ITKONHC-LV_KNUMH *AND
    *KSCHL = ITKONHC-LV_KSCHL AND KNUMV = LV_VBRK-KNUMV.
    *MOVE ITKONVC1-LV_KWERT TO ITCALCC-LV_KWERT.
    *APPEND ITCALCC.
    *ENDSELECT.
    *endloop.
    LOOP AT ITCALCC.
    COUNTC = COUNTC + 1.
    TOTALC = TOTALC + ITCALCC-LV_KWERT.
    ENDLOOP.
    MOVE ITKONHC-LV_KSCHL TO LV_CKSCHL.
    MOVE TOTALC TO LV_CKWERT.

  • How to invoke the "loading data" javascript

    Hi
    I would like to be able to reuse the piece of code which performs the 'loading data' meter in flash charts.
    I would like to enable the same 'loading data' behaviour for sql reports which takes some seconds to execute.
    I guess it is a piece of javascript? But I'm not able to locate it.
    Any ideas?
    Best regards, Jesper

    Hi Guido
    Thanks - but no really.
    My application makes heavily use of the new excellent flash charts and besides that a lot of sql reports.
    I want the 'loading data' meter to appear exactly the same for both flash charts and sql reports.
    I know how to make a meter, but the one from the flash charts looks very nice, and I want to apply that to the sql reports too.
    Best regards
    Jesper

  • Trouble loading data

    I am having an issue loading data into Toad 9.7 based on an Oracle 10g database. The files I am loading are around 3.5mb apiece. For some reason Toad is having trouble with these files. When I load the first one it will skip the second one and the third then crash Toad. If I try to load it as one big file it will just skip the file altogether and not load anything. Here is the script file I am using to do it.
    execute support_tools.set_constraints('AIRSRAQS','PARAMETER_QUALIFIERS','ENABLED');
    execute support_tools.set_triggers('AIRSRAQS','PARAMETER_QUALIFIERS','ENABLED');
    execute support_tools.disable_constraints;
    execute support_tools.disable_triggers;
    TRUNCATE TABLE parameter_qualifiers;
    @@table_parameter_qualifiers_1.sql;
    @@table_parameter_qualifiers_2.sql;
    @@table_parameter_qualifiers_3.sql;
    execute support_tools.enable_constraints;
    execute support_tools.enable_triggers;
    The reason I have it split into 3 files is because when I load it as one big file it just skips it and completes, loading no data. Using this method it will load the first one but then goes to load the second, stalls, then skips it, and crashes. =\

    Something that I discovered while testing this in Toad is that Toad will not run the script properly but SQL Plus will. Just a bit of information for someone that may be relatively new to SQL like myself.

Maybe you are looking for

  • MIRO: Field Status control per GL master for non-goods receipt invoices

    When posting incoming invoices in FI using FB60, the combination of GL account and CO account assignment are validated and checked against Field Status Group (FSTAG).  Such controls are also in place in MM for purchase requisition (ME51N) and purchas

  • Overdelivery for PO services

    I've created a PO services (item category = D) with 20% overdelivery and refer to a contract. After posting service acceptance with extra 20% of quantity, the extra 20% of amount does not reflect into contract net value [ME33K -> Header -> Statistics

  • Keyence Barcode Reader & RS-232

    Hey gang! Just wondering if anybody out there has successfully communicated with a Keyence BL-600 series barcode reader (or any Keyence barcode reader for that matter, I'm desperate) via RS-232. I have a BL-601 and I can communicate with it when I'm

  • How to install oracle connection manager on solaris 10

    I have installed oracle application server 10.1.2 on solaris 10. How do i install oracle connection manager utility. My end goal is to use applets/jdbc. Thanks

  • Default Sharepoint Library View by User ID.

    I was trying to let only certain number of people to have ALL VIEW, whereby majority of people should only have limited filtered view on the same site library, how to achieve this? Again I am using Sharepoint 2010 Foundation. Thanks.