OWB 10g - The time taken for data load is too high

I am loading data on the test datawarehouse server. The time taken for loading data is very high. The size of data is around 7 GB (size of flat files on the OS).
The time it takes to load the same amount of data on the production server from the staging area to the presentation area(datawarehouse) is close to 8 hours maximum.
But, in the test environment the time taken to execute one mapping (containing 300,000 records)is itself 8 hours.
The version of Oracle database on both the test and production servers is the same i.e., Oracle 9i.
The configuration of the production server is : 4 Pentium III processors (2.7 GHz each), 2 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache, 512 kilobyte secondary memory cache, 440.05 Gigabytes Usable Hard Drive Capacity, 73.06 Gigabytes Hard Drive Free Space
The configuration of the test server is : 4 Pentium III processors (2.4 GHz each), 1 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache,
512 kilobyte secondary memory cache, 144.96 Gigabytes Usable Hard Drive Capacity, 5.22 Gigabytes Hard Drive Free Space.
Can you guys please help me to detect the the possible causes of such erratic behaviour of the OWB 10g Tool.
Thanks & Best Regards,
Harshad Borgaonkar
PwC

Hello Harshad,
2 GB of RAM doesn't seem to be very much to me. I guess your bottleneck is I/O. You've got to investigate this (keep an eye on long running processes). You didn't say very much about your target database design. Do you have a lot of indexes on the target tables and if so have you tried to drop them before loading? Do your OWB mappings require a lot of lookups (then apropriate indexes on the lookup table are very useful)? Do you use external tables? Are you talking about loading dimension or fact tables or both? You've got to supply some more information so that we can help you better.
Regards,
Jörg

Similar Messages

  • Essbase 9.3.1, more time taken for data load.

    Hi,
    i am trying to load 15gb of data, (data is taken from two seperate database of oracle) to my ASO application directly.
    the load is very slow, what factors should i consider to make the load faster?
    Will incresing the RAM size help me in this context?
    i have gone through the admin doc of Essbase 9.3.1, in that the Hard-Disk and the RAM requirement is given, but it is for the Block storage.
    is there any difference between this estimation and the ASO estimation?
    If anyone can guide me, what things has to be taken care while loading the cube with huge data.
    i shall be very thankful.

    The statements which matters for the Aggregate storage is
    DLSINGLETHREADPERSTAGE FALSE
    DLTHREADSPREPARE Sample Basic 3
    The write(DLTHREADSWRITE Sample Basic 4) option doesnt have any impact on the writing speed of the data in the ASO.
    The statements has to be included in the essbase.cfg file, which is the configuration file of the Essbase server.
    Well as per the document this should be done throught the maxl ,Esscmd or the Analytic services console?
    Question?
    1) Should we simply put this statements in the essbase.cfg file without a semicolon? and restart the server/application?
    2) if these statement can be executed by the maxl? Please let us know how can we do that?
    3) How do we know what is the current level of thread being used for the Read/write?
    Thanks

  • How to analyse the time taken for a query

    Hey gurus ,
                          How to find the time taken for a query to execute .
    Regards,
    Venkatesh

    Hi,
    Time taken to execute a query = FRONT END TIME + OLAP TIME + DB TIME.
    front end time is time taken to do format in BEX.
    olap time is time taken to aggegate data in OLAP buffer.
    db time is tme taken to collect data at data target.
    to fine all these information
    goto RSRT -> give query name -> execute+debug -> it will display all the fields > check fields what ever u want.
    Regards,
    Haritha.

  • How to find the time taken for creating execution plan alone

    Hi,
    Is there any way to find out the division between the time taken for query parsing, creating execution plan and actual data retrieval seperately? If I enable 'set timing on' I see the elapsed time which is the total time taken for all these 3.
    Some of my queries are taking long time when I run it first time and so want to know what is it taking long, is it the parsing the query or creating the execution plan? (Since my queries run faster second time, I am assuming the major part was for parsing or creating the plan but not the data retrieval).
    Also, where does Oracle keep the execution plan? Is it in the BUFFER_CACHE? if so, I tried flushing the buffer_cache and restarting the DB as well, but still the query execution seems faster compared to the first time. How long does Oracle keep the execution plan in the cache and is there anyway to increase this cache size?
    Thanks in advance!

    user13169027 wrote:
    Hi,
    Is there any way to find out the division between the time taken for query parsing, creating execution plan and actual data retrieval seperately? If I enable 'set timing on' I see the elapsed time which is the total time taken for all these 3.
    Some of my queries are taking long time when I run it first time and so want to know what is it taking long, is it the parsing the query or creating the execution plan? (Since my queries run faster second time, I am assuming the major part was for parsing or creating the plan but not the data retrieval).
    The ideal way for finding the answer to your questions above, would be to perform a (sql) trace of your query executions. To see the difference in the trace-files, you might want to trace the first execution in one session, and the second execution in another session: so you get two different trace files, which you can then seperately tkprof, and investigate.
    Also, where does Oracle keep the execution plan? Is it in the BUFFER_CACHE? if so, I tried flushing the buffer_cache and restarting the DB as well, but still the query execution seems faster compared to the first time. How long does Oracle keep the execution plan in the cache and is there anyway to increase this cache size?
    Execution plans are held in the shared-pool, not the buffer-cache. As far as I know they will be kept in memory in an LRU way (least recently used), just like db-blocks are in the buffer-pool (I know this is not entirely correct, but for all practical purposes, think of it this way).

  • PO Lead Time cannot capture the time taken for shipping!

    Dear All
    I understand that we have PO lead time = PO Processing Time (Working Day) + Planned Delivery Time (Calendar Day) + GR processing time (working day).
    And this PO lead time will be added on top of my PO Creation Date to defer the actual goods availability date.
    My question:
    1. Planned delivery time is the time taken from vendor place to send out the goods to my warehouse. What if it is overseas purchase where goods leaving vendor's Port will first arrive in my country custom, and it will take 3 days to do clearance. once it is cleared, forwarding agent will delivery goods from my country custom to my warehouse. In this case, how do I capture it in SAP system for the planned delivery time as it has 4 periods of time now
    a. Time taken from vendor's port to reach my country's port
    b. Time taken for my country custom to do clearing
    c. Time taken for forwarding agent to fetch goods from custom to my warehouse
    d. Time taken for unpack, take out , count, inspect and put for use (GR processing time)
    Do I need to use user Exit?
    Thanks
    Edited by: Daimos on Apr 27, 2009 6:52 PM

    Dear dogboy.
    I think we must use feature on the PO Confirmation Control (CC) Key at PO Item Level:
    ED - Estimated Time of Departure from Overseas Port.
    EA - Actual Time of Departure from Overseas Port.
    EA - Estimated Time of Arrival
    AA - Actual Time of Arrival
    And the purchaser will maintain the value of each of the CC Key each time they are notified by the vendor.
    And we need to come out with a Customised Report to capture those CC dates entered so that finance is able to prepare $ in advance if the moment the EA is maintained, meaning the estimated date of arrival at the Custom there.
    But the problem is that PO User Exit is only at the header of Confirmation Control Key but not capture the DATE field we entered for each CC.
    That was the problem I last encountered.

  • How to set the time interval for data obtain through labview via write measureme

    I'm need to programme to obtain data from oscilloscope. Currently my laptop able to communicate with Agilent DSO6014A oscilloscope via USB. I'm include the write measurement in mty program and save the data in .lvm file. But the time interval in the data are too small which is in micron seconds. Is there a way to modify the interval values?

    The attached file is the capture of my diagram. Thanks.
    Attachments:
    Capture.PNG ‏56 KB

  • SD Data Load (Deliveries Too High In Number)

    Hi All,
    I am loading data for application component 11, 12, & 13.
    Filling up of set up table for application component 12 is taking very long time.
    So i am thinking of taking alternative path.
    Say; The company has following Sales Organization -- 1100, 1200, 1300
    Steps:
    1. Clear init from RSA7, delete data from setup table in LBWG, clear q in LBWQ.
    2. Go to T code OLI8BW( Deliveries) enter S. Org. as 1100.
    3. Trigger INIT with data after the setup table is filled for S.Org 1100.
    4. Fill setup table again for other sales org 1200, 1300 and trigger repair full for the respective S.Org 1200, 1300.
    Please let me know if the steps above are correct, if not please mention the correct steps.
    Thanks..
    Regards
    Madhusudan

    Hi,
    Assuming as your setting LO data load flow for application 12 for the first time.
    So your data source won't be exist at RSA7/LBWQ.
    first of all need to lock source system/related t codes.
    1. Delete(LBWG) and fill(oLI8BW) the setup tables as you need
           a. selections for sales org  - 1100.
    2. trigger info pack with data , so sales org 1100 data will be moved to PSA and delta pointer will set to your data source, data source will visible at RSA7. PSA data you can load to further data targets.
    3. Fill setup tables for Slaes org - 1200 and 1300
    4. Run repair full request with selections on Sales org - 1200 and 1300. Load same data from psa to further targets.
    5. Select V3 update method and schedule V3 job run to load delta records from source/SM13/LBWQ to RSA7.
    6. unlock source system.
    7. once you see records at RSA7, you can load delta records into bw by using delta info pack.
    Thanks

  • How to show the processing time taken for a BPEL process in BAM report.

    Hi All,
    I have the data as below in the Data object. I would like to show the time taken for each order to complete in the report.
    instance Id     order Id     product Name     product Code     price     status     instance Time      updaterName
    1360010     ord004     Guitar     prod003     2000     requested     9/22/2008 12:12:11 PM     Invoke_InsertSalesOrder
    1360010     ord004     Guitar     prod003     2000     Approved     9/22/2008 12:15:11 PM     Invoke_OrderStatusUpdate
    This data comes from simple BPEL process where sensors are configured at the start and end of BPEL process. Also have a human task activity in between to create the time difference.
    In Enterprise link design studio, I tried to calculate the time difference using expression calculator and store it as calculated field. But that doesn't seems to work because when I execute the plan, second sensor data reaches only after human approval whereas first sensor data would be waiting for calculation and ultimately nothing comes into data object.
    How and where the calculation be done to show the processing time in the report. Please someone throw some light on this.
    Regards
    Jude.
    Edited by: user600726 on Sep 30, 2008 1:30 AM

    I would suggest modifying your data object so that the data can all be in a single row and use the sensor at the end of the process to upsert (update) the row created by the sensor at the start of the process. The time difference between two fields in the same row is then an easy calculation on a BAM report -- No EL plan should be needed.

  • Time taken for a method to run. ?

    I have a query regarding ascertaining the time taken
    for a method to execute
    I have a SQL statement that I reads 10,000 rows.
    String a_SQL = "Select.....from TableA";
    try {
         IQuery query = m_UC.createQuery();
         SimpleTableModel stm = query.executeSelect(a_SQL);
       }catch(SQLException se){
       // Query Over.
       private void displayJTable(){
       // display rows read from the query above.
       }          As the query is executing,I want to display a progress bar
    showing the status of the query.
    Now I cant use this :
    start = System.currentTimeMillis();
    try {
         IQuery query = m_UC.createQuery();
         SimpleTableModel stm = query.executeSelect(a_SQL);
       }catch(SQLException se){
    end = System.currentTimeMillis();
    System.out.println(" Time taken  to display " + (end - start)/1000);
                  The above will give me the time elapsed in seconds for the SQL query to execute,but
    this is not what I want.
    What i want is to use this :
           // Progress BAR executing,so I need to get to know the time
           // taken for the query to run
            try {
            the SQL execution
            }cach(){
            // Query over.
            // Stop Progres Bar.
            // Display JTable.       How can I know when to stop the Progress Bar as I have no handle
    on the time taken to execute the query?
    Any help will be appreciated

    You can have a separate thread (or maybe it's just part of the regular GUI update thread? I don't know details about GUIs) that puts up an hourglass or spinning clock hands or dancing hamster or whatever to indicate that something is going on, but like the man says, you can't know ahead of time how long it will take to run a query, so you can't show percent done. You also don't in general know ahead of time how many rows will be returned so that doesn't help you.
    You might be able to do something for the processing of the returned data, once the query completes, because then you can often get a count of the number of rows, so you process each row in a loop, and update the % done counter each time through the loop.

  • How to find time taken for a search

    hi all,
       I need to find the time taken for a particular search in KM Search iView. I refered the following thread
    /message/5739737#5739737 [original link is broken]
    but not able to get the duration(time taken). is there any other way to achieve this?.
    all helps will be appriciated.
    Regards,
    Shanthakumar.

    Hi Shantakhumar,
    do you want to implement you own Search iView?
    Best regards,
    Denis

  • Data folder can not be opened in finding " AirPort Time Capsule " The operation can not be completed because the original item for " data" does not exist .

    Hi
    I have a " AirPort Time Capsule " (firmware 7.7.3) When I try to open the data folder in "finder". Then I got the message  " The operation can not be completed because the original item for " data" does not exist". I have a lot of data and I can understand why I get this message?
    Anyone who can help? Thanks..
    Br. Bo

    Get a USB drive of 2TB or more.. assuming your TC is 2TB. Either preformatted Mac or plug into your Mac and format it standard Mac OS Extended Journaled in disk utility.
    Do a full archive of the TC. You do this using airport utility. Do not click the erase disk.. I marked in green.. just the archive.. that is to backup the internal disk to the USB disk. It is not fast.. take it that the process will go at around 40-50GB/hr.
    Once you complete the archive .. it is a direct image of the data on your TC.. you can then plug it into your computer directly.. and then try and open the files you lost.. if you cannot open them.. open disk utility and fix the permissions.
    http://osxdaily.com/2015/01/13/repair-disk-permissions-mac-os-x/
    Or try the methods apple recommends..
    OS X Yosemite: Set permissions for items on your Mac
    It is possible to fix things on the USB drive because it is locally mounted.. but you cannot fix it on TC which is network drive.

  • When I change the time zone of the clock, the "Date created" time information for my documents and image files in the Finder window (and in Get Info) is changed. Can I make the time info in "Date created" remain fixed regardless of the clock's timezone?

    When I change the time zone of the clock, the "Date created" time information for my documents and image files in the Finder window (and in Get Info) is changed. Can I make the time info in "Date created" remain fixed regardless of the clock's timezone?

    When I change the time zone of the clock, the "Date created" time information for my documents and image files in the Finder window (and in Get Info) is changed. Can I make the time info in "Date created" remain fixed regardless of the clock's timezone?

  • Increase the number of background work processes for data load performance

    Hi all,
    There are 10 available background work processes in the BW system. We're doing some mass load to multiple ODS.But system uses only 3 background processes. How can i
    increase the number of used background work processes for new data load.
    I tried to change number of prosesses with RSODSO_SETTINGS. But no successes. Are there any other settings need to change?
    thanks,
    Yigit

    Hi sankar,
    I entered the max proc. number into ROIDOCPRMS. But it doesn't make difference. System still uses only 3 of background processes. RSCUSTA2 is replaced with
    RSODSO_SETTINGS in BI 7.0 and this trans. can only change the processes for data activation, SID generation and rollback. I need to change the process numbers for data extraction.

  • Can the time out for loading a page be extended for busy sites?

    (Error:)
    Problem loading page
    The connection has timed out
    The server at xxx.xxx is taking too long to respond.
    * The site could be temporarily unavailable or too busy. Try again in a few moments.
    Question: Is there a way to extend the time out for sites which are known to be busy?

    This issue appears under two different Mozzila "Questions". Both appear to have the same OUTDATED, INEFFECTIVE ANSWERS. A lot of us are on WIN7 now and still have erratic network speeds due to cable multiplexing -- but have been trained to leave the registry alone. Has not anyone solved this problem -- other than going notoriously unreliable FIOS ??

  • Can i buy a smaller macbook air (64gb) but use the time capsule for additional data ?

    can i buy a smaller macbook air (64gb) but use the time capsule for additional data ?

    If you plan to store important "original" or "master" files and documents on the Time Capsule, then you might want to think about a way to back up those files to another hard drive.
    Why? If the Time Capsule has a problem...and you have no backups....you lose everything.

Maybe you are looking for

  • How do i use home sharing on the new itunes (110.1)?

    i can't find the import button on the new version of itunes so i no longer know how to use homesharring.

  • Disable opening of links

    Hey Folks, So I was wondering, if there was a way to prevent applications from opening safari with a link. My problem is, that I have a lot of tabs open and when I start Safari it opens these tabs automatically. But new, when Safari is closed and an

  • Oracle CHAR Bind Variable Padding

    I have a oracle table with a CHAR(30) primary key. If I call getObjectById() with a key of less than 30 chars a JDOObjectNotFoundException is thrown as oracle expects the key to be padded. What are my options?

  • Write inputStream to file

    Hi, who could give me a short sample code how to write an inputStream or a Reader to a file. Thanks for your help Bye

  • Re: pattern for persistance management: the solution?

    Thanks to Forte France consulting group and Marc Sonnet (Forte France), here is one solution for our problem. We don't have test it yet but this seems pretty good. thanks for all your replies, Corinne Barbat ____________________________ Séparateur Ré