Data loading time too long

Hi there,
I have two Infopackages - one for year 2003 and one for year 2004. For year 2003 I use interval 200301-200312 and for year 2004 I use interval 200401-200412 as data selection. Then I started to load data from R/3. For year 2003 we have 4.9 Millions records and took 6 hours to load and for year 2004 we have 5.5 Millions records and took 46 hours to load. The interesting thing was that when I tried to use InfoPackage 2003 and put the interval 200401-200412 as data selection for loading 2004 data, it took only 7.x hours! Why? Is something wrong with the InfoPackage for 2004? Any idea and suggestion is greatly appreciated.
Weidong

Hi Weidong,
Check the processing type in both the infopackages. May be one of the infopackages has "PSA and then into Data Target" and the other infopackage has "PSA and Data Target in parallel".
Bye
Dinesh

Similar Messages

  • 0PM_ORDER loading time too long

    Hi All. We have about 600 000 PM orders in the system. Loading the master data attributes for this into BW/BI takes almost an hour. There seems to be no delta load available. Most of this running time is on ECC6. I cannot restrict the orders in the info package because there are orders created a long time ago which are still active.
    Any suggestions?
    Thanks,
    Johan Loock

    We have just under 600,000.  After perfroming a full load of all Orders, what we did to deal with 0PM_ORDER load times was to use the ABAP Conversion exit on the Infopackage's Data Selection tab.  The datasource contains Created On(ERDAT) and Change(AEDAT) dates and we rely on those dates to filter the Orders we extract.   
    We load two Infopackages (full loads) each day, one that loads any Orders created in the last 7 days and another that loads any Orders changed in the last 7 days.
    ABAP conversion exit code for created in last 7 days -
    data: l_low_date like sy-datum,
    l_idx like sy-tabix.
    read table l_t_range with key
          fieldname = 'ERDAT'.
    l_idx = sy-tabix.
    l_low_date = sy-datum - 6.
    l_t_range-option = 'BT'.
    l_t_range-low = l_low_date.
    l_t_range-high = sy-datum.
    modify l_t_range index l_idx.
    p_subrc = 0.
    The Changed code is the same, but it references AEDAT.
    The load times for the Created and Changed Infopackages are about 1 - 2 minutes each.

  • Email loading times too long (Gmail, E75)

    Hi dudes, and thanks for reading.
    Email provider: Gmail
    Phone: E75
    Phone firmware: 202.12.01 (the newest)
    I'm getting pretty long waiting times when my phone loads my emails.
    I have set up an account to the phone's email application. It lists the emails in real time, but when I try to open an one, the loading time gets overwhelming.
    It is not a pleasure to use the email application.
    Can I do anything for this? Another app?
    Any help wuold be appreciated!!!

    Multiple reasons too hard to guess.
    First step of ts - reset network settings

  • Loading time too long for my native application

    Hello,
    I've built an exe width adt and i regret that there is no screen to inform the user while the application "preparing the installation" when you execute it as does the air application.
    The problem is : when I executes the exe application , it seems that nothing happens for about ten seconds.
    I use this syntax to build my app :
    adt -package %OPTIONS% %SIGNING_OPTIONS% -target native [WINDOWS_INSTALLER_SIGNING_OPTIONS] %EXE_PATH%\%EXE_NAME%.exe %APP_XML% %FILE_OR_DIR%
    Does adt, or Adobe offer a solution, that solve the problem ? (I'd like to avoid having to hack myself or with the help of a software like "Advance Installer" the installation of my app)
    Thank a lot.

    How large is the installer?  I suspect the delay is due to the self extraction taking place.
    I took a look in our internal database and it appears we had something similar entered back in 2010 that was unfortunately deferred.  At this point, I'd recommend using a captive runtime and rolling your own installer with it's own UI.  The other alternative is to add a bug/feature request over at bugbase.adobe.com, then let the community know so that others can add comments and cast votes for it.
    Thanks,
    Chris

  • How to tune data loading time in BSO using 14 rules files ?

    Hello there,
    I'm using Hyperion-Essbase-Admin-Services v11.1.1.2 and the BSO Option.
    In a nightly process using MAXL i load new data into one Essbase-cube.
    In this nightly update process 14 account-members are updated by running 14 rules files one after another.
    These rules files connect 14 times by sql-connection to the same oracle database and the same table.
    I use this procedure because i cannot load 2 or more data fields using one rules file.
    It takes a long time to load up 14 accounts one after other.
    Now my Question: How can I minimise this data loading time ?
    This is what I found on Oracle Homepage:
    What's New
    Oracle Essbase V.11.1.1 Release Highlights
    Parallel SQL Data Loads- Supports up to 8 rules files via temporary load buffers.
    In an Older Thread John said:
    As it is version 11 why not use parallel sql loading, you can specify up to 8 load rules to load data in parallel.
    Example:
    import database AsoSamp.Sample data
    connect as TBC identified by 'password'
    using multiple rules_file 'rule1','rule2'
    to load_buffer_block starting with buffer_id 100
    on error write to "error.txt";
    But this is for ASO Option only.
    Can I use it in my MAXL also for BSO ?? Is there a sample ?
    What else is possible to tune up nightly update time ??
    Thanks in advance for every tip,
    Zeljko

    Thanks a lot for your support. I’m just a little confused.
    I will use an example to illustrate my problem a bit more clearly.
    This is the basic table, in my case a view, which is queried by all 14 rules files:
    column1 --- column2 --- column3 --- column4 --- ... ---column n
    dim 1 --- dim 2 --- dim 3 --- data1 --- data2 --- data3 --- ... --- data 14
    Region -- ID --- Product --- sales --- cogs ---- discounts --- ... --- amount
    West --- D1 --- Coffee --- 11001 --- 1,322 --- 10789 --- ... --- 548
    West --- D2 --- Tea10 --- 12011 --- 1,325 --- 10548 --- ... --- 589
    West --- S1 --- Tea10 --- 14115 --- 1,699 --- 10145 --- ... --- 852
    West --- C3 --- Tea10 --- 21053 --- 1,588 --- 10998 --- ... --- 981
    East ---- S2 --- Coffee --- 15563 --- 1,458 --- 10991 --- ... --- 876
    East ---- D1 --- Tea10 --- 15894 --- 1,664 --- 11615 --- ... --- 156
    East ---- D3 --- Coffee --- 19689 --- 1,989 --- 15615 --- ... --- 986
    East ---- C1 --- Coffee --- 18897 --- 1,988 --- 11898 --- ... --- 256
    East ---- C3 --- Tea10 --- 11699 --- 1,328 --- 12156 --- ... --- 9896
    Following 3 out of 14 (load-) rules files to load the data columns into the cube:
    Rules File1:
    dim 1 --- dim 2 --- dim 3 --- sales --- ignore --- ignore --- ... --- ignore
    Rules File2:
    dim 1 --- dim 2 --- dim 3 --- ignore --- cogs --- ignore --- ... --- ignore
    Rules File14:
    dim 1 --- dim 2 --- dim 3 --- ignore --- ignore --- ignore --- ... --- amount
    Is the upper table design what GlennS mentioned as a "Data" column concept which only allows a single numeric data value ?
    In this case I cant tag two or more columns as “Data fields”. I just can tag one column as “Data field”. Other data fields I have to tag as “ignore fields during data load”. Otherwise, when I validate the rules file, an Error occurs “only one field can contain the Data Field attribute”.
    Or may I skip this error massage and just try to tag all 14 fields as “Data fields” and “load data” ?
    Please advise.
    Am I right that the other way is to reconstruct the table/view (and the rules files) like follows to load all of the data in one pass:
    dim 0 --- dim 1 --- dim 2 --- dim 3 --- data
    Account --- Region -- ID --- Product --- data
    sales --- West --- D1 --- Coffee --- 11001
    sales --- West --- D2 --- Tea10 --- 12011
    sales --- West --- S1 --- Tea10 --- 14115
    sales --- West --- C3 --- Tea10 --- 21053
    sales --- East ---- S2 --- Coffee --- 15563
    sales --- East ---- D1 --- Tea10 --- 15894
    sales --- East ---- D3 --- Coffee --- 19689
    sales --- East ---- C1 --- Coffee --- 18897
    sales --- East ---- C3 --- Tea10 --- 11699
    cogs --- West --- D1 --- Coffee --- 1,322
    cogs --- West --- D2 --- Tea10 --- 1,325
    cogs --- West --- S1 --- Tea10 --- 1,699
    cogs --- West --- C3 --- Tea10 --- 1,588
    cogs --- East ---- S2 --- Coffee --- 1,458
    cogs --- East ---- D1 --- Tea10 --- 1,664
    cogs --- East ---- D3 --- Coffee --- 1,989
    cogs --- East ---- C1 --- Coffee --- 1,988
    cogs --- East ---- C3 --- Tea10 --- 1,328
    discounts --- West --- D1 --- Coffee --- 10789
    discounts --- West --- D2 --- Tea10 --- 10548
    discounts --- West --- S1 --- Tea10 --- 10145
    discounts --- West --- C3 --- Tea10 --- 10998
    discounts --- East ---- S2 --- Coffee --- 10991
    discounts --- East ---- D1 --- Tea10 --- 11615
    discounts --- East ---- D3 --- Coffee --- 15615
    discounts --- East ---- C1 --- Coffee --- 11898
    discounts --- East ---- C3 --- Tea10 --- 12156
    amount --- West --- D1 --- Coffee --- 548
    amount --- West --- D2 --- Tea10 --- 589
    amount --- West --- S1 --- Tea10 --- 852
    amount --- West --- C3 --- Tea10 --- 981
    amount --- East ---- S2 --- Coffee --- 876
    amount --- East ---- D1 --- Tea10 --- 156
    amount --- East ---- D3 --- Coffee --- 986
    amount --- East ---- C1 --- Coffee --- 256
    amount --- East ---- C3 --- Tea10 --- 9896
    And the third way is to adjust the essbase.cfg parameters DLTHREADSPREPARE and DLTHREADSWRITE (and DLSINGLETHREADPERSTAGE)
    I just want to be sure that I understand your suggestions.
    Many thanks for awesome help,
    Zeljko

  • Average processing time too long in workload monitoring (ST03N)

    Hi expert,
    I wander why average processing time too long in workload monitoring (ST03N)
    for example
    avg.response time : 4700 (ms)
    avg.processing time : 4200 (ms)
    avg.CPU time : 300 (ms)
    avg.DB time : 200(ms)
    OS (CPU and memory ) status is nomal.
    please tell me why processing time too long and what do I check more.
    thanks

    Hi
    Processing Time is the total time taken i.e. Time when user clicks on submit button to send data to SAP System till the time SAP System processes everything and shows the updated screen back to the user. It includes Network Time also.
    Average Processing Time is the Processing Time per Dialog Step
    In ST03N workload Monitor, there are different Task Type like Dialog, Background, RFC, Update,..........
    Could you be more specific where do you find the Average Processing Time high ? I mean for which Task Type.
    If you have users complaining for Performance problem check their Network Time also.
    Let us know, if this information was helpful to you.
    with regards,
    Parin Hariyani

  • How can data load time can be optimized using routine

    Hi,
    Data laod is taking too much time . How can data load time can be optimized using routine
    Regards,
    Vivek

    Hi Vivek,
    If you have code wirtten, please try to post the same, so that i cna help you in optimising it and give it you...
    General Guideleines.
    1. Dont use select statement inside the loop.
    2. where possibel try to use Read statement with Binary Search Option and avoid using loop statement.
    3. Try to avaoid joins if you are using any, isntead use For alll entries.
    4. Try When u use for all entries try to make sure that your internal tbale is not initial.
    5. Try to Sort and delete adjacent duplicates from internal tables wherever needed.
    6. Use clear statement whererver required.
    Regards,
    Nanda.S

  • Java.sql.SQLException:ORA-01801:date format is too long for internal buffer

    Hi,
    I am getting the following exception when i trying to insert data in to a table through a stored procedure.
    oracle.apps.fnd.framework.OAException: java.sql.SQLException: ORA-01801: date format is too long for internal buffer
    when execute this stored procedure from ana anonymous block , it gets executed successfully, but i use a OracleCallableStatement to execute the procedure i am getting this error.
    Please let me know how to resolve this error.
    Is this error something to do with the Database Configuration ?
    Thanks & Regards
    Meenal

    I don't know if this will help, but we were getting this error in several of the standard OA framework pages and after much pain and aggravation it was determined that visiting the Sourcing Home Page was changing the timezone. For most pages this just changed the timezone that dates were displayed in, but some had this ORA-01801 error and some others had an ORA-01830 error (date format picture ends before converting entire input string). Unfortunately, if you are not using Sourcing at your site, this probably won't help, but if you are, have a look at patch # 4519817.
    Note that to get the same error, try the following query (I got this error in 9.2.0.5 and 10.1.0.3):
    select to_date('10-Mar-2006', 'DD-Mon-YYYY________________________________________________HH24:MI:SS') from dual;
    It appears that you can't have a date format that is longer than 68 characters.

  • Why my adobe edge animation taking load time too much

    Hi
    why my adobe edge animation taking load time too much.
    Plsease guide me.

    Hi, Rohan-
    Can you upload your file for us to take a look at?  It's hard for us to tell what's going on without taking a look.
    Thanks,
    -Elaine

  • How to find Data load time ?

    Hi,
    Where do i look for the Total data load time ? from data source to PSA ?

    Hi Honar,
    1) Goto monitor of IP, in header tab you can find the runtime of IP.
    2) So you are loading data from source to BW.In IP header tab copy the req number, goto the source systems from which data is loading.
    goto SM37 give the request number with BI as pre-fix, you will find the total run time of job with job log.
    Hope this helps.

  • Data Load time

    I have 12 files which are similar in size. When they were loaded on one server, the load time is pretty consistent. For example, it takes about 100 seconds for loading each of the 12 files. On the other server, however, the data load time increases dramatically when loading later files, i.e., 100s, 120s, 180s....The essbase version are the same on both servers (window 2000). What could cause this problem? Is this something to do with server settings or essbase settings? Thanks for your response.Lin

    Hi,Can you please specify the version of Essbase server and development tool (app manager, admin services)?Have you checked the dense/sparce configuration?Grofaty

  • Data load taking very long time between cube to cube

    Hi
    In our system the data loading is taking very long time between cube to cube using DTP in BI7.0,
    the maximum time consumption is happening at start of extraction step only, can anybody help in decreasing the start of extraction timing please
    Thanks
    Kiran

    Kindly little bit Elaborate your issue, Like how is mapping between two cubes, Is it One to one mapping or any Routine is there in Transformation. Any Filter/ Routine in DTP.  Also before loading data to Cube did you deleted Inedxes?
    Regards,
    Sushant

  • Movie Load Time Too Slow

    Hello! 
    I've created three movies for a new website; all photo slide shows with 10-14 photos and some text.  The photos have all been optimized in Photoshop and saved for web...most are under 100kb, however the movies are taking a long time to load on the webpages.  Is there anything I can do through Edge Animate to reduce the load time?  Even with the preloader, the load time is way too long.  I've inserted the movies into the html pages using iframe.  Any suggestions are much appreciated.  Thanks!

    Hi Simon,
    I want to reduce Power View blank canvas / PowerView report load time from SSAS Tabular source in the SharePoint 2013 Portal.
    I have observed that a PowerView report with 1 View, loads faster than a PowerView report with multiple (4) Views, so I think that your statement "Power View only retrieves the data it needs at any given time for a data visualization" might
    be incorrect.
    I have read the link you have provided and have all the patches applied, besides I am not using a Power Pivot source.
    My tabular cube is complex and has about 200 measures, and the blank Power View canvas takes about 13 seconds to load in SharePoint 2013 URL from web browser. Appreciate if you can provide any insights here please.
    Thanks, Ashish Singh

  • Profile WS run time too long

    My first test run (in dev) of the custom Java PWS against a SQL server database seems to be taking too long. Its seems to be taking appx. 1 second or more per user. This will be a problem when processing around 20,000 users. We can't have a PWS running for 4-6 hours everynight.
    I'd really appreciate any tips to optimize this for faster performance. I am opening DB Connection and running 2 queries for each users and clsoing the statement and connection. Can I do this another way so I open the DB connection(and maybe statement) for the Profile Source only once and release the connection etc. after the whole job is done or in case of errors.
    Please help as this could be a huge bottleneck for us!
    Thanks.
    Vanita
    Staples

    Hi Akash,
    Thanks for your quick reply. Unfortunately I don't have any signature field to limit the profile sync by. I am currently running the PWs in ToMCAT 4.1.30. We plan to deploy on WAS in dev. I am not sure if there is connection pooling being done by Tomcat in 4.1.30 version. Do you have any information on that? I am doing db connect and 2 property queries in the getuserproperties method.
    1. Would the connection pooling (if done in WAS) make any difference to the run time?
    2. Would opening the connection in initialize() method versus GetUserProperties() make any difference?
    As always, thanks for your help.
    Vanita
    ------- Akash Jain wrote on 3/1/05 1:25 PM -------Hi Vanita,On recent hardware, you should be able to perform an initial profile sync at at a rate of ~10/second. This means you should be able to perform your 20k users profile sync in under an hour. Resyncs should be much faster if you use a signature attribute.
    I'm going to assume you're hitting some database backend with a table structure like the following:Users Table String UserGUID Date LastModified
    Properties Table int PropID String UserGUID String PropValue
    You have your users keyed off a unique name - a GUID in this example - and properties in a seperate table keyed off PropID and GUID.
    Lets review the protocol and each step:a) initialize() - sends the parameters of the profile sync to the PWS, this is a good place to do a single query to your database in order to cache all user unique names and signatures (LastModified dates) in a HashTable. This will make re-syncs much faster since subsequent AttachToUser() and GetUserSignature() calls will be derived from this HashTable.b) attachToUser() - in this call you can simply lookup your user record against the HashTable created in Initialize(). If no entry is returned, then throw a NoSuchUserException. If a user does exist continue.c) getUserSignature() - again use the HashTable created in Initialize() to lookup the signature for this user. return it as a String.d) getUserProperties() - if called, this means the signature you sent back in step (c) has changed since the last profile sync. you now want to make a call to your properties table (a single DB call) to load all the property values for the user. return these as a UserPropertyInfo object.
    During an initital sync, you will always get to step (d) above. During re-syncs, assuming there is low churn, I'd say a max of 1% of your calls will get to step (d) and thus the re-sync should be an order of magnitude faster in most cases.
    With respect to database connections - if you are opening and closing a connection for each user, this is pretty poor with respect to performance. Your best bet in Java is to use a single connection (this is a single threaded process) which is setup in initialize(). In the shutdown() method, close this connection.
    I hope this helps, the combination of using a single connection, using the signature attribute and caching all the users unique names and signatures in one call at the start of the profile sync should drastically increase performance.
    Thanks,Akash

  • Quicktime page loads taking too long

    I've got a page I'm building with 10 short mp3 clips. Here's a sample of the code for each clip:
    <object classid="clsid:02bf25d5-8c17-4b23-bc80-d3488abddc6b" codebase=
    "http://www.apple.com/qtactivex/qtplugin.cab"
    width="49" height="16">
    <param name="src" value="http://www.mysite.com/test/mp3/lookoflove.mp3">
    <param name="autoplay" value="false">
    <param name="controller" value="true">
    </object>
    Is there a parameter I can add that would keep the page from preloading all of the files and make it so the file wouldn't actually load until the 'play' button was pressed? The files are all around 600-700kb each which, by themselves don't take too long but all together make the page a little too long to wait for.

    let's put it this way. if you do File>Export>Quicktime you'll make a quicktime file with the same compressor as your sequence and it'll render faster. If you select File>Export>Quicktime Conversion you can make a quicktime file with a compression of your choice. (although the conversion process takes time). so you could compress it into a DVCPRO 50 quicktime, or an uncompressed quickime or H.264 quicktime. any of these settings will play in quicktime. These you can select through the options button in the converter window. we just suggest you use Export>Quicktime because it's faster easier and it's useless in your situation to convert.

Maybe you are looking for