TOP 10 Data Loads

I need to find out top 10 data loads that take the longest. How do i find this. Any info is helpful.
Thanks

I would take a look at tran ST03.  You need to use the Expert Mode (this is set in the upper left area).
Then select BW System Load. This should open a small pane below that offers some different time periods, e.g. Today, Today -1 , Today -2, This week, this week -1 , this week -2, and This month, Month -1 and Month -2.
Then choose Load Data.  This will bring up all kinds of Infosource load stats to review.  This bit of captured data will give you an idea of what's avail.  There is much more than this.  Sorry about the column alignment, but this gives you an idea.
Name of Infosource     Target Obj     # Requests     Parallel     Total time     Avg. t/Req      Read/hour      Written/h     t(Source)     % Source     Avg.Source       # Read     Avg. Read      Read /hour     t (Read)     t(Transf.)     % Transfer     Avg.Transf     # Transfer
TOTAL     TOTAL            64     76       1,237.4          19.3       373,019        92,895        422.7       34.16           6.6     128,215      2,003.4      1,091,847         3.3          19.9          1.61           0.3       128,215
ZFI_PARKED_EARMARKEDFUNDS     ZFM_C01             1              204.5         204.5        65,846        59,279        195.2       95.46         195.2       3,740      3,740.0         68,979        52.2           0.4          0.20           0.4         3,740
0BP_DEF_ADDRESS_ATTR     0BPARTNER             1              119.7         119.7       414,394             0         57.4       47.98          57.4      13,777     13,777.0        863,641         4.2           2.3          1.90           2.3        13,777
0BPARTNER_ATTR     0BPARTNER             1               74.3          74.3       896,913             0          8.1       10.90           8.1      18,523     18,523.0      8,226,351         0.4           2.0          2.73           2.0        18,523
ZFI_GL_4     ZGL_O_02             1               46.3          46.3       391,732       391,732         22.6       48.83          22.6       5,043      5,043.0        802,280         4.5           1.0          2.17           1.0         5,043

Similar Messages

  • How to find out the top 10 data loads based on time duration?.

    Hi,
    We are working on performance tuning. so we want to find out the top 10 loads which had run long time in date wise.
    We need the load start time and end time of the load process, Infosource and datatarget name.
    There are nearly 1000 loads, So it is very difficult to collect the load's timings in RSMO.
    Is there any another alternative to collect the top 10 loads based on time duration in date wise?
    Thanks & Regards,
    Raju

    Hi Gangaraju,
    You can install BI Statistics for getting these type of data.
    Or you check in RSDDSTAT or RSMDATASTATE_EXT or  table for the Load process time.
    Or goto t-code ST13 to get detailed analysis of a Process chain for a given period.
    Hope this helps.
    Regards,
    Ravi Kanth

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Data Load process for 0FI_AR_4  failed

    Hi!
    I am aobut to implement SAP Best practices scenario "Accounts Receivable Analysis".
    When I schedule data load process in Dialog immediately for Transaction Data 0FI_AR_4 and check them in Monitor the the status is yellow:
    On the top I can see the following information:
    12:33:35  (194 from 0 records)
    Request still running
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    No Idocs arrived from the source system.
    Question:
    which acitons can  I do to run the loading process succesfully?

    Hi,
    The job is still in progress it seems.
    You could monitor the job that was created in R/3 (by copying the technical name in the monitor, appending "BI" to is as prefix, and searching for this in SM37 in R/3).
    Keep on eye on ST22 as well if this job is taking too long, as you may have gotten a short dump for it already, and this may not have been reported to the monitor yet.
    Regards,
    De Villiers

  • Is the Data load option in 4.1 really helpful?

    Hi,
    So while trying to create a data load page in my application, I was facing issues with no data found and was able to resolve the issue at that time but that raised another question which I thought of posting as a separate question rather than in an existing thread.
    I was trying to load data into a DB table from a .csv file. and at the datavalidation page/step, it would throw a "no data found" error. I then noticed that . I then noticed, that in the second data/table mapping page where it shows all the rows. on top there is a "Column Name" list box which has values defaulted to "Do not load" and in that there are all the column names.
    So just to confirm, do we need to to everytime select all the columns ? In the very first step, the column name is specified in the first row of the csv file, which I have selected .We have quite a few columns and this would be tedious for business to manually do? Shouldn't the application just use the csv first row headers as the column name?
    How do you guys workaround this?
    Thanks,
    Sun

    Hi VC,
    It does take the excel column values on the first row. The problem was that the column value had a space at the end. hence the issue. Was able to fix that.
    Thanks,
    Sun

  • Error in Data Loading Step

    Hi all,
    One of our process chain failed at data loading step. In monitor screen the status tab ismessage as ::
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    In the error message option , I can see the 
    1) No Planning Version Selected ID SAPAPO/TSM     IDno224     
    2) Errors in source System         ID RSM                      IDno340
    Please help me to resolve the issue.

    Hi
    Right click on failed DTP --> select monitor.
    here on top you will find "Job overview", select this
    now check the job log of this DTP, you will come to know what the exact error is.
    activate the objects related to this DTP ( i mean source and target).
    Regards,
    Venkatesh

  • Attribute data load fails for 0RPA_MARM Infoobject

    Hi BW experts,
    Attribute data load to the infoobject 0RPA_MARM(not a infoprovider) is on yellow status for long time and after it is throwing the below error
    Runtime Errors :UNCAUGHT_EXCEPTION
    Except                 : CX_RSR_X_MESSAGE
    I checked the forums for this issue but got the solution for target as infoprovider.
    Please suggest.
    Regards,
    Leo.

    Hi Leo,
    Basically loading data into Master data Infoobject and making it InfoProvider are two different things. Loading data in Master data infoobject will be same for all the Infoobjects and when you make them as Infoprovider you will get an advantage of doing reporting directly on top of that Infoobject.
    Regards,
    Durgesh.

  • Oracle Database Table data Load it into Excel

    Hello All,
    Please I need your help for this problem:
    I need to load Oracle database Table data and load it into Excel and saved with xls format.
    Example -Select * from Slase data load it into the Excel.
    I appreciate ans sample code to help me do that, Please help me out. This is very urgent.
    Thanks alot and best regards,
    anbu

    >
    I need to load Oracle database Table data and load it into Excel and saved with xls format.
    Example -Select * from Slase data load it into the Excel.
    I appreciate ans sample code to help me do that, Please help me out. This is very urgent.
    >
    Nothing in these forums is 'urgent'. If you have an urgent problem you should contact Oracle support or hire a consultant.
    You have proven over and over again that you are not a good steward of the forums. You continue to post questions that you say are 'urgent' but rarely take the time to mark your questions ANSWERED when they have been.
    Total Questions: 90 (78 unresolved)
    Are you willing to make a commitment to to revisit your 78 unresolved questions and mark them ANSWERED if they have been?
    The easiest way to export Oracle data to Excel is to use sql developer. It is a free download and this article by Jeff Smith shows how easy it is
    http://www.thatjeffsmith.com/archive/2012/09/oracle-sql-developer-v3-2-1-now-available/
    >
    And One Last Thing
    Speaking of export, sometimes I want to send data to Excel. And sometimes I want to send multiple objects to Excel – to a single Excel file that is. In version 3.2.1 you can now do that. Let’s export the bulk of the HR schema to Excel, with each table going to it’s own workbook in the same worksheet.
    >
    And you have previously been ask to read the FAQ at the top of the thread list. If you had done that you would have seen that there is a FAQ for links that have many ways, with code, to export data to Excel.
    5. How do I read or write an Excel file?
    SQL and PL/SQL FAQ

  • Refresh Webi reports automatically after BI data loads ?

    Hello,
    We are about to install BOE X.1 on top of BI 701.
    The idea would be that the users would perform their WebI daily reporting, but obviously only after SAP BI night batch scheduling is finished, and the BI infoproviders filled.
    I've read that BOE ofers you the ability to refresh the reports upfront.
    We were wondering if there is a way to easily implement this logical dependance : to refresh the Webi Report only after the end of BI data loads.
    There is off course, the possibility to use an external scheduler, but we have checked the licencing and it's quite expensive.
    Is there another way to do so ?
    Many thanks for your attention.
    Best Regards.
    Raoul

    Hi Alan,
    Thank you very much for your quick answer.
    I would like to make sure that I understand you since I'm not very familliar with BOE :
    First , we create an event in the CMC and connect it to a file location
    Then we schedule the document and add the file event : do you mean schedule the webi report  ?
    Finally create the file  as part of the Bex query refresh process : how exactly do we do that, in the BI process chains ?
    Thank you very in advance for your help.
    Best Regards.
    Raoul

  • DATA LOAD WORNINGS IN ASO CUBES

    Hi Every one,
    While loading data into ASO cubes in Essbae we are getting wornings like *"Data load strem contains 1.25797e 08 and [0] #misssing cells ".*My data file have #missing values and o's and sepecial carecters like E .I want To load the complete data with out warnings.Kindly let me know if any one's know the solution .Whether i need to change any settings in rule file or how to ingnore those cells .
    Thanks,
    Vikram

    The warnings are really informational messages to let you know it loaded and did not load those values. Which is fine as they tend to bloat a cube (The zeros). #missing is not going to load anyway and the E is exponential format of numbers whinch should not be a problem. Excel will display it this way, but you can format it without the E. You don't mention if you are doing this from EAS or MaxL and what version you are on. In Veraion 11, in EAS there are options in the load dialog to turn on or off the loading of zzeros and missing across the top. In MaxL, I don't see the syntax in the Tech reference, but I thought it was there in 9.

  • Data load performance using infoset Vs View.

    Hi Guru,
    I am performing generic extraction in that i am loading data to cube but my Data source is based on Infoset in R/3.
    it is taking 30 MIn. to load 10,00000 Lakh (Ten Lakh) records ideally it has to take 10 to 15 min. rit ?
    can anybody suggest me where i need to check for increase in performance or shall i create datasource over a view and try to load data will it help me in data load performance ?
    thanks,
    ganesh.

    hi Ganesh,
    Primary Index ->
    When you create a database table in the ABAP Dictionary, you must specify the combination of fields that enable an entry within the table to be clearly identified. The key fields must be specified at the top of the table field list, and define them as key fields. A minimum of 1 key field and a maximum of 16 key fields can be defined.
    When the table is activated, an index formed from all key fields is created on the database (with Oracle, Informix, DB2), in addition to the table itself. This index is called the primary index The primary index is unique by definition.
    In addition to the primary index you can define one or more secondary indexes for a table in the ABAP Dictionary, and create them on the database. Secondary indexes can be unique or non-unique.
    If you dispatch an SQL statement from an ABAP program to the database, the program searches for the data records requested either in the database table itself (full table scan) or by using an index ( index unique scan or index range scan). If all fields requested are found in the index using an index scan, the table records do not need to be accessed.
    The index records are stored in the index tree and sorted according to index field. This enables accelerated access using the index The table records in the table blocks are not sorted.
    An index should not consist of too many fields. Having a few very selective fields increases the chance of reusability, and reduces the chance of the database optimizer selecting an unsuitable access path.
    To create Index ->
    Yo have to use trx SE11 into Dev system.
    Enter the database table name and press
    Display -> Indexes -> Create
    Enter index name.
    Choose Maintain logon language.
    Enter short description and index fields.
    Then press save and create the request to transport the index to QA and PRD. Then press activate.
    Hope this helps,
    VA
    Edited by: Vishwa  Anand on Sep 29, 2010 12:50 PM

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

Maybe you are looking for