Time of data load

Hi,
In the query, how to display the date and time of last data load?
Thanks.

Hi there,
There's a button named Information every time you execute a query that has that has that information.
Diogo.

Similar Messages

  • Takes Long time for Data Loading.

    Hi All,
    Good Morning.. I am new to SDN.
    Currently i am using the datasource 0CRM_SRV_PROCESS_H and it contains 225 fields. Currently i am using around 40 fields in my report.
    Can i hide the remaining fields in the datasource level itself (TCODE : RSA6)
    Currently data loading takes more time to load the data from PSA to ODS (ODS 1).
    And also right now i am pulling some data from another ODS(ODS 2)(LookUP). It takes long time to update the data in Active data table of the ODS.
    Can you please suggest how to improve the performance of dataloading on this Case.
    Thanks & Regards,
    Siva.

    Hi....
    Yes...u can hide..........just Check the hide box for those fields.......R u in BI 7.0 or BW...........whatever ........is the no of records is huge?
    If so u can split the records and execute............I mean use the same IP...........just execute it with different selections.........
    Check in ST04............is there are any locks or lockwaits..........if so...........Go to SM37 >> Check whether any Long running job is there or not.........then check whether that job is progressing or not............double click on the Job >> From the Job details copy the PID..............go to ST04 .....expand the node............and check whether u r able to find that PID there or not.........
    Also check System log in SM21............and shortdumps in ST04........
    Now to improve performance...........u can try to increase the virtual memory or servers.........if possiblr........it will increase the number of work process..........since if many jobs run at a time .then there will be no free Work prrocesses to proceed........
    Regards,
    Debjani......

  • Time dependent data load to BW using Fixed Time Interval selection

    Hi,
    I'm facing a problem in extracting time-dependent master data to BW. I would really appreciate if anyone can help in this regard.
    We have a table in R/3 which has time-dependent data. For example, for a single OBJID we could have multiple records with different BEGDA/ENDDA and different attributes. When we try to load this data into BW InfoObject, we see a time-dependent data selection in update tab of infopackage. We need to select BEGDA/ENDDA in datasource since otherwise during loads we would get a "DUPLICATE RECORDS" load error in BW.
    The problem is when we enter a Fixed-time interval Start Data and End Date in the InfoPackage, only the records with that StartDate and EndDate are loaded. Ideally I would want to see all the records whose BEGDA/ENDDA falls in the range of StartDate and EndDate.
    I tried modifying the ROOSFIELD table for the data source. I made the BEGDA SELOPTS as GE(16) and ENNDA SELOPTS as LE(64). But still it doesn't seem to work. I checked if the change to ROOSFIELD has any affect in RSA3. Unless we give a range for BEGDA it doesn't seem to behave as expected. Could someone let me know how I can load all records to BW whose BEGDA/ENDDA fall in the range of StarteDate and EndDate.
    Thanks,
    Anil.

    Anil,
    If you are using a custom dataSource, make sure the dataSource fields for begda and ennda are labeled "datefrom" and "dateto".
    Cheers,
    T-

  • Underlying RS Table :: Date & Time for Data Load Requests

    Dear SAP BW Community,
    In BW 3.5, does anybody know the underlying "RS" table where I can check to see the earliest date & time which a data target was loaded, by providing the data target's technical name in SE16 ?
    Thanks!

    OK, I've found the timestamp the data load requests in the table RSMONICDP.
    To get the earliest data load for infoCube FRED, I'm going to Oracle via SQL*Plus, as follows:
    select to_char(min(TIMESTAMP)) from sapr3.RSMONICDP where ICUBE = 'FRED' ;

  • Time stamp data load

    Hi Gurus
    How can we upload time stamp data in BW
    ex: 15/06/08 12:50pm,  TO SINGLE INFOOBJECT,
    need it now,
    Thanks & regards
    Ogeti

    Hi,
       You could use a routine in the transformation.
    The routine would look something like this.
    DATA: v_timestamp(14).
    CONCATENATE sy-datum sy-uzeit INTO v_timestamp.
    Result = v_timestamp.
    You need not map any field. Just write the routine.
    Assign points if helpful
    Regards

  • Issue at the time of Data load

    Hi Team,
    While doing Import from data manager package syatem not loading the data and througing error.
    Log as below:
    Package:                     Import
    Appset:                      RCOM1
    Application:                 LEGALAPP
    Request Time:                2009-03-31 04:58:14
    Start   Time:                2009-03-31 04:58:14
    End     Time:                2009-03-31 04:58:15
    Total   Time:                00:00:01
    TOTAL STEPS  3
    1. Assign initial parameters:        completed  in 1 sec.
    2. Convert data:                     Failed  in 0 sec.
    [Selection]
    FILE= DataManager\DataFiles\My Files\TB_FEB_2009_1.txt
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.XLS
    CLEARDATA= No
    RUNLOGIC= No
    [The list of conversion files in each transformation file]
    Transformation file: DATAMANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.XLS
    [Messages]
    Convert data
    The data file is empty. Please check the data file and try again.
    I am using same members are there in dimensions and same dimensions. Because of this reason I havent used any transformation file for loading data into legal application.
    Please help me in resolving this error.
    Regards
    Naveen.KV

    Hello,
        It looks like the convert data task is receiving an empty file to load. Are you use the standardt Import package or a custom one?
        Please try to activate the debug information, adding the instruction DEBUG(ON) and verify the data into the file passed to the conversion task.
    Regards,
    Mihaela

  • How can I specify the PARTITION name at the time of data load?

    Hi,
    I have a table with 4 partitions. By using SQL*Loader I'm going to load data into the same. While inserting the records it should go to the 2nd partition only.
    Where should i specify the Partition name?
    Please clarify me with sample code.
    Thanks.

    Assuming that the partition is empty before the load, I would load the data into a temp table with the same structure as the partitioned table. After all the data is successfully loaded, exchange the partition of the table with the temp table using 'alter table ... exchange partition...'
    Another question is, how is your table partitioned?
    Message was edited by:
    Jens Petersen

  • Increasing Max run time in DATA load

    Hi,
    I know to increase the maximun run time we have to set in the infopackage level or RSCUSTV2 or from Monitor
    My question is how to increae the time for the dataload when the load is happenning.
    Assume the r/3 job is running and the BW job is going to be red due to timeout setting. I want to extend this time out time during this dataload...
    How to do that ?
    Thanks

    HI
    Pvc
    This is a basis job you will have to chnage the setting of the profile.
    Tcode is RZ10
    select your profile
    look for max_wprun_time
    you will have to increase it here
    You will have to restart the server after doing this.
    Contact your basis team to do this
    Hope this helps
    K.Mohan

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Insert OR Update with Data Loader?

    Hello,
    can i Insert OR Update at same time with Data Loader?
    How can i do this?
    Thanks.

    The GUI loader wizard does allow for this including automatically adding values to the PICKLIST fields.
    However, if you mean the command line bulk loader, the answer is no. And to compound the problem, the command line version will actually create duplicates for some of the objects. It appears that the "External Unique Id" is not really "unique" (as in constrained via unique index) for some of the objects. So be very careful when you prototype something with the GUI loader and then reuse the map on the command line version.
    You will find that some objects can work well with the command line loader (some objects will not).
    Works well (just a few examples):
    Account (assuming your NAME,LOCATION fields are unique).
    Financial Product
    Financial Account
    Financial Transaction
    Will definitely create duplicates via command line bulk loader:
    Contact
    Asset
    Also be aware that you might hear that during a go-live that Oracle will remove the 30k record limit on bulks loads (temporarily). I have not had any luck with Oracle Support making that change (2 clients specifically in the last 12 months).

  • Data loading delay

    Hi Friends.,
               Shall i have an answer for one error,
    The Issue is: Every day i load to one info cube, whatever the cube it is, it takes 2 Hours for every load, but once it has taken 5 Hours, what might be the reason? just confusing with that, can anybody let me clarify !!!!
    Regards.,
    Balaji Reddy K.

    Reddy,
    1. Is the time taken for loading to PSA or to load from PSA to cube ? if it is to oad to PSA then  uaually the problem lies at the extractor
    2. If it is loading to the cube.. then check out if statistics are being maintained for the cube and they would give an accurate picture of where the dataload is taking up most time.
    Do an SQL trace during the data load and if you find a lot of aster Data Lookups .. make sure that master data is loaded and if there are a lot of looups to Table NRIV check if number range buffering is on so that dim IDs get generated faster
    Check if the data load happens fast if you drop any existing indexes...
    Are you loading any agregates after the data load ? check fi th aggregates are necessary or if they have been enabled for delta loads..
    If you have indexes active and there is a huge data loa , depending on the index , the data load can get delayed..
    If the cube is not compressd , some times the data load can get delayed..
    Also when the data load is going on check in SM50 and SM37 to see if the jobs are active - this means that the data load is active from both sides...
    Always update the statistics for the cube before the load and ater the load , this helps in deciphering the time it takes for the data load... after activating the statistics .. check table RSDDSTAT or the standard reports available as part of BW tecnical content..
    Hope it helps..
    Arun
    Assgn points if helpful

  • Data load status stay in yellow

    Hi,
    My BW platform is BW 701 with SP 5.  I am loading ECCS data hourly with 3.5 method which include transfer rule and update rule.  Most of the time the data load are successfully completed.  The total number of records is about 180,000 records and extracted in 4 packets. Once in a while, randomly one of the data packet stay in yellow could not complete the load.  But, in the next hour data load, the data loaded successfully.  We know it is not data issue.  Does anyone know why the data load is not consistently?  Any suggestions are much appreciated.
    Thanks for your suggestions,
    Frank

    HI Frank,
    This might be because some of the TRFc or Idcos might got hung.
    Check the source system job got finished or not.
    If the source system job got completed check the TRFCs and IDOCs if there are any hung TRFCs and IDOCs.
    in the monitor screen --> menu bar environment > select job overview> source system (enter id n pwd ) check the job status.
    once its done to check the TRFCs
    Info package>Monitor->Environment> Trasact Rfcs-->In source system --> it will diplsay all the TRFCs check for your hung TRFCs and try to manually flush the TRFC(F6).
    If idocs are hung change - process the IDOCs manually.
    Regards
    KP

  • Data load to DSO takes long time to finish

    Dear All,
    We have a data load from data source to std  DSO.The data load takes 5 hours to complete  6000 records in single data package which is long time.
    Process monitor shows yellow status at one of the step for long time "No message :Transformation End" and after 5 hours approx  it completes successfully.
    Please find the snapshot of process monitor(Attached File Process monitor.png).
    There is an end routine and the transformation  is having direct mapping except for a target object exchage rate which is master data look up of DSO (Attached FIle : Transformation rule.png)
    The look up DSO /BI0/AFIGL_DS00 in the below code is having DOCNUM as a primary key  but not the POSKY. Since one of the field is not a primary key,secondary index is created for the look up DSO.But,still it takes huge time to finish the last step as mentioned in the snapshot.
    Setting for parallel process is 1
    DTP--> Update tab-->Error handling-->No update,no reporting.But there is a error DTP present which I believe that there is no use when "No update,No reporting" option is chosen.
    Can you please suggest the reason for the such long time.Also,Please suggest how to find the exact place where it consumes lot of time.
    End routine Logic:
        IF NOT RESULT_PACKAGE IS INITIAL.
          REFRESH IT_FIG.
          SELECT DOCNUM  POSKY DEBCRE LOCC
          FROM /BI0/AFIGL_DS00 INTO TABLE IT_FIG
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE DOCNUM = RESULT_PACKAGE-BILNO AND
                POSKY = '02'.
        LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
            READ TABLE IT_FIG INTO WA_FIG WITH KEY
                       DOCNUM = <RESULT_FIELDS>-BILNO.
            IF SY-SUBRC EQ 0.
              <RESULT_FIELDS>-DEB = WA_FIG-DEBCRE.
              <RESULT_FIELDS>-LOC_CURRC2 = WA_FIG-LOCC.
            ENDIF.
        ENDLOOP.
        ENDIF.
    Thanks in advance
    Regards
    Pradeep

    Hi,
    below code check it and try to load the data.
    IF RESULT_PACKAGE IS NOT INITIAL.
          SELECT DOCNUM 
                          POSKY
                          DEBCRE
                          LOCC
          FROM /BI0/AFIGL_DS00 INTO TABLE IT_FIG
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE DOCNUM = RESULT_PACKAGE-BILNO AND
                POSKY = '02'.
        LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
            READ TABLE IT_FIG INTO WA_FIG WITH KEY
                       DOCNUM = <RESULT_FIELDS>-BILNO.
            IF SY-SUBRC EQ 0.
               <RESULT_FIELDS>-DEB = WA_DOCNUM.
               <RESULT_FIELDS>-DEB = WA_POSKY.
              <RESULT_FIELDS>-DEB = WA_FIG-DEBCRE.
              <RESULT_FIELDS>-LOC_CURRC2 = WA_FIG-LOCC.
            ENDIF.
        ENDLOOP.
        ENDIF.
    if your are getting any error please let us know
    1.decrease the data packet size in DTP like 10,000 or 20,000.
    2.increase the parallel process at DTP level.
    Thanks,
    Phani.

  • How to tune data loading time in BSO using 14 rules files ?

    Hello there,
    I'm using Hyperion-Essbase-Admin-Services v11.1.1.2 and the BSO Option.
    In a nightly process using MAXL i load new data into one Essbase-cube.
    In this nightly update process 14 account-members are updated by running 14 rules files one after another.
    These rules files connect 14 times by sql-connection to the same oracle database and the same table.
    I use this procedure because i cannot load 2 or more data fields using one rules file.
    It takes a long time to load up 14 accounts one after other.
    Now my Question: How can I minimise this data loading time ?
    This is what I found on Oracle Homepage:
    What's New
    Oracle Essbase V.11.1.1 Release Highlights
    Parallel SQL Data Loads- Supports up to 8 rules files via temporary load buffers.
    In an Older Thread John said:
    As it is version 11 why not use parallel sql loading, you can specify up to 8 load rules to load data in parallel.
    Example:
    import database AsoSamp.Sample data
    connect as TBC identified by 'password'
    using multiple rules_file 'rule1','rule2'
    to load_buffer_block starting with buffer_id 100
    on error write to "error.txt";
    But this is for ASO Option only.
    Can I use it in my MAXL also for BSO ?? Is there a sample ?
    What else is possible to tune up nightly update time ??
    Thanks in advance for every tip,
    Zeljko

    Thanks a lot for your support. I’m just a little confused.
    I will use an example to illustrate my problem a bit more clearly.
    This is the basic table, in my case a view, which is queried by all 14 rules files:
    column1 --- column2 --- column3 --- column4 --- ... ---column n
    dim 1 --- dim 2 --- dim 3 --- data1 --- data2 --- data3 --- ... --- data 14
    Region -- ID --- Product --- sales --- cogs ---- discounts --- ... --- amount
    West --- D1 --- Coffee --- 11001 --- 1,322 --- 10789 --- ... --- 548
    West --- D2 --- Tea10 --- 12011 --- 1,325 --- 10548 --- ... --- 589
    West --- S1 --- Tea10 --- 14115 --- 1,699 --- 10145 --- ... --- 852
    West --- C3 --- Tea10 --- 21053 --- 1,588 --- 10998 --- ... --- 981
    East ---- S2 --- Coffee --- 15563 --- 1,458 --- 10991 --- ... --- 876
    East ---- D1 --- Tea10 --- 15894 --- 1,664 --- 11615 --- ... --- 156
    East ---- D3 --- Coffee --- 19689 --- 1,989 --- 15615 --- ... --- 986
    East ---- C1 --- Coffee --- 18897 --- 1,988 --- 11898 --- ... --- 256
    East ---- C3 --- Tea10 --- 11699 --- 1,328 --- 12156 --- ... --- 9896
    Following 3 out of 14 (load-) rules files to load the data columns into the cube:
    Rules File1:
    dim 1 --- dim 2 --- dim 3 --- sales --- ignore --- ignore --- ... --- ignore
    Rules File2:
    dim 1 --- dim 2 --- dim 3 --- ignore --- cogs --- ignore --- ... --- ignore
    Rules File14:
    dim 1 --- dim 2 --- dim 3 --- ignore --- ignore --- ignore --- ... --- amount
    Is the upper table design what GlennS mentioned as a "Data" column concept which only allows a single numeric data value ?
    In this case I cant tag two or more columns as “Data fields”. I just can tag one column as “Data field”. Other data fields I have to tag as “ignore fields during data load”. Otherwise, when I validate the rules file, an Error occurs “only one field can contain the Data Field attribute”.
    Or may I skip this error massage and just try to tag all 14 fields as “Data fields” and “load data” ?
    Please advise.
    Am I right that the other way is to reconstruct the table/view (and the rules files) like follows to load all of the data in one pass:
    dim 0 --- dim 1 --- dim 2 --- dim 3 --- data
    Account --- Region -- ID --- Product --- data
    sales --- West --- D1 --- Coffee --- 11001
    sales --- West --- D2 --- Tea10 --- 12011
    sales --- West --- S1 --- Tea10 --- 14115
    sales --- West --- C3 --- Tea10 --- 21053
    sales --- East ---- S2 --- Coffee --- 15563
    sales --- East ---- D1 --- Tea10 --- 15894
    sales --- East ---- D3 --- Coffee --- 19689
    sales --- East ---- C1 --- Coffee --- 18897
    sales --- East ---- C3 --- Tea10 --- 11699
    cogs --- West --- D1 --- Coffee --- 1,322
    cogs --- West --- D2 --- Tea10 --- 1,325
    cogs --- West --- S1 --- Tea10 --- 1,699
    cogs --- West --- C3 --- Tea10 --- 1,588
    cogs --- East ---- S2 --- Coffee --- 1,458
    cogs --- East ---- D1 --- Tea10 --- 1,664
    cogs --- East ---- D3 --- Coffee --- 1,989
    cogs --- East ---- C1 --- Coffee --- 1,988
    cogs --- East ---- C3 --- Tea10 --- 1,328
    discounts --- West --- D1 --- Coffee --- 10789
    discounts --- West --- D2 --- Tea10 --- 10548
    discounts --- West --- S1 --- Tea10 --- 10145
    discounts --- West --- C3 --- Tea10 --- 10998
    discounts --- East ---- S2 --- Coffee --- 10991
    discounts --- East ---- D1 --- Tea10 --- 11615
    discounts --- East ---- D3 --- Coffee --- 15615
    discounts --- East ---- C1 --- Coffee --- 11898
    discounts --- East ---- C3 --- Tea10 --- 12156
    amount --- West --- D1 --- Coffee --- 548
    amount --- West --- D2 --- Tea10 --- 589
    amount --- West --- S1 --- Tea10 --- 852
    amount --- West --- C3 --- Tea10 --- 981
    amount --- East ---- S2 --- Coffee --- 876
    amount --- East ---- D1 --- Tea10 --- 156
    amount --- East ---- D3 --- Coffee --- 986
    amount --- East ---- C1 --- Coffee --- 256
    amount --- East ---- C3 --- Tea10 --- 9896
    And the third way is to adjust the essbase.cfg parameters DLTHREADSPREPARE and DLTHREADSWRITE (and DLSINGLETHREADPERSTAGE)
    I just want to be sure that I understand your suggestions.
    Many thanks for awesome help,
    Zeljko

  • Error while data loading in real time cube

    HI experts,
    I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
    The cube is  a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
    It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.

    What was the resolution to this issue.  We rae having the same issue only with external system (not a flat file).  We get the  RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE).  We have been facing this issue for a while and even opened up a message with SAP.

Maybe you are looking for

  • HP Officejet Pro 8100 - trouble printing from the web

    The printer prints just fine however it will not print from certain websites.  I get a message to choose a printer or setup a printer and when I choose the printer to print from it will not acknowledge it.  Any help would be greatly appreciated. Than

  • Feature Request: Please make a 5.0 (5.1 without woofer) option for surrou

    Hello Creative, please make an 5.0 option for your soundcards. There are still some people out there with high-end speaker systems. "These" people don't need a subwoofer, because they have full range front speakers (40hz -20khz +-3 db and at 30 hz -6

  • Ain't I using java -Xmx500m correctly?

    On the command line, when I type �java -version� I get java version "1.6.0_01" Java(TM) SE Runtime Environment (build 1.6.0_01-b06) Java HotSpot(TM) Client VM (build 1.6.0_01-b06, mixed mode, sharing) when I type �java -Xms32m -Xmx128m� or �java -Xms

  • Customer payment  against invoices

    Dear friends . Please suggest me that is there any report in FI for customer payment against invoices Customer No     Name     Company Code     Fiscal Year     Invoice No     Invoce Posting Date     Invoice Amount     DZDoc No         DZ Amount     D

  • Print DMS Document from Material master

    Hello All, Can you advice on how to Print the DMS Document as a smartforms  from Material master ?