Rela time data load in BI

Hello experts ,
I  have a table in R/3 side which will be updated very frequently in R/3 side.
How can I   get the real time data in BI from that table ?
Thanks
Pankaj

Hi,
You can create a generic delta datsource based on that table and enable delta mechanism. By this you can capture delta and will help you to get frequently changed data to BI.
Check the following link which will give you an idea to create generic delta datasource
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33

Similar Messages

  • Time Data loading with missing format

    Hi expert,
    after we load time data, find the following missing information:
    0CALDAY:
    20.080.112 (Error)
    2008.01.12 (correct one)
    0CALMONTH:
    200.801(Error)
    2008.01(correct one)
    0CALQUARTE
    20.081(Error)
    20081(correct one)
    Could anyone let us know how to correct data with missing format to correct one and how to do this step by step? Many thank!

    What is the source for this data ?
    Have it corrected in the source
    Or
    Correct in PSA and load it from there.
    Does it happen only for a few records ?

  • Table Name Storing date/time Data Loaded.

    Hi All,
    In BI I have few Info Providers to which data will be loaded frequently. Is there any table in BI which stores the information like the date and time the data was loaded to the corresponding Info Provider. Is there any table as such?? If yes then what is the transaction to reach that table.
    Thanks in Advance,
    Rajesh.

    check the table in SE11 tcode
    RSSTATMANPART

  • Message while data load is in progress

    Hi,
    We are loading the data in the infocube every day twice i.e in the morning and the afternoon.
    Loading methodology is always delete and load. At any given point we have only one request in the cube. Data Load takes arround 20-30 minutes.
    When the user runs the query during the data load, he gets the message 'No Applicable Data Found'. Can anyone please advise, how do we show the proper message like 'Data is updating in the system..Please try after some time...' etc.
    We are using BEx Browser with a template and a query attached to the template.
    Please advise.
    Regards
    Ramesh Ganji

    Hi,
    Tell the time of the data load to the users so that they are aware that the loads are in progress and data will not be available for reportting as of now and prohibit themselves from running the report give a buffer time of around 15-20 mins as there might be some issue some where down the line. Ask them to run the report other than the time data loads are happening
    You could also reschedule the timings of the process chain to finish before the users comes in.
    As far as the functionaly you are referring to i am not sure if we are able to acheive this..
    Regards
    Puneet

  • Data loading problem with Movement types

    Hi Friends,
            I extarcted data using the data source General Ledger : Line itemdata (0fi_gl_4) to BW side.
        Problem is Movement Types for some documents missing.
    But i checked in rsa3 that time showing correctly.
    i restricted the data in bw side infopackage level only particular document that time data loading perfecly.
    this data source having 53,460 records.among all the records 400 records doc type 'we' movement types are missing.
    please give me solution for this how to loading the data with movement types.
    i checked particular document of 50000313 in RSA3 it is showing movement types. then i loaded data in bw side that time that movement types are not comming to be side. then i gave the particular doc 50000313 in infopackage level loading the data that time movement types are loading correctly. this extaractor having 55000 records.
    this is very urgent problem.Please give me reply urgenty. i am waiting for your's replys.
    Thanks & Regards,
    Guna.
    Edited by: gunasekhar raya on May 8, 2008 9:40 AM

    Hi,
    we enhanced Mvement type field(MSEG-BWART) General ledger (0FI_GL_4) extractor.
    this field populated with data all the ACC. Doc . number.
    Only 50000295 to 50000615  in this range we are not getting the movement types values.
    we didn't write any routines in transfer and update rules level.
    just we mapped to BWART field 0MOVETYPE info object.
    we restrict the particular doc no 50000313 infopackage level that time loading the the data into cube with movement types.
    but we remove the restriction infopackage level then loading the data that time we missing the movement types data of particular doc no 50000295 to 50000615.
    Please give mesolution for this. i need to solve this very urgently.
    i am witing for your reply.
    Thanks,
    Guna.

  • Common errors in Data Loading

    Can anybody tell me what are the common error we come across in BW/BI for following:-
    1] Data Loading in Infocube/ODS/DSO
    2] Aggregates
    3] PSA
    4] DTP
    5] Transformations
    Thanks in advance

    Hi,
    Here are the list of common issues we face while data loading in to BW:
    Data Loading in Infocube/ODS/DSO:
    1) Missing SID issue or missing master data
    2) Data load cancelled becasue of various exceptions. Eg: Message Type 'x', DBIF SQL Error,
    3) Data records found in duplicate
    4) Alpha Confirming Vaue error
    5) Invalid time interval
    6) Time overlap error
    7) Job Cancelled in source system
    8) RFC Connection Error
    9) Data Source Replication error
    10) Data load locked by active change run
    11) Attributes to target by user 'xyz'
    12) Job cancelled as the previous load is still running
    13) Target locked by a change run
    (Some times data loads dont fail but run for long time then usual with out any progess, because of struck tRFCs in source sytem, source sytem performance is poor and the job is in release state for long time, etc...)
    Aggregate Rollups:
    1) Nofilled aaggregates available, rollup not possible
    2) Rollup locked by active change run
    3) Job cancelled because of various exceptions
    PSA Updations:
    1) Job cancelled because of various exception
    2) Missing SID value
    ODS Activaitons:
    1) Request to be activate shoud be green - (Loading of data in to ODS in still going on)
    2) Key exists in duplicate
    3) Job cancelled because of various exceptions (Short Dumps)
    Attribue Change Run:
    1) Locked by a rollup
    2) Job cancelled because of various exceptions
    3) Locked by another change run
    Hope it helps....
    Cheers,
    Habeeb

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Takes Long time for Data Loading.

    Hi All,
    Good Morning.. I am new to SDN.
    Currently i am using the datasource 0CRM_SRV_PROCESS_H and it contains 225 fields. Currently i am using around 40 fields in my report.
    Can i hide the remaining fields in the datasource level itself (TCODE : RSA6)
    Currently data loading takes more time to load the data from PSA to ODS (ODS 1).
    And also right now i am pulling some data from another ODS(ODS 2)(LookUP). It takes long time to update the data in Active data table of the ODS.
    Can you please suggest how to improve the performance of dataloading on this Case.
    Thanks & Regards,
    Siva.

    Hi....
    Yes...u can hide..........just Check the hide box for those fields.......R u in BI 7.0 or BW...........whatever ........is the no of records is huge?
    If so u can split the records and execute............I mean use the same IP...........just execute it with different selections.........
    Check in ST04............is there are any locks or lockwaits..........if so...........Go to SM37 >> Check whether any Long running job is there or not.........then check whether that job is progressing or not............double click on the Job >> From the Job details copy the PID..............go to ST04 .....expand the node............and check whether u r able to find that PID there or not.........
    Also check System log in SM21............and shortdumps in ST04........
    Now to improve performance...........u can try to increase the virtual memory or servers.........if possiblr........it will increase the number of work process..........since if many jobs run at a time .then there will be no free Work prrocesses to proceed........
    Regards,
    Debjani......

  • Data load to DSO takes long time to finish

    Dear All,
    We have a data load from data source to std  DSO.The data load takes 5 hours to complete  6000 records in single data package which is long time.
    Process monitor shows yellow status at one of the step for long time "No message :Transformation End" and after 5 hours approx  it completes successfully.
    Please find the snapshot of process monitor(Attached File Process monitor.png).
    There is an end routine and the transformation  is having direct mapping except for a target object exchage rate which is master data look up of DSO (Attached FIle : Transformation rule.png)
    The look up DSO /BI0/AFIGL_DS00 in the below code is having DOCNUM as a primary key  but not the POSKY. Since one of the field is not a primary key,secondary index is created for the look up DSO.But,still it takes huge time to finish the last step as mentioned in the snapshot.
    Setting for parallel process is 1
    DTP--> Update tab-->Error handling-->No update,no reporting.But there is a error DTP present which I believe that there is no use when "No update,No reporting" option is chosen.
    Can you please suggest the reason for the such long time.Also,Please suggest how to find the exact place where it consumes lot of time.
    End routine Logic:
        IF NOT RESULT_PACKAGE IS INITIAL.
          REFRESH IT_FIG.
          SELECT DOCNUM  POSKY DEBCRE LOCC
          FROM /BI0/AFIGL_DS00 INTO TABLE IT_FIG
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE DOCNUM = RESULT_PACKAGE-BILNO AND
                POSKY = '02'.
        LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
            READ TABLE IT_FIG INTO WA_FIG WITH KEY
                       DOCNUM = <RESULT_FIELDS>-BILNO.
            IF SY-SUBRC EQ 0.
              <RESULT_FIELDS>-DEB = WA_FIG-DEBCRE.
              <RESULT_FIELDS>-LOC_CURRC2 = WA_FIG-LOCC.
            ENDIF.
        ENDLOOP.
        ENDIF.
    Thanks in advance
    Regards
    Pradeep

    Hi,
    below code check it and try to load the data.
    IF RESULT_PACKAGE IS NOT INITIAL.
          SELECT DOCNUM 
                          POSKY
                          DEBCRE
                          LOCC
          FROM /BI0/AFIGL_DS00 INTO TABLE IT_FIG
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE DOCNUM = RESULT_PACKAGE-BILNO AND
                POSKY = '02'.
        LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
            READ TABLE IT_FIG INTO WA_FIG WITH KEY
                       DOCNUM = <RESULT_FIELDS>-BILNO.
            IF SY-SUBRC EQ 0.
               <RESULT_FIELDS>-DEB = WA_DOCNUM.
               <RESULT_FIELDS>-DEB = WA_POSKY.
              <RESULT_FIELDS>-DEB = WA_FIG-DEBCRE.
              <RESULT_FIELDS>-LOC_CURRC2 = WA_FIG-LOCC.
            ENDIF.
        ENDLOOP.
        ENDIF.
    if your are getting any error please let us know
    1.decrease the data packet size in DTP like 10,000 or 20,000.
    2.increase the parallel process at DTP level.
    Thanks,
    Phani.

  • How to tune data loading time in BSO using 14 rules files ?

    Hello there,
    I'm using Hyperion-Essbase-Admin-Services v11.1.1.2 and the BSO Option.
    In a nightly process using MAXL i load new data into one Essbase-cube.
    In this nightly update process 14 account-members are updated by running 14 rules files one after another.
    These rules files connect 14 times by sql-connection to the same oracle database and the same table.
    I use this procedure because i cannot load 2 or more data fields using one rules file.
    It takes a long time to load up 14 accounts one after other.
    Now my Question: How can I minimise this data loading time ?
    This is what I found on Oracle Homepage:
    What's New
    Oracle Essbase V.11.1.1 Release Highlights
    Parallel SQL Data Loads- Supports up to 8 rules files via temporary load buffers.
    In an Older Thread John said:
    As it is version 11 why not use parallel sql loading, you can specify up to 8 load rules to load data in parallel.
    Example:
    import database AsoSamp.Sample data
    connect as TBC identified by 'password'
    using multiple rules_file 'rule1','rule2'
    to load_buffer_block starting with buffer_id 100
    on error write to "error.txt";
    But this is for ASO Option only.
    Can I use it in my MAXL also for BSO ?? Is there a sample ?
    What else is possible to tune up nightly update time ??
    Thanks in advance for every tip,
    Zeljko

    Thanks a lot for your support. I’m just a little confused.
    I will use an example to illustrate my problem a bit more clearly.
    This is the basic table, in my case a view, which is queried by all 14 rules files:
    column1 --- column2 --- column3 --- column4 --- ... ---column n
    dim 1 --- dim 2 --- dim 3 --- data1 --- data2 --- data3 --- ... --- data 14
    Region -- ID --- Product --- sales --- cogs ---- discounts --- ... --- amount
    West --- D1 --- Coffee --- 11001 --- 1,322 --- 10789 --- ... --- 548
    West --- D2 --- Tea10 --- 12011 --- 1,325 --- 10548 --- ... --- 589
    West --- S1 --- Tea10 --- 14115 --- 1,699 --- 10145 --- ... --- 852
    West --- C3 --- Tea10 --- 21053 --- 1,588 --- 10998 --- ... --- 981
    East ---- S2 --- Coffee --- 15563 --- 1,458 --- 10991 --- ... --- 876
    East ---- D1 --- Tea10 --- 15894 --- 1,664 --- 11615 --- ... --- 156
    East ---- D3 --- Coffee --- 19689 --- 1,989 --- 15615 --- ... --- 986
    East ---- C1 --- Coffee --- 18897 --- 1,988 --- 11898 --- ... --- 256
    East ---- C3 --- Tea10 --- 11699 --- 1,328 --- 12156 --- ... --- 9896
    Following 3 out of 14 (load-) rules files to load the data columns into the cube:
    Rules File1:
    dim 1 --- dim 2 --- dim 3 --- sales --- ignore --- ignore --- ... --- ignore
    Rules File2:
    dim 1 --- dim 2 --- dim 3 --- ignore --- cogs --- ignore --- ... --- ignore
    Rules File14:
    dim 1 --- dim 2 --- dim 3 --- ignore --- ignore --- ignore --- ... --- amount
    Is the upper table design what GlennS mentioned as a "Data" column concept which only allows a single numeric data value ?
    In this case I cant tag two or more columns as “Data fields”. I just can tag one column as “Data field”. Other data fields I have to tag as “ignore fields during data load”. Otherwise, when I validate the rules file, an Error occurs “only one field can contain the Data Field attribute”.
    Or may I skip this error massage and just try to tag all 14 fields as “Data fields” and “load data” ?
    Please advise.
    Am I right that the other way is to reconstruct the table/view (and the rules files) like follows to load all of the data in one pass:
    dim 0 --- dim 1 --- dim 2 --- dim 3 --- data
    Account --- Region -- ID --- Product --- data
    sales --- West --- D1 --- Coffee --- 11001
    sales --- West --- D2 --- Tea10 --- 12011
    sales --- West --- S1 --- Tea10 --- 14115
    sales --- West --- C3 --- Tea10 --- 21053
    sales --- East ---- S2 --- Coffee --- 15563
    sales --- East ---- D1 --- Tea10 --- 15894
    sales --- East ---- D3 --- Coffee --- 19689
    sales --- East ---- C1 --- Coffee --- 18897
    sales --- East ---- C3 --- Tea10 --- 11699
    cogs --- West --- D1 --- Coffee --- 1,322
    cogs --- West --- D2 --- Tea10 --- 1,325
    cogs --- West --- S1 --- Tea10 --- 1,699
    cogs --- West --- C3 --- Tea10 --- 1,588
    cogs --- East ---- S2 --- Coffee --- 1,458
    cogs --- East ---- D1 --- Tea10 --- 1,664
    cogs --- East ---- D3 --- Coffee --- 1,989
    cogs --- East ---- C1 --- Coffee --- 1,988
    cogs --- East ---- C3 --- Tea10 --- 1,328
    discounts --- West --- D1 --- Coffee --- 10789
    discounts --- West --- D2 --- Tea10 --- 10548
    discounts --- West --- S1 --- Tea10 --- 10145
    discounts --- West --- C3 --- Tea10 --- 10998
    discounts --- East ---- S2 --- Coffee --- 10991
    discounts --- East ---- D1 --- Tea10 --- 11615
    discounts --- East ---- D3 --- Coffee --- 15615
    discounts --- East ---- C1 --- Coffee --- 11898
    discounts --- East ---- C3 --- Tea10 --- 12156
    amount --- West --- D1 --- Coffee --- 548
    amount --- West --- D2 --- Tea10 --- 589
    amount --- West --- S1 --- Tea10 --- 852
    amount --- West --- C3 --- Tea10 --- 981
    amount --- East ---- S2 --- Coffee --- 876
    amount --- East ---- D1 --- Tea10 --- 156
    amount --- East ---- D3 --- Coffee --- 986
    amount --- East ---- C1 --- Coffee --- 256
    amount --- East ---- C3 --- Tea10 --- 9896
    And the third way is to adjust the essbase.cfg parameters DLTHREADSPREPARE and DLTHREADSWRITE (and DLSINGLETHREADPERSTAGE)
    I just want to be sure that I understand your suggestions.
    Many thanks for awesome help,
    Zeljko

  • How to retrive the new reocrds which are posted at the time of loading data

    hi experts
    i have a doubt
    if we are performing a load operation at the time of loading if client posts some new records related to the current load is those records will be transferred to the target with the current load or we have to load them at the time of delta load, what will be the problems occurs at this situation
    also we are having an option in RSA3 ---BLOCKED ORDERS is it going to helpful for this situation
    also i found some answer like at the time of loading we need to lock the base tables so that the new data will be blocked is it the solution for the above scenario
    thanks in advance

    Hi Lokesh,
    Not clear if you are referring to posting of records during an initialization activity or normal delta, full loads. In case of an initialisation for a LO Cockpit datasource you cannot allow any postings to be done in the source (ECC) system. In case of normal delta, full loads, the changes are stored in tables/extract structures/delta queues and are not affected by the changes in the source system during that time. The changes done during that time are captured in the next delta run.
    Hope this helps!
    You may want to refer to blogs from Roberto on the extraction methods and their operations.
    Regards,
    Kunal Gandhi

  • Long time to load data from PSA to DSO -Sequential read RSBKDATA_V

    Hi ,
    It is taking long time to load data from PSA to DSO. It is doing Sequential read on RSBKDATA_V and table contents no data .
    we are at - SAPKW70105. It started since yesterday . There is no changes in system parameters.
    Please advice. 
    Thanks
    Nilesh

    Hi Nilesh,
    I guess the following SAP Note will help you in this situation.
    [1476842 - Performance: read RSBKDATA only when needed|https://websmp107.sap-ag.de/sap/support/notes/1476842]
    Just note that the reference to Support Packages is wrong. It is also included in SAP_BW 701 06.

  • Data form taking long time to load

    Hi All,
    My data form is taking a very long time to load. It has period dimension which is having 53 weeks. If I collase Period dimension then the dataform opens up quickly otherwise its taking a very long time (around 7 mins) to open up.
    What can be done to improve the speed?
    Thanks

    Abhishek wrote:
    Hope this helps
    Improving data form performance
    So do I :)
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Revaluate data record at the time of loading from flat file or BI Cube

    Hello Friends,
    I want to revaluate a data record at time of loading using Transformation or Conversion File, based on some condition.
    Like, I have a rule to identify that a record is supposed to be multiplied by -1 or not.
    For example,
    *if (ID(1:5) = str(00070) then(Record-1)
          ID(1:5) = str(00071) then (Record-2)
    Can you please guide me how can I achieve this by using Transformation file or Conversion file?
    Regards,
    Vishal.

    Hi Nilanjan,
    Thanks for reply.
    I tried the script you suggested in conversion file for Account.
    But It is not working for me.
    Even I tried simple multiplication and also addition in Formula column it is not working.
    External   -->   *
    Internal    -->    *
    Formula   --->  Value * -1
    Above conversion file for Account was not working for me.
    then I tried
    Formula  --> Value + 100
    It also did not work for me.
    Kindly suggest if I am doing anything wrong in above file.
    Thanks,
    Nilanjan.

  • Error "cannot load request real time data targets" for new cube in BI 7.

    Hi All,
    WE have recently upgarded our SCM system from 4.1 to SCM 7.0 which incorporated BI 7.0.
    I am using BI 7.0 for first time and ahve the following issue:
    I ceated a new infocube and data source of flat file and succesfully created transformation, and Data Transfer Process. Everything looked fine. I added flat file and checked preview and could see data. Now when I start job to load data in infocube the follwing error is shown "cannot load request real time data targets". 
    I checked cube type in setting in infcune is shows as Standard.  When I doube clicked on error the following message showed up
    You are trying to load data into a real-time InfoCube using a DTP.
    This is only possible if the correct load settings have been defined for the InfoCube.
    Procedure
    In the object tree of the Data Warehousing Workbench, call Load Behavior of Real-Time InfoCube from the context menu of the InfoCube. Switch load behavior to Transactional InfoCube can be loaded; planning not allowed.
    I did not understand what it is meant and how to set changes. Can someone advice and follow me through.
    Thanks
    KV

    Hi Kverma,
    Real-time InfoCubes can be filled with data using two different methods: using the transaction for entering planning data, and using BI staging, whereby planning data cannot be loaded simultaneously. With Real time cube you can select the method you want to use for update as
    Real Time data Target can be loaded With Data; Planning not allowed &
    Real Time data Target can be Planned; Data loading not allowed
    You can change this behaviour by right clicking on cube and selecting Change real time load behaviour and select first option. You will be able to load the data then
    Regards,
    Kams

Maybe you are looking for

  • How can i leave audio from one clip playing, while showing another one in i

    i am editing a movie, using iMovie 08, and want to use sound for a clip in the video for the end credits. how can i do this? just to be clear, i am not trying to add audio. i am trying to leave the audio from ONE CLIP playing, while viewing a new fra

  • How to assign a value to the viewcriteria attribute from backing bean ?

    Hi, I have a readonly VO with the following query select attr1,attr2, attr3, attr4 from sometable I have a view criteria in the above VO something like attr1= and attr2= I have dragged and dropped this criteria on a Status.jsff page as query panel wi

  • Data retrival

    hi experts, we have parameter and select-options in selection screen.if we can't provide any values in options what will be the output. and if we can't provide any values in parameters what will be the output. thanks in advance.

  • Projects in forms developer 10g

    hi all, i need forms based projects for study purpose. is there any websites providing such projects for free. plz help me to get it

  • Need help--Classic won't start

    I still have a couple of OLD apps that require System 9. When I attempt to open either of them I get the "Classic is starting" progress bar, but it never quite finishes. I can start Classic okay with extensions off but then I can't use the printer, a