Loading data u2013 data mart scenario

We are loading from a cube to 20 other cubes with one info package.
If I created say 2 info packages and loaded to 10 cubes from each info package, would this reduce the runtime for loading and if so what do I need to ensure is available ie
No of processes used to load to each cube (does anyone know what this is and whether this is background, dialog)  etc
Thanks

Hi,
This depends on many things,....
1) No of available resources(dialog/bkgd) in ur system.
2) The type of rules u have defined...
3) The data packet size u have defined at the IP level.
4) The fastness of moving idocs i.e in SM58...
5) The extraction job and the performance(deleting indexes) of the target cubes...
Now whenever u start a load then the extraction job will be running in BKGD only but the processing of each data packet runs in dialog.So the extraction job always be finishing in lesstime but the processing and updating takes time..this u can monitor it in SM51.
And then the dailog processes can be monitored at IP level>scheduler>default data's transfer there  ucan chk the no of dialog processes....There itself u can increase the dat apackert size also....
and if ur in BI use DTP's it always runs in BKGD so it access very fast as there is no concept of TRFC i.e SM58 in DTP's...u can increase the batch/parallel processes also...
If the load is huge as ur loading to 20 cubes i suggest to run one after the other if ur using 2 packages...
Use SPRO>Display IMG> to Maintain Control Parameters for Data Transfer
rgds,
Edited by: Krishna Rao on May 14, 2009 1:00 PM

Similar Messages

  • Runtime Errors       " UNCAUGHT_EXCEPTION ...... While loading data

    Hi Gurus:  Need yoyur help to resolve the loading issue (Data Mart)
    Scenario:
        Loading data from BW layer to BI layer - Cube to Cube loading of data.
        Data loaded in DW layer cube with no issues.  However, gives the following error when loading to
        BI layer cube (It is 1:1 mapping excpt for couple of convesrion routines on satndard objects such
        as Material Division, periods and Unit).
    Message:
         *Runtime Errors         UNCAUGHT_EXCEPTION*
         *Exception               CX_RSR_X_MESSAGE*   
    This happens in Program - 'SAPLRRMS'; Include -"LRRMSU13"
    What happened?
        The exception 'CX_RSR_X_MESSAGE' was raised, but it was not caught anywhere 
        along the calhierarchy Since exceptions represent error situations and this error was
        not adequately responded to, the running ABAP program 'SAPLRRMS' has to be
        terminated.
    We are aware of the following 2 OSS notes reference:
    855424 - Error in Formula Compiler
    872193 - Run Time Error 'UNCAUGHT_EXCEPTION' during upload
    Any other solution for this?  please let me know.
    Thanks.......SMaalya

    please replicate DM system.
    -- sameer

  • How to load the exist Data Mart Request in to cube.

    Hi Friends,
    I have a scenario... i got DSO and one Cube , iam loading the data from DSO to Cube .
    The number of records are more in the DSO and due to some clculation in the routine i need to load the data year by year(i got 2007 and 2008 year daya). I have loaded the 2007 data to the Infocube and in the DSO i can see the Data Mart Symbal against the Request Id when th load is finished successfully.
    When iam trying to load the 2008 data from DSO to Cube iam getting the "0" records. I realised that the Data Mart symbal is exist in the DSO.
    How to load the 2008 data in this scenario. If i delete the data mart sysbel means iam deleting the Cube request.
    Can any have an idea on this.
    Thaks in advance.

    HI,
    Things are not clear.
    How is loading happening if its delta or full load through DTP or you using 3.5 flow??
    In any cases if you do a full load or full repair based on the year selection it should pick the records from the source...there is nothing to do with the data mart status in the case of full loads.
    Data mart status will come into picture only when you schedule the delta loads.
    Do a full load based on selections on year from the DSO to the cube...no need to delete the data mart or it will bring that request again delta is scheduled...does that request in DSO contains the data for 2008 only ...if yes then you can just delete the data mart status for that and do a delta...if not then do full loads as said.
    Thanks
    Ajeet

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Automatic loading from data marts

    Hi All,
    I have a cube which has a data flow wherein I have an ODS1 which gives data to ODS2 and then finally to the cube.
    I have put this entire process in a process chain.
    But many a times due to an erroneous record,. the ODS1 loading fails and then I have to manually correct the PSA and do a manual activation too in ODS1.
    I
    I now want to know whether the system will pick up the processes again from the process chain and do further loading automatically OR if there is a possibility of automated loading settings be defined in the infopackage so as to do subsequent data targets once I have done it successfully in ODS1???
    Also, if I do such a setting, then the job will be done in my username or BWALEREMOTE??
    Thanks,
    Sharmishtha Biswas

    Thnaks for ur reply,
    But it is possible that the data mart does it automatically??
    As I have observed that some of the subequent infoproviders get loaded automatically after I manually load and activate one data mart.
    I wanted to find an explanation for this..
    Thanks,
    sharmishtha

  • Data mart data load InfoPackage gets shot dumps

    This is related to the solution Vijay provided in the link What is the functionality of 0FYTLFP [Cumulated to Last Fiscal Year/Period
    I encounter the problem again for Data Mart load that I created different initial load InfoPackages with different data selection and ran them separatedly that the initial data packet are messed up and whenever I try to creat a new InfoPackage, always get short dumps. RSA7 on BW system itself doesn't give the fault entry.
    I try to use the program RSSM_OLTP_INIT_DELTA_UPDATE you provided, get three parameters:
    LOGSYS (required)
    DATASOUR (required)
    ALWAYS (not required)
    I fill in LOGSYS with our BW system source system name that's in the InfoPackage and fill in DATASOUR with the datasource name 80PUR_C01. But it goes nowhere when clicking the execution button!
    Then I tried another option you suggested by checking the entries in the following three tables:
    ROOSPRMS Control Parameters Per DataSource
    ROOSPRMSC Control Parameter Per DataSource Channel
    ROOSPRMSF Control Parameters Per DataSource
    I find there is no any entry for 1st table with datasource 80PUR_C01, but find two entries in each of the 2nd and 3rd tables. I need to go ahead to delete these two entries for these two tables, right?
    Thanks

    Kevin,
    sorry, I didn't follow your problem/question, but pay attention when you want to modify these tables content !!!
    Since there is an high risk of inconsistencies...(why don't you ask for some SAP support in OSS for this situation?)
    Hope it helps!
    Bye,
    Roberto

  • Data Mart load does not support Delta load?

    For data mart load, we create an InfoPackage for Delta load, under data selection tab, for the InfoObj. 0FISCPER, we pick OLAP variable 0FPER [Current Fiscal Year/Period (SAP Exit)], save this InfoPackage, but when get in again, find under Data Selection tab, on the row 0FISCPER, the From Value column becomes 001/2005 and To Value column becomes 011/2005 and the whole Data Selection tab screen becomes non-editable! 
    The correct behavior should be that after saving the InfoPackage on this row 0FISCPER, From Value/To Value columns are blank and just the Type column value is 7 and after saving the infopackage and get back in, the Data Selection tab screen is editable.
    But if we select Full update mode, then switch back to the Data Selection tab, then the behavior is right. 
    We wonder if SAP doesn't support Delta load for Data Mart load, why under the Update tab, the Delta load radio button shows up after we conducted an initial load?  It doesn't make any sense for SAP!

    Dear,
    as alredy suggested, you cannot do different selection criteria on your delta, since you automatically have to load on the basis of initial selection...and, if you think about it, this is logic: you cannot initialize a flow for certain conditions and then decide to upload in delta other (different) records...
    Hope now is clearer...
    Bye,
    Roberto
    clearly, if you perform a full (like a 'repair' full, from a functional point of view), this problem doesn't exist (data is taken from active table and not from change log one!)

  • Issue in loading data from data mart

    Actual load is from APO to BW is happening, but load gets failed when it is loaded from base cube to target cube (data mart), this flow is in 3.5.
    Attached document contains the system msg.

    Your load is still in process.
    Where load failure mesasge?
    Try to activate source 8DSO/Cube and later actiavte update rules thru prorgams.
    Can you check your job details from SM37.
    From info pack monitor, copy request id.
    Go SM37,Job name - <enter copied request id>.
    Use job status - Active/cancel/finished.
    Use proper data ranges.
    User - *

  • Problem loading from DATA MART to ODS, SERVICE-API

    Hi gurus,
    I have a problem loading data from data mart to ODS, full load.
    But, if i try extractor itself (test in RSA3) it works fine.
    I already replicated, generated,check transfer rules....datamart but when i try to load data, I get this to messages:
    Message no. R3005
    "The InfoSource 8TEST_MM specified in the data request, is not defined in the
    source system."
    Message no. RSM340
    Errors in source system.
    BTW: This test system was copied from production system. And so far I had no problems with system, but i never tried loading from data marts.
    Any ideas?
    Regards, Uros

    Thanks, for your answer.
    I already did that and everything is fine, I see the infosource, and if I test the extractor it works fine, but the infopackage gives me above mentioned errors.
    I already looked through notes and I couldn't find anything useful.
    I forgot to mention that I generated export data source from transacional ODS.
    Regards, Uros

  • Data Mart Staging Area Initial Load

    Hi
    I have to populate a Data Mart staging schema using production data. The data which I need to move is around 2TB (a few tables have 100GB size), rman backup is not an option because I am moving a subset of data, transportable tablespace is not an option because again I am moving a subset of data, data pump might be an option but we might not be able to get consistent view in the database (get ORA-015555 snapshot too old) because the load is high.
    I am thiking of this option, using data pump and export a table each time (for the large tables) and in the replicat add csn > flashback_scn for these tables. Is this possible?
    Thanks

    yes you can
    just use in the replicat parameter file
    map apps.fnd_users, target apps.fnd_users , FILTER ( @GETENV ("TRANSACTION", "CSN") > 1234);
    where 1234 is your expdp flashback_scn, if you have 10 scn for 10 tables just use different filter CSN in each of the mapping statement
    Regards

  • Table on delta loads from data mart

    Hi,
    I am loading data from two DSO's (let's call them A and B) to another DSO (B) in BI 7 with a BW3.5 delta infopackage.
    Now I want to know where I can find information on the timestamp or last request from the last delta load from A and B to C.
    So in fact I would like to know how the system knows which requests in A en B have not been loaded to C yet at the next delta load.
    In which table can I find the information for ODS A and B that is used by the system to define what data in the change log has been loaded or not to ODS C or other targets. (In fact I should have a comparison table in BW as ROOS* in R3)
    Thanks in advance!
    Kind regards,
    Bart

    Hi Guys,
    Thanks for the answers.
    I know how to check everything in the Workbench, but I want to know where the information of the delta is stored technically.
    Just for the sake of completeness:
    Due to some issues; several successive loads from A and B were correctly loaded into DSO C (the 'new' table), but could not be activated. It is not possible to do a repeat or anything else. I am not going into too much detail, but just take this for granted.
    The only way we can 'solve' the problem is to make the system believe that the 3 last loads (activated data) in A and the two last loads in B have not been loaded to C yet. Just deleting the last delta's in C and do a new delta from A and B to C will not work.
    Therefore I want to 'manipulate' the table that is being read by a delta load. If I can change the timestamp or request numbers in that table, I can make the system believe that some requests have not been loaded to C yet.
    Dirty, but it should work I think. But I am still figuring out what table contains information about the datamart data source (8A and 8B) and the last delta load to the targets.
    Hope this is more clear.
    Thanks in advance!
    Kind regards,
    Bart

  • Error message while loading data from data mart

    Please can some one help on this.
    While loading data from one cube to another (I am actually creating a backup of original cube) for 1 record I am getting the error message " internal error occured with time split" The long text of error message says Message no. RSAU101

    Take a look at this thread:
    Internal error occured with time split
    Hope it helps,
    Gilad

  • Loaded data is not yet visible in Reporting (see long text) Message no. RSM

    Hi All,
    I have one issue in BI part of Solman. For one of the scenario of Solution Manager uses its own BI component (BI_CONT 7.03 SP08 SAP_BW 700 SP15).
    There is one background job which has to update the data in one of the infocube 0SMD_PE2D. This job is getting failed for the past 2 months.
    When i check the logs of that background job, it shows the long text as below.
    Loaded data is not yet visible in Reporting (see long text)
    Message no. RSM1130
    Diagnosis
    There is an inconsistency between the load status of the data and the option of reporting on this data.
    There is data in the InfoProvider that is OK from a quality point of view, but is not yet displayed in reporting.
    The problem, for example, is to do with request 0000000494, number APO_R4CEHVODV7XS7MRL741GSMV9UB.
    Procedure
    Choose Refresh to remove the inconsistency.
    I went to RSA1 and selected that cube and choose "Execution changes - Manage". On the right hand side under request tab page it shows all the requests with request status as Green.  In Monitor there is No Data Available
    I found a similar thread [Loaded data is not yet visible in Reporting for Data Mart destination cub; unfortunately solution is not updated.
    Could any one help me
    regards
    Naveen

    Hi Aduri,
    In LISTCUBE i am getting the following error
    SQL Error 0
    Unknown error in SQL interface
    System error in program SAPLRS_EXCEPTION and form RS_EXCEPTION_TO_MESSAGE 
    Source system is Solution Manager only, production client and BI by default its running in 001 client. We dont have any APO system in our landscape.
    Kindly advice
    Regards
    Naveen

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Data Mart -  Configuaration steps - Please help

    We would like to load data from our BW DEV to  BW Sand box. We have got RFC connection too. we have  created DEV as source system in Sandbox...
    We have similar cubes, ODS & infosource in both the DEV and Sand box. But, there is no data in Sandbox for any ODS /Cubes
    Now, Can anyone explain the data mart configuartion for DEV side and Sandbox side and how to load data from source (DEV) ODS/ Cube  to target (Sandbox) ODS / Cube in step by step..
    We would like to load data for a particular period for example only for the month of Jan2007.
    Advance Thanks.

    Hi Bhanu Gupta,
    Thanks for your advise. I have few question. Please help.
    For Example,
    1. I have Infosource 2LIS_17_AAA in both the system (DEV and Sandbox). I am seeing that the data flow is from this infosource (2LIS_17_AAA ) to Cube AAA in DEV.
    What  should I need to do in order to load data from this cube AAA (DEV) to cube BBB(sandbox). (I confused, whether I have to create export data source for the cube AAA or what shoud I do for the infosource 2LIS_17_AAA.
    2. Another case is, I have two Infosources like 2LIS_17_CCC and 2LIS_17_DDD, in both the system (DEV and Sandbox). I am seeing that the data flow scenario in DEV is from these infosources (2LIS_17_CCC & 2LIS_17_DDD ) to ODS EEE and then from this ODS (export data source - 8ODSEEE) to Cube FFF.
    In Sandbox, we have above these infosources, ODSEEE, 8ODSEEE and Cube FFF.
    Can you please explain in steps..Thanks.

Maybe you are looking for