ODS to CUbe Zero Record Load

Hi,
A daily load from ODS to Cube is scheduled but the source may or may not have records. so every time the source hass zero records the load happens till the ODS and the status for the ODS remains yellow.
I manually change it to green and activate the request so that the process continues. Is there a way to automate this so that eventhough there are zero records in the load the ODS is activated.
Help would be appreciated

Hi
You can do this.
RSMO>select your request>settings>Evaluation of requests(traffic lights)>
IF no data is available in the system the request-->select the radio button.Status Successful..
Hope it helps
Thanks
Teja

Similar Messages

  • Cube to Cube vs ODS to Cube good for loading performance

    BW Experts,
    I have an option of loading from Cube to Cube or ODS to Cube. It has data in 20 m records.
    So in terms of performance of loading perspective, which one is better..
    Thanks in advance,
    BWer

    Hi,
    of course (but you didn't mention that the level of detail was different) it's depending on if there's alot more data in the DSO than in the cube.
    A DSO is 1 table which is easier to extract data from and build additional indexes on to speed up load time.
    The Info cube is as you know more compelx built with the multi dimensional model.
    Kind regards
    /martin

  • Zero Record load should trigger warning

    Hi,
    I have got one more requirement where the client should get warning if there is any Zero delta load.
    where would be future monitoring best to implement from your point of view - on BI side or source system side to get warnings about this kind of issues (0 record loads)?
    I feel can we achieve this in process chain by using email.If yes how.
    Regards,
    Anita

    Yes this can be done... there's a table rsmonicdp which store all the info abt RSMO... so just check this table in a report and put where condition for load type as delta and records transferred as 0...
    Just a have a look at RSMONICDP...and rsmdatastat....
    Regards,
    Aadil.

  • Issue in loading ODS to CUBE(URGENT)

    Hi All,
    We are facing an issue in loading AR,AP and G/L ODS to Cube.
    The steps we done are
    1) Repair full request is done in ODS got around 35 lack records.
    2) Done Full load in Cube n got 35 lack records.
    3) Done delta load in ODS got 189 records.
    4) Done delta load in Cube got 1 lack records.
    This is where we guys couldnt find out what went wrong??
    1 lack records to cube is not possible...??
    Did we went wrong any where..??
    Could any one guide us to do What has to be done??
    Kind Regards,
    Shanbagavalli.S

    Hi,
    Thanks for the response.
    Already delta is done in the Cube and ODS.
    No we didn't do intialization . We did repair full in ODS and full update to cube,.
    What we have done is
    We planned to do the selective deletion from G/L cube for Q2 2007.
    Done a repair full request to ODS
    From ODS to Cube a full load. (as we have deleted the data from the cube)
    Problem is after doing this it has pulled data for Saturday, Sunday, Monday and Tuesday.(past five days)
    But the problem is wilth delta.. it is still pulling only 189 same records that got pulled in yesterday evening data load
    Can you tell me wat are all the methods to load those days records also delta records to CUBE.
    Thanks in advance.
    shanba
    Message was edited by:
            shanbagavalli shivasankaran
    Message was edited by:
            shanbagavalli shivasankaran

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Error while loading data from write optimized ODS to cube

    Hi All,
    I am loading data from a write optimized ODS to cube
    I have done Generate Export Datasource
    schedulled the info packge with 1 selection for full load
    then it gave me following error in Transfer IDOCs & TRFC
    Info IDOC 1: IDOC with errors added
    Info IDOC 2: IDOC with errors added
    Info IDOC 3: IDOC with errors added
    Info IDOC 4: IDOC with errors added
    Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
    Processing below is green
    shows update of  4 new records to Datapackage 1.
    Please provide inputs for the resolution
    Thanks & Regards,
    Rashmi.

    please let me know, What more details you need?
    If I click F1 for error details i get following message
    Messages from source system
    see also Processing Steps Request
    These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
    From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
    Thanks & Regards,
    Rashmi.

  • ODS to CUBE loading - taking too much time

    Hi Experts,
    I am loading data from R/3(4.7) to BW (3.5).
    I am loading with option --> PSA and then Data Target (ODS ).
    I have a selection criteria in Infopackage while loading from standard Datasource to ODS.
    It takes me 20 mins to load 300K records.
    But, from ODS to Infocube ( update method: Data Target Only), it is taking 8 hours.
    The data packet size in Infopackage is 20,000 ( same for ODS and Infocube).
    I also tried changing the data packet size, tried with full load , load with initialization,..
    I tried scheduling it as a background job too.
    I do not have any selection criteria in the infopackage from ODS to Cube.
    Please let me know how can I decrease this loading time from ODS to Infocube.

    Hi,
    To improve the data load performance
    1. If they are full loads then try to see if you make them delta loads.
    2. Check if there are complex routines/transformations being performed in any layer. In that case see if you can optimize those codes with the help of an abaper.
    3. Ensure that you are following the standard procedures in the chain like deleting Indices/secondary Indices before loading etc.
    4. Check whether the system processes are free when this load is running
    5. try making the load as parallel as possible if the load is happening serially. Remove PSA if not needed.
    6. When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    Ensure the data packet sizing and also the number range buffering, PSA Partition size, upload sequence i.e, always load master data first, perform change run and then transaction data loads.
    Check this doc on BW data load perfomance optimization
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    BI Performance Tuning
    FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
    Thanks,
    JituK

  • How to store as single record data coming from different ods to cube

    Hi All,
            we have the scenario like .
    Same contract values are uploaded from 3 ods to cube.In the cube this information is stored as 3 different records.
    Is there any option having same contract no and same fiscal period values should be stored as single record in cube.
    Thanks in advance.
    Regards,
    Shradda.

    Hi Shradda,
    On Performance Side .....consider below points
    1. Indexes on Cube ( That delete indexes before load and create them after Load ).
    2. Aggregate Design ( Decision on Base Aggregates, roll up Hierarchy , BW Statistics etc ).
    3. Partition of InfoCube ( Basically decision on No. of Partition )
    4. Data Package Size ( Always try to have larger Data Package so that preaggreation will reduce the no. of Data Records ).
    Best is Service.sap.com on BI page u will find material for performance . THat will help you .
    To assign points on the left of screen u will get radio buttons for assigning points for each person who has responded .
    Regards,
    Vijay

  • Delta loading daily ods to cube

    HI guys
    delta not laoding to ods to cube.last laoding in ods 3 /06 .
    when m try to loading m getting erorr
    subseq. processing(messages) : errors ocuured
    Activation of M records from DataStore object 0FIGL_O10 terminated ,
    REquest REQU-D3k2lr788qjc4g9w7h8fwbnhk(2.130) has not or not c0rrctely updated;p
    same msg showing in
    other (messages ): errors occurred
    Diagnosis
        Request REQU_D3K2IR788QJC4G9W7H8FWBNHK (2.130 ) for DataSource 0FI_GL_10
        from source system P02CLNT800 has the status green and a lesser SID (and
        is therefore older) than the request that you currently want to update
        into the DataStore object.
        This is not possible because the sequence of requests has to be
        followed.
        Delta- and init requests have to be updated to the DataStore object in
        the request sequence.
    please advise on this
    Regrds
    patel

    Patel,
    Go to RSRQ. Give this request no. REQU_D3K2IR788QJC4G9W7H8FWBNHK. There you will find the message that this request has not been properly updated. You will find a button to update the request. Update this to the data target and activate. You will be thru this time.
    Enjoy!!
    Shambhu

  • How to delete init request when load is taking place from ods to Cube

    Dear All,
    I have initiated the init load from ODS to Cube, but the init request was not successfull... Now on clicking the init infopkg i get the short dump error....
    I am trying to acess this infopkg so that i can delete the old init and reschedule it again...
    Now in order to avoid this error i need to delete the init request somehow..... Since data load is happening in BW system only there is no way to go to delta queue and delete the init...
    How can i delete the init request in this case or how do i avoid this dumup error???
    Thanks & Regards,
    Anup

    From se11, go to the table, see the content... from this window mark the check box for relevant entries in the first column and click on Display button (F7)
    enter "/h" in the command prompt and press enter
    Click the top most green button to execute.
    From the debugging screen, double click on variable "CODE" change the value to "DELE" (from "SHOW") and save (by pressing enter in bi7 and by clicking pencil icon in bw 3.x ).
    Press F8
    In the result screen you will find the delete entry in the application bar
    Delete one by one after ensuring the record content.
    Better you can ask any abaper to delete the table entries in the debug mode coz its a production server..else you can call me 9500066350(I prefer you to go along with abaper if you have no option dont wait for calling).

  • Load fail - ODS to Cube - INIT req

    Hi experts..
    Im loading init request from ODS to my cube.The req contains 12 million records in the ODS.I got the following error message in the RSMO.
    Job termination in source system
    Diagnosis
    The background job for data selection in the source system has been terminated. It is very likely that a short dump has been logged in the source system
    Procedure
    Read the job log in the source system. Additional information is displayed here.
    To access the job log, use the monitor wizard (step-by-step analysis)  or the menu path <LS>Environment -> Job Overview -> In Source System
    Error correction:
    Follow the instructions in the job log messages.
    This load is from ODS to CUBE ,and how does the source system is related here..
    Bcos it says error in the source system.????
    How to correct this error????
    thanks & regards
    ragu

    Hi Ragu
    When you are loading from ODS to Cube at that time source system BW Myself system (your own BW system) acts as source system.
    As given in Procedure section ->To access the job log, use the monitor wizard (step-by-step analysis) or the menu path <LS>Environment -> Job Overview -> In Source System->>> Look for job log.
    Also check in ST22 for short dump and let us know the error description in detail.
    Regards
    Pradip

  • Can I do Parallel Full loads from ODS to Cube.

    Hai,
       Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
    Please advise.
    Thanks, Vijay,

    Assuming that the only connection we are talking about is between a single ODS and a single Cube.
    I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
    If the update is a delta there is really no way to do it.
    How many records are we talking about? Is there logic in the update rule?

  • Date load error From ODS to Cube?

    Guru's,
    Here is a status of data load and suggest me.
    Actually we are loading data from ODS to Cube which is full load.This loading is done thorugh process chain and it is loaded for every half an hour.Every time the earlier load will be deleted and next load will be taken to cube.Two days ago the load had gone wrong.When I tried to see cause for the failure in Monitor details I could see Update missing and processing missing.When I see in process chains log view I got info stating some invalid characterstic had appeared.But when I search in PSA the whole of the data packets are green.If there is any invalid characterstics PSA should show a red status.But it is not showing.Please guide me how could I solve this?
    point will be definitely awarded for the proper answers.
    Thanks in advance.
    vasu.

    Hi Vasuvasu
    Once the edit is done and reloade the sata it wont show u the records in Red..
    Hope itz Helps..!
    Regards
    KISHORE M REDDY
    **Winners Don't Do Different things,They Do things Differently...!**
    > I dont whink this is with Disk space Y because after
    > every full load we are deleting data.

  • Delta Load failure from ODS to Cube

    Experts,
    After CRM upgrade, I had to reinitilized the delta process. In this, I reloaded ODS with first init delta without data and then delta load with new data created in CRM. This worked good so far. After this when I tried to load cube with init. delta without data, no issues. But delta load from ODS to Cube doesn't work. Does anybody have any suggestions, please?
    Thanks,
    Nimesh
    Following error observed in Status for delta load from ODS to Cube.
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data Packets or Info Packets are missing in BW, but there were - as far as can be seen - no processing errors in the source system. It is
    therefore probable that an error arose in the data transfer.
    With the analysis an attempt was made to read the ALE outbox of the source system, which lead to error .
    It is possible that no connection exists to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection of the source system for errors and check the
    authorizations and profiles of the remote user in both the BW and
    source systems.
    Check th ALE outbox of the source system for IDocs that have not been

    Hi,
    As far as i understood, you have successful deltas' loading to the ODS and you want to update entire ODS data to the Cube followed by daily delta's.
    If this is the case,
    Make sure that you have active update rules exist between ODS and Cube.
    First goto your ODS and right click and select option 'Update ODS data in data target'.
    Select Init update > IT will take you to the init info package where you select init <b>with data transfer</b> (Init with data transfer will bring all the records from ODS to Cube)
    Once init is completed ,you can schedule to load delta's regularly form ODS to Cube by following the same steps.
    Hope this helps
    Praveen
    Message was edited by:
            Praveen Vujjini

  • Error while loading from ODS to CUBE

    Hi guys,
    I am loading data from source system to ODS and from ODS to CUBE. Ok. The data came successfully from Source System to ODS. But while coming to ODS to CUBE it is showing some error at Data Packet 13. Ok. In CUBE, I loaded the data by using “update ODS data to data target”, at that time it shows two options like FULL UPDATE and INIT UPDATE. Then I selected FULL UPDATE and then I checked in “processing” tab there is only one option enabled i.e. only data target. OK and then I loaded it. Now after getting the error where I can correct it. There is no PSA option in monitoring.
             Other wise how I can change option in Processing tab of infopackage for PSA. But I know one point that when we load the data from one target to another the only one option available in processing tab is “only data target”. How can I change that option and how can I correct the error.
    Thanks
    Rajesh

    Hi,
    i solved my question like the following.
    Go to monitoring of the CUBE and select the option "read every thing as manually" -> then it shows another screen for correcting the records -> correct all the records and load the data again. ok
    Thanks
    Rajesh

Maybe you are looking for

  • Full page prints from iPhone but not from iPad using photo app with AirPrint ?

    Full page prints from iPhone but not from iPad using photo app with AirPrint ? iPad only prints 4x6 on full size paper. It's an HP 8600.

  • Some users drag and drop to outlook turn to links instead of attachments.

    Hi it seems that some of our users when they drag and drop a file from sharepoint it turns into a hyperlink rather than an attachment. This is a problem since if we send the file externally that person would not be able to access that document.  Any

  • Insert into Nested Tables

    Good day All, I am working on my first nested table and having difficulty with INSERTS. Can someone explain what I am doing wrong? Many Thanks, Danny describe PAT_HOST_SYSTEM Name Null Type HOST_ID NOT NULL NUMBER IPADDR NOT NULL VARCHAR2(16 CHAR) OS

  • TNS Listener Poison Attack - CVE-2012-1675

    I have few databases from Oracle 9i to Oracle 11g. Many are standalone instances,and few RAC instances. My questions are 1) For standalone instances, will the following setting in listener.ora file and restarting listener addresses this vulnerability

  • MULTI OUTS for 3rd party plugins in Logic Pro (8)

    Hi All, I am using Logic Pro 8 on a dual g5 Mac and am trying to use a multi-channel and multi-timbral software instrument Spectrasonics Stylus RMX 1.5. This worked fine in Logic 7 but now I only have output A (a stereo output) and cannot assign to A