Data loaded but request is in red

Hi Friends,
i have an ODS and Cube , the data gets loaded into ODS initially and then into Cube through that ODS. every thing happens thorough process chain. yesterday the ODS data loaded and available for reporting but the request is in Red color . it has failed to load the data into further to cube. my doubt is if the data loaded and available for reporting but Red request will give any problem? . i have tried to change the QM status but it did not allow to change. please guide me on this.. max points will be awarded.
Thanks,
Prasad

Hi ...
I think that the request in ur ODS is partially activated.......bcoz QM status red .......but available for reporting only happens when request ........you cannot change the status manually......since it is partially activated...........
If the load is full upload........then delete the request from the target ..and reload it.........
But if it is a delta load........then change the QM status of the IP to red in RSMO.........In the target level......I mean in the Manage screen it will not allow you to change....for full upload also its better to change the QM status in red......then delete it and again reload it.........
Is there is any ODS activation step in ur PC.......if not check the settings of the ODS in RSA1........in the setting tab there is a check bob Activate Data Automatically.........it is checked or not......if it is checked it means after data get loaded in the ODS.....it will get activated automatically.................But when you are using a  PC..........this is not a good practice..........its better to keep a seperate ODS activation step in the PC..........
Hope this helps......
Regards,
Debjani.......

Similar Messages

  • Data Load Issue "Request is in obsolete version of DataSource"

    Hello,
    I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
    Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
    The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    I have taken the follwoing action
    1. Replicated the data source
    2. Deleted all request from PSA
    2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
    3. Re transported the datasource , transformation, DTP
    Still getting the same issue
    If you have any idea please reply asap.
    Samit

    Hi
    Generate your datasource in R/3 then replicate and activate the transfer rules.
    Regards,
    Chandu.

  • Master Data Loaded but not able to see.

    Loaded master data successfully. Able to see it in P and M table, but not able to see when going thru Manage - Contents at InfoProvider level.
    Many requests are available showing number of records Transferred and Updated.
    What could be the possible reason?

    Hi Sarabjit,
    Welcome to SDN!!
    You need to load the transaction data into the Info-provider to see the contents and not just the master data.
    In case you have done the transaction load then you need to do the "apply hier/attr change" for the master data loaded. Goto RSA1 -> Tools(menu) -> apply hier/attr change -> Seelect your info-object and execute.
    Bye
    Dinesh

  • Data load by Request ID!

    Hi Friends,
    I have a scenario. I want to load data from one InfoCube to DSO. I generated export data source for the InfoCube and created updated rules between InfoCube and DSO. I have 10 requests in InfoCube for which data was loaded. Now, I DO NOT want to load data belongs to all those 10 requests in infocube. Instead, i want to load data for a particular request ID. How can i do this?
    <b>your help will be appreciated in terms of points.</b>
    Thanks,
    Manmit

    Actually, thinking about it, there is an easier way.
    1) Create a query on the infocube, having all the required fields (you need to be loaded into the ODS) in rows and columns.
    2) Have a variable on 0REQUID in the global filter section of the query
    3) Create a transactional ODS which should have all the fields present in the query.
    4) Create an APD which will enable you to save the output of the above query into the transactional ODS.
    5) Now create update rules between the transactional ODS and the ODS that needs to be loaded with data.
    6) Execute the APD with by passing the request ID of interest to the query. The output will be saved into the transactional ODS and this ouput data is the data in the cube of the request ID selected during query execution.
    7) Load from the transactional ODS to your original ODS. You can safely do a full load, coz  the transactional data only has the data of interest to you.
         Good luck!

  • Trying to Pull data from ECC data loading but all garbag values

    Hi Experts,
    I am trying to pull the data from sap ECC. Standard extractor (2LIS_VA_ITM) & 2LIS_VA_HDR. I have imported 2LIS_11_VAITM in ODP objects.
    I have created the flow.
    I ran the job. It has been completed successfully. It loaded the records. But all the records are not readable format(Garbage values ???).
    This is data service 4.2 SP1. Can you please let me know how can I correct this.
    Please find the attached screen shot after loading for vbeln ??? are coming.
    Regards,
    Reddy.

    Hi Experts,
    I am trying to pull the data from sap ECC. Standard extractor (2LIS_VA_ITM) & 2LIS_VA_HDR. I have imported 2LIS_11_VAITM in ODP objects.
    I have created the flow.
    I ran the job. It has been completed successfully. It loaded the records. But all the records are not readable format(Garbage values ???).
    This is data service 4.2 SP1. Can you please let me know how can I correct this.
    Please find the attached screen shot after loading for vbeln ??? are coming.
    Regards,
    Reddy.

  • Master data load failed

    Hi Experts,
    One of my master data full load failed, see the below erro message in status tab,
    Incorrect data records - error requests (total status RED)
    Diagnosis
    Data records were recognized as incorrect.
    System response
    The valid records were updated in the data target.
    The request was marked as incorrect so that the data that was already updated cannot be used in reporting.
    The incorrect records were not written in the data targets but were posted retroactively under a new request number in the PSA.
    Procedure
    Check the data in the error requests, correct the errors and post the error requests. Then set this request manually to green.
    can any one please give me solution.
    Thanks
    David

    HI,
    I am loading the data from application server,  i dont have R/3 for this load. and the below message is showing in status tab;
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    Thanks
    David

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Data load failed at infocube level

    Dear Experts,
    I hve data loads from the ECC source system for datasource 2LIS_11_VAITM to 3 different datatargets in BI system. the data load is successful until PSA when comes to load to datatargets the load is successful to 2 datatargets and failed at one data target i.e. infocube. I got the following error message:
    Error 18 in the update
    Diagnosis
        The update delivered the error code 18 .
    Procedure
        You can find further information on this error in the error message of
        the update.
    Here I tried to activate update rules once again by excuting the  program and tried to reload using reconstruction fecility but I get the same error message.
    Kindly, please help me to analyze the issue.
    Thanks&Regards,
    Mannu

    Hi,
    Here I tried to trigger repeat delta in the impression that the error will not repeat but then I encountered the issues like
    1. the data load status in RSMO is red but where as in the data target the status is showing green
    2. when i try to analyze psa from rsmo Tcode PSA is giving me dump with the following.
    Following analysis is from  Tcode  ST22
    Runtime Errors         GETWA_NOT_ASSIGNED
    Short text
         Field symbol has not yet been assigned.
    What happened?
         Error in the ABAP Application Program
         The current ABAP program "SAPLSLVC" had to be terminated because it has
         come across a statement that unfortunately cannot be executed.
    What can you do?
         Note down which actions and inputs caused the error.
         To process the problem further, contact you SAP system
         administrator.
         Using Transaction ST22 for ABAP Dump Analysis, you can look
         at and manage termination messages, and you can also
         keep them for a long time.
    Error analysis
        You attempted to access an unassigned field symbol
        (data segment 32821).
        This error may occur if
        - You address a typed field symbol before it has been set with
          ASSIGN
        - You address a field symbol that pointed to the line of an
          internal table that was deleted
        - You address a field symbol that was previously reset using
          UNASSIGN or that pointed to a local field that no
          longer exists
        - You address a global function interface, although the
          respective function module is not active - that is, is
          not in the list of active calls. The list of active calls
          can be taken from this short dump.
    How to correct the error
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "GETWA_NOT_ASSIGNED" " "
        "SAPLSLVC" or "LSLVCF36"
        "FILL_DATA_TABLE"
    Here I have activated the include LSLVCF36
    reactivated the transfer rules and update rules and retriggered the data load
    But still I am getting the same error...
    Could any one please help me to resolve this issue....
    Thanks a lot,
    Mannu
    Thanks & Regards,
    Mannu

  • Data loaded in the target are not reportable.

    Dear all,
    In one of our cubes, data is loaded daily. A couple of days back a load failed with ABAP Dump " Raise exception" - subsequently the other days have data loaded but are found to be not reportable.
    How to handle this.
    Can anyone provide reasons for the error and subsequently making the requests reportable in steps.
    Experts help will be appreciated.
    Note:- Further details of the error gives details as "ADD_PARTITION_FAILED"
    Regards,
    M.M

    Hi,
      can you have look in ST22 for Short dumps on the day this load got failed, there you can get some more details of the error. it will help us to understood your error.
    have a look to the below SAP Note:539757
    Summary
    Symptom
    The process of loading transaction data fails because a new partition cannot be added to the F fact table. The loading process terminates with a short dump.
    Other terms
    RSDU_TABLE_ADD_PARTITION_ORA , RSDU_TABLE_ADD_PARTITION_FAILED , TABART_INCONSITENCY, TSORA , TAORA , CATALOG
    Reason and Prerequisites
    The possible causes are:
    SQL errors when creating the partition.
    Inconsistencies in the Data Dictionary control tables TAORA and TSORA. There must be valid entries for the data type of the table in these tables.
    Solution
    BW 3.0A & BW 3.0B
               If there are SQL errors: Analyze the SQL code in the system log or short dump and if possible, eliminate the cause. The cause is often a disk space problem or lock situations on the database catalog or, less frequently: The partitioning option in the ORACLE database is not installed.
               In most cases, there are inconsistencies in the Data Dictionary control tables TAORA and TSORA. As of Support Package 14 for BW 3.0B/Support Package 8 for BW 3.1C, the TABART_INCONSITENCY exception is issued in this case.
               1. ) Check whether there is an entry for the corresponding data type of the table in the TAORA table.
               The TAORA table contains the assignment of data classes to data tablespaces and their attributes, for example:
               Tablespace Data class
               DDIM      PSAPDIMD   . .......
               DFACT     PSAPFACTD  ........
               DODS      PSAPODSD   . ......
               2. ) Check whether the TSORA contains an entry for the tablespace from the TAORA table.
               The TSORA table contains the assignment of the data tablespace to the index tablespace.
               TABSPACE                      INDSPACE
               PSAPDIMD                      PSAPDIMD
               PSAPFACTD                      PSAPFACTD
               PSAPODSD                      PSAPODSD
               For more information, see Notes 502989 and 46272.

  • Error while data loading in real time cube

    HI experts,
    I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
    The cube is  a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
    It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.

    What was the resolution to this issue.  We rae having the same issue only with external system (not a flat file).  We get the  RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE).  We have been facing this issue for a while and even opened up a message with SAP.

  • Update the on-hand quantities through Data Loader

    Dear All,
    I need to update the on-hand quantities of some number of Items (5000 items).
    I hope I can do this with Data Loader, but which is the screen that I need to go and update the stock. Is the way, or any other alternate way that I can accomplish this stock update..
    Please update...
    many thanks in advance....

    Hi Friend
    Whenever you need to uploda the inventory, we use the Create Immediate Form in OA. Which consists of Reason Code/Item/Lot/WH/Qty for this you need to create DATA LOADER script which looks like below
    Reason code Tab Item  Tab Lot tab SB WH tab Qty save SB *NR
    POST tab raw1 tab L101 tab sb W1 tab 150 save sb NR
    ADD tab FG123 tab F101 tab *sb W2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Blank Row in table during Master Data Load

    I am having some success with my master data loads, but when I maintain the master data I have noticed that every table has had a blank row inserted.
    Does anybody know why and what I should do with the row (i.e. delete it)?

    This blank row is by default created and you can no way delete it.  Though you delete it, a new row with blank values will be appended.  This is required for some technical reason while reading the table within ABAP programs. 
    This is applicable only for SAP tables and may not be required for custom developed ones unless you want to use this in screen programs.
    Regards,
    Raj

  • Poor Data Load Performance Issue - BP Default Addr (0BP_DEF_ADDR_ATTR)

    Hello Experts:
    We are having a significant performance issue with the Business Partner Default Address extractor (0BP_DEF_ADDRESS_ATTR).  Our extract is exceeding 20 hours for about 2 million BP records.  This was loading the data from R/3 to BI -- Full Load to PSA only. 
    We are currently on BI 3.5 with a PI_BASIS level of SAPKIPYJ7E on the R/3 system. 
    We have applied the following notes from later support packs in hopes of resolving the problem, as well as doubling our data packet MAXSIZE.  Both changes did have a positive affect on the data load, but not enough to get the extract in an acceptable time. 
    These are the notes we have applied:
    From Support Pack SAPKIPYJ7F
    Note 1107061     0BP_DEF_ADDRESS_ATTR delivers incorrect Address validities
    Note 1121137     0BP_DEF_ADDRESS_ATTR Returns less records - Extraction RSA3
    From Support Pack SAPKIPYJ7H
    Note 1129755     0BP_DEF_ADDRESS_ATTR Performance Problems
    Note 1156467     BUPTDTRANSMIT not Updating Delta queue for Address Changes
    And the correction noted in:
    SAP Note 1146037 - 0BP_DEF_ADDRESS_ATTR Performance Problems
    We have also executed re-orgs on the ADRC and BUT0* tables and verified the appropriate indexes are in place.  However, the data load is still taking many hours.  My expectations were that the 2M BP address records would load in an hour or less; seems reasonable to me.
    If anyone has additional ideas, I would much appreciate it. 
    Thanks.
    Brian

    Hello Experts:
    We are having a significant performance issue with the Business Partner Default Address extractor (0BP_DEF_ADDRESS_ATTR).  Our extract is exceeding 20 hours for about 2 million BP records.  This was loading the data from R/3 to BI -- Full Load to PSA only. 
    We are currently on BI 3.5 with a PI_BASIS level of SAPKIPYJ7E on the R/3 system. 
    We have applied the following notes from later support packs in hopes of resolving the problem, as well as doubling our data packet MAXSIZE.  Both changes did have a positive affect on the data load, but not enough to get the extract in an acceptable time. 
    These are the notes we have applied:
    From Support Pack SAPKIPYJ7F
    Note 1107061     0BP_DEF_ADDRESS_ATTR delivers incorrect Address validities
    Note 1121137     0BP_DEF_ADDRESS_ATTR Returns less records - Extraction RSA3
    From Support Pack SAPKIPYJ7H
    Note 1129755     0BP_DEF_ADDRESS_ATTR Performance Problems
    Note 1156467     BUPTDTRANSMIT not Updating Delta queue for Address Changes
    And the correction noted in:
    SAP Note 1146037 - 0BP_DEF_ADDRESS_ATTR Performance Problems
    We have also executed re-orgs on the ADRC and BUT0* tables and verified the appropriate indexes are in place.  However, the data load is still taking many hours.  My expectations were that the 2M BP address records would load in an hour or less; seems reasonable to me.
    If anyone has additional ideas, I would much appreciate it. 
    Thanks.
    Brian

  • All records loaded to cube successfully but the request is in red status

    All the records are loaded from r3 to target, but the status of the request is still in RED, what is the reason and how to resolve it?

    Hi,
    If you find the records are loaded to cube and request is still red, Follow any one of the following options.
    1. Make the request to green manually
    2. Check the extraction monitor settings in SPRO -
    > SAP Customizing Implementation Guide  ---> Business Intelligence  --->  
        Automated Processes  -
    > Extraction Monitor Settings. Check how the settings are done if you are authorized.
    3. Go to Manage of the target ---> Environment ---> Automatic Request processing ---> Check the option Set Quality Status to OK
    Hope this helps,
    Regards,
    Rama Murthy.
    Edited by: Rama Murthy Pvss on May 11, 2010 7:39 AM

  • Master data loads Request not updated to any data target using delta

    When checking the PSA the request updated ICON has red triangle and the mouse over says
    "Request not updated to any data target using delta".
    I am doing full loads for text data and both full and delta's loads for Attributes. This is BI 7.0 system but the loads are still from 3.1.  DTP has not been implemented yet. The system was just upgraded in July.  I am unable to schedule deletes from the PSA for successful loads.  However, I think the data is updating to the info objects.  My Info package has the selection to PSA then in the InfoObject package by package.  
    How do I schedule the deletes from PSA and why does the Request updated show red but the monitor for the info package show green?
    Edited by: Joe Mallorey on Jan 27, 2009 5:46 PM

    Hi shikha,
    The load has not failed but I am unable to delete the load from the PSA.
    If you do a manage on the Data Source or go to PSA from RSA1 the first column has the green gear icon instead of a green check mark I have red triangle the mouse over says "request not updated to any data target using delta" The data has loaded to info object. I am trying to schedule deletes from the PSA and using the option to "delete only successfully booked/updated requests" So how do I get the request updated column to show a green check mark so my deletes will Process? This is for master data only. My transactions load fine and delete properly according to my settings.
    Thanks for the reply.
    Regards,
    JoeM

Maybe you are looking for