Data loading delay

Hi Friends.,
           Shall i have an answer for one error,
The Issue is: Every day i load to one info cube, whatever the cube it is, it takes 2 Hours for every load, but once it has taken 5 Hours, what might be the reason? just confusing with that, can anybody let me clarify !!!!
Regards.,
Balaji Reddy K.

Reddy,
1. Is the time taken for loading to PSA or to load from PSA to cube ? if it is to oad to PSA then  uaually the problem lies at the extractor
2. If it is loading to the cube.. then check out if statistics are being maintained for the cube and they would give an accurate picture of where the dataload is taking up most time.
Do an SQL trace during the data load and if you find a lot of aster Data Lookups .. make sure that master data is loaded and if there are a lot of looups to Table NRIV check if number range buffering is on so that dim IDs get generated faster
Check if the data load happens fast if you drop any existing indexes...
Are you loading any agregates after the data load ? check fi th aggregates are necessary or if they have been enabled for delta loads..
If you have indexes active and there is a huge data loa , depending on the index , the data load can get delayed..
If the cube is not compressd , some times the data load can get delayed..
Also when the data load is going on check in SM50 and SM37 to see if the jobs are active - this means that the data load is active from both sides...
Always update the statistics for the cube before the load and ater the load , this helps in deciphering the time it takes for the data load... after activating the statistics .. check table RSDDSTAT or the standard reports available as part of BW tecnical content..
Hope it helps..
Arun
Assgn points if helpful

Similar Messages

  • Data load delay after DB upgrade in source system

    we have upgraded our source system DB to oracle 11g over the weekend. The first loads of our process chain the master data is taking huge time with the message as
    I am getting message as
    Data Package 000001: sent,not arrived
    Info IDoc 2 : sent , not arrived ; Data passed to port OK
    Info IDoc 1 : sent , not arrived ; Data passed to port OK
    Info IDoc 3 : sent , not arrived ; Data passed to port OK
    Info IDoc 4 : sent , not arrived ; Data passed to port OK
    Request IDoc : Application document posted
    processing (data packet) : No data
    can anyone share ideas on this
    thanks and regards
    Edited by: AAL123 on Sep 26, 2011 7:17 AM

    Hi,
    Check the below given SAP Notes,
    Note: 580869 ( RFC calls can be processed with report RSARFCEX) & 530997.
    Regards,
    Durgesh.

  • IDOC RSRQST is getting stuck frequently, it leads to delay in data loading

    hi,
    we are using SAP R/3 4.7 E and SAP Netweaver 2004s as BI server .
    On daily basis we are loading the data from R/3 to BI system . we are facing problems in data loading
    IDOC RSRQST is getting stuck frequently, which leads to delay in data loading..
    Please guide me to resolve this issue .
    Thanks in advance .

    Thanks for your reply kumarsen   . I have gone through the referred forum .
    In my case , I have checked the TRFC,  conncetivity between R/3 and BW  and Job scheduled as priority C .
    Still many Idocs are got stuck in R/3 [approx every 5 min once] . we are releasing the Idoc's through BD87 only .
    Please suggest for some permanent solution .

  • In Log-shipping what is load delay period on secondary server - Skipping log backup file since load delay period has not expired ....

    During logshipping, job on secondary server is ran successfully BUT give this information
    "Skipping log backup file since load delay period has not expired ...."
    What is this "Load delay period" ? Can we configure this somehow, somewhere ?
    NOTE : The value on "Restore Transasction Log tab", Delay Restoring backups at least = Default (zero minutes)
    Thanks
    Think BIG but Positive, may be GLOBAL better UNIVERSAL.

    How to get the LSBackup, LSCopy, and LSRestore jobs back in sync...
    Last I posted the issue was that my trn backups were not being copied from Primary to Secondary. 
    I found upon further inspection of the LS related tables  in MSDB the following likely candidates for adjustment:
    1) dbo.log_shipping_monitor_secondary, column  last_copied_file 
    change last copied file column to something older than the file that is stuck. For example, the value in the table was 
    E:\SQLLogShip\myDB_20140527150001.trn
    I changed last_copied_file to E:\SQLLogShip\myDB_20140525235000.trn. Note that this is just a made up file name that is a few minutes before the actual file that I would like to restore (myDB_2014525235428.trn). 4 mins and 28 seconds before, to
    be exact.
    LSCOPY runs and voila! now it is copied from primary to secondary. That appears to be the only change needed to get the copy going again.
    2) For LSRestore, see the MSDB table dbo.log_shipping_monitor_secondary, change
    last_restored_file
    again I used the made up file E:\SQLLogShip\myDB_20140525235000.trn
    LSRESTORE runs and my just copied myDB_2014525235428.trn is restored
    ** note that
    dbo.log_shipping_secondary_databases also has a last_restored_file column - this did not seem to have any effect, though I see that it updates after completing the above and LSRestore has run successfully, so now it is correct as well
    3) LSBackup job is still not running, the job still has a last run date in the future. Could just leave it and eventually it will come right, but I made a fairly significant time change, plus it's all an experiment....back to MSDB.
    look at dbo.sysjobs, get the job_id of your LSBackup job
    edit dbo.sysjobschedules - change next_run_date  next_run_time as needed to a datetime before the current time, or when you would like the job to start running. 
    I wouldn't be so cavalier with data that was important, but that's the benefit of being in Test, and it appears that these time comparisons are very rudimentary - a value in the relevant log shipping table and the name of the trn file. That said, if you
    were facing a problem of this nature due to lost trn files, corrupted, or some similar scenario, this wouldn't fix your problem, though it _might_ allow you to continue? But in my case I know I have all the trn files, it's just the time that changed, in this
    case on my Primary server, and thus the names of the trn logs got out sync.

  • Data Load Optimization

    Hi,
    I have a cube with following dimension information and it requires optimization for data load, its data is cleared and loaded every week from SQL data source using load rule. It loads 35 million records and the load is so slow that only for data load excluding calculation takes 10 hrs. Is it common? Is there any change in the structure I need to make the load faster like changing the Measures to sparse or change the position of dimensions. Also the block size is large, 52920 B thats kind of absurd. I have also the cache settings below so please look at it please give me suggestions on this
    MEASURE      Dense     Accounts 245 (No. Of Members)
    PERIOD     Dense     Time 27
    CALC      Sparse     None      1
    SCENARIO     Sparse     None 7
    GEO_NM     Sparse     None     50
    PRODUCT     Sparse     None 8416
    CAMPAIGN     Sparse     None 35
    SEGMENT     Sparse     None 32
    Cache settings :
    Index Cache setting : 1024
    Index Cache Current Value : 1024
    Data File Cache Setting : 32768
    Data file Cache Current Value : 0
    Data Cache Setting : 3072
    Data Cache Current Value : 3049
    I would appreciate any help on this. Thanks!

    10 hrs is not acceptable even for that many rows. For my discussion, I'll assume a BSO cube,
    There are a few things to consider
    First what is the order of the columns in your load rule? Can you post the SQL? IS the sql sorted as it comes in? Optimal for a load would be to have your sparse dimensions first followed by the dense dimensions(preferably having one of the dense dimensiosn as columns instead of rows) For example your periods going across like Jan, Feb, Mar, etc
    Second, Do you have parallel data loading turned on? Look in the config for Dlthreadsprepare and DLthreadswrite. My multithreading you can get better throughput
    Third, how does the data get loaded? Is there any summation of data before being loaded or do you have the load rule set to addative. doing the summation in SQL would spead things up a lot since each block would only get hit once.
    I have also seen network issues cause this as transferring this many rows would be slow ( as KRishna said) and have seen where the number of joins done on the SQL caused massive delays in preparing the data. Out of interest, how long does the actual query take if you are just executing it from a SQL tool.

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Master Data loading got failed: error "Update mode R is not supported by th

    Hello Experts,
    I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
    For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
    "Update mode R is not supported by the extraction API"
    Can anyone tell me what is that error for? how to resolve this issue?
    Regards,
    Nirav

    Hi
    Update mode R error will come in the below case
    You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
    This time the load will fail with update mode R.
    As repeat delta is not supported.
    So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
    example your fiorst delta failed with replication issue.
    only replicating and repeaing will not solve the update mode R.
    you will have to do both replication of the data source and re-int for the update mode R.
    One more thing I would like to add is.
    If the the delat which failed with error the first time(not update mode R), then
    you have to do init with data transfer
    if it failed without picking any records,
    then do init without data transfer.
    Hope this helps
    Regards
    Shilpa
    Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM

  • CALL_FUNCTION_CONFLICT_TYPE Standard Data loading

    Hi,
    I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
    The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
    The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
    I apologise for the long message below... this is a part of the system log.
    For information the routine_0004 is a standard one.
    Thanks a lot in advanced!
    Cheers.
       CALL_FUNCTION_CONFLICT_TYPE                                                 
    Except.                CX_SY_DYN_CALL_ILLEGAL_TYPE                             
    Symptoms.                                                Type conflict when calling a function module
    Causes                                                        Error in ABAP application program.   
         The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.                                 
         This is probably due to an error in the ABAP program.                                 
         A function module was called incorrectly.
    Errors analysis
         An exception occurred. This exception is dealt with in more detail below                      
         . The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"               
         "(FORM)" .                                    
        Since the caller of the procedure could not have expected this exception                      
         to occur, the running program was terminated.                                                  The reason for the exception is:     
        The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
    The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
        The field "RESULT" specified here is a different field type.
    How to correct the error.
      You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:                                
        "CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC                                    
        "GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"                                
        "ROUTINE_0004"                                                                               
        If you cannot solve the problem yourself and you wish to send                                 
        an error message to SAP, include the following documents:                                  
        1. A printout of the problem description (short dump)                                         
           To obtain this, select in the current display "System->List->                              
           Save->Local File (unconverted)".                                              2. A suitable printout of the system log To obtain this, call the system log through transaction SM21.  Limit the time interval to 10 minutes before and 5 minutes  after the short dump. In the display, then select the function                                    
           "System->List->Save->Local File (unconverted)".                                       
        3. If the programs are your own programs or modified SAP programs, supply the source code.               
           To do this, select the Editor function "Further Utilities->  Upload/Download->Download".                                        
        4. Details regarding the conditions under which the error occurred                            
           or which actions and input led to the error.                                               
        The exception must either be prevented, caught within the procedure                           
         "ROUTINE_0004"                                    
        "(FORM)", or declared in the procedure's RAISING clause.                                      
        To prevent the exception, note the following:                                    
    Environment system SAP Release.............. "640"
    Operating system......... "SunOS"   Release.................. "5.9"
    Hardware type............ "sun4u"
    Character length......... 8 Bits     
    Pointer length........... 64 Bits             
    Work process number...... 2        
    Short dump setting....... "full"   
    Database type............ "ORACLE" 
    Database name............ "BWI"  
    Database owner........... "SAPTB1"  
    Character set............ "fr" 
    SAP kernel............... "640"   
    Created on............... "Jan 15 2006 21:42:36"   Created in............... "SunOS 5.8 Generic_108528-16 sun4u" 
    Database version......... "OCI_920 " 
    Patch level.............. "109"    
    Patch text............... " "        
    Supported environment....     
    Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."             
    SAP database version..... "640"
    Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
    SAP Release.............. "640"  
    The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in      
         "ROUTINE_0004". 
       The main program was "RSMO1_RSM2 ".  
    The termination occurred in line 702 of the source code of the (Include)           
         program "GP420EQ35FHFOCVEBCR6RWPVQBR"                                             
        of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
       Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in  the RAISING clause of the procedure. 
        The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
    672      'ROUTINE_0003' g_s_is-recno 
    673      rs_c_false rs_c_false g_s_is-recno  
    674      changing c_abort.   
    675     catch cx_foev_error_in_function. 
    676     perform error_message using 'RSAU' 'E' '510'  
    677             'ROUTINE_0003' g_s_is-recno
    678             rs_c_false rs_c_false g_s_is-recno
    679             changing c_abort.
    680   endtry.              
    681 endform.
    682 ************************************************************************ 
    683 * routine no.: 0004
    684 ************************************************************************ 
    685 form routine_0004 
    686   changing 
    687   result  type g_s_hashed_cube-FISCPER3
    688   returncode     like sy-subrc 
    689     c_t_idocstate  type rsarr_t_idocstate
    690     c_subrc        like sy-subrc 
    691     c_abort        like sy-subrc. "#EC *  
    692   data:
    693     l_t_rsmondata like rsmonview occurs 0 with header line. "#EC * 
    694                    
    695  try.             
    696 * init
    variables 
    697   move-corresponding g_s_is to comm_structure.
    698                     
    699 * fill the internal table "MONITOR", to make monitor entries  
    700                     
    701 * result value of the routine
    >>>>    CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'  
    703          EXPORTING      
    704               I_TIMNM_FROM       = '0CALDAY'  
    705               I_TIMNM_TO         = '0FISCPER'  
    706               I_TIMVL            = COMM_STRUCTURE-CALDAY
    707               I_FISCVARNT        = gd_fiscvarnt
    708          IMPORTING 
    709               E_FISCPER          = RESULT.                               
    710 * if the returncode is not equal zero, the result will not be updated 
    711   RETURNCODE = 0. 
    712 * if abort is not equal zero, the update process will be canceled
    713   ABORT = 0.
    714              
    715   catch cx_sy_conversion_error   
    716         cx_sy_arithmetic_error.
    717     perform error_message using 'RSAU' 'E' '507'
    718             'ROUTINE_0004' g_s_is-recno
    719             rs_c_false rs_c_false g_s_is-recno
    720             changing c_abort.
    721   catch cx_foev_error_in_function.
    System zones content
    Name                Val.                                                                               
    SY-SUBRC           0                                         
    SY-INDEX           2                                         
    SY-TABIX           0                                         
    SY-DBCNT           0                                         
    SY-FDPOS           65                                        
    SY-LSIND           0                                         
    SY-PAGNO           0                                         
    SY-LINNO           1                                         
    SY-COLNO           1                                         
    SY-PFKEY           0400                                      
    SY-UCOMM           OK                                        
    SY-TITLE           Moniteur - Atelier d'administration       
    SY-MSGTY           E                                         
    SY-MSGID           RSAU                                      
    SY-MSGNO           583                                       
    SY-MSGV1           BATVC  0000000000                         
    SY-MSGV2           0PROJECT                                  
    SY-MSGV3                                           
    SY-MSGV4                                           
    Selected variables
    Nº       23 Tpe          FORM
    Name    ROUTINE_0004           
    GD_FISCVARNT                                 
        22
        00 RS_C_INFO                                                      I
          4
          9                                
    COMM_STRUCTURE-CALDAY
    20060303
    33333333
    20060303  
    SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR   4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
    RESULT
    000
    333
    00

    You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
    DATA: var type <the same that e_fiscper in FM definition>
    CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
    EXPORTING
    I_TIMNM_FROM = '0CALDAY'
    I_TIMNM_TO = '0FISCPER'
    I_TIMVL = COMM_STRUCTURE-CALDAY
    I_FISCVARNT = gd_fiscvarnt
    IMPORTING
    E_FISCPER = var.
    result = var.
    --- ASSIGN POINTS IS USEFUL.

  • Data load stuck from DSO to Master data Infoobject

    Hello Experts,
    We have this issue where data load is stuck between a DSO and master data infoobject
    Data uploads from DSO( std) to master data infoobject.
    This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
    Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
    Now when we are doing full load via DTP the load is stuck and is not processing.
    Earlier it took only 5 mns of time to complete the full load.
    Please advise what could be the reason and cause behind this.
    Regards,
    santhosh.

    Hello guys,
    Thanks for the quick response.
    But its nothing proceeding further.
    The request is still running.
    earlier this same data is loaded in 5 mns.
    Please find the screen shot.
    master data for the infoobjects are loaded as well.
    I can see in SM50 the process at P table of the infoobject the process is.
    Please advise.
    Please find the detials
    Updating attributes for InfoObject YCVGUID
    Start of Master Data Update
    Check Duplicate Key Values
    Check Data Values
    Process time dependent attributes- green.
    No Message: Process Time-Dependent Attributes- yellow
    No Message: Generates Navigation Data- yellow
    No Message: Update Master Data Attributes - yellow
    No Message: End of Master Data Update - yellow
    and nothing is going further in Sm37
    Thanks,
    Santhosh.

  • How to create a report in bex based on last data loaded in cube?

    I have to create a query with predefined filter based upon "Latest SAP date" i.e. user only want to see the very latest situation from the last load. The report should only show the latest inventory stock situation from the last load. As I'm new to Bex not able to find the way how to achieve this. Is there any time characteristic which hold the last update date of a cube? Please help and suggest how to achieve the same.
    Thanks in advance.

    Hi Rajesh.
    Thnx fr ur suggestion.
    My requirement is little different. I build the query based upon a multiprovider. And I want to see the latest record in the report based upon only the latest date(not sys date) when data load to the cube last. This date (when the cube last loaded with the data) is not populating from any data source. I guess I have to add "0TCT_VC11" cube to my multiprovider to fetch the date when my cube is last loaded with data. Please correct me if I'm wrong.
    Thanx in advance.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

Maybe you are looking for