EC-CS Data Loading Strategy

Hello. We are currently in the process of implementing the Enterprise Consolidation Transaction Data InfoCube (0ECCS_C01).
We are attempting to develop a data loading strategy for the InfoCube. Since this InfoCube does not have a delta process, reloading the entire cube on a daily basis is not feasable due to the length of time. We would like to set it up where it would load the current month for actuals plus into the future for forecast dollars.
Has anyone established a data loading process for their consolidated accounting InfoCube that works well and keeps data loading time to a minimum?
Best regards,
Lynn

Hi,
You could prepare packages:
one which you upload all data from previous years/months
second with OLAP variables (for example 0DAT) which you upload data only from present day/month/year - depends which variable you select (add this package to the chain).
When the second package will crash, you have to repeat procedure.
Regards,
Dominik

Similar Messages

  • Master data Load - Strategy

    Hi All,
    I would like to know the better master data strategy.
    When using 3.x master data load strategy I can load directly into the info object without going thru additional DTP process. Is there any specific advantages of doing this, besides that it is 7.X.
    I have been using 2004s from 2005 but most of the implementations we used 3.x methodology for master data load and  DTP for transaction load. i would like to know whether SAP recommands new methodology using DTP for master data loads? If I load my master data using 3.x I can avoid one extra step, but will it be discontinued in the future? and have to use DTP even for this?
    Please advice if you know what is the best way forward strategically an dtechnically?
    Thanks,
    Alex.

    Alex,
    Please read my answer...
    The new data flow designed by SAP is using the DTP, even for Master Data... Right now you can use the "3.x" style which they maintain for backward-compatibility but down the road, eventually, it will be dropped, so looking ahead, technically and strategically, the right way to go is by using DTP...
    You can go here http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm and check under Data Warehousing, Data Distribution, Data Transfer Process...
    You could also open an OSS note to SAP and ask them directly.
    Thanks,
    Luis

  • Suggest good strategy for data load through standard datasources

    Hi BW Gurus,
    we currently are using standard purhasing related datasources. We forsee new reports coming in later based on the standard datasources.
    Can you please suggest a good general startegy to follow to bring in R/3 data. Our concerns are towards data loads [ initializations etc..] as some of the standard datasources are already in production.
    please advice.

    Hi
    go through these web-blogs -  From Roberto Negro it may help you.
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    Regards,
    Rajesh.

  • Material master data maintenance strategy

    Hi,
    We are working on a global template to suggest Material master data maintenance strategy.
    Please suggest what could be various options for same.
    Regards,
    Avnish

    Hello,
    In the same situation now:
    1.  Suggestion is to load the Basic Data centrally, say from HQ
    2.  Other data can be loaded via the subsidiary in the different countries.
    3.  If required you can use a block or a material statu and have the HQ control the access to the other parts of the material master.
    For your information, we have come across a big issue regarding this and only now we have begun the above method as my client's HQ controls the release of materials into the market.
    Regards
    Waza

  • How to use incremental data load in OWB? can CDC be used?

    hi,
    i am using oracle 10g relese 2 and OWB 10g relese 1
    i want know how can i implement incremental data load in OWB?
    is it having such implicit feature in OWB tool like informatica?
    can i use CDC concept for this/ is it viable and compatible with my envoirnment?
    what could be other possible ways?

    Hi ,
    As such the current version of OWB does not provide the functionality to directly use CDC feature available. You have to come up with your own strategy for incremental loading. Like, try to use the Update Dates if available on your source systems or use CDC packages to pick the changed data from your source systems.
    rgds
    mahesh

  • Need suggestions for imporving data load performance via SQL Loader

    Hi,
    Our requirement is to load 512 (1 GB each) files in Oracle database.
    We are using SQL loaders to load files into the DB (A partitioned table) and have tried almost all the possible options that come with sql loaders (Direct load path, parallel=true, multithreading=true, unrecoverable)
    As the tables is growing bigger in size, each file load time is increasing (It started with 5 minutes per file and has reached 2 hours per 3 files now and is increasing with every batch- Note we are loading 3 files concurrently on the target table using the parallel = true oprion of sql loader)
    Questions 1:
    My problem is that somehow multithreading is not working for us (we have multi CPU server and have enabled multithreading=true). Could it be something to do with DB setting which might be hindering the data load to be done in multiple threads?
    Question 2:
    Would gathering stats on the target table and it's partitions help improve load performance ? I'm not sure if stats improve DML's, they would definitely improve sql queries. Any thoughts?
    Question 3:
    What would be the best strategy to gather stats on this table (which would end up having 512 GB data) ?
    Question 4:
    Do you think insertions in a partitioned table (with growing sizes) would have poor performance as compared to a non-partitioned table ?
    Any other suggestions to improve performace are most welcome !!
    Thanks,
    Sachin
    Edited by: Sachin Tiwari on Mar 13, 2013 6:29 AM

    2 hours to load just 3 GB of data seems unreasonable regardless of the SQL Loader settings. It seems likely to me that the problem is not with SQL Loader but somewhere else.
    Have you generated a Statspack/ AWR/ ASH report to see where all that time is being spent? Are there triggers on the table? Are there bitmap indexes?
    Is your table partitioned in a way that is designed to improve the efficiency of loads so that all the data from one file goes into one partition? Or is data from each file getting inserted into many different partitions.
    Justin

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Master Data loading got failed: error "Update mode R is not supported by th

    Hello Experts,
    I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
    For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
    "Update mode R is not supported by the extraction API"
    Can anyone tell me what is that error for? how to resolve this issue?
    Regards,
    Nirav

    Hi
    Update mode R error will come in the below case
    You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
    This time the load will fail with update mode R.
    As repeat delta is not supported.
    So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
    example your fiorst delta failed with replication issue.
    only replicating and repeaing will not solve the update mode R.
    you will have to do both replication of the data source and re-int for the update mode R.
    One more thing I would like to add is.
    If the the delat which failed with error the first time(not update mode R), then
    you have to do init with data transfer
    if it failed without picking any records,
    then do init without data transfer.
    Hope this helps
    Regards
    Shilpa
    Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM

  • CALL_FUNCTION_CONFLICT_TYPE Standard Data loading

    Hi,
    I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
    The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
    The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
    I apologise for the long message below... this is a part of the system log.
    For information the routine_0004 is a standard one.
    Thanks a lot in advanced!
    Cheers.
       CALL_FUNCTION_CONFLICT_TYPE                                                 
    Except.                CX_SY_DYN_CALL_ILLEGAL_TYPE                             
    Symptoms.                                                Type conflict when calling a function module
    Causes                                                        Error in ABAP application program.   
         The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.                                 
         This is probably due to an error in the ABAP program.                                 
         A function module was called incorrectly.
    Errors analysis
         An exception occurred. This exception is dealt with in more detail below                      
         . The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"               
         "(FORM)" .                                    
        Since the caller of the procedure could not have expected this exception                      
         to occur, the running program was terminated.                                                  The reason for the exception is:     
        The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
    The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
        The field "RESULT" specified here is a different field type.
    How to correct the error.
      You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:                                
        "CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC                                    
        "GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"                                
        "ROUTINE_0004"                                                                               
        If you cannot solve the problem yourself and you wish to send                                 
        an error message to SAP, include the following documents:                                  
        1. A printout of the problem description (short dump)                                         
           To obtain this, select in the current display "System->List->                              
           Save->Local File (unconverted)".                                              2. A suitable printout of the system log To obtain this, call the system log through transaction SM21.  Limit the time interval to 10 minutes before and 5 minutes  after the short dump. In the display, then select the function                                    
           "System->List->Save->Local File (unconverted)".                                       
        3. If the programs are your own programs or modified SAP programs, supply the source code.               
           To do this, select the Editor function "Further Utilities->  Upload/Download->Download".                                        
        4. Details regarding the conditions under which the error occurred                            
           or which actions and input led to the error.                                               
        The exception must either be prevented, caught within the procedure                           
         "ROUTINE_0004"                                    
        "(FORM)", or declared in the procedure's RAISING clause.                                      
        To prevent the exception, note the following:                                    
    Environment system SAP Release.............. "640"
    Operating system......... "SunOS"   Release.................. "5.9"
    Hardware type............ "sun4u"
    Character length......... 8 Bits     
    Pointer length........... 64 Bits             
    Work process number...... 2        
    Short dump setting....... "full"   
    Database type............ "ORACLE" 
    Database name............ "BWI"  
    Database owner........... "SAPTB1"  
    Character set............ "fr" 
    SAP kernel............... "640"   
    Created on............... "Jan 15 2006 21:42:36"   Created in............... "SunOS 5.8 Generic_108528-16 sun4u" 
    Database version......... "OCI_920 " 
    Patch level.............. "109"    
    Patch text............... " "        
    Supported environment....     
    Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."             
    SAP database version..... "640"
    Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
    SAP Release.............. "640"  
    The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in      
         "ROUTINE_0004". 
       The main program was "RSMO1_RSM2 ".  
    The termination occurred in line 702 of the source code of the (Include)           
         program "GP420EQ35FHFOCVEBCR6RWPVQBR"                                             
        of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
       Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in  the RAISING clause of the procedure. 
        The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
    672      'ROUTINE_0003' g_s_is-recno 
    673      rs_c_false rs_c_false g_s_is-recno  
    674      changing c_abort.   
    675     catch cx_foev_error_in_function. 
    676     perform error_message using 'RSAU' 'E' '510'  
    677             'ROUTINE_0003' g_s_is-recno
    678             rs_c_false rs_c_false g_s_is-recno
    679             changing c_abort.
    680   endtry.              
    681 endform.
    682 ************************************************************************ 
    683 * routine no.: 0004
    684 ************************************************************************ 
    685 form routine_0004 
    686   changing 
    687   result  type g_s_hashed_cube-FISCPER3
    688   returncode     like sy-subrc 
    689     c_t_idocstate  type rsarr_t_idocstate
    690     c_subrc        like sy-subrc 
    691     c_abort        like sy-subrc. "#EC *  
    692   data:
    693     l_t_rsmondata like rsmonview occurs 0 with header line. "#EC * 
    694                    
    695  try.             
    696 * init
    variables 
    697   move-corresponding g_s_is to comm_structure.
    698                     
    699 * fill the internal table "MONITOR", to make monitor entries  
    700                     
    701 * result value of the routine
    >>>>    CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'  
    703          EXPORTING      
    704               I_TIMNM_FROM       = '0CALDAY'  
    705               I_TIMNM_TO         = '0FISCPER'  
    706               I_TIMVL            = COMM_STRUCTURE-CALDAY
    707               I_FISCVARNT        = gd_fiscvarnt
    708          IMPORTING 
    709               E_FISCPER          = RESULT.                               
    710 * if the returncode is not equal zero, the result will not be updated 
    711   RETURNCODE = 0. 
    712 * if abort is not equal zero, the update process will be canceled
    713   ABORT = 0.
    714              
    715   catch cx_sy_conversion_error   
    716         cx_sy_arithmetic_error.
    717     perform error_message using 'RSAU' 'E' '507'
    718             'ROUTINE_0004' g_s_is-recno
    719             rs_c_false rs_c_false g_s_is-recno
    720             changing c_abort.
    721   catch cx_foev_error_in_function.
    System zones content
    Name                Val.                                                                               
    SY-SUBRC           0                                         
    SY-INDEX           2                                         
    SY-TABIX           0                                         
    SY-DBCNT           0                                         
    SY-FDPOS           65                                        
    SY-LSIND           0                                         
    SY-PAGNO           0                                         
    SY-LINNO           1                                         
    SY-COLNO           1                                         
    SY-PFKEY           0400                                      
    SY-UCOMM           OK                                        
    SY-TITLE           Moniteur - Atelier d'administration       
    SY-MSGTY           E                                         
    SY-MSGID           RSAU                                      
    SY-MSGNO           583                                       
    SY-MSGV1           BATVC  0000000000                         
    SY-MSGV2           0PROJECT                                  
    SY-MSGV3                                           
    SY-MSGV4                                           
    Selected variables
    Nº       23 Tpe          FORM
    Name    ROUTINE_0004           
    GD_FISCVARNT                                 
        22
        00 RS_C_INFO                                                      I
          4
          9                                
    COMM_STRUCTURE-CALDAY
    20060303
    33333333
    20060303  
    SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR   4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
    RESULT
    000
    333
    00

    You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
    DATA: var type <the same that e_fiscper in FM definition>
    CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
    EXPORTING
    I_TIMNM_FROM = '0CALDAY'
    I_TIMNM_TO = '0FISCPER'
    I_TIMVL = COMM_STRUCTURE-CALDAY
    I_FISCVARNT = gd_fiscvarnt
    IMPORTING
    E_FISCPER = var.
    result = var.
    --- ASSIGN POINTS IS USEFUL.

  • Data load stuck from DSO to Master data Infoobject

    Hello Experts,
    We have this issue where data load is stuck between a DSO and master data infoobject
    Data uploads from DSO( std) to master data infoobject.
    This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
    Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
    Now when we are doing full load via DTP the load is stuck and is not processing.
    Earlier it took only 5 mns of time to complete the full load.
    Please advise what could be the reason and cause behind this.
    Regards,
    santhosh.

    Hello guys,
    Thanks for the quick response.
    But its nothing proceeding further.
    The request is still running.
    earlier this same data is loaded in 5 mns.
    Please find the screen shot.
    master data for the infoobjects are loaded as well.
    I can see in SM50 the process at P table of the infoobject the process is.
    Please advise.
    Please find the detials
    Updating attributes for InfoObject YCVGUID
    Start of Master Data Update
    Check Duplicate Key Values
    Check Data Values
    Process time dependent attributes- green.
    No Message: Process Time-Dependent Attributes- yellow
    No Message: Generates Navigation Data- yellow
    No Message: Update Master Data Attributes - yellow
    No Message: End of Master Data Update - yellow
    and nothing is going further in Sm37
    Thanks,
    Santhosh.

  • How to create a report in bex based on last data loaded in cube?

    I have to create a query with predefined filter based upon "Latest SAP date" i.e. user only want to see the very latest situation from the last load. The report should only show the latest inventory stock situation from the last load. As I'm new to Bex not able to find the way how to achieve this. Is there any time characteristic which hold the last update date of a cube? Please help and suggest how to achieve the same.
    Thanks in advance.

    Hi Rajesh.
    Thnx fr ur suggestion.
    My requirement is little different. I build the query based upon a multiprovider. And I want to see the latest record in the report based upon only the latest date(not sys date) when data load to the cube last. This date (when the cube last loaded with the data) is not populating from any data source. I guess I have to add "0TCT_VC11" cube to my multiprovider to fetch the date when my cube is last loaded with data. Please correct me if I'm wrong.
    Thanx in advance.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

Maybe you are looking for

  • Need suggestion for variable size array of primitive type.

    Hi all, I am working on a problem in which I need to keep an array of primitive (type int) sorted. Also I insert new elements while keeping it sorted, and at times I merge two similar arrays with unique elements. I would welcome suggestions on the im

  • Japanese character in Code Editor of "View" is shown as Question Marks

    Hi All, I have a probelm where I tried to create a view having decode function as shown below: decode(A.resource_process_name,         'ザイコ', 'Inventory',         'PP良後', 'Wafer BE',         'PP良前', 'Wafer FE',         'PP良済', 'Wafer',         'PP良引'

  • Use of EXCLUDE in HBR

    Hi All, I was wondering if anyone knows if the EXCLUDE command has definitively been left out of HBR. We are using v11.1.1.3. Thanks, Brian

  • Revolve 810 ( Gen1 ) and Windows 8.1

    Hello, I recently upgrade my elitebook Revolve 810 ( Gen1 ) to Windows 8.1. Because the power fan keeps blowing almost continously I also upgraded to the latest bios ( f47 ). Now I no longer have the tablet functionalities on the device. This means t

  • Is it safe to delete logs?

    I have a Mac Pro Dual-Core Intel Xeon that tells me its start up disk is full, which was a surprise because I store almost all of my data on external drives. Through 'WhatSize' I discovered that a big chunk of the memory is being used 'asl' logs loca