Load from ODS into InfoCube gives TIME-OUT runtime error after 10 minutes ?

Hi all,
   We have a full load from ODS into InfoCube and it was working fine till the last week upto with 50,000 records. Now, we have around 70,000+ records and started failing with TIME_OUT runtime error.
   The following is from the Short Dump (ST22):
   The system profile "rdisp/max_wprun_time" contains the maximum runtime of a
program. The current setting is 600 seconds. Once this time limit has been exceeded, the system tries to terminate any SQL statements that are currently being executed and tells the ABAP processor to terminate the current program.
  The following are from ROIDOCPRMS table:
   MAXSIZE (in KB) : 20,000
   Frequency       :  10
   Max Processes : 3
  When I check the Data Packages under 'Details' tab in Monitor, there are four Data Packages and the first three are with 24,450 records.  I will right click on each Data Package and select 'Manual Update' to load from PSA. When this Manual Update takes more than 10 minutes it is failing with TIME_OUT again.
  How could I fix this problem, PLEASE ??
Thanks,
Venkat.

Hello A.H.P,
The following is the Start Routine:
PROGRAM UPDATE_ROUTINE.
$$ begin of global - insert your declaration only below this line  -
TABLES: /BIC/AZCPR_O0400, /BIC/AZCPR_O0100, /BIC/AZCPR_O0200.
DATA: material(18), plant(4).
DATA: role_assignment like /BIC/AZCPR_O0100-CPR_ROLE, resource like
/BIC/AZCPR_O0200-CPR_BPARTN.
$$ end of global - insert your declaration only before this line   -
The follow definition is new in the BW3.x
TYPES:
  BEGIN OF DATA_PACKAGE_STRUCTURE.
     INCLUDE STRUCTURE /BIC/CS8ZCPR_O03.
TYPES:
     RECNO   LIKE sy-tabix,
  END OF DATA_PACKAGE_STRUCTURE.
DATA:
  DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
       WITH HEADER LINE
       WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
FORM startup
  TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
           MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
           DATA_PACKAGE STRUCTURE DATA_PACKAGE
  USING    RECORD_ALL LIKE SY-TABIX
           SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
  CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
$$ begin of routine - insert your code only below this line        -
fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
to make monitor entries
   clear DATA_PACKAGE.
   loop at DATA_PACKAGE.
      select single /BIC/ZMATERIAL PLANT
         into (material, plant)
         from /BIC/AZCPR_O0400
         where CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID
         and ( MATL_TYPE = 'ZKIT' OR MATL_TYPE = 'ZSVK' ).
       if sy-subrc = 0.
          DATA_PACKAGE-/BIC/ZMATERIAL = material.
          DATA_PACKAGE-plant = plant.
          modify DATA_PACKAGE.
          commit work.
       endif.
       select single CPR_ROLE into (role_assignment)
                     from /BIC/AZCPR_O0100
                     where CPR_GUID = DATA_PACKAGE-CPR_GUID.
        if sy-subrc = 0.
          select single CPR_BPARTN into (resource)
                     from /BIC/AZCPR_O0200
                     where CPR_ROLE = role_assignment
                     and CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID.
               if sy-subrc = 0.
                  DATA_PACKAGE-CPR_ROLE = role_assignment.
                  DATA_PACKAGE-/BIC/ZRESOURCE = resource.
                  modify DATA_PACKAGE.
                  commit work.
               endif.
          endif.
       clear DATA_PACKAGE.
       endloop.
if abort is not equal zero, the update process will be canceled
  ABORT = 0.
$$ end of routine - insert your code only before this line         -
Thanks,
Venkat.

Similar Messages

  • Delta loading from ODS to InfoCube

    Hai
    Im extracting the data from the custom table into ODS and doing full load everytime for loading delta.
    how can load the only delta from the ODS to InfoCube. what are the setting for this one
    pls let me know
    kumar

    Hai <b>Ravi</b>...
    <u><b>Load the Data From ODS to InfoCube</b></u>:
    Clck as right Click in the Specified ODS.....
    Select the Option " Update ODS Data in Data Target "......
    Then You able to load the Data to any type of DataTarget's(either ODS, InfoCube)
    <i><b>removed</b></i>...
    <u><b>BalajeeKannan</b></u>

  • Debugging data load from ODS to InfoCube?

    Hi Expert,
    I got some errors when I load data from ODS to Cube. I tried to simulate the load, but I got No data in PSA table. I checked mt Infopakage, it is only update Datatarget directly. How can I debug in this situation. I have only that choice "Data Target only" How can I select update Cube through PSA? Thanks in advance!
    Weidong

    Hi robert,
    Thanks for your reply! Do I have to delete the old one? It looks like that system generated two Infopackages - One is full update and another one is Initial update. I think that I have to delete both of them and re-created both, right? Thanks  a lot!
    Weidong

  • Maximum Time out  runtime error

    Hi All,
    The following query is giving the error message:
    SELECT mkpfbudat msegbwart msegwerks msegmatnr mseg~shkzg
             msegmenge msegmeins msegdmbtr maraextwg makt~maktx
        INTO TABLE i_src
        FROM mkpf
          JOIN mseg ON mkpfmblnr = msegmblnr
                   AND mkpfmjahr = msegmjahr
          JOIN mara ON msegmatnr = maramatnr
          JOIN makt ON msegmatnr = maktmatnr
        WHERE mkpf~budat IN s_budat AND
              mseg~bwart IN s_bwart AND
              mseg~matnr IN s_matnr AND
              mseg~werks IN s_werks AND
              mara~mtart  = c_mtart AND
              makt~spras  = sy-langu.
    The error message is :
    The program ZMMPR_KEY_MOVEMENT has exceeded the maximum permitted runtime and therefore has been terminated. 
    Can you tell me how we can improve the performance?
    Thanks,
    Sobhan.

    Hi Sobhan,
    I remember this problem, getting MBELN's for a date range and material range and BWART range, from the early 90's on an R/3 1.2.
    Try it this way:
    TABLES: mkpf, mseg, mara.
    SELECT-OPTIONS:
      s_budat FOR mkpf-budat,
      s_bwart FOR mseg-bwart,
      s_matnr FOR mseg-matnr,
      s_werks FOR mseg-werks.
    PARAMETERS:
      c_mtart TYPE mtart.
    TYPES:
      BEGIN OF i_scr_t,
        mblnr LIKE mkpf-mblnr,
        mjahr LIKE mkpf-mjahr,
        budat LIKE mkpf-budat,
        bwart LIKE mseg-bwart,
        werks LIKE mseg-werks,
        matnr LIKE mseg-matnr,
        shkzg LIKE mseg-shkzg,
        menge LIKE mseg-menge,
        meins LIKE mseg-meins,
        dmbtr LIKE mseg-dmbtr,
        extwg LIKE mara-extwg,
        maktx LIKE makt-maktx,
      END OF i_scr_t,
      BEGIN OF mblnr_t,
        mblnr LIKE mkpf-mblnr,
        mjahr LIKE mkpf-mjahr,
        budat LIKE mkpf-budat,
      END OF mblnr_t,
      BEGIN OF mat_t,
        matnr LIKE mara-matnr,
        extwg LIKE mara-extwg,
        mtart LIKE mara-mtart,
        maktx LIKE makt-maktx,
      END OF mat_t.
    DATA:
      lt_mblnr TYPE TABLE OF mblnr_t WITH HEADER LINE,
      lt_mat   TYPE HASHED TABLE OF mat_t WITH UNIQUE KEY matnr,
      ls_mat   LIKE LINE OF lt_mat,
      i_scr    TYPE TABLE OF i_scr_t WITH HEADER LINE.
    SELECT budat mjahr mblnr FROM mkpf
      INTO CORRESPONDING FIELDS OF TABLE lt_mblnr
      WHERE mkpf~budat IN s_budat.
    LOOP AT lt_mblnr.
      SELECT bwart werks matnr shkzg menge meins dmbtr
        FROM mseg
        INTO CORRESPONDING FIELDS OF i_scr
        WHERE mblnr = lt_mblnr-mblnr
          AND mjahr = lt_mblnr-mjahr
          AND bwart IN s_bwart
          AND matnr IN s_matnr
          AND werks IN s_werks.
        READ TABLE lt_mat INTO ls_mat
          WITH TABLE KEY matnr = i_scr-matnr.
        IF sy-subrc <> 0.
          ls_mat-matnr = i_scr-matnr.
          SELECT SINGLE extwg mtart FROM mara
            INTO (ls_mat-extwg, ls_mat-mtart)
            WHERE matnr = ls_mat-matnr.
          SELECT SINGLE maktx FROM makt INTO ls_mat-maktx
            WHERE matnr = ls_mat-matnr
              AND spras = sy-langu.
          INSERT ls_mat INTO TABLE lt_mat.
        ENDIF.
        IF ls_mat-mtart = c_mtart.
          i_scr-extwg = ls_mat-extwg.
          i_scr-maktx = ls_mat-maktx.
          APPEND i_scr.
        ENDIF.
      ENDSELECT.
    ENDLOOP.
    It does not look very high performing, but in fact, it used only 1/2 an hour instead of 3 days of the version, that corresponds to yours. Yes, I know, a lot changed since 1.2 in the database interface, but just try it and you will see. And forget joining MKPF and MSEG. No database will handle this in a real system.
    If you find my answer useful, please don't forget the reward.
    Regards,
    Juergen
    Message was edited by: Juergen Wurth

  • Data Load from ODS to Infocube

    Hi All,
    I have been trying to load Data from Standard ODS (activated from Business Content) to InfoCube (activated from Business Content). The records are present in ODS and it has been activated. However, when the load was scheduled and run for the Infocube, the data did not show up in the "Infocube" content and "Fact Table" as well. When I looked at Infocube->Manage->Requests, it showed transferred records as 21 (same as shown in the "monitor" of the existing request) and added records as 0. Similarly for other requests in the cube, it showed the same way. The only difference was the number of records (eg 178 or 48, etc). The number of entries for Fact table was 0 and the it is not compressed. The cube name is 0BBP_C02. Can any of you come up with an idea towards the solution to this problem.
    Thanks

    Hi Saura,
    Check start routines in the cube URs.
    For example, in my system for 0BBP_SC ODS object as infosource I see the start routine:
      LOOP AT DATA_PACKAGE.
        IF NOT ( DATA_PACKAGE-BBP_RELSC = 'I1129'
    *    if shopping cart is approved
          AND DATA_PACKAGE-BBP_ORIGIN = 'B' ).
    * and record originates in BBP
          DELETE DATA_PACKAGE.
        ENDIF.
      ENDLOOP.
    As you can see, records not conforming some conditions are deleted.
    The similiar routines you can find in another URs.
    Best regards,
    Eugene

  • Error While Loading data from ODS to Infocube

    I am trying to load data from ODS to Infocube  thru Update ODS data into data target. My requirement was to take a small subset of fields from ODS and design IC and load the data.
    My load fails at Extraction process as I get 0 records from total number of records sent in each package. Please let me know if you need more information.
    Please advise.
    Thanks,
    RR

    In Details tab of monitor, in extraction step,
    Extraction(messages): Errors occured
    Green Light for Data Request Received
    Green Light for Data Selection Scheduled
    Yellow for 25000 Records sent(0 records received)
    Yellow for 25000 Records sent(0 records received)
    Yellow for 15000 Records sent(0 records received)
    Green Light for Data Selection Ended
    Please let me know, If you need more information.
    Thanks,
    R R

  • Direct vs Indirect data load from ODS - InfoCube

    We know that there are two options to load data from ODS to InfoCube:
    1. Direct load:
    If the ODS setting is checked with the option of "Update data targets from ODS object automatically", then when loading data to ODS, then the data also can be loaded to the InfoCube, in this way, only one InfoPackage is needed to load data to the ODS, then the InfoCube can be filled with data automatically.
    2. Indirect load:
    If the ODS setting is NOT checked with the option of "Update data targets from ODS object automatically", then not only the InfoPackage to load the data to the ODS is needed, but also the InfoPackage to load data from the exported datasource of the ODS to the InfoCube is needed to be created in order to load the data from ODS to the InfoCube.
    I wonder what are the pro and con between the above two loading methods from ODS to InfoCube.
    Welcome to every BW expert inputs!

    HI Kevin,
    Direct Loads or rather Automated Loads are usually used where u need to load data automatically into the final data target. And it has no dependencies.This process has less of maintenance involved and u need to execute a single infopackage to send data to the final data target.
    But Indirect Loads are usually used when u have a number of dependencies and this ODs object is one of the object in a process chain. So if u require the this ODS data load wait till some other event has been executed , then indirect loads are used through process chains.
    Regards,
    JAsprit

  • Error while loading the data from ODS to InfoCube

    hai
    Im trying to load the data from ODS to InfoCube for particular year .
    But it says that there is a source system problem .
    why it is like that .
    pls tell me
    i ll assing the points
    rizwan

    Hi Rizwan,
    you didn't mention the error message in details. there could be a few places to be checked:
    - check if BW itself source system is active and in tact and reactivate if necessary
    - check if update rule is active and reactivate if necessary
    - check if ODS is active and reactivate if necessary
    Regards,
    Lilly

  • Shortdump problem for loadinf data from ODS to InfoCube

    hai
    im trying to load the data from ODS to InfoCube.But i got the following error like below
    Short dump in the Warehouse
    Diagnosis
    The data update was not completed. A short dump has probably been logged in BW providing information about the error.
    <b>System response
    "Caller 70" is missing.
    Further analysis:
    Search in the BW short dump overview for the short dump belonging to the request. Pay attention to the correct time and date on the selection screen.
    You get a short dump list using the Wizard or via the menu path "Environment -> Short dump -> In the Data Warehouse".
    Error correction:
    Follow the instructions in the short dump.</b>
    I looked at the shortdump.But it says that there is no shortdump for that particular date selection.
    pls tell me wht i have to do
    i ll assing the points
    bye
    rizwan

    Hi Rizwan,
    Why does the error occurs ?
    • This error normally occurs whenever BW encounters error and is not able to classify them. There could be multiple reasons for the same
    o Whenever we are loading the Master Data for the first time, it creates SID’s. If system is unable to create SID’s for the records in the Data packet, we can get this error message.
    o If the Indexes of the cube are not deleted, then it may happen that the system may give the caller 70 error.
    o Whenever we are trying to load the Transactional data which has master data as one of the Characteristics and the value does not exist in Master Data table we get this error. System can have difficultly in creating SIDs for the Master Data and also load the transactional data.
    o If ODS activation is taking place and at the same time there is another ODS activation running parallel then in that case it may happen that the system may classify the error as caller 70. As there were no processes free for that ODS Activation.
    o It also occurs whenever there is a Read/Write occurring in the Active Data Table of ODS. For example if activation is happening for an ODS and at the same time the data loading is also taking place to the same ODS, then system may classify the error as caller 70.
    o It is a system error which can be seen under the “Status” tab in the Job Over View.
    What happens when this error occurs ?
    • The exact error message is “System response "Caller 70" is missing”.
    • It may happen that it may also log a short dump in the system. It can be checked at "Environment -> Short dump -> In the Data Warehouse".
    What can be the possible actions to be carried out ?
    • If the Master Data is getting loaded for the first time then in that case we can reduce the Data Package size and load the Info Package. Processing sometimes is based on the size of Data Package. Hence we can reduce the data package size and then reload the data again. We can also try to split the data load into different data loads
    • If the error occurs in the cube load then we can try to delete the indexes of the cube and then reload the data again.
    • If we are trying to load the Transactional and Master Data together and this error occurs then we can reduce the size of the Data Package and try reloading, as system may be finding it difficult to create SID’s and load data at the same time. Or we can load the Master Data first and then load Tranactional Data
    • If the error is happening while ODS activation cause of no processes free, or available for processing the ODS activation, then we can define processes in the T Code RSCUSTA2.
    • If error is occurring due to Read/Write in ODS then we need to make changes in the schedule time of the data loading.
    • Once we are sure that the data has not been extracted completely, we can then go ahead and delete the red request from the manage tab in the InfoProvider. Re-trigger the InfoPackage again.
    • Monitor the load for successful completion, and complete the further loads if any in the Process Chain.
    (From Re: caller 70 missing).
    Also check links:
    Caller 70 is missing
    Re: Deadlock - error
    "Caller 70 Missing" Error
    Caller 70 missing.
    Bye
    Dinesh

  • Error when loading from ODS to Cube

    Hello Friends,
    I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
    07/03/2007     13:10:25     Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
    07/03/2007     13:28:42     Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed     
    I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
    Thanks.

    Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
    Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
    A few questions:
    How are you loading ur cube?
    Did the data get thru fine to PSA with the infopack in question?
    How did you load ur DSO(assuming the load was successful)?
    Message was edited by:
            voodi

  • Data Extraction from ODS to Infocube

    Hi.
    How will you extract data from ODS to Infocube. what is process ?
    please explain....

    Hi,
    as the no.. of records arelot more then your usual loading ..
    so it will take mmuch more time today ..but make sure that the job is running fine..
    You can create load the data by creating selection  but at this point i dont think you can do this.. because for this you have to kill the job ..may be some changes would have been happened
    in the Ip thats why its picking more records..
    hope this helps you ...
    Reagds,
    shikha

  • Problem when loading from ODS to the CUBE

    Hi Experts,
    I am facing an unusual problem when loading data from ODS to the cube.
    I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
    I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
    I am sure when I run a full load the data goes from Active table of the ODS.
    After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
    I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
    I also dont have any fancy routines in the update rules.
    Please help me in this regard.
    Regards
    Raghu

    Hi,
    Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
    o     First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
    o     First select info objects option and create info area then create info object catalog then char, & key figure. Then create ‘id’ in Char, Name as attribute activate it then in key figures create no and activate it.
    o     In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
    o     For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
    o     Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
    o     Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
    o     Then other screen opens there we can see if we doesn’t able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
    o     so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
    o     Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
    o     Once it is green the click the data target and see the file executed.
    o     Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
    o     Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
    o     Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
    o     In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
    o     Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
    o     Then select monitor option then the contents then the field to be elected then the file to be executed.
    regards
    ashwin

  • How to debug start routines of update rules from ODS to InfoCube

    Dear gurus,
      I have an update rule from ODS to InfoCube. I wrote a start routine in the update rule. Now I want to debug it. I went to monitor and simulate update the data package and only got the prompt "No data exists in the corresponding PSA table". So how can I debug this start routine?
      Thanks in advance.
    Jin Ming

    Jin,
    In order to use PSA between ODS and InfoCube, you may have to use an exclusive InfoPackage and load separately. In that InfoPackage, choose the radio button to use a PSA.
    I think you are currently updating the InfoCube directly without using a separate InfoPackage.
    Look for an InfoSource under DataMarts (search for 8<ODS Technical name>) and create your InfoPackage there.
    Good luck.

  • Some records not transfer from ODS to infocube

    Hello BW folks ,
    We have an ODS which stores the various sales doc. types.
    We are transfering all the data from this ODS to infocube. We do not have any routine or filter conditions while loading data from ODS to infocube.
    In the update rules of infocube we do not have any routine written .
    Also no start routine is present in Infocube.
    The data is loaded successfully from ODS to infocube.
    But still a particular 'sales doc type' is not transfer from ODS to infocube.
    While this 'sales doc type' is present in ODS.
    The Sales doc type is maintained with respect to the Ordernumber. So If I check the Ordernumber of that sales doc type in ODS then the same Order Number is not present in infocube.
    Means some / few Ordernumbers also getting deleted while transfering data from ODS to infocube.
    We do not have any object in update rules as master data attribute of.
    Please suggest me what to do in this case.
    Amol.

    hello VC,
    I have checked that in ODS update rules , for key figures we have update type as Addition and the data from ODS to infocube is INIT and then delta.
    Is this the reason for Overwritting the particular sales doc type ?
    I thing we should do full upload only from ODS to infocube in this case.
    Regards,
    Amol.

  • Error in Process Chain From ODS to InfoCube

    Hi There
    What variants should follow to load data from ods to infocube in process chains?
    I created Process Chain,first delete content of ods and data load,Activate ods , further update ods , delete index , load from ods to IC and generate index and it's warning message.
    Advice me regarding this.
    Regards,
    Chandu.

    HI Chandu,
    Please remove the further update process from the chain.
    The chain should look like...
    startdelete contents of the ODS-load to ODSactivate ODS-delete index of cubeload from ODS to Cubereconstruct indices.
    If its only a warning msg ...u can neglect...try activating the process chain and run it with an immediate option.
    By the way what does the warning msg say??
    There should be a process.....something like this ...then neglect it..
    Regards,
    Marc
    Message was edited by: Marc
    Message was edited by: Marc

Maybe you are looking for

  • ERROR WHILE OPENING INTERFACE

    Hi, When i open interface iam getting ODI error "Cannot get the technology language" Details:_ java.lang.NullPointerException      at com.sunopsis.dwg.dbobj.SnpLeTechno.initializeNonPersistantFields(SnpLeTechno.java)      at com.sunopsis.dwg.dbobj.ge

  • User exit while doing Goods Issue

    Dear Guru's, I have the below requirement from my client, They want an warning msg from the system while doing Goods issue, if the stock of certain material is goes below min stock level, so i spoke to my abaper to create user exit, but he is asking

  • Time Machine backup not complete?

    I've had Time Machine running now succesfully for a while. HOWEVER ... a week ago my machine all of a sudden came to a grinding halt, with the internal HD becoming unresponsive. After a lot of checking for solutions online, I've decided to erase the

  • Previous week data required

    Hi all, In my query user want to compare this week data verses previous week data. How can I get data which are there in previous week. I donu2019t have loading date in my info provider. Please suggest.

  • Charging Porposnate dep to the cost centes based on the Interval Assigned

    Hi, I have scenario were asste is being transferred from one cost center to another and depreciation needs to be charged on proposnate basis between the old and new cost center. I have changed the interval in the aaset master (T-Code AS02) for transf