COPA delta Request

Hi,
I Updated my update rules,I want this to apply for last 5 requests.
I am extracting from COPA.
How can I extract last 5 delta's.
Do I have to set the time stamp in R/3? where and How?
I want to know the options available.
Tim

Hi Tim,
You do have the data in the PSA? If yes then you can reload from the PSA itself and it will be processed according to your new update rules. You can change the timestamps, but keep that as the very last resort as it is not a suggested practise.
Hope this helps...

Similar Messages

  • Standard and generic COPA delta

    Hello
    I dont understand why many people say standard copa delta  , generic copa delta.
    COPA is a generated application and delta should be generic.
    What do you mean when you are talking (many posts in this forum) about standard copa delta?
    Thanks

    Hi,
    Copa extraction is generated application based on the operating concren configured in R/3 system. and same applied to delta process.
    R/3 table TKEBWTS contains information about the COPA load requests that have been posted into BW.  time stamp entries in this table can be maintained to reintialize the delta. this may be the reason some may call in generic or standard delta.
    Dev

  • COPA Delta Mechanism

    Hi All,
    Can any one explain about Delta Mechanism in COPA ie.,
    How delta works in COPA,
    From where the delta record are picked,
    When the Delta Queue is filled,
    what is Safety Delta,
    What is Relignment and when we go for that, etc.....
    Thanks in Advance,
    Regards
    Ramakrishna Kamurthy

    Hi Ramakrishana,
    Up to and including Plug-In Release PI2003.1, a DataSource is only
    defined in the current client of the R/3 System. This means that a
    DataSource can only be extracted from this client. The DataSource has a
    timestamp for the delta method, and this timestamp is only valid for th
    current client. This timestamp is managed by Profitability Analysis.
    With Plug-In Release PI2004.1 (Release 4.0 and higher), timestamp
    management was converted to a new method, called generic delta. This
    method works in connection with an SAP BW system with Release 2.0 and
    higher. With this method, timestamp management is no longer performed b
    Profitability Analysis, but instead by the Service API (interface on th
    R/3 side between Profitability Analysis and SAP BW). Exclusively in
    Release 3.1I, Profitability Analysis continues to support timestamp
    management.
    Compared to timestamp management in Profitability Analysis, the generic
    delta allows for several enhancements:
    o   You can apply the delta method simultaneously using the same
         DataSource from more than one R/3 System client because a separate
         timestamp is saved for each logical system.
    o   You can apply the delta method for the same R/3 System client
         simultaneously using the same DataSource from several SAP BW
         systems.
    o   You can perform several initializations of the delta method with
         different selections using the same DataSource from a given SAP BW
         system for the same R/3 System client.
    o   The DataSource commands the Delta Init Simulation mode. With
         timestamp management in Profitability Analysis, this mode had to be
          implemented using the Simulate Delta Method Initialization function
          (see SAP Note 408366).
      For more information on the generic delta, see Delta Transfer, whereby
      the steps of the Specify Generic Delta for a DataSource section are
      performed automatically for Profitability Analysis when a DataSource is
      created. For this, the field determining the delta is taken as the
      timestamp for Profitability Analysis (TIMESTMP), and the timestamp is
      stored for summarization levels and line item tables. However, in
      contrast to generic DataSources, the TIMESTMP field is not generated in
      the extraction structure because this is not necessary for DataSources
      in Profitability Analysis. As with timestamp management in Profitabilit
      Analysis, an upper limit of 30 minutes is set as the safety interval.
      You find the timestamp of a DataSource for the delta method in the
      current logical system either in the Header Information for the
      DataSource using the IMG activity Display Detailed Information on
      DataSource or using the IMG activity Check Delta Queue in the Extractor
      IMG. The timestamp is shown here when you choose the selection button i
      the Status column for the combination of DataSource and SAP BW system.
      DataSources created after implementing PI2004.1 automatically apply the
      new method. DataSources that were created in Plug-In releases prior to
      PI2004.1 still continue to use timestamp management in Profitability
      Analysis but can be converted to the generic delta. For this, an
      additional selection option Convert to Generic Delta appears in the
      selection screen of the IMG activity Create Transaction Data DataSource
      when a DataSource with timestamp management in Profitability Analysis i
      entered. Conversion from the generic delta to timestamp management in
      Profitability Analysis is not supported.
      Conversion is only possible for DataSources that are defined in the
      current client of the R/3 system and for which the delta method has
      already been successfully initialized or for which a delta update has
      successfully been performed. This is the case once the DataSource has
      the replication status Update successful. Furthermore, no realignments
      should have been performed since the last delta update.
    For the conversion, the timestamp for the current R/3 System client is
    transferred from Profitability Analysis into the timestamp of the
    generic delta. In this way, the transition is seamless, enabling you to
    continue to perform delta updates after the conversion. If delta updates
    are to be performed from different R/3 System clients for this
    DataSource, you first need to initialize the delta method for these
    clients.
    The conversion must be performed separately in each R/3 System because
    the timestamp information is always dependent on the current R/3 System
    and is reset during the transport. If, however, a converted DataSource
    is inadvertently transported into a system in which it has not yet been
    converted, delta extraction will no longer work in the target system
    because the timestamp information is deleted during the import into the
    target system and is not converted to the timestamp information of the
    generic delta. If in this instance no new delta initialization is going
    to be performed in the target system for the DataSource, you can execute
    program ZZUPD_ROOSGENDLM_FROM_TKEBWTS from SAP Note 776151 for the
    DataSource. This program reconstructs the current time stamp information
    from the information for the data packages transported thus far and
    enters this time stamp information into the time stamp information for
    the generic delta. Once this program has been applied, delta extraction
    should work again. Normally, however, you should ensure during the
    transport that the DataSource uses the same logic in the source system
    and the target system.
    After the conversion, the DataSource must be replicated again from the
    SAP BW system.
    A successful conversion is recorded in the notes on the DataSource,
    which you can view in the IMG activity Display Detailed Information on
    DataSource.
    Since the generic delta does not offer any other log functions apart
    from the timestamp information (status: Plug-In Release PI2004.1),
    Profitability Analysis still logs the delta initialization requests or
    delta requests. However, the information logged, in particular the
    timestamps, only has a statistical character because the actual
    timestamp management occurs in the generic delta. Since the delta method  can be performed simultaneously for the same R/3 System client using the
      generic delta from several SAP BW systems, the information logged is
      stored for each logical system (in R/3) and SAP BW system. When a delta
      initialization is simulated, only the timestamp of the generic delta is
      set; Profitability Analysis is not called. Consequently, no information
      can be logged in this case. Messages concerning a DataSource are only
      saved for each logical system (in R/3). You can use the IMG activity
      Display Detailed Information on DataSource to view the information
      logged.
      Another enhancement from Plug-In Release PI2004.1 means that you can no
      longer exclusively perform full updates for DataSources of the Extractor
      Checker of the Service API that have recently been created or converted
      to the generic delta. The following update modes are possible:
      o   F - Full update: Transfer of all requested data
      o   D - Delta: Transfer of the delta since the last request
      o   R - Repeat transfer of a data package
      o   C - Initialization of the delta transfer
      o   S - Simulation of the initialization of the delta transfer
      In the case of all update modes other than F, you have to specify an SAP
      BW system as the target system so that the corresponding timestamp
      and/or selection information for reading the data is found. The Read
      only parameter is set automatically and indicates that no timestamp
      information is changed and that Profitability Analysis does not log the
      request.
      Update mode I (transfer of an opening balance for non-cumulative values)
      is not supported by Profitability Analysis.
    Assign Points if useful
    Regards
    Venkata Devaraj

  • Error in COPA Delta

    Hi,
    While I was trying to change the Timestamp of a COPA Delta load to a previous load in transaction KEB5, I got the following Error Message:
    "Master Record for Datasource 1_CO_PA800IDEA_AC is defined for generic delta     Read
    Timestamp could not be set"
    What might be the possible reason for such a Error Message. Could anybody help me in this regard ???

    Please get me the solution for this particular scenario.
    Let us suppose that i created the datasource some 1 year back and running the deltas successfully since then.
    Let us number the requests from 1,2,3... and so on since 1st jan 2009. And let us suppose each request loads
    some 1000 records everyday from ECC to BI.
    Now on 10th august some change was made to ECC( such as change in the User Exit ) that caused the mismatch in number of data records ( loading only 950 records ) since the change was made on 10th August. But I identified the problem on 17th august with request number 160. Let us say those request numbers are 153(10th),154(11th),155(12th),156(13th),157(14th),158(15th),159(16th),160(17th). I fixed the program in ECC on 17th and further the requests from request number 160(17th) are fetching the correct number of data records. Now I need to delete the data in BI for request numbers 153 till 159 since they are not reconciling the data of ECC with BI. Now we reload the deltas for such requests from ECC to BI. So how do I reload the deltas now in this scenario since ECC system keeps track of the last delta ie.. on 17th august(Req no 160). But how do I Load the deltas from req number 153 to 159 from ECC to BI.
    One proposed solution to this problem was setting the timestamp of COPA to 10th august. But it is not advised to set the timestamp for COPA datasources using the KEB5 transaction in a production system.
    Now in this scenario how do I load the request numbers 153 to 159 from ECC to Bi to fetch the correct data records for that period. Please propose a solution for this problem ASAP. U can also call me back whenever u need.
    Thanks in advance

  • DTP Delta request for open hub destination.

    Hi expert,
    As you know, In BW 7.0, open hub destination is used for open hub service, and you can create tranformation/DTP for the destination. and in some case the DTP delta is enabled.
    my questions are
    1) in the context menu of open hub destination, I did not find something like manage, which could list all the DTP request executed for this open hub destination..but for normal infocube or ods, you can find a list.
    2) if the DTP could enable the delta, where can I delete the latest delta request, and have a new load..
    3) I assume the DTP could be excuted with batch integrated with process chain by process type 'DTP', right ? if the destination type is third party tool, can it also be integrated with process chain ?
    4) can give some obvious difference of InfoSpoke and open hub destination in terms of advantage and disadvantage..
    Thanks a lot for the feedback.
    Best Regards,
    Bin

    Hi ,
    Please look at the links below . hope it helps .
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9?QuickLink=index&overridelayout=true
    Regards
    Santosh

  • Issue with Delta Request

    Hi Friends,
    I have an issue with loading.
    1. From source 2lis_13_VDITM, data loads to 2 targets.
    ZIC_SEG, 0SD_C03.and from 0sd_C03 it again loads to ZSD_C03W and ZSD_C03M through DTP.
    I have done a repair full load on 08.08.2011 to PSA and loaded this request manually to cube ZIC_seg.
    I forgoted to delete this request in PSA as i dont want to load this to other targets.
    Before i notice ,already delta requests got loaded and has pulled my repair full request also to other targets.
    As i have not done any selective deltions on other cubes there may be double entries.
    I am planning to do the below steps inorder to rectify the issue.
    1. Do a selective deletion of the delta request in all 3 targets which got loaded on 8th along with repair full.
    2. Delete the repair full request from PSA.
    So now delta request which got loaded after repair full request, is left in PSA.
    My question is if my PC runs today will this delta request in PSA also pulls again to cube through DTP?
    Kindly share any other ideas please urgent...
    Regards,
    Banu

    Hi Banu,
    If the data in the CUBE's is not compressed, then follow the below steps
    1)Delete the latest request in CUBE ZIC_SEG
    2)Delete the latest request from cubes  ZSD_C03W and ZSD_C03M and then from 0SD_C03.
    3)load the delta request manually using DTP from your cubes(in DTP you can load by giving request number). to cubes ZIC_SEG and 0SD_C03.
    3)now run the delta DTP to load the delta request from 0SD_C03 to ZSD_C03W and ZSD_C03M.
    Next when your PC runs it will load only that particular day delta request.
    It will work
    Regards,
    Venkatesh

  • Is it mandatory to delte all the previous delta request.

    Hi Gurus,
    I got strucked up with the following issue. Please help
    We have two infocubes called regional and global cubes.
    We are loading data from regional infocube in to global info cube through delta update.
    But the values did not tied up. so I ran the INIT and loaded the data succesfully. Now Is it mandatory to delte all the previous delta request which are already loaded in to the global infocube.

    Hi,
    You need to delete the data from cube and need to do init accrodingly load the data.
    Thanks & Regards,
    Ramnaresh.

  • How is it possible that a target has two deltas requests with the exact sam

    HI,
    How is it possible that a target has two deltas requests with the exact same records?

    Hi Mohammed,
    There are couple of reason behind it.
    main reason is
    take one scenario we have DSO as source and cube as target.
    Now DSO contains only 1 request, which i have loaded to cube by using delta.
    after that i have deleted DSO content by using right click on it and delete data.
    (hope u know that it will not delete request from target automatically)
    now once again i have loaded same request from datasource to DSO
    and then loaded delta again from DSO to cube.
    In this case for 1 single request in DSO you can see 2 request in cube with same data.
    Regards,
    Ashish

  • DELETE A DELTA REQUEST THAT WAS COMPRESSED !!

    Dear Experts,
    I've some inconsistent data that i'm trying to address.
    So, I've delted a few requests out of ODS_1  >> which sends delta's to CUBE_1.
    After that I went to manage for CUBE_1 and deleted couple of those common requests from CUBE_1.
    Except one request XXXXXX for 11/1/05. When i'm trying to delte this request it says "This request has been aggregated, compressed or Scheduled".
    It said the same thing for the few of the previous req's that i deleted. But after few minutes, it allowed me to delete them successfully.
    This request is adamantly not going any further. So, I went ahead to the RSMO and changed the status from green to red on the STATUS tab. But it's still not letting me delete it.
    I'm concerned that about two things:
    1. how do i delete this delta request XXXXXX.
    2. If resend the data from datastage to ODS_1 from 11/1 to date, it'll then send delta's to CUBE_1.
    But the CUBE_1 already has this request XXXXXX for 11/1/05, it would create duplicate records. What would i do at that point.
    Please guide me thru this.
    Thank you,
    Ti
    Message was edited by: TI

    You need to check to see if the Request you are trying to delete has been compressed in the base cube. Once a Request is compressed in the base cube, you can not delete it.
    As Dinesh, mentions, the cause could also be that you have an agrregate that is set to automatically compress after the rollup, and you can not delete Requests from an aggregate that has been compressed either. Now if the Request has NOT been compressed in the base cube, and the problem is just due to the aggregate, then you can deactivate the aggregate, whihc should allow you to then delete the Request from the base cube.
    If the request has been compressed in the base cube, I think you really have only a couple of options -
    Reload the cube - might be the easiest if the cube is not to big.
    Selective Deletion might work if you can identify criteria that would only point to the data that had been loaded in that Request. Possible, but not likely - depends on the cube design and data.
    If you still have the data in the PSA you use that to create a flat file that you would reverse all the key figure signs and then load that to the cube from a flat file data source you define to offset the data from the Request that you can not backout.  Another variation would be to try to generate a similar reversing flat file from the source ODS. The PSA is probably a better approach if you have the data still there.
    So ...
    The first thing is to identify why you can not delete th Request - due to is compression in the base cube or compression in an aggregate.
    This is why you shouldn't compress Requests in the base cube until you have verfied the data.

  • Aggregated delta request

    Hi Experts,
    In my cube one delta request is aggregated (Rollup).
    Is it possible to delete that request.
    Thanks
    Ajay

    Hi..........
    Yes,but as shikha already sid u hav to Roll back the request..............because once roll up is done....... data moves from F fact table to E fact table..............in E fact table request numer does'nt exist...............so we cannot delete request..........
    So first deactivate the aggregates.........since it is a delta request....don't forget to make the QM status red................to set back the init pointer..............otherwise data will be lost............then delete the request.................after reloading the request............again activate and re-fill the aggregates manually............
    Regards,
    Debjani........

  • Erraneous delta request

    There is a delta request in the infopackage that I need to delete. I am getting an erraneous delta request message. I want to kill the request. How do I do it? I checked in the scheduler options but I do not see an option for deleting the request. When I click on the stop job (recycle bin icon) i keep getting the same message. I would appreciate a response.

    Found out the problem. Delete requests from all data targets first.

  • Delta Request

    I have a delta request that Failed, Do i need to change the QM setting before i can reload it from psa?

    Hi,
    For delta failures, first you need to change the QM status to RED.
    Then delete the request from data targets.
    Go for repeating Delta by scheduling the infopak/repeat in process chain.
    It asks for repeat of last delta. click on yes.
    Now the repeat Delta upload will work fine.

  • Delete the delta request

    Hi BW Experts,
    I have a doubt, please make clear it.
    Daily delta, first it goes to ODS and then it is updated to further data target i.e, Infocube.
    Suppose today (22-09-2007) delta has been running successful in ODS as well as Infocube.
    If i delete the 15-09-2007 Request (Which is Delta Request only) from both ODS and Infocube, what will happen?
    Is there any problem in ODS and Infocube?
    Please give me the reply.
    And my Second Question is:
    What is the main purpose of DB Statistics.
    Thanks in advance.
    Regards,
    Anjali

    hi,
    if suppose u r deleting the particular delta request from ods and cube means
    data update through that request no will be deleted from both cube and ods.
    DB Statistics. it's mainly for OLAP run time analysis, query run time analysis
    if helpful provide points
    regards
    harikrishna.N

  • Error in loading delta request to ODS

    Dear All,
    i am not being able to load data from ODS to another ODS and Cube. It is saying "the last delta update was incorrect therefore, no new delta update possible".
    i guess i need to delete the old delta update request.
    can you all help me. its a bit urgent.

    i have done all the possible steps as guided by you all. i am still not being able to load it.
    i will tel you my process flow:
    i have a process chain which is loading data to source ODS and subsequently to another ODS and Cube.
    But there was an errror in loading the data to source ODS today. So i deleted the request from the source ODS and from the other ODS and CUBE.
    Now i loaded data to source ODS successfuly. and tried loading data from source ODS to other ODS and CUBE.
    But the request ended with errors. so i deleted the request again from target ODS and CUBE.
    Now when i m again loading data to target ODS and Cube then it is saying
    1) "Delta request REQU_CGSYNH9QR02W5WZOQBDPTHP99 is incorrect in the monitor"
    2)A repeat needs to be requested
    3)Data target 0CRM_C04 still includes delta request REQU_CGSYNH9QR02W5WZOQBDPTHP99
    4)Data target 0CRM_OPMO still includes delta request REQU_CGSYNH9QR02W5WZOQBDPTHP99
    5)If the repeat is now loaded, then after the load
    6)there could be duplicate data in the data targets
    7)Request the repeat?
    CONTINUE CANCEL
    what do i do now?

  • CUBE (Delta Request)

    Hello,
       Can we delete a delta request from cube
    Scenario is
          Delta Request 1    (Loaded on 12/07/06)
          Delta Request 2    (Loaded on 11/07/06)
          Delta Request 3    (Loaded on 10/07/06)
          Delta Request 4    (Loaded on 09/07/06)
          Delta Request 5    (Loaded on 08/07/06)
    Can we delete Delta Request 4 from the Cube and the next delta won't be effected
    Thanks

    Yes this is possible Krishna. The other requests will not be affected, but may be the data will change.
    Hope this helps...

Maybe you are looking for