Re: Update Cache Objects in Delta Process Dosn't work

Hi All,
Re: Update Cache Objects in Delta Process doesn't work.
BI 7 - SP 17
This is the scenario I am working on, am running a bex query on a Cube(via a multi) with bunch aggregates.
The daily extraction & Aggregate rollup is correct, but when I run a Bex Query it display incorrect keyfigure values as compared to what we see in LISTCUBE for the infocube.
So when I ran the same query in RSRT with "Do not use Cache", it gave correct results and then when I ran the Bex Query again it fixed itself and it displayed correctly.
InfoCube - standard & No compression for requests
Query Properties are
Read Mode - H
Req Status  - 1
Cache - Main Memory Cache Without swaping
Update Cache Objects in Delta Process (Flag selected)
SP grouping - 1
This problem occurs once in couple of weeks and my question is there a permanant fix for it??
OR should we turn the cache off??
Can anyone please help.
Thanking You.
Rao

Hi Kevin/Rao,
We are currently experiencing problems with the 'Update Cache Objects in Delta' process.  Did either of you manage to resolve your issues, and if so, how?

Similar Messages

  • Updating an object in cache without getting it

    Hello
    I need to know how it's possible to update/modify a regular cached object attributes ?
    is see that NamedCache.put() method only replace the old value ..
    I wish to spare the get from cache --> modify attribute localy --> put to cache , since i already know what to change in the objects.
    My Topology : no near/ local cache is configured ,i am working with Coherence*Extend tcp (out of process)
    Thanks

    Hi Reem,
    There are a number of ways to modify cache data internally within the cluster without doing a get/put from the client. The simplest, and probably what you are looking for, would be to use EntryProcessors.
    http://download.oracle.com/docs/cd/E14526_01/coh.350/e14509/transactionslocks.htm#BEIJCGDF
    You can use an entry processor to update an entry for a given key or instead of the whole entry you can also update individual attributes of an entry.
    JK

  • Informatica Night Update (Delta) Process - More than one Delta Source

    We are setup to run a nightly process to update our DW with changes that took place that day in our Siebel OLTP environment. We refer to this as our "Delta Process" since it only updates rows that were chagned in the OLTP or adds new rows that were added to the OLTP. Here is our design:
    * In our Siebel OLTP we have a View (V table) created that contains a view of only the records that have been changed since the last "Delta Process". This way we can identify only those rows that need to be updated. So in these examles when you see a table that is prefixed as "S_" it references the entire table and a table prefixed as "V_" references on only the changes to the underlying "S_" table.
    Ex 1: Order Item table (S_ORDER_ITEM) joins to Account table (S_ORG_EXT). In the Informtica mapping SQ_JOINER we have a query to SELECT statements that with the results concatentated with a UNION statement. The first SELECT statement selects all rows from the V_ORDER_ITEM joined to the S_ORG_EXT so that all delta rows on the order item view are updated with the corresponding data from the account table (S_ORG_EXT). The second SELECT statement selects all rows from the S_ORG_ITEM joined to the V_ORG_EXT so that all of the order item records that contained account information that changed (per the view) are updated. The result is an updated Order Item DW table that contains all updates made to the Order Item and all any associated Accounts information that is stored on the Order Item.
    SELECT A.*, B.* FROM V_ORDER_ITEM A, S_ORG_EXT B WHERE A.ORG_ID = B.ROW_ID
    UNION
    SELECT A.*, B.* FROM S_ORDER_ITEM A, V_ORG_EXT B WHERE A.ORG_ID = B.ROW_ID
    The issues:_
    This works fine when you have two tables joined together that contain deltas and you need only on UNION statement. However, the issue is when I have 14 tables joined to S_ORDER_ITEM that contain deltas. This cannot be accomlised (that I can see) with one UNION statement but you would need a UNION statement for each delta table.
    Ex 2: This example contains just 3 tables. Order Item table (S_ORDER_ITEM) joins to Account table (S_ORG_EXT) and joins to Product table (S_PROD_INT). In this example you will need to have one UNION for each delta table. If you combine delta tables in the same union you will ultimately end up missing data in the final result. This is because the delta tables will only contain the rows that have changed and if one delta table contains a change and needs to pull data from another delta table that did not contain a corresponding change it will not pull the information.
    SELECT A.*, B.*, C.* FROM V_ORDER_ITEM A, S_ORG_EXT B, S_PROD_INT C WHERE A.ORG_ID = B.ROW_ID AND A.PROD_ID = C.ROW_ID
    UNION
    SELECT A.*, B.*, C.* FROM S_ORDER_ITEM A, V_ORG_EXT B, S_PROD_INT C WHERE A.ORG_ID = B.ROW_ID AND A.PROD_ID = C.ROW_ID
    UNION
    SELECT A.*, B.*, C.* FROM S_ORDER_ITEM A, S_ORG_EXT B, V_PROD_INT C WHERE A.ORG_ID = B.ROW_ID AND A.PROD_ID = C.ROW_ID
    The question:_
    1. Is my understanding of how the delta process works correct?
    2. Is my understanding that I will need a UNION for each delta table correct?
    3. Is there another way to perform the delta process?
    My issues are based upon the fact that I join roughly 15 delta tables and select about 100 columns to denormalize the data in the DW. If this is the only option that I have then this will generate an very large and complex query which would be very difficult to manage and udpate.
    Thanks...

    Hi,
    Going thru your post, I find that you have the delta view (V_) and the main table (S_) as drivers, i.e you have two driver tables and hence you make outer joins (w.r.t each other) and make a union to get the complete set of data.
    Can you please tell me why both are considered as drivers? Is there a possiability that the V_ view may not have some data but the corresponding S_table might have an update?
    Regards,
    Bharadwaj Hari

  • Process Chain problem related to Update ODS Object Data (Further Update)

    In the process chain, the data is loaded to an ODS first then feed to an InfoCube by kind of data mart load.  But when checking the process chain, getting yellow warning msg:
    A type "Update ODS Object Data (Further Update)" process has to follow process "Activate ODS Object Data" var.our_infopackage_name in the chain
    Message no. RSMPC016
    If we add the variant "Update ODS Object Data (Further Update) after the activation of the ODS and before the InfoPackage to feed data to the cube, then another yellow warning shows up for the InfoPackage of loading data to the InfoCube that says the Update variant can not be before the loading to this Cube and etc.
    What should we do to have the problem resolved?
    Thanks

    hey Dinesh,
    How to have a wait time for the load to the cube?
    I've given you "Very helpful" rewarding points and after you give the answer to the above, I will give you the "Solve problem" rewarding points.
    Thanks

  • (Automatic) Refresh of Cached object after insert/update

    Hi,
    (I am using Toplink 9.0.3 against an Oracle Database)
    I am inserting and updating records in the database through objects registered in a TopLink UnitOfWork. I happen to know that certain database columns will get a (changed) values because of Database triggers.
    Is it true that the only way to get these changed values reflected back to the TopLink cache is by explicitly executing a
    session.refreshObject();
    call for every object changed in the UnitOfWork?
    Is there no way to inform TopLink (for example through the descriptors for the relevant classes) that for certain classes after the insert/update an automatic synchronization with the database must be performed?
    I have not been able to find such a setting, but I may have overlooked it - I hope I did.
    Thanks for your help,
    Lucas Jellema (AMIS)

    In this case use a postMerge event -- it will get called after the merge of the cache and then you could update the object explicitly.
    Ultimately, the way to achieve the behavior you're looking for is events or refreshing.
    - Don

  • SAP BW Purchasing - Scheduling Agreement delta processing

    Hi,
    I have a question about Scheduling Agreements coming from ECC to BW with delta processing.
    When a goods receipt happens for a Scheduling Agreement item, it registers process key 2.
    However, it seems that when a second goods receipt happens for that same scheduling agreement item, the old goods receipts updates with a process key 4, and the new receipt takes the process key 2 when we aggregate at an InfoCube level. In the first level write optimized DSO, the process keys are in tact for every line item & schedule line (all goods receipts process key 2 are there).
    Can somebody confirm this logic?
    We are having issues because we need all of that goods receipt history stored in our InfoCubes, and we do not know how to resolve. Please help! Points will be assigned...
    Thanks very much,
    Courtney

    are you saying that Process Key is 2 for both the good reciepts in write optimized DSO but in cube, first good reciept is having a process key 4 when second good reciept is also there?
    If yes, it seems that problem is with the info object for process key - is it a Key Figure or characteristic?
    It should be a characteristic so that the value is not added but overwritten.
    you are doing a full or delta from Data Source to DSO and DSO to Cbue?
    Regards,
    Gaurav

  • Query on delta process

    Hi BW Experts,
    For AP(Accounts payable),AR(Accounts receivable) we can run delta process to pick delta records. How?
    Could anyone please let me know?
    Thanks

    FI extractors are worked on after image delta. Delta records are diractly selected from the R/3 tables using a time stamp mechanism.delta records are diractly transfored to bw no need to writen to the bw Plz go through reg FI OFI_Gl_4,OFI_AP_4,OFI_AR_4 0FI_GL_4 (G/L Accounts: line items) No redundant fields are transferred into BW: Only fields from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS), No customer or vendor related fields. 0FI_AP_4 (AP: line items) and 0FI_AR_4 (AR: line items) Vendor / Customer related information (e.g. payment/dunning data). “Coupled” consistent “snapshot” of FI data in BW:extraction G/L account extraction determines selection criteria (comp.code, fiscal period) and upper time limit of all extracted FI line-items. AP and AR extraction: no further selection criteria necessary / possible. “Uncoupled” extraction possible with PlugIn PI 2002.2, see OSS note 551044. 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 use an After Image Delta Delta type “Extractor“: Delta records are directly selected from the R/3 tables using a timestamp mechanism. Delta records are directly transferred to BW. No record is written to the BW delta queue. After Image Delta: FI line items are transferred from the source system in their final state (= “After Image“). This delta method is not suitable for direct InfoCube update. ODS object is obligatory to determine the delta for InfoCube update. Delta queue and BW scheduler ensure correct serialization of the records (e.g. inserts must not pass changes) Distribution of delta records to multiple BW systems. Selection criteria of Delta-Init upload are used to “couple” the datasources logically. time mechanism... New FI documents Posted in R/3 since the last line-item extraction. Selection based on the field BKPF-CPUDT. Hierarchy Extractor for Balance Sheet & P&L Structure Technical name: 0GLACCEXT_T011_HIER Technical Data Type of DataSource Hierarchies Application Component FI-IO Use The extractor is used for loading hierarchies (balance sheetl/P&L structures) for the characteristic (InfoObject) 0GLACCEXT. Fields of Origin in the Extract Structure Field in Extract Structure Description of Field in the Extract Structure Table of Origin Field in Table of Origin .INCLUDE ROSHIENODE RFDT CLUSTD FIELDNM RSFIELDNM RFDT CLUSTD GLACCEXT GLACCEXT RFDT CLUSTD RSIGN RR_RSIGN RR_PLUMI RFDT CLUSTD PLUMI ROSHIENODE RFDT CLUSTD Features of Extractor Extractor: FBIW_HIERARCHY_TRANSFER_TO Extraction structure: DTFIGL_HIERNODE_1 Financial Accounting: Line Item Extraction Procedure General Information About the Line Item Extraction Procedure BW Release 3.1 makes consistent data extraction in the delta method possible for line items in General Ledger Accounting (FI-GL), and selected subsidiary ledgers (Accounts Receivable FI-AR and Accounts Payable FI-AP) and tax reporting. The extraction procedure delivered with BW Release 2.0B, based on DataSources 0FI_AR_3 and 0FI_AP_3 , can be replaced. This is described in note 0410797. The decisive advantage of choosing R/3 line item tables as the data source is that extra fields can be transferred to the BW. These were not available with transaction figures from table GLTO, the previous R/3 data source in General Ledger Accounting FI-GL. This provides more extensive and flexible analysis in BW. To enable you to assure consistent Delta data, four new InfoSources are provided in the OLTP system (with the corresponding DataSources and extractors in the SAP R/3 system): Application InfoSource Description FI-GL 0FI_GL_4 General Ledger: Line Items FI-AP 0FI_AP_4 Accounts Payable: Line Items (Extraction Linked to 0FI_GL_4) FI-AR 0FI_AR_4 Accounts Receivable: Line Items (Extraction Linked to 0FI_GL_4) FI-TX 0FI_TAX_4 General Ledger: Data for Tax on Sales/Purchases For the General Ledger, selection is made from tables BKPF and BSEG, while selection for the subsidiary accounts is made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable). InfoSource 0FI_GL_4 transfers only those fields that are relevant for General Ledger Accounting from the Financial Accounting document (tables BKPF and BSEG) to the BW system. The consisten recording of data from General Ledger Accounting and Subledger Accounting is provided by means of coupled delta extraction in the time stamp procedure. General ledger accounting is the main process in delta mode and provides subsidiary ledger extraction with time stamp information (time intervals of previously selected general ledger line items). This time stamp information can also be used for a loading history: this shows which line items have previously been extracted from the SAP R/3 system. Delta Method Delta extraction enables you to load into the BW system only that data that has been added or has changed since the last extraction event. Data that is already loaded and is not changed is retained. This data does not need to be deleted before a new upload. This procedure enables you to improve performance unlike the periodic extraction of the overall dataset. Financial Accounting line items are read by extractors directly from the tables in the SAP R/3 system. A time stamp on the line items serves to identify the status of the delta data. Time stamp intervals that have already been read are then stored in a time stamp table. The delta dataset is transferred to the BW system directly, without records being transferred to the delta queue in the SAP R/3 system (extraktor delta method). The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method). This data method is not suitable for filling InfoCubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (InfoCubes) can be provided with data from this ODS object. Time Stamp Method With Financial Accounting line items that have been posted in the SAP R/3 system since the last data request, the extractors identify the following delta dataset using the time stamp in the document header (BKPF-CPUDT). When a delta dataset has been selected successfully, the SAP R/3 system logs two time stamps that delimit a selection interval for a DataSource in table BWOM2_TIMEST: Field Name Key Description MANDT X Client OLTPSOURCE X DataSource AEDAT X SYSTEM: Date AETIM X SYSTEM: Time UPDMODE Data update mode (full, delta, deltainit) TS_LOW Lower limit of the time selection (time stamp in seconds since 1.1.1990) TS_HIGH Upper limit of the time selection (time stamp in seconds since 1.1.1990) LAST_TS Flag: 'X' = Last time stamp interval of the delta extraction TZONE Time zone DAYST Daylight saving time active? The time stamps are determined from the system date and time and converted to the format seconds since 1.1.1990, taking into account the time zone and daylight saving time. To ensure correct and unique reconversion to date and time, the time zone and daylight saving time must be stored in table BWOM2_TIMEST. Table BWOM2_TIMEST therefore serves to document the loading history of Financial Accounting line items. It also provides defined restart points following incorrect data requests. To provide a better overview, the time stamps in the example table are entered in the date format. The columns TZONE and DAYST were left out. OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS 0FI_GL_4 16 May 2000/20:15 Init 01 Jan. 1990 15 May 2000 0FI_GL_4 24 May 2000/16:59 Delta 16 May 2000 23 May 2000 0FI_GL_4 02 June 2000/21:45 Delta 24 June 2000 01 June 2000 0FI_GL_4 15 June 2000/12:34 Delta 02 June 2000 14 June 2000 0FI_GL_4 21 June 2000/18:12 Delta 15 June 2000 20 June 2000 X 0FI_AP_4 18 May 2000/21:23 Init 01 Jan. 1990 15 May 2000 0FI_AP_4 30 May 2000/12:48 Delta 16 May 2000 23 May 2000 0FI_AP_4 10 June 2000/13:19 Delta 24 June 2000 01 June 2000 X 0FI_AR_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_AR_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_AR_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X 0FI_TX_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_TX_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_TX_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X Constraints Per day, no more than one delta dataset can be transferred for InforSource 0FI_GL_4. The extracted data therefore has the status of the previous day. For further data requests on the same day, the InfoSource does not provide any data. In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in the BW system for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting. If you delete the initialization selection in the source system for InfoSource 0FI_GL_4 in the BW system Administrator Workbench, the time stamp entries for InfoSources 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 and OFI_TX_4 are also removed from table BWOM2_TIMEST. Recording Changed Line Items In the case of Financial Accounting line items that have been changed since the last data request in the SAP R/3 system, there is no reliable time stamp that can document the time of the last change. For this reason, all line items that are changed in a way relevant for BW must be logged in the SAP R/3 system

  • Initialize Delta Process without data transfer

    Dear Gurus,
    I am facing a problem when I try to "initialize Delta Process without data transfer" an ODS.
    I first deleted the initialization request in Scheduler->Initialization Options for Source system and then I executed the Infopackage.
    The status of the new init. request is red and I get the following error message :
    Request REQU_DMJI1J66397EM5OZMC1EQJLWZ (192.384) for DataSource 8ZSALETXT from source system PW1CLNT600 has the status green and a lesser SID (and is therefore older) than the request that you currently want to update into the DataStore object.
    This is not possible because the sequence of requests has to be followed.
    Delta- and init requests have to be updated to the DataStore object in the request sequence.
    The problem is that I can't find this request (REQU_DMJ...) anywhere.
    Could you please explain me what to do with this issue and tell me if it is possible to make the system just ignore this request ?
    Thank you in advance
    Fabien

    Hi............
    U r not able to find REQU_DMJI1J66397EM5OZMC1EQJLWZ in the Manage screen of the ODS right?.........
    Go to RSRQ...................give this request number and check...........if the Status is green........in the Header tab check whether the load is till PSA or not............if it is till PSA............then go to the Status tab and click on Process Manually to push the request in the target........otherwise just make the QM status red here..............then do the load.......
    Regards,
    Debjani.........

  • ODS Delta Process, DataSource and Delta Queue

    Hi to all,
    loading data from a SAP source system into SAP BW 7.0 via generic 3.x datasource causes problems.
    Here is a short description:
    The data flow is from a source table using a generic extractor into a target ODS in full update mode.
    Update rules should move data from table structure into ODS structure in this way:
    Source table structure
    CustKey1     Key2     
    13386          C23     
    13386          B14     
    13387          A13
    13387          E25
    ODS structure
    CustKey1     col1     col2     
    13387          A13     E25     
    This works pretty well - as long as all records with the same CustKey1 are transfered in the same data package. Data Browser (SE16) shows the situation in ODS-"New data" view  (data is not activated):
    Request    Data_packet_number     Data_record_number      CustKey1
    112            000003                  1.061              0000013386
    112            000004                      1              0000013386
    112            000004                      2              0000013387
    There are two records for CustKey1 0000013386 with
    data record 1.061 in data packet 000003   and
    data record       1 in data packet 000004.
    The obove constellation is the cause for errors in the ODS Delta Queue and subsequent data processing.
    I think there may be one solution to solve the problem by changing the Delta Process of the Data Source.
    The properties are:
    - Record type: Full, delta, before, after images
    - Delta type: Delta from delta queue or from extractor
    - Serialization: No serialization, serialization of request, serialization of data packages
    But how can I change these settings? Transactions RSA2 RSO2 RSO6 RSO8 don't do this.
    What is the right delta process to use?
    I have tried changing the delta process generating several 3.x datasources with the following delta processes (see table RODELTAM)
    - " " (Full-upload)
    - AIE
    - ADD
    Unfortunately with no effect.
    Who can help?
    Regards
    Josef
    Edited by: Josef L. on Mar 20, 2009 7:44 PM

    hi venkat,
    whenever you load a delta from ods to cube , whatever the data changed in ods since last delta will be updated, hence from you case, both req1 and req2 will be loaded to cube.
    Assign Points if useful
    Ramesh

  • After REFRESH the cached object is not consistent with the database table

    After REFRESH, the cached object is not consistent with the database table. Why?
    I created a JDBC connection with the Oracle database (HR schema) using JDeveloper(10.1.3) and then I created an offline database (HR schema)
    in JDeveloper from the existing database tables (HR schema). Then I made some updates to the JOBS database table using SQL*Plus.
    Then I returned to the JDeveloper tool and refreshed the HR connection. But I found no any changes made to the offline database table JOBS in
    JDeveloper.
    How to make the JDeveloper's offline tables to be synchronized with the underling database tables?

    qkc,
    Once you create an offline table, it's just a copy of a table definition as of the point in time you brought it in from the database. Refreshing the connection, as you describe it, just refreshes the database browser, and not any offline objects. If you want to syncrhnonize the offline table, right-click the offline table and choose "Generate or Reconcile Objects" to reconcile the object to the database. I just tried this in 10.1.3.3 (not the latest 10.1.3, I know), and it works properly.
    John

  • How can I update an object and its nested collection- Agengy, SMP 3.0

    Hello Gurus,
    I have two objects: PurchaseOrders and PurchaseItems. 1 PurchaseOrder has more than 1 PurchaseItems. In my Java code, I create a Item array in PurchaseOrder object to store the PurchaseItems.
    What I want to do is to update the PurchaseOrder object. There are 2 things I have to do:
    Update PurchaseOrder object.
    Update an existing PurchaseItem of the PurchaseOrder and/or create a new PurchaseItem of that PurchaseOrder?
    I already created a Bapi Java class to edit a selected PO and every Item of that PO on my device.
    What should I do next with the Agentry? Any help is appreciated.. Thank you very much.

    The process is virtually identical.  The only difference will be in your BAPI logic which you have already written.  You will connect it just like you are doing for the Add scenario.  Once you get that working this should follow a very similar pattern.
    --Bill

  • Flash Player dosn't work in IE 11 even shockwave flash object is activated

    Hello,
    Flash Player dosn't work in IE 11 even shockwave flash object is activated.
    I've installed Adobe Flash Player Version 16.0.0.287 (today).
    then
    I  checked on this page:
    http://helpx.adobe.com/de/flash-player.html
    says Flash Player not installed or not activated
    Active X Version 16.0.0.287
    Internet Explorer 11
    Win 7 SP1 32 Bit
    Please help me.
    Thank you in advance
    Paul

    First, confirm that ActiveX Filtering is configured to allow Flash content:
    https://forums.adobe.com/thread/867968
    Internet Explorer 11 introduces a number of changes both to how the browser identifies itself to remote web servers, and to how it processes JavaScript intended to target behaviors specific to Internet Explorer. Unfortunately, this means that content on some sites will be broken until the content provider changes their site to conform to the new development approach required by modern versions of IE.
    You can try to work around these issues by using Compatibility View:
    http://windows.microsoft.com/en-us/internet-explorer/use-compatibility-view#ie=ie-11
    If that is too inconvenient, using Google Chrome may be a preferable alternative.

  • Explication about the Delta process in SAP BI - ROCANCEL and 0RECORDMODE

    Hello,
    I use the delta process with cockpit datasources (2LIS_17_I3HDR and 2LIS_17_I0NOTIF).
    I have transfered all data from the extraction queue to the BW queue.
    After, when I will launch the delta process in SAP BI (with the infopackage), two ODS will be updated. I wanted to know how SAP will know what is really the delta? I have seen that there is a ROCANCEL field in the PSA, how does SAP choose the good row? Does-it recognize the ROCANCEL and so replace the row ROCANCELED with the new one? Have we to do a special manipulation (like a mapping of ROCANCEL with 0RECORDMODE?)?
    Can you explain me a little how does SAP work (ROCANCEL values, 0RECORDMODE, etc. and what I have to do? ).
    Thanks a lot,
    Regards,
    Julien

    Check :
    Re: Indicator: Cancel Data Record
    Re: 0RECORDMODE, 0STORNO, ROCANCEL

  • Need information on 'Repeat Delta' process

    Deleted failed request from InfoCube 0PY_C02
    Checked Data Mart Status from ODS ZPY_O50 for failed load.
    When attempting to push the data from the ODS to InfoCube get error message:
         Last delta update is not yet completed
         Therefore, no new delta update is possible.
         You can start the request again
         if the last delta request is red or green in the monitor (QM activity).
    To get load to start, opened process chain and selected ‘Repeat’ from context menu. This enabled the job to start without the above mentioned error.
    Questions to answer:
    1. BW is expecting less than 8019027 records to be uploaded to the InfoCube, currently looking in the monitor I see that 44204420 records have been processed. Does this mean that the repeat process is loading all data to InfoCube from ODS?
    2. When doing a repeat last delta, does the system repeat only the last successful delta?
    3. When doing a repeat last delta, does the system read the entire contents of the ODS then update the InfoCube with the correct records, or just read the last request loaded to the ODS?

    Normally the repeat should pull only the failed delta, but for some reason I experienced that too having the repeat was pulling more record that the earlier delta. I would say the delta pointer is not working properly.
    Normally, before deleting the failed request, you need to turn the request to red, but that is too late now to do.
    You need to reint delta, but you need to make sure that you don't miss the last delta.
    Steps:
    1. Delete the failed request from the cube.
    2. Delete the successful request from the ods
    3. Delete initialization from ods to the cube
    4. Do init without data transfer from ODS to the cube
    5. Reconstruct the last deleted request in the ODS
    6. Do Delta from ods to the cube.
    Now you have reinitialized and got the last delta which is in the ods.
    thanks.
    Wond

  • TopLink cached object changed are not commited to the database

    Hello,
    I'm using TopLink 10 and I have a writing issue with a use case:
    1. I read an obect using TopLink that is in the IdentityMap
    2. Using JSF this object is edited throught a web form.
    3. I give the modified object to the data layer and try to modify inside a unit of work:
    UnitOfWork uow = session.acquireUnitOfWork();
    //laspEtapeDef comes from JSF and has been modfied previously
    LaspEtapeDef laspEtapeDefClone = uow.readObject( laspEtapeDef );
    //I update the clone field
    laspEtapeDefClone.setDescription(laspEtapeDef.getDescription());
    uow.commit();4. I use again the same object to display it once modified.
    The object is modified in the cache but the modified fields are never commited to the database. This code works only if I disable the cache.
    So, I've modified my JSF form to send the fields instead of modifying directly the object.
    My question: Is there a way to commit changes mades in an cached object?
    I've found the following section in the documentation, that explain the problem but doesn't gives the solution:
    http://docs.oracle.com/cd/E14571_01/web.1111/b32441/uowadv.htm#CACGDJJH
    Any idea?

    How are you reading in the object initially? The problem is likely that you are modifying an object from the session cache. When you then read in the object from the uow, it uses the object in the session cache as the back up. So there will not appear to be any changes to persist to the database.
    You will need to make a copy of the object for modification, or use the copy from the unitofwork to make the changes instead of working directly on the object in the session. Disabling the cache means there is no copy in the session cache to use as a back up, so the uow read has to build an object from the database.
    Best Regards,
    Chris

Maybe you are looking for

  • Journal Entry report

    SAP experts, I need to find a JE report with the following columns in particular for the client. I know I can go to FB03, S_ALR_87012286, S_ALR_87012287 and S_ALR_87012289, but I am not a able to get a simple reports with these columns. Company code,

  • Why Songs duplicated - downloaded and shown as itunes in the clouds

    Just downloaded iTunes 11. Approx 400 songs are shown with the cloud. I understand that it represents songs bought on other devices but not yet downloaded on the PC. But thats not the case.Each of the songs had been previously downloaded and therefor

  • Connecting to any one of the two diff db instances in the loin page itself

    HIi, In my login page we need to have inputtext with a dropdown where the dropdown will have 2 different db instances and the user should have a option of choosing any one based on his selection when he enters user name and password and clicks on log

  • Video display with a local FLV

    Hello, I use the video dislay component for playing a flv (vd.source='video/video1.flv'). When i use on my website, it's running. But I need to work in offline on CD-Rom. So when I copy all files from 'bin' directory to another different directory, t

  • Change country of registration ID

    Change country of registration ID, plse help