Query on delta process

Hi BW Experts,
For AP(Accounts payable),AR(Accounts receivable) we can run delta process to pick delta records. How?
Could anyone please let me know?
Thanks

FI extractors are worked on after image delta. Delta records are diractly selected from the R/3 tables using a time stamp mechanism.delta records are diractly transfored to bw no need to writen to the bw Plz go through reg FI OFI_Gl_4,OFI_AP_4,OFI_AR_4 0FI_GL_4 (G/L Accounts: line items) No redundant fields are transferred into BW: Only fields from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS), No customer or vendor related fields. 0FI_AP_4 (AP: line items) and 0FI_AR_4 (AR: line items) Vendor / Customer related information (e.g. payment/dunning data). “Coupled” consistent “snapshot” of FI data in BW:extraction G/L account extraction determines selection criteria (comp.code, fiscal period) and upper time limit of all extracted FI line-items. AP and AR extraction: no further selection criteria necessary / possible. “Uncoupled” extraction possible with PlugIn PI 2002.2, see OSS note 551044. 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 use an After Image Delta Delta type “Extractor“: Delta records are directly selected from the R/3 tables using a timestamp mechanism. Delta records are directly transferred to BW. No record is written to the BW delta queue. After Image Delta: FI line items are transferred from the source system in their final state (= “After Image“). This delta method is not suitable for direct InfoCube update. ODS object is obligatory to determine the delta for InfoCube update. Delta queue and BW scheduler ensure correct serialization of the records (e.g. inserts must not pass changes) Distribution of delta records to multiple BW systems. Selection criteria of Delta-Init upload are used to “couple” the datasources logically. time mechanism... New FI documents Posted in R/3 since the last line-item extraction. Selection based on the field BKPF-CPUDT. Hierarchy Extractor for Balance Sheet & P&L Structure Technical name: 0GLACCEXT_T011_HIER Technical Data Type of DataSource Hierarchies Application Component FI-IO Use The extractor is used for loading hierarchies (balance sheetl/P&L structures) for the characteristic (InfoObject) 0GLACCEXT. Fields of Origin in the Extract Structure Field in Extract Structure Description of Field in the Extract Structure Table of Origin Field in Table of Origin .INCLUDE ROSHIENODE RFDT CLUSTD FIELDNM RSFIELDNM RFDT CLUSTD GLACCEXT GLACCEXT RFDT CLUSTD RSIGN RR_RSIGN RR_PLUMI RFDT CLUSTD PLUMI ROSHIENODE RFDT CLUSTD Features of Extractor Extractor: FBIW_HIERARCHY_TRANSFER_TO Extraction structure: DTFIGL_HIERNODE_1 Financial Accounting: Line Item Extraction Procedure General Information About the Line Item Extraction Procedure BW Release 3.1 makes consistent data extraction in the delta method possible for line items in General Ledger Accounting (FI-GL), and selected subsidiary ledgers (Accounts Receivable FI-AR and Accounts Payable FI-AP) and tax reporting. The extraction procedure delivered with BW Release 2.0B, based on DataSources 0FI_AR_3 and 0FI_AP_3 , can be replaced. This is described in note 0410797. The decisive advantage of choosing R/3 line item tables as the data source is that extra fields can be transferred to the BW. These were not available with transaction figures from table GLTO, the previous R/3 data source in General Ledger Accounting FI-GL. This provides more extensive and flexible analysis in BW. To enable you to assure consistent Delta data, four new InfoSources are provided in the OLTP system (with the corresponding DataSources and extractors in the SAP R/3 system): Application InfoSource Description FI-GL 0FI_GL_4 General Ledger: Line Items FI-AP 0FI_AP_4 Accounts Payable: Line Items (Extraction Linked to 0FI_GL_4) FI-AR 0FI_AR_4 Accounts Receivable: Line Items (Extraction Linked to 0FI_GL_4) FI-TX 0FI_TAX_4 General Ledger: Data for Tax on Sales/Purchases For the General Ledger, selection is made from tables BKPF and BSEG, while selection for the subsidiary accounts is made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable). InfoSource 0FI_GL_4 transfers only those fields that are relevant for General Ledger Accounting from the Financial Accounting document (tables BKPF and BSEG) to the BW system. The consisten recording of data from General Ledger Accounting and Subledger Accounting is provided by means of coupled delta extraction in the time stamp procedure. General ledger accounting is the main process in delta mode and provides subsidiary ledger extraction with time stamp information (time intervals of previously selected general ledger line items). This time stamp information can also be used for a loading history: this shows which line items have previously been extracted from the SAP R/3 system. Delta Method Delta extraction enables you to load into the BW system only that data that has been added or has changed since the last extraction event. Data that is already loaded and is not changed is retained. This data does not need to be deleted before a new upload. This procedure enables you to improve performance unlike the periodic extraction of the overall dataset. Financial Accounting line items are read by extractors directly from the tables in the SAP R/3 system. A time stamp on the line items serves to identify the status of the delta data. Time stamp intervals that have already been read are then stored in a time stamp table. The delta dataset is transferred to the BW system directly, without records being transferred to the delta queue in the SAP R/3 system (extraktor delta method). The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method). This data method is not suitable for filling InfoCubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (InfoCubes) can be provided with data from this ODS object. Time Stamp Method With Financial Accounting line items that have been posted in the SAP R/3 system since the last data request, the extractors identify the following delta dataset using the time stamp in the document header (BKPF-CPUDT). When a delta dataset has been selected successfully, the SAP R/3 system logs two time stamps that delimit a selection interval for a DataSource in table BWOM2_TIMEST: Field Name Key Description MANDT X Client OLTPSOURCE X DataSource AEDAT X SYSTEM: Date AETIM X SYSTEM: Time UPDMODE Data update mode (full, delta, deltainit) TS_LOW Lower limit of the time selection (time stamp in seconds since 1.1.1990) TS_HIGH Upper limit of the time selection (time stamp in seconds since 1.1.1990) LAST_TS Flag: 'X' = Last time stamp interval of the delta extraction TZONE Time zone DAYST Daylight saving time active? The time stamps are determined from the system date and time and converted to the format seconds since 1.1.1990, taking into account the time zone and daylight saving time. To ensure correct and unique reconversion to date and time, the time zone and daylight saving time must be stored in table BWOM2_TIMEST. Table BWOM2_TIMEST therefore serves to document the loading history of Financial Accounting line items. It also provides defined restart points following incorrect data requests. To provide a better overview, the time stamps in the example table are entered in the date format. The columns TZONE and DAYST were left out. OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS 0FI_GL_4 16 May 2000/20:15 Init 01 Jan. 1990 15 May 2000 0FI_GL_4 24 May 2000/16:59 Delta 16 May 2000 23 May 2000 0FI_GL_4 02 June 2000/21:45 Delta 24 June 2000 01 June 2000 0FI_GL_4 15 June 2000/12:34 Delta 02 June 2000 14 June 2000 0FI_GL_4 21 June 2000/18:12 Delta 15 June 2000 20 June 2000 X 0FI_AP_4 18 May 2000/21:23 Init 01 Jan. 1990 15 May 2000 0FI_AP_4 30 May 2000/12:48 Delta 16 May 2000 23 May 2000 0FI_AP_4 10 June 2000/13:19 Delta 24 June 2000 01 June 2000 X 0FI_AR_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_AR_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_AR_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X 0FI_TX_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_TX_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_TX_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X Constraints Per day, no more than one delta dataset can be transferred for InforSource 0FI_GL_4. The extracted data therefore has the status of the previous day. For further data requests on the same day, the InfoSource does not provide any data. In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in the BW system for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting. If you delete the initialization selection in the source system for InfoSource 0FI_GL_4 in the BW system Administrator Workbench, the time stamp entries for InfoSources 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 and OFI_TX_4 are also removed from table BWOM2_TIMEST. Recording Changed Line Items In the case of Financial Accounting line items that have been changed since the last data request in the SAP R/3 system, there is no reliable time stamp that can document the time of the last change. For this reason, all line items that are changed in a way relevant for BW must be logged in the SAP R/3 system

Similar Messages

  • Informatica Night Update (Delta) Process - More than one Delta Source

    We are setup to run a nightly process to update our DW with changes that took place that day in our Siebel OLTP environment. We refer to this as our "Delta Process" since it only updates rows that were chagned in the OLTP or adds new rows that were added to the OLTP. Here is our design:
    * In our Siebel OLTP we have a View (V table) created that contains a view of only the records that have been changed since the last "Delta Process". This way we can identify only those rows that need to be updated. So in these examles when you see a table that is prefixed as "S_" it references the entire table and a table prefixed as "V_" references on only the changes to the underlying "S_" table.
    Ex 1: Order Item table (S_ORDER_ITEM) joins to Account table (S_ORG_EXT). In the Informtica mapping SQ_JOINER we have a query to SELECT statements that with the results concatentated with a UNION statement. The first SELECT statement selects all rows from the V_ORDER_ITEM joined to the S_ORG_EXT so that all delta rows on the order item view are updated with the corresponding data from the account table (S_ORG_EXT). The second SELECT statement selects all rows from the S_ORG_ITEM joined to the V_ORG_EXT so that all of the order item records that contained account information that changed (per the view) are updated. The result is an updated Order Item DW table that contains all updates made to the Order Item and all any associated Accounts information that is stored on the Order Item.
    SELECT A.*, B.* FROM V_ORDER_ITEM A, S_ORG_EXT B WHERE A.ORG_ID = B.ROW_ID
    UNION
    SELECT A.*, B.* FROM S_ORDER_ITEM A, V_ORG_EXT B WHERE A.ORG_ID = B.ROW_ID
    The issues:_
    This works fine when you have two tables joined together that contain deltas and you need only on UNION statement. However, the issue is when I have 14 tables joined to S_ORDER_ITEM that contain deltas. This cannot be accomlised (that I can see) with one UNION statement but you would need a UNION statement for each delta table.
    Ex 2: This example contains just 3 tables. Order Item table (S_ORDER_ITEM) joins to Account table (S_ORG_EXT) and joins to Product table (S_PROD_INT). In this example you will need to have one UNION for each delta table. If you combine delta tables in the same union you will ultimately end up missing data in the final result. This is because the delta tables will only contain the rows that have changed and if one delta table contains a change and needs to pull data from another delta table that did not contain a corresponding change it will not pull the information.
    SELECT A.*, B.*, C.* FROM V_ORDER_ITEM A, S_ORG_EXT B, S_PROD_INT C WHERE A.ORG_ID = B.ROW_ID AND A.PROD_ID = C.ROW_ID
    UNION
    SELECT A.*, B.*, C.* FROM S_ORDER_ITEM A, V_ORG_EXT B, S_PROD_INT C WHERE A.ORG_ID = B.ROW_ID AND A.PROD_ID = C.ROW_ID
    UNION
    SELECT A.*, B.*, C.* FROM S_ORDER_ITEM A, S_ORG_EXT B, V_PROD_INT C WHERE A.ORG_ID = B.ROW_ID AND A.PROD_ID = C.ROW_ID
    The question:_
    1. Is my understanding of how the delta process works correct?
    2. Is my understanding that I will need a UNION for each delta table correct?
    3. Is there another way to perform the delta process?
    My issues are based upon the fact that I join roughly 15 delta tables and select about 100 columns to denormalize the data in the DW. If this is the only option that I have then this will generate an very large and complex query which would be very difficult to manage and udpate.
    Thanks...

    Hi,
    Going thru your post, I find that you have the delta view (V_) and the main table (S_) as drivers, i.e you have two driver tables and hence you make outer joins (w.r.t each other) and make a union to get the complete set of data.
    Can you please tell me why both are considered as drivers? Is there a possiability that the V_ view may not have some data but the corresponding S_table might have an update?
    Regards,
    Bharadwaj Hari

  • Re: Update Cache Objects in Delta Process Dosn't work

    Hi All,
    Re: Update Cache Objects in Delta Process doesn't work.
    BI 7 - SP 17
    This is the scenario I am working on, am running a bex query on a Cube(via a multi) with bunch aggregates.
    The daily extraction & Aggregate rollup is correct, but when I run a Bex Query it display incorrect keyfigure values as compared to what we see in LISTCUBE for the infocube.
    So when I ran the same query in RSRT with "Do not use Cache", it gave correct results and then when I ran the Bex Query again it fixed itself and it displayed correctly.
    InfoCube - standard & No compression for requests
    Query Properties are
    Read Mode - H
    Req Status  - 1
    Cache - Main Memory Cache Without swaping
    Update Cache Objects in Delta Process (Flag selected)
    SP grouping - 1
    This problem occurs once in couple of weeks and my question is there a permanant fix for it??
    OR should we turn the cache off??
    Can anyone please help.
    Thanking You.
    Rao

    Hi Kevin/Rao,
    We are currently experiencing problems with the 'Update Cache Objects in Delta' process.  Did either of you manage to resolve your issues, and if so, how?

  • In oracle rac, If user query a select query and in processing data is fetched but in the duration of fetching the particular node is evicted then how failover to another node internally?

    In oracle rac, If user query a select query and in processing data is fetched but in the duration of fetching the particular node is evicted then how failover to another node internally?

    The query is re-issued as a flashback query and the client process can continue to fetch from the cursor. This is described in the Net Services Administrators Guide, the section on Transparent Application Failover.

  • ODS Delta Process, DataSource and Delta Queue

    Hi to all,
    loading data from a SAP source system into SAP BW 7.0 via generic 3.x datasource causes problems.
    Here is a short description:
    The data flow is from a source table using a generic extractor into a target ODS in full update mode.
    Update rules should move data from table structure into ODS structure in this way:
    Source table structure
    CustKey1     Key2     
    13386          C23     
    13386          B14     
    13387          A13
    13387          E25
    ODS structure
    CustKey1     col1     col2     
    13387          A13     E25     
    This works pretty well - as long as all records with the same CustKey1 are transfered in the same data package. Data Browser (SE16) shows the situation in ODS-"New data" view  (data is not activated):
    Request    Data_packet_number     Data_record_number      CustKey1
    112            000003                  1.061              0000013386
    112            000004                      1              0000013386
    112            000004                      2              0000013387
    There are two records for CustKey1 0000013386 with
    data record 1.061 in data packet 000003   and
    data record       1 in data packet 000004.
    The obove constellation is the cause for errors in the ODS Delta Queue and subsequent data processing.
    I think there may be one solution to solve the problem by changing the Delta Process of the Data Source.
    The properties are:
    - Record type: Full, delta, before, after images
    - Delta type: Delta from delta queue or from extractor
    - Serialization: No serialization, serialization of request, serialization of data packages
    But how can I change these settings? Transactions RSA2 RSO2 RSO6 RSO8 don't do this.
    What is the right delta process to use?
    I have tried changing the delta process generating several 3.x datasources with the following delta processes (see table RODELTAM)
    - " " (Full-upload)
    - AIE
    - ADD
    Unfortunately with no effect.
    Who can help?
    Regards
    Josef
    Edited by: Josef L. on Mar 20, 2009 7:44 PM

    hi venkat,
    whenever you load a delta from ods to cube , whatever the data changed in ods since last delta will be updated, hence from you case, both req1 and req2 will be loaded to cube.
    Assign Points if useful
    Ramesh

  • Duplication of data in BI, with 0FI_GL_10" which has a AIED delta process.

    Hi,
    I need some help!!
    In my actual proyect, we are using DS "0FI_GL_10", which has a AIED delta process. Someone knows, if this configuration could carry on any type of extraction problem, such as a duplication of data in BI? This is what it is happening in my proyect. In the inic of "0FI_GL_10", I extract "one" record, and after a few days when I run a delta extraction, I have a new same record in the PSA, (with the same key fileds and attributes than the other). Also, in my tranformation rule, that link my PSA with a DSO, the key figure 0SALE is parameterized as summation in the aggregation, this is wrong and or i have to use a overwrite aggregation? or my problem is in the type on process delta that has the extractor "0FI_GL_10"?
    Thanks in advance for any kind of help.
    Regards,
    Juan Manuel

    Hi
    There are chances that this datasource might send in the same records when run in delta mode and that is the reason a DSO is needed with overwrite mode in between.
    Also there are a couple of notes stating problems related to this datasource.
    Note 1002272,1153944, 1127034
    Refer this if you are having specific issues with this datasource
    Hope this helps
    Gaurav

  • Reg: SQL select Query in BPEL process flow

    <p>
    Hi,
    I am suppose to execute a SQL select query (in BPEL Process flow) as mention below in JDeveloper using Database adapter.
    </p>
    <p>
    SELECT LENGTH, WIDTH, HEIGHT, WEIGHT,
    </p>
    <p>
    LENGTH*WIDTH* HEIGHT AS ITEM_CUBE
    </p>
    <p>
    FROM CUBE
    </p>
    <p>
    WHERE ITEM= &lt;xyz&gt;
    </p>
    <p>
    AND OBJECT= (SELECT CASE_NAME FROM CUBE_SUPPLIER WHERE ITEM=&lt;xyz&gt; AND SUPP_IND = &lsquo;Y')
    <strong>Now my question is:
    1.</strong> What does this "*" refer to in the query and how can I retrieve the value of LENGTH*WIDTH* HEIGHT from the query where LENGTH,WIDTH and HEIGHT are the individual field in the table.
    2.What does this " AS" refer to? If " ITEM_CUBE " is the alies for the table name "ITEM" to retrieve the value, then query shoud be evaluated as
    </p>
    <p>
    SELECT LENGTH, WIDTH, HEIGHT, WEIGHT,
    </p>
    <p>
    LENGTH*WIDTH* HEIGHT AS ITEM_CUBE
    </p>
    <p>
    FROM CUBE
    </p>
    <p>
    WHERE <strong>ITEM_CUBE.ITEM</strong>= &lt;xyz&gt;
    </p>
    <p>
    AND <strong>ITEM_CUBE.OBJECT</strong>= (SELECT CASE_NAME FROM CUBE_SUPPLIER WHERE ITEM=&lt;xyz&gt; AND SUPP_IND = &lsquo;Y')
    Is my assumption correct?
    Please suggest asap.
    Thanks...
    </p>
    <p>
    </p>

    Hi
    Thank for your reply!
    I have a nested select query which performs on two different table as shown below:
    <p>
    SELECT LENGTH, WIDTH, HEIGHT, WEIGHT,
    </p>
    <p>
    LENGTH*WIDTH* HEIGHT AS ITEM_CUBE
    </p>
    <p>
    FROM CUBE
    </p>
    <p>
    WHERE ITEM= &lt;abc&gt;
    </p>
    <p>
    AND OBJECT= (SELECT NAME FROM SUPPLIER WHERE ITEM=&lt;Item&gt; AND SUPP_IND = &lsquo;Y')
    I am using DB adapter of Oracle JDeveloper in BPEL process flow, where I can able to select only one master table in DB adapter say SUPPLIER and its attributes at a time.But as per my requirment I need to select both the table (CUBE and SUPPLIER) in a single adapter to execute my query.
    It can be achievable by using two DB adapter , One to execute the nested query and another to execute the main qyery considering value of nested query as a parameter.But I want to achieve it by using a single one.
    Am I correct with my concept?
    Please suggest how to get it ?
    </p>
    Edited by: user10259700 on Oct 23, 2008 12:17 AM

  • BW Authorizations - Query variable with processing mode as "customer exit"

    Hi,
    Iam new to BW authorizations and have not yet worked on customer exit before. I was going through the documentation at various sites but I could not get the end to end description on how the query process( when using a variable for an InfoObject) works in case of customer exit.
    Let's assume that I am using  a query variable with processing mode as "customer exit" and at the exit I  write some code to extract user's authorizations from a z table. if this is the case, then when an end user runs a query,how will the the system know what value needs to be filled in the variable for the requesting user. Are the user details  also sent to the code along with the query variable? If so how. If I mis-understood the process then forgive me and let me know the correct process.

    Hi!
    welcome to SDN!
    customer exit variables need programing by user. so if you create a customer exit variable, you got to right a program which extracts values into this variable. we can do what ever we want in program, SAP will not deal anything ´with customer exits.
    with regards
    ashwin
    PS n:  Assigning point to the helpful answers is the way of saying thanks in SDN.  you can assign points by clicking on the appropriate radio button displayed next to the answers for your question. yellow for 2, green for 6 points(2)and blue for 10 points and to close the question and marked as problem solved. closing the threads which has a solution will help the members to deal with open issues with out wasting time on problems which has a solution and also to the people who encounter the same porblem in future. This is just to give you information as you are a new user.

  • SAP BW Purchasing - Scheduling Agreement delta processing

    Hi,
    I have a question about Scheduling Agreements coming from ECC to BW with delta processing.
    When a goods receipt happens for a Scheduling Agreement item, it registers process key 2.
    However, it seems that when a second goods receipt happens for that same scheduling agreement item, the old goods receipts updates with a process key 4, and the new receipt takes the process key 2 when we aggregate at an InfoCube level. In the first level write optimized DSO, the process keys are in tact for every line item & schedule line (all goods receipts process key 2 are there).
    Can somebody confirm this logic?
    We are having issues because we need all of that goods receipt history stored in our InfoCubes, and we do not know how to resolve. Please help! Points will be assigned...
    Thanks very much,
    Courtney

    are you saying that Process Key is 2 for both the good reciepts in write optimized DSO but in cube, first good reciept is having a process key 4 when second good reciept is also there?
    If yes, it seems that problem is with the info object for process key - is it a Key Figure or characteristic?
    It should be a characteristic so that the value is not added but overwritten.
    you are doing a full or delta from Data Source to DSO and DSO to Cbue?
    Regards,
    Gaurav

  • Explication about the Delta process in SAP BI - ROCANCEL and 0RECORDMODE

    Hello,
    I use the delta process with cockpit datasources (2LIS_17_I3HDR and 2LIS_17_I0NOTIF).
    I have transfered all data from the extraction queue to the BW queue.
    After, when I will launch the delta process in SAP BI (with the infopackage), two ODS will be updated. I wanted to know how SAP will know what is really the delta? I have seen that there is a ROCANCEL field in the PSA, how does SAP choose the good row? Does-it recognize the ROCANCEL and so replace the row ROCANCELED with the new one? Have we to do a special manipulation (like a mapping of ROCANCEL with 0RECORDMODE?)?
    Can you explain me a little how does SAP work (ROCANCEL values, 0RECORDMODE, etc. and what I have to do? ).
    Thanks a lot,
    Regards,
    Julien

    Check :
    Re: Indicator: Cancel Data Record
    Re: 0RECORDMODE, 0STORNO, ROCANCEL

  • How to include Query extractor in process chain

    Hi All,
    According to th requirement I have created Query Extractor using RSCRM_BAPI to upload Query data to application server. 
    Now it has to be included in PROCESS CHAIN.
    Is there any way to include Query Extractor in process chain.
    Or is there any ABAP program which can be used to trigger the my Query Extractor and later this program can be used in process chain.
    Quick reply is much appreciable.
    Thanks,
    Uday.

    s

  • Query in mapping process....

    can i do the query in mapping process 4 dimensions or cubes??? using which pallete??? n how 2 use it with my own queries???
    thx guyz....

    Perhaps if you rewrite your question in something that resembles English you may get an answer.

  • Can you run a Query in a Process Chain?

    As part of a data validation process chain, I need to run a query and send the results by email.  I've created the query (with the exception) and set it up in information broadcaster to be sent by email.  I thought that I would be able to just drop in the "Exception Reporting" process into the process chain and be able to select the query to run.  Needless to say, it don't work that way.
    If anyone has ran a query in a process chain, please let me know how you did it?
    Also if someone knows how the "Exception Reporting" process in RSPC works, please share?
    Thanks

    Patel, we may be able to rethink our approach and use the event data change that was mentioned in the document you sent.
    I was hoping to be able to do everything from within a process chain.  Does anyone know how to use the "Exception Reporting" process that is available in RSPC?  Is it a leftover from the 3.X days that can't really be used in 7.0?

  • COPA Realignment in R/3 -do i need to Re-initialize the Delta Process in BI

    Hi frnds,
    Im facing an issue related COPA realignment,
    1.Once the COPA realignment is done in R/3, Do i need to Re-initialize the Delta Process in BI?
    Edited by: MohanDP on May 4, 2011 8:37 AM

    But i need historical data too,so how to avoid Re-initialization to get historical data.Can u please help
    if you need the changed historical data, you have no choice but to reinit. you can do this at anytime, you don't need any downtime for copa init.
    M.

  • How the SQL Query Parsing is processing inside SQL/PLSQL engine?

    Hi all,
    Can you explain how the SQL Query Parsing is processing inside SQL/PLSQL engine?
    Thanks,
    Sankar

    Sankar,
    Oracle Database concepts - Chapter 24..
    You will find the explanation required under the heading parsing.
    http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14220/sqlplsql.htm

Maybe you are looking for