Delta in InfoCube

Hi Experts,
How delta is handled in the Infocube?
Regards,
Lalit

Hi Pali,
1. Irrespective of the source of Universe - BEx query or Infocube, you do not need to worry about refreshing/exporting the Universe everytime there is a delta change. Universe is a semantic layer that provides interface to the underlying sources but there is NO DATA stored there (think Virtual cube almost). When a WEBI query is executed, the data is being fetched from an infoprovider.
2. I do not know about any tools at this time that allow conversion from BEx to Crystal. Once Pioneer (the BEx - Voyager hybrid)is rolled out next year, there will possible be some conversion mechanisms to convert the BEx reports onto that particular platform.
Hope this helps.
Cheers!

Similar Messages

  • Delta Upload - InfoCube Then Full Upload ODS

    Hello,
    1. I upload a InfoCube in Delta Mode.
    2. I upload form this InfoCube to a ODS in Full Mode.
    Exemple :
    1. InfoCube (Upload in Delta Mode) :
    Request Order Operation WorkCenter  Values
    1       10    1         AAA         +10
    2       10    1         AAA         -10
    2       10    1         BBB         +20
    2. ODS (Upload from the cube above in Full Mode)
    Order(key) Operation(key) WorkCenter Values
    10         1              AAA        -10
    The last record upload in my ODS is the second of the Infocube. The data in my ODS are incorrects. I don't know if I need to use 0RECORDMODE.
    Thanks in advance for your help.
    Stéphane

    Hello Stephane,
    In your scenario, please do either one from following in order to update date correctly in ODS from infocube,
    1. Change the update type to an addition from an overwrite at ODS level.
    2. or use serialized V3 update at the r/3 side, so that R/3 system sends changed records in the same sequence as it was generated. Because you might be getting +20 first then -10 and ODS setting is kept overwrite. As a result, the ODS data are incorrect(-10).
    Hope it helps you.
    Thanks and Regards,
    ~Ketan Patel.

  • How to load delta for 0PC_C01 cube?

    Hi all,
    I am working on Product Cost Controlling Infoarea. In that I am working on cubes
    CO-PC: Cost Object Controlling :0PC_C01 Having Info sources 0CO_PC_02,0CO_PC_01 when I am scheduling Infopak it is having update option only " Full Update".
    I have done the FULLUPDATE. But the problem is now I have to do "Delta Update"
    What i have to do ?
    how I will load delta into Infocube?
    Plaese suggest me regarding this.
    Points will be Rewrding.
    Thanks,
    kiran.

    Hi Kiran,
    hwat version of BI in?
    if its not 7 you can try this...
    Do you have the option to dona n Init in your infopackage?
    If yea, run an init. Only if you have have an init for a target, does the new infopackage (which you will hve to make to do a delta after an init), gives you an option of a delta.
    Let me know if it workeed.
    Tke cre
    Message was edited by:
            Tessy Thomas

  • Query on delta process

    Hi BW Experts,
    For AP(Accounts payable),AR(Accounts receivable) we can run delta process to pick delta records. How?
    Could anyone please let me know?
    Thanks

    FI extractors are worked on after image delta. Delta records are diractly selected from the R/3 tables using a time stamp mechanism.delta records are diractly transfored to bw no need to writen to the bw Plz go through reg FI OFI_Gl_4,OFI_AP_4,OFI_AR_4 0FI_GL_4 (G/L Accounts: line items) No redundant fields are transferred into BW: Only fields from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS), No customer or vendor related fields. 0FI_AP_4 (AP: line items) and 0FI_AR_4 (AR: line items) Vendor / Customer related information (e.g. payment/dunning data). “Coupled” consistent “snapshot” of FI data in BW:extraction G/L account extraction determines selection criteria (comp.code, fiscal period) and upper time limit of all extracted FI line-items. AP and AR extraction: no further selection criteria necessary / possible. “Uncoupled” extraction possible with PlugIn PI 2002.2, see OSS note 551044. 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 use an After Image Delta Delta type “Extractor“: Delta records are directly selected from the R/3 tables using a timestamp mechanism. Delta records are directly transferred to BW. No record is written to the BW delta queue. After Image Delta: FI line items are transferred from the source system in their final state (= “After Image“). This delta method is not suitable for direct InfoCube update. ODS object is obligatory to determine the delta for InfoCube update. Delta queue and BW scheduler ensure correct serialization of the records (e.g. inserts must not pass changes) Distribution of delta records to multiple BW systems. Selection criteria of Delta-Init upload are used to “couple” the datasources logically. time mechanism... New FI documents Posted in R/3 since the last line-item extraction. Selection based on the field BKPF-CPUDT. Hierarchy Extractor for Balance Sheet & P&L Structure Technical name: 0GLACCEXT_T011_HIER Technical Data Type of DataSource Hierarchies Application Component FI-IO Use The extractor is used for loading hierarchies (balance sheetl/P&L structures) for the characteristic (InfoObject) 0GLACCEXT. Fields of Origin in the Extract Structure Field in Extract Structure Description of Field in the Extract Structure Table of Origin Field in Table of Origin .INCLUDE ROSHIENODE RFDT CLUSTD FIELDNM RSFIELDNM RFDT CLUSTD GLACCEXT GLACCEXT RFDT CLUSTD RSIGN RR_RSIGN RR_PLUMI RFDT CLUSTD PLUMI ROSHIENODE RFDT CLUSTD Features of Extractor Extractor: FBIW_HIERARCHY_TRANSFER_TO Extraction structure: DTFIGL_HIERNODE_1 Financial Accounting: Line Item Extraction Procedure General Information About the Line Item Extraction Procedure BW Release 3.1 makes consistent data extraction in the delta method possible for line items in General Ledger Accounting (FI-GL), and selected subsidiary ledgers (Accounts Receivable FI-AR and Accounts Payable FI-AP) and tax reporting. The extraction procedure delivered with BW Release 2.0B, based on DataSources 0FI_AR_3 and 0FI_AP_3 , can be replaced. This is described in note 0410797. The decisive advantage of choosing R/3 line item tables as the data source is that extra fields can be transferred to the BW. These were not available with transaction figures from table GLTO, the previous R/3 data source in General Ledger Accounting FI-GL. This provides more extensive and flexible analysis in BW. To enable you to assure consistent Delta data, four new InfoSources are provided in the OLTP system (with the corresponding DataSources and extractors in the SAP R/3 system): Application InfoSource Description FI-GL 0FI_GL_4 General Ledger: Line Items FI-AP 0FI_AP_4 Accounts Payable: Line Items (Extraction Linked to 0FI_GL_4) FI-AR 0FI_AR_4 Accounts Receivable: Line Items (Extraction Linked to 0FI_GL_4) FI-TX 0FI_TAX_4 General Ledger: Data for Tax on Sales/Purchases For the General Ledger, selection is made from tables BKPF and BSEG, while selection for the subsidiary accounts is made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable). InfoSource 0FI_GL_4 transfers only those fields that are relevant for General Ledger Accounting from the Financial Accounting document (tables BKPF and BSEG) to the BW system. The consisten recording of data from General Ledger Accounting and Subledger Accounting is provided by means of coupled delta extraction in the time stamp procedure. General ledger accounting is the main process in delta mode and provides subsidiary ledger extraction with time stamp information (time intervals of previously selected general ledger line items). This time stamp information can also be used for a loading history: this shows which line items have previously been extracted from the SAP R/3 system. Delta Method Delta extraction enables you to load into the BW system only that data that has been added or has changed since the last extraction event. Data that is already loaded and is not changed is retained. This data does not need to be deleted before a new upload. This procedure enables you to improve performance unlike the periodic extraction of the overall dataset. Financial Accounting line items are read by extractors directly from the tables in the SAP R/3 system. A time stamp on the line items serves to identify the status of the delta data. Time stamp intervals that have already been read are then stored in a time stamp table. The delta dataset is transferred to the BW system directly, without records being transferred to the delta queue in the SAP R/3 system (extraktor delta method). The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method). This data method is not suitable for filling InfoCubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (InfoCubes) can be provided with data from this ODS object. Time Stamp Method With Financial Accounting line items that have been posted in the SAP R/3 system since the last data request, the extractors identify the following delta dataset using the time stamp in the document header (BKPF-CPUDT). When a delta dataset has been selected successfully, the SAP R/3 system logs two time stamps that delimit a selection interval for a DataSource in table BWOM2_TIMEST: Field Name Key Description MANDT X Client OLTPSOURCE X DataSource AEDAT X SYSTEM: Date AETIM X SYSTEM: Time UPDMODE Data update mode (full, delta, deltainit) TS_LOW Lower limit of the time selection (time stamp in seconds since 1.1.1990) TS_HIGH Upper limit of the time selection (time stamp in seconds since 1.1.1990) LAST_TS Flag: 'X' = Last time stamp interval of the delta extraction TZONE Time zone DAYST Daylight saving time active? The time stamps are determined from the system date and time and converted to the format seconds since 1.1.1990, taking into account the time zone and daylight saving time. To ensure correct and unique reconversion to date and time, the time zone and daylight saving time must be stored in table BWOM2_TIMEST. Table BWOM2_TIMEST therefore serves to document the loading history of Financial Accounting line items. It also provides defined restart points following incorrect data requests. To provide a better overview, the time stamps in the example table are entered in the date format. The columns TZONE and DAYST were left out. OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS 0FI_GL_4 16 May 2000/20:15 Init 01 Jan. 1990 15 May 2000 0FI_GL_4 24 May 2000/16:59 Delta 16 May 2000 23 May 2000 0FI_GL_4 02 June 2000/21:45 Delta 24 June 2000 01 June 2000 0FI_GL_4 15 June 2000/12:34 Delta 02 June 2000 14 June 2000 0FI_GL_4 21 June 2000/18:12 Delta 15 June 2000 20 June 2000 X 0FI_AP_4 18 May 2000/21:23 Init 01 Jan. 1990 15 May 2000 0FI_AP_4 30 May 2000/12:48 Delta 16 May 2000 23 May 2000 0FI_AP_4 10 June 2000/13:19 Delta 24 June 2000 01 June 2000 X 0FI_AR_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_AR_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_AR_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X 0FI_TX_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_TX_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_TX_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X Constraints Per day, no more than one delta dataset can be transferred for InforSource 0FI_GL_4. The extracted data therefore has the status of the previous day. For further data requests on the same day, the InfoSource does not provide any data. In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in the BW system for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting. If you delete the initialization selection in the source system for InfoSource 0FI_GL_4 in the BW system Administrator Workbench, the time stamp entries for InfoSources 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 and OFI_TX_4 are also removed from table BWOM2_TIMEST. Recording Changed Line Items In the case of Financial Accounting line items that have been changed since the last data request in the SAP R/3 system, there is no reliable time stamp that can document the time of the last change. For this reason, all line items that are changed in a way relevant for BW must be logged in the SAP R/3 system

  • Cube Compression and InfoSpoke Delta

    Dear Experts,
    I submitted a message ("InfoSpoke Delta Mechanism") the other day regarding a problem I am having with running an InfoSpoke as a delta against a cube and didn't receive an answer that would fix the problem.   Since that time I have been told that we COMPRESS the data in the cube after it is loaded.  It is after the compression that I have been trying to run the delta InfoSpoke.   As explained earlier there have been 18 loads to a cube since the initial (Full) run of the InfoSpoke.  I am now trying to run the InfoSpoke in Delta mode and get the "There is no new data" message. Could the compression of the cube be causing the problem with the "There is no new data" message that appears when I try to run the InfoSpoke after the cube load and compression?    If someone could explain what happens in a compression also this would be helpful.
    Your help is greatly appreciated.
    Thank you,
    Dave

    You need uncompressed requests to feed your deltas. Infocube deltas uses request ids. Compressed requests cannot be used since the request ids are set to zero.
    You need to resequence the events.
    1. Load into infocube.
    2. Run infospoke delta to extract delta requests.
    3. Compress.

  • Delta Missing for Billing Cube

    Data goes missing only for 25th dec 08 in Billing cube , rest is available in the cube
    Data modeling:
    DataSource: 2LIS_13_VDITM & 2LIS_13_VDHDR
    Load type : Delta
    Target : Infocube
    System: BW3.1
    Possible solution is to delete the deltas since 25th dec 08 till today and perform repair full request for the concerned period (since target is cube so we have to delete the previous loads to avoid duplicate records)
    then,
    delete the init
    initilize w/o data transfer
    schedule the load back to delta
    Before going for this method,do we have to delete/fillup setup table in R/3 ??? I dont think so as we are performing full repair load in BW for the given range of date/period...
    Guys, lets me know whether i m right or not,,,will it be okay to delete delta loads since 25th till today??
    Do u have any better method to get the missing billing data for 25th??

    Alok,
    1. Delete data in BW for the period from 25th Dec. one doubt is there whether we have to delete all the requests since 25th till date or just to delete 25th data?
    2. Dont touch the exisiting init as ait adds unnecessary work
    3. Delete the setup table in R3 fisrt (in case thr is some old data)
    4. Fill the setup table fro Appl 13 for the date range you have deleted. it will be filled up for missing documents
    5. Do a repair full request for this selection. (P.S : Stop the billing delta until you complete this correction) i think we have to borrow help from functional consultant to stop the posting
    6. Restart the billing delta
    --> Alok, if delta queue getting updated and same time setup also performing while postings happening till current date, chances of same record available in setup table and delta queue.
    As you are loading into a cube(addition), after setup tables data loding through repair full, same record may come again via delta(due to new postings). It creates a redundancy of data.
    So, as you have problem with one day data on 25th Dec, its better to delete data for that day or for that week and reload by selectively. As you are loding historic data, no postings happens to those records, no need to worry for delta queue. Or
    Find out missing records, setup and load. Bit tidious job.... but 100% safe...
    P.S: Deleting entire data from 25th onwards and reloading is not a wise decision, as you are loding into a cube.
    Hope its Clear
    Srini

  • Delete Request which is already updated to Futher data target

    Hi Experts,
    Is there any way that we can delete a request from infocube which is already updated in to another data target.
    My requirement is i have PCA infocube which 3 full loads and 1 delta. I can delete the full loads by using delete overlap request process type. But from PCA infocube the data is futher updated in another planning cube as delta request to update the actuals.
    This is causing the problem as the data is getting further update to another cube as a delta request i am not able to delete the overlapping request of full loads in PCA cube. Any ideas or suggestions would be a great help.
    please note that i cant change the dataflow at this point as this cube is already in production.
    Thanks,
    RRRRR.

    Hi R,
    thanks for pointing this out.
    I still think the procedure that I described would be suitable.
    Let me re-capture: You want to delete the Full request in the Infocube, but you don't want to touch the data of this request that has been already updated in the planning cube.
    1- Delete the full request in your infocube
    2- Confirm the messages that this will lead to de-activating the delta to the planning cube (as explained previously)
    3- Load your new full request into your infocube iif you want to
    4- Set up the delta between Infocube and Planning cube again, by requesting an INIT w/o data
    This way you will be able to change the data in the Infocube, but not impact the data that has already been updated in your data target the planning cube.
    In case you want the new FULL request to be updated via a DELTA into your planning cube you have to swap the order of steps 3 and 4 above.
    Hope that clarifies this approach to you.
    kind regards,
    Ralf

  • Functionality of DSO

    Hi Folks,
    Could you please send me the links of the general basic functionality of DSO? All relevant information about DSO.
    Regards,
    Geeta.

    Hi,
    SAP Notes:
    ODS Object
    Definition
    An ODS object acts as a storage location for consolidated and cleaned-up transaction data (transaction data or master data, for example) on the document (atomic) level.
    This data can be evaluated using a BEx query.
    An ODS object contains key fields (for example, document number/item) and data fields that can also contain character fields (for example, order status, customer) as key figures. The data from an ODS object can be updated with a delta update into InfoCubes and/or other ODS objects or master data tables (attributes or texts) in the same system or across different systems.
    Unlike multi-dimensional data storage using InfoCubes, the data in ODS objects is stored in transparent, flat database tables. Fact tables or dimension tables are not created.
    The cumulative update of key figures is supported for ODS objects, just as it is with InfoCubes, but with ODS objects it is also possible to overwrite data fields. This is particularly important with document-related structures, since changes made to documents in the source system do not apply solely to numeric fields such as order quantity, but also to non-numeric fields such as goods receiver, status, and delivery date. To map these changes in the BW ODS objects, the relevant fields in the ODS objects must also be overwritten and set to the current value. Furthermore, a source can be made delta-capable by overwriting it and using the existing change log. This means that the delta that, for example, is further updated in the InfoCubes, is calculated from two, successive after-images.
    Use
    Before you can begin with modeling, you need to consider how the ODS object needs to be set. Accordingly, select your ODS object type and the implementation variant.
    The following ODS objects can be differentiated:
    1. Standard ODS Object
    2. Transactional ODS object: The data is immediately available here for reporting. For implementation, compare with the Transactional ODS Object.
    The following implementation variants are conceivable for standard ODS objects:
    · ODS object with source-system oriented data: Here the data is saved in the form it is delivered from the DataSource of the source system. This ODS type can be used to report on the original data as it comes from the source system. It serves to handle administration more comfortably and to selectively update.
    · Consistent ODS object: Data is stored here in granular form and consolidated. This consolidated data on a document level creates the basis for further processing in BW. To do this, compare level one of the Implementation Scenarios
    · Application-related ODS object: The data is combined here according to the business-related problem and can be used specifically as a basis for operative reporting problems. You can report directly for these ODS objects, or you can continue to update them in InfoCubes. To do this, compare level two of the Implementation Scenarios.
    Structure
    Every ODS object is represented on the database by three transparent tables:
    Active data: A table containing the active data (A table)
    Activation queue: For saving ODS data records that are to be updated but that have not yet been activated. The data is deleted after the records have been activated.
    Change log: Contains the change history for delta updating from the ODS Object into other data targets, such as ODS Objects or InfoCubes for example.
    An exception is the transactional ODS object, which is only made up of the active data table.
    The tables containing active data are constructed according to the ODS object definition, meaning that key fields and data fields are specified when the ODS object is defined. Activation queue and change log are the same in the table’s structure. They have the request ID, package ID and the record number as a key.
    This graphic shows how the various tables for the ODS object work together during data loading.
    Data can be loaded performantly from several source systems simultaneously because a queuing mechanism enables parallel inserting. The key allows records to be labeled consistently in the activation queue.
    The data get into the change log via the activation queue and is written to the table for active data upon activation. During activation, the requests are sorted according to their logical keys. This ensures that the data is updated in the correct request sequence in the table for active data.
    See also the Example of Updating Data to an ODS Object
    Integration
    Integration with data flow
    The diagram below shows the data flow in BW, enhanced with ODS objects. The data flow consists of three steps:
    1. Loading the data from the DataSource into the PSA.
    In this step, the data from a source system DataSource is loaded and stored in the transfer structure format. The PSA is the input storage location for BW. DataSources with the IDoc transfer method are not supported with ODS objects.
    2. Updating the data into the ODS object, and activating the data
    ¡ Updating the data into the ODS object using transfer and update rules.
    The transfer rules transform and clean up the data from the PSA. Generally, the update rules are only used here for one-to-one transfer into the ODS object. This is because transformation has already taken place in the transfer rules. The data arrives in the activation queue.
    ¡ Activating the data in the ODS object.
    When you activate the data, the data necessary for a delta update is sent to the change log. The data arrives in the table of active data.
    3. Updating the data from the ODS object
    ¡ Updating the data in the related data targets such as InfoCubes or ODS objects, for example.
    In this step, the system uses the update rules to update the data that has not yet been processed in the change log (the delta) into InfoCubes, other ODS objects, or master data tables. Only update rules are used here, because the data is already available in a cleansed and consolidated format.
    ¡ Transferring the data from an ODS object into other BW systems.
    In this step, it is possible to use the data mart interface to load data from one BW into another BW. In the BW source system, the corresponding ODS object acts as a DataSource that the BW target system recognizes as a DataSource replica, through the metadata comparison. The BW target system is now able to request data from the source system, just as it does with every other DataSource. The requested data is stored in the input storage PSA for further processing, such as updating into InfoCubes, for example.
    Integration with the Administrator Workbench - Modeling
    Metadata
    The ODS objects are fully integrated with BW metadata. They are transported just like InfoCubes, and installed from Business Content (for more information, see Installing Business Contentsapurl_link_0004_0005_0008). The ODS objects are grouped with the InfoCubes in the InfoProvider view of the Administrator Workbench - Modeling, and are displayed in a tree. They also appear in the data flow display.
    Update
    The update rules define the rules that are used to write to an ODS object. They are very similar to the update rules for InfoCubes. The most important difference is how the Data Fields Are Updated. When you update requests in an ODS object, you have an overwrite option as well as an addition option.
    The Delta Process , which is defined for the DataSource, also influences the update. When you are loading flat files, you have to select a suitable delta process from the transfer structure maintenance, this ensures that you use the correct type of update.
    Unit fields and currency fields operate just like normal key figures, meaning that they must be explicitly filled using a rule.
    Scheduling and Monitoring
    The processes for scheduling InfoPackages for updating into InfoCubes and ODS objects are identical. Error handling is an exception here. It cannot be used with ODS objects.
    It is also possible to schedule the activation of ODS object data and the update from the ODS object into the related InfoCubes or ODS objects.
    The individual steps, including the ODS object processing, are logged in the Monitor. Logs detailing the activation of the new records from the existing request in the ODS object, are also stored in the Monitor.
    Loadable DataSources
    In full-update mode, every transactional DataSource contained in an ODS object is updated. In delta-update mode, only those DataSources that are flagged as ODS-delta compatible are updated.
    In ODS there are three types of tables available
    1.New table
    2.Active table
    3.Change log Table
    When a new data is entered in to the ODS it is get in to the New data Table, And if u activate the ODS these data will be move to Active table. If there is is any Change occurs in the data based on the key fiels this modification and chages can be entered in the Change log table....and those can categorized based on the Record mode
    ODS have 3 tables .
    New Table
    Change Log table
    Active Table.
    New Table :
    When ever a new data comes from source system through scheduing , It will be in the new data table . Do the scheduling , You will get the status in Monitor as Red Green circle . This means that new table is having data . To view data in new table , Just Rt click the ODS -> Manage -> Requests , You select the table as new and extecute .
    Active Table .
    To make the data to reach the data target , You activate the ODS , So that data from new is moved to Active table . Then the status will become green. Now the data in the new table will be deleted . The active and change log will have entries .
    Change Log table :
    This table will keed the entries to do the change log processing such as Before Image , After Image Etc..
    You need to use a ODS when you have to maintain the data inside BW but in a trasactional way.
    The ODS works like a R3 table and maintain the data in that way.ODS doesn not cumulate data as a cube and only maintain certain fields as primary key (like R3 tables).In that way it works as a table and could be the source for an infocube...
    Normally tha data in ODS is trasnferred to an infocube in order to report it through a BEX Query.
    Also, normally, time characteristics form part of the ODS Key as well.
    If you create a Generic Extractor from a Table or View, it's even easier. Just use as your ODS Key the same Key fields you have in your Table or View.
    Always analyze the specific case and make this question "What Characteristics combination makes each record unique?" The answer is your key.
    Check the following links for details.
    http://help.sap.com/saphelp_nw70/helpdata/en/a8/6b023b6069d22ee10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/F9/45503C242B4A67E10000000A114084/content.htm
    Key fileds are like Primary keys -- Chars are placed
    Data fields --> Key figures and Chars
    it is because ODS follows Overwrite property for keyfigures based on the combination of keyfields..so make sure on what level of detail you want the data and select your keyfields based on that.
    ODS and Cube differences
    *Hope info is useful*
    Regards
    CSM Reddy
    CSM Reddy

  • How can I delete all delta request and init from InfoCube?

    Hi,
    We are working with an extractor that support deltas, but we are presenting problems with it, some changes in the data are not displayed. For this reason we are thinking change this type of processing and make a full load (with the full updating the changes are shown).
    For this reason we need to delete all requests of delta and its init. But in manage option of the infocube we can’t see the init and the first deltas.
    If I delete the initialization information at info package (menu option scheduler-> initialization). With this only we will delete the Initialization. I need to delete the data in the infocube.
    I would appreciate your help.
    Regards,
    Victoria

    Hi Victoria,
    You can either display all the requests in the manage tab and delete from there. Or you can use the Automatic Request Deletion option in the Data Target tab of the InfoPackage. See here for details:
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a65a8e07211d2acb80000e829fbfe/content.htm
    Hope this helps...

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Delta Load on DSO and Infocube

    Hi All,
            I would like to know the procedure for the scenario mentioned below.
    Example Scenario:
    I have created a DSO with 10 characteristics and 3 keyfigure. i have got 10 Customers whose transactions are happening everyday. A full upload on the DSO was done on 7th October 09. How can i load their changing data's from 8th Oct to till date to DSO? and what will be the situation for the same in the case of Infocube??
    Step by step guidance will be a great help
    Thanks in advance
    Liquid

    Hi,
    The key-fields take an important role at DSO level alone as you get the power of overwritting records. Once thats done at the DSO level you can simply carry on with a Delta load into the cube from your Change Log table and you don't have to worry about anything. Just to add, all the characteristics in the cube are key-fields so you will get a new records for each different value only the key-figures will sum up for a set of all the same characteristics.
    Thanks,
    Arminder

  • Delta data changes are not reflecting from ODS to another infocube

    ODS1 --->Infocube1  - Field1,field2,field3
    ODS2 -
    > Infocube 2  -
    Field 2-1,2-2,2-3,field2-4
    Both datasource information are totally different.
    Delta changes are captured in the field3 in ODS1 and Infocube 1 also.
    Field3 and Field 2-3 is the same field .Infocube 2 - I have included infoobject and populating the value from ODS1-Field3.
    The question is
    Whenever I capture the changes from ODS1-field3 -the similar changes also needs to be updated here in infocube2 -field2- 3.
    Start routine : I have created code like ,
    SELECT field1 field2 field3
      INTO CORRESPONDING FIELDS OF TABLE IT_field
      FROM /BIC/aods1.
    Update rule routine : Select the field2-3 in the characteristic- routine
      READ TABLE IT_field INTO LS_field
      WITH TABLE KEY
      field1 = COMM_STRUCTURE-field2-1
    Field 2 = COMM_STRUCTURE-field2-2.
    Result = LS_field -field2-3.
    If I load the full load to infocube 2  -no problem.
    Is their any other way Can I capture the transactional data changes in infocube 2.
    Advance thanks

    Thanks for quick update. You are right.
    There is no problem from  ODS1 --->Infocube1 - Delta is working properly.
    My question is ,
    ODS1 has a field 3. Whenever new changes happen for this field3 - I have to capture the simialr changes in the Infocube 2 and the fieldis 2-3.
    field3 and field 2-3 is the same field infoobject .
    Whenever the changes occurs inthe tranasation - ODS1-field 3 will capture the changes. How can I capture the similar changes in Infocube 2 and the field 2-3.

  • Deltas on ODS and Infocube

    Hi All,
    i need information on how to do deltas on ods. i did two  full uploads for ods  0fiar_o03 and then infocube. i need to do delta for this ods and cube. Please give me the steps to do delta for ods and infocube.
    regards
    raja

    How are you doing the full load to ODS 0FIAR_O03. Are you giving any selection criteria.
    In order start pulling Delta to the ODS you need to do a Full repair request or else you need to delete the full load requests from ODS and reload using Initialization.
    Anways you are loading full load request in to the ODS, so it would be better to load Init Load next time and then start picking up deltas.
    Delta to the Infocube is easily possible, by selecting Init in ODS after deleting all the data from the infocube. After that only the delta records from change log will be taken to the Cube from the ODS.
    The delta to the cube can be performed irrespective of whether you performa Full load or Delta load in to the ODS.
    Hope it helps!!!

  • Can any body explain me about R/3 delta in to a infocube directly?

    Hi Experts,
    I think it is not possible to do a delta load from R/3 in to a infocube right? Because Infocube is addition and it will not overwrite the changes like ODS. So if I am doing a delta load from R/3 in to infocube directly; I can only able to do a full load not a delta load right?
    Thanks in advance.
    Aslam.

    I am just trying to understand how the cube works with delta laod. For example:
    Record 1 in the infocube is $15 for infoobject say expense and this infoobject is mapped to the field expense in r/3.
    - now for this same record user changed the value to $20 for the field expense.
    - when i do delta to the infocube in bw what happens?
    Does the infocube overwrite the value from $15 to $20?
    or Does the infocube add $15 to $20 and make it $35?
    If it make it to $35 than it is wrong; it should be $20 right?
    How it works as it don't have any overwrite functionality?
    I think the 0RECORDMODE will take care of the delta load.
    Thanks in advance.
    Aslam.
    Edited by: Asif Aslam on Jan 23, 2008 10:55 PM

  • "Error occurred in the data selection" on init delta load ODS- InfoCube

    Hi, gurus and experts!
    I'm trying to do Init delta load from 0FIAR_O03 ODS into 0FIAR_C03 InfoCube in PRD env. by InfoPackage "Initialize with delta transfer (with data transfer)". Immediately after the load was started I got error
    "Error occurred in the data selection"
    with details
    "Job terminated in source system --> Request set to red".
    Where are no any short dumps. There are 1 activated init load in ODS - nearly 6 500 000 records.
    Any ideas about error? And the way I can load data?

    Hi Gediminas Berzanskis,
    I faced the similar error when I tried to load data from an ODS which contained huge amount of data to a Cube. Even in your case the volume of data is more (around 6.5 million records). The error could be due to the table space issue, please contact your Basis team to check if enough table space exist for both the data targets. 
    Meanwhile you may also check the RFC connection of the Myself source system.
    You can replicate the DSO datasource once and then reactivate the transfer rules using the program RS_TRANSTRU_ACTIVATE_ALL.
    Try load with a small amount of data with a Full load option and then with a delta option..
    Hope this Helps,
    Prajeevan (XLNC)

Maybe you are looking for

  • HP Color Laserjet 2600n cant print on Win8 even with the right driver installed!

    I configured the HP Color Laserjet 2600n from TCP/IP port and dit the windows update on drives, installed my printer but it won't even print the test page. it says error printing on the print queue. I cheked the network and sharing. I checked the pri

  • Auto Calculation of Education Cess on I Tax Amt.

    Hi, I Tax scenario is not configured at client site. Amt of Income Tax incorporated mannually in W/T every month. Now I need to auto-calculate Edu Cess of 3% thereon. All related reports are in 'Z' format. Can we do it thou INVAL and how? OR Is there

  • DBACOCKPIT | Alert Monitor problems

    Hello, I have a the following problem: SAP EHP 1 for SAP NetWeaver 7.0 on the <SID> -CheckDB is running every day at 6:00 am succesfully -TA DBACOCKPIT | Alerts | Database Check - lists all warnings and errors -TA DBACOCKPIT | Alerts | Alert Monitor

  • Camera Flash Function

    When using the camera, all the flash options seem to be greyed out. Unfortunately this means that I can't take a photo in low light situations. Have also tried the night mode which is unselectable. The oddity is that I have been able to download a fl

  • Activation Disaster

    I asked an Adobe rep in a chat window how to upgrade my Captivate 6 subscription to Captivate 7.  I was told to purchase a new subscription and could then download the newest version.  He also told me to deactivate my Captivate 6 before installing 7,