Functionality of DSO

Hi Folks,
Could you please send me the links of the general basic functionality of DSO? All relevant information about DSO.
Regards,
Geeta.

Hi,
SAP Notes:
ODS Object
Definition
An ODS object acts as a storage location for consolidated and cleaned-up transaction data (transaction data or master data, for example) on the document (atomic) level.
This data can be evaluated using a BEx query.
An ODS object contains key fields (for example, document number/item) and data fields that can also contain character fields (for example, order status, customer) as key figures. The data from an ODS object can be updated with a delta update into InfoCubes and/or other ODS objects or master data tables (attributes or texts) in the same system or across different systems.
Unlike multi-dimensional data storage using InfoCubes, the data in ODS objects is stored in transparent, flat database tables. Fact tables or dimension tables are not created.
The cumulative update of key figures is supported for ODS objects, just as it is with InfoCubes, but with ODS objects it is also possible to overwrite data fields. This is particularly important with document-related structures, since changes made to documents in the source system do not apply solely to numeric fields such as order quantity, but also to non-numeric fields such as goods receiver, status, and delivery date. To map these changes in the BW ODS objects, the relevant fields in the ODS objects must also be overwritten and set to the current value. Furthermore, a source can be made delta-capable by overwriting it and using the existing change log. This means that the delta that, for example, is further updated in the InfoCubes, is calculated from two, successive after-images.
Use
Before you can begin with modeling, you need to consider how the ODS object needs to be set. Accordingly, select your ODS object type and the implementation variant.
The following ODS objects can be differentiated:
1. Standard ODS Object
2. Transactional ODS object: The data is immediately available here for reporting. For implementation, compare with the Transactional ODS Object.
The following implementation variants are conceivable for standard ODS objects:
· ODS object with source-system oriented data: Here the data is saved in the form it is delivered from the DataSource of the source system. This ODS type can be used to report on the original data as it comes from the source system. It serves to handle administration more comfortably and to selectively update.
· Consistent ODS object: Data is stored here in granular form and consolidated. This consolidated data on a document level creates the basis for further processing in BW. To do this, compare level one of the Implementation Scenarios
· Application-related ODS object: The data is combined here according to the business-related problem and can be used specifically as a basis for operative reporting problems. You can report directly for these ODS objects, or you can continue to update them in InfoCubes. To do this, compare level two of the Implementation Scenarios.
Structure
Every ODS object is represented on the database by three transparent tables:
Active data: A table containing the active data (A table)
Activation queue: For saving ODS data records that are to be updated but that have not yet been activated. The data is deleted after the records have been activated.
Change log: Contains the change history for delta updating from the ODS Object into other data targets, such as ODS Objects or InfoCubes for example.
An exception is the transactional ODS object, which is only made up of the active data table.
The tables containing active data are constructed according to the ODS object definition, meaning that key fields and data fields are specified when the ODS object is defined. Activation queue and change log are the same in the table’s structure. They have the request ID, package ID and the record number as a key.
This graphic shows how the various tables for the ODS object work together during data loading.
Data can be loaded performantly from several source systems simultaneously because a queuing mechanism enables parallel inserting. The key allows records to be labeled consistently in the activation queue.
The data get into the change log via the activation queue and is written to the table for active data upon activation. During activation, the requests are sorted according to their logical keys. This ensures that the data is updated in the correct request sequence in the table for active data.
See also the Example of Updating Data to an ODS Object
Integration
Integration with data flow
The diagram below shows the data flow in BW, enhanced with ODS objects. The data flow consists of three steps:
1. Loading the data from the DataSource into the PSA.
In this step, the data from a source system DataSource is loaded and stored in the transfer structure format. The PSA is the input storage location for BW. DataSources with the IDoc transfer method are not supported with ODS objects.
2. Updating the data into the ODS object, and activating the data
¡ Updating the data into the ODS object using transfer and update rules.
The transfer rules transform and clean up the data from the PSA. Generally, the update rules are only used here for one-to-one transfer into the ODS object. This is because transformation has already taken place in the transfer rules. The data arrives in the activation queue.
¡ Activating the data in the ODS object.
When you activate the data, the data necessary for a delta update is sent to the change log. The data arrives in the table of active data.
3. Updating the data from the ODS object
¡ Updating the data in the related data targets such as InfoCubes or ODS objects, for example.
In this step, the system uses the update rules to update the data that has not yet been processed in the change log (the delta) into InfoCubes, other ODS objects, or master data tables. Only update rules are used here, because the data is already available in a cleansed and consolidated format.
¡ Transferring the data from an ODS object into other BW systems.
In this step, it is possible to use the data mart interface to load data from one BW into another BW. In the BW source system, the corresponding ODS object acts as a DataSource that the BW target system recognizes as a DataSource replica, through the metadata comparison. The BW target system is now able to request data from the source system, just as it does with every other DataSource. The requested data is stored in the input storage PSA for further processing, such as updating into InfoCubes, for example.
Integration with the Administrator Workbench - Modeling
Metadata
The ODS objects are fully integrated with BW metadata. They are transported just like InfoCubes, and installed from Business Content (for more information, see Installing Business Contentsapurl_link_0004_0005_0008). The ODS objects are grouped with the InfoCubes in the InfoProvider view of the Administrator Workbench - Modeling, and are displayed in a tree. They also appear in the data flow display.
Update
The update rules define the rules that are used to write to an ODS object. They are very similar to the update rules for InfoCubes. The most important difference is how the Data Fields Are Updated. When you update requests in an ODS object, you have an overwrite option as well as an addition option.
The Delta Process , which is defined for the DataSource, also influences the update. When you are loading flat files, you have to select a suitable delta process from the transfer structure maintenance, this ensures that you use the correct type of update.
Unit fields and currency fields operate just like normal key figures, meaning that they must be explicitly filled using a rule.
Scheduling and Monitoring
The processes for scheduling InfoPackages for updating into InfoCubes and ODS objects are identical. Error handling is an exception here. It cannot be used with ODS objects.
It is also possible to schedule the activation of ODS object data and the update from the ODS object into the related InfoCubes or ODS objects.
The individual steps, including the ODS object processing, are logged in the Monitor. Logs detailing the activation of the new records from the existing request in the ODS object, are also stored in the Monitor.
Loadable DataSources
In full-update mode, every transactional DataSource contained in an ODS object is updated. In delta-update mode, only those DataSources that are flagged as ODS-delta compatible are updated.
In ODS there are three types of tables available
1.New table
2.Active table
3.Change log Table
When a new data is entered in to the ODS it is get in to the New data Table, And if u activate the ODS these data will be move to Active table. If there is is any Change occurs in the data based on the key fiels this modification and chages can be entered in the Change log table....and those can categorized based on the Record mode
ODS have 3 tables .
New Table
Change Log table
Active Table.
New Table :
When ever a new data comes from source system through scheduing , It will be in the new data table . Do the scheduling , You will get the status in Monitor as Red Green circle . This means that new table is having data . To view data in new table , Just Rt click the ODS -> Manage -> Requests , You select the table as new and extecute .
Active Table .
To make the data to reach the data target , You activate the ODS , So that data from new is moved to Active table . Then the status will become green. Now the data in the new table will be deleted . The active and change log will have entries .
Change Log table :
This table will keed the entries to do the change log processing such as Before Image , After Image Etc..
You need to use a ODS when you have to maintain the data inside BW but in a trasactional way.
The ODS works like a R3 table and maintain the data in that way.ODS doesn not cumulate data as a cube and only maintain certain fields as primary key (like R3 tables).In that way it works as a table and could be the source for an infocube...
Normally tha data in ODS is trasnferred to an infocube in order to report it through a BEX Query.
Also, normally, time characteristics form part of the ODS Key as well.
If you create a Generic Extractor from a Table or View, it's even easier. Just use as your ODS Key the same Key fields you have in your Table or View.
Always analyze the specific case and make this question "What Characteristics combination makes each record unique?" The answer is your key.
Check the following links for details.
http://help.sap.com/saphelp_nw70/helpdata/en/a8/6b023b6069d22ee10000000a11402f/frameset.htm
http://help.sap.com/saphelp_nw04s/helpdata/en/F9/45503C242B4A67E10000000A114084/content.htm
Key fileds are like Primary keys -- Chars are placed
Data fields --> Key figures and Chars
it is because ODS follows Overwrite property for keyfigures based on the combination of keyfields..so make sure on what level of detail you want the data and select your keyfields based on that.
ODS and Cube differences
*Hope info is useful*
Regards
CSM Reddy
CSM Reddy

Similar Messages

  • Regarding Info cube and DSO

    Hi,
    Generally reporting will be done on Info Cube rather than DSO.
    Suppose If we assign the same data source to Info Cube and DSO then both contains the same data.
    Info cube have additive and aggregated functionality where DSO have the overwrite functionality .
    Are we using cube for this functionality only ?
    What about the Dimensions in Cube how they differ from data fields and key fields in DSO when we are developing same Bex Report on both ?
    Please advice me .
    Thanks in advance.
    Thanks & Regards,
    Ramnaresh.p

    It is hard to compare Cube and DSO.
    Both thier own usage.
    1. InfoCube is always additive, while DSO supports overwrite functionality.
    2. In InfoCube, combination of all the characteristic value is a Key in the Fact Table, while in ODS, you can specify your own Key Fields based on which you want to generate unique record in the DSO.
    3. DSO supports many delta modes like D, R, N, X, after image, before image, while cube does not support all the modes. You can not delete the record based on the key from the cube by just loading the data. While DSO automaitcally deletes the record from active table and generates the reverse entry for Cube.
    4. DSO is a flat structure and therefore, it is used to store information at detail level, while cube is used to store information at aggregated level.
    So both the structures are very much different from each other. One can replace other at some places, but both the objects have thier own functionality.
    - Danny

  • Upgrade of Service Pack

    Hi all,
    We need to Upgrade the Service Pack of SAP BW.
    What is the process to do that?
    OR Pre or Post steps that we need to follow.
    Please help.

    You have to do following testing,
    Activity
    ExtractionCreation of Infopackage
    Execution of infopackage with filters(Dynamic and static) and without filters.
    Deletion of infopackage.
    Modify an existing infopackage
    Data Extraction from LO data sources(Full, Delta, Repair Full, Init,repeat delta)
    Data Extraction from Generic data sources(Full and delta)
    Modelling
    Creation of DTP
    Execution of DTPwith filters(Dynamic and static) and without filters.
    Deletion of DTP.
    Modify an existing DTP.
    Check for functioning of interactive DTP Code.
    Functionalities of Semantic groups and various Update modes in DTP settings.
    Creating of Error DTPs.
    Delta Intialization in DTP
    Check for functionality of start, end, and transformation routine.
    Aggregation behaviour with and without start routine.
    Update behaviour od end routine.
    Over write and Additive functionality of transformation.
    Delta functionality in DSO and change log entry creation.
    Check Delta Consistency from Write Optimized DSO
    DSO data load and activation of DSO data. (Both 3.5 flow and 7.0 flow)
    Data mart loads (execution of DTP in 7.0 flow and u201Cfurther updateu201D for 3.5 flow).
    Deletion data from DSO and Cube ( Selective, request based and full)
    Delete and create index.
    Cube Compression
    Rollup of Aggregates
    Request deletion in ODS/Cube
    Process Chains
    Run process chains for Master(with attribute change run) 
    Run process chain for transaction loads which has a full flow i.e source sytem extraction to Aggregate roll up.
    Check for Metachain execution
    Remotely calling a process chain from a source system
    Event based trigger of process chains
    Deletion of PSA and Changelog using process chains
    Deltion of contents using process chain
    Execution of ABAP pgms using process chains
    Functioning of decision making and interrupts in process chains.
    Front End
    Execution of report on Bex, Portal, RSRT
    Customer exit variable
    Virtual Characteristics/Key figures.
    Text variables with replacement path.
    Check for Formula variables.
    Run time Unit and/or Currency conversions.
    Hierarchy variables or Hierarchy.
    Query execution using RRMX (IN BEX)
    Creation of a work Book and refresh the work book.
    All existing functionalities of work like copy sheet,Global and local variables
    Excel version of queries.
    Broadcasting a workbook and query
    IP
    Create, Change , activate an Aggregation Level,Filter, Planning sequence,Planning Function
    Test the functionality of Planning Functions
    Test the reason entry Functionality
    Check the scroll in IP Screen.
    Others
    Creation and Execution of APD
    Creation and execution of Open Hub
    Data Archiving in Write Optimized DSO and Standard DSO
    Run Native SQL Pgms
    Run SE38 pgms
    Refresh Xcelsius dashboard and check for functionality.
    Regards,
    Sushant

  • Can i  use Two DSO 's in Function module

    hi experts,
    Can i  use Two DSO 's in Function module  .That FM is used for Layout
    I actually  want to  fill one DSO Refering the Data in Another DSO.
    Regard
    Naresh.

    In the first way the DSO's are usually called shared libraries or DSO libraries and named libfoo.so or libfoo.so.1.2. They reside in a system directory (usually /usr/lib) and the link to the executable program is established at build-time by specifying -lfoo to the linker command. In the second way the DSO's are usually called shared objects or DSO files and can be named with an arbitrary extension

  • Function module to update right optimized dso

    Hi Experts,
    Is there any standard function module available to insert data into the right optimized DSO from transformation routine. Or programmatically we need to do this.
    Thanks in advance.

    Hi,
    Here there are two DSO used. One to collect the corrrect records and the other to collect the error records. Which will used later for reconciliation with the business and then again loaded after the rectification.
    So from the transformation of the Main DSO we need to update the error DSO (when ever any validation fails).
    Let me know any function module to update the right optimized DSO.
    Thanks,
    Jugal.
    Edited by: jugal behera on Feb 6, 2008 4:38 AM

  • The function of IP: write back data from layout or query into DSO?

    hi,guys,
    I am very wondering that if BI-IP has the function of adding or modifying data directly into DSO ?which like the function of BPS layout writing data back to infocube? If it could ,how?
    Alough I don't think it's possible,I hope someone could give me a certain answer.
    Thanks a lot,
    johnson.

    Hi,
    Please search the forum before posting - this question has been asked quite a few times earlier.
    IP can read data from multiple types of infoproviders (including DSO) but it can write back data only to real-time infocubes.

  • Update DSO with planning function independently from saving data in realtim

    Dear all,
    I have an realtime cube which will be loaded via planning function and/or data entry queries.
    In order to track the entries, I store all companies with status in a DSO objects, by using a planning fuction which calls a function module and inserts entries in the DSO table. The planning function will be start by clicking a button in a web application.
    For example: A company enters data. The value will be stored in the realtime cube and the DSO entry will be created with company xyz and status 1.
    Sometime it is necessary to create a status entry in the DSO without entering data in the realtime cube. (  In order to provide an other department to enter data in during status 1).
    In this case the planning function which calls the function module in order to insert entries in DSO is not working because there is no data exchange with the realtime cube.
    How can I change DSO entries independently from writing data in a realtime cube or not.
    Any help would be great.
    Best regards,
    Stefan from Munich/Germany

    Hello Marcel,
    i have one planning function which copies data from one version to another within the cube and another planning function (type fox) which calls up an ABAP function module in order to update my status DSO. see below:
    DATA FISCYEAR TYPE 0FISCYEAR.
    DATA COMPANY TYPE ZMCOMPANY.
    FISCYEAR = OBJV( ).
    COMPANY = OBJV( ).
    CALL FUNCTION Z_FM_SEND_FOR_APP_PLAN_C01
    EXPORTING
    I_COMPANY = COMPANY
    I_FISCYEAR = FISCYEAR.
    normal way:
    User enters data via query and sends data to headquarters (1. planning functions copy from version 1 to 2 and second planning functions changes status in DSO from 1 to 2.) This works.
    not normal way, but somethis necessary:
    User does not malke any entries, and headquarters wants to change the status via an own web application. In this case the first planning function runs, but no data were copied because there are no entries. So far thats ok, but at least the second pölanning function should run and change the status in the DSO from 1 to 2. And exacltly this is not working. I suppose that the reason is, that there are no data in the cube.
    Any ideas would be great.
    Best regards,
    Stefan from Munich/Germnay

  • Function module: bpc application to DSO

    Hi friends,
    i'm trying to write bpc application data to DSO via BADI.  Is there any function module to update DSO?
    Thanks,

    Hi Naresh,
    It is important to use the right kind of DSO. Only DSOs for Direct Update support writing with a function module.
    The different types of DSO are discussed here: [http://help.sap.com/saphelp_nw70ehp2/helpdata/en/f9/45503c242b4a67e10000000a114084/frameset.htm]
    DSO for Direct Update is discussed specifically here: [http://help.sap.com/saphelp_nw70ehp2/helpdata/en/c0/99663b3e916a78e10000000a11402f/frameset.htm]
    That second link also contains references to all the function modules that are available for writing to this type of DSO.
    Good luck!
    Ethan

  • Key fields in DSO

    HI experts,
    I have the requirement to create DSO with Material,Plant,Usage,BOM,Alternative,BOM category,Counter,Valid From,BOM status,Base unit,Base Quantity fields. I am not involved in the functional part. could you please tell me which fields can take in the key part???
    Regards,
    KP

    Hi Kundan,
    What is the data source that you are using to laod the data to DSO?
    You can fidn the key fileds with the help of data source. or from the source table if you are not having Data source.(there will be a screw symbol for the key fileds).
    you can use material, plant, BOM as key fileds.
    Regards
    KP

  • Infoset functionality in BI-HR

    HI all,
    I worked on infoset, but functionality of Infoset with time dependent objects is different. Can any one share there experience with infoset scenarios in HR module implementations.
    0HR_PA_0 and 0HR_PA_1 doesn't support delta , then how to capture deltas for these datasources. Can any one suggest approach to achieve delta capture mechanism with BI_HR implementation.
    What is the functionality of KEY DATE IN HR-BI, I know that key date will be applicable for time dependent master data objects.I need some more inputs about this concept.
    Regards,
    rvc

    Since most of HR reports requires EMPLOYEE, HRPOSITION, JOB and ORGUNIT information, so u need to create InfoSets with couple of these master data objects along with cube/DSO.
    u2022 Inner join: A record can only be in the selected result set if there are entries in both joined tables
    u2022 Left outer join: If there is no corresponding record in the right table, the record is part of the result set
    (fields belonging to the right table have initial values)
    u2022 Temporal join: A join is called temporal if at least one member is time-dependent.
    If there are InfoSets with time-dependent master data, do not restrict the data by the fields Valid from (0DATEFROM) and Valid to (0DATETO). See also an example in the Data Models using InfoSets section of this document.
    When u implement Time Management & CATS need to set up Time types and quota types etc in source system. For PA & OM I do not think U need to do any config.
    Most of HR datasources (except time management and CATS) doesn't support delta. So, daily load process chain would delete complete data and do a full load.
    The key date contains the date for which time-dependent master data is selected. You determine the key date either in the query definition (query properties) or supply the value using a variable. The default value for the key date is the date on which the query is executed, that is <today>. If you select 01.01.1999 for example, time-dependent data is read up to 01.01.1999.

  • Unable to consolidate data from two DSOs into an InfoCube for reporting.

    Hello Experts
    I am working on BW 7.0 to create BEx Report as below:
    This report will have data coming from two different sources, some data from COPA DSO [such as Customer Number, Product Hierarchy1, Product Hierarchy2, Product Hierarchy3, Product Hierarchy4. Product Hierarchy5, Product Hierarchy6 and a few other Key Figures] and the rest [such as Product Hierarchy, Reference Document, Condition Type (both Active & Inactive), Condition Value and a few other Key Figures] from another DSO (ZSD_DS18) which is a copy of the BCT DSO (0SD_O06). I've chosen this DSO because this is the BCT DSO which is used to store data from a Standard Extractor 2LIS_13_VDKON.
    Below are the screenshots of these 2 DSOs:
    I have successfully extracted the data from 2LIS_13_VDKON (includes PROD_HIER but not Customer Number) and loaded into a DSO (ZSD_D17).
    All the testing is done using only one Sales Document No (VBELN).
    First test that I tried is.. to create an Infocube and loaded data from ZCOPA_01 and ZSD_DS18 and when the LISTCUBE was run on this InfoCube, the data is coming up in two different lines which is not very meaningful. Screenshot below:
    Therefore, I have created another DSO (ZSD_DS17) to Consolidate the data from ZCOPA_01 & ZSD_DS18 establishing mappings between some of common Chars such as below:
    ZCOPA_01                    ZSD_DS18
    0REFER_DOC  <->        0BILL_NUM
    0REFER_ITM    <->        0BILL_ITEM
    0ME_ORDER    <->        0DOC_NUMBER
    0ME_ITEM        <->        0S_ORD_ITEM
    51 records are loaded from ZSD_DS18 into ZSD_DS17 and 4 records are loaded from ZCOPA_01 into ZSD_DS17 for a particular Sales Document Number.
    When I am using a Write-Optimized DSO, data is coming in just 1 line but it is showing only 4 lines which is aggregated which is as expected since W/O DSO aggregates the data. However, when I use Standard DSO, the data is again splitting up into many lines.
    Is there something that I am missing in here while designing the Data Model or does this call for some ABAP being done, if so, where or should I have to talk to the Functional Lead and then Enhance the Standard Extractor, and even if I do that, I would still have to bring in those Key Figures from ZCOPA_01 for my reporting?
    Thank you very much in advance and your help is appreciated.
    Thanks,
    Chandu

    in your (current) InfoCube setup, you could work with "constant selection" on the key figures
    for the COPA key figures, you'll need to add product hierarchy to the key figures with a selection of # and constant selection flagged
    for the SD key figures, you'll have to do the same for customer & the product hierarchy levels (instead of product hierarchy)

  • Generating DSOs by using a ABAP program

    Hi Experts,
    We have a requirement like mass generation of infoobjects through excel file as input. Similarly we need to create 20 to 30 dso's by giving excel file as input.In that file we mention DSO technical name,properties,key fields and data fields. By using this information we have to generate dso automatically in BI system.
    Any one have had implemented this in their project. I searched in Google for this, but no information found.
    Please share me the code/Program/Function Module to achieve it.
    Your help is much appreciated.
    Thanks in advance.
    Regards,
    Vijay.

    Hi Vijay,
    Yo have to use the BAPIs BAPI_ODSO_CREATE and BAPI_ODSO_ACTIVATE.
    I have used these BAPIs in my approach for generating Reporting InfoObjects based on Business Content. Here you have the option to generate a "template" DSO. Please have a look at my document Implementing Reporting InfoObjects based on Business Content. Here you can find sample coding in Class YCL_BW_REP_IOBJ Method GENERATE_TEMPLATE_DSO.
    Best regards,
    Sander

  • DSO - Key Field doubt?

    Hi All,
    I want to know the functionality of the Key Field in the DSO.
    I know based on the Key Fields(Primary Key) data is pulled to DSO, but please explain me the below scenario
    E.g
    I have 3 records in the Extractor( Generic) and i am pulling data to DSO - Full load.
    Material                 Qty       Date
    100                       20        1-Jan-2010
    100                       30        5-Jan-2010
    100                       10        10-Jan-2010
    In DSO i have only one Key field called 0MAT_NM (Material Number), but i have 3 records in the Extractor , shown above.
    Now my Question is
    On what basis data will be pulled to DSO?
    Whether latest date records will be pulled to DSO or let me know correct method? because i have got material no 100
    but date and Qty is different (means one time it is taking Qty - 30 and date as 1st Jan 2010, other run it is taking differnet).
    Please clear my doubt.
    Regards,
    Nithi.

    in your dso you can have one record per key. keyfigures will be in addition or overwrite option. if in "addition' the value will be the sum of all the values for a given key. in 'overwrite', you'll only keep the last record written to the dso; char are always in overwrite, so you'll keep the last record written to the dso
    M.

  • DB Connect DataSource PSA records and DSO records are not matching...

    Dear All,
    I'm working with SAP NetWeaver BW 7.3 and for the first time, I have loaded from Source System DB Connect. I have created a DataSource and pulled all records and found 8,136,559 records in PSA. When I designed and created DSO with Key Fields 0CALDAY, Item No and Company Code, it has transferred records about 8,136,559 and added records about 12,534 only. Similarly following InfoCube has about 12,534 records into its Fact table. When I tried to reconcile the data/records with source DBMS for a month, the records/data could not matched?
    1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
    2. Have I not mentioned the Key Fields of DSO in a correct manner?
    3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
    4. How should I resolve this issue?
    5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
    Many thanks,
    Tariq Ashraf

    Dear Tariq,
    1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
    Ans:  Check transformation once. Is there any start routine you have used or direct assignments. What kind of DTP settings you have done.
    Check the messages at the DTP monitor. You will surely find some clue. Any duplicate records are being detected or not check once if you are using semantic keys in your DTP.
    2. Have I not mentioned the Key Fields of DSO in a correct manner?
    Ans:  The transformation key and the DSo key are they same in your case?
    What kind of DSO is it? Like for sales order DSO you take Order number as a key field., So you have to define the key fields according to business semantics I suppose. Do you agree?
    3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
    Ans:  I dont think so as the keys you defined will help in having unique data records isnot it?
    4. How should I resolve this issue?
    Ans: Please check the above as in Ans:1 please share your observation.
    5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
    Ans: DSO overwriting of key figures is useful when you have full loads in picture. Are you always going to perform full loads ?
    For reference would you like to check this thread:  Data fileds and key fields in DSO
    Lets see what experts give their inputs.
    Thank You...

  • Data Fields & Key Fields IN DSO

    Hi Guys,
    Can any one tel me what exactly key field holds & Data field hold in DSO.
    Thanks,
    Venkatesh

    Hi,
    A DSO serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level. This data can be evaluated using a BEx query.
    A DSO contains
    Key fields (such as document number, document item) and data fields that, in addition to key figures, can also contain character fields (such as order status, customer). The data from a DataStore object can be updated with a delta update into InfoCubes (standard) and/or other DataStore objects or master data tables (attributes or texts) in the same system or across different systems.
    Key fields : It will contains both Chars and KeyFigures.
    Data Fields : Based on this i.e. the charecterisctics which you will keep in this Data Fields, the DSO will over write the Data in DSO.
    Overwriting functionality will works based on Data Fileds.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f9/45503c242b4a67e10000000a114084/content.htm
    Thanks
    Reddy

Maybe you are looking for

  • Convert smartform output in to PDF using CONVERT_OTF function  how to do it

    Hi Anil , and  Hi All          I am trying to display smartforms output in java webdynpro          for that i have got the following code in sdn.            can anybody please clarify these doubts in the  below code            1) What are the mandato

  • Exporting specific columns from XLS file to a new XLS or XLSX file using powershell

    The scenario is that i have a file that our server is constantly dumping data to on a daily basis and out of all the data i only need 2 columns. I need to be a able to automate this because i need to generate a monthly report based on those two colum

  • Err -9006

    i try to download a movie and err -9006 keeps coming up. i've gone to store=>check for purchases many, many times but err -9006 keeps coming up. apple tells me it's something to do with my firewall, but i've successfully downloaded other movies and h

  • Lilo files in Temp folder

    Whenever I start Acrobat 8.1.7 on my Windows Vista laptop, a number of files named liloxxx with modification dates of 5/15/09 and 5/17/09 are created in my App Data\Local\Temp folder.  These disappear when I close Acrobat.  Why is this happening?  Is

  • How do I get Thunderbird to turn E-mail addresses blue?

    I am used to the E-mail addresses automatically turning blue, in some case underlined. Can Thunderbird do this? If so, How? Chris Previously Incredimail.