Write -Optimized DSO Activation Issue

Hi Experts,
when ever I am activating the DSO(WRITE-OPTIMIZED), I am getting error like
"no PSA for InfoSource and source system in Bi 7.0" 
Can you pls provide solution for this issue.
considerations:
1. This Write-Optimized DSO does n't contain any semantic keys,all are taken as DATA Fields.
2. I tried by checking and unchecking the uniqueness of data check box.
Thanking You,
R.Dama

Hi Experts,
I know that only active data table is available in WO-DSO.I am not trying to add activate process in Process Chain.
And, I am not trying in Data Load time.
I am saying that ,the problem is while creating WO-DSO, we need to activate dso..right
ErrorIt is in the Initial step while creating WO-DSO.
Pls help me and clarify that why I am getting that error message..

Similar Messages

  • Write optimized dso activation error

    The dictionary activation module could not activate all objects.
    System Response
    A statement is produced from the activation module log with warnings and error messages.
    Enhancement category for include or subtype missing.....help me solve this issue....

    Hi,
    We had similar error during cube activation but we didn't change Enhancement catagory rather we have deleted additional indexes created on different dimension tables in SE11.
    Go to se11 and check the number of indexes created, use Index button on tool bar. Delete unwanted indexes and then try to activate again.
    Reards,
    Durgesh.

  • Issue with Write Optimized DSO in BI 7.0

    Hi Guys,
    I have an issue reagarding Write Optimized DSO in BI 7.0.
    First, I explain the scenario..
    We have some Master Data Objects around 5. We have a source system for load master data.
    The design is like this
    There is a DSO which gets the data from 30 Master Data Sources and feeds 30 Master Data objects (Info Objects).
    The process of loading master data is before loading the new loads previous load requestes have to be deleted from this Write Optimized DSO and current loads have to be loaded. This entire process is done through Process Chains.
    In this regard. every thing is going smoothly. But for last somedays Deletetion of Requests in Write Optimized DSO process showing its failed (Red Color) with error (pls. check the below screen shot for that) even though the previous requested in DSO got deleted and new load requested added in DSO Manage Tab.
    What I am thinking is, its merely Fron End Problem, If am not wrong please help me in this regard.
    Please let me know is there any notes for related this issue, ot something other solution.
    THanks in Advance
    Peter ....

    how it is resolved how u did u done the procedure ?

  • Standard DSO - Write Optimized DSO, key violation, same semantic key

    Hello everybody,
    I'm trying to load a Write-Optimized DSO from another Standard DSO and then is raised the "famous" error:
    During loading, there was a key violation. You tried to save more than
    one data record with the same semantic key.
    The problematic (newly loaded) data record has the following properties:
    o   DataStore object: ZSD_O09
    o   Request: DTPR_D7YTSFRQ9F7JFINY43QSH1FJ1
    o   Data package: 000001
    o   Data record number: 28474
    I've seen many different previous posts regarding the same issue but not quite equal as mine:
    [During loading, there was a key violation. You tried to save more than]
    [Duplicate data records at dtp]
    ...each of them suggests to make some changes in the Semantic Key. Here's my particular context:
    Dataflow goes: ZSD_o08 (Standard DSO) -> ZSD_o09 (Write-Optimized DSO)
    ZSD_o08 Semantic Keys:
    SK1
    SK2
    SK3
    ZSD_o09 Semantic Keys:
    SK1
    SK2
    SK3
    SK4 (value is taken in a routine as SY-DATUM-1)
    As far as I can see there are no repeated records for semantic keys into ZSD_o08 this is confirmed by querying at active data table for ZSD_o08 ODS. Looking for the Temporary Storage for the crashed DTP at the specific package for the error I can't neither see any "weird" thing.
    Let's suppose that the Semantic Key is crucial as is currently set.
    Could you please advice?. I look forward for your quick response. Thank you and best regards,
    Bernardo

    Hi  Bernardo:
    By maintaining the settings on your DTP you can indicate wether data should be extracted from the Active Table or Change Log Table as described below.
    >-Double click on the DTP that transfers the data from the Standard DSO to the Write Optimized DSO and click on the "Extraction" Tab, on the group at the bottom select one of the 4 options:
    >Active Table (With Archive)
    >Active Table (Without Archive)
    >Active Table (Full Extraction Only)
    >Change Log
    >Hit the F1 key to access the documentation
    >
    >===================================================================
    >Indicator: Extract from Online Database
    >The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    >For Extraction from the DataStore Object, you have the following options:
    >Active Table (with Archive)
    >The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    >Active Table (Without Archive)
    >The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    >Change Log
    >The data is read from the change log of the DataStore object.
    >For Extraction from the InfoCube, you have the following options:
    >InfoCube Tables
    >Data is only extracted from the database (E table and F table and aggregates).
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage.
    Have you modified the default settings on the DTP? How is the DTP configured right now? (or how was configured before your testing?)
    Hope this helps,
    Francisco Milán.

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Missing PARTNO field in Write Optimized DSO

    Hi,
    I have a write optimized DSO, for which the Partition has been deleted (reason unknown) in Dev system.
    For the same DSO, partition parameters exists in QA and production.
    Now while transporting this DSO to QA, I am getting an error "Old key field PARTNO has been deleted", and the DSO could not be activated in the target system.
    Please let me know, how do I re-insert this techinal key of PARTNO in my DSO.
    I pressuming it has something to do with partitioning of the DSO.
    Please Help.......

    Hi,
    Since the write-optimized DataStore object only consists of the table of active data, you do not have to activate the data, as is necessary with the standard DataStore object. This means that you can process data more quickly.
    The loaded data is not aggregated; the history of the data is retained. If two data records with the same logical key are extracted from the source, both records are saved in the DataStore object. The record mode responsible for aggregation remains, however, so that the aggregation of data can take place later in standard DataStore objects.
    The system generates a unique technical key for the write-optimized DataStore object. The standard key fields are not necessary with this type of DataStore object. If standard key fields exist anyway, they are called semantic keys so that they can be distinguished from the technical keys. The technical key consists of the Request GUID field (0REQUEST), the Data Package field (0DATAPAKID) and the Data Record Number field (0RECORD). Only new data records are loaded to this key.
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    PS: Excerpt from http://help.sap.com/saphelp_nw2004s/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    Hope this helps.
    Best Regards,
    Rajani

  • Stack Dump while creating Write Optimized DSO on BW 7.3, MS SQL - any SAP Notes?

    While attempting to create a Write Optimized DSO on BW 7.3, MS SQL, we had a stack dump - ABAP Program     CL_RSD_ODSO_VERS==============CP with short text " Access using NULL object reference is not possible. "
    Has anyone seen any SAP Notes on this subject?

    Hi John,
    Was any field deleted from the DSO, if yes, probably the system is not able to relate the pointer from the DSO table. Error like these come up very often when fields are deleted from DSO. Solution is to remove the field manually from the table also. However, the system is supposed to do but sometimes such ambiguity causes such issues.

  • Unable to delete request from write-optimized DSO (Error during rollback)

    Hi Gurus,
    I am trying to delete a delta request from a Write-Optimized DSO. This request was uploaded with a DTP from another Write-optimized DSO.
    The actual overall status of the request is RED and the description of that status is now: 'Error during rollback of request DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR; only rollback allowed'.
    I checked the log of all Request Operations in DataStore (from the same line where the red request is now) and I see my several attemps to delete this request under a RED radiobutton with the title Rollback.  The details for this error are the following:
    Could not delete request data from active table
    Message no. RSODSO_ROLLBACK114
    Diagnosis
    The system could not delete the request data from the active table of a write-optimized DataStore object.
    System Response
    Write-optimized DataStore object: DTFISO02
    Active table: /BIC/ADTFISO0200
    Request: DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR
    Procedure
    Search for Notes containing the key words "Delete write-optimized DSO PSA"
    I am relatively new to SAP BI 7.0 and I do not know how to delete this request.  Any help will be highly appreciated !!
    Leticia

    Hi Leticia:
    Take a look at the SAP Notes below.
    Note 1111065 - "701: Delta consistency check for write-optimized DSOs"
    Note 1263877 - "70SP20: Delta consistency check for write-optimized DSOs"
    Note 1125025 - "P17:PSA:DSO:ODSR missing in PSA process for write-opt. DSO"
    Additionally, some ideas from the alternative presented on the blog by KMR might help you.
    "How to generate a selective deletion program for info provider"
    Regards,
    Francisco Mílán.

  • Write optimized DSO to standard

    Hello,
    I converted an write optimized DSO to Standard DSO as expected the transformation become inactive i made the mappaing and tried to activate again and its throwing an error cannot activate the transformation.
    The error its giving is
    Syntax error in GP_ERR_RSTRAN_MASTER_TMPL, row 1.181 (-> long text)
    Error during generation
    Error when activating Transformation ........................
    can any on e please suggest how to activate this transformation.
    Cheers,
    Vikram

    Try this program (if exists in ur BW system) RSDG_TRFN_ACTIVATE to activate the transformation.
    Also see whether u can delete complete DTP & Transformation and relogin into RSA1 and create new one.
    (if possible also delete this transformation from table RSTRAN, after deleting DTP)
    Also check following OSS notes on this.
    977922 & 957028

  • Sematic group for a write optimized DSO

    Hello,
    I am loading from a datasource to a write optmized DSO. While doing so, I am not able to set semtantic key/group for the DTP.  I have turned error handling on using the setting 'Valid records update, No reporting(request RED)'.
    Is this setting not possible for a write optimized DSO? if not why not? and if yes, what am i missing!?
    Thanks a lot in advance,
    Prakash

    Thank you,
    here are my findings:
    - if data has been loaded with delta using the DTP, it is not possible to change the sematic groups any more - all requests have to be deleted first
    - the error handling must be turned on
    The issue is resolved.
    Marek

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Write Optimized DSO Probelm

    Hi Friends,
    I have to create a process chain for write optimized DSO so i know that every day we have to delete the data from both data target and psa but i am confused how to set delete data every day in psa of my data source.
    can any body help me in this issue
    Thanks and Regards
    pedamarla

    Hi,
    While creating the process chain, select the process type Delete PSA, give the datasource name and schedule it for daily run. Thenafter, run the load through process chain,
    Cheers...
    Puneesh

  • Unable to delete data target contents of Write-Optimized DSO in Process Chain

    Hi Experts,
    We are using SAP Net Weaver BW 7.01 version and we need to delete the entire data target contents of Write-Optimized DSO in the process chain before the next data load.
    I included this step in process chain but still it is failing with errore message"Message not found (in main memory), Drop Cube Failed In Data Target"
    This process type is working during BW 7.0 version but not in BW 7.01 version.
    However i found that we can use the program RSSM_DELETE_WO_DSO_REQUESTS to delete old requests in the Write-Optimized DSO for BW 7.01 SP07 as per SAP Note 1437407 but still it's not working even after implementing this program as the Prerequisite to delete the request is the data mart status should be updated where it is not happening for the program.
    We had an process type option to 'delete the requests from Write-Optimized DSO' directly in BW 7.3 but still not available in 7.01 version.
    Could you please suggest me on how to resolve this issue in BW 7.01?
    Many thanks for your help in advance.
    Regards,
    Madhu

    Create ABAP program as attached code.
    Then you can use that ABAP program in process chains through ABAP variant
    ABAP varaint should have following properties
    Select call mode as Synchronous; call from Local; and Program
    Give your ABAP program name in "program name" and create one program variant for each write optimized DSO.
    Please refer how to use ABAP program in process chains for further details.
    Hope this helps

  • Disk throughput drops when inserting data packages in write-optimized DSO

    Hi all,
    we are currently testing our new freshly installed SAN.
    To see the performance gain in BI, I'm currently doing some test loads.
    And during the monitoring of those loads, I noticed something I'd like someone to explain :-):
    I execute a DTP from PSA to a write-optimized DSO.
    The n° of parallel processes = 9
    Update method = serial extraction, immediate parallel processing
    N° of records transferred: +23.000.000
    Ok, in the first phase (read the PSA) only one process is used (serial extraction).  When I look in OS07, I notice we have very good throughput: +66.000 TransfKB/s. Very nice!
    But as soon as BI starts inserting the data packages, and parallel processing kicks in, the throughput drops to 4K or something, and sometimes we get 20K at max.  That's not too good.
    We have a massive SAN , but the BI system does not seem to use it?
    I was wondering why this is the case.  I already toyed around with the package size, but it's always the same.
    Also I noticed that the allocated processes don't seem to be active.  I allocated 9 BTC processes to this load.
    They are all used, but we only see 3 inserts at the same time, max.  Also in the DTP-monitor, only 3 packages are processed at the same time.  As it's a write-optimized DSO, RSODSO_SETTINGS does not apply I presume.
    Any ideas?
    tnx!

    Hi,
    can you pls try to give some filetr in DTP and try to pull the data.
    I am not sure why first data package is taking long time and otehr data package is taking less time..
    Do you have any start routine..If datapak = 1.. the do this logic..
    Pls check..
    regards
    Gopal

Maybe you are looking for