Transformationsde-activating after changing write optimized to standard dso

Hi experts,
i changed write optimized dso to standard dso because of some changes, but after changes done i activated the changed transformations  standard dso(previously write optimized), but it is not activating , after gone through the issue its happened
because of 0 record mode, i deleted the 0 recordmode rule and activated.
Now the real issue:-
when i transport this changed dso and transformations from dev to acceptance test am facing this issue below are the details.
Start of the after-import method RS_TRFN_AFTER_IMPORT for object type(s) TRFN (Activation Mode)
Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC deleted in version M of DDIC
Transformation 0RKZY5AKAWMJ2ZRKJMSZO791UP6X1F2N deleted in version M of DDIC
Activation of Objects with Type Transformation
Checking Objects with Type Transformation
Checking Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC
Rule 10 (target: 0RECORDMODE group: 02 Technical Group ): Constant is initial
Checking Transformation 0RKZY5AKAWMJ2ZRKJMSZO791UP6X1F2N
Rule 10 (target: 0RECORDMODE group: 02 Technical Group ): Constant is initial
Saving Objects with Type Transformation
Internal Activation (Transformation )
Preprocessing / Creation of DDIC Objects for Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC
Preprocessing / Creation of DDIC Objects for Transformation 0RKZY5AKAWMJ2ZRKJMSZO791UP6X1F2N
Please help me in this by rectificying in dev system how could i manage to send this transport to acceptance test.
One more thing even after deletin the 0recordmode in transformation , i can able to see that in techinical group.
cheers
vas
Post Processing/Checking the Activation for Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC
Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC was activated

Hi
You can change write optimised to satndard DSO.
You are supposed to add 0recordmode field in datsource check below links for clear information:
error  while activating transformations
Re: Transformation error.
Ravi

Similar Messages

  • IPhone 4 activating after changing memory

    How I can activate iphone 4 after changing memory from 32 gb to 16 gb???

    you can use any services and check this imei:
    PHONE 4 32GB BLACK
    SIM Locked : No
    Carrier : US AT&T Puerto Rico and US Virgin Islands Activation Policy

  • Activation after changes in structure with errors.

    hi all,
    I have a Z- Structure which is included in a transparent Z- table.
    I made some changes in the data element of one of the structure fields, (Changed from Char to NUMC ), now when i activate the structure it shows an error , saying " Alter table not possible on the table.....".
    Any idea how to proceed from here, is there some other way to change and activate all the dependent tables.
    All answers will be appreciated.

    Hi,
    You need to go to the tables which are using this structure and then use Data Base Utility ( T code Se14 ) to adjust and activate the database tables.
    Also make sure that the programs which are using dependent tables are also changed accordingly and this activity is done some time when system is not being used much.
    Hope this helps!!
    Regards,
    Lalit

  • Sim not active after changing to 4G

    This new 4G sim is active with a temporary number but I need my old number working. The old sim still works songs been using that. This is taking too long. What should I do EE?

    I am having the same issue it should take up to 24 hours. Now 48 hours and still waiting for it to migrate over. Very annoying.

  • Problem in activating after changing structure of BAPI once released

    Hi,
    I have created a BAPI  which is having 2 import structures. The BAPI was relaesed.
    When I changed component type of a field in one of the structure, it is not allowing me to re-activate it.
    Can we not make any changes in structure of the BAPI when it is released.
    Thanks in advance !
    Anubha
    Edited by: Anubha Pandey on Aug 14, 2008 6:04 AM

    > I have created a BAPI  which is having 2 import structures. The BAPI was relaesed.
    > When I changed component type of a field in one of the structure, it is not allowing me to re-activate it.
    >
    > Can we not make any changes in structure of the BAPI when it is released.
    you change the release status To Obsolete
    and then change the structures and  Generate ,Now set the Status To Release

  • Why in SE16 we can not  see New Data Table for standard DSO

    Hi,
    We says that there is three tables (New Data Table, Active Data Table and Change Log Table) of Standard DSO, Then Why in SE16 we can not  see New Data Table of Standard DSO.
    Regards,
    Sushant

    Hi Sushant,
    It is possible to see the 3 DSO tables data in through SE16. May be you do not have authorization to see data through SE16.
    Sankar Kumar

  • Changing of Write Optimized DSO to Standard DSO

    Dear Experts,
    I have created few Write Optimized (WO)  DSOs based on the requirement and I have created few reports also on these WO DSOs.
    The problem is when I am creating an Info Set on the WO DSO, Standard DSO and a Cube (total 3 Info Providers I am including in this info Set) it is throwing an error while I am displaying the data from that Info Set.
    I came to know that the problem is with WO DSO So I want to change this WO DSO to Standard DSO.
    FYI We are in Development stage only.
    If I copy the same DSO and make it as a Standard DSO my reports which I created on these will disturb so I want alternate solution for this.
    Regards,
    Phani.
    Edited by: Sai Phani on Nov 12, 2010 5:25 PM

    Hi Sai
    Write optimized DSO's always help in optimal data performance. I am sure you created the WOD only to take advantage of this point.
    However, WOD are not suitable when it comes to reporting.
    So instead of converting the WOD to a standard DSO, (where you do lose the data load performance advantage), why don't you connect the WOD to a standard DSO wherein you extract data from the WOD to a standard DSO and use the standard DSO for reporting.  This would give you benefit during data loas as well as reportiing.
    Cheers
    Umesh

  • Error occurs while activating a 'Write Optimized' DSO.

    I am getting error " There is no PSA for infosource 'XXXX'  and source system 'XXX' while activating a newly defined DSO object.
    I am able to activate a Standard DSOs, however the error occurs while activating a 'Write Optimized' DSO

    Hi,
    For write optimised DSO, check if you have tick the uniqueness of the records. If you check that and if there are two same records coming from source in one go, then you will get error
    From SAP help
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    Thanks
    Srikanth

  • Standard DSO - Write Optimized DSO, key violation, same semantic key

    Hello everybody,
    I'm trying to load a Write-Optimized DSO from another Standard DSO and then is raised the "famous" error:
    During loading, there was a key violation. You tried to save more than
    one data record with the same semantic key.
    The problematic (newly loaded) data record has the following properties:
    o   DataStore object: ZSD_O09
    o   Request: DTPR_D7YTSFRQ9F7JFINY43QSH1FJ1
    o   Data package: 000001
    o   Data record number: 28474
    I've seen many different previous posts regarding the same issue but not quite equal as mine:
    [During loading, there was a key violation. You tried to save more than]
    [Duplicate data records at dtp]
    ...each of them suggests to make some changes in the Semantic Key. Here's my particular context:
    Dataflow goes: ZSD_o08 (Standard DSO) -> ZSD_o09 (Write-Optimized DSO)
    ZSD_o08 Semantic Keys:
    SK1
    SK2
    SK3
    ZSD_o09 Semantic Keys:
    SK1
    SK2
    SK3
    SK4 (value is taken in a routine as SY-DATUM-1)
    As far as I can see there are no repeated records for semantic keys into ZSD_o08 this is confirmed by querying at active data table for ZSD_o08 ODS. Looking for the Temporary Storage for the crashed DTP at the specific package for the error I can't neither see any "weird" thing.
    Let's suppose that the Semantic Key is crucial as is currently set.
    Could you please advice?. I look forward for your quick response. Thank you and best regards,
    Bernardo

    Hi  Bernardo:
    By maintaining the settings on your DTP you can indicate wether data should be extracted from the Active Table or Change Log Table as described below.
    >-Double click on the DTP that transfers the data from the Standard DSO to the Write Optimized DSO and click on the "Extraction" Tab, on the group at the bottom select one of the 4 options:
    >Active Table (With Archive)
    >Active Table (Without Archive)
    >Active Table (Full Extraction Only)
    >Change Log
    >Hit the F1 key to access the documentation
    >
    >===================================================================
    >Indicator: Extract from Online Database
    >The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    >For Extraction from the DataStore Object, you have the following options:
    >Active Table (with Archive)
    >The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    >Active Table (Without Archive)
    >The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    >Change Log
    >The data is read from the change log of the DataStore object.
    >For Extraction from the InfoCube, you have the following options:
    >InfoCube Tables
    >Data is only extracted from the database (E table and F table and aggregates).
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage.
    Have you modified the default settings on the DTP? How is the DTP configured right now? (or how was configured before your testing?)
    Hope this helps,
    Francisco Milán.

  • Write optimized DSO to standard

    Hello,
    I converted an write optimized DSO to Standard DSO as expected the transformation become inactive i made the mappaing and tried to activate again and its throwing an error cannot activate the transformation.
    The error its giving is
    Syntax error in GP_ERR_RSTRAN_MASTER_TMPL, row 1.181 (-> long text)
    Error during generation
    Error when activating Transformation ........................
    can any on e please suggest how to activate this transformation.
    Cheers,
    Vikram

    Try this program (if exists in ur BW system) RSDG_TRFN_ACTIVATE to activate the transformation.
    Also see whether u can delete complete DTP & Transformation and relogin into RSA1 and create new one.
    (if possible also delete this transformation from table RSTRAN, after deleting DTP)
    Also check following OSS notes on this.
    977922 & 957028

  • J'ai changé de disque dur et je ne parviens plus à activer la créative suite design standard CS6 avec le no de série donné par adobe.

    J’ai changé de disque dur et je ne parviens plus à activer la créative suite
    design standard CS6 avec le no de série donné par adobe.

    after entering your cs6 serial number select a previous version you own and then enter the previous version's serial number.

  • CTS+ Automated activation of change lists after import

    Hi,
    is there a way to have a CTS+ transport automatically activated after transport?
    Problem is that change lists of the transport user get overlooked. We would like to automatically activated everything after the import of a CTS+ transport.
    thx holger

    Hi,
    we created a transport with a right mouse click for a complete namespace and transported it to the target system ESR/IR and Integration Directory. It had to be manually activated (Change list of the CTS+ User)
    That is what we want to skip.
    --->  Now when I write this I remember that this is a "feature", you guys are right -- sometimes you had to check your comm channels etc.  and make these specific changes..
    Right?
    (You cannot always remember everything, true? That is why we have sdn as collective brain)
    Edited by: Holger Stumm on May 12, 2010 11:18 AM - remembers now

  • Changes to write optimized DSO containing huge amount of data

    Hi Experts,
    We have appended two new fields in DSO containg huge amount of data. (new IO are amount and currency)
    We are able to make the changes in Development (with DSO containing data).  But when we tried to
    tranport the changes to our QA system, the transport hangs.  The transport triggers a job which
    filled-up the logs so we need to kill the job which aborts the transport.
    Does anyone of you had the same experience.  Do we need to empty the DSO so we can transport
    successfully?  We really don't want to empty the DSO's as it will take time to load? 
    Any help?
    Thank you very muhc for your help.
    Best regards,
    Rose

    emptying the dso should not be necessary, not for a normal dso and not for a write optimized DSO.
    What are the things in the logs; sort of conversions for all the records?
    Marco

  • Write -Optimized DSO Activation Issue

    Hi Experts,
    when ever I am activating the DSO(WRITE-OPTIMIZED), I am getting error like
    "no PSA for InfoSource and source system in Bi 7.0" 
    Can you pls provide solution for this issue.
    considerations:
    1. This Write-Optimized DSO does n't contain any semantic keys,all are taken as DATA Fields.
    2. I tried by checking and unchecking the uniqueness of data check box.
    Thanking You,
    R.Dama

    Hi Experts,
    I know that only active data table is available in WO-DSO.I am not trying to add activate process in Process Chain.
    And, I am not trying in Data Load time.
    I am saying that ,the problem is while creating WO-DSO, we need to activate dso..right
    ErrorIt is in the Initial step while creating WO-DSO.
    Pls help me and clarify that why I am getting that error message..

  • How to preserve data when converting a Standard DSO into a Write Optimized

    Hi,
    I'm looking for proven strategies for preserving data when converting a standard DSO into a write optimized DSO. The data has to be dropped before the new DSO is transported into the environment.
    1. The DSO is currently in synch with a cube,
    2. The PSA does not have all the data which is in the DSO.
    3. Data volume is incredibly high for a full reload from ECC, so we'd like to avoid that option.
    Appreciate any help!

    Hi Gregg,
    have you considered just deleting the data? I know that sounds simple, but it might be a valid solution.
    If the DSO is just receiving new data (e.g. FI documents), you can continue to deliver a logically correct delta to the cube.
    Should that not be possible and you really want all data that you currently have in your DSO1 in the write optimized future of it, then how about this:
    - Create a new DSO2 same structure as DSO1
    - Load all data into that
    - Delete all data from your DSO1 and import the transport to make it write optimized
    - Load all data back into you now write optimized DSO1 from DSO2
    The problem you have then, is that all data that you have already loaded into your cube is due to be delivered as a delta from DSO1 again.
    Depending on your transformation / update rules that might or might not be a problem.
    Best,
    Ralf

Maybe you are looking for