Standard DSO
Hi Experts,
Please answer below basics questions.
1) We have predefined DataSources, infoobjects, DSOs
Example, If I want to use the Data Source: 2LIS_02_HDR and do we have the Predefined DSO for that?
and how do i create in BW and have all the Data fields in DSO and does it map directly in transformtaion or are we supposed to manually map
if we do not have datafileds of predefined DSO or we supposed to manually add Infoobjects predefined.
thank you in advance
Hi,
Please check the forum for the same.
-Vikram
Similar Messages
-
Why in SE16 we can not see New Data Table for standard DSO
Hi,
We says that there is three tables (New Data Table, Active Data Table and Change Log Table) of Standard DSO, Then Why in SE16 we can not see New Data Table of Standard DSO.
Regards,
SushantHi Sushant,
It is possible to see the 3 DSO tables data in through SE16. May be you do not have authorization to see data through SE16.
Sankar Kumar -
Hi Gurus,
Can anybody tell that ' CAN WE CREATE A STANDARD DSO WITH-OUT KEY FIELDS?'. In what situation it is recomended.
Thanks in advance,
shivHi Prasanth,
SID key, the package ID and the record number in Active table and the change log has the request ID as its key, the package ID, and the record number.
but I'm asking about other key fields like material,coustomer etc..
Thanks,
Shiv -
Error while Activating the Standard DSO
Hi,
I am getting the below error while Activating the Standard DSO.
Characteristic 0DPM_DCAS__0CSM_SPRO: DS object 0DPM_DCAS__0CSM_STAT (master data check) does not exist
I tried searching the forum , but didnt find any answer.
Any suggestions are welcome.
Thank you,
Adhvi.Hi,
Are you getting the error while trying to activate the DSO data after loading or just while trying to activate the DSO itself.
If it is during the activation of a request,then please check if you have loaded some data in the DSO which doesnt have the corresponding Master Data loaded.For preventing this, you can check in Infopackage/DTP -"No Update without Master Data".
You can also go into RSRV and perform elementary tests for the DSO and check the SID table consistency.
Thanks,
Divya. -
Changing of Write Optimized DSO to Standard DSO
Dear Experts,
I have created few Write Optimized (WO) DSOs based on the requirement and I have created few reports also on these WO DSOs.
The problem is when I am creating an Info Set on the WO DSO, Standard DSO and a Cube (total 3 Info Providers I am including in this info Set) it is throwing an error while I am displaying the data from that Info Set.
I came to know that the problem is with WO DSO So I want to change this WO DSO to Standard DSO.
FYI We are in Development stage only.
If I copy the same DSO and make it as a Standard DSO my reports which I created on these will disturb so I want alternate solution for this.
Regards,
Phani.
Edited by: Sai Phani on Nov 12, 2010 5:25 PMHi Sai
Write optimized DSO's always help in optimal data performance. I am sure you created the WOD only to take advantage of this point.
However, WOD are not suitable when it comes to reporting.
So instead of converting the WOD to a standard DSO, (where you do lose the data load performance advantage), why don't you connect the WOD to a standard DSO wherein you extract data from the WOD to a standard DSO and use the standard DSO for reporting. This would give you benefit during data loas as well as reportiing.
Cheers
Umesh -
Standard DSO - Write Optimized DSO, key violation, same semantic key
Hello everybody,
I'm trying to load a Write-Optimized DSO from another Standard DSO and then is raised the "famous" error:
During loading, there was a key violation. You tried to save more than
one data record with the same semantic key.
The problematic (newly loaded) data record has the following properties:
o DataStore object: ZSD_O09
o Request: DTPR_D7YTSFRQ9F7JFINY43QSH1FJ1
o Data package: 000001
o Data record number: 28474
I've seen many different previous posts regarding the same issue but not quite equal as mine:
[During loading, there was a key violation. You tried to save more than]
[Duplicate data records at dtp]
...each of them suggests to make some changes in the Semantic Key. Here's my particular context:
Dataflow goes: ZSD_o08 (Standard DSO) -> ZSD_o09 (Write-Optimized DSO)
ZSD_o08 Semantic Keys:
SK1
SK2
SK3
ZSD_o09 Semantic Keys:
SK1
SK2
SK3
SK4 (value is taken in a routine as SY-DATUM-1)
As far as I can see there are no repeated records for semantic keys into ZSD_o08 this is confirmed by querying at active data table for ZSD_o08 ODS. Looking for the Temporary Storage for the crashed DTP at the specific package for the error I can't neither see any "weird" thing.
Let's suppose that the Semantic Key is crucial as is currently set.
Could you please advice?. I look forward for your quick response. Thank you and best regards,
BernardoHi Bernardo:
By maintaining the settings on your DTP you can indicate wether data should be extracted from the Active Table or Change Log Table as described below.
>-Double click on the DTP that transfers the data from the Standard DSO to the Write Optimized DSO and click on the "Extraction" Tab, on the group at the bottom select one of the 4 options:
>Active Table (With Archive)
>Active Table (Without Archive)
>Active Table (Full Extraction Only)
>Change Log
>Hit the F1 key to access the documentation
>
>===================================================================
>Indicator: Extract from Online Database
>The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted. For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
>For Extraction from the DataStore Object, you have the following options:
>Active Table (with Archive)
>The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
>Active Table (Without Archive)
>The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
>Archive (Only Full Extraction)
>The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
>Change Log
>The data is read from the change log of the DataStore object.
>For Extraction from the InfoCube, you have the following options:
>InfoCube Tables
>Data is only extracted from the database (E table and F table and aggregates).
>Archive (Only Full Extraction)
>The data is only read from the archive or from a near-line storage.
Have you modified the default settings on the DTP? How is the DTP configured right now? (or how was configured before your testing?)
Hope this helps,
Francisco Milán. -
How to preserve data when converting a Standard DSO into a Write Optimized
Hi,
I'm looking for proven strategies for preserving data when converting a standard DSO into a write optimized DSO. The data has to be dropped before the new DSO is transported into the environment.
1. The DSO is currently in synch with a cube,
2. The PSA does not have all the data which is in the DSO.
3. Data volume is incredibly high for a full reload from ECC, so we'd like to avoid that option.
Appreciate any help!Hi Gregg,
have you considered just deleting the data? I know that sounds simple, but it might be a valid solution.
If the DSO is just receiving new data (e.g. FI documents), you can continue to deliver a logically correct delta to the cube.
Should that not be possible and you really want all data that you currently have in your DSO1 in the write optimized future of it, then how about this:
- Create a new DSO2 same structure as DSO1
- Load all data into that
- Delete all data from your DSO1 and import the transport to make it write optimized
- Load all data back into you now write optimized DSO1 from DSO2
The problem you have then, is that all data that you have already loaded into your cube is due to be delivered as a delta from DSO1 again.
Depending on your transformation / update rules that might or might not be a problem.
Best,
Ralf -
Is Manual loading to Standard DSO possible?
Dear Mates,
Is there any program or functin module for writing data in Standard DSO. For cube we are having program.
Please help to solve this issue.
Thanks,
Ranganath.Hi Ranganath,
What I did in past was, created an ABAP program and then using INSERT command I have directly created some records in Direct update DSO but I have not tried this with Standard DSO.
May be you can try to directly write data in active table of standard DSO by creating ABAP report.
Regards,
Durgesh. -
Transformationsde-activating after changing write optimized to standard dso
Hi experts,
i changed write optimized dso to standard dso because of some changes, but after changes done i activated the changed transformations standard dso(previously write optimized), but it is not activating , after gone through the issue its happened
because of 0 record mode, i deleted the 0 recordmode rule and activated.
Now the real issue:-
when i transport this changed dso and transformations from dev to acceptance test am facing this issue below are the details.
Start of the after-import method RS_TRFN_AFTER_IMPORT for object type(s) TRFN (Activation Mode)
Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC deleted in version M of DDIC
Transformation 0RKZY5AKAWMJ2ZRKJMSZO791UP6X1F2N deleted in version M of DDIC
Activation of Objects with Type Transformation
Checking Objects with Type Transformation
Checking Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC
Rule 10 (target: 0RECORDMODE group: 02 Technical Group ): Constant is initial
Checking Transformation 0RKZY5AKAWMJ2ZRKJMSZO791UP6X1F2N
Rule 10 (target: 0RECORDMODE group: 02 Technical Group ): Constant is initial
Saving Objects with Type Transformation
Internal Activation (Transformation )
Preprocessing / Creation of DDIC Objects for Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC
Preprocessing / Creation of DDIC Objects for Transformation 0RKZY5AKAWMJ2ZRKJMSZO791UP6X1F2N
Please help me in this by rectificying in dev system how could i manage to send this transport to acceptance test.
One more thing even after deletin the 0recordmode in transformation , i can able to see that in techinical group.
cheers
vas
Post Processing/Checking the Activation for Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC
Transformation 0MX1MHVGBT0A0PTA526YRW3O48Q77GOC was activatedHi
You can change write optimised to satndard DSO.
You are supposed to add 0recordmode field in datsource check below links for clear information:
error while activating transformations
Re: Transformation error.
Ravi -
How to edit data while loading data from W/O to Standard DSO?
Hello,
I am loading data from W/O to Standard DSO, during activation it got error out due to SID failure error for one infoobject(error is due to lowercase letter).But i can't change the infoobject setting and transformation.
Is there any way to edit the data(either in W/O DSO or in new data of Standard DSO)?
Thanks and regards,
Himanshu.HI,
Please check what is setting there in the transaction RSKC if it is set as ALL_CAPITAL then you must atleast chage the charecter setting, write a command in transformation, load to PSA and modify (Not applicable for BI7), Remove the setting in RSKC.(Not suggested).
Cheers
Vikram -
Cancel Indicator X record is not updating to Standard DSO
Hi All,
I working on SD--> Deliveries Extractor 2LIS_12_VAITM, we have setup the solution as similar to LSA model.
I can see a Delivery 123 has been created and changed in the same day.
Delivery item createdon GI_DATE
123 10 2014.01.01 2014.04.05
123 10 2014.01.01 2014.04.06
we have mapping from datasource to a Standard DSO and Corporate memory DSO.
For the standard DSO, i can 1 record in Active table and 3 records in Change log. And Corporate Memory DSO has 2 records.
Standard DSO Corporate Memory DSO
Activate Table
123 10 2014.01.01 2014.04.06 123 10 2014.01.01 2014.04.05
123 10 2014.01.01 2014.04.06
Change log:
123 10 2014.01.01 2014.04.06 N
123 10 2014.01.01 2014.04.06 X
123 10 2014.01.01 2014.04.06
Question:
Why For the Standard DSO is not bring first record GI_DATE which has 2014.04.05.
when i look at PSA, i can see the record was has two records, one with Cancel Indicator 'X' and another with space.
Now I need initial GI_DATE for Standard DSO as well.
Could you please provide your suggestions.
Regards,
Naresh.Hi Naresh,
It could be because of Duplicate records in the same package the latest record was updated to DSO.
Solution: For your scenario,Please try to make 0GI_DATE as part of Key Fields of the DSO, so that you will be able to see 2 records as per your example.
Regards,
Banu -
Loading of data in the standard dso
hello everyone,
i loaded the data in the standard dso using flat file
after that i changed one record in the file and uploaded it with delta update
so the thing is i cant see data that i changed in the dso or in the tablesHi,
From flat file system doesn't understand the delta. Unless some specific code is written.
For every delta load one time stamp is required and I guess you are just loading one flat file dirctly into system by selcting delta option in info package/DTP. Either you need to write code to make system understnd for time stamp or you sholud do full load in case of flat file. It will overwrite evrything and you will get changed records as well and you will filnd enteries in all the tables (activate the data after load).
You may try pseudo delat as well by loading only changed records into DSO. In this case load will be full just the file will be having only changed record and it will add in existing record.
I hope it will help.
Thanks,
S -
How to make standard DSO available for reporting?
In previous version I have Bex setting to make ODS for reporting but in new DSO I don't see this kind of seeting. Any Idea?
Actually I want the DSO in the query designer to create an adhoc report.
Thanks in advance.
Yorkhi,
that option(bex reporting) is taken out from this version in the settings of dso object.
by default all the three types of DSO object provide reporting on it.
sap help document. check the following link
http://help.sap.com/saphelp_nw04s/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm
hope this help you
regards
harikrishna N -
How to improve the activation time for a standard DSO
Hi all,
I'm facing an issue related to the activation of a DSO. The SM37 log is as follows....
12:03:16 Job started
12:03:16 Step 001 started (program RSPROCESS, variant &0000000006946, user ID BWREMOTE)
12:03:20 Attivazione is running: Data target ZFIZIASA, from 93,627 to 93,627
12:18:00 Overlapping check with archived data areas for InfoProvider ZFIZIASA
12:18:00 Data to be activated successfully checked against archiving objects
12:18:02 Status transition 2 / 2 to 7 / 7 completed successfully
12:18:13 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4IC45QJA588GKZ0M7JEJ3HCAR with ID 1218130
12:18:19 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4IC45QJA588GKZ0M7JEJ3HCAR with ID 1218190
12:18:20 Parallel processes (for Attivazione); 000003
12:18:20 Timeout for parallel process (for Attivazione): 000300
12:18:20 Package size (for Attivazione): 020000
12:18:20 Task handling (for Attivazione): Processi batch
12:18:20 Server group (for Attivazione): Nessun gruppo di server config
12:18:20 Activation started (process is running under user BWREMOTE)
12:18:20 Not all data fields were updated in mode "overwrite"
12:18:20 Process started
12:18:20 Process completed
12:18:20 Activation ended
Please have a look into the bold 3rd & 4th line. I couldnt able to analyse where the issue is and what to do to minimize the time for activation.
It is very challenging, please reply!
Please help.
Thanks in adv.
AjayHi Kundan,
Thanks for the response!
Actually, I have two identical DSO, having all the char & KF same and feeding the data from same DS at the same time but the issue is...
1) For 1st DSO, Activation log....
01.07.2010 02:02:41 Job started
01.07.2010 02:02:41 Step 001 started (program RSPROCESS, variant &0000000006946, user ID BWREMOTE)
01.07.2010 02:02:46 Attivazione is running: Data target ZFIZIASA, from 93,751 to 93,751
01.07.2010 02:19:27 Overlapping check with archived data areas for InfoProvider ZFIZIASA
01.07.2010 02:19:27 Data to be activated successfully checked against archiving objects
01.07.2010 02:19:27 Status transition 2 / 2 to 7 / 7 completed successfully
01.07.2010 02:19:28 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICC6UMXYT00Z2DYOQV8QL8CJ with ID 021
01.07.2010 02:19:30 Parallel processes (for Attivazione); 000003
01.07.2010 02:19:30 Timeout for parallel process (for Attivazione): 000600
01.07.2010 02:19:30 Package size (for Attivazione): 020000
01.07.2010 02:19:30 Task handling (for Attivazione): Processi batch
01.07.2010 02:19:30 Server group (for Attivazione): Nessun gruppo di server config
01.07.2010 02:19:30 Activation started (process is running under user BWREMOTE)
01.07.2010 02:19:30 Not all data fields were updated in mode "overwrite"
01.07.2010 02:19:30 Activation ended
2) For 2nd DSO, the activation log is..
01.07.2010 02:01:13 Job started
01.07.2010 02:01:13 Step 001 started (program RSPROCESS, variant &0000000006947, user ID BWREMOTE)
01.07.2010 02:01:35 Attivazione is running: Data target ZFIGL_02, from 93,749 to 93,749
01.07.2010 02:01:43 Overlapping check with archived data areas for InfoProvider ZFIGL_02
01.07.2010 02:01:43 Data to be activated successfully checked against archiving objects
01.07.2010 02:01:53 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICCAR91ALHE1VHSB6GV9PAPV with ID 02015300
01.07.2010 02:01:54 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICCAR91ALHE1VHSB6GV9PAPV with ID 02015400
01.07.2010 02:01:56 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICCAR91ALHE1VHSB6GV9PAPV with ID 02015600
Now my client is asking to lower the activation time for 1st one as it is taking for 2nd one.
I'm toatally blank now that what & how to do it!
The volume of data ...
ZFIZIASA
Active table - Number of entries: 13.960.508
Change Log Table - Number of entries: 13.976.530
ZFIGL_02
Active Table - Number of entries: 21.923.947
Change Log table - Number of entries: 21.938.657
Thanks,
ajay
Edited by: sap.ajaykumar on Jul 6, 2010 1:47 PM -
Write optimized DSO to standard
Hello,
I converted an write optimized DSO to Standard DSO as expected the transformation become inactive i made the mappaing and tried to activate again and its throwing an error cannot activate the transformation.
The error its giving is
Syntax error in GP_ERR_RSTRAN_MASTER_TMPL, row 1.181 (-> long text)
Error during generation
Error when activating Transformation ........................
can any on e please suggest how to activate this transformation.
Cheers,
VikramTry this program (if exists in ur BW system) RSDG_TRFN_ACTIVATE to activate the transformation.
Also see whether u can delete complete DTP & Transformation and relogin into RSA1 and create new one.
(if possible also delete this transformation from table RSTRAN, after deleting DTP)
Also check following OSS notes on this.
977922 & 957028
Maybe you are looking for
-
Using a Family Members upgrade
Ok so i know this has been asked before and I think I get the gist of it. I currently have the Dinc and am giving it to my bro for his upgrade. My Dinc has insurance and unlimited data. I know I will have to activate it on his line before mine. My qu
-
Closing Stock of a particular date:Function Module
Hello Experts, I m developig a report in which I want Closing Stock amount of a particular date. Is there any funtion module for this, as in tables S031, mard I m getting Closing Stock as on date. Pls suggest Priyanka
-
Adobe Reader 11 does not allow hyperlink transfer to default browser
Hi All, I have an about dialog created using wxwidgets it also has a hyperlink which when clicked on opens a default browser and opens up the website however this thing runs fine on adobe reader X but give an error like this on adobe reader x1
-
Does the new shuffle read foreign languages?
Specifically - Chinese - Korean All my songs are Chinese and Korean, so I'm wondering if it will announce those characters.
-
Je n'arrive plus à ouvrir les fichiers ARW dans PE "impossible d'ouvrir....car ce type de fichier est incorrect". J'ai téléchargé la version 6.5 de Camera Raw sur le site d'Adobe mais il semble que pour mon Sony Alpha 57 c'est la version 6.7 qu'il f