Issue in transformation from DSO to Cube
Dear Expert,
I have data in DSO in the following format:-
0EMPLOYEE 0WAGETYPE 0DATEFROM 0DATETO 0AMOUNT
123456 0100 01.01.2010 30.04.2010 5000
246991 0200 01.02.2010 30.05.2010 8000
These data are further moving to the Cube through the transformation.
Cube should be populated with the above data in the following way:-
0EMPLOYEE 0WAGETYPE 0CALMONTH 0AMOUNT
123456 0100 01.2010 5000
123456 0100 02.2010 5000
123456 0100 03.2010 5000
123456 0100 04.2010 5000
246991 0200 02.2010 8000
246991 0200 03.2010 8000
246991 0200 04.2010 8000
246991 0200 05.2010 8000
Please suggest how to achieve these. If we can solve this issue using start routine, then please provide the sample code.
You help will be appreciated.
Regards,
Prakash
Hi,
Pseudo code for start routine should look like this,
startcalmonth = datefrom(6).
endcalmonth = dateto(6).
while startcalmonth <= endcalmonth.
pernr = src_pernr
wagetype = src_wagetype
amount = src_amount.
append to internal table.
startcalmonth+4(2) = startcalmonth+4(2) + 1.
endwhile.
And it should work as expected.
Hope that helps.
Regards
Mr Kapadia
*Assigning points is the way to say thanks.**
Similar Messages
-
Issue when uploading Sales data from DSO to Cube.
Dear All,
I have an issue when I am uploading Sales data from DSO to Cube. I am using BI 7.0 and I have uploaded all sales document level data to my DSO. Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube. Cube has customer wise aggregation data.
In DSO I have NetPrice(KF) and Delivered_QTY(KF). I do a simple multiplication routine in the transformation from DSO to Cube.
RESULT = SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
Can someone please help me.
ShankaHi,
are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
but first verify if other key figures are also having the same issue.
Thanks
Ajeet -
New field added to cube, delta DTP from DSO to cube is failing
Dear all,
Scenerio in BI 7.0 is;
data source -delta IPs-> DSO -
delta DTP---> Cube.
Data load using proces chain daily.
We added new field to Cube --> transformation from DSO to cube is active, transport was successful.
Now, delta from DSO to cube is failing.
Error is: Dereferencing of the NULL reference,
Error while extracting from source <DSO name>
Inconsistent input parameter (parameter: Fieldname, value DATAPAKID)
my conclusion, system is unable to load delta due to new field. And it wants us to initialize it again ( am i right ?)
Do I have only one choice of deleting data from cube & perform init dtp again ? or any other way ?
Thanks in advance!
Regards,
Akshay HarsheHi Durgesh / Murli,
Thanks for quick response.
@ durgesh: we have mapped existing DSO field to a new field in cube. So yes in DTP I can see the field in filter. So I have to do re-init.
@ Murli: everything is active.
Actully there are further complications as the cube has many more sources, so wanted to avoid seletive deletion.
Regards,
Akshay -
Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3
Hi All,
We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
first time i have loaded the data it is full update.
Second time i init the data load. I pulled the delta from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
I tried compression but still it did not worked.
Can some has face this kind
ThanksYaseen & Soujanya,
It is very hard to guess the problem with the amount of information you have provided.
- What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
- How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
- Is there data already in the InfoCube?
- Is there change log data for DSO or did someone delete all the PSA data?
Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
Good luck.
Sudhi Karkada
<a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a> -
Data is not updating from DSO to CUBE
Hi Friends,
Can any one please Guide me.
Data is updating upto ODS but not from this lower level ODS to top of the cube from last week and displaying error as " Exception condition "DATE_INVALID" raised" .
How i can proceed to find which date field is getting wrong date or not getting date ..Can any body tell that how we can analize the possibilities in this case to find the error and solution .I need to solve this today,Can any body guide me please.
BR,
Venkathi
check the transformations between dso and cube, and check any inconsistence. if not then follow the link
http://www.consolut.com/en/s/sap-ides-access/d/s/doc/F-DATE_CONVERT_TO_FACTORYDATE
this may help you. -
Issue in Data from DSO to DSO Target with different Key
Hello All,
I am having Issue in Data from DSO to DSO Target with different Key
Source DSO has Employee + Wage + Post numner as key and Target has Employee + Wage Type as key.
DSO semantic grouping works like Group By clause in sql, is my understanding right ?
Also if someone can explain this with a small example, it would be great.
Many Thanks
KrishnaDear, as explained earlier your issue has nothing to do with semantic grouping .
Semantic grouping is only usefull when you have written a routine in the transformation for calculations and in error handling .
Please go through this blog which explains very clearly the use of semantic grouping .
http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2013/06/16/semantic-groups-in-dtp
Now coming to your above question
DSO 1
Employee WageTyp Amount
100 aa 200
200 aa 200
100 bb 400
100 cc 300
If we have semantic group as Employee . If we have Employee as key of the target DSO and update type as summation .
then target DSO will have
Emp Amount
100 700
200 200
In this case Wage type will be the last record arriving from the data package . If the record 100 cc 300 is arrivng last then wage type will be cc .
2) Case 2
DSO 1
Employee WageTyp Amount
100 aa 200
200 aa 200
100 bb 400
100 aa 300
if we do Semantic grouping with Emp and Wage Type If we have Employee and wage type as key of the target DSO and update type as summation .
then target DSO will have
Emp Wage Amount
100 aa 500
200 aa 200
100 bb 400
Hope this helps . -
Data load from DSO to Cube in BI7?
Hi All,
We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
a. Infopackage1 from datasource to ODS and
b. Infopackage2 from ODS to the CUBE.
Now after we transported the migrated dataflow to production, to load the same infoproviders I use
1. Infopackage to load PSA.
2. DTP1 to load from PSA to DSO.
3. DTP2 to load from DSO to CUBE.
step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
Please note that the DSO already has data loaded using infopackage. (Is this causing the problem?)
Thanks,
Sirish.Hi Naveen,
Thanks for the Reply. The creation of DTP is not possible without a transformation.
The transformation has been moved to production successfully. -
When I extracting data from DSO to Cube by using DTP.
When i am extracting data from DSO to Cube by using DTP.
I am getting following erros.
Data package processing terminated (Message no. RSBK229).
Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
Data package processing terminated. (Message no. RSBK229).
Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
please help me out on this issue . I am not able to move forward .its very urgenthello
Have you solved the problem ?
I have the same error...
as you solve this error,
can you help me please I have the same error
yimi castro garcia
[email protected] -
How to load request by request from DSO to Cube
Hi All,
I am loading the data from DSO to Cube. In DSO each request has some 50 lakhs records. When I give execute in the cube transformation after some time it is giving timout error(Basis have set the time till 25 min) and I gave one more option in DTP get all new data request by request but this option also not getting the proper data. Again I am getting the time error.
If anyone knows any other option please let me know.
Thanks for your help in advance
Regards
SathiyaHi,
You can load on the criteria of key fields in the DSO.
Or selecting Fiscal periods may help.
Hope this helps,
Regards,
anil -
Filter in DTP load from DSO to cube by Forecast version not working ?
Hi All
Can any help on the below code I did to filter data update from DSO to Cube in DTP - to filter by next period forecast version. This code is not working it is loading data pf present forecast version also Can any one help please
data: l_idx like sy-tabix.
data: L_date type sy-datum,
t_gjahr type t009b-bdatj,
t_buper type t009b-poper,
1_period(6) type c.
read table l_t_range with key
fieldname = 'ZFCSTVERS'.
l_idx = sy-tabix.
clear: t_buper, t_gjahr.
L_date = sy-datum.
call function 'DATE_TO_PERIOD_CONVERT'
EXPORTING
i_date = L_date
i_periv = 'Z1'
IMPORTING
e_buper = t_buper
e_gjahr = t_gjahr.
*---> Check if the period is 012, then increase the year by 1 and set
*period to 001.
if t_buper = '012'.
t_gjahr = t_gjahr + 1.
t_buper = '001'.
else.
*---> Increase just the period by 1.
t_buper = t_buper + 1.
endif.
concatenate t_gjahr t_buper+1(2) into 1_period.
l_t_range-fieldname = 'ZFCSTVERS'.
l_t_range-low = 1_period.
l_t_range-sign = 'I'.
l_t_range-option = 'EQ'.
append l_t_range.
p_subrc = 0.
sk
Edited by: SK Varma Penmatsa on Jan 23, 2012 2:30 PMHi Praveen/Raj,
Basically PCS_PER_PACK is a KF i have in the DSO, which i use to calculate the total number of pieces which i store in a KF in the cube. The transformation rule to calculate TOTAL_PCS is a routine.
within this routine i multiply PACKS * PCS_PER_PACK to calculate the figure. I do not store PCS_PER_PACK in the cube.
is it this rule that you want me to check whether aggregation is set to SUM or overwrite? Also this rule should add up the total pcs. if I say overwrite then it might not give me the desired result?
Thing which i cannot figure out is since the transformation rules go record by record why would it add up the PCS_PER_PACK figure before calculating?
I cannot access the system at the moment. as soon as i can i will check on it and get back to you.
thanks once again for you're quick response to my need.
regards
dilanke -
Delta records not updating from DSO to CUBE in BI 7
Hi Experts,
Delta records not updating from DSO to CUBE
in DSO keyfigure value showing '0' but in CUBE same record showing '-I '
I cheked in Change log table in DSO its have 5 records
ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - -1
ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0
ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0
ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1
ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0
but active data table have one record - 0
how to corrcct the delta load??
Regards,
JaiHi,
I think initially the value was 0 (ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0, new image in changelog) and this got loaded to the cube.
Then the value got changed to 1 (ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0, before image & ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1, after image). Now this record updates the cube with value 1. The cube has 2 records, one with 0 value and the other with 1.
The value got changed again to 0 (ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - (-1), before image &
ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0, after image). Now these records get aggregated and update the cube with (-1).
The cube has 3 records, with 0, 1 and -1 values....the effective total is 0 which is correct.
Is this not what you see in the cube? were the earlier req deleted from the cube? -
hi
i am facing data from problem from dso to cube
i am getting daat into dso
plant material doc_num itemno qty
p1 m1 1 1 10
p1 m1 1 2 20
dso i am uisng key fields
doc_no
item_no
doc_year
data fields
material
plant
qty
while loading data into cube
its showing only
p1 m1 1 2 20
item 1 is missing
there is no iten num filed in my target field?
ple let me know ur idea.Hi,
The problem is in your ODS/DSO.
In your ODS you have 'Doc_num' as key field.
For both Items 1 & 2, you have same key field ( Doc_Num) Value.
So, the first record will be over written by the second one. So from your ODS you will get only the latest record to the Cube.
If you want to send both doc_nums to cube, you need to bring doc_num to the data fields area of ODS.
Hope this helps.
Cheers
Praveen -
Problem while data processing TRANSACTION data from DSO to CUBE
Hi Guru's,
we are facing problem while data processing TRANSACTION data from DSO to CUBE. data packets processing very slowly and updating .Please help me regarding this.
Thanks and regards,
SridharHi,
I will suggest you to check a few places where you can see the status
1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
3) RSMO see what is available in details tab. It may be in update rules.
4) ST22 check if any short dump has occured.
You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
If you feel its active and running you can verify by checking if the number of records has increased in the cube.
Thanks,
JituK -
Error while loading data from DSO to Cube
Dear Experts,
I have change update routines of the update rule for DSO & update rule between DSO & cube. when i uploading data in DSO , working fine. But when uploading from DSO to cube , it giving error.
Error Message :
DataSource 8ZDSO_ABB does not have the same status as the source system in the Business Information Warehouse.
The time stamp in the source system is 30.09.2010 14:05:17.
The time stamp in the BW system is 09.05.2008 16:41:10.
I executed program RS_TRANSTRU_ACTIVATE_ALL , still giving same error.
Regards,
Anand Mehrotra.Dear Experts,
Thanks for help.Problem solved.
Regards,
Anand Mehrotra. -
Unable to load data from DSO to Cube
Good morning all,
I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
Thank you so much!Hi BI User,
For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
So..first do the initialization and perform delta upload into the cube.
Regards,
Subhashini.
Maybe you are looking for
-
2-PC's - with 1-Itunes account?
I have a PC at home with Itunes, which I use to sync my IPhone. I have an extensive library of music and movies in Itunes. I have a netbook for travel and would like to put the itunes library on that as well? (easier to watch my movies on)-I will
-
Need the table and field names...
Hi Is there any field for min order quantity and lead time in vendor master related tables ? I want to generate a report to show that this vendor expects a min order quantity of this much and his delivery will take a minimum of n days/weeks etc ? So
-
I would like to use Lightroom. I have Windows desktop, Galaxy S5 phone(android), and iPad. Is this all compatible?
-
Error in ATP Packages in Oracle Apps 11.5.10 Install on Linux
I have installed 11.5.10 on Linux. I am getting 4 package body INVALID and not able to compile. select object_name, object_type, status from all_objects where object_name like 'MSC%' and status = 'INVALID' and object_type <> 'UNDEFINED' MSC_ATP_PUB I
-
Get the Total Amount Paid of an A/R Invoice
Hi to All., Im doing a Report where in i should show the total amount paid in an A/R Invoice But my Problem is i dont know what table to get it .. thx in advance ..