Look up two ODS and load data to the cube
Hi ,
I am trying to load data to the Billing Item Cube.The cube contains some fileds which are loaded from the Billing Item ODS which is loaded from 2LIS_13_VDITM directly from the datasource, and there are some fields which needs to be looked up in the Service Order ODS and some fields in the Service Orders Operations ODS.I have written a Start Routine in the Cube update rules.Using the select statement from both the ODS i am fetching the required fields from the both ODS and i am loading them to the internal tables and in the Update rules i am writing the Update routine for the fields using Read statement.
I am getting an error when it is reading the second select statement(from second ODS).
The error message is
You wanted to add an entry to table
"\PROGRAM=GPAZ1GI2DIUZLBD1DKBSTKG94I3\DATA=V_ZCSOD0100[]", which you declared
with a UNIQUE KEY. However, there was already an entry with the
same key.
The error message says that there is an Unique Key in the select statement which is already an entry.
Can any one please help me in providing the solution for this requirement.I would appreciate your help if any one can send me the code if they have written.
Thanks in Advance.
Bobby
Hi,
Can you post the select statements what you have written in Start routine.
regards,
raju
Similar Messages
-
Input ready query is not showing loaded data in the cube
Dear Experts,
In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
Thanks,
Gopi RHi,
input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
Regards,
Gregor -
Can't view my Cube and Dimension Data with the Cube Viewer
I'm new in using OWB, i'm using Oracle 10g release1 with OWB R2 also Oracle WorkFlow 2.6.3.
When studying with the steps from the OTN pages (start01, flat-file02, relational-wh-03, etl-mappings, deployingobjects, loading-warehouse and bi-modeling)
the loading was success, i guess...
But when I want to see the data in the cube and dimension, an error occurs.
It says
" CubeDV_OLAPSchemaConnectionException_ENT_06952??
CubeDV_OLAPSchemaConnectionException_ENT_06952??
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DataViewerConnection.connect(DataViewerConnection.java:115)
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DimDataViewerMain.BIBeansConnect(DimDataViewerMain.java:433)
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DimDataViewerMain.init(DimDataViewerMain.java:202)
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DimDataViewerEditor._init(DimDataViewerEditor.java:68)
at oracle.wh.ui.editor.Editor.init(Editor.java:1115)
at oracle.wh.ui.editor.Editor.showEditor(Editor.java:1431)
at oracle.wh.ui.owbcommon.IdeUtils._tryLaunchEditorByClass(IdeUtils.java:1431)
at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1344)
at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1362)
at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:864)
at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:851)
at oracle.wh.ui.console.commands.DataViewerCmd.performAction(DataViewerCmd.java:19)
at oracle.wh.ui.console.commands.TreeMenuHandler$1.run(TreeMenuHandler.java:188)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:178)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:454)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:151)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:145)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:137)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:100) "
Can somebody explain what is happening, I really don't understand, when the cube viewer window appears, there's no data in it....
I realy need help with this...I'm new in using OWB, i'm using Oracle 10g release1 with OWB R2 also Oracle WorkFlow 2.6.3.
When studying with the steps from the OTN pages (start01, flat-file02, relational-wh-03, etl-mappings, deployingobjects, loading-warehouse and bi-modeling)
the loading was success, i guess...
But when I want to see the data in the cube and dimension, an error occurs.
It says
" CubeDV_OLAPSchemaConnectionException_ENT_06952??
CubeDV_OLAPSchemaConnectionException_ENT_06952??
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DataViewerConnection.connect(DataViewerConnection.java:115)
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DimDataViewerMain.BIBeansConnect(DimDataViewerMain.java:433)
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DimDataViewerMain.init(DimDataViewerMain.java:202)
at oracle.wh.ui.owbcommon.dataviewer.dimensional.DimDataViewerEditor._init(DimDataViewerEditor.java:68)
at oracle.wh.ui.editor.Editor.init(Editor.java:1115)
at oracle.wh.ui.editor.Editor.showEditor(Editor.java:1431)
at oracle.wh.ui.owbcommon.IdeUtils._tryLaunchEditorByClass(IdeUtils.java:1431)
at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1344)
at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1362)
at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:864)
at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:851)
at oracle.wh.ui.console.commands.DataViewerCmd.performAction(DataViewerCmd.java:19)
at oracle.wh.ui.console.commands.TreeMenuHandler$1.run(TreeMenuHandler.java:188)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:178)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:454)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:151)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:145)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:137)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:100) "
Can somebody explain what is happening, I really don't understand, when the cube viewer window appears, there's no data in it....
I realy need help with this... -
dear all,
I need some help: I'm new to OWB and I'm trying to load a cube with dimensions keys data. I have successfully deployed the cube and dimensions without any problems. Now when I deploy the mapping to load the cube I cannot get any data there...
- I have tried mapping by using dimension operator and cube operator by mapping dimension business keys with keys in the cube...and still no data...
-I have also tried joining all the dimensions with the joint operator and joint condition with the cube as input and have the output to the cube...but still no data. I have also disable the constraints on the cube for the purpose of loading the data, but without much success. What am I doing wrong?
Any help and guidance will be greatly appreciated. Thanks much!
AkymIt sounds like you are not getting any matching keys for loading into the cube.
Do you have a time dimension created by OWB in the cube? The key used by the cube operator for an OWB time dimension is a formatted number. The time dimension keys are stored as follows;
Day Level - YYYYMMDD
Month Level - YYYYMM
Week Level - YYYYWW
Quarter - YYYYQ
Year - YYYY
If you have a source that has a SQL date datatype for example and want to construct the key for a cube's time dimension at the day level something like the following expression can be used to construct the time reference from a SQL date...
to_number(to_char( time_key, 'YYYYMMDD'))
It may not be this but just a thought.
Cheers
David -
Error while loading data into the cube
Hi,
I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
Also can some one explain the Datatransfer process(not in process chain)?
Regards,
SamHi Sam
after you load the data through DTP(after click on execute button..) > just go to monitor screen.. in that press the refresh button..> in that it self.. you can find the logs..
otherwise.. in the request screen also.. beside of the request number... you can see the logs icon.. you can click on this..
DTP means..
DTP-used for data transfer process from psa to data target..
check thi link..for DTP:
http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
to load data in to the datatargets or infoproviders like DSO, cube....
in your case.. the problem may be.. check the date formats.. special cherectrs... and
REGARDS
@JAY -
Need to DELETE and LOAD data of the Last month to
Hi Experts,
I need to delete the last month data from cube because One material is not updated with a Required value. We have made some changes in Update routine for a Value. The Same change must be reflected from last month of data ,
So I need to delate the data based on selection. And reload the data again. And the data flows from
2LIS_13_VDITM (Info Source ) to ZSD_C03 (Info Cube).
I read many SDN threads, but I am getting confused.
How can I proceed for these ?
Thanks,
Utpal.Hi Srikanth,
Thank you for responding.
I had a problem on one material. So deleted the request and Reload it from PSA. That issue got solved.
And Now the problem is , Now my senior is saying Delete the data from April-09 till date and Reload it. The issue is My CUBE ( ZSD_C03 ) is updated with 4 data sources. 2LIS_11_V_ITM , 2LIS_13_VDITM , 2LIS_12_VCHDR , 2LIS_11_VAITM .
And I need to delete data from 2LIS_13_VDITM data source . How Can I proceed for the current issue ?
Please suggest ...
Thank you ,
Utpal -
Could not see loaded data in the Cube
Hi Gurus,
Pl. help me in this: I am not able to get teh data loaded into the cube. In PSA it is loaded right. after that I created Transformation on InfoCube and I got Proposal Generated. Then I created DTP and executed, and monitor shows green. but when I right click on the cube and look for data(also maintain master data on InfoObject), I see only the headers of the table but no data is displayed.
I hope you have similar situations. Thanking you in advance.Check if the request is reportable, sometimes the load could be fine, but the request may not be reportable. Make sure in your start routine you don't have a routine which will filter or delete the data out from the cube before it gets update.
thanks.
Wond -
Loaded data amount into cube and data monitor amount
Hi,
when I load data into the cube the inserted data amount in the administrator section shows 650000 data sets. The monitor of that request shows a lot of data packages. When I sum the data packages, the sum is about 700000 data sets.
Where is the difference coming from?
Thanks!Hi ,
If it is a full load to the cube , all the records are updated in it since in a cube data can be overwritten.
If it is a delta load and u want to see why the difference occurs between the records transferred and added in cube ,
u can go to the manage tab in dso , go to the contents tab ,there click change log button at the below , check the number of entries in that table , the number of entries are the added records in cube since only these records are the new records other records with the same key are already present in the cube. -
for integration purpose, how can you go and check data in the cube and validate it against the query data ? i know we can go to corresponded infocube and right click "display data" but what's on the query is not available when i execute "display data"
ThanksYou can always use the similar set of Restrictions as used in RKF and get the value very close to what RKF is displaying.
But getting the same values in CKF is a bit tough..I can suggest may be you can take few/1-2 examples and do the calculations manually and compare the query result and Infocube data. -
Delivery date and loading date is same
Hello ,
Issue decription:
The system calculating delivery=loading date , though the route is maintained for 9 days trasit days.
This problem is with only one routeA.
When i have created SO for material M123 with route "B" shipping Point 0001, the system consider transit duration of the route.But when Sales order (SO )is placed for materail M123, shipping Point 0001, and route A, system is not considering the transit duration.
Kindly let me know any config change is required with respect to route or forwarding agent.Hi,
Check for the route A what's the transit duration time has maintained.The system uses the transit time from delivery scheduling as well as other time estimates (for example, the loading time) to determine the dates by which the goods must be available for picking, packing, and loading.Finally check the factory calendar if forwarding agent facory calendar has not maintaained it will consider the shipping point calendar.
Regards,
Hari Challa. -
Planning Good Issue date and loading date
Hi All,
How I can get the planning Good issue date? I found that the loading date is the same as the planned Good issue date, what's the loading time?
As what I understanding, loading date + loading time = Good issue date, But I am not sure how to calculate the planned Good issue date.
thanks,Hello Friend,
It is determined in Shipping Point configuration.The path is as follows SPRO--> Enterprise Structure >Definition>Logistics Execution-->Define, copy, delete, check shipping point and go in detail the concerned shipping point. -
Hi All,
We have created one cube and loaded data successfully.
But we have one dimension named periodType: the members of the period are:Annual, Quarterly and Monthly.
Client asked to make Monthly as + and remaining ~.
For that we have created one view in that we have added one column aggr_cons.
I have defined it as +* for monthly and the rest as *~*. I’m using that column as the consolidation operator.
But the data load is laoding only a few records.
Actually client asked me to create an Attribute dimension but it is not possible in EIS.
We are using EIS 7.1.2 and SQL server 2005.
For PeriodType dimension we have written the query like
select distinct PeriodType,aggr_cons
from ClaimsData_2
PeriodType is a column in table and it contains Annual,Quarterly,Monthly.
Please let me know any ideas to do this.
Only thing is i have to make Monthly as +* and remaining *~*.
Thanks,
prathapHi Pratap,
Ffirst do following changes in datasource (i.e. SQL Server, Oracle whatevere you are using): -
1- Create a new table say 'Population' and add add two column say id and population like 100,200....etc. Define id as primary key.
2- Now assuming that you have a SQL table called 'Product' so add column called 'Attribute' create it relationship with column 'id' of table 'Population' through foriegn key.
Now do follolwing changes in OLAP metadata & metaoutline:-
1- Suppose you have Product dimension and enable one of its column as attribute.
2- Ok now open metaoutline and expand Product dimension in left panel. Now it will show attributes that you associated.
3- Select an attribute and drag to right panel. It will create a attribute dimension automatically.
Hope it answers you.
Atul K, -
Automatically trigger the event to load data from Planning cube to Standard Cube
Hello,
We have a below set up in our system..
1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
2. An actual reporting cube which gets data from the planning cube above.
Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
This involves 2 things..
1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
2. Trigger the DTP which loads data from Planning cube to reporting cube.
We want to automate the above two steps...
I have tried few things to achieve the same..
1. Created an event in SM64,
2. In the Planning cube "Manage" Screen, clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
3. Wrote a ABAP program which changes the setting of the planning cube ( " Change real time load behaviour " to Loading )
4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?hi,
try to do the transformation directly in the input cube by using CR of type exit, more details :
http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
hope it helps. -
Loading data from Info cube to PA
Hi All,
I am trying to load the CVC from Infocube to Planning Area in Development environment.
I am getting error message for mismatch of periodicity of infocube and Planning Area.
how to resolve this. Where do we define the periodicty of infocube ? I don't think so ?
Please help me
Thanks a lot
Prabhat SahayPrabhat,
I understand that you already have the data in the infocube and want to load it into the planning area.
However it is important to ensure that the data in the cube and the planning area have the same periodicity failing which we will run into such issues.
So pls go back to the step of defining the infocube.
In RSA1 if you double click on the infocube name which you are using as source, you can see the details in the right. In that, look for the dimensions section, under that look for the time section. If that contains a calweek or a calmonth and you are using a fiscal variant, this issue occurs.
You have to drop your infocube data, redefine the cube to use FISCAL PERIOD and FISCAL VARNT instead of calweek or calmonth and then load the data back again. You need to ensure you populate the fields FISCAL PERIOD and FISCAL VARNT. You might want to do that with routines in your update rules or your infosource communication structure. REmember to activate the entire structure from cube to data source.
Once you load the data into the cube with the new setting, and then try to copy the data into planning area, it will be successful.
To answer your other question. Yes Time Characteristics is necessary for loading data between cube and planning area.
Hope this helps.
Thanks
Mani Suresh -
Loading Data from one Cube into another Cube
Hi Guys,
I am trying to load data from one cube A to another cube B. Cube A has data around 200,000 records. I generate export datasource on Cube A. Replicated the datasource and created InfoSource and activated it.
I created update rules for Cube B selecting Source as Cube A. I do have a start routine to duplicate records in Cube A. Now when I schedule load,
It stops at Processing Datapacket and says no data. Is there something wrong with the update routine or is there any other way to load form cube to cube in a simpler way?
Thanks in advanceThis is the start routine to duplicate records in two currencies.
DATA: datew TYPE /bi0/oidateto,
datew2 TYPE rsgeneral-chavl,
fweek TYPE rsgeneral-chavl,
prodhier TYPE /bi0/oiprod_hier,
market TYPE /bic/oima_seg,
segment TYPE /bic/oizsegment.
DATA: BEGIN OF S_DATA_PACK OCCURS 0.
INCLUDE STRUCTURE /BIC/CS8ZSDREV.
DATA: END OF S_DATA_PACK.
S_DATA_PACK[] = DATA_PACKAGE[].
REFRESH DATA_PACKAGE.
LOOP AT S_DATA_PACK.
move-corresponding s_data_pack to DATA_PACKAGE.
if DATA_PACKAGE-loc_currcy = 'EUR'.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalloc.
DATA_PACKAGE-CURRENCY = 'EUR'.
APPEND DATA_PACKAGE.
else.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
endif.
ENDLOOP.
This is to load Quantity field
RESULT = COMM_STRUCTURE-BILL_QTY.
This is to load Value field
RESULT = COMM_STRUCTURE-NETVAL_INV.
UNIT = COMM_STRUCTURE-currency.
Maybe you are looking for
-
Install iTunes and Quicktime seperately?
I already have quicktime installed, and all I'm trying to do is upgrade my current iTunes to 10.1. Every time I try to do this, it gets to the part where it installs quicktime, gets to "computing space requirements" and stops. An error message pops u
-
My iphone keeps removing the telephone number from some of my contacts. It doesn't remove the contact it only removes the telephone number. Anyone else have this issue?
-
Communication channel Failure of SFTP adapter type.
Hi, I have a communication channel failure while sending data from a system using SFTP adapter.But when i cross checked in SXMB_MONI i am unable to find any errors. Error is : Current transaction is marked for rollback: Trace of setRollbackOnly() inv
-
When using CS2, I placed files from MSWord and the paragraph styles worked correctly - identically named styles were imported correctly. Since "upgrading" to CC, and using the same ID and Word templates, some of the styles now are modified when I pla
-
Is Mac OS 10.7.3 vulnerable to DNSchanger Trojan malware?
As a newbie in this forum I asked this question in the wrong spot...the Snow Leopard forum. WZZZ said: You are running Lion; this is the Snow Leopard (10.6) forum. This is pretty old stuff, so that's why I'm wondering if there's some new developme