Effects of R3 DS enhancement on remote cube
Hi All,
Is there a direct effect while running queries against a remote cube when its underlying R3 DS has been enhanced?
An example is, I'm running queries on a remote cube copied from OL_PCA_1. Now I enhanced R3 DS 0EC_PCA_1_9 (which feeds IS 0EC_PCA_1, which in turn in the basis of my remote cube). I did not include the new characteristics in the remote cube itself, but rather included it in the enhanced standard OL_PCA_1 where more granular reporting will be performed.
Will there be query performance effects on the remote cube using this approach? How can I measure the performance impact (if any) on the remote cube's queries?
Thanks.
Hi All,
Is there a direct effect while running queries against a remote cube when its underlying R3 DS has been enhanced?
An example is, I'm running queries on a remote cube copied from OL_PCA_1. Now I enhanced R3 DS 0EC_PCA_1_9 (which feeds IS 0EC_PCA_1, which in turn in the basis of my remote cube). I did not include the new characteristics in the remote cube itself, but rather included it in the enhanced standard OL_PCA_1 where more granular reporting will be performed.
Will there be query performance effects on the remote cube using this approach? How can I measure the performance impact (if any) on the remote cube's queries?
Thanks.
Similar Messages
-
Remote cube on Data Source 0FI_GL_4
HI,
I have a requirement of reconciling the Line Items from R/3 to BW.
So i tried to create Remote cube with Direct access, but coudnt due to 0FI_GL_4 data source doesnt support Direct access and reconciliation.
I can see that Data Source 0FI_GL_4 is being fetched by Function module BWFID_GET_FIGL_ITEM and the base table is DTFIGL_4.
My question is can I create the Generic data source based on the function module by taking the copy of existing FM BWFID_GET_FIGL_ITEM and change Direct access option to 2(Supports Preaggregation).
Do i need to change any settings at table level and parameters to get the required solution.
Please sugget me how to proceed the development to reconcile the Line Items from R3 to BW.
Regards,
Siva..
Edited by: Siv Kishore on Apr 7, 2011 4:59 PMHI Banerjee,
Thanks for the update.
As you said i cant use the data source 0FI_GL_1, Because we need to reconcile many fields from R/3 to BW, but those fields are not available in this data source and it will be complicated to enhance such fields.
can I create View based generic data source on BSEK and BSEG tables, Will it work? if yes do i need to follow any criteria to get the Correct Line Items.
Please suggest me is there any SAP suggested solution or if you worked on such kind of development.
Regards,
Siva Thottempudi..
Edited by: Siv Kishore on Apr 12, 2011 4:06 PM -
Remote Cubes - R3 customer exit
Hi,
We do have a situation where the enhanced fields on a SPL data source, which are filled in the R3 customer exit, do not provide data when queried on Remote cube.
However when fetching data from the same datasource, using normal data extraction, data is filled for the enhanced fields.
Is this a expected result ?
Pl. advise,
Thanks,
RamHello TG,
As the datasource has massive data, better option would be to use navigational attribute.
Advantages:
1. In a enhancement, you would have to read fields from a DataBase table, which would also have massive data. select would be a pain on such table.
2. CMOD execution includes passing of data within internal tables, filling of internal tables, selection of data, modifications of internal tables which takes time.
3. In case of a navigationsl attribute, just a SID table needs to be accessed, also master data table is never as big as transaction data so it is better.
Hope this heps.
Regards,
Pankaj -
Report is not getting data from Remote cube thru Multi Provider
Hi SAPians
I ve strucked up with a Problem in The Reconciliation Report in BW3.5
The Report was built on a Multi Provider, which was created on Basic and Remote Cubes .
Both cubes have same Data Source and all the Objects were in Active version and found good.
When I m executing the Report ,I m only getting the data from the Basic cube and no data is coming from Remote Cube.
I ve checked the char " 0Infoprov " in Multi Provider and It was assigned with the both the cubes.
What might be the problem
Please help me in this regard
Thanks in advance
Regards
ArjunHi
In the Reconciliation multiprovider, include 0INFOPROVIDER = Remote cube.
If data still not coming, you can be sure connectivity with Source system of the Remote cube is the issue
Check with basis to solve the connectivity issue.
Ensure Remote cube is consistent
Bye -
Adding new field to Remote Cube
Hello,
We have an existing remote cube that we are trying to add a new field to in 7.0. I've already added the field to the datasource, unhidden it, and tested it in RSA3. Now i'm trying to bring it into BW and add it to the remote cube. I right clicked on the source system under the data target and chose Replicate Metadata however the new field does not show up anywhere. What am i doing wrong? Does a remote cube work differently? How do i get my new field over from R3?
Regards,
TMSThat's the problem. The R3 field does not show up in the datasource in BW so it can be moved over to the transfer structure. As you know when you replicate metadata, the new datasource fields appear and can be mapped into the transfer structure. That is not the case for this remote cube. The new field is not there.
-
Hi Guru's
I've created a generic ds using fm for direct access.creating virtual cube and mapped accordingluy with ds.
Now when I do test extract on ds per a single posting day then it takes few secords to pull through the data and the no. of records are 45 on dev box.But I do data display on remote cub and it takes long and throughing a timed out error.
I dont have any routines on transformaions.
What could be the problem?
Nagesh.HI
here's the code in FM
Example: DataSource for table SFLIGHT
TABLES: BKPF,BSEG.
Auxiliary Selection criteria structure
DATA: L_S_SELECT TYPE SRSC_S_SELECT.
Maximum number of lines for DB table
STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
counter
S_COUNTER_DATAPAKID LIKE SY-TABIX,
Starting point for each data package
READ_NEXT LIKE SY-TABIX,
Total Number of Extracted Records
NO_RECS LIKE SY-TABIX,
Number read so far
TEMP_CNTR(9) TYPE N.
cursor
S_CURSOR TYPE CURSOR.
Select ranges
RANGES: R_BLDAT FOR BKPF-BLDAT,
R_BUDAT FOR BKPF-BUDAT,
R_CPUDT FOR BKPF-CPUDT,
R_BUKRS FOR BKPF-BUKRS,
R_GJAHR FOR BKPF-GJAHR,
R_BELNR FOR BKPF-BELNR.
DATA: GT_T_DATA LIKE ZFI_GL_4 OCCURS 0 WITH HEADER LINE.
DATA: GT_BKPF TYPE TABLE OF BKPF,
GS_BKPF TYPE BKPF.
DATA: COUNTER(9) TYPE N.
DATA: LV_MONAT(3).
Initialization mode (first call by SAPI) or data transfer mode
(following calls) ?
IF I_INITFLAG = SBIWA_C_FLAG_ON.
Initialization: check input parameters
buffer input parameters
prepare data selection
Check DataSource validity
CASE I_DSOURCE.
WHEN 'ZFI_GL_4'.
WHEN OTHERS.
IF 1 = 2. MESSAGE E009(R3). ENDIF.
this is a typical log call. Please write every error message like this
LOG_WRITE 'E' "message type
'R3' "message class
'009' "message number
I_DSOURCE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
Fill parameter buffer for data extraction calls
S_S_IF-REQUNR = I_REQUNR.
S_S_IF-DSOURCE = I_DSOURCE.
S_S_IF-MAXSIZE = I_MAXSIZE.
Fill field list table for an optimized select statement
(in case that there is no 1:1 relation between InfoSource fields
and database table fields this may be far from beeing trivial)
APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
ELSE. "Initialization mode or data extraction ?
Data transfer: First Call OPEN CURSOR + FETCH
Following Calls FETCH only
First data package -> OPEN CURSOR
IF S_COUNTER_DATAPAKID = 0.
Fill range tables BW will only pass down simple selection criteria
of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BLDAT'.
MOVE-CORRESPONDING L_S_SELECT TO r_bldat.
APPEND R_BLDAT.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BUDAT'.
MOVE-CORRESPONDING L_S_SELECT TO R_BUDAT.
APPEND R_BUDAT.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CPUDT'.
MOVE-CORRESPONDING L_S_SELECT TO R_CPUDT.
APPEND R_CPUDT.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BUKRS'.
MOVE-CORRESPONDING L_S_SELECT TO R_BUKRS.
APPEND R_BUKRS.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'GJAHR'.
MOVE-CORRESPONDING L_S_SELECT TO R_GJAHR.
APPEND R_GJAHR.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BELNR'.
MOVE-CORRESPONDING L_S_SELECT TO R_BELNR.
APPEND R_BELNR.
ENDLOOP.
Determine number of database records to be read per FETCH statement
from input parameter I_MAXSIZE. If there is a one to one relation
between DataSource table lines and database entries, this is trivial.
In other cases, it may be impossible and some estimated value has to
be determined.
SELECT * FROM BKPF INTO TABLE GT_BKPF WHERE BLDAT IN R_BLDAT
AND BUDAT IN R_BUDAT
AND CPUDT IN R_CPUDT
AND BUKRS IN R_BUKRS
AND GJAHR IN R_GJAHR
AND BELNR IN R_BELNR.
LOOP AT GT_BKPF INTO GS_BKPF.
SELECT * FROM BSEG WHERE BELNR = GS_BKPF-BELNR
AND BUKRS = GS_BKPF-BUKRS
AND GJAHR = GS_BKPF-GJAHR.
MOVE-CORRESPONDING GS_BKPF TO GT_T_DATA.
MOVE-CORRESPONDING BSEG TO GT_T_DATA.
CLEAR LV_MONAT.
CONCATENATE '0' GT_T_DATA-MONAT INTO LV_MONAT.
CONCATENATE GT_T_DATA-GJAHR LV_MONAT INTO GT_T_DATA-FISCPER.
CALL FUNCTION 'BWFIU_GET_DOCUMENT_ORIGIN'
EXPORTING
I_AWTYP = GT_T_DATA-AWTYP
I_AWKEY = GT_T_DATA-AWKEY
IMPORTING
E_REFBELNR = GT_T_DATA-AWREF
E_REFGJAHR = GT_T_DATA-AWGJA
E_REFBUKRS = GT_T_DATA-AWBUK
E_REFKOKRS = GT_T_DATA-AWKOK.
Lcal currency calculations.
IF GT_T_DATA-SHKZG EQ 'S'.
GT_T_DATA-DMSOL = GT_T_DATA-DMBTR.
GT_T_DATA-DMSHB = GT_T_DATA-DMBTR.
GT_T_DATA-DMHAB = 0.
ELSEIF GT_T_DATA-SHKZG EQ 'H'.
GT_T_DATA-DMHAB = GT_T_DATA-DMBTR.
GT_T_DATA-DMSHB = GT_T_DATA-DMBTR * -1.
GT_T_DATA-DMSOL = 0.
ENDIF.
Foriegn Currency calculations.
IF GT_T_DATA-SHKZG EQ 'S'.
GT_T_DATA-WRSOL = GT_T_DATA-DMBTR.
GT_T_DATA-WRSHB = GT_T_DATA-DMBTR.
GT_T_DATA-WRHAB = 0.
ELSEIF GT_T_DATA-SHKZG EQ 'H'.
GT_T_DATA-WRHAB = GT_T_DATA-DMBTR.
GT_T_DATA-WRSHB = GT_T_DATA-DMBTR * -1.
GT_T_DATA-WRSOL = 0.
ENDIF.
APPEND GT_T_DATA.
ENDSELECT.
ENDLOOP.
DESCRIBE TABLE GT_T_DATA LINES NO_RECS.
READ_NEXT = 0.
TEMP_CNTR = 0.
ENDIF. "First data package ?
Fetch records into interface table.
IF ( READ_NEXT GT NO_RECS ) OR ( NO_RECS EQ 0 ).
REFRESH GT_T_DATA.
CLEAR:S_COUNTER_DATAPAKID, COUNTER, READ_NEXT, S_S_IF, NO_RECS.
RAISE NO_MORE_DATA.
ENDIF.
COUNTER = 0.
LOOP AT GT_T_DATA FROM READ_NEXT.
COUNTER = COUNTER + 1.
IF COUNTER GT S_S_IF-MAXSIZE.
IF TEMP_CNTR EQ NO_RECS.
CLEAR: S_COUNTER_DATAPAKID,COUNTER,READ_NEXT,S_S_IF,NO_RECS.
RAISE NO_MORE_DATA.
ELSE.
READ_NEXT = READ_NEXT + COUNTER.
ENDIF.
EXIT.
ENDIF.
MOVE-CORRESPONDING GT_T_DATA TO E_T_DATA.
APPEND E_T_DATA.
TEMP_CNTR = TEMP_CNTR + 1.
iF TEMP_CNTR GT NO_RECS.
CLEAR GT_T_DATA.
CLEAR: S_COUNTER_DATAPAKID, COUNTER,READ_NEXT, S_S_IF,NO_RECS.
RAISE NO_MORE_DATA.
EXIT.
ENDIF.
ENDLOOP.
S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
ENDIF. "Initialization mode or data extraction ?
ENDFUNCTION.
If it's a code then why it's quick on rsa3?
Nagesh. -
Deleting DTP based on Open Hub and Remote Cube
Hi All,
i have created a DTP delta for my Open Hub to load a flat file from a remote cube.
i have done some load test request but it doesn't work (no data in my file .csv), i have created another full DTP with others parameters and it works successfully.
now i want to delete my Delta DTP but the system return me a message like this:
DTP_4IDT4BXV29E1N0MWINKCWO60B cannot be deleted at the moment (see long text)
and in the long text i have:
Message no. RSBK037
Diagnosis
You want to delete a delta DTP that has been successfully used to load requests from the source into the target. If you delete the DTP, you will also delete the information on the source data that was successfully transferred. As a result the source data would be transferred again, if you create a new delta DTP that links the same source with the same target.
System Response
The system terminated the delete operation.
Procedure
You have two options:
1. Delete all the requests loaded with this DTP from the target and then delete the DTP.
2. Do not delete the DTP and continue to load deltas using this DTP.
i tried to see the requests loaded with the delta dtp and i have deleted one but there are another requests that i can't deleted.
i have deleted the delta dtp from my transport request.
how should i do to delete definitivily my delta dtp?
thanks for your help
BilalDo not delete entries out of table RSBKREQUEST.
To delete your DTP, you may use program RSBKDTPDELETE. Give the technical id of the DTP as the input parameter. -
Problem with a query on Remote Cube
Hi,
We are working on Remote cube, which has a source from a view built on R/3 base table. I need to extract the data to BW based on a current date due to huge volume of data(performance reasons) in the table. I have used an exit on R/3 to restrict to current date but the extract checker was showing the valid data ie only for current date when i had built a query on Remote cube, the Report was showing complete data(restriction not working). We have even tried using an inversion routine in transfer rules to pass the selections to the Source system even then it doesn't work. Could you help if you have come across same kind of situation or you can suggest an alternate solution on the same but we have to use Remote cube.
Any suggestions asap would be highly appreciated and rewarded with full points.
Regards,
RajCould you check the BLOB really contains UTF-8 encoded XML?
What's your database character set?The BLOB contains UTF-8 Encoded
and the database where i am connectes have AL32UTF8 character set, but my internal instance have "AMERICAN_AMERICA.WE8ISO8859P1"
that is a problem?
How could I change the character set of the oracle local client to the character set of the remote oracle data base? -
Help required for a Remote cube query
Hi,
We are working on Remote cube, which has a source from a view built on R/3 base table. I need to extract the data to BW based on a current date due to huge volume of data(performance reasons) in the table. I have used an exit on R/3 to restrict to current date but the extract checker was showing the valid data ie only for current date. When i had built a query on Remote cube, the Report was showing complete data(restriction not working). We have even tried using an inversion routine in transfer rules to pass the selections to the Source system even then it doesn't work. Could you help if you have come across same kind of situation or you can suggest an alternate solution on the same but we have to use Remote cube.
Any suggestions asap would be highly appreciated and rewarded with full points.
Regards,
RajI can think of two ways to do it
Simple with no ABAP coding is a view
Create a view between a timestamp table and your big table
Put an entry into your timestamp table of the current date - then use a selection of this inside the view
Unforutunately you cannot use SY fields inside database views (otherwise you could have used these as a selection condition in the view)
The best way to do it is using a function module and passing the data from the query into the SQL statement
I prefer to do it the last way and also pass data through a generic structure - thus you manipulate the data inside the intial loop in the function module and don;t utilise further loops downstream in transfer rules
(to try and keep the reponse time down) -
Hello BW Experts,
1) Is it advisibe to use remote cubes ( not for the user purpose, but for internal data reconciliation purpose). ?
2) tables we are looking at are the vbak, vbap, vbrk, vbrp from SAP and comparing them to the base Sales and billing ODS. And produce a reconciliaiton report. Since the data in the source tables vbap, vbrp is huge, it gives memory dumps. Is there any better way of using this. Any suggestion to get solve the memory dump?
3) Are there any measures to improe remote cube performance
4) Other than the remote cube is there any other way of doing the reconciliation for the above data
Any suggestions appreciated.
Thanks,
BWerHi BWer,
Remote cube performance is really an issue.
Instead, you can load data upto PSA and compare them.
There is a "How to... Validate InfoCube Data By Comparing it with PSA Data":
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/g-i/how to validate infocube data by comparing it with psa data
You can also quickly build in R/3 a query upon those tables using SQ01 tcode and Quick Viewer button for comparison.
Best regards,
Eugene -
Error reading the data from the remote cube
Hi all,
When we try to get the data for remote cube from LISTCUBE, we are getting the following msg.
1) Messages for DataSource 9AUPA_DP_HK_01 from source system AD1CLNT100
2) 224(/SAPAPO/TSM): No planning version selected
3) Errors have occurred while extracting data from DataSource 9AUPA_DP_HK_01
4) Error reading the data of InfoProvider UICHKRMTC.
Any Inputs?Hi,
Check whether the sourcesystem is responding.
And also in the error mesg: 224(/SAPAPO/TSM): No planning version selected
It seems you have not selected any planning version. Give any planning version in the listcube selection screen and execute.
Regards,
Ravi Kanth -
Error while assigning source system to SAP Remote cube
I need to do a query using remote cube and the source sys is flat file.
it wont let me assign the source system to the remote cube.
the r/3 system icon is seen next to the remote cube technical name...
I can't see my flatfile source system
can somebody help me.Hello Kiran,
I don't think its possible to use Remote cubes for PC_FILE or FLAT FILE as Source system.
You can use Virtual Providers in these different scenarios
Virtual Providers based on DTP => used for SAP source systems
Virtual Providers based on BAPI => used for non-SAP or external systems
Virtual Providers bsaed on Function modules => You use this VirtualProvider if you want to display data from non-BI data sources in BI without having to copy the dataset into the BI structures. The data can be local or remote. You can also use your own calculations to change the data before it is passed to the OLAP processor. This function is used primarily in the SAP Strategic Enterprise Management (SEM) application.
Use InfoObects as Virtual Providers
Your case is none of these.
More details can be found under this link:
http://help.sap.com/saphelp_nw70/helpdata/EN/84/497e4ec079584ca36b8edba0ea9495/frameset.htm -
Creation of Remote Cube on a view
Hi,
How do you create a remote cube on R/3 view?
Thanks
SuchitraHi,
How do you create a remote cube on R/3 view?
Thanks
Suchitra -
Steps to to enhance the existing cube and get the data from r/3
I am at entry level.
if i want to enhance an existing cube and pull data from R/3 into the cube what are the main steps i do.
in my scenario - this is SD module. The infocude is custom cube(CONTRACT cube) . I have to enhance this existing custom cube with new info objects. The person who had created this cube years ago has used generic data source .
Please explain me the main stepd i need to do in R/3 till i get it to BIcheck if you can enhance the existing generic data source to bring data for new info objects or if those fileds are aleady existing in the data source - if existing, you will have to make changes only in BW side, if not, you will have to first modify the generic data source
if you have modified the data source, relicate data source at BW side
create new info objects and add those to cube > activate info cube
map the info objects with fields in data source > activate tranformation
test it by loading the data
now you will have to move these changes to produciton;
move the data source changes to R/3 production system (if applicable)
replicate data source in BW production system
move the BW transports to production system
you will have communicate to the uses that data will not avialable for some time or you need to reload data in cube in off business hours or on a weekend
delete data from info cube
initialize delta without data transer
do a full repair to the cube
now you can run delta on a daily basis
I hope it helps.
Regards,
Gaurav -
Hi,
I am trying to generate a report on BW using Remote Cubes have source as SQL server using UD Connect. I am using this process to re-consile some of the values. I am getting a date from source and I want it to display as Fiscal Period.
Creating variable is not of much use because the code will execute at the first point and is passes as selection criteria to UD Connect , This may impact the performance. I want the variable/formula to execute after the data comes to BEx.
I tried of creating the Text variable with replacement variable on Characterstic (Date) so that I can pick on mmyyyy. The text variable is not working.
Can any one suggest me on this.
Regards
Sudheerhi
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f05d0b81-076a-2c10-44ab-f00b0b90ce38?QuickLink=index&overridelayout=true
this may help you
Maybe you are looking for
-
Hello Experts, I created the User Defined Table in the name of SR NO where i entered the Docment No in Code Field (System generated default field).When i add the Document Number serially when i saved the UDT table then my serial sequence has changed.
-
2nd Document window not closing
I have 2 Document windows, - win1 is parent window. - win2 is a small window with 3 display fields and 2 buttons OK & Cancel. - other setting for 'win2' is Modal = 'YES' Close Allowed = 'YES' i have problem with Cancel button code, i am giving if :SY
-
Where to buy Canon original accessories ??
The original boxes with some accessories of my new purchase. 7D Mk2 and 24 70 F4IS were taken by somebody from my front porch. (My bad. I put the Canon boxes inside a snow thrower box. The big box was left outside the front porch pending sorting for
-
Washed out colors - brightness too bright - fix?
Hello! Well, I'm currently running a 400 mhz, slot loading, iMac DVSE (spring 2000), equipped with 10.4.8, 512 MB ram, and an 80G hard drive! However, now that i've upgraded to tiger, and am using this old machine far more, I've noticed that the disp
-
ME-807 System error: error during insert in table EKKO in SM13
Hello Experts, I have been searching a lot in SDN and in google about the ERROR ME-807 System error: error during insert in table EKKO but still i have not got any satisfactory answer. In ME31K when I create the contract and put some validation check