Get metadata of the cube or query
Hello all,
is there a function module,give a query or infocube name,I can get the meta data of the
cube or the query.
regards
kaushik
Hi there,
The best way to know the metadata from the cube or the query is to use the transaction rsa1->metadata repository.
To see the metadata from the Cube with function module you can use the function module BAPI_CUBE_GETDETAIL.
To see the contents of the query you can create a program to read the content of the query. I use a program from SDN to do that.
Diogo.
Similar Messages
-
Master Data Attribute is not getting reflected in the Cube?
Hi All,
I have loaded the data from ODS to the cubes for customer master and it contains a lot of navigational attributes
for instance :
master data for customer in development is
customer customer group 5(nav attributes) Customer pricing group
1001 HAW IE
And hence when i load the customer 1001 from the ods to the cubes i can see the customer contains navigtional attributes for customer pricing group but CUSTOMER GROUP5 doesnt gets displayed in the cube.
I check the master data it contains the values for customer group. How to Rectify this issue?
Pls advise me!!
Thanks
Poojahi all,
I have explicity clicked on the navigational attributes in the cube i can see the values for the other navigational attributes being displayed in the cube for customer master but then only one navigational attribute is not getting displayed.
For instance
customer, customer gp2, customer grp5
100, nr, nv
only customer grp5 is not getting displayed which has values in the master data but then customer grp2 along with customer is getting displayed.
Thanks
Pooja -
Time-dependent master data in the cube and query
Hello,
I have a time-dep. masterdata infoobject with two time-dep attributes (one of them is KF). If i add this infoobject into the cube, what time period SID will be considered during the load? I assume it only matters during load, if i add the KF to the query it gets its value based on the SID in the cube.. right?
Thanks,
vamsi.If its Time Dependent Master Data object when you run your Master Data if that time any changes to the Master Data that value will be overwrite to the old value you will get the new value. When you run the Query execution the Infocube Master Data infoobject will having the SID that time it will to there it will be displayed at that moved what is the value you have in the Master Data table.
This is what my experience.
Thanks,
Rajendra.A -
Hi,
In my infocube material type for one of the material is not getting displayed.
When I check in the content of the cube for this material all the fileds are getting displayed except material type.
However it is present in the material master data from which it is put into the update rules to populate in the cube.
Its getting displayed for some other materials , so we cant say that mapping is wrong or problem with update rules.
Can some body let me know what could be the reason.
Thanks,
JeetuHi Jeetu,
can you check in your cube if you have for one material, entries with AND entries without the MATL_TYPE? If this is the case then you were loading transactional data before having the corresponding material master data.
You should adapt your scenario:
- first do not use the standard attribute derivation during your URules: performance is very bad.
- implement a start routine filling an internal table with your material and MATL_TYPE for all entries of material in your datapackage.
- implement an update routine on the MATL_TYPE with a READ on this internal table an raise an ABORT = 4 if the MATL_TYPE is initial or the material in not found.
Now to fix your situation you'll have to reload your cube or alternatively just reload your missing MATL_TYPE MATERIAL from your cube itself and selective delete those which are empty.
hope this helps...
Olivier. -
Amount not getting displayed in the cube
Hi,
I have a ZTable which has got an Amount field declared like this.
AMOUNT WAERS CUKY 5 0 Currency Key
And I declared a ZAMOUNT as Amount in the cube and provided 0CURRENCY.
Initially there was no value in the Amount column in the Ztable and hence could not see any value of Amount in the cube.But after entering values in the ZTable, I deleted the replicated the DS, create TR, UR and then create an InfoPckg and then created the Data Transfer Process. But still I am not able to see the Amount Value in the cube.
Kindly Help
Sam..> hi Sam,
>
> did you assign AMOUNT field from ztable to infoobject
> ZAMOUNT in transfer rules ? move AMOUNT from left to
> right in tab 'datasource./transf.rules ?
> and ZAMOUNT mapped ? (with ZAMOUNT)
>
> hope this helps.
Hi Edwin,
Assigning and mapping which you mentioned is just pulling an arrow from the left Amount to right ZAmount, right? I did that. But there is an additional 0Currency on the right side without getting mapped? Should I map that also to Amount? Any issue with Rules.
Regards,
Sam... -
Data is not getting loaded in the Cube
Hi Experts,
I had a cube to which I created aggregates and then I deleted the data of the cube and made some changes in it and trying to load the data again from PSA.
But I'm the data is not getting loaded in to it the cube.
Can anyone tell me do I have to delete the aggregates before I start a fresh data in to my cube. If yes can you tell me how do i delete it and load it again.
If it is something else please help me resolve my issue.
ThanksHi,
Deactivate the aggregates and then load the data from PSA to Cube.
And then reactivate the aggregates. While reactivating them you can ask to fill them also.
Regards,
Anil Kumar Sharma. P -
Data Not getting transferred in the Cube for 2LIS_03_BF
Hello Gurus,
I have installed the cube 0IC_C03 from standard business content. While executing the data load , the data is seen in PSA but does not get updated in Cube
The info package is on mode "In PSA and then at once in InfoCube"
Regards
LalanHi,
It is the problem with Process keys . Take a look on the links:
Re: Problem extracting 2LIS_03_BX into 0IC_C03
Re: Records Not Added
Re: Inventory Management
Re: 0PROCESSKEY
With rgds,
Anil Kumar Sharma .P
Message was edited by: Anil Kumar Sharma
Message was edited by: Anil Kumar Sharma -
Value gets doubled in the cube when doing a data mart
Hi,
I am doing a data mart from one cube to another cube for different consolidation units and the value gets doubled for all the consolidation units except for one consolidation unit.there are no duplicate results in the cube.
Can anyone tell me how i can debug this issue or what could be the reasons,This is very urgent.
Thnaks and Regards,
SubhaIn the Cube that is being loaded try seeing its content. While seeing its content give the restriction on the Request ID as that of the request which loaded the Cube.
If you find the consolidation unit values doubled there, try seeing your update rule and transfer rule if the value is being changed anywhere.
Then check the PSA.
Hope that helps.
Regards. -
A keyfigure is not getting displayed in the DSO and query
hi friends,
i have newly developed a DSO with 11 keyfigures and with some 10 characteristics. i created DTP , transformations etc., and i loaded data into it and activated it succesfully.
now when i select display data of this DSO one of my keygure is not getting displayed.
even tht same keyfigure is not appearing in the query too.
but when i check the active table of this dso in SE11 , tht keyfigure is displayed with values.
could anyone help through this issue.Hi
Even I faced such an issue earlier. I could resolve it simply by readjusting the DSO i.e. to delete the keyfigure and add it in the structure once again, before this you have to delete the data in the DSO. Also, if you have a multiprovider on the DSO make sure that the keyfigure concerned is identified.
Let us know if this works for you. Thanks. -
Not able to get MetaData of BlobItems using java or by using the URI
line1
String metadta = blob.getMetadata().toString();
line2 System.out.println(metadta);
line3 blob.download(fileOutputStream);
line4 metadta = blob.getMetadata().toString();
line5 metadta.toString();
line6 System.out.println(metadta);
I will get NullPointerException on Line1
And will be Able to get MetaData of the blob item after i initiate download .Hi,
You have got NullPointerException on Line1, what was this blob on line 1, I didn't very familar with java, but from my experience, we need to know the details of getMetadata(), about download blob file, I suggest you read this article:
http://www.windowsazure.com/en-us/documentation/articles/storage-java-how-to-use-blob-storage/
Best Regards
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
How to make sure the Live cache information is getting reflect in the bex?
Hi All,
When the user enters some data in the planning book as soon as he saves it. It should be getting reflected in the cube isnt it? when i run the query it doesnt get reflected?
What is the technical settings should i do ? to reflected those changes in apo 3.5 verison.?
Thanks
PoojaHi Pooja, Rafael,
You can actually have a bex report with life data.
For that you need to use a remote cube, and not a basic one.
On comment though: you access only the information stored in the live cache, so for example you will not get the projected stock easily.
By default the projected stock is not stored; but even if you sotre it in a time series key figure, you still need to run the calculation... so any changes in order will not be reflected untill the next sotck calculation...
So to get the projected stock real time is more complex and required some abap coding... to get the the order real time is quite easy to do with remote cube
Good luck
Julien -
Account Dimension not showing all members in the cube in Analysis Services
Hi,
In SAP - BPC 5.1 after processing account dimension all the members are created under account dimension in analysis services but the same cannot be found in the data cube. Hence the reports generated through SAP - BPC is not showing all the members.
The issue looks very strange as we can see the members getting created but the same is not getting populated in the cubes. I am not able to visualize what exacly the issue is? Is it with the application or with the analysis services?
Thanks
SharathYour sixth sense is correct, there is definitely support in MSAS and BPC 5.1 for multiple hierarchies in the account dimension.
I'm also referring to parentH1 and parentH2, but perhaps we're still speaking of different things.
In the past, I faced a very similar problem as you, and the root cause was because I had one member, let's call it FancyParent, which, in H1, had children Child1 and Child2.
But in H2 it had children Child1, Child2 and Child3. I forget now if that was how I wanted the setup to be, or if it was a mistake on my part, but either way, MSAS doesn't allow this. The admin consol didn't complain when processing the dimension -- this was in Outlooksoft 5.0; perhaps validation has improved now.
But the cube was completely unworkable. Certain things were calculating correctly, but everything in the account dimension in the area around FancyParent (above and below it, in both hierarchies) was quite unpredictable.
By disabling first one, and then the other, of the two hierarchies, and disabling blocks of accounts, I was eventually able to pinpoint the problem. But it took days to figure out what was the problem. (The account dimension had 2500 members and 4 hierarchies, and it was not a pretty sight.)
A parent must have the same definition of children in all hierarchies. It can't, as another example, have children in H1, and be a base member in H2. Each member can have different parents in the two hierarchies, but must always have the same children in both.
To work around this problem, I had to create two separate accounts NetIncomeH1 and NetIncomeH2 (and PretaxIncomeH1, PretaxIncomH2, etc. all the way down to the point where they branched off), to get the two separate aggregations of the P&L in the two hierarchies. Once I did that, it made sense to me why, but I also swore off on frivolous extra hierarchies ever again. -
Query of the cube # is getting displaced
Hi All,
In the query of the cube for characteristics whereever it is blank from the cube it is getting displayed by a '#".I want it to display a blank. how to overcome this problem.
thanksHi Pavan,
Please try to seach thr forums before posting a question. See this thread:
Replace the default u0091#u0092 in BEx
Re: replacing '#'-sign for 'not assigned' in queries
Hope this helps... -
Data in the Cube not getting aggregated
Hi Friends
We have Cube 1 and Cube 2.
The data flow is represented below:
R/3 DataSource>Cube1>Cube2
In Cube1 data is Stored according to the Calender Day.
Cube2 has Calweek.
In Transformations of Cube 1 and Cube 2 Calday of Cube 1 is mapped to Calweek of Cube 2.
In the CUBE2 when i upload data from Cube1.Keyfigure Values are not getting summed.
EXAMPLE: Data in Cube 1
MatNo CustNo qty calday
10001 xyz 100 01.01.2010
10001 xyz 100 02.01.2010
10001 xyz 100 03.01.2010
10001 xyz 100 04.01.2010
10001 xyz 100 05.01.2010
10001 xyz 100 06.01.2010
10001 xyz 100 07.01.2010
Data in Cube 2:
MatNo CustNo qty calweek
10001 xyz 100 01.2010
10001 xyz 100 01.2010
10001 xyz 100 01.2010
10001 xyz 100 01.2010
10001 xyz 100 01.2010
10001 xyz 100 01.2010
10001 xyz 100 01.2010
But Expected Output Should be:
MatNo CustNo qty calweek
10001 xyz 700 01.2010
How to acheive this?
I checked in the transformations all keyfigures are maintained in aggregation summation
regards
PreetamJust now i performed consisyency check for the cube:
I a getting following warnings:
Time characteristic 0CALWEEK value 200915 does not fit with time char 0CALMONTH val 0
Consistency of time dimension of InfoCube &1
Description
This test checks whether or not the time characteristics of the InfoCube used in the time dimension are consistent. The consistency of time characteristics is extremely important for non-cumulative Cubes and partitioned InfoCubes.
Values that do not fit together in the time dimension of an InfoCube result in incorrect results for non-cumulative cubes and InfoCubes that are partitioned according to time characteristics.
For InfoCubes that have been partitioned according to time characteristics, conditions for the partitioning characteristic are derived from restrictions for the time characteristic.
Errors
When an error arises the InfoCube is marked as a Cube with an non-consistent time dimension. This has the following consequences:
The derivation of conditions for partitioning criteria is deactivated on account of the non-fitting time characteristics. This usually has a negative effect on performance.
When the InfoCube contains non-cumulatives, the system generates a warning for each query indicating that the displayed data may be incorrect.
Repair Options
Caution
No action is required if the InfoCube does not contain non-cumulatives or is not partitioned.
If the Infocube is partitioned, an action is only required if the read performance has gotten worse.
You cannot automatically repair the entries of the time dimension table. However, you are able to delete entries that are no longer in use from the time dimension table.
The system displays whether the incorrect dimension entries are still being used in the fact table.
If these entries are no longer being used, you can carry out an automatic repair. In this case, all time dimension entries not being used in the fact table are removed.
After the repair, the system checks whether or not the dimension is correct. If the time dimension is correct again, the InfoCube is marked as an InfoCube with a correct time dimension once again.
If the entries are still being used, use transaction Listcube to check which data packages are affected. You may be able to delete the data packages and then use the repair to remove the time dimension entries no longer being used. You can then reload the deleted data packages. Otherwise the InfoCube has to be built again. -
Java.sql.SqlRecoverableException error while querying the cube
Hi
I get the following error when I try to query the cube from a Java program
java.sql.SqlRecoverableException: no more data to read from the socket.
This error occurs sometimes and sometimes it does not occur.
We observed that if many people simultaneously try to query the cube the error crops up.
Is this a bug or is there a solution or a method to deal with this sort of situationThat sounds like the connection to the database is not there, or was dropped. Check that you are not maxed out on the number of processes.
Maybe you are looking for
-
I am having the problem when trying to take a panorama picture. It may only take the first frame or doesn't even start at all when I press the shutter button. even though when i hold the phone straight up and down, so that camera is at the top on t
-
Error in local message system; message 009999000035 not complete
Hello All. I have configured Service desk in SolMan 7.0 which has patch level 14 and also configured in Satellite systems. I am using SM_...._BACK RFC connection in BCOS_CUST table. The user used by SM_...._BACK have the following roles in SolMan SAP
-
Generate a delayed pulse with analog trigger
I want to generate a pulse with counter when the gate of counter receives a voltage value which is generated by analog output chanel,is this possible?? I am using PCI-6713 Thanks!!
-
Calling an actionable email from process task OIM11g
Hello Experts, Is there a way to call an actionable email from process task, meaning can i trigger a workflow on demand from process task.
-
I shot using an eye-fi card. The card autmatically generates a xmp file with geodata in it. LR auto imports these images and adds lens correction and a copy right tag. What I can't do is automatically is get LR to read the xmp file and include this i