How to make data loaded into cube NOT ready for reporting
Hi Gurus: Is there a way by which data loaded into cube, can be made NOT available for reporting.
Please suggest. <removed>
Thanks
See, by default a request that has been loaded to a cube will be available for reporting. Bow if you have an aggregate, the system needs this new request to be rolled up to the aggregate as well, before it is available for reporting...reason? Becasue we just write queries for the cube, and not for the aggregate, so you only know if a query will hit a particular aggregate at its runtime. Which means that if a query gets data from the aggregate or the cube, it should ultimately get the same data in both the cases. Now if a request is added to the cube, but not to the aggregate, then there will be different data in both these objects. The system takes the safer route of not making the 'unrolled' up data visible at all, rather than having inconsistent data.
Hope this helps...
Similar Messages
-
AWM Newbie Question: How to filter data loaded into cubes/dimensions?
Hi,
I am trying to filter the amount of data loaded into my dimensions in AWM (e.g., I only want to load like 1-2 years worth of data for development purposes). I can't seem to find a place in AWM where you can specify a WHERE clause...is there something else I must do to filter data?
ThanksHi there,
Which release of Oracle OLAP are you using? 10g? 11g?
You can use database views to filter your dimension and cube data and then map these in AWM
Thanks,
Stuart Bunby
OLAP Blog: http://oracleOLAP.blogspot.com
OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
OLAP on OTN: http://www.oracle.com/technology/products/bi/olap/index.html
DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html -
How to make data in Essbase cube equal to data in DW when drilling through
Is there standard ways in Oracle BI + Essbase to make data in Essbase cube and DW equal (corresponding)?
For example when we are drilling down from cube to DW in this moment in DW may be loaded additional data (new data) which are not loaded in the cube yet.
So we have situation where data in the cube not correspond to data in the DW.I think rebuilding the cube on a more frequent basis not solves the problem – there will be significant time between the moment when data loaded in DW and when data updated in the cube.
I thought of creating 2 tables in DW (“new” and “old”) and 2 cubes (“new” and “old”).
So the process of loading data will look like this:
1. We have corresponding data in table “old” and cube “old”. User always works with “old” objects.
2. Load data to table “new”.
3. Load data to cube “new” from table “new”.
4. Rename tables and cubes: “old” to “new”, “new” to “old”. Here users starting to work with updated cube and table.
5. Add new changes go cube and table “new” (there will be old data).
6. Go to step 2.
But this way is too expensive (storage amount doubles).
And maybe easier way can be found?... -
How to Achieve this Data Load into Cube
Hi Experts,
Could you please update me on how to achieve this
I got a History data (25-35 Million Records) for about 8 years to be loaded to BW.
What is the best approach to be followed
1) Load everything into one cube and create aggregates or
2) Create 4 different cubes (with same data model) and load load 2 years of data into each cune (2Years * 4 Cubes = 8 Years Data) and develop a multicube on top of 4 cubes
If so how can i load data into respective cubes
Ex: Lets assure i got data from 31.12.2007 to 01.01.2000 which is 8 years of data
Now i created 4 Cubes--C1,C2,C3,C4 & C5
How can i specifically
load data from 01.01.2000 to 31.12.2001 (2 Years) to C1
load data from 01.01.2002 to 31.12.2003 (2 Years) to C2
load data from 01.01.2004 to 31.12.2005 (2 Years) to C3
load data from 01.01.2006 to 31.12.2007 (2 Years) to C4
load data from 01.01.2008 to 31.12.2010 (2 Years) to C5 (Currently Loading)
Please advise the best approach to be followed and why
ThanksIf you already have the cube C5 being loaded and the reports are based on this cube, then if you donot want to create additional reports, you can go ahead and load the history data into this cube C5.
What is your sourcesystem and datasource?. Are there selection conditions (in your infopackage) available to specify the selections? If so, you can go ahead and do full loads to the current cube.
For query performance, you can create aggregates on this cube based on the fiscal period / month / year ( whichever infoobject is used in the reports)
If your reports are not based on timeperiod, then multicube query will work as parrallelized sub queries and so there will be 4 dialog processes occupied on your BW system everytime the query is hit.
Also any changes that you want to make in cube will have to be copied to all cubes, so maintenance may be a question.
If there are enough justification, then approach 2 can be taken up -
How to check data in the cube ? just for a selected request .
hi,
I am running 5 different DTP for a cube. and i can see 5 Succesful request ID available for reporting (In manage cube ).
Now, how do i check recoards for a perticuler request, so that i can check , if the data is comming right or not !
if i select the request id ( in the Request Tab ) and then switch to content tab, then i guess its showing me all the recoards , which are there in the cube by all the DTP.
please help me out,
thanksHi,
Check if you have Loading date Field in your info cube. you can reconcile the records based on loading date.
SujanR -
Mainframe data loaded into Oracle tables - Test for low values using PL/SQL
Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Some columns from the mainframe data had 'low values' in them. These columns were defined on the Oracle tables as varchar2 types. In looking at the data, some of these columns appear to have data that looks like little square boxes, not sure but maybe that is the way Oracle interprets the 'low values' in the original data into varchar. When I run a select to find all rows where this column is not null, it selects these columns. In the results of the select statement, the columns appear to be blank, however, in looking at the data in the column using SQL Developer, I can see the odd 'square boxes'. My guess is that the select statement is detecting that something exists in this column. Long story short, some how I am going to have to test this legacy data on the Oracle table using Pl/Sql to test for 'low values'. Does anyone have any suggestions on how I could do this????? Help! The mainframe data we are loading into these tables is loaded with columns with low values.
I am using Oracle 11i.
Thanks
Edited by: ncsthbell on Nov 2, 2009 8:38 AMncsthbell wrote:
Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Not a wise thing to do. Mainframe operating systems typically use EBCDIC and Unix and Windows servers use ASCII. The endian is also different (big endian vs little endian).
Does anyone have any suggestions on how I could do this????? As suggested, use the SQL function called DUMP() to see the actual contents (in hex) of these columns. -
Data Entry Layouts - Rows not ready for inputs - Multiple records
Hi,
I have created a simple layout using a predefined simple structure for rows (FS Items) and a data driven one for columns (for the period value in LC).
Some cells show some data in them and are not available for input because there are multiple transaction records behind the numbers.
The transaction records share the same breakdowns except for the Currency Translation Indicator (it is empty for original records and shows '1' for records created during currency translation) and the currency key for transaction currency (sometimes it is empty).
It would be much appreciated if you let me know a way to introduce new data with layouts for FS Items (accounts) which already have transaction records.Hi Roberto,
Data may not be ready for input because of different reasons. The most I met were - you didn't expand the rows structure until the leaves (not nodes) or you marked the column as display only. -
Hi,
When I run DTP, I get the error below on SM21. I checked note 1634716 - SYB: Lock timeout or deadlocks and 1933239 - SYB: Shortdumps with resource shortage and runtime error DBIF_SETG_SQL_ERROR but they didn't solve the issue. Any help will be appricated.
Database error 12205 at OPC
> [ASE Error SQL12205]Could not acquire a lock within the
> specified wait period. SERVER level wait period=1800 seconds
> spid=568, lock type=shared intent, dbid=4, objid=838472672,
> pageno=0, rowno=0. Aborting the transaction.#
Runtime error "DBIF_DSQL2_SQL_ERROR" occurred.
Thanks,I also get same error when I'm trying to collapse cube.
It always get the same error on RSDU_PARTITIONS_INFO_GET_SYB.
I found a correction on 1616762 - SYB: Fix collection for table partitioning but it didn't solve anything. It gives CX_SY_NATIVE_SQL_ERROR on same function. Correction is deleting "at isolation 1" line .
Any idea ? -
How to make data of responsabilitys in read only for users
hi everyone
i need help to find a way to make all datas in the E-Business suite application in read only. i try to attach the diffirents responsabilitys with options profiles but i cant make this process work
if anyone please know any way to help to make this work it will be mush appreciated.
Thanks in advence
My said ABAROUNSo you'd better ask this in the E-business forum.
Sybrand Bakker
Senior Oracle DBA -
I wanna Ask you something. My brother has an iPhone 5s that disabled for about 23 millions minutes the time changed to 1976. So how to make it back into 2015 without waiting for 23 millions minutes later and enabled it without erasing any data? Please answer me ASAP because I really need it. Thank you for answer!
Hello theshadowhunters,
There is not much you can do to change the time back because he will not have access to change the time. The only option to get passed this is to restore the iPhone. Once the iPhone is restored and have a back up, you can put that back on once the restore is complete. For more information, take a look at the article below.
Forgot passcode for your iPhone, iPad, or iPod touch, or your device is disabled
https://support.apple.com/en-us/HT204306
Regards,
-Norm G. -
How is data loaded into the BCS cubes?
We are on SEM-BW 4.0 package level 13. I'm totally new to BCS from the BW view point. I'm not the SEM person but I support the BW side.
Can anyone explain to me or point me to documentation that explains how the data gets loaded into cube 0BCS_C11 Consolidation (Company/Cons Profit Center? I installed the delivered content and I can see various export data sources that were generated. However I do not see the traditional update rules, infosources etc.
The SEM person has test loaded some data to this cube and I can see the request under 'manage' and even display the content. However the status light remains yellow and data is not available for reporting unless I manually set the status to green.
Also, I see on the manage tab under Info-Package this note: Request loaded using the APO interface without monitor log.
Any and all assistance is greatly appreciated.
Thanks
DennyHi Dennis,
For reporting the virtual cube 0BCS_VC11 which is fed by 0BCS_C11 is used.
You don't need to concern about the yellow status. The request is closed automatically after reaching 50000 records.
About datastream - you right - the BW cube is used.
And if your BW has some cubes with information for BCS on a monthly basis, you may arrange a load from a data stream.
This BW cube I make as much similar to 0BCS_C11 by structure as possible -- for a smooth data load. The cube might be fed by another cube which contains information in another format. In update rules of the first cube you may transform the data for compatibility of the cubes structure.
Best regards,
Eugene -
How to delete the data loaded into MySQL target table using Scripts
Hi Experts
I created a Job with a validation transformation. If the Validation was failed the data passed the validation will be loaded into Pass table and the data failed will be loaded into failed table.
My requirement was if the data was loaded into Failed database table then i have to delete the data loaded into the Passed table using Script.
But in the script i have written the code as
sql('database','delete from <tablename>');
but as it is an SQL Query execution it is rising exception for the query.
How can i delete the data loaded into MySQL Target table using scripts.
Please guide me for this error
Thanks in Advance
PrasannaKumarHi Dirk Venken
I got the Solution, the mistake i did was the query is not correct regarding MySQL.
sql('MySQL', 'truncate world.customer_salesfact_details')
error query
sql('MySQL', 'delete table world.customer_salesfact_details')
Thanks for your concern
PrasannaKumar -
How to extract data from info cube into an internal table using ABAP code
HI
Can Anyone plz suggest me
How to extract data from info cube into an internal table using ABAP code like BAPI's or function modules.
Thankx in advance
regds
AJAYHI Dinesh,
Thankq for ur reply
but i ahve already tried to use the function module.
When I try to Use the function module RSDRI_INFOPOV_READ
I get an information message "ERROR GENERATION TEST FRAME".
can U plz tell me what could be the problem
Bye
AJAY -
How can i add the dimensions and data loading into planning apllications?
Now please let me know how can i add the dimensions and data loading into planning apllication without manuallly?
you can use tools like ODI or DIM or HAL to load metadata & data into planning applications.
The data load can be done at the Essbase end using rules file. But metadata changes should flow from planning to essbase through any of above mentioned tools and also there are many other way to achieve the same.
- Krish -
Loaded data not visible for Reporting?
Hi all,
In the info cube, the data is not visible for reporting, How can i make it available for reporting,
When i select the Manage option from the context menu of the cube and select the Requests Tab, i get a pop up message that tells
" There is an inconsistency between the load status of the data and the option of reporting on this data.
There is data in the InfoCube/ODS object that is OK from a quality point of view, but is not yet displayed in reporting.
The problem, for example, is to do with request 0000018049, number REQU_F4ZBFRMDGBULE9WCUN3R5UX5X."
How do i find out the inconsistancies?
PS: All the requesta are delta upload
Thanks n regards
GirikumarHi Girikumar
Use the RSRV transaction and check the inconsistency of the cube ..If any inconsistency is there repair that with the repari option in the toolbar..
Let me knwo if it not resolved..
Bye
Shu Moh..
Maybe you are looking for
-
What is the keyboard shortcut for "force quit"?
I dont understand what keys those symbols stand for. Thanks!
-
Song wont play and just jumped to next song automatically
Hi guys, Im using MacBook Pro OS X 10.9.2, iTunes 11.1.5(5) and iPhone 4S Actually, i added my musics on iTunes library and all of them play properly in iPhone but when im going to play a specific one it wont play and automatically jumped to next tra
-
I have a valid apple I'd which I use to sign in to the apple community without any problem, but each time I try to sign in to iMessage, it shows me that Username and Password is incorrect please try again. Why is it so?
-
Blocking websites in iphone safari
Hey guys, How can I block some websites to open in iphone's safari. by block I mean, block for good, with authorization (using a password) to unblock the websites perhaps. I've been trying to block adult websites (I've got my really demanding reasons
-
Multiperson video conferencing
I am unable to host or join a multi-person video conference although I have sufficient bandwith and adequate hardware. I have done this before, so this is a recent development. In fact, other users on this same machine show the "stacked cameras" when