OWB 11.1 Cube Operator 'Active Date' column
Hi Experts,
I recently defined a cube in OWB 11.1 using the cube editor. The result object contained a column 'Active Date' which was only visible in the cube object within a mapping. Neither the underlying table nor the Cube object editor showed the 'Active Date' column.
Has anyone a short explanation on what the idea of that column is and how it could / should be used?
Thanks for your help
Regards
Andy
Hi Alex,
thanks for your reply.
Does that mean that I should fill the 'Active Date' column with the date-value of my fact-row. OWB then joins all type II or type III dimensions like the this:
cube.active_date between dimension_start_date and dimension_end_date.
Correct?
The 'Active Date' column will never be visible in the underlying cube-table as it's only used to create the correct join?
Thanks for your help.
Regards
Andy
Similar Messages
-
APEX_UTIL.IR_FILTER with BETWEEN operator for date columns
Hi,
when I run my application I can set a BETWEEN-filter for date columns. I can't find a way how to use it with the APEX_UTIL.IR_FILTER function. Maybe I miss something?
Documentation: http://download.oracle.com/docs/cd/E17556_01/doc/apirefs.40/e15519/apex_util.htm#CHDDDFBF
TobiasTobias,
If you think about it, a "Between" is nothing more than a single line way to say
WHERE :X >= :Y
AND :X <= :Z
So you should be able to apply two filters to the report using the LTE and GTE operators
Hope this helps
Doug Gault
www.sumneva.com -
Date columns in owb from mainframe
I get data in flat file from mainframe. In mainframe if date column is high date then it will be assigned defualt '9999-12-31' . Now I am trying to load into OWB and it converts to 31-dec-99 with mask 'yyyy-mm-dd' . I cant distinguish it has high date.
I want to insert space or is there anything high date in oracle. How to set a date defined column in oracle table to no value like high date or low date , means I dont have a valid date at that point of time.
thanksHi,
The mapping through which you are loading can be made so as to check for a high date and if found in the source data make them NULL and load them to the target table field. You need to make the target date field as NULLABLE. This NULL can serve as a high date for you.
Regards
-AP -
Ordinal date columns in the activity details area ?
Is it possible to add ordinal date columns in the activity details area of Priamvera P6?
It is possible - I just have to do it, I will let you know after I do it how it goes...
-
DSO upload and no data in Active data table
Hi Experts,
I have a strange problem.I have loaded data to DSO from DS in BI7. It has the further uplaod to cube.I have activated the DSO and it went sucessfull and had Request ID generated. It has added and transfer records available like 150000 records as I have done full upload. strangly I cannot see any data in Active data table.
Pls advise how can I check the data I am doing some mistake.I have data mart status for this DSO. the deletion of the DSO and reloading could that create the data not visible in DSO even after activation
Pls advise.
TatiHi,
I believe this got something to do with the display setting.. After displaying the data, get into the settings menu and look for Layout option --> display --> see if there is any default setting applied.. change this setting to something else.. create a setting with all the columns dragged & dropped.. These are the options you can try..
If this did not resolve.. please try displaying the data from SE16 transaction and post us the results..
Thanks,
Naren -
DBIF_REPO_SQL_ERROR short dumps while activating data in ODS
Hi All,
We are using BW3.5 with Oracle 9.2 on AIX 5.3 box, from the past few days we are getting DBIF_REPO_SQL_ERROR short dumps frequently while activating data in ODS.
Runtime Error DBIF_REPO_SQL_ERROR
Date and Time 08.01.2008 13:01:08
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)".
2. A suitable printout of the system log
To obtain this, call the system log through transaction SM21.
Limit the time interval to 10 minutes before and 5 minutes
after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs,
supply the source code.
To do this, select the Editor function "Further Utilities->
Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
Internal call code.........: "[REPO/*/43/LOAD/CX_SY_OPEN_SQL_DB=============CP
Please check the entries in the system log (Transaction SM21).
You may able to find an interim solution to the problem
in the SAP note system. If you have access to the note system yourself,
use the following search criteria:
"DBIF_REPO_SQL_ERROR" C
"SAPLRSSM_PROCESS" or "LRSSM_PROCESSF04"
"CHECK_IF_ANALYZE_IS_NESSESSARY"
System environment
SAP Release.............. "640"
Application server....... "psapdb"
Network address.......... "158.58.65.11"
Operating system......... "AIX"
Release.................. "5.3"
Hardware type............ "00CD615C4C00"
Character length......... 16 Bits
Pointer length........... 64 Bits
Work process number...... 22
Short dump setting....... "full"
Database server.......... "psapdb"
Database type............ "ORACLE"
Database name............ "BWP"
Database owner........... "SAPBWP"
Character set............ "C"
SAP kernel............... "640"
Created on............... "Oct 29 2006 20:49:57"
Created in............... "AIX 1 5 00538A4A4C00"
Database version......... "OCI_920 "
Patch level.............. "155"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE
10.2.0.."
SAP database version..... "640"
Operating system......... "AIX 1 5, AIX 2 5, AIX 3 5"
Memory usage.............
Roll..................... 16192
EM....................... 16759424
Heap..................... 0
Page..................... 24576
MM Used.................. 6604384
MM Free.................. 1772536
SAP Release.............. "640"
User and Transaction
Client.............. 200
User................ "R3REMOTE"
Language key........ "E"
Transaction......... " "
Program............. "SAPLRSSM_PROCESS"
Screen.............. "SAPMSSY0 1000"
Screen line......... 6
Information on where terminated
The termination occurred in the ABAP program "SAPLRSSM_PROCESS" in
"CHECK_IF_ANALYZE_IS_NESSESSARY".
The main program was "RSPROCESS ".
The termination occurred in line 1143 of the source code of the (Include)
program "LRSSM_PROCESSF04"
of the source code of program "LRSSM_PROCESSF04" (when calling the editor
11430).
The program "SAPLRSSM_PROCESS" was started as a background job.
Job name........ "BI_PROCESS_ODSACTIVAT"
Job initiator... "RPRIYANKA"
Job number...... 02010102
Also we have a failed job here. Here is the job log.
Job log overview for job: BI_PROCESS_ODSACTIVAT / 02010102
Date Time Message text
08.01.2008 13:01:00 Job started
08.01.2008 13:01:00 Step 001 started (program RSPROCESS, variant &0000000000188, user ID R3REMOTE)
08.01.2008 13:01:02 Activation is running: Data target HBCS_O25, from 1,758 to 1,767
08.01.2008 13:01:02 Data to be activated successfully checked against archiving objects
08.01.2008 13:01:02 SQL: 01/08/2008 13:01:02 R3REMOTE
08.01.2008 13:01:02 ANALYZE TABLE "/BIC/AHBCS_O2540" DELETE
08.01.2008 13:01:02 STATISTICS
08.01.2008 13:01:02 SQL-END: 01/08/2008 13:01:02 00:00:00
08.01.2008 13:01:02 SQL: 01/08/2008 13:01:02 R3REMOTE
08.01.2008 13:01:02 BEGIN DBMS_STATS.GATHER_TABLE_STATS ( OWNNAME =>
08.01.2008 13:01:02 'SAPBWP', TABNAME => '"/BIC/AHBCS_O2540"',
08.01.2008 13:01:02 ESTIMATE_PERCENT => 1 , METHOD_OPT => 'FOR ALL
08.01.2008 13:01:02 INDEXED COLUMNS SIZE 75', DEGREE => 1 ,
08.01.2008 13:01:02 GRANULARITY => 'ALL', CASCADE => TRUE ); END;
08.01.2008 13:01:05 SQL-END: 01/08/2008 13:01:05 00:00:03
08.01.2008 13:01:05 SQL-ERROR: 603 ORA-00603: ORACLE server session terminated by fat al error
08.01.2008 13:01:05 System error: RSDU_ANALYZE_TABLE_ORA/ ORACLE
08.01.2008 13:01:08 ABAP/4 processor: DBIF_REPO_SQL_ERROR
08.01.2008 13:01:08 Job cancelled
Listener is working fine, Checked the RFC connections, Tried restarting the system, tried adding space to the TEMP tablespace as well as PSAPBWP, but they didn't work.
Please help.The problem got solved as there were DB errors like ORA-01114: IO error writing block to file 256 (block # 126218).
Here the point to be notes is file 256. The number greater than 255 indicates a temp datafile. This indicates an issue with PSAPTEMP tablespace. When checked on PSAPTEMP tablespace, one of the filesystems where one temp data file sitting was full. The filesystem was 100% full. This will prevent the temp datafile to grow and shrink as and when required.
So, adding more space to the filesystem solved the problem.
Thanks to everyone who have shown interest in solving my problem. -
Using dimension and cube operator
hi
I am facing problem in using this dimension and cube operator..
Actually I want to create a sales data with dimensions as time,customer and product
After creating a dimension in owb I get a table created attached with it same is the case with cube operator
so where should I store my measures in the cube or the table created and where to store the data about dimensions in the table created.
and finally how to use them in mapping to retrieve the data stored
ThanksIt sounds like you are not getting any matching keys for loading into the cube.
Do you have a time dimension created by OWB in the cube? The key used by the cube operator for an OWB time dimension is a formatted number. The time dimension keys are stored as follows;
Day Level - YYYYMMDD
Month Level - YYYYMM
Week Level - YYYYWW
Quarter - YYYYQ
Year - YYYY
If you have a source that has a SQL date datatype for example and want to construct the key for a cube's time dimension at the day level something like the following expression can be used to construct the time reference from a SQL date...
to_number(to_char( time_key, 'YYYYMMDD'))
It may not be this but just a thought.
Cheers
David -
How to tune performance of a cube with multiple date dimension?
Hi,
I have a cube where I have a measure. Now for a turn time report I am taking the date difference of two dates and taking the average, max and min of the date difference. The graph is taking long time to load. I am using Telerik report controls.
Is there any way to tune up the cube performance with multiple date dimension to it? What are the key rules and beset practices for a cube to perform well?
Thanks,
AmitHi amit2015,
According to your description, you want to improve the performance of a SSAS cube with multiple date dimension. Right?
In Analysis Services, there are many tips to improve the performance of a cube. In this scenario, I suggest you only keep one dimension, and only include the column which are required for your calculation. Please refer to "dimension design" in
the link below:
http://www.mssqltips.com/sqlservertip/2567/ssas--best-practices-and-performance-optimization--part-3-of-4/
If you have any question, please feel free to ask.
Simon Hou
TechNet Community Support -
How can i change the Format of my DATE column?
I need to change the date format for a whole column. At present i can do MM-DD-YYYY. I need to change this this to DD-MMM-YYYY.
I know about the to_date function, i tried to apply it to change the format of my whole column by doing the following.....
CREATE TABLE "IT220_DATEHOLIDAY"
"DEPARTID" VarChar(2)NOT NULL ENABLE,
"HOLCODE" VARCHAR2(2)NOT NULL ENABLE,
"DEPARDATE" DATE,
to_date('DATE','DD-MMM-YYYY'), <<<change DATE column to DD-MMM-YYYY
CONSTRAINT "DATEHOLIDAY_PK" PRIMARY KEY ("DEPARTID") ENABLE
ORA-00902: invalid datatype <<<<This was the error message i received.
I am aware that the to_date function is supposed to be used to change strings into a certain format. I guess this means you cant do it with columns? Is there anyway i can format the whole column or do i have to do each string of data entered one by one?
Thanks in advance!Hello Jay,
I'm not sure you hit the right forum, as this doesn't seem to be a problem with APEX.
Anyway:
You can't use that function on a table like that, and as you already suggested, the to_date-function expects a string value.
It seems you also have a misunderstanding of the basic datatypes in the database. DATE is such a datatype and is stored an internal format you don't need to care about. Each time your request the value, the database will give you a string representation according to either your locale or you give a certain formatting mask.
The same applies to insert or update operations: you hand in either a variable of type DATE or use a function like to_date to create an "object" of type date.
You may be interested in reading the documentation of the [url http://download.oracle.com/docs/cd/E11882_01/server.112/e16508/toc.htm]Oracle Database Concepts. The section concerning datatype DATE can be found here:
http://download.oracle.com/docs/cd/E11882_01/server.112/e16508/tablecls.htm#CBBGJHJC
An overview of formatting options can be found here:
http://download.oracle.com/docs/cd/E11882_01/server.112/e17118/sql_elements004.htm#SQLRF00212
So to answer your question: You would format it as part of your insert operation, taking a string and convert it using to_date.
-Udo -
Need suggestion on Active data guard or Logical Stand by
Hi All,
Need a suggestion of on below scenario.
We have a production database ( oracle version 11g R2 ) and planning to have a Logical standby or physical standy (Active data guard). Our usage of the standby database is below.
1) Planning to run online reports (100+) 24x7. So might create additional indexes,materialized views etc.
2) daily data feed ( around 300+ data files ) to data warehouse. daily night, jobs will be scheduled to extract data and send to warehouse. Might need additional tables for jobs usage.
Please suggest which one is good.
Regards,
vara.Hello,
In active dataguad Whig is feature from 11gRx ,
If you choose active dataguard, you have couple of good options, one is you can make a high availability of your production database, which can act as image copy of production, as you are asking in 11g you have more advantage where you can open in read only mode and at the sometime MRP will be active, so you can redirect users to connect standby to perform select operations for reporting purpose. So that you can control much load on production ,
Even uou can perform switchover in case of role change, perform failover if your primary is completely lost. Also you can convert to physical to logical standby databases & you can configure FSFO
You have plenty of options with active dataguard.
Refer http://www.orafaq.com/node/957
consider closing the thread if answered and keep the forum clean.
>
User Profile for user11261773
user11261773
Handle: user11261773
Status Level: Newbie
Registered: Jul 14, 2011
Total Posts: 12
Total Questions: 6 (5 unresolved)
>
Edited by: CKPT on Mar 18, 2012 8:14 PM -
Performance operations based on Column values in SQL server 2008
Hi ,
I have a table which consist of following columns
ID Formula
Values
DisplayValue
1
a*b/100 100*12/100
null
2
b*c/100
12*4/100
null
I want to perform operation based on column "Values" and save data after operations in new column Name "Display Value" .i.e I want to get the below result . Can anyone please help.
ID Formula
Values
DisplayValue
1
a*b/100 100*12/100
12
2
b*c/100
12*4/100
0.48
Thanks for the help.
Regards, Priti ATry this,
create table #mytable (ID int,Formula varchar(10), [Values] varchar(10), DisplayValue decimal(10,4))
insert into #mytable values(1 ,'a*b/100','100*12/100',null)
insert into #mytable values(2 ,'b*c/100','12*4/100',null)
declare @rowcount int=1
while @rowcount <= (select max(id) from #mytable)
begin
declare @expression nvarchar(max)
select @expression=[values] from #mytable where id = + @rowcount
declare @sql nvarchar(max)
set @sql = 'select @result = ' + @expression
declare @result decimal(10,4)
exec sp_executesql @sql, N'@result decimal(10,4) output', @result = @result out
update #mytable set DisplayValue= @result where id = @rowcount
set @rowcount=@rowcount+1
end
select * from #mytable
Regards, RSingh -
ADC exception=BAM-01262: Active Data Cache server exception in openViewset
Hi All,
I am getting the below mentioned error when i try to open a report from Active Studio in BAM.
If i give parameter value ' All ' in the prompt, it shows all the value. But if i give specific values like one parameter value 1007 and other All etc, it gives the below error. I tried giving parameter value ' All ' in the prompt after getting the error, its not showing the values.
The below error is from front end:-
SQL EXCEPTION NULL : Invalid column index
The below error from back end:-
[2012-01-10T06:39:07.916+05:30] [bam_server1] [ERROR] [] [oracle.bam.reportcache] [tid: [ACTIVE].ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: weblogic] [ecid: 83cdf926093045b8:4fde7fa7:134bf505061:-8000-00000000000044f4,0] [APP: oracle-bam#11.1.1] *ReportCache: ReportCacheServer.OpenViewSet: ADC exception=BAM-01262: Active Data Cache server exception in openViewset(). [[*
at oracle.bam.adc.kernel.util.Util.getCacheException(Util.java:101)
at oracle.bam.adc.kernel.util.Util.getCacheException(Util.java:154)
at oracle.bam.adc.kernel.util.Util.getCacheException(Util.java:172)
at oracle.bam.adc.kernel.server.DataStoreServer.openViewset(DataStoreServer.java:1110)
at oracle.bam.adc.ejb.BamAdcServerBean.openViewset(BamAdcServerBean.java:841)
at sun.reflect.GeneratedMethodAccessor702.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.bea.core.repackaged.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:310)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at com.bea.core.repackaged.springframework.jee.intercept.MethodInvocationInvocationContext.proceed(MethodInvocationInvocationContext.java:104)
at oracle.bam.adc.ejb.BamAdcServerBean.interceptor(BamAdcServerBean.java:266)
at sun.reflect.GeneratedMethodAccessor374.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
[2012-01-10T06:39:07.913+05:30] [bam_server1] [ERROR] [] [oracle.bam.adc] [tid: [ACTIVE].ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: weblogic] [ecid: 83cdf926093045b8:4fde7fa7:134bf505061:-8000-00000000000044f4,0] [APP: oracle-bam#11.1.1] ActiveDataCache: Exception occurred in method openViewset(_TPT_PO_HEADER_STG_TBL,0)[[
Exception: java.sql.SQLException: SQLError(17003) SQLState(99999) Invalid column index
at oracle.jdbc.driver.OraclePreparedStatement.setStringInternal(OraclePreparedStatement.java:6336)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:10605)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:10518)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:11574)
at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:11544)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:249)
at oracle.bam.adc.common.externaldatasources.JDBC.getDataReader(JDBC.java:322)
at oracle.bam.adc.kernel.datasets.ExternalStorageEngine.getDataReader(ExternalStorageEngine.java:76)
at oracle.bam.adc.kernel.viewsets.utilities.externaldata.DataImporter.executeQuery(DataImporter.java:95)
at oracle.bam.adc.kernel.viewsets.utilities.externaldata.ExternalDataManager.importExternalData(ExternalDataManager.java:228)
at oracle.bam.adc.kernel.viewsets.utilities.externaldata.ExternalDataManager.importExternalFactData(ExternalDataManager.java:151)
at oracle.bam.adc.kernel.viewsets.utilities.externaldata.ExternalDataManager.getExternalData(ExternalDataManager.java:103)
at oracle.bam.adc.kernel.viewsets.Viewset.loadData(Viewset.java:259)
at oracle.bam.adc.kernel.viewsets.ViewsetBase.initialize(ViewsetBase.java:171)
at oracle.bam.adc.kernel.viewsets.Viewset.initialize(Viewset.java:220)
at oracle.bam.adc.kernel.viewsets.ViewsetBase.open(ViewsetBase.java:154)
Please give me any suggestion on this.
Thanks,
ManikandanThis problem was solved when I installed a new product version.
-
Oracle 10g Query on Date Column
Hello -
What is the most efficient way to query on a date column to get all dates within the last 2 months? I use something like the following in the WHERE clause:
billing_date >= to_date(add_months(sysdate, -2))
However, I can't ever get the index on billing_date to be used.
Any help is greatly appreciated...
Thanks!!This is a perfectly valid way to query the data. Here's an example on my own production tables. This one has about 300 million rows:
select *
from prod.tran_history
where tran_date >= to_date(add_months(sysdate, -2))
| Id | Operation | Name | Rows | Bytes | Cost |
| 0 | SELECT STATEMENT | | 11M| 863M| 146K|
| 1 | TABLE ACCESS BY INDEX ROWID| TRAN_HISTORY | 11M| 863M| 146K|
|* 2 | INDEX RANGE SCAN | TRAN_HIST_DATE_IDX | 2144K| | 2098 |
-------------------------------------------------------------------------------------------- -
Oracle 11g Active Data Guard help ?
Hi Friends,
I successfully setup an Active data guard environment(11g). But, I dont know when the PROD database is highly utilize , its read only tasks like reporting and backup are doing in STANDBY. How can I know which db (prod or stand by) is used for these readonly operations ?
Regards
VishIt is not so simple to direct reports to the Physical Standby as you seem to assume.
You need to do some work for the setup.
See here for a description:
http://uhesse.com/downloads/real-time-query-presentation/
Kind regards
Uwe Hesse
Don't believe it, test it!"
http://uhesse.com -
Using cube operator in mapping
Hi everyone,
I am just building my first owb map with a cube (relational fact table) but I can't find any description anywhere on how to actually use/connect the cube operator in a mapping. I have managed to build a cube, linked to two dimensions, but when I place it on the canvas, I am not sure what it expects.
Does anybody know of any good examples or a bit of documentation anywhere?
Thanks!
Ed
PS In a SQL statement I would populate that fact table by simply joining the external table with the dimension, but the input signature of the cube is just different from what I expected.hi Patrick
i have already done this but it is not working as well and response is same "you are using two different sources in expression".
in fact i am using few tables from source schema and another table(time dimension table) from target schema. i think either this error is due to using time dimension's table or it could be due to using table from two different schema's
what you say... i need your point of view
regards,
imran
Maybe you are looking for
-
Is it normall for a Macbook Pro to overheat during browsing web pages?
DonAnan Is it normall for a Macbook Pro 13" retina 2 weeks old to overheat during browsing web pages ? i just brought a new MacBook Pro 13" Retina ( Dual Core i5 2.6GHz , 8GB Ram , 512GB Flash Storage ) and i notes that its really get OVERHEAT on the
-
How to pair new Magic mouse w desktop? Old mouse is dead.
How do I get my new Magic Mouse to pair w my old 2007 iMac? I don't have the old Mouse; when I bought the new one at the Apple Store they said they'd throw the old one away. So I gave it to them.
-
Ipod configured for mac -- battery charge question
Hello, I got a new 30GB ipod video with my macbook. I already had a black one, but rather than reconfiguring it for a mac (wiping the hard drive), I gave that one to a friend and kept the new one for myself -- now, my macbook is off in repair-land ge
-
Problem/Question: I want to back up 2 macs to 1 Time Capsule, AND provide each of them with an equal amount of space. Issues: - to ensure one doesnt "out backup the other" - AND since it isnt possible (practically) to Partition the HDD within the Tim
-
Envy 4502 all in one won't print from Windows XP
I just set up and successfully installed the software for my Envy 4502 on an old PC running Windows XP. The internal test page comes out fine, but when I send a document it appears very briefly in the print queue then disappears and the document won