Analytics partitioning
Hello,
I need to partition data for my analytics by week. The problem is that I keep more than 52 weeks in my tables so the years overalp (which I don't want them too). If I add year to partitioning I'm splitting the new year week in two.
Do you guys know of any strategy that would allow me to partition data by week with the above constraints?
Thank you,
Hi,
Exor wrote:
the code is stored in views. If I add something like where my_date > sysdate - 364 it will make the query dependent on the time it is run.I never told you to say anything like "where my_date > sysdate - 364".
I said to do something like "PARTITION BY (my_date', 'IW').
If you try that and you still have a problem, then post:
(1) The version of Oracle (and any other relevant software) you're using
(2) A little sample data (just enough to show what the problem is) from all the relevant tables
(3) The results you want from that data
(4) Your best attempt so far (formatted)
(5) The full error message (if any), including line number
Executable SQL statements (like "CREATE TABLE AS ..." or "INSERT ..." statements) are best for (2).
If you can present your problem using commonly available tables (for example, tables in scott schema, or views in the data dictionary), then you can omit (2).
Formatted tabular output is okay for (3). Type these 6 characters
{code}
(small letters only) before and after the tabular text, to preserve spacing.
Similar Messages
-
Aggregation with a parent child hierarchy
Hi
I have 2 tables, hierarchy and data
hierarchy has 2 columns parent and child
data contains values for the lowest rungs of the hierarchy described.
What i'd like to do is, using the hierarchy described, sum the lowest rung values up to their parent values and so on.
Quick example:
Hierarchy
child parent
1 null
2 1
3 1
4 2
5 2
Data
hiearchyid value
3 10
4 20
5 30
output required
1 60
2 50
3 10
4 20
5 30
The last 3 rows are just the 2nd table so i can just union those in but how would i work out the first two values using sql... have tried combinations of connect by, analytics (partition by etc) and rollup but to no avail...
Any help would be appreciated..
Cheers
Alex@Aketi
It is One way that We derive
"sys_connect_by_path(RowIDToChar(RowID),'.') as
RowIDList"
Then
We use Left Join Using
"instr(RowIDList,RowIDToChar(RowID))".sorry, didn't catch what you mean.
my site :-)
http://oraclesqlpuzzle.hp.infoseek.co.jp/4-13.html
pity, I don't know Japanese :(
@Rob
your query is good and simple :) probably it's the best alternative for the OP.
But it won't work with not a plain hierarchy, e.g.:
1
2 3
4
5 6
SQL> create table hierarchy (child,parent)
2 as
3 select 1, null from dual union all
4 select 2, 1 from dual union all
5 select 3, 1 from dual union all
6 select 4, 3 from dual union all
7 select 4, 2 from dual union all
8 select 5, 4 from dual union all
9 select 6, 4 from dual
10 /
Table created
SQL>
SQL> create table data (hierarchyid,value)
2 as
3 select 5, 10 from dual union all
4 select 6, 25 from dual
5 /
Table created
SQL>
SQL> select h.child
2 , sum(connect_by_root d.value)
3 from hierarchy h
4 , data d
5 where h.child = d.hierarchyid (+)
6 connect by prior h.parent = h.child
7 group by h.child
8 order by h.child
9 /
CHILD SUM(CONNECT_BY_ROOTD.VALUE)
1 70
2 35
3 35
4 70
5 10
6 25
6 rows selected
SQL> -
Can anyone share their experiences in partitioning tables in HR ANALYTICS? We're using OBIEE 10.1.3.4 and OBI ANALYTICS 7.9.5. oracle 10.2.0.4 and OEL 4.5
Specifically I am looking at partitioning the W_EMPLOYEE_DAILY_SNP_F tableYou don't want to partition the daily snapshot, you want to purge it when you don't need it anymore!
HR analytics 795 works with a daily snapshot and a monthly snapshot.
The last day of the daily snapshot wil become the monthly snapshot, when the monthly snapshot for that month is build, you can purge the daily snapshot!
BIA796 even doesn't have the daily snapshot anymore.
So the question is, do you really need the data of every day from a couple of months back???
Remc0 -
Hi,
I have configure stream analytics to read messages from eventhub and I am getting this error message:Correlation ID:
e185534f-9e82-4efc-8050-5294796fb275
Error:
Exceeded the maximum number of allowed receivers per partition in a consumer group which is 5 TrackingId:4569eca8-cb70-4c50-96cd-5868bfbc3e37_B45,TimeStamp:1/7/2015 11:40:37 PM TrackingId:f408292e-0dab-40cc-8ae9-2730313c3241_B45, Reference:8fca3c6a-b0d1-41cf-a835-f7c74a2ea8c3,
Timestamp:1/7/2015 11:40:37 PM
Message:
Exceeded the maximum number of allowed receivers per partition in a consumer group which is 5
Message Time:
2015-01-07 23:40:38Z
Microsoft.Resources/EventNameV2:
sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.naveenevents_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
Microsoft.Resources/Operation:
Receive Events
Microsoft.Resources/ResourceUri:
/subscriptions/f122f431-ddc7-4e21-ae3a-f6a9e1273bcb/resourceGroups/StreamAnalytics-Default-Central-US/providers/Microsoft.StreamAnalytics/streamingjobs/Naveenstreamanalytics
Type:
EventHubInputAdapterEventHubFailure
Any help on the same is appreciated.
Thanks
NaveenHi Naveen,
Thanks for reporting this issue. We are taking a look at this issue.
Questions:
Do you have multiple readers reading from this EventHub, or have you setup multiple Stream Analystics jobs reading from this event hub. You are hitting a limitation of how many readers are allowed per EH consumer group.
What is the serialization format of your input (JSON/CSV)? Could you please verify the serialization format setup in the streaming job and if it matches your data?
Thanks
Zafar Abbas -
Project Analytics 7.9.6.1 - Error while running a full load
Hi All,
I am performing a full load for Projects Analytics and get the following error,
=====================================
ERROR OUTPUT
=====================================
1103 SEVERE Wed Nov 18 02:49:36 WST 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_CodeDimension_Gl_Account
1104 SEVERE Wed Nov 18 02:49:36 WST 2009
ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORA11510_Adaptor:SDE_ORA_CodeDimension_Gl_Account:(Source : FULL Target : FULL)
MESSAGE:::
Irrecoverable Error
Error while contacting Informatica server for getting workflow status for SDE_ORA_CodeDimension_Gl_Account
Error Code = 36331:Unknown reason for error code 36331
Pmcmd output :
Session log initialises NULL value to mapping parameter MPLT_ADI_CODES.$$CATEGORY. This is then used insupsequent SQL and results in ORA-00936: missing expression error following are the initialization section and the load section containing the error in the log
Initialisation
DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
DIRECTOR> VAR_27028 Use override value [ORA_11_5_10] for session parameter:[$DBConnection_OLTP].
DIRECTOR> VAR_27028 Use override value [ORA_11_5_10.DATAWAREHOUSE.SDE_ORA11510_Adaptor.SDE_ORA_CodeDimension_Gl_Account_Segments.log] for session parameter:[$PMSessionLogFile].
DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[MPLT_ADI_CODES.$$CATEGORY].
DIRECTOR> VAR_27028 Use override value [4] for mapping parameter:[MPLT_SA_ORA_CODES.$$DATASOURCE_NUM_ID].
DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[MPLT_SA_ORA_CODES.$$TENANT_ID].
DIRECTOR> TM_6014 Initializing session [SDE_ORA_CodeDimension_Gl_Account_Segments] at [Wed Nov 18 02:49:11 2009].
DIRECTOR> TM_6683 Repository Name: [repo_service]
DIRECTOR> TM_6684 Server Name: [int_service]
DIRECTOR> TM_6686 Folder: [SDE_ORA11510_Adaptor]
DIRECTOR> TM_6685 Workflow: [SDE_ORA_CodeDimension_Gl_Account_Segments] Run Instance Name: [] Run Id: [17]
DIRECTOR> TM_6101 Mapping name: SDE_ORA_CodeDimension_GL_Account_Segments [version 1].
DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.6.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_CodeDimension_Gl_Account_Segments].
DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
DIRECTOR> TM_6703 Session [SDE_ORA_CodeDimension_Gl_Account_Segments] is run by 32-bit Integration Service [node01_ASG596138], version [8.6.1], build [1218].
MANAGER> PETL_24058 Running Partition Group [1].
MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
MANAGER> PETL_24001 Parallel Pipeline Engine running.
MANAGER> PETL_24003 Initializing session run.
MAPPING> CMN_1569 Server Mode: [UNICODE]
MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
MAPPING> TM_6151 The session sort order is [Binary].
MAPPING> TM_6185 Warning. Code page validation is disabled in this session.
MAPPING> TM_6156 Using low precision processing.
MAPPING> TM_6180 Deadlock retry logic will not be implemented.
MAPPING> TM_6307 DTM error log disabled.
MAPPING> TE_7022 TShmWriter: Initialized
MAPPING> DBG_21075 Connecting to database [orcl], user [DAC_REP]
MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Map] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Code] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_W_CODE_D] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_CodeDimension_Gl_Account_Segments]
DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Nov 18 02:49:14 2009)
MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Nov 18 02:49:14 2009)
DIRECTOR> PETL_24006 Starting data movement.
MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 128000 bytes.
READER_1_1_1> DBG_21438 Reader: Source is [asgdev], user [APPS]
READER_1_1_1> BLKR_16051 Source database connection [ORA_11_5_10] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
READER_1_1_1> BLKR_16003 Initialization completed successfully.
WRITER_1_*_1> WRT_8147 Writer: Target is database [orcl], user [DAC_REP], bulk mode [OFF]
WRITER_1_*_1> WRT_8221 Target database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL INSERT statement:
INSERT INTO W_CODE_D(DATASOURCE_NUM_ID,SOURCE_CODE,SOURCE_CODE_1,SOURCE_CODE_2,SOURCE_CODE_3,SOURCE_NAME_1,SOURCE_NAME_2,CATEGORY,LANGUAGE_CODE,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,W_UPDATE_DT,TENANT_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL UPDATE statement:
UPDATE W_CODE_D SET SOURCE_CODE_1 = ?, SOURCE_CODE_2 = ?, SOURCE_CODE_3 = ?, SOURCE_NAME_1 = ?, SOURCE_NAME_2 = ?, MASTER_DATASOURCE_NUM_ID = ?, MASTER_CODE = ?, MASTER_VALUE = ?, W_INSERT_DT = ?, W_UPDATE_DT = ?, TENANT_ID = ? WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL DELETE statement:
DELETE FROM W_CODE_D WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_CODE_D]
WRITER_1_*_1> WRT_8003 Writer initialization complete.
READER_1_1_1> BLKR_16007 Reader run started.
WRITER_1_*_1> WRT_8005 Writer run started.
WRITER_1_*_1> WRT_8158
Load section
*****START LOAD SESSION*****
Load Start Time: Wed Nov 18 02:49:16 2009
Target tables:
W_CODE_D
READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_Codes_GL_Account_Segments.Sq_Fnd_Flex_Values] User specified SQL Query [SELECT
FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
FND_FLEX_VALUES.FLEX_VALUE,
MAX(FND_FLEX_VALUES_TL.DESCRIPTION),
FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME
FROM
FND_FLEX_VALUES,
FND_FLEX_VALUES_TL,
FND_ID_FLEX_SEGMENTS,
FND_SEGMENT_ATTRIBUTE_VALUES
WHERE
FND_FLEX_VALUES.FLEX_VALUE_ID = FND_FLEX_VALUES_TL.FLEX_VALUE_ID AND FND_FLEX_VALUES_TL.LANGUAGE ='US' AND
FND_ID_FLEX_SEGMENTS.FLEX_VALUE_SET_ID =FND_FLEX_VALUES.FLEX_VALUE_SET_ID AND
FND_ID_FLEX_SEGMENTS.APPLICATION_ID = 101 AND
FND_ID_FLEX_SEGMENTS.ID_FLEX_CODE ='GL#' AND
FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM =FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_NUM AND
FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_ID =101 AND
FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_CODE = 'GL#' AND
FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME=FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_COLUMN_NAME AND
FND_SEGMENT_ATTRIBUTE_VALUES.ATTRIBUTE_VALUE ='Y'
GROUP BY
FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
FND_FLEX_VALUES.FLEX_VALUE,
FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME]
READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:17 2009)
READER_1_1_1> RR_4050 First row returned from database to reader : (Wed Nov 18 02:49:17 2009)
LKPDP_3> DBG_21312 Lookup Transformation [mplt_ADI_Codes.Lkp_W_CODE_D]: Lookup override sql to create cache: SELECT W_CODE_D.SOURCE_NAME_1 AS SOURCE_NAME_1, W_CODE_D.SOURCE_NAME_2 AS SOURCE_NAME_2, W_CODE_D.MASTER_DATASOURCE_NUM_ID AS MASTER_DATASOURCE_NUM_ID, W_CODE_D.MASTER_CODE AS MASTER_CODE, W_CODE_D.MASTER_VALUE AS MASTER_VALUE, W_CODE_D.W_INSERT_DT AS W_INSERT_DT, W_CODE_D.TENANT_ID AS TENANT_ID, W_CODE_D.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, W_CODE_D.SOURCE_CODE AS SOURCE_CODE, W_CODE_D.CATEGORY AS CATEGORY, W_CODE_D.LANGUAGE_CODE AS LANGUAGE_CODE FROM W_CODE_D
WHERE
W_CODE_D.CATEGORY IN () ORDER BY DATASOURCE_NUM_ID,SOURCE_CODE,CATEGORY,LANGUAGE_CODE,SOURCE_NAME_1,SOURCE_NAME_2,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,TENANT_ID
LKPDP_3> TE_7212 Increasing [Index Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [1000000] to [4734976].
LKPDP_3> TE_7212 Increasing [Data Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [2000000] to [2007040].
READER_1_1_1> BLKR_16019 Read [625] rows, read [0] error rows for source table [FND_ID_FLEX_SEGMENTS] instance name [mplt_BC_ORA_Codes_GL_Account_Segments.FND_ID_FLEX_SEGMENTS]
READER_1_1_1> BLKR_16008 Reader run completed.
LKPDP_3> TM_6660 Total Buffer Pool size is 609824 bytes and Block size is 65536 bytes.
LKPDP_3:READER_1_1> DBG_21438 Reader: Source is [orcl], user [DAC_REP]
LKPDP_3:READER_1_1> BLKR_16051 Source database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
LKPDP_3:READER_1_1> BLKR_16003 Initialization completed successfully.
LKPDP_3:READER_1_1> BLKR_16007 Reader run started.
LKPDP_3:READER_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:18 2009)
LKPDP_3:READER_1_1> CMN_1761 Timestamp Event: [Wed Nov 18 02:49:18 2009]
LKPDP_3:READER_1_1> RR_4035 SQL Error [
ORA-00936: missing expression
Could you please suggest what the issue might be and how it can be fixed?
Many thanks,
KiranI have continued related detains in the following thread,
Mapping Parameter $$CATEGORY not included in the parameter file (7.9.6.1)
Apologies for the inconvenience.
Thanks,
Kiran -
Error while extracting data to replicated partition (EAL)
Hi,
I´m using Essbase Analytics. I want to create a copy of my HFM application in Essbase.
So I´ve created a new bridge. In the Essbase tab of the bridge I choose "Replicated partition", because I want the data in Essbase directly, not in the EAL server.
So EAL created the application, database and the outline.
In the extract tab I select my created region and click "Extract data". After some time it ends, but no data is stored in the essbase database!
The log file of the EAL server says: Extract region requested (time stamp is the beginning of the process)
Has anyone an idea??
Thanks, Stefan
P.S.: The creation of an transparent partition works fine and I can see data via Smartview.We use EAL with the transparent partition option without any issues. We write some report scripts to export the data from the app and load into another essbase app periodically for reporting. We can also use smartview or the add-in to pull data directly from the EAL essbase apps.
What are you trying to do that requires you to use a replicated partition? We were told transparent was the preferred method and so we used that option. -
Pivot not working; Analytics question
I got a lot done this past weekend on tough query for me but now have more questions. You must run the top query to understand the understand the second.
Background:
This is a period to date report which includes amounts from beginning of the year as well as last year.
My example Report is for Period 9 which begins on Aug 24, 2009
Period has 5 weeks (Most periods have 4) ;
Also if period is requested in Week 3 then only weeks 1 and 2 will be reflected on report
The report looks like this: (Data provided should produce these numbers except for maybe the YTD column.
WEEK1 WEEK2 WEEK3 WEEK4 WEEK5 Period-to-date Year-to-date
Net - Landau 11,485.79 11,416.60 11,609.01 11,049.76 12,867.10 58,428.00 454,231.37
Net - AW 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Net - LJS 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Net - TB 7,118.17 7,228.13 7,657.94 7,699.53 7,958.53 37,662.00 306,115.59
Total Net 18,603.96 18,644.73 19,266.95 18,749.29 20,825.63 96,091.00 760,346.96
Last Year Sales 23,515.95 24,244.37 23,962.74 23,134.79 24,440.87 119,299.00 856,363.36
Increase (4,911.99) (5,599.64) (4,695.79) (4,385.50) (3,615.24) (23,208.00) (96,016.40)
Last year
Next Week 24,244.37 23,962.74 23,134.79 24,440.87 23,055.87 118,839.00 879,419.23 --===== Current Year Dates
Beginning of period: (BOP) Mon Aug 24
Week 1: Mon Aug 24 - Aug 30
Week 2: Mon Aug 11 - Sept 6
Week 3: Mon Sep 7 - Sept 13
Week 4: Mon Sept 14 - Sept 20
Week 5: Mon Sept 21 - Sept 27
Beginning of fiscal year( BOY) = '28-Dec-08'
--===== Last Year Dates
Beginning of period: (BOP_LY) Mon Aug 25
Week 1: Mon Aug 25 - Aug 31
Week 2: Mon Aug 1 - Sept 7
Week 3: Mon Sep 8 - Sept 14
Week 4: Mon Sept 15 - Sept 21
Week 5: Mon Sept 22 - Sept 28
Beginning of fiscal year( BOY) = '31-Dec-07'
My challenge over weekend was to get entire year of data vs. just the period data.
There are 7 columns on this report. 5 weeks of period, PeriodToDate (total of the weeks) and YearToDate (PeriodToDate + Sum of all data from beginning of year to the end of the previous period.
I'm not really concerned with the PTD, the program can handle that. I got the BOY date with the following code:
All data is summed for the week and grouped by storeid and TRUNC(date, 'IW') which is the Monday of each week.
of each week. (The resultset contains data for 2 stores instead of 1. This is there only to make sure my query was filtering correctly)
drop table my_csh_main;
CREATE TABLE MY_CSH_MAIN
( "FK_STR_MAIN_ID" NUMBER,
"BUSI_DATE" DATE,
"CONF_NUMB2" NUMBER,
"CONF_NUMB49" NUMBER,
"CONF_NUMB44" NUMBER,
"CONF_NUMB3" NUMBER,
"CONF_NUMB4" NUMBER,
"CONF_NUMB38" NUMBER,
"CONF_NUMB56" NUMBER
REM INSERTING into MY_CSH_MAIN
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('28-AUG-08','DD-MON-RR'),22103.69,0,0,119,0,4605.21,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('27-AUG-09','DD-MON-RR'),18081.37,0,0,0,0,3533.45,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('17-SEP-09','DD-MON-RR'),18211.29,0,0,0,0,3806.32,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('04-SEP-08','DD-MON-RR'),24244.37,0,0,284.94,0,0,9395.63);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('03-SEP-09','DD-MON-RR'),18644.73,0,0,85.48,0,0,7228.13);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('24-SEP-09','DD-MON-RR'),16809.21,0,0,64.99,0,3014.61,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('25-SEP-08','DD-MON-RR'),24440.87,0,0,0,0,0,9469.64);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('28-AUG-08','DD-MON-RR'),23515.95,0,0,0,80.38,0,9379.9);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('24-SEP-09','DD-MON-RR'),20825.63,0,0,73.97,0,0,7958.53);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('17-SEP-09','DD-MON-RR'),18749.29,0,0,0,0,0,7699.53);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('11-SEP-08','DD-MON-RR'),22839.3,0,0,206.39,116.74,4493.28,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('04-SEP-08','DD-MON-RR'),22627.74,0,0,279.98,0,4423.83,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('27-AUG-09','DD-MON-RR'),18603.96,0,0,81.25,0,0,7118.17);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('11-SEP-08','DD-MON-RR'),23962.74,0,0,153.1,0,0,9335.35);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('18-SEP-08','DD-MON-RR'),23134.79,0,0,44.12,0,0,8978.87);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('25-SEP-08','DD-MON-RR'),24950.45,0,0,129.98,0,5330.22,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('10-SEP-09','DD-MON-RR'),19266.95,0,0,0,0,0,7657.94);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('03-SEP-09','DD-MON-RR'),17183.25,0,0,0,0,3487.12,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('18-SEP-08','DD-MON-RR'),21372.82,0,0,0,0,4546.15,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('10-SEP-09','DD-MON-RR'),17688.41,0,0,113.12,0,3424.17,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('31-DEC-08','DD-MON-RR'),611016.24,0,0,1276.62,724.96,122236.02,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('31-DEC-08','DD-MON-RR'),667612.63,0,0,1018.81,0,0,269777.87);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('02-JAN-08','DD-MON-RR'),1676737.13,0,0,5652.47,3850.68,345971.1,500.5);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('02-JAN-08','DD-MON-RR'),1786451.11,0,0,3167.61,175.38,0,788438.73);
CREATE TABLE LANDAU_REPORT_STORES
( "COMPANYNAME" CHAR(40 BYTE),
"DISTRICTNAME" VARCHAR2(50 BYTE),
"STOREID" NUMBER(4,0) NOT NULL ENABLE,
"STORENAME" VARCHAR2(70 BYTE),
"STORENBR" CHAR(4 BYTE) NOT NULL ENABLE
REM INSERTING into LANDAU_REPORT_STORES
Insert into LANDAU_REPORT_STORES (COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',64,'N_Main','0004');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',65,'Belvidere','0005');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',104,'Mulford','0032');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 50',141,'Charleston','0043');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',61,'Kilburn','0002');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',62,'11th_St','0003');
with WeeklyTotals as
( select StoreId
, week_date
, net_sales
, ljs_food_sales
, total_drink_sales
, net_food
, CASE
WHEN net_food is null then 0
WHEN net_food = 0 then 0
ELSE Ljs_food_sales / net_food
END as LJS_Percent
, CASE
WHEN net_food is null then 0
WHEN net_food = 0 then 0
else total_drink_sales * (ljs_food_sales / net_food)
END as LJS_Drinks
, aw_sales
, tb_sales
FROM (Select fk_str_main_id as StoreId,
trunc(csh.busi_date, 'IW') as week_date,
sum(csh.conf_numb2) As net_sales,
sum(conf_numb49) as ljs_food_sales,
sum(conf_numb44) As total_drink_sales,
sum(csh.conf_numb2) - sum(CONF_NUMB44) - sum(conf_numb3) - sum(conf_numb4) AS net_food,
sum(conf_numb38) As aw_sales,
sum(conf_numb56) as tb_sales
from my_csh_main csh
WHERE BUSI_DATE BETWEEN TO_DATE( '28-Dec-08' ,'DD-MON-YY') AND TO_DATE( '27-SEP-09' ,'DD-MON-YY')
and fk_str_main_id in (141, 221)
GROUP BY CSH.FK_STR_MAIN_ID, trunc(csh.busi_date, 'IW')
, WeeklyFoodSalesLY as
SELECT fk_str_main_id AS storeid
, TRUNC(busi_date, 'iw') week_date
, SUM(csh.conf_numb2) AS net_sales
FROM my_csh_main csh
WHERE busi_date BETWEEN to_date('31-DEc-07', 'dd-Mon-yy') and to_date('28-Sep-08', 'dd-Mon-yy')
GROUP BY fk_str_main_id, TRUNC(busi_date, 'iw')
, StoreDetails AS
select * from LANDAU_REPORT_STORES where STORENAME NOT like '%CLOSED%' order by DistrictNAme, StoreNbr
Select
foods.storeid
, CASE
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') then 1
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') + 7 then 2
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') + 14 then 3
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09','dd-Mon-yy') + 21 then 4
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') + 28 then 5
ELSE 0
end as WeekNBr
, foods.week_date as CurrWeekDate
, foodsLY.week_date as LastYearWEekDate
, ROUND(NVL(foods.net_sales - (aw_sales + tb_sales + ljs_drinks + ljs_food_sales ), 0), 2) as Landau_Net_Sales
, ROUND(NVL(aw_sales, 0), 2) as aw_sales
, ROUND(NVL(tb_sales, 0), 2) as tb_sales
, ROUND(NVL(ljs_drinks + ljs_food_sales, 0),2) as ljs_net_sales
, ROUND(NVL(foods.Net_Sales, 0), 2) as net_sales
--==
-- Last Year Sales and Last Year Next Year Sales
--==
, ROUND(NVL(foodsLY.Net_Sales, 0),2) as WTDLYNetSales
, ROUND(NVL(Foods.Net_Sales - FoodsLY.Net_Sales, 0),2) as WTDSalesIncrease
-- , ROUND(NVL(FoodsLY_NextWeek.Net_Sales, 0), 2) as WTDFoodsSalesLY
, stores.*
from WEeklyTotals Foods
LEFT OUTER JOIN Weeklyfoodsalesly foodsly ON foodsly.storeid = foods.storeid
AND foodsly.week_date = DECODE(foods.week_date,
to_date('24-AUG-09', 'dd-Mon-yy') , to_date('25-AUG-08', 'dd-Mon-yy') ,
to_date('24-AUG-09', 'dd-Mon-yy') + 7, to_date('25-AUG-08', 'dd-Mon-yy') + 7,
to_date('24-AUG-09', 'dd-Mon-yy') + 14, to_date('25-AUG-08', 'dd-Mon-yy') + 14,
to_date('24-AUG-09', 'dd-Mon-yy') + 21, to_date('25-AUG-08', 'dd-Mon-yy') + 21,
to_date('24-AUG-09', 'dd-Mon-yy') + 28, to_date('25-AUG-08', 'dd-Mon-yy') + 28)
LEFT OUTER JOIN StoreDetails stores ON stores.storeid = Foods.storeid;one exception. In a With statement, I get a recordset containing all of last years data. I could not figure out the Pivot
to get the Last Year Next Week . That means next week, one year ago. I pulled out a snippet to work with but couldn't get it
to work:
The problem came when I tried to pivot the Last year sales. I pulled this snippet out to test with but couldn't make it work.
with WeeklyFoodSalesLY as
( SELECT fk_str_main_id AS storeid
, TRUNC(busi_date, 'iw') week_date
, sum(conf_numb2) conf_numb2
FROM my_csh_main csh
WHERE busi_date BETWEEN to_date('31-dec-07', 'dd-Mon-yy') and to_date('28-Sep-08', 'dd-Mon-yy') + 7
and fk_str_main_id = 141
GROUP BY fk_str_main_id , TRUNC(busi_date, 'iw')
, sales_ly as
( SELECT storeid
, week_date
, sum(conf_numb2) as net_sales
from WeeklyFoodSalesLY
WHERE week_date BETWEEN to_date('25-Aug-08', 'dd-Mon-yy') and to_date('28-Sep-08', 'dd-Mon-yy')
GROUP BY storeid , week_date
UNION ALL
SELECT storeid
, week_date
, sum(conf_numb2) as net_sales
from WeeklyFoodSalesLY
WHERE week_date BETWEEN to_date('25-Aug-08', 'dd-Mon-yy') + 7 and to_date('28-Sep-08', 'dd-Mon-yy') + 7
GROUP BY storeid , week_date
UNION ALL
SELECT storeid
, week_date
, sum(conf_numb2) as net_sales
from WeeklyFoodSalesLY
WHERE week_date < to_date('25-Aug-08', 'dd-Mon-yy')
GROUP BY storeid , week_date
, pivoted_sales_ly as
( SELECT storeid
week_date
, MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy'), net_sales)) as lastyear
, MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy') + 7, net_sales)) as lastyearnextweek
, MAX(DECODE(week_date, to_date('31-dec-07', 'dd-Mon-yy'), net_sales)) as lastyeartotal
from sales_ly
GROUP BY storeid, week_date
Select *
from pivoted_sales_ly; What am i dont wrong here? Once I get this I can work back into the original query.
Analytics:
Boneist gave me some code to try last week:
Combining refCursors and Summing
that created the PTD column.
I could never get it to work because I was tweaking so much with the Query but is there a way to get the PTD without
having to add an extra PTD column for each week.
One other thing:
I would like to know how to use analytics to roll up report by division into a different cursor.
Edited by: TheHTMLDJ on Oct 26, 2009 4:50 AM
Edited by: TheHTMLDJ on Oct 26, 2009 4:59 AMTheHTMLDJ wrote:
The report looks like this: (Data provided should produce these numbers except for maybe the YTD column.With the data you provided, I managed to produce the expected output for storeid 221, barring the YTD column, which had the wrong values. I had to make up the "next week" row for the previous year for storeid 221, btw:
WITH weekly_values AS (SELECT fk_str_main_id as storeid,
TRUNC(csh.busi_date, 'IW') AS week_date,
NVL(SUM(csh.conf_numb2), 0) AS net_sales,
NVL(SUM(conf_numb49), 0) AS ljs_food_sales,
NVL(SUM(conf_numb44), 0) AS total_drink_sales,
NVL(SUM(csh.conf_numb2) - SUM(conf_numb44)
- SUM(conf_numb3)
- SUM(conf_numb4), 0) AS net_food,
NVL(SUM(conf_numb38), 0) AS aw_sales,
NVL(SUM(conf_numb56), 0) AS tb_sales
FROM my_csh_main csh
WHERE (busi_date BETWEEN TRUNC(:v_bop, 'iy') AND TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks - 1) + 6
OR
busi_date BETWEEN TRUNC(:v_bop_ly, 'iy') AND TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks + 6)
AND fk_str_main_id IN (141, 221)
GROUP BY csh.fk_str_main_id,
TRUNC(csh.busi_date, 'IW')),
weekly_totals AS (SELECT storeid,
week_date,
net_sales,
ljs_food_sales,
total_drink_sales,
net_food,
CASE WHEN nvl(net_food, 0) = 0 then 0
ELSE ljs_food_sales / net_food
END AS ljs_percent,
CASE WHEN nvl(net_food, 0) = 0 then 0
ELSE total_drink_sales * (ljs_food_sales / net_food)
END AS ljs_drinks,
aw_sales,
tb_sales,
net_sales - aw_sales
- tb_sales
- ljs_food_sales
- CASE WHEN nvl(net_food, 0) = 0 then 0
ELSE total_drink_sales * (ljs_food_sales / net_food)
END AS landau_net_sales
FROM weekly_values),
week_tots_ytd AS (SELECT storeid,
week_date,
net_sales,
ljs_food_sales,
total_drink_sales,
net_food,
ljs_drinks,
aw_sales,
tb_sales,
landau_net_sales,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, landau_net_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) lns_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, aw_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) aws_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, ljs_food_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) ljsf_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, tb_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) tbs_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, net_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) net_ytd,
SUM(net_sales) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) full_net_ytd,
SUM(landau_net_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) lns_ptd,
SUM(aw_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) aws_ptd,
SUM(ljs_food_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) ljsf_ptd,
SUM(tb_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) tbs_ptd,
SUM(net_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks-1) THEN 2
END) net_ptd,
LEAD(net_sales) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')
ORDER BY week_date) next_week_net_sales
FROM weekly_totals),
foods AS (Select storeid,
CASE WHEN week_date in (:v_bop, :v_bop_ly) then 1
WHEN week_date in (:v_bop + 7, :v_bop_ly + 7) then 2
WHEN week_date in (:v_bop + 14, :v_bop_ly + 14) then 3
WHEN week_date in (:v_bop + 21, :v_bop_ly + 21) then 4
WHEN week_date in (:v_bop + 28, :v_bop_ly + 28) then 5
END AS week_number,
week_date,
ROUND(net_sales - (aw_sales + tb_sales + ljs_drinks + ljs_food_sales), 2) AS landau_net_sales,
ROUND(aw_sales, 2) AS aw_sales,
ROUND(tb_sales, 2) AS tb_sales,
ROUND(ljs_drinks + ljs_food_sales, 2) AS ljs_net_sales,
ROUND(net_sales, 2) AS net_sales,
ROUND(lns_ytd, 2) AS lns_ytd,
ROUND(aws_ytd, 2) AS aws_ytd,
ROUND(ljsf_ytd, 2) AS ljsf_ytd,
ROUND(tbs_ytd, 2) AS tbs_ytd,
ROUND(net_ytd, 2) AS net_ytd,
ROUND(full_net_ytd, 2) AS full_net_ytd,
ROUND(lns_ptd, 2) AS lns_ptd,
ROUND(aws_ptd, 2) AS aws_ptd,
ROUND(ljsf_ptd, 2) AS ljsf_ptd,
ROUND(tbs_ptd, 2) AS tbs_ptd,
ROUND(net_ptd, 2) AS net_ptd,
ROUND(next_week_net_sales, 2) AS next_week_net_sales,
SUM(ROUND(next_week_net_sales, 2)) OVER (PARTITION BY storeid,
TRUNC(week_date, 'iy')) nwns_ptd
FROM week_tots_ytd
WHERE (week_date BETWEEN TRUNC(:v_bop, 'iw') AND TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks - 1)
OR
week_date BETWEEN TRUNC(:v_bop_ly, 'iw') AND TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks - 1))),
pivoted_foods AS (SELECT storeid,
week_number,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), landau_net_sales)) landau_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), aw_sales)) aw_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), tb_sales)) tb_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), ljs_net_sales)) ljs_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), net_sales)) net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), lns_ytd)) lns_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), aws_ytd)) aws_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), ljsf_ytd)) ljsf_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), tbs_ytd)) tbs_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), net_ytd)) net_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), full_net_ytd)) full_net_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), lns_ptd)) lns_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), aws_ptd)) aws_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), ljsf_ptd)) ljsf_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), tbs_ptd)) tbs_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), net_ptd)) net_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), net_sales)) ly_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), net_ytd)) ly_net_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), net_ptd)) ly_net_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), next_week_net_sales)) ly_next_week_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), nwns_ptd)) ly_nwns_ptd
FROM foods
GROUP BY storeid,
week_number),
storedetails AS (SELECT companyname,
districtname,
storeid,
storename,
storenbr
FROM landau_report_stores
WHERE UPPER(storename) NOT LIKE '%CLOSED%'
ORDER BY districtname, storenbr),
dummy AS (SELECT level col1
FROM dual
CONNECT BY level <= 8)
SELECT pf.storeid,
DECODE(dummy.col1, 1, 'Net - Landau',
2, 'Net - AW',
3, 'Net - LJS',
4, 'Net - TB',
5, 'Total Net',
6, 'Last Year Sales',
7, 'Increase',
8, 'Last Year Next Week') category,
SUM(CASE WHEN pf.week_number = 1 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 1 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 1 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week1,
SUM(CASE WHEN pf.week_number = 2 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 2 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 2 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week2,
SUM(CASE WHEN pf.week_number = 3 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 3 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 3 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week3,
SUM(CASE WHEN pf.week_number = 4 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 4 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 4 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week4,
SUM(CASE WHEN pf.week_number = 5 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 5 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 5 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week5,
SUM(CASE WHEN pf.week_number = 5 AND dummy.col1 = 1 THEN pf.lns_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 2 THEN pf.aws_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 3 THEN pf.ljsf_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 4 THEN pf.tbs_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 5 THEN pf.net_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 6 THEN pf.ly_net_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 7 THEN pf.net_ptd - pf.ly_net_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 8 THEN pf.ly_nwns_ptd
END) period_to_date,
SUM(CASE WHEN pf.week_number = 5 AND dummy.col1 = 1 THEN pf.lns_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 2 THEN pf.aws_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 3 THEN pf.ljsf_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 4 THEN pf.tbs_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 5 THEN pf.net_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 6 THEN pf.ly_net_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 7 THEN pf.net_ytd - pf.ly_net_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 8 THEN pf.full_net_ytd
END) year_to_date,
stores.companyname,
stores.districtname,
stores.storename,
stores.storenbr
FROM pivoted_foods pf
LEFT OUTER JOIN storedetails stores ON stores.storeid = pf.storeid,
dummy
group by dummy.col1,
DECODE(dummy.col1, 1, 'Net - Landau',
2, 'Net - AW',
3, 'Net - LJS',
4, 'Net - TB',
5, 'Total Net',
6, 'Last Year Sales',
7, 'Increase',
8, 'Last Year Next Week'),
rollup((pf.storeid,
stores.companyname,
stores.districtname,
stores.storename,
stores.storenbr))
order by pf.storeid, dummy.col1;(replace all :v_num_weeks, :v_bop and :v_bop_ly references with your variables) -
Pure SQL to partition date-time occurrences into non-overlapping windows?
i've a question that so far, i've never been able to solve via a pure SQL option.
it's hard to explain in words, but it's something like this:
given a set of date-time, i would like to partition the data into non-overlapping windows of 30 minutes each.
the data is supposed to be partitioned into windows of 30 minutes, meaning when the data is within 30 minutes of the first occurrence, only the first occurrence will be returned. in the next second after the 30th minute, the record will be considered as the start of a new window and is also returned. so those data that occurs within the window period are suppressed. the first occurrence is not necessarily occurring on the 00th minute, so the window start will never be constant.
run the below query to look at the dummy data.
SELECT 'A' AS ID
, TRUNC(SYSDATE) + 7 / 24 + 1 *(ROWNUM - 1) / 1440 AS datetime
FROM DUAL
CONNECT BY ROWNUM <= 50
UNION ALL
SELECT 'A' AS ID
, TRUNC(SYSDATE) + 9 / 24 + 8 / 1440 + 1 *(ROWNUM - 1) / 1440 AS datetime
FROM DUAL
CONNECT BY ROWNUM <= 35
UNION ALL
SELECT 'B' AS ID
, TRUNC(SYSDATE) + 7 / 24 + 5 *(ROWNUM - 1) / 1440 AS datetime
FROM DUAL
CONNECT BY ROWNUM <= 15this is supposed to be the output.
ID DATETIME
A 5/19/2010 07:00:00
A 5/19/2010 07:30:00
A 5/19/2010 09:08:00
A 5/19/2010 09:38:00
B 5/19/2010 07:00:00
B 5/19/2010 07:30:00
B 5/19/2010 08:00:00so far, i'm using a PL/SQL to pipe the records. but i would like to know if this is achievable via SQL or not.
i've tried looking at analytics, width_bucket, ntile and alll options i can think of, but i just can't solve this at all.hey Bob,
your answer is most definitely correct and does what i want. i've verified it again my data set and it returns the results as required!
you've definitely proven me wrong. i was always under the impression that this wasn't possible. thanks!
just a small note:
i need the windows to be binned by seconds, so have changed the numtodsinterval to raw numbers.
WITH t AS
(SELECT 'A' AS ID
, TRUNC(SYSDATE) +(6.75 / 24) AS datetime
FROM DUAL
UNION ALL
SELECT 'A' AS ID
, TRUNC(SYSDATE) +(6.75 / 24) AS datetime
FROM DUAL
UNION ALL
SELECT 'A' AS ID
, TRUNC(SYSDATE) + 7 / 24 + 1 *(ROWNUM - 1) / 1440 AS datetime
FROM DUAL
CONNECT BY ROWNUM <= 50
UNION ALL
SELECT 'A' AS ID
, TRUNC(SYSDATE) + 9 / 24 + 8 / 1440 + 1 *(ROWNUM - 1) / 1440 AS datetime
FROM DUAL
CONNECT BY ROWNUM <= 35
UNION ALL
SELECT 'B' AS ID
, TRUNC(SYSDATE) + 7 / 24 + 5 *(ROWNUM - 1) / 1440 AS datetime
FROM DUAL
CONNECT BY ROWNUM <= 15)
,a AS
(SELECT ID
,datetime
,LAG(datetime) OVER(PARTITION BY ID ORDER BY datetime) AS prevtime
,LAST_VALUE(datetime) OVER(PARTITION BY ID ORDER BY datetime RANGE BETWEEN CURRENT ROW AND 30 / 1440 + 1 / 86400 FOLLOWING) AS interval_end
FROM t)
,b AS
(SELECT ID
,datetime
,LEAD(datetime) OVER(PARTITION BY ID ORDER BY datetime) AS nexttime
FROM t)
,ab AS
(SELECT a.ID
,a.datetime
,a.prevtime
,a.interval_end
,b.datetime as b_datetime
,b.nexttime
FROM a JOIN b ON(a.ID = b.ID
AND a.interval_end = b.datetime)
SELECT ID
,datetime
FROM ab
START WITH prevtime IS NULL
CONNECT BY ID = PRIOR ID
AND datetime = PRIOR nexttime
ORDER BY ID
,datetime;this most definitely proves that i'm still not sure of how to use hierarchy queries.
Edited by: casey on May 20, 2010 11:20 AM -
How to break a ranked query into subcomponenets using analytics.
HI,
I'm using Oracle 10.2.0.4. I have a query where I'm listing the top X clients ranked by Total Margin.
The query in SQL looks like this.
Select
accounting_date,
customer_number,
customer_id,
customer_name,
TOT_PROD,
TOT_SVS,
TOT_MGN,
RANK_IN_LINE
from(
Select
RNK.accounting_date,
RNK.customer_number,
RNK.customer_id,
RNK.customer_name,
SUM (RNK.total_prod_margin) TOT_PROD,
SUM (RNK.total_svcs_margin) TOT_SVS,
SUM (RNK.total_margin) TOT_MGN,
RANK() OVER( ORDER BY SUM (RNK.total_margin) desc) RANK_IN_LINE
from
SELECT MGN.accounting_date,
MGN.customer_number,
MGN.customer_id,
MGN.customer_name,
SUM (MGN.prod_margin) total_prod_margin,
SUM (MGN.svcs_margin) total_svcs_margin,
SUM (MGN.prod_margin + MGN.svcs_margin) total_margin
FROM
(( SELECT TRUNC (ba.accounting_date) accounting_date,
ba.CUSTOMER_ACCOUNT_NUMBER customer_number,
to_char(nvl(ba.customer_id, -1)) customer_id,
ba.customer_name,
SUM (ba.book_margin_amount) prod_margin,
0 svcs_margin
FROM ftbv.ftbv_booking_activity_all_1d ba
WHERE ba.order_type_name <> 'Wholesale'
AND ba.SALESREP_NAME <> 'NO SALES CREDIT'
AND ba.product_group_name <> 'BAS New'
AND BA.PARTIAL_ID <> -1
GROUP BY TRUNC (ba.accounting_date),
ba.customer_account_number,
to_char(nvl(ba.customer_id, -1)),
ba.customer_name
HAVING SUM (ba.book_margin_amount) <> 0)
UNION ALL
( SELECT TRUNC (bs.accounting_date) accounting_date,
bs.AMS_CUSTOMER_NUMBER customer_number,
bs.CUSTOMER_ACCOUNT_NUMBER customer_id,
bs.customer_name,
0 prod_margin,
SUM (bs.book_gross_profit_margin_amt) svcs_margin
FROM ftbv.ftbv_book_svc_activity_all_1d bs
WHERE bs.booking_activity_name != 'Undefined'
GROUP BY TRUNC (bs.accounting_date),
bs.ams_customer_number,
bs.CUSTOMER_ACCOUNT_NUMBER,
bs.customer_name
HAVING SUM (bs.book_gross_profit_margin_amt) <> 0)) MGN
GROUP BY MGN.accounting_date,
MGN.customer_number,
MGN.customer_id,
MGN.customer_name) RNK
Group by
RNK.accounting_date,
RNK.customer_number,
RNK.customer_id,
RNK.customer_name,
RNK.total_prod_margin,
RNK.total_svcs_margin,
RNK.total_margin
order by RNK.customer_name , RNK.accounting_date)
where RANK_IN_LINE <=10
order by rank_in_line asc; Now I need to add fields where this is broken out by REGION and LINE OF BUSINESS but I want to keep the same number of clients. If I use the same calculation, I get only the top ten CLIENT, REGION, LINE OF BUSINESS values( two extra fields). I don't get the original list of client , I just get the top X lines.
What do I need to do in order to Keep the same top ten or what ever clients and show those values broken down by their regions and Lines of business (LOBs)?
The new fields are shown in this query
SELECT MGN.accounting_date,
MGN.customer_number,
MGN.customer_id,
MGN.customer_name,
MGN.SALES_REGION_NAME,
MGN.DESCRIPTION,
SUM (MGN.prod_margin) total_prod_margin,
SUM (MGN.svcs_margin) total_svcs_margin,
SUM (MGN.prod_margin + MGN.svcs_margin) total_margin,
-- RANK() OVER(PARTITION BY MGN.customer_name ORDER BY SUM (MGN.prod_margin + MGN.svcs_margin) desc)
FROM
(( SELECT TRUNC (ba.accounting_date) accounting_date,
ba.CUSTOMER_ACCOUNT_NUMBER customer_number,
to_char(nvl(ba.customer_id, -1)) customer_id,
ba.customer_name,
TERR.SALES_REGION_NAME,
decode(LOB.SALES_CHANNEL_CODE, 'SUN','Oracle',
'EBASOL', 'BAS',
'NBASOL', 'BAS',
'X86', 'X86',
'PC','X86',
'IBM','IBM',
LOB.DESCRIPTION) DESCRIPTION,
SUM (ba.book_margin_amount) prod_margin,
0 svcs_margin
FROM ftbv.ftbv_booking_activity_all_1d ba,
FTBV_DIM_TERRITORY_1D TERR,
FORX_LINES_OF_BUSINESS LOB
WHERE ba.order_type_name <> 'Wholesale'
AND ba.SALESREP_NAME <> 'NO SALES CREDIT'
AND ba.product_group_name <> 'BAS New'
AND BA.PARTIAL_ID <> -1
AND ba.TERRITORY_ID = TERR.TERRITORY_ID
AND BA.PRODUCT_GROUP_CODE = LOB.CODE
GROUP BY TRUNC (ba.accounting_date),
ba.customer_account_number,
to_char(nvl(ba.customer_id, -1)),
ba.customer_name,
TERR.SALES_REGION_NAME,
decode(LOB.SALES_CHANNEL_CODE, 'SUN','Oracle',
'EBASOL', 'BAS',
'NBASOL', 'BAS',
'X86', 'X86',
'PC','X86',
'IBM','IBM',
LOB.DESCRIPTION)
HAVING SUM (ba.book_margin_amount) <> 0)
UNION ALL
( SELECT TRUNC (bs.accounting_date) accounting_date,
bs.AMS_CUSTOMER_NUMBER customer_number,
bs.CUSTOMER_ACCOUNT_NUMBER customer_id,
bs.customer_name,
TERR.SALES_REGION_NAME,
'Service' DESCRIPTION,
0 prod_margin,
SUM (bs.book_gross_profit_margin_amt) svcs_margin
FROM ftbv.ftbv_book_svc_activity_all_1d bs,
FTBV_DIM_TERRITORY_1D TERR
WHERE bs.booking_activity_name != 'Undefined'
AND bs.TERRITORY_ID = TERR.TERRITORY_ID
GROUP BY TRUNC (bs.accounting_date),
bs.ams_customer_number,
bs.CUSTOMER_ACCOUNT_NUMBER,
bs.customer_name,
TERR.SALES_REGION_NAME,
'Service'
HAVING SUM (bs.book_gross_profit_margin_amt) <> 0)) MGN
GROUP BY MGN.accounting_date,
MGN.customer_number,
MGN.customer_id,
MGN.customer_name,
MGN.SALES_REGION_NAME,
MGN.DESCRIPTION
order by MGN.customer_name , MGN.accounting_date;Also My output from the original SQL looks like this:
Client Number Client Name Product GPM SUM Services GPM SUM Total GPM SUM
109087 Client 1 1822052.627 0 1822052.627
114386 Client 2 803616 0 803616
101390 Client 3 101339.26 2516 103855.26
102740 Client 4 50000 0 50000
104698 Client 48057.83 0 48057.83
100229 Client 6 39555.289 0 39555.2894
106902 Client 7 4148.73 0 4148.73
103565 Client 8 3016.19 0 3016.19
130322 Client 9 1892 0 1892
130241 Client 10 700 0 700For the second query I removed the Top N constant and got too large a list. I'm getting this.
Client Number Client Name Region Line of Business Product GPM SUM Services GPM SUM Total GPM SUM
109087 Client 1 Central Storage 11473.9 0 11473.9
109087 Client 1 Central Security 21690.727 0 21690.727
109087 Client 1 Central Network 1788888 0 1788888
120360 Client 2 Central X86 -189.76 0 -189.76
104134 Client 3 Central X86 -412.8 0 -412.8
104698 Client 4 Central X86 48057.83 0 48057.83
103565 Client 4 Central Storage 3016.19 0 3016.19
102740 Client 5 Central Storage 50000 0 50000
108702 Client 6 Central Sun 109.21 0 109.21
108702 Client 6 Central Storage -336 0 -336
110019 Client 7 Central Network 397.5 0 397.5
105096 Client 8 Central Security 198 0 198
130241 Client 9 Central Security 700 0 700
105588 Client 10 Central Sun -7742 0 -7742
106902 Client 11 Central Security 4148.73 0 4148.73
130322 Client 12 Central Network 1892 0 1892
101965 Client 13 Central X86 64.92 0 64.92
100229 Client 14 Central IBM 39555.29 0 39555.29
100229 Client 15 Central Hewlett Packard -0.0006 0 -0.0006
114386 Client 15 Central Storage 803616 0 803616
107554 Client 16 Central Sun -6155 0 -6155
104724 Client 17 Central Network 280.24 0 280.24
101390 Client 18 Central X86 91093.89 0 91093.89
101390 Client 18 Central Service 0 2516 2516
101390 Client 18 Central Network 10245.37 0 10245.37It's not keeping the original Top Ten Clients of course, It's giving me too many. What analytics do I need to use to Keep the Top Ten (x) clients, but then broken out by that client into their respective subgroups of Region and LOB(Line of Business)? -
Hi,
I am using OWB V9i. I need to implement an sql statement in a mapping.The statement is
CASE WHEN (tr_dt=lead(tr_dt) OVER(PARTITION BY ID ORDER BY ID,tr_dt))
THEN lead(ef_dt) OVER(PARTITION BY ID ORDER BY ID,tr_dt)
ELSE tr_dt
END
I am not able to put the case statement in an expression as it has OVER PARTITION BY which the expression operator is not accepting. Is there any way i can implement the same in the mapping without using a table function or any other procedure???
Regards
BharathOWB's support for analytics has always been pretty sparse. They did release a doc that showed a way to build some analytics using various workarounds here:
http://www.oracle.com/technology/sample_code/products/warehouse/files/analyticfunctions.pdf
However my approach has always been to tend towards simplicity. A convoluted "workaround"mapping is harder to tune, troubleshoot, or maintain, and I always worry about how OWB will handle such workarounds over subsequent release upgrades. So, for anything not clearly natively supported by OWB - I build a view with the analytic and use that as my source.
Mike -
Hi,
I am new to my project where I need some help.
My requirement is like below
In EMP table I need to group the all the employees who belongs to one department, I used group by function to fetch the results, but my lead asked me to use partition by clause. So I am not sure is I get different result set by using partition by ?
If yes How can I use it.
Thanks,grouping reduces the number of result rows - and analytics do not. Here is a small emp example:
select ENAME
, job
, count(*) over (partition by job) job_count
from emp
order by ename;
ENAME JOB JOB_COUNT
ADAMS CLERK 4
ALLEN SALESMAN 4
BLAKE MANAGER 3
CLARK MANAGER 3
FORD ANALYST 2
JAMES CLERK 4
JONES MANAGER 3
KING PRESIDENT 1
MARTIN SALESMAN 4
MILLER CLERK 4
SCOTT ANALYST 2
SMITH CLERK 4
TURNER SALESMAN 4
WARD SALESMAN 4
With group by you would only get one line per job group. You could get the same result with a self join with a grouped result:
select e1.ename
, e1.job
, e2.job_count
from emp e1
, (select JOB
, count(*) job_count
from emp
group by job) e2
where e1.job = e2.job
order by e1.ename;
ENAME JOB JOB_COUNT
ADAMS CLERK 4
ALLEN SALESMAN 4
BLAKE MANAGER 3
CLARK MANAGER 3
FORD ANALYST 2
JAMES CLERK 4
JONES MANAGER 3
KING PRESIDENT 1
MARTIN SALESMAN 4
MILLER CLERK 4
SCOTT ANALYST 2
SMITH CLERK 4
TURNER SALESMAN 4
WARD SALESMAN 4 -
Announcement: Oracle BI, Warehousing and Analytics Summit, Dec. 2-3 2008
Oracle BIWA Summit 2008, December 2-3, 2008 (www.oraclebiwa.org)
Oracle World Headquarters, Redwood Shores, CA
IOUG Business Intelligence, Warehousing and Analytics SIG
”Transforming Data To Information For Competitive Advantage”
Oracle BIWA Summit 2008 is a forum for business intelligence (BI), warehousing and analytics professionals to exchange information, experiences and best practices. With over 75 presentations and workshops and the entire event focused on BI, Warehousing and Analytics (BIWA), you will get the knowledge and information critical to be successful in your work. You will hear experts present novel and interesting use cases of Oracle Database-centric BIWA topics through keynotes, technical talks, hands on workshops, discussion panels and more.
Call for Presentations & Technical Workshops
You are invited to submit an abstract for a Presentation (50 minutes) or hands-on Technical Workshop (110 minutes) to the Oracle Business Intelligence, Warehousing, and Analytics (BIWA) Summit 2008 at the Oracle Conference Center in Redwood Shore, CA! See submission instructions and important dates below.
Oracle Business Intelligence, Warehousing, and Analytics (BIWA) SIG
The Oracle BIWA Special Interest Group (BIWA SIG) functions as a special interest group of the Independent Oracle Users Group (IOUG). BIWA SIG represents a worldwide association of persons who share a vested interest in successfully deploying Oracle Database-centric BIWA products, features and Options—that is, the Oracle Database “BIWA” platform. Oracle Database BIWA platform includes, but is not limited to:
* Oracle Database, Oracle OLAP Option, Oracle Data Mining Option, Oracle Partitioning Option, Oracle Spatial Option, Oracle Warehouse Builder, SQL Statistical functions, SQL Analytics, Oracle Text, Oracle Multimedia, regular expression searches, text mining, BI Publisher
* Query and reporting tools, BI tools and BI dashboards including Oracle Business Intelligence SE, Oracle Business Intelligence EE, and similar tools from Oracle partners and ISVs
The Oracle BIWA Summit 2008 is designed to accelerate users' success with Oracle products by providing a forum for sharing information and best practices, discovering novel and innovative practices, and exposing issues and limitations concerning Oracle products.
Keynotes
Come hear from the industry leaders, visionaries and experts including:
* Jeanne Harris, co-author, with Tom Davenport, of the best-selling book “Competing on Analytics: The New Science of Winning” (Harvard Business School Press, 2007) and Executive Research Fellow and Director of Research at the Accenture Institute for High Performance Business
* Ray Roccaforte, Vice President of Data Warehousing and Business Intelligence Platform, Oracle Corporation
* Juan Loaiza, Senior Vice President Systems Technology, Oracle Corporation
* Dr. Usama Fayyad, Chief Data Officer and Executive Vice President, Research & Strategic Data Solutions, Yahoo!
Hands-on Technical Workshops
Oracle BIWA Summit includes hands-on workshops using provided laptops running the latest release of the Oracle Database and BIWA-related software.
-Learn from the experts
-Hands-on opportunities to work with the software
- 2-hour in-depth hands-on workshops
Partners and Sponsors
BIWA partners and sponsors will be exhibiting their products and services at the BIWA Summit 2008. And don’t miss the fast-paced fun Lightning Rounds!
-Partner and Sponsor Exhibits
-Evening Reception for all BIWA attendees on Tuesday Night, December 2 sponsored by Partner and Sponsors
-ISV Partners and Sponsors Lightning Rounds (just before lunch each day)
BIWA themes include: "Transforming data to information for competitive advantage", and “Better Information Better Results”Hi,
Yes changed,i am just working on the BI Composer part. applied JDEV 13952743 patch then extended domain...by script..
got some error then rollback jdev patch file then reapplied.
FYI:
Applying JDev 13952743 Patch file.
E:\>cd E:\opatch_top\13952743
E:\opatch_top\13952743>set PATH=%PATH%;D:\Oracle\Middleware\Oracle_BI1\OPatch
set ORACLE_HOME=D:\Oracle\Middleware\oracle_common
set PATH=%ORACLE_HOME%\bin;%PATH%
set JAVA_HOME=%ORACLE_HOME%\jdk
set PATH=%JAVA_HOME%\bin;%PATH%
set PATH=%ORACLE_HOME%\OPatch;%PATH%
E:\opatch_top\13952743>opatch version
E:\opatch_top\13952743>opatch apply
used below command for rollback
opatch rollback -id 13952743
Thanks
Deva -
I am not able to see partition tab in Informatica sessions, though we have full license of Siebel Analytics 7.8.
Do we need to purchase the license separately for partitioning, or is there any option to enable it, that I am missing?
Thanks,
Ravishyou need the [OBIA forum|http://forums.oracle.com/forums/forum.jspa?forumID=410]
-
Is it possible to install OBIEE(Siebel Analytics) & Hyperion on same machin
Hi All,
In my Stand alone system i am having win2003 server and WinXP with saperate Partitions in saperate drives (like C: & D:).
My requirement is to install Hyperion 11.1.1 Essbase and Planning in Win 2003 server and OBIEE(Siebel Analytics) 10x in WinXP.
IS it possible to install these two technologies on the same stand alone machine with saperate OS and saperate drives. My machine capacity is 1GB ram and 80gb harddisk and 2.4GHz processor.
Plz suggest me some thing regarding this doubt
Thanks inadvanceWe will not use the two servers at the same time and this machine will not be used for any development,just for demo purpose. So we will use only one thing at one time for getting handson, that is why we are trying to install two things on the same machine
What we are exactly planning is C: drive which will have WIN 2003 server with 10g or sql server as database and .NET framework for HYPERION 11.1.1.. Essbase and planning installations.
and on the other side OBIEE is compatable for XP and need 10g database and JDK1.5.0 on other side in D: drive
In these two technologies Hyperion Installation are important to us and OBIEE is only if possible.This is the exact requirement and environment. Suggest me if anything is going wrong or we are planning over the limits
The machine is 1.5gb ram and 80gb harddisk and 2.4ghz processor.
Thanks in Advance -
Dose the Partition by clause increase the perfomance then the normal group by clause ??
thanks,
Raj.Analytic queries != Aggregate queries
Therefore it depends on what you're doing as to whether you'd want to use Analytic vs Aggregate queries.
Performance depends also on the amount of data in your tables.
Having said that, Analytics can mean that a self-join is no longer needed, and this could help if a large table is involved ... on the otherhand, it might hinder.
Think of Analytic queries as another tool in your toolbox; sometimes a hammer is the right tool to use, but sometimes you need a spanner instead.
Maybe you are looking for
-
Unable to save data in record working time
Hi All, I am working on ESS portal , when I try to record & save working time in portal , it throw an error message as 'Unable to save data'. Can you plz assist me on this. Usefull answers will be awarded. Thanks, Nilima
-
Dropdown menu in APEX 4.0
Hi, I am wondering how to create the Dropdown menu in APEX 4.0. Similar to the Oracle Application Express Homepage or the one gets created in any Websheet Application. I have tried to use the List but unable to place into the right position (Display
-
Mac Mini Doesn't Finish Booting
I have a Mac Mini (Intel-based, from about 3 years ago or so) and all was working fine (Snow Leopard) and yesterday it hung. After a reboot, it takes 15 mins to get to see the icons on the screen and never fully boots. There is an OS update ready f
-
Calling report application from jsp
Hi, I am designing a JSP that captures search parameters. The user will enter words contained in a company name and product codes. This triggers the jsp to display all matches on the same page. My question is how do I call a portal report application
-
Itunes 11.1.4 skipping and macbook pro unresponsive
macbook pro mid 2009 when i lisining to music stord on my mac itunes will skip kinda like it is buffering for a split sec only thing is all my music is on the hdd and when it dose this my mac becomes unresponsiveand it only dose this when useing the