Analytics Question
I am getting ready to release my first app that allows users to share content through the web viewer. Am I able to view any metrics behind web viewer usage with the out of the box dps analytics?
thanks!
Hank
Hi Hank,
The web viewer is tracked, but the baseline analytics on DPS Portal doesn't differentiate analytics for the web viewer from native viewers. If you have access to Adobe Analytics (SiteCatalyst) then you can differeniate web viewer from native viewers by the Viewer Version.
Shikha
Similar Messages
-
Pivot not working; Analytics question
I got a lot done this past weekend on tough query for me but now have more questions. You must run the top query to understand the understand the second.
Background:
This is a period to date report which includes amounts from beginning of the year as well as last year.
My example Report is for Period 9 which begins on Aug 24, 2009
Period has 5 weeks (Most periods have 4) ;
Also if period is requested in Week 3 then only weeks 1 and 2 will be reflected on report
The report looks like this: (Data provided should produce these numbers except for maybe the YTD column.
WEEK1 WEEK2 WEEK3 WEEK4 WEEK5 Period-to-date Year-to-date
Net - Landau 11,485.79 11,416.60 11,609.01 11,049.76 12,867.10 58,428.00 454,231.37
Net - AW 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Net - LJS 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Net - TB 7,118.17 7,228.13 7,657.94 7,699.53 7,958.53 37,662.00 306,115.59
Total Net 18,603.96 18,644.73 19,266.95 18,749.29 20,825.63 96,091.00 760,346.96
Last Year Sales 23,515.95 24,244.37 23,962.74 23,134.79 24,440.87 119,299.00 856,363.36
Increase (4,911.99) (5,599.64) (4,695.79) (4,385.50) (3,615.24) (23,208.00) (96,016.40)
Last year
Next Week 24,244.37 23,962.74 23,134.79 24,440.87 23,055.87 118,839.00 879,419.23 --===== Current Year Dates
Beginning of period: (BOP) Mon Aug 24
Week 1: Mon Aug 24 - Aug 30
Week 2: Mon Aug 11 - Sept 6
Week 3: Mon Sep 7 - Sept 13
Week 4: Mon Sept 14 - Sept 20
Week 5: Mon Sept 21 - Sept 27
Beginning of fiscal year( BOY) = '28-Dec-08'
--===== Last Year Dates
Beginning of period: (BOP_LY) Mon Aug 25
Week 1: Mon Aug 25 - Aug 31
Week 2: Mon Aug 1 - Sept 7
Week 3: Mon Sep 8 - Sept 14
Week 4: Mon Sept 15 - Sept 21
Week 5: Mon Sept 22 - Sept 28
Beginning of fiscal year( BOY) = '31-Dec-07'
My challenge over weekend was to get entire year of data vs. just the period data.
There are 7 columns on this report. 5 weeks of period, PeriodToDate (total of the weeks) and YearToDate (PeriodToDate + Sum of all data from beginning of year to the end of the previous period.
I'm not really concerned with the PTD, the program can handle that. I got the BOY date with the following code:
All data is summed for the week and grouped by storeid and TRUNC(date, 'IW') which is the Monday of each week.
of each week. (The resultset contains data for 2 stores instead of 1. This is there only to make sure my query was filtering correctly)
drop table my_csh_main;
CREATE TABLE MY_CSH_MAIN
( "FK_STR_MAIN_ID" NUMBER,
"BUSI_DATE" DATE,
"CONF_NUMB2" NUMBER,
"CONF_NUMB49" NUMBER,
"CONF_NUMB44" NUMBER,
"CONF_NUMB3" NUMBER,
"CONF_NUMB4" NUMBER,
"CONF_NUMB38" NUMBER,
"CONF_NUMB56" NUMBER
REM INSERTING into MY_CSH_MAIN
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('28-AUG-08','DD-MON-RR'),22103.69,0,0,119,0,4605.21,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('27-AUG-09','DD-MON-RR'),18081.37,0,0,0,0,3533.45,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('17-SEP-09','DD-MON-RR'),18211.29,0,0,0,0,3806.32,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('04-SEP-08','DD-MON-RR'),24244.37,0,0,284.94,0,0,9395.63);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('03-SEP-09','DD-MON-RR'),18644.73,0,0,85.48,0,0,7228.13);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('24-SEP-09','DD-MON-RR'),16809.21,0,0,64.99,0,3014.61,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('25-SEP-08','DD-MON-RR'),24440.87,0,0,0,0,0,9469.64);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('28-AUG-08','DD-MON-RR'),23515.95,0,0,0,80.38,0,9379.9);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('24-SEP-09','DD-MON-RR'),20825.63,0,0,73.97,0,0,7958.53);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('17-SEP-09','DD-MON-RR'),18749.29,0,0,0,0,0,7699.53);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('11-SEP-08','DD-MON-RR'),22839.3,0,0,206.39,116.74,4493.28,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('04-SEP-08','DD-MON-RR'),22627.74,0,0,279.98,0,4423.83,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('27-AUG-09','DD-MON-RR'),18603.96,0,0,81.25,0,0,7118.17);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('11-SEP-08','DD-MON-RR'),23962.74,0,0,153.1,0,0,9335.35);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('18-SEP-08','DD-MON-RR'),23134.79,0,0,44.12,0,0,8978.87);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('25-SEP-08','DD-MON-RR'),24950.45,0,0,129.98,0,5330.22,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('10-SEP-09','DD-MON-RR'),19266.95,0,0,0,0,0,7657.94);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('03-SEP-09','DD-MON-RR'),17183.25,0,0,0,0,3487.12,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('18-SEP-08','DD-MON-RR'),21372.82,0,0,0,0,4546.15,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('10-SEP-09','DD-MON-RR'),17688.41,0,0,113.12,0,3424.17,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('31-DEC-08','DD-MON-RR'),611016.24,0,0,1276.62,724.96,122236.02,0);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('31-DEC-08','DD-MON-RR'),667612.63,0,0,1018.81,0,0,269777.87);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (141,to_date('02-JAN-08','DD-MON-RR'),1676737.13,0,0,5652.47,3850.68,345971.1,500.5);
Insert into MY_CSH_MAIN (FK_STR_MAIN_ID,BUSI_DATE,CONF_NUMB2,CONF_NUMB49,CONF_NUMB44,CONF_NUMB3,CONF_NUMB4,CONF_NUMB38,CONF_NUMB56) values (221,to_date('02-JAN-08','DD-MON-RR'),1786451.11,0,0,3167.61,175.38,0,788438.73);
CREATE TABLE LANDAU_REPORT_STORES
( "COMPANYNAME" CHAR(40 BYTE),
"DISTRICTNAME" VARCHAR2(50 BYTE),
"STOREID" NUMBER(4,0) NOT NULL ENABLE,
"STORENAME" VARCHAR2(70 BYTE),
"STORENBR" CHAR(4 BYTE) NOT NULL ENABLE
REM INSERTING into LANDAU_REPORT_STORES
Insert into LANDAU_REPORT_STORES (COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',64,'N_Main','0004');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',65,'Belvidere','0005');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',104,'Mulford','0032');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 50',141,'Charleston','0043');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',61,'Kilburn','0002');
Insert into LANDAU_REPORT_STORES(COMPANYNAME,DISTRICTNAME,STOREID,STORENAME,STORENBR) values ('Landau ','DIST 10',62,'11th_St','0003');
with WeeklyTotals as
( select StoreId
, week_date
, net_sales
, ljs_food_sales
, total_drink_sales
, net_food
, CASE
WHEN net_food is null then 0
WHEN net_food = 0 then 0
ELSE Ljs_food_sales / net_food
END as LJS_Percent
, CASE
WHEN net_food is null then 0
WHEN net_food = 0 then 0
else total_drink_sales * (ljs_food_sales / net_food)
END as LJS_Drinks
, aw_sales
, tb_sales
FROM (Select fk_str_main_id as StoreId,
trunc(csh.busi_date, 'IW') as week_date,
sum(csh.conf_numb2) As net_sales,
sum(conf_numb49) as ljs_food_sales,
sum(conf_numb44) As total_drink_sales,
sum(csh.conf_numb2) - sum(CONF_NUMB44) - sum(conf_numb3) - sum(conf_numb4) AS net_food,
sum(conf_numb38) As aw_sales,
sum(conf_numb56) as tb_sales
from my_csh_main csh
WHERE BUSI_DATE BETWEEN TO_DATE( '28-Dec-08' ,'DD-MON-YY') AND TO_DATE( '27-SEP-09' ,'DD-MON-YY')
and fk_str_main_id in (141, 221)
GROUP BY CSH.FK_STR_MAIN_ID, trunc(csh.busi_date, 'IW')
, WeeklyFoodSalesLY as
SELECT fk_str_main_id AS storeid
, TRUNC(busi_date, 'iw') week_date
, SUM(csh.conf_numb2) AS net_sales
FROM my_csh_main csh
WHERE busi_date BETWEEN to_date('31-DEc-07', 'dd-Mon-yy') and to_date('28-Sep-08', 'dd-Mon-yy')
GROUP BY fk_str_main_id, TRUNC(busi_date, 'iw')
, StoreDetails AS
select * from LANDAU_REPORT_STORES where STORENAME NOT like '%CLOSED%' order by DistrictNAme, StoreNbr
Select
foods.storeid
, CASE
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') then 1
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') + 7 then 2
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') + 14 then 3
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09','dd-Mon-yy') + 21 then 4
WHEN to_date(Foods.week_date, 'dd-Mon-yy') = to_date('24-AUG-09', 'dd-Mon-yy') + 28 then 5
ELSE 0
end as WeekNBr
, foods.week_date as CurrWeekDate
, foodsLY.week_date as LastYearWEekDate
, ROUND(NVL(foods.net_sales - (aw_sales + tb_sales + ljs_drinks + ljs_food_sales ), 0), 2) as Landau_Net_Sales
, ROUND(NVL(aw_sales, 0), 2) as aw_sales
, ROUND(NVL(tb_sales, 0), 2) as tb_sales
, ROUND(NVL(ljs_drinks + ljs_food_sales, 0),2) as ljs_net_sales
, ROUND(NVL(foods.Net_Sales, 0), 2) as net_sales
--==
-- Last Year Sales and Last Year Next Year Sales
--==
, ROUND(NVL(foodsLY.Net_Sales, 0),2) as WTDLYNetSales
, ROUND(NVL(Foods.Net_Sales - FoodsLY.Net_Sales, 0),2) as WTDSalesIncrease
-- , ROUND(NVL(FoodsLY_NextWeek.Net_Sales, 0), 2) as WTDFoodsSalesLY
, stores.*
from WEeklyTotals Foods
LEFT OUTER JOIN Weeklyfoodsalesly foodsly ON foodsly.storeid = foods.storeid
AND foodsly.week_date = DECODE(foods.week_date,
to_date('24-AUG-09', 'dd-Mon-yy') , to_date('25-AUG-08', 'dd-Mon-yy') ,
to_date('24-AUG-09', 'dd-Mon-yy') + 7, to_date('25-AUG-08', 'dd-Mon-yy') + 7,
to_date('24-AUG-09', 'dd-Mon-yy') + 14, to_date('25-AUG-08', 'dd-Mon-yy') + 14,
to_date('24-AUG-09', 'dd-Mon-yy') + 21, to_date('25-AUG-08', 'dd-Mon-yy') + 21,
to_date('24-AUG-09', 'dd-Mon-yy') + 28, to_date('25-AUG-08', 'dd-Mon-yy') + 28)
LEFT OUTER JOIN StoreDetails stores ON stores.storeid = Foods.storeid;one exception. In a With statement, I get a recordset containing all of last years data. I could not figure out the Pivot
to get the Last Year Next Week . That means next week, one year ago. I pulled out a snippet to work with but couldn't get it
to work:
The problem came when I tried to pivot the Last year sales. I pulled this snippet out to test with but couldn't make it work.
with WeeklyFoodSalesLY as
( SELECT fk_str_main_id AS storeid
, TRUNC(busi_date, 'iw') week_date
, sum(conf_numb2) conf_numb2
FROM my_csh_main csh
WHERE busi_date BETWEEN to_date('31-dec-07', 'dd-Mon-yy') and to_date('28-Sep-08', 'dd-Mon-yy') + 7
and fk_str_main_id = 141
GROUP BY fk_str_main_id , TRUNC(busi_date, 'iw')
, sales_ly as
( SELECT storeid
, week_date
, sum(conf_numb2) as net_sales
from WeeklyFoodSalesLY
WHERE week_date BETWEEN to_date('25-Aug-08', 'dd-Mon-yy') and to_date('28-Sep-08', 'dd-Mon-yy')
GROUP BY storeid , week_date
UNION ALL
SELECT storeid
, week_date
, sum(conf_numb2) as net_sales
from WeeklyFoodSalesLY
WHERE week_date BETWEEN to_date('25-Aug-08', 'dd-Mon-yy') + 7 and to_date('28-Sep-08', 'dd-Mon-yy') + 7
GROUP BY storeid , week_date
UNION ALL
SELECT storeid
, week_date
, sum(conf_numb2) as net_sales
from WeeklyFoodSalesLY
WHERE week_date < to_date('25-Aug-08', 'dd-Mon-yy')
GROUP BY storeid , week_date
, pivoted_sales_ly as
( SELECT storeid
week_date
, MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy'), net_sales)) as lastyear
, MAX(DECODE(week_date, to_date('25-Aug-08', 'dd-Mon-yy') + 7, net_sales)) as lastyearnextweek
, MAX(DECODE(week_date, to_date('31-dec-07', 'dd-Mon-yy'), net_sales)) as lastyeartotal
from sales_ly
GROUP BY storeid, week_date
Select *
from pivoted_sales_ly; What am i dont wrong here? Once I get this I can work back into the original query.
Analytics:
Boneist gave me some code to try last week:
Combining refCursors and Summing
that created the PTD column.
I could never get it to work because I was tweaking so much with the Query but is there a way to get the PTD without
having to add an extra PTD column for each week.
One other thing:
I would like to know how to use analytics to roll up report by division into a different cursor.
Edited by: TheHTMLDJ on Oct 26, 2009 4:50 AM
Edited by: TheHTMLDJ on Oct 26, 2009 4:59 AMTheHTMLDJ wrote:
The report looks like this: (Data provided should produce these numbers except for maybe the YTD column.With the data you provided, I managed to produce the expected output for storeid 221, barring the YTD column, which had the wrong values. I had to make up the "next week" row for the previous year for storeid 221, btw:
WITH weekly_values AS (SELECT fk_str_main_id as storeid,
TRUNC(csh.busi_date, 'IW') AS week_date,
NVL(SUM(csh.conf_numb2), 0) AS net_sales,
NVL(SUM(conf_numb49), 0) AS ljs_food_sales,
NVL(SUM(conf_numb44), 0) AS total_drink_sales,
NVL(SUM(csh.conf_numb2) - SUM(conf_numb44)
- SUM(conf_numb3)
- SUM(conf_numb4), 0) AS net_food,
NVL(SUM(conf_numb38), 0) AS aw_sales,
NVL(SUM(conf_numb56), 0) AS tb_sales
FROM my_csh_main csh
WHERE (busi_date BETWEEN TRUNC(:v_bop, 'iy') AND TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks - 1) + 6
OR
busi_date BETWEEN TRUNC(:v_bop_ly, 'iy') AND TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks + 6)
AND fk_str_main_id IN (141, 221)
GROUP BY csh.fk_str_main_id,
TRUNC(csh.busi_date, 'IW')),
weekly_totals AS (SELECT storeid,
week_date,
net_sales,
ljs_food_sales,
total_drink_sales,
net_food,
CASE WHEN nvl(net_food, 0) = 0 then 0
ELSE ljs_food_sales / net_food
END AS ljs_percent,
CASE WHEN nvl(net_food, 0) = 0 then 0
ELSE total_drink_sales * (ljs_food_sales / net_food)
END AS ljs_drinks,
aw_sales,
tb_sales,
net_sales - aw_sales
- tb_sales
- ljs_food_sales
- CASE WHEN nvl(net_food, 0) = 0 then 0
ELSE total_drink_sales * (ljs_food_sales / net_food)
END AS landau_net_sales
FROM weekly_values),
week_tots_ytd AS (SELECT storeid,
week_date,
net_sales,
ljs_food_sales,
total_drink_sales,
net_food,
ljs_drinks,
aw_sales,
tb_sales,
landau_net_sales,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, landau_net_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) lns_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, aw_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) aws_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, ljs_food_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) ljsf_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, tb_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) tbs_ytd,
SUM(decode(week_date, TRUNC(:v_bop_ly, 'iw') + 7*:v_num_weeks, NULL, net_sales)) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) net_ytd,
SUM(net_sales) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')) full_net_ytd,
SUM(landau_net_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) lns_ptd,
SUM(aw_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) aws_ptd,
SUM(ljs_food_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) ljsf_ptd,
SUM(tb_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks -1) THEN 2
END) tbs_ptd,
SUM(net_sales) OVER (PARTITION BY storeid,
CASE WHEN week_date between TRUNC(:v_bop, 'iw') and TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks -1) THEN 1
WHEN week_date between TRUNC(:v_bop_ly, 'iw') and TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks-1) THEN 2
END) net_ptd,
LEAD(net_sales) OVER (PARTITION BY storeid,
trunc(week_date, 'iy')
ORDER BY week_date) next_week_net_sales
FROM weekly_totals),
foods AS (Select storeid,
CASE WHEN week_date in (:v_bop, :v_bop_ly) then 1
WHEN week_date in (:v_bop + 7, :v_bop_ly + 7) then 2
WHEN week_date in (:v_bop + 14, :v_bop_ly + 14) then 3
WHEN week_date in (:v_bop + 21, :v_bop_ly + 21) then 4
WHEN week_date in (:v_bop + 28, :v_bop_ly + 28) then 5
END AS week_number,
week_date,
ROUND(net_sales - (aw_sales + tb_sales + ljs_drinks + ljs_food_sales), 2) AS landau_net_sales,
ROUND(aw_sales, 2) AS aw_sales,
ROUND(tb_sales, 2) AS tb_sales,
ROUND(ljs_drinks + ljs_food_sales, 2) AS ljs_net_sales,
ROUND(net_sales, 2) AS net_sales,
ROUND(lns_ytd, 2) AS lns_ytd,
ROUND(aws_ytd, 2) AS aws_ytd,
ROUND(ljsf_ytd, 2) AS ljsf_ytd,
ROUND(tbs_ytd, 2) AS tbs_ytd,
ROUND(net_ytd, 2) AS net_ytd,
ROUND(full_net_ytd, 2) AS full_net_ytd,
ROUND(lns_ptd, 2) AS lns_ptd,
ROUND(aws_ptd, 2) AS aws_ptd,
ROUND(ljsf_ptd, 2) AS ljsf_ptd,
ROUND(tbs_ptd, 2) AS tbs_ptd,
ROUND(net_ptd, 2) AS net_ptd,
ROUND(next_week_net_sales, 2) AS next_week_net_sales,
SUM(ROUND(next_week_net_sales, 2)) OVER (PARTITION BY storeid,
TRUNC(week_date, 'iy')) nwns_ptd
FROM week_tots_ytd
WHERE (week_date BETWEEN TRUNC(:v_bop, 'iw') AND TRUNC(:v_bop, 'iw') + 7*(:v_num_weeks - 1)
OR
week_date BETWEEN TRUNC(:v_bop_ly, 'iw') AND TRUNC(:v_bop_ly, 'iw') + 7*(:v_num_weeks - 1))),
pivoted_foods AS (SELECT storeid,
week_number,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), landau_net_sales)) landau_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), aw_sales)) aw_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), tb_sales)) tb_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), ljs_net_sales)) ljs_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), net_sales)) net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), lns_ytd)) lns_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), aws_ytd)) aws_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), ljsf_ytd)) ljsf_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), tbs_ytd)) tbs_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), net_ytd)) net_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), full_net_ytd)) full_net_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), lns_ptd)) lns_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), aws_ptd)) aws_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), ljsf_ptd)) ljsf_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), tbs_ptd)) tbs_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop, 'iy'), net_ptd)) net_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), net_sales)) ly_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), net_ytd)) ly_net_ytd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), net_ptd)) ly_net_ptd,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), next_week_net_sales)) ly_next_week_net_sales,
MAX(DECODE(TRUNC(week_date, 'iy'), TRUNC(:v_bop_ly, 'iy'), nwns_ptd)) ly_nwns_ptd
FROM foods
GROUP BY storeid,
week_number),
storedetails AS (SELECT companyname,
districtname,
storeid,
storename,
storenbr
FROM landau_report_stores
WHERE UPPER(storename) NOT LIKE '%CLOSED%'
ORDER BY districtname, storenbr),
dummy AS (SELECT level col1
FROM dual
CONNECT BY level <= 8)
SELECT pf.storeid,
DECODE(dummy.col1, 1, 'Net - Landau',
2, 'Net - AW',
3, 'Net - LJS',
4, 'Net - TB',
5, 'Total Net',
6, 'Last Year Sales',
7, 'Increase',
8, 'Last Year Next Week') category,
SUM(CASE WHEN pf.week_number = 1 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 1 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 1 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 1 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week1,
SUM(CASE WHEN pf.week_number = 2 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 2 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 2 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 2 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week2,
SUM(CASE WHEN pf.week_number = 3 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 3 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 3 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 3 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week3,
SUM(CASE WHEN pf.week_number = 4 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 4 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 4 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 4 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week4,
SUM(CASE WHEN pf.week_number = 5 AND dummy.col1 = 1 THEN pf.landau_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 2 THEN pf.aw_sales
WHEN pf.week_number = 5 AND dummy.col1 = 3 THEN pf.ljs_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 4 THEN pf.tb_sales
WHEN pf.week_number = 5 AND dummy.col1 = 5 THEN pf.net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 6 THEN pf.ly_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 7 THEN pf.net_sales - pf.ly_net_sales
WHEN pf.week_number = 5 AND dummy.col1 = 8 THEN pf.ly_next_week_net_sales
END) week5,
SUM(CASE WHEN pf.week_number = 5 AND dummy.col1 = 1 THEN pf.lns_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 2 THEN pf.aws_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 3 THEN pf.ljsf_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 4 THEN pf.tbs_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 5 THEN pf.net_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 6 THEN pf.ly_net_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 7 THEN pf.net_ptd - pf.ly_net_ptd
WHEN pf.week_number = 5 AND dummy.col1 = 8 THEN pf.ly_nwns_ptd
END) period_to_date,
SUM(CASE WHEN pf.week_number = 5 AND dummy.col1 = 1 THEN pf.lns_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 2 THEN pf.aws_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 3 THEN pf.ljsf_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 4 THEN pf.tbs_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 5 THEN pf.net_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 6 THEN pf.ly_net_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 7 THEN pf.net_ytd - pf.ly_net_ytd
WHEN pf.week_number = 5 AND dummy.col1 = 8 THEN pf.full_net_ytd
END) year_to_date,
stores.companyname,
stores.districtname,
stores.storename,
stores.storenbr
FROM pivoted_foods pf
LEFT OUTER JOIN storedetails stores ON stores.storeid = pf.storeid,
dummy
group by dummy.col1,
DECODE(dummy.col1, 1, 'Net - Landau',
2, 'Net - AW',
3, 'Net - LJS',
4, 'Net - TB',
5, 'Total Net',
6, 'Last Year Sales',
7, 'Increase',
8, 'Last Year Next Week'),
rollup((pf.storeid,
stores.companyname,
stores.districtname,
stores.storename,
stores.storenbr))
order by pf.storeid, dummy.col1;(replace all :v_num_weeks, :v_bop and :v_bop_ly references with your variables) -
I have a Pro account, and have a few questions regarding base analytics:
1) we can see data for which articles are viewed in a folio and how many times, and also data for overlay views... but is there any way in base analytics to see which overlays are being viewed in each article (match them up)?
2) under Articles > Videos, it isn't showing any data for us and yet we have a video in our folio. If you look under All Overlays, it lists Video, and shows data under the Overlay Starts. Why isn't anything showing under Articles > Videos?
3) I assume that 360 refers to all image sequences in the folio? Is there any way to separate image sequences from actual 360's? If not, where can we make this a suggested change in the future! I would think a simple checkbox in the Folio Overlay panel in InDesign to designate an actual 360 vs an image sequence used for something else...
Thanks!
GregHi Greg,
Here are the responses:
1) Currently the base analytics does not provide you a way to know which overlays from a specific article are being viewed. If you are a SiteCatalyst customer you can see breakdown of overlay starts per article.
2) This is a bug which needs to be fixed soon in the upcoming release.
3) Currently you cannot differentiate 360 from image sequences. Thanks for the feedback, we'll evaluate it. -
Substr/Instr/analytics question
I have a table like this on 9i (soon to be 10g)
CREATE TABLE BASELINE_TESTRUN
RUNID VARCHAR2(42 BYTE) NOT NULL
with data like this:
insert into BASELINE_TESTRUN values ('DEV1-XXX-01');
insert into BASELINE_TESTRUN values ('DEV1-XXX-03');
insert into BASELINE_TESTRUN values ('DEV2-XXX-01');
insert into BASELINE_TESTRUN values ('DEV2-XXX-02');
insert into BASELINE_TESTRUN values ('DEV2-XXX-03');
insert into BASELINE_TESTRUN values ('DEV2-XXX-05');
insert into BASELINE_TESTRUN values ('DEV2-XXX-06');
insert into BASELINE_TESTRUN values ('DEV2-XXX-06');
The output should be the next higher number of the last substring in runid
DEV1-XXX 04
DEV2-XXX 07
There may be missing no, and/or duplicates.
This was my Query for this - It works but seems rather clumsy. I would like to have done it in 1 SQL (avoiding the select from (select)):
select runid , max (next_runno) next_runno
from
select substr(runid,1,instr(runid,'-',1,2)-1) runid,
max(substr(runid,instr(runid,'-',1,2)+1))
over(partition by substr(runid,1,instr(runid,'-',1,2)-1)) + 1 next_runno
from baseline_testrun
where instr(runid,'-',1,2) > 0
---where runid like 'whatever%'
group by runid;
Alternate solutions would be appreciated ... and NO I did not define that table like this.
I would have preferred a sequence and a keys that was not composit (Codd/Date, pls. forgive the creators of the table)
best regards
MetteYou are grouping you data. In this case an aggregate function could be more helpful than an analytic function.
In 9i already you are be able to use the KEEP syntax for aggregates.
Look at this example
SQL> select substr(runid,1,instr(runid,'-',1,2)-1) group_run,
2 to_number(
3 max(substr(runid, instr(runid,'-',1,2)+1))
4 keep (dense_rank last order by substr(runid, instr(runid,'-',1,2)))
5 )+1 max_run
6 from BASELINE_TESTRUN
7 group by substr(runid,1,instr(runid,'-',1,2)-1);
GROUP_RUN MAX_RUN
DEV1-XXX 4
DEV2-XXX 7
SQL> Dimas Version is better, since it avoids the keep syntax, that is not really needed for this special case.
Message was edited by:
Sven W. -
Analytics Report Suite Not working
Hello:
I'd like to setup an analytics report suite for an application account I've created. I don't have an Omniture Site Catalyst so I want to use the basic analytics suite included in the DPS Pro. But, when I try to write a name for the Report Suite, the system asks me to enter a Customer Name to enable the Check Availability button. I've done this before in several other application accounts without any issue.
What could be done in these case?For analytics-related issues please contact Gold Support directly for assistance. You can find contact information by going to http://digitalpublishing.acrobat.com/ and looking in the bottom middle of the page.
Since this is a very old thread I'm locking it to avoid multiple issues confusing the conversation. If you found this thread via search and have analytics questions either contact Gold Support or start a new thread in the forum for your specific issue.
Neil -
What is different between Analytics and VC Forum?
Hi Experts!
I don't know what is different between Analytics and VC forum in SDN. of course, Analytics is not Visual composer.
But, I guess that most of analytics questions will be related to Visual Composer in this Analytics Forum.
How about your opinion?Hi Young
Analytics is business process oriented and thus it's domain is BPX hence it is placed under forum for BPX(Business Process Expert).
VC is a tool and it is in technology domain.
First we discuss a scernario based on particular business process. Then we prepare detailed level of analysis-reporting and actionable analytics this business process would need. Then we use tools like VC to accomplish this actionable analytics.
Hope this helps
Regards
Pradip -
JoinFieldValue Between account and opportunity
Hi,
I created two fields with the same name in account and opportunty :"number of building"
The need is the following: two fields "number of building" in two objects (account and opportunity). The user will fill "number of building" in account and it must be automatically fill in "number of building" on the opportunity.
I decided to create a JoinFieldValue in "number of building" opportunity field.
JoinFieldValue('<Account>',[<AccountId>],'<iNombre_btiments_ITAG>')
And I ticked the postdefault box.
The issue is the following: when I fill the "number of building" in the account object, it doesn't fill automatically the "number of building" in the opportunity...
Do I need to create a workflow?
Can someonge help me on this issue?
Thank you,
RegardsThe solution you describe will work when a new Opportunity is created and associated to an existing Account which already has a value in this field.
Keep in mind that default values only get addressed upon record creation - updating the record (or associated record) will do nothing.
You can use workflow to update the field value later using a Before Modified Record Saved trigger event on Opportunity. In this scenario, every time the Opportunity is updated it could grab the field value from the related Account.
What you CAN"T do is what you describe below - have an update on the Account record trigger a field update on a child record. Just doesn't work that way. Your only option would be web services.
BTW - this is not an Analytics question, so you should post to the Administrator forum. -
7.9.6.1 Financial Analytics - JDE - Extract Dates question
Hi all,
Implementation of 7.9.6.1 Financial Analytics + Procurement and Spend Analytics
OLTP: JDE E9
OLAP DB: Oracle 11g
We were trying to adjust the # of prune days for an incremental load, when we discovered the change was having no effect.
Our situation - JDE*
Looking at the parameters in DAC, we found that the incremental condition in the SDEs is:
date >= *$$LAST_EXTRACT_JDEDATE*
In DAC, this parameter expands to:
TO_NUMBER(TO_CHAR(TO_DATE('@DAC_$$TGT_REFRESH_DATE'), 'DDD'))+(TO_NUMBER(TO_CHAR(TO_DATE('@DAC_$$TGT_REFRESH_DATE'), 'YYYY')) -1900) * 1000
If one keeps digging,
+$$TGT_REFRESH_DATE = @DAC_TARGET_REFRESH_TIMESTAMP in custom format, MM/DD/YYYY+
Compared to EBS*
Now, if I look at the *$$LAST_EXTRACT_DATE* parameter (used in EBS SDEs)
It expands to:
+@DAC_SOURCE_PRUNE_REFRESH_TIMESTAMP in custom format, MM/DD/YYYY+
Conclusion and question*
Obviously the Julian date conversion is required in $$LAST_EXTRACT_JDEDATE, but apparently the prune days are being considered in $$LAST_EXTRACT_DATE and not in $$LAST_EXTRACT_JDEDATE.
An obvious fix is to use @DAC_SOURCE_PRUNE_REFRESH_TIMESTAMP in $$LAST_EXTRACT_JDEDATE, but I don't know if that would have any side effects. I'll test it.
I'll raise a SR with Oracle, but wanted to check if you guys had seen this before.
Thanks, regards.-
Alex.-
Edited by: Alejandro Rosales on Feb 22, 2011 5:57 PMHi Alejandro,
Maybe this has been updated/correct in 7.9.6.2. Later today I will be in the office and can check in a VM image.
I'll update the thread as soon as soon as I have checked this out.
Regards,
Marco Siliakus -
I'm a Captivate user and I wanted to get a better understanding of the Analytics Dashboard so I've downloaded the Presenter 9 trial. I've enabled analytics as well as reporting for a SCORM 2004 environment, published a file with multiple questions, and uploaded the zip SCO into my LMS as well as into SCORM cloud.
When I launch the module I'm presented with the user login prompt, but when I enter my information and click Sign In I get an error message "Error connecting to server, try again." If I continually try to login it will eventually move into the content.
I've also tried launching the content without the SCORM reporting turned on and running the module from a web server, but I still get at least one error message about being unable to connect to the server.
In all three cases, the data from the attempts isn't updated in the learning dashboard. Since I haven't found any information about how this functions I'm guessing that either the data doesn't get updated dynamically or that the error messages are causing the problem.
Can anyone enlighten me to these issues?Hello,
Sorry to know that you are facing problems with Analytics dashboard in Presenter.
Can you tell us if you could connect to als.adobe.com when you were getting this message?
Our server was down for few hours yesterday for maintanence and this was informed to all existing customers.
Can you please retry your workflow as all services are up now. Let us know if its works for you.
If it does not, you can email us at mvlele(@)adobe(dot)com and we will do our best to resolve your issue at the earliest.
Regards,
Mukul -
OIC Analytics setup question.
We are implementing OIC Analytics and have some architectural question and would appreciate if someone can answer. So far here is what we have done.
1) We are going to have OIC Datamart in separate DB so have created master/work repositories & target tables there.
2) Installed ODI studio and imported all the scenarios ( Oracle hasn't provided the interface/models so can't dig into details of process)
3) Established a DB link from OIC-datamart to OIC-EBS.
Now thing which is not clear to me is staging table cn_s_quota_fact_source and how it gets populated. Oracle has given the script and in description have asked to create in EBS APPS Schema.
1 ) But what if I want to create this table in target server. Can I do it and what scenario populate it ?
2) Oracle document E16110-02 says run "Incentive Compensation Analytics – ODI concurrent program" which will push transactional data to datamart. but what if I want to PULL instead of Push. Is it possible , if yes then how ?
I have gone through document Oracle Incentive Compensation Analytics for Oracle Data Integrator User and Implementation Guide(Part Number E16110-02) but it doesn't answer all the question. I am looking for someone who has real time experience implementing OIC for ODI.
Please share your experience.
Thanks & RegardsDear Pahadia,
could you please send me the installation document of OIC analytics. My mail id is [email protected] Thanks in advance.
Regards -
Adobe Captivate Question Analytics in SCORM ???
I have embedded my captivate module in our LMS. How do I get the analytics from the quiz to sync with SCORM?
When I upload the zipped file for my module it does not work.
Do I need a separate zipped file with all quiz questions?Is there something I need to fill out or change on this screen? I'm still not sure why it is not reporting the data back to me.
-
*Directions Webinar-Followup Questions-Retail Analytics-Oracle Spatial
Thanks to all who joined today's Directions Media webinar, "Retail Analytics in an Enterprise Cloud with Oracle Spatial and Oracle Site Hub". If you have followup questions that we did not have time to address during the live webinar, please post them here. Thanks.
-
Techie Question about iWeb, outbound links, and Google Analytics
Hi,
I have a kind of pretty technical question about using Google Analytics (GA) with iWeb sites to measure clicks on outbound links. Here's the setup:
1) My website's online, and GA is working properly.
2) I don't sell stuff directly, but I do have a "Purchase" page which contains a link to a specific Amazon page.
3) I'm pretty sure I can measure how many people go to the Purchase page using GA, but how do I measure whether visitors click on the Amazon link? It would be interesting to me to be able to gauge what the conversion rate is; that is, how many people actually proceed to purchase the product at Amazon.
I've checked the GA instructions and there's a page about manually tracking clicks on outbound links but it requires some code inputting. I know zero about coding, but I'm pretty sure that in iWeb you can't get access to the source code to input the required code.
Am I wrong about this? And if I am, then does anyone here know how to measure the clickthrough rate?
Anyway, hope you can help. Thanks in advance.I don't know about the link tracking but you could have the link take you to a redirect page which automatically sends them to Amazon.com after a preset time, 0 seconds to whatever you'd like. You can put a counter on that redirect page, like StatCounter. StatCounter can tell you how many, from where they came, i.e country and area, repeat visits, etc.
Tutorials #12 & #13 describe how to put a visible counter on a page but with a redirect page you wouldn't have to bother since it wouldn't be seen. You can get all of your information from your account on the StatCounter site.
The redirect page would be a simple plain text page titled redirect.html and would look like this:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="refresh"
content="0; Destination URL">
<title>Redirecting or whatever you'd like to appear at the top of the browser</title>
</head>
<body>
</body>
</html>
0 Is the number of seconds to wait before redirecting the visitor. If set to zero the visitor would not know they were being redirected. Replace Destination URL with the Amazon URL you want them to go to.
OT -
Post Questions here-Today's webcast- BI & Spatial & Predictive Analytics
If you have followup questions from today's Directions webinar (Learn How to Use Oracle’s Spatial and BI Tools for Location-aware Predictive Analytics), please post here. Thanks.
If you would like to share your OBIEE and OBIA knowledge and experiences, please submit your proposals below:
http://submissions.miracd.com/ioug2010/login.asp
Collaborate 2010 (april 18-22) will have a special focus on BI => "Get Analytical with BIWA Training Days" -
Extended Analytics Flat File extract question
I have modified the Extended Analytics SDK application as such:
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
instead of
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
Where does the FLATFILE get saved/output when using the API?
Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
thanks for your help,
ChrisNever mind. I found the location on the server.
Maybe you are looking for
-
Set Parameter Array of OTYPE_ROWID
I have tried to adds a parameter array to a database as OTYPE_ROWID. the sample codes are as follow: //Use oo4o to connect DB CString strRowIDArrayName = "U1"; long ParamRecordCount = 5; ODatabase db; db.open("DSN","UID","PWD"); OParameterCollection
-
Drag/Drop inside SWFLoader not working
I've got a Flex application that loads a secondary Flex app via SWFLoader. The secondary app uses drag/drop operations to rearrange components, delete them etc. The secondary app works perfectly while it is running by itself (eg. not 'embedded' via
-
Service Product Upgrade/Downgrade functionality
Just wanted to ask a question about the eCommerce functionality on BC. My client has a service/support product that they sell to Members. They have various levels of this product eg. Pro, Regular & Basic. One of the features of this service is that M
-
Hello everybody, I have a JSON_File whith different content inside like so: "title": "LEMON", "description": "frame01", "frame01": "_01.png" "title": "POTATO", "description": "frame02", "frame02": "
-
Air StageWebView html5 content vs native browser performance
I'm using StageWebView for displaying a html5 contents (a simple game) in AIR mobile app. On some devices with Android 4.4, Nexus 4 for ex., html5 in StageWebView has 10 FPS instead of 50-60 in mobile Google Chrome on same device! But StageWebView us