Showing unique dates with multiple events
I have a series of calendar events that I'm pulling from a
database, and each of them has an associated datestamp. Basically,
I want to be able to compact the dates, so that each unique date is
listed only once. I would like to present them in a format that
looks something like this:
Nov. 23: Pumpkin Festival
Comic Book Convention
Nov. 24: Woodworker's Presentation
Gallery Showing
Engineering Conference
Nov. 26: Music Festival
Right now, each event has to have a date associated with it.
Like so:
Nov. 23: Pumpkin Festival
Nov. 23: Comic Book Convention.
You can see, they're both on November 23rd.
I've attached the code I'm using to pull the database. It's
pulling events within a 31-day time period. Also, I've got
variables that break up the datestamp into individual parts, so I
just need to figure out how to show each unique date only
once.
> so that each unique date is listed only once
One way is to use the "group" attribute of
cfoutput.
In your SELECT, add a calculated field that extracts the date only.
Then ORDER BY that field and "group" by it in your cfoutput. Since
it contains date/time values, you can use DateFormat() to display
the dates in MMM dd format.
select
convert(datetime, convert(varchar, yourDateColumn, 112), 112)
as SortDate,
othercolumns ...
from your_table
order by SortDate, othercolumns
Similar Messages
-
Am I able to tag a data point of a spreadsheet that is being created by a datalogging VI such that at the end I have the data with multiple tags which corelate to events during a measurement cycle
My final need is to take data from a datalogging VI and store it in a spreadsheet with tags that corespond to events in a subVI which is controlling motor movement. This will allow users to view all data and mark the relevent data for analysis. As usual, user want everthing but with conditions.Sure. What you do is take the numeric value acquired, the tags you want, and build them into an array. So now, when you write to the spreadsheet, you'll have a 2D array. One thing you have to keep in mind is that all elements of an array have to be of the same type. So if your tags are strings, you'll have to convert your numeric data into strings as well.
-
Bug report: Website doesn't show flash data with version 11.4
A website I've used for years no longer shows any data with the new version of Flash Player; I see a spinning "loading" wheel instead, which never stops spinning. Others using the site are having the same problem.
I have removed 11.4 and installed 10.3 and all is well again.
Website: gmws.edupage.org/timetable (the school schedule I administer). A timetable should appear, and clicking on "teacher", "class", etc. should give a dropdown list of options. (Try under 10.3 and you will see the correct behavior.)I've now fixed the timetable so it is using a non-Flash version. To see the problematic behavior, you will now have to navigate to gmws.edupage.org/timetable, and then click on the icon at the top of the page which says "2012-2013 final schedule easily viewable" (this is the non-Flash version). From the dropdown list that appears, choose the Flash version of the timetable, "2012-2013 final schedule (07/27 -..." and you will see the spinning wheel under 11.4, a proper school schedule under 10.3.
-
Device Storage Details shows Other data with capacity 11.9 GB after updating to OS10.1
After upgrading my Z10 to OS 10.1 the storage device shows that other data with 11.9 GB with no reason, i didn't download alot of apps or games, only the default apps with couple of other apps, the Device is so slow and i have to restart the device more than three times in the last 2 hours.
SmoothRider wrote:
Hello demo501,
Welcome to the forums.
When you are looking at your device storage do you see a button that says "Device Storage Details" ?
As far as I know that is the button you have to press to determine if you have been hit by the Other Data Gnome.
Do you have any information to offer as a solution to get rid of it perminantly without wiping your phone? Sick of wiping this phone I got better things to do than baby sit my SmartPhone.
I am astounded at the inability of the BlackBerry community to address this. I am besides myself that BlackBerry corporation hasn't provided adequate documentation about this issue. -
Job with multiple event schedules
Is it possible to create a job with multiple schedules? Can you have multiple schedule names?
DBMS_SCHEDULER.CREATE_JOB (
job_name => 'my_new_job2',
job_type => 'PLSQL_BLOCK',
job_action => 'BEGIN SALES_PKG.UPDATE_SALES_SUMMARY; END;',
schedule_name => 'my_saved_schedule, my_saved_schedule2'); <------------------ like this?
END;
thanks.I am using oracle 10g and have installed the file arrival package. I want my job to run when multiple files arrive. I have created the file arrival event schedules. I know i can create chain event steps to respond, but my chain has to be running for the steps to respond to the events. I want the chain (or rather the job that starts the chain) to kick off when 2 or more files arrive.
thanks. -
Fetch Multiple row in repor to insert data with multiple rows .
Hi Friends
i want to Insert emp attendance .
There are 10 emp in company .they enter there arival time in attendance register .
i have two item
1--p1_att_date
2--p1_status
ATTEN_STATUS are PRESENT by default in selectlist .
i want, when i enter date in p1_att_date item and Press submit then generate a report with 10 employee and insert in_time in report and when press enter then data with 10 emp should insert in to table ABC .
i don't want to use tabularform for this .actully i don't want to use ADD_ROW option to ADD attendance for second emp.so please give me some solution.
Emp_ID ATTEN_DATE ATTEN_STATUS IN_TIME OUT_TIME
101 22-JAN-2009 PRESENT
102 22-JAN-2009 PRESENT
103 22-JAN-2009 PRESENT
104 22-JAN-2009 PRESENT
105 22-JAN-2009 PRESENT
106 22-JAN-2009 PRESENT
107 22-JAN-2009 PRESENT
108 22-JAN-2009 PRESENT
109 22-JAN-2009 PRESENT
110 22-JAN-2009 PRESENT
My table is :-
table Name --ABC
emp_id number;
atten_date date;
atten_status varchar2(12);
in_time timestemp;
out_time timestemp;
How can i do this.
Thanks
ManojHi Manoj,
You can create multiple records easily using a single, simple, form. However, you would surely have to enter in the times individually using a tabular form (otherwise, you would have to use the same form 10 times). You do not have to keep the Add Row option on the page - this functionality can be removed.
Andy -
ST06 showing all data with value Zero
Dear Friends,
In transaction code ST06 or OS06 it is not showing all data i.e. CPU utilization file system etc, it show only Physical memory data all other fields are Zero. SAPOSCOL is running.
So please help how can i retriew these fiels in ST06.
Thanks in advance.
Regards,
Sachin Jadhav
BASISReplace 0 with an empty string in a derived column transformation.
REPLACE(myColumn,"0","")
MCSE SQL Server 2012 - Please mark posts as answered where appropriate. -
Seperating data with multiple headers
Hello,
I have a data set that has 6 columns with headers containing information for one cycle of a part (Data File is Attached). About 200 rows later I have another set of data with 6 columns for cycle 2 with a space and a new header, and at about another 400 rows later I have the data for cycle 5 with a space and a new header. This pattern repeats throughout the data set.
I need a way to seperate this data so that I can plot the individual cycles. When I import this data set into Diadem with the DataPlugin Wizzard it does not recognize the spaces and new headers. It labels the spaces and headers as "NO VALUE", so there are discontinuities in the data. Is there a way to seperate the cycles in this data set in Diadem?
For example, I would like to have rows 6 thru 215, rows 220 thru 630, rows 635 thru 1046, rows 1051 thru 1462, etc. This way I can plot the individual cycles against the running time.
Solved!
Go to Solution.
Attachments:
Sample_Data2.csv 774 KBHi wils01,
Here's a DataPlugin I created that loads your posted data file with each cycle in a separate Data Portal Group.
Brad Turpin
DIAdem Product Support Engineer
National Instruments
Attachments:
wils01_Cycles_CSV.uri 9 KB -
LISTCUBE not showing any data with SAPKW70110
Hi all
we just upgraded our system from SAPKW70107 to SAPKW70110 and now do have the following problem:
All InfoCubes containing non-cumulatives don't show any data anymore in transaction LISTCUBE.
All the cubes did work fine before.
What is curious is that some queries based on these cubes are working and at least showing some data (but not the correct data).
We couldn't find anything in the notes so far..
Anything we could do?
Regards
bivisionHi,
You can use improvised listcube to display data. It can be created by taking help from the following link ;-
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/70e092a3-9d8a-2d10-6ea5-8846989ad405?QuickLink=index&overridelayout=true
However you should raise this issue to SAP .
Navesh
Edited by: navsamol on Oct 11, 2011 8:17 PM -
How to autopopulate a text field with unique data from multiple data sets
Hi,
I'm a laboratory manager in charge of a hospital project which will be using pdf forms to send and receive data from our end users across the city. I need help with the last part of our pdf form, specifically with a js that will do a bit of text-field autopopulation magic. This, unfortunately, is a beyond what I have taught myself about pdf js functionality.a
The problem:
I need to provide my end users with a text field containing a set of data [A, B, C, D, E, F ...] and the total items in this set [tot#]. The end user needs this information as part of the implementation of this particular laboratory machine.
The particulars
When the end user asks for an experiment to be run, we must specify some pieces of data to help them interpret the results. These are constructed as panels which contain discrete data elements.
For example - One experiment may use two panels, Panel#1 and Panel #2. Panel #1 includes the items A, B, D, E, Panel #2 includes the items A, B, C, F, G.
Thus, the panels may share some of the same items, but, I only want the unique members to be displayed in the text field. If I make a drop down box or checkboxes with the panels, I want to be able to select the panels that we ran and (in this example) have the text field display only the unique items among all the panels that were used:
textfield outpute = A, B, C, D, E, F, G - 7 total.
Any assistance from the pdf community would be very much appreciated.
Thanks!Thanks for that help.
I should have made it more apparent that I'm very new to scripting and I'm
not a programmer by trade. I have a few questions before modifying the code
you kindly provided.
1) Where should I embed this script? Within the 'selection change' area of
my listbox?
2) Can I replace the term 'arr' with the names of the various items in my
listbox or should I put 'arr#' as the output value for each term?
3) Will this script find and display the unique values when a user selects
multiple items in my listbox?
4) How does the script know where to output the unique members of the
combined set?
I appreciate your patience with me.
ck -
To comapre a single date with multiple dates
Hi All,
I am using fetch xml report in which there are two table Opportunity and activity
One Opportunity with have multiple activity
Opportunity has a field called new_dateofapplication
and Activity has a field called StartDateActivity which have multiple values
i need to compare these dates somehow as below but it didnt work
=iif((Fields!new_dateofapplicationValue.Value)<=(Fields!StartDateActivityValue.Value),sum(Fields!ActivityDurationValue.Value),0)
for e.g
Any help much appreciated
Thanks
Pradnya07Hi Simran08,
According to your description, you have two tables in your report. Now you want to compare the date in Opportunity table with the date in Activity table, sum all activity hours which their date are later than the start date. Right?
In Reporting Service, if you want to compare two values in IIF() function without using Lookup/LookupSet, these two data fields need to be in same scope. Your date are from different tables. It seems you have different datasets for those tables. This may
be the reason why you can’t make your expression working in your report. And for comparing data from different datasets, we only have Lookup/LookupSet function. However, this function is only working in conditions where the value of one field is equals to
the value of the other field. So we can never use this function to judge if the value of one field is greater/less than the value of the other field. So in your scenario, it’s impossible to find a solution if your Activity table only has those two columns.
Also if you get the total hours in that way (like you post in your picture), you can’t figure out how many hours in total hours are spent on each program. But I guess it supposed to have some fields in your Activity to show clients and program. If so, you
just need to set groups in your table and get total in group footer. It will looks like below:
If possible could you post the dataset and table structure or any other detail information about your report? That can help us tested your case in our local environment and do further analysis. Thank you.
Reference:
LookupSet Function (Report Builder and SSRS)
Best Regards,
Simon Hou
-
How do I Identify Lead and Lag in consecutive dates with multiple values?
I am using:
Oracle SQL Developer (3.0.04)
Build MAin-04.34
Oracle Database 11g
Enterprise Edition 11.2.0.1.0 - 64bit Production
I would like to identify the Lead and Lags based on two groups.
The groupping is that multiple C_ID can be found in a Single W_ID
and that multiple D_VAL can be found in C_ID
So I would like to identify and mark with "Lead" and "Lag" related to the "consecutivedaysc" (this already matches the D_VAL and C_ID very well) for example W_ID 2004 has C_ID 2059 with D_VAL of 44 for April 2 & 3, the consecutive days are 2 and I would like to correctly mark April 2 as the Lead and April 3 as the lag.
Then I would like to mark the "Lead" and "Lag" independent of if there are multiple D_VAL on the same W_ID
Example that I am having trouble with:
W_ID 2285 on April 10 for C_ID 7847, I don't understand whay I can't get "Lag" in stead of a "Lead" realted to the consecutivedaysw
I would like to eventually have it that the data gets summarized based on W_ID and sinlge (non-repeating) dt with Lead and Lags.
table
with t as (
select 4592 U_KEY,0 D_VAL_PRESENT,2004 W_ID,to_date('4/1/2013','mm-dd-yyyy') dt,2059 C_ID, (null) D_VAL,0 GWID,13 GCID,1 CONSECUTIVEDAYSC,1 CONSECUTIVEDAYSW from dual union all
select 4591,1,2004,to_date('4/2/2013','mm-dd-yyyy'),2059,44,1,11,2,13 from dual union all
select 4590,1,2004,to_date('4/3/2013','mm-dd-yyyy'),2059,44,1,11,2,13 from dual union all
select 4589,1,2004,to_date('4/4/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4588,1,2004,to_date('4/5/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4587,1,2004,to_date('4/6/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4586,1,2004,to_date('4/7/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4585,1,2004,to_date('4/8/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4584,1,2004,to_date('4/9/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4583,1,2004,to_date('4/10/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4582,1,2004,to_date('4/11/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4581,1,2004,to_date('4/12/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4580,1,2004,to_date('4/13/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4579,1,2004,to_date('4/14/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 1092,0,2686,to_date('4/1/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3416,0,2686,to_date('4/1/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18118,0,2686,to_date('4/1/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1091,0,2686,to_date('4/2/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3415,0,2686,to_date('4/2/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18117,0,2686,to_date('4/2/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1090,0,2686,to_date('4/3/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3414,0,2686,to_date('4/3/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18116,0,2686,to_date('4/3/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1089,1,2686,to_date('4/4/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3413,1,2686,to_date('4/4/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18115,1,2686,to_date('4/4/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1088,1,2686,to_date('4/5/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3412,1,2686,to_date('4/5/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18114,1,2686,to_date('4/5/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1087,1,2686,to_date('4/6/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3411,1,2686,to_date('4/6/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18113,1,2686,to_date('4/6/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1086,1,2686,to_date('4/7/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3410,1,2686,to_date('4/7/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18112,1,2686,to_date('4/7/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1085,1,2686,to_date('4/8/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3409,1,2686,to_date('4/8/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18111,1,2686,to_date('4/8/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1084,1,2686,to_date('4/9/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3408,1,2686,to_date('4/9/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18110,1,2686,to_date('4/9/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1083,1,2686,to_date('4/10/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3407,1,2686,to_date('4/10/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18109,1,2686,to_date('4/10/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1082,1,2686,to_date('4/11/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3406,1,2686,to_date('4/11/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18108,1,2686,to_date('4/11/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1081,1,2686,to_date('4/12/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3405,1,2686,to_date('4/12/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18107,1,2686,to_date('4/12/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1080,1,2686,to_date('4/13/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3404,1,2686,to_date('4/13/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18106,1,2686,to_date('4/13/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1079,1,2686,to_date('4/14/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3403,1,2686,to_date('4/14/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18105,1,2686,to_date('4/14/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 17390,1,3034,to_date('4/1/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17389,1,3034,to_date('4/2/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17388,1,3034,to_date('4/3/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17387,1,3034,to_date('4/4/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17386,1,3034,to_date('4/5/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 7305,1,3034,to_date('4/6/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17385,1,3034,to_date('4/6/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14123,1,3034,to_date('4/6/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 17384,1,3034,to_date('4/7/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17383,1,3034,to_date('4/8/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 7302,1,3034,to_date('4/9/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17382,1,3034,to_date('4/9/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14120,1,3034,to_date('4/9/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7301,1,3034,to_date('4/10/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17381,1,3034,to_date('4/10/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14119,1,3034,to_date('4/10/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7300,1,3034,to_date('4/11/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17380,1,3034,to_date('4/11/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14118,1,3034,to_date('4/11/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7299,1,3034,to_date('4/12/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17379,1,3034,to_date('4/12/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14117,1,3034,to_date('4/12/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7298,1,3034,to_date('4/13/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17378,1,3034,to_date('4/13/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14116,1,3034,to_date('4/13/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7297,1,3034,to_date('4/14/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17377,1,3034,to_date('4/14/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14115,1,3034,to_date('4/14/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual
)the script that I am using:
select
t.*/
case
when lag(dt) over(partition by c_id, d_val order by dt, u_key)+1 = dt
then'Lag'
when lead(dt) over(partition by c_id, d_val order by dt, u_key)-1 = dt
then 'Lead_1'
when consecutivedaysc = 1
then'Lead_3'
else 'wrong'
end LeadLagD_VAL,
case
when lag(dt) over(partition by w_id, c_id, d_val_present,gwid order by dt)+1 = dt
then'Lag'
when lead(dt) over(partition by w_id, c_id, d_val_present, gwid order by dt)-1 = dt
then 'Lead_A'
when consecutivedaysw = 1
then 'Lead_B'
else 'wrong'
end Lead_Lag2
from t
order by
W_ID,
dt asc,
C_ID asc
;the results should look like this (but haveing issues)
u_key D_VAL_PRESENT W_ID C_ID DT D_VAL GWID GCID CONSECUTIVEDAYSC CONSECUTIVEDAYSW LEADLAGD_VAL LEAD_LAG2
4592 0 2004 2059 01-APR-13 0 13 1 1 Lead_1 Lead_A
4591 1 2004 2059 02-APR-13 44 1 11 2 13 Lead_1 Lead_A
4590 1 2004 2059 03-APR-13 44 1 11 2 13 Lag Lag
4589 1 2004 2059 04-APR-13 389 1 0 11 13 Lead_1 Lag
4588 1 2004 2059 05-APR-13 389 1 0 11 13 Lag Lag
4587 1 2004 2059 06-APR-13 389 1 0 11 13 Lag Lag
4586 1 2004 2059 07-APR-13 389 1 0 11 13 Lag Lag
4585 1 2004 2059 08-APR-13 389 1 0 11 13 Lag Lag
4584 1 2004 2059 09-APR-13 389 1 0 11 13 Lag Lag
4583 1 2004 2059 10-APR-13 389 1 0 11 13 Lag Lag
4582 1 2004 2059 11-APR-13 389 1 0 11 13 Lag Lag
4581 1 2004 2059 12-APR-13 389 1 0 11 13 Lag Lag
4580 1 2004 2059 13-APR-13 389 1 0 11 13 Lag Lag
4579 1 2004 2059 14-APR-13 389 1 0 11 13 Lag Lag
1092 0 2686 7210 01-APR-13 0 11 3 3 Lead_1 Lead_A
3416 0 2686 7211 01-APR-13 0 11 3 3 Lead_1 Lead_A
18118 0 2686 17391 01-APR-13 0 11 3 3 Lead_1 Lead_A
1091 0 2686 7210 02-APR-13 0 11 3 3 Lag Lag
3415 0 2686 7211 02-APR-13 0 11 3 3 Lag Lag
18117 0 2686 17391 02-APR-13 0 11 3 3 Lag Lag
1090 0 2686 7210 03-APR-13 0 11 3 3 Lag Lag
3414 0 2686 7211 03-APR-13 0 11 3 3 Lag Lag
18116 0 2686 17391 03-APR-13 0 11 3 3 Lag Lag
1089 1 2686 7210 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
3413 1 2686 7211 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
18115 1 2686 17391 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
1088 1 2686 7210 05-APR-13 51 9 0 11 11 Lag Lag
3412 1 2686 7211 05-APR-13 51 9 0 11 11 Lag Lag
18114 1 2686 17391 05-APR-13 51 9 0 11 11 Lag Lag
1087 1 2686 7210 06-APR-13 51 9 0 11 11 Lag Lag
3411 1 2686 7211 06-APR-13 51 9 0 11 11 Lag Lag
18113 1 2686 17391 06-APR-13 51 9 0 11 11 Lag Lag
1086 1 2686 7210 07-APR-13 51 9 0 11 11 Lag Lag
3410 1 2686 7211 07-APR-13 51 9 0 11 11 Lag Lag
18112 1 2686 17391 07-APR-13 51 9 0 11 11 Lag Lag
1085 1 2686 7210 08-APR-13 51 9 0 11 11 Lag Lag
3409 1 2686 7211 08-APR-13 51 9 0 11 11 Lag Lag
18111 1 2686 17391 08-APR-13 51 9 0 11 11 Lag Lag
1084 1 2686 7210 09-APR-13 51 9 0 11 11 Lag Lag
3408 1 2686 7211 09-APR-13 51 9 0 11 11 Lag Lag
18110 1 2686 17391 09-APR-13 51 9 0 11 11 Lag Lag
1083 1 2686 7210 10-APR-13 51 9 0 11 11 Lag Lag
3407 1 2686 7211 10-APR-13 51 9 0 11 11 Lag Lag
18109 1 2686 17391 10-APR-13 51 9 0 11 11 Lag Lag
1082 1 2686 7210 11-APR-13 51 9 0 11 11 Lag Lag
3406 1 2686 7211 11-APR-13 51 9 0 11 11 Lag Lag
18108 1 2686 17391 11-APR-13 51 9 0 11 11 Lag Lag
1081 1 2686 7210 12-APR-13 51 9 0 11 11 Lag Lag
3405 1 2686 7211 12-APR-13 51 9 0 11 11 Lag Lag
18107 1 2686 17391 12-APR-13 51 9 0 11 11 Lag Lag
1080 1 2686 7210 13-APR-13 51 9 0 11 11 Lag Lag
3404 1 2686 7211 13-APR-13 51 9 0 11 11 Lag Lag
18106 1 2686 17391 13-APR-13 51 9 0 11 11 Lag Lag
1079 1 2686 7210 14-APR-13 51 9 0 11 11 Lag Lag
3403 1 2686 7211 14-APR-13 51 9 0 11 11 Lag Lag
18105 1 2686 17391 14-APR-13 51 9 0 11 11 Lag Lag
17390 1 3034 5395 01-APR-13 298 0 0 14 14 Lead_1 Lead_A
17389 1 3034 5395 02-APR-13 298 0 0 14 14 Lag Lag
17388 1 3034 5395 03-APR-13 298 0 0 14 14 Lag Lag
17387 1 3034 5395 04-APR-13 298 0 0 14 14 Lag Lag
17386 1 3034 5395 05-APR-13 298 0 0 14 14 Lag Lag
7305 1 3034 5394 06-APR-13 44 0 0 7 14 Lead_1 Lag
17385 1 3034 5395 06-APR-13 298 0 0 14 14 Lag Lag
14123 1 3034 22421 06-APR-13 44 0 0 7 14 Lead_1 Lag
17384 1 3034 5395 07-APR-13 298 0 0 14 14 Lag Lag
17383 1 3034 5395 08-APR-13 298 0 0 14 14 Lag Lag
7302 1 3034 5394 09-APR-13 44 0 0 7 14 Lead_1 Lag
17382 1 3034 5395 09-APR-13 298 0 0 14 14 Lag Lag
14120 1 3034 22421 09-APR-13 44 0 0 7 14 Lead_1 Lag
7301 1 3034 5394 10-APR-13 44 0 0 7 14 Lag Lag
17381 1 3034 5395 10-APR-13 298 0 0 14 14 Lag Lag
14119 1 3034 22421 10-APR-13 44 0 0 7 14 Lag Lag
7300 1 3034 5394 11-APR-13 44 0 0 7 14 Lag Lag
17380 1 3034 5395 11-APR-13 298 0 0 14 14 Lag Lag
14118 1 3034 22421 11-APR-13 44 0 0 7 14 Lag Lag
7299 1 3034 5394 12-APR-13 44 0 0 7 14 Lag Lag
17379 1 3034 5395 12-APR-13 298 0 0 14 14 Lag Lag
14117 1 3034 22421 12-APR-13 44 0 0 7 14 Lag Lag
7298 1 3034 5394 13-APR-13 44 0 0 7 14 Lag Lag
17378 1 3034 5395 13-APR-13 298 0 0 14 14 Lag Lag
14116 1 3034 22421 13-APR-13 44 0 0 7 14 Lag Lag
7297 1 3034 5394 14-APR-13 44 0 0 7 14 Lag Lag
17377 1 3034 5395 14-APR-13 298 0 0 14 14 Lag Lag
14115 1 3034 22421 14-APR-13 44 0 0 7 14 Lag Lag I place the "wrong" showing that neither the when conditions were no able to work.
any suggestions on a better direction for me to solve this?
Edited by: 1004407 on May 23, 2013 1:16 PM
Then I am trying to get this, not to include C_ID
u_key D_VAL_PRESENT W_ID DT CONSECUTIVEDAYSW LEAD_LAG2
4592 0 2004 01-APR-13 1 Lead_A
4591 1 2004 02-APR-13 13 Lead_A
4590 1 2004 03-APR-13 13 Lag
4589 1 2004 04-APR-13 13 Lag
4588 1 2004 05-APR-13 13 Lag
4587 1 2004 06-APR-13 13 Lag
4586 1 2004 07-APR-13 13 Lag
4585 1 2004 08-APR-13 13 Lag
4584 1 2004 09-APR-13 13 Lag
4583 1 2004 10-APR-13 13 Lag
4582 1 2004 11-APR-13 13 Lag
4581 1 2004 12-APR-13 13 Lag
4580 1 2004 13-APR-13 13 Lag
4579 1 2004 14-APR-13 13 Lag
1092 0 2686 01-APR-13 3 Lead_A
1091 0 2686 02-APR-13 3 Lag
1090 0 2686 03-APR-13 3 Lag
1089 1 2686 04-APR-13 11 Lead_A
1088 1 2686 05-APR-13 11 Lag
1087 1 2686 06-APR-13 11 Lag
1086 1 2686 07-APR-13 11 Lag
1085 1 2686 08-APR-13 11 Lag
1084 1 2686 09-APR-13 11 Lag
1083 1 2686 10-APR-13 11 Lag
1082 1 2686 11-APR-13 11 Lag
1081 1 2686 12-APR-13 11 Lag
1080 1 2686 13-APR-13 11 Lag
1079 1 2686 14-APR-13 11 Lag
17390 1 3034 01-APR-13 14 Lead_A
17389 1 3034 02-APR-13 14 Lag
17388 1 3034 03-APR-13 14 Lag
17387 1 3034 04-APR-13 14 Lag
17386 1 3034 05-APR-13 14 Lag
7305 1 3034 06-APR-13 14 Lag
17384 1 3034 07-APR-13 14 Lag
17383 1 3034 08-APR-13 14 Lag
7302 1 3034 09-APR-13 14 Lag
7301 1 3034 10-APR-13 14 Lag
7300 1 3034 11-APR-13 14 Lag
7299 1 3034 12-APR-13 14 Lag
7298 1 3034 13-APR-13 14 Lag
7297 1 3034 14-APR-13 14 Lag then into this (which I would use where Lead_Lag2 = "Lead_A"
u_key D_VAL_PRESENT W_ID DT CONSECUTIVEDAYSW LEAD_LAG2
4592 0 2004 01-APR-13 1 Lead_A
4591 1 2004 02-APR-13 13 Lead_A
1092 0 2686 01-APR-13 3 Lead_A
11089 1 2686 04-APR-13 11 Lead_A
17390 1 3034 01-APR-13 14 Lead_A but onething at a time.
Thanks for point out the errors Frank, always helpful to know what others see.
Edited by: 1004407 on May 23, 2013 2:36 PM
Edited by: 1004407 on May 23, 2013 4:01 PMIs this the first set you expect?
SQL> with flagged as
2 (
3 select w_id,d_val,dt,u_key,
4 case when lag(dt) over(partition by w_id,d_val order by dt,u_key)
5 in (dt,dt-1)
6 then 0
7 else 1
8 end flg
9 from t
10 ),
11 summed as
12 (
13 select w_id,d_val,dt,u_key,
14 sum(flg) over(order by w_id,d_val nulls first,dt,u_key) sm
15 from flagged
16 ),
17 day_count as
18 (
19 select w_id,d_val,dt,u_key,count(distinct dt) over(partition by sm) cnt
20 from summed
21 )
22 select w_id,d_val,dt,u_key,cnt
23 from day_count
24 order by w_id,d_val nulls first,dt,u_key;
W_ID D_VAL DT U_KEY CNT
2004 01-APR-13 4592 1
2004 44 02-APR-13 4591 2
2004 44 03-APR-13 4590 2
2004 389 04-APR-13 4589 11
2004 389 05-APR-13 4588 11
2004 389 06-APR-13 4587 11
2004 389 07-APR-13 4586 11
2004 389 08-APR-13 4585 11
2004 389 09-APR-13 4584 11
2004 389 10-APR-13 4583 11
2004 389 11-APR-13 4582 11
2004 389 12-APR-13 4581 11
2004 389 13-APR-13 4580 11
2004 389 14-APR-13 4579 11
2686 01-APR-13 1092 3
2686 01-APR-13 3416 3
2686 01-APR-13 18118 3
2686 02-APR-13 1091 3
2686 02-APR-13 3415 3
2686 02-APR-13 18117 3
2686 03-APR-13 1090 3
2686 03-APR-13 3414 3
2686 03-APR-13 18116 3
..... -
I am playing with the template where an event structure takes note of user inputs and a main state machine is then used to respond to these inputs. When I only monitor one control I use the 'NewVal' output to read out the changed value. But when I monitor multiple objects with a single case I also have to associate the readout with the correct owner. After some tinkering I was able to extract the label property and use a case to assign them. Are there better ways of doing this? For example maybe there is a way to connect the label text directly to the 'bundle by name'?
Also this should be easy to accomplish by simply creating local variables of the objects and read from them, but I got the impression that the use of global and local variables is now strongly discouraged?
Thanks for any suggestions!
Attachments:
Untitled.png 39 KBWell, I don't really like these theroretical discussions. Can you attach a simplified version of some of your code?
There are many other ways to identify the particular control. You could for example search an array of references for the value of the "ctlref" event data node. This would make the code much more robust (your code will fail for example if you (or some other programmer updating your code in a few years!) notices a mispelling and edits the label without also changing the case structure cases).
LabVIEW Champion . Do more with less code and in less time . -
HTML Client Entering/Editing Data With Multiple Tables
Hi guys,
I've been stumped on what seems to be a relatively simple problem for quite some time now. I have a bunch of Products - Dairy, Meat, Produce and Candy, for example. Each Product has multiple ProductConfigurations - say, Monday, Tuesday, Wednesday, Thursday,
and Weekend. Each Product also has its own unique set of ProductAttributes; Meat may have Animal, ExpirationDate, CostPerPound, Kosher for example, while Candy may have Type, Manufacturer, Price, Distributor, and Dairy may have Source, ExpirationDate, UnitCost.
These are just examples, but the point is each Product has unique ProductAttributes.
For each ProductConfiguration (which has a Product), I want to be able to go to this ProductConfiguration and see a list of all its ProductAttributes, and provide an option to edit the ProductAttributeValue (a separate table) for each ProductAttribute. A
ProductAttributeValue has a Value, a ProductAttribute (ie, Manufacturer) and a ProductConfiguration (ie, Tuesday).
The way I currently have it set up is with a clickable List of all ProductAttributes for a given Product (which I get by querying the filter 'ProductAttribute.ProductID = Screen.ProductID' from the ProductAttributes tables). When you click, it brings up
an 'AddNewProductAttributeValue' screen, in which the ProductAttribute is already populated. However, this requires a looot of clicks for Product(Configurations) which have many ProductAttributes. I'd love to be able to configure all of the ProductAttributeValues
for a given ProductConfiguration on one screen.
Any ideas how I could do this?Matt, there are two bits to get the in-built table to play nicely.
// This makes the input textbox focus work better
myapp.AddEditProductConfiguration.created = function (screen) {
var scr = screen;
scr.ProductAttributeValues.addChangeListener("selectedItem", function (eventArgs) {
if (scr.ProductAttributeValues.selectedItem != null) {
var item = scr.ProductAttributeValues.selectedItem;
setTimeout(function () {
$("input", item.details.focusElement).focus();
}, 100);
myapp.AddEditProductConfiguration.Value_postRender = function (element, contentItem) {
contentItem.details._details.focusElement = element;
... the maintenance screen is based on a standard 'AddEdit<entity>' screen tweaked and in msls-2.5.2.js (or earlier) ...
case $.mobile.keyCode.TAB:
//Xpert360: tab
//updateSelectedHeader(table, table._headerRowElement[0]);
break;
... to stop the keyup handler messing up tabbing between input textboxes.
Some code similar to this could pre-populate (or verify) that a configuration has rows of ProductAttributeValues matching ProductAttributes ...
scr.getProductAttributeValues().then(function (values) {
if (scr.ProductAttributeValues.count <= 0) {
scr.getProductAttributes().then(function (values) {
if (scr.ProductAttributes.count > 0) {
scr.ProductAttributes.data.forEach(function(attr, idx, obj){
var attrvalue = scr.ProductAttributeValues.addNew();
attrvalue.ProductAttribute = scr.ProductAttributes.data[idx];
attrvalue.ProductConfiguration = scr.ProductConfiguration;
attrvalue.Value = "";
... which could be used in a variety of places.
I will look to uploading the sample application.
Dave
Dave Baker | AIDE for LightSwitch | Xpert360 blog | twitter : @xpert360 | Xpert360 website | Opinions are my own. For better forums, remember to mark posts as helpful/answer. -
11g Cube not showing any data with no Rejected records
Hi David ,
Strangely one of my 11g Cube is not showing data from today as it is showing all records rejected . However when I lookup Rejected records table I didn't find any records .Not sure what is happening . When I want to peep out the AWM queries from CUBE_BUILD_LOG and ran to the database in the AWM schema the records are retruning perfectly fine . I wonder same query is firing during Cube load but without any data ? My Cube build script has only LOAD and Aggregate .
after maintain My dimensions data are looking fine but no data populated after Cube maintenance . My MV switch off across all dimensions and cubes .
I navigate to CUBE_OPERATION_LOG and not able to comprehend about the content.
Any advice ?
Thanks and Regards,
DxPHi David ,
To be very frank today is very bad day ... Please see below my observation:
Executed below to make sure that no key values in dimension is missing but present in fact . All below query returns no row.
select distinct owner_postn_wid from w_synm_rx_t_f
minus
select distinct row_wid from postn_dh
select distinct payer_type_Wid from w_synm_rx_t_f
minus
select distinct row_wid from wc_ins_plan_dh
select distinct market_wid from w_synm_rx_t_f
minus
select distinct row_wid from w_product_dh
select distinct period_day_wid from w_synm_rx_t_f
minus
select distinct row_wid from w_daytime_D
select distinct contact_wid from w_synm_rx_t_f
intersect
select distinct row_wid from w_person_d
select distinct X_TERR_TYPE_WID from w_synm_rx_t_f
minus
select distinct row_wid from W_LOV_D
============================
Below returns count of 0 rows : ensure no NULL present
select count(1) from w_synm_rx_t_f where contact_wid is null;
select count(1) from w_synm_rx_t_f where owner_postn_wid is null;
select count(1) from w_synm_rx_t_f where payer_type_Wid is null;
select count(1) from w_synm_rx_t_f where period_day_wid is null;
select count(1) from w_synm_rx_t_f where X_TERR_TYPE_WID is null;
select count(1) from w_synm_rx_t_f where market_wid is null;
+++++++++++++++++++++++++++++++++
Cube Build Log has below entry:
796 0 STARTED CLEAR VALUES MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.267000000 PM +05:30 JAVA 1 C 47141 67 0 1
796 0 COMPLETED CLEAR VALUES MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.267000000 PM +05:30 JAVA 1 C 47141 67 0 2
796 0 STARTED LOAD MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.283000000 PM +05:30 JAVA 1 C 47142 68 0 1
796 0 SQL LOAD MKT_SLS_CUBE CUBE "<SQL>
<![CDATA[
SELECT /*+ bypass_recursive_check cursor_sharing_exact no_expand no_rewrite */
T16_ROW_WID ALIAS_127,
T13_ROW_WID ALIAS_128,
T10_ROW_WID ALIAS_129,
T7_ROW_WID ALIAS_130,
T4_ROW_WID ALIAS_131,
T1_ROW_WID ALIAS_132,
SUM(T20_MKT_TRX) ALIAS_133,
SUM(T20_MKT_NRX) ALIAS_134
FROM
SELECT /*+ no_rewrite */
T1."CONTACT_WID" T20_CONTACT_WID,
T1."MARKET_WID" T20_MARKET_WID,
T1."OWNER_POSTN_WID" T20_OWNER_POSTN_WID,
T1."PAYER_TYPE_WID" T20_PAYER_TYPE_WID,
T1."PERIOD_DAY_WID" T20_PERIOD_DAY_WID,
T1."MKT_NRX" T20_MKT_NRX,
T1."MKT_TRX" T20_MKT_TRX,
T1."X_TERR_TYPE_WID" T20_X_TERR_TYPE_WID
FROM
NN_OLAP_POC."W_SYNM_RX_T_F" T1 )
T20,
SELECT /*+ no_rewrite */
T1."ROW_WID" T16_ROW_WID
FROM
NN_OLAP_POC."W_DAYTIME_D" T1 )
T16,
SELECT /*+ no_rewrite */
T1."ROW_WID" T13_ROW_WID
FROM
NN_OLAP_POC."W_PERSON_D" T1 )
T13,
SELECT /*+ no_rewrite */
T1."ROW_WID" T10_ROW_WID
FROM
NN_OLAP_POC."WC_INS_PLAN_DH" T1 )
T10,
SELECT /*+ no_rewrite */
T1."ROW_WID" T7_ROW_WID
FROM
NN_OLAP_POC."W_LOV_D" T1 )
T7,
SELECT /*+ no_rewrite */
T1."ROW_WID" T4_ROW_WID
FROM
NN_OLAP_POC."POSTN_DH" T1 )
T4,
SELECT /*+ no_rewrite */
T1."ROW_WID" T1_ROW_WID
FROM
NN_OLAP_POC."W_PRODUCT_DH" T1 )
T1
WHERE
((T20_PERIOD_DAY_WID = T16_ROW_WID)
AND (T16_ROW_WID = 20100101)
AND (T20_CONTACT_WID = T13_ROW_WID)
AND (T20_PAYER_TYPE_WID = T10_ROW_WID)
AND (T7_ROW_WID = T20_X_TERR_TYPE_WID)
AND (T20_OWNER_POSTN_WID = T4_ROW_WID)
AND (T20_MARKET_WID = T1_ROW_WID)
AND ((T20_PERIOD_DAY_WID) IN ((20100107.000000) , (20100106.000000) , (20100128.000000) , (20100124.000000) , (20100121.000000) , (20100118.000000) , (20100115.000000) , (20100109.000000) , (20100125.000000) , (20100114.000000) , (20100111.000000) , (20100110.000000) , (20100104.000000) , (20100101.000000) , (20100129.000000) , (20100123.000000) , (20100117.000000) , (20100113.000000) , (20100108.000000) , (20100131.000000) , (20100120.000000) , (20100116.000000) , (20100119.000000) , (20100105.000000) , (20100102.000000) ,
(20100130.000000) , (20100127.000000) , (20100122.000000) , (20100112.000000) , (20100103.000000) , (20100126.000000) ) ) )
GROUP BY
(T1_ROW_WID, T4_ROW_WID, T7_ROW_WID, T10_ROW_WID, T13_ROW_WID, T16_ROW_WID)
ORDER BY
T1_ROW_WID ASC NULLS LAST ,
T4_ROW_WID ASC NULLS LAST ,
T7_ROW_WID ASC NULLS LAST ,
T10_ROW_WID ASC NULLS LAST ,
T13_ROW_WID ASC NULLS LAST ,
T16_ROW_WID ASC NULLS LAST ]]>/>
</SQL>" NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.627000000 PM +05:30 JAVA 1 MAP1 C 47142 68 0 2
796 0 COMPLETED LOAD MKT_SLS_CUBE CUBE "<CubeLoad
LOADED="0"
REJECTED="4148617"/>" NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.486000000 PM +05:30 JAVA 1 C 47142 68 0 3
796 0 STARTED UPDATE MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.501000000 PM +05:30 JAVA 1 C 47143 69 0 1
796 0 COMPLETED UPDATE MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.548000000 PM +05:30 JAVA 1 C 47143 69 0 2
+++++++++++++++++
You can observer clear rejection of 4 million rows ... Ran the above query which returns my data successfully.
Look out to CUBE_REJECTED records take the sample record and put into the above query it is returning the data fine with my measures and dimension WID's :(PLEASE SEE BELOW THE FILTERS on ROW_WID)
=========================
SELECT /*+ bypass_recursive_check cursor_sharing_exact no_expand no_rewrite */
T16_ROW_WID ALIAS_127,
T13_ROW_WID ALIAS_128,
T10_ROW_WID ALIAS_129,
T7_ROW_WID ALIAS_130,
T4_ROW_WID ALIAS_131,
T1_ROW_WID ALIAS_132,
SUM(T20_MKT_TRX) ALIAS_133,
SUM(T20_MKT_NRX) ALIAS_134
FROM
SELECT /*+ no_rewrite */
T1."CONTACT_WID" T20_CONTACT_WID,
T1."MARKET_WID" T20_MARKET_WID,
T1."OWNER_POSTN_WID" T20_OWNER_POSTN_WID,
T1."PAYER_TYPE_WID" T20_PAYER_TYPE_WID,
T1."PERIOD_DAY_WID" T20_PERIOD_DAY_WID,
T1."MKT_NRX" T20_MKT_NRX,
T1."MKT_TRX" T20_MKT_TRX,
T1."X_TERR_TYPE_WID" T20_X_TERR_TYPE_WID
FROM
NN_OLAP_POC."W_SYNM_RX_T_F" T1 )
T20,
SELECT /*+ no_rewrite */
T1."ROW_WID" T16_ROW_WID
FROM
NN_OLAP_POC."W_DAYTIME_D" T1 )
T16,
SELECT /*+ no_rewrite */
T1."ROW_WID" T13_ROW_WID
FROM
NN_OLAP_POC."W_PERSON_D" T1 )
T13,
SELECT /*+ no_rewrite */
T1."ROW_WID" T10_ROW_WID
FROM
NN_OLAP_POC."WC_INS_PLAN_DH" T1 )
T10,
SELECT /*+ no_rewrite */
T1."ROW_WID" T7_ROW_WID
FROM
NN_OLAP_POC."W_LOV_D" T1 )
T7,
SELECT /*+ no_rewrite */
T1."ROW_WID" T4_ROW_WID
FROM
NN_OLAP_POC."POSTN_DH" T1 )
T4,
SELECT /*+ no_rewrite */
T1."ROW_WID" T1_ROW_WID
FROM
NN_OLAP_POC."W_PRODUCT_DH" T1 )
T1
WHERE
((T20_PERIOD_DAY_WID = T16_ROW_WID)
AND (T16_ROW_WID = 20100101)
AND (T20_CONTACT_WID = T13_ROW_WID)
AND (T20_PAYER_TYPE_WID = T10_ROW_WID)
AND (T7_ROW_WID = T20_X_TERR_TYPE_WID)
AND (T20_OWNER_POSTN_WID = T4_ROW_WID)
AND (T20_MARKET_WID = T1_ROW_WID)
AND T13_ROW_WID = 255811
AND T7_ROW_WID = 122
AND T4_ROW_WID =3
AND T1_ROW_WID=230
AND T10_ROW_WID = 26
AND ((T20_PERIOD_DAY_WID) IN ((20100107.000000) , (20100106.000000) , (20100128.000000) , (20100124.000000) , (20100121.000000) , (20100118.000000) , (20100115.000000) , (20100109.000000) , (20100125.000000) , (20100114.000000) , (20100111.000000) , (20100110.000000) , (20100104.000000) , (20100101.000000) , (20100129.000000) , (20100123.000000) , (20100117.000000) , (20100113.000000) , (20100108.000000) , (20100131.000000) , (20100120.000000) , (20100116.000000) , (20100119.000000) , (20100105.000000) , (20100102.000000) ,
(20100130.000000) , (20100127.000000) , (20100122.000000) , (20100112.000000) , (20100103.000000) , (20100126.000000) ) ) )
GROUP BY
(T1_ROW_WID, T4_ROW_WID, T7_ROW_WID, T10_ROW_WID, T13_ROW_WID, T16_ROW_WID)
ORDER BY
T1_ROW_WID ASC NULLS LAST ,
T4_ROW_WID ASC NULLS LAST ,
T7_ROW_WID ASC NULLS LAST ,
T10_ROW_WID ASC NULLS LAST ,
T13_ROW_WID ASC NULLS LAST ,
T16_ROW_WID ASC NULLS LAST
=================================
THE XML export of CUBE as below:
<!DOCTYPE Metadata [
<!ENTITY % BIND_VALUES PUBLIC "OLAP BIND VALUES" "OLAP METADATA">
%BIND_VALUES;
]>
<Metadata
Version="1.2"
MinimumDatabaseVersion="11.2.0.1">
<Cube
ETViewName="MKT_SLS_CUBE_VIEW"
Name="MKT_SLS_CUBE">
<Measure>
<BaseMeasure
SQLDataType="NUMBER"
ETMeasureColumnName="TRX"
Name="TRX">
<Description
Type="LongDescription"
Language="AMERICAN"
Value="TRX">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="TRX">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="TRX">
</Description>
</BaseMeasure>
</Measure>
<Measure>
<BaseMeasure
SQLDataType="NUMBER"
ETMeasureColumnName="NRX"
Name="NRX">
<Description
Type="LongDescription"
Language="AMERICAN"
Value="NRX">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="NRX">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="NRX">
</Description>
</BaseMeasure>
</Measure>
<CubeMap
Name="MAP1"
IsSolved="False"
Query="W_SYNM_RX_T_F"
WhereClause="W_DAYTIME_D.ROW_WID = 20100101">
<MeasureMap
Name="TRX"
Measure="TRX"
Expression="W_SYNM_RX_T_F.MKT_TRX">
</MeasureMap>
<MeasureMap
Name="NRX"
Measure="NRX"
Expression="W_SYNM_RX_T_F.MKT_NRX">
</MeasureMap>
<CubeDimensionalityMap
Name="TIME"
Dimensionality="TIME"
MappedDimension="TIME.CALENDER.MONTHLY"
JoinCondition="W_SYNM_RX_T_F.PERIOD_DAY_WID = W_DAYTIME_D.ROW_WID"
Expression="W_DAYTIME_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="CUSTOMER"
Dimensionality="CUSTOMER"
MappedDimension="CUSTOMER.CUSTOMER_HIERARCHY.DETAIL_LEVEL"
JoinCondition="W_SYNM_RX_T_F.CONTACT_WID = W_PERSON_D.ROW_WID"
Expression="W_PERSON_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="INS_PLAN_DH"
Dimensionality="INS_PLAN_DH"
MappedDimension="INS_PLAN_DH.INS_PLAN.DETAIL"
JoinCondition="W_SYNM_RX_T_F.PAYER_TYPE_WID = WC_INS_PLAN_DH.ROW_WID"
Expression="WC_INS_PLAN_DH.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="LIST_OF_VALUES"
Dimensionality="LIST_OF_VALUES"
MappedDimension="LIST_OF_VALUES.LOV_HIERARCHY.DETAIL_LEVEL"
JoinCondition="W_LOV_D.ROW_WID = W_SYNM_RX_T_F.X_TERR_TYPE_WID"
Expression="W_LOV_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="POSITIONDH"
Dimensionality="POSITIONDH"
MappedDimension="POSITIONDH.POST_HIER.DETAIL"
JoinCondition="W_SYNM_RX_T_F.OWNER_POSTN_WID = POSTN_DH.ROW_WID"
Expression="POSTN_DH.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="PRODH"
Dimensionality="PRODH"
MappedDimension="PRODH.PRODHIER.DETAILLVL"
JoinCondition="W_SYNM_RX_T_F.MARKET_WID = W_PRODUCT_DH.ROW_WID"
Expression="W_PRODUCT_DH.ROW_WID">
</CubeDimensionalityMap>
</CubeMap>
<Organization>
<AWCubeOrganization
MVOption="NONE"
SparseType="COMPRESSED"
MeasureStorage="SHARED"
NullStorage="MV_READY"
CubeStorageType="NUMBER"
PrecomputePercent="35"
PrecomputePercentTop="0"
PartitionLevel="TIME.CALENDER.MONTHLY"
AW="&AW_NAME;">
<SparseDimension
Name="TIME"/>
<SparseDimension
Name="CUSTOMER"/>
<SparseDimension
Name="INS_PLAN_DH"/>
<SparseDimension
Name="LIST_OF_VALUES"/>
<SparseDimension
Name="POSITIONDH"/>
<SparseDimension
Name="PRODH"/>
<DefaultBuild>
<![CDATA[BUILD SPEC LOAD_AND_AGGREGATE
LOAD NO SYNCH,
SOLVE
)]]>
</DefaultBuild>
</AWCubeOrganization>
</Organization>
<Dimensionality
Name="TIME"
ETKeyColumnName="TIME"
Dimension="TIME">
</Dimensionality>
<Dimensionality
Name="CUSTOMER"
ETKeyColumnName="CUSTOMER"
Dimension="CUSTOMER">
</Dimensionality>
<Dimensionality
Name="INS_PLAN_DH"
ETKeyColumnName="INS_PLAN_DH"
Dimension="INS_PLAN_DH">
</Dimensionality>
<Dimensionality
Name="LIST_OF_VALUES"
ETKeyColumnName="LIST_OF_VALUES"
Dimension="LIST_OF_VALUES">
</Dimensionality>
<Dimensionality
Name="POSITIONDH"
ETKeyColumnName="POSITIONDH"
Dimension="POSITIONDH">
</Dimensionality>
<Dimensionality
Name="PRODH"
ETKeyColumnName="PRODH"
Dimension="PRODH">
</Dimensionality>
<Description
Type="LongDescription"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<ConsistentSolve>
<![CDATA[SOLVE
SUM
MAINTAIN COUNT
OVER ALL
)]]>
</ConsistentSolve>
</Cube>
</Metadata>
+++++++++++++++++++++++
I dropped the AW and create new from exported XML and maintain all dimensions and then rebuild . Still have the issue :(
Any thing you can hightlight from above ?
Thanks,
DxP
Also I sustpect whethere it is a issue due to below error caused when I click on one of my Position_Hier view from AWM : even if I select that view it is throwing the error in SQL developer after displaying first couple of rows (while page down)
java.sql.SQLException: ORA-33674: Data block size 63 exceeds the maximum size of 60 bytes.
at oracle.olap.awm.util.jdbc.SQLWrapper.execute(Unknown Source)
at oracle.olap.awm.querydialog.PagedQueryDialog$1.construct(Unknown Source)
at oracle.olap.awm.ui.SwingWorker$2.run(Unknown Source)
at java.lang.Thread.run(Thread.java:595)
Edited by: e_**** on Aug 17, 2011 8:41 PM
Maybe you are looking for
-
I really need help for a school project im working on, it seems as though flash doesnt even recognise the actions in the layers, the codes in the first frame are as follows: stop(); /* Click to Go to Frame and Stop Clicking on the specified symbol in
-
I want to store my photos on a portable external hard drive.
I want to move my photos from my laptop to an external hard drive. I already own a Seagate which is formatted for a PC, with photos already on it. I do not want to lose those photos and have been told by Apple tech that the seagate can be reformatt
-
[Solved] Error boot after upgrading and util-linux-tools and sysvinit
Hi all ... This morning I did a sudo pacman -Syu to update my laptop which packs sysvinit-tools 2.88-8 and 2.22-6 util-linux packages including ... My surprise that I take is that a few hours ago after starting my laptop again I come across a series
-
Bank Guarantee Process and Configuration
Hi Gurus, Can you help me with standard Bank Guarantee Process and Configuration in SAP. Rgds, Harmees
-
Error 1402.Could not open key: UNKNOWN\Components\DA42BC89BF25F5BD0AF18C3B9B1A1EE8\3EDFA09D76E3A704CA8A0FABDA10F280. Verify that you have sufficient access to that key, or contact your support personnel. Action ended 16:43:44: InstallFinalize. Retu