Seperating data with multiple headers
Hello,
I have a data set that has 6 columns with headers containing information for one cycle of a part (Data File is Attached). About 200 rows later I have another set of data with 6 columns for cycle 2 with a space and a new header, and at about another 400 rows later I have the data for cycle 5 with a space and a new header. This pattern repeats throughout the data set.
I need a way to seperate this data so that I can plot the individual cycles. When I import this data set into Diadem with the DataPlugin Wizzard it does not recognize the spaces and new headers. It labels the spaces and headers as "NO VALUE", so there are discontinuities in the data. Is there a way to seperate the cycles in this data set in Diadem?
For example, I would like to have rows 6 thru 215, rows 220 thru 630, rows 635 thru 1046, rows 1051 thru 1462, etc. This way I can plot the individual cycles against the running time.
Solved!
Go to Solution.
Attachments:
Sample_Data2.csv 774 KB
Hi wils01,
Here's a DataPlugin I created that loads your posted data file with each cycle in a separate Data Portal Group.
Brad Turpin
DIAdem Product Support Engineer
National Instruments
Attachments:
wils01_Cycles_CSV.uri 9 KB
Similar Messages
-
Revision: 949
Author: [email protected]
Date: 2008-03-27 07:12:59 -0700 (Thu, 27 Mar 2008)
Log Message:
Bug: BLZ-96 - When sending a HttpService request from ActionScript with multiple headers with the same name, it causes a ClassCastException in the server
QA: Yes - try again with legacy-collection true and false.
Doc: No
Checkintests: Pass
Details: Another try in fixing this bug. When legacy-collection is false, Actionscript Array on the client becomes Java Array on the server and my fix yesterday assumed this case. However, when legacy-collection is true, Actionscript Array becomes Java ArrayList on the server. So added code to handle this case.
Ticket Links:
http://bugs.adobe.com/jira/browse/BLZ-96
Modified Paths:
blazeds/branches/3.0.x/modules/proxy/src/java/flex/messaging/services/http/proxy/RequestF ilter.javaHi all!
Just to post the solution to this if anyone ever runs accross this thread...
For some reason i had it bad the first time, don't have time right now to see why but here is what worked for me:
HashMap primaryFile = new HashMap();
primaryFile.put("fileContent", bFile);
primaryFile.put("fileName", uploadedFile.getFilename());
operationBinding.getParamsMap().put("primaryFile", primaryFile);
HashMap customDocMetadata = new HashMap();
HashMap [] properties = new HashMap[1];
HashMap customMetadataPropertyRoom = new HashMap();
customMetadataPropertyRoom.put("name", "xRoom");
customMetadataPropertyRoom.put("value", "SOME ROOM");
properties[0] = customMetadataPropertyRoom;
customDocMetadata.put("property", properties);
operationBinding.getParamsMap().put("CustomDocMetaData", customDocMetadata);
Basically an unbounded wsdl type is an array of objects (HashMaps), makes sense, i thought i had it like this before, must have messed up somewhere...
Good luck all! -
Revision: 931
Author: [email protected]
Date: 2008-03-26 11:31:01 -0700 (Wed, 26 Mar 2008)
Log Message:
Bug: BLZ-96 - When sending a HttpService request from ActionScript with multiple headers with the same name, it causes a ClassCastException in the server
QA: Yes - we need automated tests for this basic case.
Doc: No
Checkintests: Pass
Details: RequestFilter was not handling multiple headers with the same name properly.
Ticket Links:
http://bugs.adobe.com/jira/browse/BLZ-96
Modified Paths:
blazeds/branches/3.0.x/modules/proxy/src/java/flex/messaging/services/http/proxy/RequestF ilter.javaHi all!
Just to post the solution to this if anyone ever runs accross this thread...
For some reason i had it bad the first time, don't have time right now to see why but here is what worked for me:
HashMap primaryFile = new HashMap();
primaryFile.put("fileContent", bFile);
primaryFile.put("fileName", uploadedFile.getFilename());
operationBinding.getParamsMap().put("primaryFile", primaryFile);
HashMap customDocMetadata = new HashMap();
HashMap [] properties = new HashMap[1];
HashMap customMetadataPropertyRoom = new HashMap();
customMetadataPropertyRoom.put("name", "xRoom");
customMetadataPropertyRoom.put("value", "SOME ROOM");
properties[0] = customMetadataPropertyRoom;
customDocMetadata.put("property", properties);
operationBinding.getParamsMap().put("CustomDocMetaData", customDocMetadata);
Basically an unbounded wsdl type is an array of objects (HashMaps), makes sense, i thought i had it like this before, must have messed up somewhere...
Good luck all! -
Smartform with multiple Headers & Respective Item Details and Totals?
Dear All,
Appreciate if anyone clarify my doubt.
My requirement is to develop a smartform for billing with multiple headers & respective items and totals. For more clarity I'm explaining below.
Header Section1
variable1 variable2 variable3 variable4 variable5
variable6 variable7 variable8
variable9 variable10 variable11
Item Details1
s no vebeln verpr col4 col5 col6
total 123.21
total in words -
value 321.21
value 982.98
value some value
footer details
Header Section2
variable1 variable2 variable3 variable4 variable5
variable6 variable7 variable8
variable9 variable10 variable11
Item Details2
s no vebeln verpr col4 col5 col6
total 123.21
total in words -
value 321.21
value 982.98
value some value
footer details
Header Section3
variable1 variable2 variable3 variable4 variable5
variable6 variable7 variable8
variable9 variable10 variable11
Item Details3
s no vebeln verpr col4 col5 col6
total 123.21
total in words -
value 321.21
value 982.98
value some value
footer details
Is this possible using smartforms?
Now they are using a classical report and takes printout in preprinted stationery.
requirement is to develop a smartform for this. there is a selection screen for the print program which accepts billing docs.
Thanks,Rajiv,
Actually for your case you don't need a table inside the Main Window.
1.) First Populate all the data(header + item) in an internal table in the print program itself.
2.) Pass this internal table to the smartform.
3.) In the smartform give exact dimensions to the main window.
4.) In the Main Window Use a Loop on this internal table.
5.) Create a template for the header data giving the exact dimensions.
6.) Create another template for the item data giving exact dimensions to the template.
7.) If there is pre-printed text between these templates, insert a text element between these templates. This text element will have nothing just give a paragraph format to it. Go to smartstyle, create a style in which create a paragraph format setting line spacing parameter of this paragraph format of the same size as that of pre-printed text.
8.) Only 1 page is required since Loop statement will print the second record on the second page if the dimension of templates are exact.
9.) Dimensions you can easily measure from the pre-printed form.
<removed by moderator>
Edited by: Thomas Zloch on Mar 6, 2012 -
How to export data with column headers in sql server 2008 with bcp command?
Hi all,
I want know "how to export data with column headers in sql server 2008 with bcp command", I know how to import data with import and export wizard. when i
am trying to import data with bcp command data has been copied but column names are not came.
I am using the below query:-
EXEC master..xp_cmdshell
'BCP "SELECT * FROM [tempdb].[dbo].[VBAS_ErrorLog] " QUERYOUT "D:\Temp\SQLServer.log" -c -t , -T -S SERVER-A'
Thanks,
SAAD.Hi All,
I have done as per your suggestion but here i have face the below problem, in print statment it give correct query, in EXEC ( EXEC master..xp_cmdshell @BCPCMD) it was displayed error message like below
DECLARE @BCPCMD
nvarchar(4000)
DECLARE @BCPCMD1
nvarchar(4000)
DECLARE @BCPCMD2
nvarchar(4000)
DECLARE @SQLEXPRESS
varchar(50)
DECLARE @filepath
nvarchar(150),@SQLServer
varchar(50)
SET @filepath
= N'"D:\Temp\LDH_SQLErrorlog_'+CAST(YEAR(GETDATE())
as varchar(4))
+RIGHT('00'+CAST(MONTH(GETDATE())
as varchar(2)),2)
+RIGHT('00'+CAST(DAY(GETDATE())
as varchar(2)),2)+'.log" '
Set @SQLServer
=(SELECT
@@SERVERNAME)
SELECT @BCPCMD1
= '''BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT '
SELECT @BCPCMD2
= '-c -t , -T -S '
+ @SQLServer +
SET @BCPCMD
= @BCPCMD1+ @filepath
+ @BCPCMD2
Print @BCPCMD
-- Print out below
'BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername'
EXEC
master..xp_cmdshell
@BCPCMD
''BCP' is not recognized as an internal or external command,
operable program or batch file.
NULL
if i copy the print ourt put like below and excecute the CMD it was working fine, could you please suggest me what is the problem in above query.
EXEC
master..xp_cmdshell
'BCP "SELECT * FROM
[tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername '
Thanks, SAAD. -
ABAP list with multiple headers
Hi experts,
i have to display a table with multiple headers. Example:
Header 1 = Sales order header
Header 2 = Sales order items
It's possible to create an ABAP list with 2 headers?
Thanks in advance.Hi Dan,
FYI .
[ALV Grid with Multiple Headers;
Regards
Abhii -
Am I able to tag a data point of a spreadsheet that is being created by a datalogging VI such that at the end I have the data with multiple tags which corelate to events during a measurement cycle
My final need is to take data from a datalogging VI and store it in a spreadsheet with tags that corespond to events in a subVI which is controlling motor movement. This will allow users to view all data and mark the relevent data for analysis. As usual, user want everthing but with conditions.Sure. What you do is take the numeric value acquired, the tags you want, and build them into an array. So now, when you write to the spreadsheet, you'll have a 2D array. One thing you have to keep in mind is that all elements of an array have to be of the same type. So if your tags are strings, you'll have to convert your numeric data into strings as well.
-
Hierarchy problem when creating IDoc with multiple headers
Hi,
In general, When you create an inbound IDoc for the IDoc Type FIDCCP02, you have all the data records arranged in hierarchical order (while viewing it in the we02 transaction).
Our requirement is to have m number of headers(E1FIKPF) and in turn it has n number of items(E1FISEG) inside a same IDoc. We have a problem in viewing the hierarchy of them in the data records tab. It comes down in a flat order without hierarchical division.
If we have only one header, we get the hierarchy sorted by default. If we have multiple headers we see the same problem again.
We guess some make-ups in the EDIDD structure which contains segnum, hlevel etc needs to be done.
Let us know the correct method to get this fixed.
Points will be rewarded.Thanks in advance.
Regards,
ReniEvery Idoc segment has a place to put in the parent segment for that Idoc. When you "fill" the Idoc , this tsep might not be executing for some reason.
Is this some customer development?
in EDID4 ,see these fields
PSGNUM-Number of the hierarchically higher SAP segment
HLEVEL Hierarchy level.
Regards -
Posting an Idoc with Multiple Headers and Multiple Lines
Hi all,
I have a scenario like this fileXIIdoc
Invoice file comes in text file with multiple header and multiple lines like this
111 aaa 13214234 US (header)
09082010 ABC 9999 A (Line)
222 ccc 43454543 US (header)
09082010 XYZ 7777 B (Line)
09082010 PQR 8888 C (Line)
I need to post single Idoc with all the headers and lines
we are planning to use Custom Idoc,Is this possible by having the header segment as unbounded??
Regards
Edited by: Vamsi Krishna on Sep 8, 2010 8:05 PMhi,
lets go again.
you have to edit the custom IDoc and change the ocurrance on the header segment. also modify the details segment adding an addtional field that identify which header that this detail belong.
for this case you cannot change the occurrance in SAP PI because you wont identify what i have told you before. in PI apply to change the occurrance if you could send ONE Idoc Per Header.
now is better clear???
Let us know.
Thanks
Rodrigo P-. -
Hello All,
i have a requirement where i have to have multiple headers in a single alv.
The exact requirement is like.
Header1----
row1
row2 1 unit
row3
row4----
Header2
row1
row2
row3
row4
Header3
row1
row2
row3
row4
and the number of individual units is not fixed and will be determined at runtime.
Can somebody tell me the steps to achieve it.
actually i tried using block alv in loop but it is showing inconsistent data and moreover it has a limitation of 19 appends.make you are looking a break page
check this
http://hosteddocs.ittoolbox.com/kk021909.pdf -
Dear friends,
I am getting problems in one ALV Report, the i need to display data with as following format(two headers).
volume stock | being stock |
_____________________ |___________________________________________|
matnr | mat.descr | plant1 | plant2 | plant3 | palnt4 |
00110 : limston material | 1001 | 1002 | 1004 | 1005 |
00111 : himston material | 1111 | 1112 | 1114 | 1115 |Hi Vasu,
Check out the link.
Re: TOP OF PAGE with CL_GUI_ALV_GRID
data: o_html TYPE REF TO cl_dd_document,
o_event_receiver TYPE REF TO lcl_event_receiver.
CLASS lcl_event_receiver IMPLEMENTATION.
*-- Top of Page
METHOD handle_print_top_of_page.
ENDMETHOD. "handle_print_top_of_page
METHOD handle_top_of_page.
ENDMETHOD. "handle_top_of_page
ENDCLASS. "lcl_event_receiver IMPLEMENTATION
CREATE OBJECT o_dockingcontainer
EXPORTING
ratio = '95'
EXCEPTIONS
cntl_error = 1
cntl_system_error = 2
create_error = 3
lifetime_error = 4
lifetime_dynpro_dynpro_link = 5
OTHERS = 6.
IF sy-subrc NE 0.
MESSAGE i000 WITH text-013. " Error in object creation
LEAVE LIST-PROCESSING.
ENDIF.
*--Create Splitter Container
CREATE OBJECT o_split
EXPORTING
parent = o_dockingcontainer
sash_position = 20
with_border = 0
EXCEPTIONS
cntl_error = 1
cntl_system_error = 2
OTHERS = 3.
*--Get the containers of the splitter control
<b>o_container_top = o_split->top_left_container.</b>-TOP
o_container_bot = o_split->bottom_right_container. -ALV
ENDIF.
CREATE OBJECT o_alvgrid
EXPORTING
i_parent = o_container_bot.
*-- <b>Print Top of Page</b>
DATA: lws_text TYPE sdydo_text_element.
IF cl_gui_alv_grid=>offline( ) IS INITIAL.
*-- Object for HTML top container
CREATE OBJECT o_html
EXPORTING style = 'ALV_GRID'
background_color = 35.
*-- Top of Page
CALL METHOD o_alvgrid->list_processing_events
EXPORTING
i_event_name = 'TOP_OF_PAGE'
i_dyndoc_id = o_html.
*-- Total Record Text
CALL METHOD o_html->add_text
EXPORTING
text = text-014
sap_emphasis = text-017.
CALL METHOD o_html->add_gap
EXPORTING
width = 8.
**-- Total record Value
lws_text = cnt_total.
CALL METHOD o_html->add_text
EXPORTING
text = lws_text
sap_emphasis = text-017.
CLEAR lws_text.
CALL METHOD o_html->new_line
EXPORTING
repeat = 1.
**-- Total Success text
CALL METHOD o_html->add_text
EXPORTING
text = text-015
sap_emphasis = text-017
fix_lines = c_x.
CALL METHOD o_html->add_gap
EXPORTING
width = 12.
lws_text = cnt_success.
CALL METHOD o_html->add_text
EXPORTING
text = lws_text
sap_emphasis = text-017
fix_lines = c_x.
CLEAR lws_text.
CALL METHOD o_html->new_line
EXPORTING
repeat = 1.
*-- Total Failed text
CALL METHOD o_html->add_text
EXPORTING
text = text-016
sap_emphasis = text-017
fix_lines = c_x.
CALL METHOD o_html->add_gap
EXPORTING
width = 16.
lws_text = cnt_failed.
CALL METHOD o_html->add_text
EXPORTING
text = lws_text
sap_emphasis = text-017
fix_lines = c_x.
CLEAR lws_text.
*-- Display Report Header
CALL METHOD o_html->display_document
EXPORTING
parent = o_container_top.
ENDIF.
http://abap4.tripod.com/download/alvstub.txt
http://www.sapdevelopment.co.uk/reporting/alv/alvgrid_endlist.htm
Reward points if this helps.
Manish
Message was edited by:
Manish Kumar
Message was edited by:
Manish Kumar -
Fetch Multiple row in repor to insert data with multiple rows .
Hi Friends
i want to Insert emp attendance .
There are 10 emp in company .they enter there arival time in attendance register .
i have two item
1--p1_att_date
2--p1_status
ATTEN_STATUS are PRESENT by default in selectlist .
i want, when i enter date in p1_att_date item and Press submit then generate a report with 10 employee and insert in_time in report and when press enter then data with 10 emp should insert in to table ABC .
i don't want to use tabularform for this .actully i don't want to use ADD_ROW option to ADD attendance for second emp.so please give me some solution.
Emp_ID ATTEN_DATE ATTEN_STATUS IN_TIME OUT_TIME
101 22-JAN-2009 PRESENT
102 22-JAN-2009 PRESENT
103 22-JAN-2009 PRESENT
104 22-JAN-2009 PRESENT
105 22-JAN-2009 PRESENT
106 22-JAN-2009 PRESENT
107 22-JAN-2009 PRESENT
108 22-JAN-2009 PRESENT
109 22-JAN-2009 PRESENT
110 22-JAN-2009 PRESENT
My table is :-
table Name --ABC
emp_id number;
atten_date date;
atten_status varchar2(12);
in_time timestemp;
out_time timestemp;
How can i do this.
Thanks
ManojHi Manoj,
You can create multiple records easily using a single, simple, form. However, you would surely have to enter in the times individually using a tabular form (otherwise, you would have to use the same form 10 times). You do not have to keep the Add Row option on the page - this functionality can be removed.
Andy -
To comapre a single date with multiple dates
Hi All,
I am using fetch xml report in which there are two table Opportunity and activity
One Opportunity with have multiple activity
Opportunity has a field called new_dateofapplication
and Activity has a field called StartDateActivity which have multiple values
i need to compare these dates somehow as below but it didnt work
=iif((Fields!new_dateofapplicationValue.Value)<=(Fields!StartDateActivityValue.Value),sum(Fields!ActivityDurationValue.Value),0)
for e.g
Any help much appreciated
Thanks
Pradnya07Hi Simran08,
According to your description, you have two tables in your report. Now you want to compare the date in Opportunity table with the date in Activity table, sum all activity hours which their date are later than the start date. Right?
In Reporting Service, if you want to compare two values in IIF() function without using Lookup/LookupSet, these two data fields need to be in same scope. Your date are from different tables. It seems you have different datasets for those tables. This may
be the reason why you can’t make your expression working in your report. And for comparing data from different datasets, we only have Lookup/LookupSet function. However, this function is only working in conditions where the value of one field is equals to
the value of the other field. So we can never use this function to judge if the value of one field is greater/less than the value of the other field. So in your scenario, it’s impossible to find a solution if your Activity table only has those two columns.
Also if you get the total hours in that way (like you post in your picture), you can’t figure out how many hours in total hours are spent on each program. But I guess it supposed to have some fields in your Activity to show clients and program. If so, you
just need to set groups in your table and get total in group footer. It will looks like below:
If possible could you post the dataset and table structure or any other detail information about your report? That can help us tested your case in our local environment and do further analysis. Thank you.
Reference:
LookupSet Function (Report Builder and SSRS)
Best Regards,
Simon Hou
-
How do I Identify Lead and Lag in consecutive dates with multiple values?
I am using:
Oracle SQL Developer (3.0.04)
Build MAin-04.34
Oracle Database 11g
Enterprise Edition 11.2.0.1.0 - 64bit Production
I would like to identify the Lead and Lags based on two groups.
The groupping is that multiple C_ID can be found in a Single W_ID
and that multiple D_VAL can be found in C_ID
So I would like to identify and mark with "Lead" and "Lag" related to the "consecutivedaysc" (this already matches the D_VAL and C_ID very well) for example W_ID 2004 has C_ID 2059 with D_VAL of 44 for April 2 & 3, the consecutive days are 2 and I would like to correctly mark April 2 as the Lead and April 3 as the lag.
Then I would like to mark the "Lead" and "Lag" independent of if there are multiple D_VAL on the same W_ID
Example that I am having trouble with:
W_ID 2285 on April 10 for C_ID 7847, I don't understand whay I can't get "Lag" in stead of a "Lead" realted to the consecutivedaysw
I would like to eventually have it that the data gets summarized based on W_ID and sinlge (non-repeating) dt with Lead and Lags.
table
with t as (
select 4592 U_KEY,0 D_VAL_PRESENT,2004 W_ID,to_date('4/1/2013','mm-dd-yyyy') dt,2059 C_ID, (null) D_VAL,0 GWID,13 GCID,1 CONSECUTIVEDAYSC,1 CONSECUTIVEDAYSW from dual union all
select 4591,1,2004,to_date('4/2/2013','mm-dd-yyyy'),2059,44,1,11,2,13 from dual union all
select 4590,1,2004,to_date('4/3/2013','mm-dd-yyyy'),2059,44,1,11,2,13 from dual union all
select 4589,1,2004,to_date('4/4/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4588,1,2004,to_date('4/5/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4587,1,2004,to_date('4/6/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4586,1,2004,to_date('4/7/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4585,1,2004,to_date('4/8/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4584,1,2004,to_date('4/9/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4583,1,2004,to_date('4/10/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4582,1,2004,to_date('4/11/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4581,1,2004,to_date('4/12/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4580,1,2004,to_date('4/13/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 4579,1,2004,to_date('4/14/2013','mm-dd-yyyy'),2059,389,1,0,11,13 from dual union all
select 1092,0,2686,to_date('4/1/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3416,0,2686,to_date('4/1/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18118,0,2686,to_date('4/1/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1091,0,2686,to_date('4/2/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3415,0,2686,to_date('4/2/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18117,0,2686,to_date('4/2/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1090,0,2686,to_date('4/3/2013','mm-dd-yyyy'),7210,(null),0,11,3,3 from dual union all
select 3414,0,2686,to_date('4/3/2013','mm-dd-yyyy'),7211,(null),0,11,3,3 from dual union all
select 18116,0,2686,to_date('4/3/2013','mm-dd-yyyy'),17391,(null),0,11,3,3 from dual union all
select 1089,1,2686,to_date('4/4/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3413,1,2686,to_date('4/4/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18115,1,2686,to_date('4/4/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1088,1,2686,to_date('4/5/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3412,1,2686,to_date('4/5/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18114,1,2686,to_date('4/5/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1087,1,2686,to_date('4/6/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3411,1,2686,to_date('4/6/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18113,1,2686,to_date('4/6/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1086,1,2686,to_date('4/7/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3410,1,2686,to_date('4/7/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18112,1,2686,to_date('4/7/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1085,1,2686,to_date('4/8/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3409,1,2686,to_date('4/8/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18111,1,2686,to_date('4/8/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1084,1,2686,to_date('4/9/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3408,1,2686,to_date('4/9/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18110,1,2686,to_date('4/9/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1083,1,2686,to_date('4/10/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3407,1,2686,to_date('4/10/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18109,1,2686,to_date('4/10/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1082,1,2686,to_date('4/11/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3406,1,2686,to_date('4/11/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18108,1,2686,to_date('4/11/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1081,1,2686,to_date('4/12/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3405,1,2686,to_date('4/12/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18107,1,2686,to_date('4/12/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1080,1,2686,to_date('4/13/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3404,1,2686,to_date('4/13/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18106,1,2686,to_date('4/13/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 1079,1,2686,to_date('4/14/2013','mm-dd-yyyy'),7210,51,9,0,11,11 from dual union all
select 3403,1,2686,to_date('4/14/2013','mm-dd-yyyy'),7211,51,9,0,11,11 from dual union all
select 18105,1,2686,to_date('4/14/2013','mm-dd-yyyy'),17391,51,9,0,11,11 from dual union all
select 17390,1,3034,to_date('4/1/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17389,1,3034,to_date('4/2/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17388,1,3034,to_date('4/3/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17387,1,3034,to_date('4/4/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17386,1,3034,to_date('4/5/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 7305,1,3034,to_date('4/6/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17385,1,3034,to_date('4/6/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14123,1,3034,to_date('4/6/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 17384,1,3034,to_date('4/7/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 17383,1,3034,to_date('4/8/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 7302,1,3034,to_date('4/9/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17382,1,3034,to_date('4/9/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14120,1,3034,to_date('4/9/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7301,1,3034,to_date('4/10/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17381,1,3034,to_date('4/10/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14119,1,3034,to_date('4/10/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7300,1,3034,to_date('4/11/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17380,1,3034,to_date('4/11/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14118,1,3034,to_date('4/11/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7299,1,3034,to_date('4/12/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17379,1,3034,to_date('4/12/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14117,1,3034,to_date('4/12/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7298,1,3034,to_date('4/13/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17378,1,3034,to_date('4/13/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14116,1,3034,to_date('4/13/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual union all
select 7297,1,3034,to_date('4/14/2013','mm-dd-yyyy'),5394,44,0,0,7,14 from dual union all
select 17377,1,3034,to_date('4/14/2013','mm-dd-yyyy'),5395,298,0,0,14,14 from dual union all
select 14115,1,3034,to_date('4/14/2013','mm-dd-yyyy'),22421,44,0,0,7,14 from dual
)the script that I am using:
select
t.*/
case
when lag(dt) over(partition by c_id, d_val order by dt, u_key)+1 = dt
then'Lag'
when lead(dt) over(partition by c_id, d_val order by dt, u_key)-1 = dt
then 'Lead_1'
when consecutivedaysc = 1
then'Lead_3'
else 'wrong'
end LeadLagD_VAL,
case
when lag(dt) over(partition by w_id, c_id, d_val_present,gwid order by dt)+1 = dt
then'Lag'
when lead(dt) over(partition by w_id, c_id, d_val_present, gwid order by dt)-1 = dt
then 'Lead_A'
when consecutivedaysw = 1
then 'Lead_B'
else 'wrong'
end Lead_Lag2
from t
order by
W_ID,
dt asc,
C_ID asc
;the results should look like this (but haveing issues)
u_key D_VAL_PRESENT W_ID C_ID DT D_VAL GWID GCID CONSECUTIVEDAYSC CONSECUTIVEDAYSW LEADLAGD_VAL LEAD_LAG2
4592 0 2004 2059 01-APR-13 0 13 1 1 Lead_1 Lead_A
4591 1 2004 2059 02-APR-13 44 1 11 2 13 Lead_1 Lead_A
4590 1 2004 2059 03-APR-13 44 1 11 2 13 Lag Lag
4589 1 2004 2059 04-APR-13 389 1 0 11 13 Lead_1 Lag
4588 1 2004 2059 05-APR-13 389 1 0 11 13 Lag Lag
4587 1 2004 2059 06-APR-13 389 1 0 11 13 Lag Lag
4586 1 2004 2059 07-APR-13 389 1 0 11 13 Lag Lag
4585 1 2004 2059 08-APR-13 389 1 0 11 13 Lag Lag
4584 1 2004 2059 09-APR-13 389 1 0 11 13 Lag Lag
4583 1 2004 2059 10-APR-13 389 1 0 11 13 Lag Lag
4582 1 2004 2059 11-APR-13 389 1 0 11 13 Lag Lag
4581 1 2004 2059 12-APR-13 389 1 0 11 13 Lag Lag
4580 1 2004 2059 13-APR-13 389 1 0 11 13 Lag Lag
4579 1 2004 2059 14-APR-13 389 1 0 11 13 Lag Lag
1092 0 2686 7210 01-APR-13 0 11 3 3 Lead_1 Lead_A
3416 0 2686 7211 01-APR-13 0 11 3 3 Lead_1 Lead_A
18118 0 2686 17391 01-APR-13 0 11 3 3 Lead_1 Lead_A
1091 0 2686 7210 02-APR-13 0 11 3 3 Lag Lag
3415 0 2686 7211 02-APR-13 0 11 3 3 Lag Lag
18117 0 2686 17391 02-APR-13 0 11 3 3 Lag Lag
1090 0 2686 7210 03-APR-13 0 11 3 3 Lag Lag
3414 0 2686 7211 03-APR-13 0 11 3 3 Lag Lag
18116 0 2686 17391 03-APR-13 0 11 3 3 Lag Lag
1089 1 2686 7210 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
3413 1 2686 7211 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
18115 1 2686 17391 04-APR-13 51 9 0 11 11 Lead_1 Lead_A
1088 1 2686 7210 05-APR-13 51 9 0 11 11 Lag Lag
3412 1 2686 7211 05-APR-13 51 9 0 11 11 Lag Lag
18114 1 2686 17391 05-APR-13 51 9 0 11 11 Lag Lag
1087 1 2686 7210 06-APR-13 51 9 0 11 11 Lag Lag
3411 1 2686 7211 06-APR-13 51 9 0 11 11 Lag Lag
18113 1 2686 17391 06-APR-13 51 9 0 11 11 Lag Lag
1086 1 2686 7210 07-APR-13 51 9 0 11 11 Lag Lag
3410 1 2686 7211 07-APR-13 51 9 0 11 11 Lag Lag
18112 1 2686 17391 07-APR-13 51 9 0 11 11 Lag Lag
1085 1 2686 7210 08-APR-13 51 9 0 11 11 Lag Lag
3409 1 2686 7211 08-APR-13 51 9 0 11 11 Lag Lag
18111 1 2686 17391 08-APR-13 51 9 0 11 11 Lag Lag
1084 1 2686 7210 09-APR-13 51 9 0 11 11 Lag Lag
3408 1 2686 7211 09-APR-13 51 9 0 11 11 Lag Lag
18110 1 2686 17391 09-APR-13 51 9 0 11 11 Lag Lag
1083 1 2686 7210 10-APR-13 51 9 0 11 11 Lag Lag
3407 1 2686 7211 10-APR-13 51 9 0 11 11 Lag Lag
18109 1 2686 17391 10-APR-13 51 9 0 11 11 Lag Lag
1082 1 2686 7210 11-APR-13 51 9 0 11 11 Lag Lag
3406 1 2686 7211 11-APR-13 51 9 0 11 11 Lag Lag
18108 1 2686 17391 11-APR-13 51 9 0 11 11 Lag Lag
1081 1 2686 7210 12-APR-13 51 9 0 11 11 Lag Lag
3405 1 2686 7211 12-APR-13 51 9 0 11 11 Lag Lag
18107 1 2686 17391 12-APR-13 51 9 0 11 11 Lag Lag
1080 1 2686 7210 13-APR-13 51 9 0 11 11 Lag Lag
3404 1 2686 7211 13-APR-13 51 9 0 11 11 Lag Lag
18106 1 2686 17391 13-APR-13 51 9 0 11 11 Lag Lag
1079 1 2686 7210 14-APR-13 51 9 0 11 11 Lag Lag
3403 1 2686 7211 14-APR-13 51 9 0 11 11 Lag Lag
18105 1 2686 17391 14-APR-13 51 9 0 11 11 Lag Lag
17390 1 3034 5395 01-APR-13 298 0 0 14 14 Lead_1 Lead_A
17389 1 3034 5395 02-APR-13 298 0 0 14 14 Lag Lag
17388 1 3034 5395 03-APR-13 298 0 0 14 14 Lag Lag
17387 1 3034 5395 04-APR-13 298 0 0 14 14 Lag Lag
17386 1 3034 5395 05-APR-13 298 0 0 14 14 Lag Lag
7305 1 3034 5394 06-APR-13 44 0 0 7 14 Lead_1 Lag
17385 1 3034 5395 06-APR-13 298 0 0 14 14 Lag Lag
14123 1 3034 22421 06-APR-13 44 0 0 7 14 Lead_1 Lag
17384 1 3034 5395 07-APR-13 298 0 0 14 14 Lag Lag
17383 1 3034 5395 08-APR-13 298 0 0 14 14 Lag Lag
7302 1 3034 5394 09-APR-13 44 0 0 7 14 Lead_1 Lag
17382 1 3034 5395 09-APR-13 298 0 0 14 14 Lag Lag
14120 1 3034 22421 09-APR-13 44 0 0 7 14 Lead_1 Lag
7301 1 3034 5394 10-APR-13 44 0 0 7 14 Lag Lag
17381 1 3034 5395 10-APR-13 298 0 0 14 14 Lag Lag
14119 1 3034 22421 10-APR-13 44 0 0 7 14 Lag Lag
7300 1 3034 5394 11-APR-13 44 0 0 7 14 Lag Lag
17380 1 3034 5395 11-APR-13 298 0 0 14 14 Lag Lag
14118 1 3034 22421 11-APR-13 44 0 0 7 14 Lag Lag
7299 1 3034 5394 12-APR-13 44 0 0 7 14 Lag Lag
17379 1 3034 5395 12-APR-13 298 0 0 14 14 Lag Lag
14117 1 3034 22421 12-APR-13 44 0 0 7 14 Lag Lag
7298 1 3034 5394 13-APR-13 44 0 0 7 14 Lag Lag
17378 1 3034 5395 13-APR-13 298 0 0 14 14 Lag Lag
14116 1 3034 22421 13-APR-13 44 0 0 7 14 Lag Lag
7297 1 3034 5394 14-APR-13 44 0 0 7 14 Lag Lag
17377 1 3034 5395 14-APR-13 298 0 0 14 14 Lag Lag
14115 1 3034 22421 14-APR-13 44 0 0 7 14 Lag Lag I place the "wrong" showing that neither the when conditions were no able to work.
any suggestions on a better direction for me to solve this?
Edited by: 1004407 on May 23, 2013 1:16 PM
Then I am trying to get this, not to include C_ID
u_key D_VAL_PRESENT W_ID DT CONSECUTIVEDAYSW LEAD_LAG2
4592 0 2004 01-APR-13 1 Lead_A
4591 1 2004 02-APR-13 13 Lead_A
4590 1 2004 03-APR-13 13 Lag
4589 1 2004 04-APR-13 13 Lag
4588 1 2004 05-APR-13 13 Lag
4587 1 2004 06-APR-13 13 Lag
4586 1 2004 07-APR-13 13 Lag
4585 1 2004 08-APR-13 13 Lag
4584 1 2004 09-APR-13 13 Lag
4583 1 2004 10-APR-13 13 Lag
4582 1 2004 11-APR-13 13 Lag
4581 1 2004 12-APR-13 13 Lag
4580 1 2004 13-APR-13 13 Lag
4579 1 2004 14-APR-13 13 Lag
1092 0 2686 01-APR-13 3 Lead_A
1091 0 2686 02-APR-13 3 Lag
1090 0 2686 03-APR-13 3 Lag
1089 1 2686 04-APR-13 11 Lead_A
1088 1 2686 05-APR-13 11 Lag
1087 1 2686 06-APR-13 11 Lag
1086 1 2686 07-APR-13 11 Lag
1085 1 2686 08-APR-13 11 Lag
1084 1 2686 09-APR-13 11 Lag
1083 1 2686 10-APR-13 11 Lag
1082 1 2686 11-APR-13 11 Lag
1081 1 2686 12-APR-13 11 Lag
1080 1 2686 13-APR-13 11 Lag
1079 1 2686 14-APR-13 11 Lag
17390 1 3034 01-APR-13 14 Lead_A
17389 1 3034 02-APR-13 14 Lag
17388 1 3034 03-APR-13 14 Lag
17387 1 3034 04-APR-13 14 Lag
17386 1 3034 05-APR-13 14 Lag
7305 1 3034 06-APR-13 14 Lag
17384 1 3034 07-APR-13 14 Lag
17383 1 3034 08-APR-13 14 Lag
7302 1 3034 09-APR-13 14 Lag
7301 1 3034 10-APR-13 14 Lag
7300 1 3034 11-APR-13 14 Lag
7299 1 3034 12-APR-13 14 Lag
7298 1 3034 13-APR-13 14 Lag
7297 1 3034 14-APR-13 14 Lag then into this (which I would use where Lead_Lag2 = "Lead_A"
u_key D_VAL_PRESENT W_ID DT CONSECUTIVEDAYSW LEAD_LAG2
4592 0 2004 01-APR-13 1 Lead_A
4591 1 2004 02-APR-13 13 Lead_A
1092 0 2686 01-APR-13 3 Lead_A
11089 1 2686 04-APR-13 11 Lead_A
17390 1 3034 01-APR-13 14 Lead_A but onething at a time.
Thanks for point out the errors Frank, always helpful to know what others see.
Edited by: 1004407 on May 23, 2013 2:36 PM
Edited by: 1004407 on May 23, 2013 4:01 PMIs this the first set you expect?
SQL> with flagged as
2 (
3 select w_id,d_val,dt,u_key,
4 case when lag(dt) over(partition by w_id,d_val order by dt,u_key)
5 in (dt,dt-1)
6 then 0
7 else 1
8 end flg
9 from t
10 ),
11 summed as
12 (
13 select w_id,d_val,dt,u_key,
14 sum(flg) over(order by w_id,d_val nulls first,dt,u_key) sm
15 from flagged
16 ),
17 day_count as
18 (
19 select w_id,d_val,dt,u_key,count(distinct dt) over(partition by sm) cnt
20 from summed
21 )
22 select w_id,d_val,dt,u_key,cnt
23 from day_count
24 order by w_id,d_val nulls first,dt,u_key;
W_ID D_VAL DT U_KEY CNT
2004 01-APR-13 4592 1
2004 44 02-APR-13 4591 2
2004 44 03-APR-13 4590 2
2004 389 04-APR-13 4589 11
2004 389 05-APR-13 4588 11
2004 389 06-APR-13 4587 11
2004 389 07-APR-13 4586 11
2004 389 08-APR-13 4585 11
2004 389 09-APR-13 4584 11
2004 389 10-APR-13 4583 11
2004 389 11-APR-13 4582 11
2004 389 12-APR-13 4581 11
2004 389 13-APR-13 4580 11
2004 389 14-APR-13 4579 11
2686 01-APR-13 1092 3
2686 01-APR-13 3416 3
2686 01-APR-13 18118 3
2686 02-APR-13 1091 3
2686 02-APR-13 3415 3
2686 02-APR-13 18117 3
2686 03-APR-13 1090 3
2686 03-APR-13 3414 3
2686 03-APR-13 18116 3
..... -
HTML Client Entering/Editing Data With Multiple Tables
Hi guys,
I've been stumped on what seems to be a relatively simple problem for quite some time now. I have a bunch of Products - Dairy, Meat, Produce and Candy, for example. Each Product has multiple ProductConfigurations - say, Monday, Tuesday, Wednesday, Thursday,
and Weekend. Each Product also has its own unique set of ProductAttributes; Meat may have Animal, ExpirationDate, CostPerPound, Kosher for example, while Candy may have Type, Manufacturer, Price, Distributor, and Dairy may have Source, ExpirationDate, UnitCost.
These are just examples, but the point is each Product has unique ProductAttributes.
For each ProductConfiguration (which has a Product), I want to be able to go to this ProductConfiguration and see a list of all its ProductAttributes, and provide an option to edit the ProductAttributeValue (a separate table) for each ProductAttribute. A
ProductAttributeValue has a Value, a ProductAttribute (ie, Manufacturer) and a ProductConfiguration (ie, Tuesday).
The way I currently have it set up is with a clickable List of all ProductAttributes for a given Product (which I get by querying the filter 'ProductAttribute.ProductID = Screen.ProductID' from the ProductAttributes tables). When you click, it brings up
an 'AddNewProductAttributeValue' screen, in which the ProductAttribute is already populated. However, this requires a looot of clicks for Product(Configurations) which have many ProductAttributes. I'd love to be able to configure all of the ProductAttributeValues
for a given ProductConfiguration on one screen.
Any ideas how I could do this?Matt, there are two bits to get the in-built table to play nicely.
// This makes the input textbox focus work better
myapp.AddEditProductConfiguration.created = function (screen) {
var scr = screen;
scr.ProductAttributeValues.addChangeListener("selectedItem", function (eventArgs) {
if (scr.ProductAttributeValues.selectedItem != null) {
var item = scr.ProductAttributeValues.selectedItem;
setTimeout(function () {
$("input", item.details.focusElement).focus();
}, 100);
myapp.AddEditProductConfiguration.Value_postRender = function (element, contentItem) {
contentItem.details._details.focusElement = element;
... the maintenance screen is based on a standard 'AddEdit<entity>' screen tweaked and in msls-2.5.2.js (or earlier) ...
case $.mobile.keyCode.TAB:
//Xpert360: tab
//updateSelectedHeader(table, table._headerRowElement[0]);
break;
... to stop the keyup handler messing up tabbing between input textboxes.
Some code similar to this could pre-populate (or verify) that a configuration has rows of ProductAttributeValues matching ProductAttributes ...
scr.getProductAttributeValues().then(function (values) {
if (scr.ProductAttributeValues.count <= 0) {
scr.getProductAttributes().then(function (values) {
if (scr.ProductAttributes.count > 0) {
scr.ProductAttributes.data.forEach(function(attr, idx, obj){
var attrvalue = scr.ProductAttributeValues.addNew();
attrvalue.ProductAttribute = scr.ProductAttributes.data[idx];
attrvalue.ProductConfiguration = scr.ProductConfiguration;
attrvalue.Value = "";
... which could be used in a variety of places.
I will look to uploading the sample application.
Dave
Dave Baker | AIDE for LightSwitch | Xpert360 blog | twitter : @xpert360 | Xpert360 website | Opinions are my own. For better forums, remember to mark posts as helpful/answer.
Maybe you are looking for
-
Network sharing of iTunes library issue
Hello, I've searched for a similar problem, but did not find one. I have a main computer where I keep all of my MP3's on an external drive. I've sucessfully shared the library to the second computer - all of the playlists show up complete with artwor
-
IPod Touch recognized by iTunes, but not Vista
I've recently re-installed vista and now I have a recognition issue. When connecting the ipod for the first time since the re-install it does the usual "new hardware found" and tells me that the ipod is ready for use only it won't show in My Computer
-
Sending email in html format.
i am trying to send email in html format. the following is my sample code. it's just sending plain text. how should i send html file in email? String message_body = "<a href=\"test.html\">click here</a>"; MimeMessage msg = new MimeMessage( session );
-
"no_image_found".gif
Hi everyone, I am sure most of the APEX users hwre have used the DEMO Application that comes with APEX. I have a question related to that. I am sure you are aware of a table called DEMO_IMAGES. In that table you will see an image_id --> 0, which is t
-
Macbook Pro 2009 Battery won't charge when plugged in
So recently i've experienced a problem in which my Macbook Pro 15 inch (2009 model A1286) would not charge even while plugged in. So i thought the problem was that my battery had to be serviced for a while now and it said it wouldn't harm my computer