Transformation of Rows to Column in HANA
Hi All,
I have a requirement of converting rows to columns.
For Ex -
I have date in my view as below -
My requirement is I want to convert the above in columnar structure as -
Please suggest how can I achieve the same.
Regards,
Nakul Kothari
+9987039379
Hi Nakul,
There have the points you need to reach your goal, see what would be in SQL:
select
status_start_date,
sum(s1) as s_Initiated,
sum(s2) as s_cleared,
sum(s3) as s_scf_associated,
sum(s4) as s_scf_supervisor
from (
select status_start_date,count(*) as s1,0 as s2,0 as s3,0 as s4 from _sys_bic."Spend/AT_GES_CLAIMS_FLAG_DTLS"
where status_start_date between '2014-01-24' and '2014-01-30' and status='Initiated'
group by status_start_date
union all
select status_start_date,0 as s1,count(*) as s2,0 as s3,0 as s4 from _sys_bic."Spend/AT_GES_CLAIMS_FLAG_DTLS"
where status_start_date between '2014-01-24' and '2014-01-30' and status='Cleared'
group by status_start_date
union all
select status_start_date,0 as s1,0 as s2,count(*) as s3,0 as s4 from _sys_bic."Spend/AT_GES_CLAIMS_FLAG_DTLS"
where status_start_date between '2014-01-24' and '2014-01-30' and status='Seek Clarification from Associate'
group by status_start_date
union all
select status_start_date,0 as s1,0 as s2,0 as s3,count(*) as s4 from _sys_bic."Spend/AT_GES_CLAIMS_FLAG_DTLS"
where status_start_date between '2014-01-24' and '2014-01-30' and status='Seek Clarification from Supervisor'
group by status_start_date
group by status_start_date
Regards, Fernando Da Rós
Similar Messages
-
Row to Column XSL Transform in BLS
I have 11.5, sr3.
I was going to use the XSLTransformation action to swap rows and columns of a data set using /Illuminator/stylesheets/RowToColumnTransform.xsl. I cannot get anything but the following as an output:
<?xml version="1.0" encoding="UTF-8"?><Rowsets DateCreated="2007-12-12T13:27:29" EndDate="2007-12-03T08:09:17" StartDate="2007-12-03T08:09:17" Version="11.5.3"><Rowset><Columns/><Row/></Rowset></Rowsets>
There are no errors, I just don't get the result. The input data set is as follows:
<?xml version="1.0" encoding="UTF-8"?><Rowsets DateCreated="2007-12-12T13:27:29" EndDate="2007-12-03T08:09:17" StartDate="2007-12-03T08:09:17" Version="11.5.3"><Rowset><Columns><Column Description="" MaxRange="1" MinRange="0" Name="User_ID" SQLDataType="1" SourceColumn="User_ID"/><Column Description="" MaxRange="1" MinRange="0" Name="User_Name" SQLDataType="1" SourceColumn="User_Name"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_Number" SQLDataType="4" SourceColumn="Sample_Number"/><Column Description="User Name" MaxRange="1" MinRange="0" Name="Login_By" SQLDataType="1" SourceColumn="Login_By"/><Column Description="Examination Type" MaxRange="1" MinRange="0" Name="Examination" SQLDataType="1" SourceColumn="Examination"/><Column Description="" MaxRange="1" MinRange="0" Name="Examination_Title" SQLDataType="1" SourceColumn="Examination_Title"/><Column Description="" MaxRange="1" MinRange="0" Name="Examination_Desc" SQLDataType="1" SourceColumn="Examination_Desc"/><Column Description="" MaxRange="1" MinRange="0" Name="Test_Number" SQLDataType="4" SourceColumn="Test_Number"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_Time" SQLDataType="93" SourceColumn="Sample_Time"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_Status" SQLDataType="1" SourceColumn="Sample_Status"/><Column Description="" MaxRange="1" MinRange="0" Name="Authorize" SQLDataType="1" SourceColumn="Authorize"/><Column Description="" MaxRange="1" MinRange="0" Name="Total_Defects" SQLDataType="4" SourceColumn="Total_Defects"/><Column Description="" MaxRange="1" MinRange="0" Name="Carton_Code_Date" SQLDataType="1" SourceColumn="Carton_Code_Date"/><Column Description="" MaxRange="1" MinRange="0" Name="Package_Code_Date" SQLDataType="1" SourceColumn="Package_Code_Date"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_1_Container" SQLDataType="1" SourceColumn="Sample_1_Container"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_2_Container" SQLDataType="1" SourceColumn="Sample_2_Container"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_1_Container_Get" SQLDataType="1" SourceColumn="Sample_1_Container_Get"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_2_Container_Get" SQLDataType="1" SourceColumn="Sample_2_Container_Get"/><Column Description="" MaxRange="1" MinRange="0" Name="Machine_Scanned" SQLDataType="1" SourceColumn="Machine_Scanned"/><Column Description="" MaxRange="1" MinRange="0" Name="Machine_Shift_Flag" SQLDataType="4" SourceColumn="Machine_Shift_Flag"/><Column Description="" MaxRange="1" MinRange="0" Name="Maker_Name" SQLDataType="1" SourceColumn="Maker_Name"/><Column Description="" MaxRange="1" MinRange="0" Name="Packer_Name" SQLDataType="1" SourceColumn="Packer_Name"/><Column Description="" MaxRange="1" MinRange="0" Name="Sample_Size" SQLDataType="4" SourceColumn="Sample_Size"/><Column Description="SAP Product Code of the associated Cigarette Audit if Linked" MaxRange="1" MinRange="0" Name="Associated_Cig_Audit_SAP_Code" SQLDataType="1" SourceColumn="Associated_Cig_Audit_SAP_Code"/><Column Description="SAP Product Code of the associated Pack Audit if Linked" MaxRange="1" MinRange="0" Name="Associated_Pack_Audit_SAP_Code" SQLDataType="1" SourceColumn="Associated_Pack_Audit_SAP_Code"/><Column Description="Sample Number of the associated CIg Audit if Linked" MaxRange="1" MinRange="0" Name="Associated_Cig_Audit_Sample_Number" SQLDataType="1" SourceColumn="Associated_Cig_Audit_Sample_Number"/><Column Description="Sample Number of the associated Pack Audit if Linked" MaxRange="1" MinRange="0" Name="Associated_Pack_Audit_Sample_Number" SQLDataType="1" SourceColumn="Associated_Pack_Audit_Sample_Number"/><Column Description="" MaxRange="1" MinRange="0" Name="Machine_SAP_Code" SQLDataType="1" SourceColumn="Machine_SAP_Code"/><Column Description="" MaxRange="1" MinRange="0" Name="Machine_SAP_Desc" SQLDataType="1" SourceColumn="Machine_SAP_Desc"/><Column Description="" MaxRange="1" MinRange="0" Name="Maker_SAP_Code" SQLDataType="1" SourceColumn="Maker_SAP_Code"/><Column Description="" MaxRange="1" MinRange="0" Name="Maker_SAP_Desc" SQLDataType="1" SourceColumn="Maker_SAP_Desc"/><Column Description="" MaxRange="1" MinRange="0" Name="Packer_SAP_Code" SQLDataType="1" SourceColumn="Packer_SAP_Code"/><Column Description="" MaxRange="1" MinRange="0" Name="Packer_SAP_Desc" SQLDataType="1" SourceColumn="Packer_SAP_Desc"/><Column Description="" MaxRange="1" MinRange="0" Name="Action" SQLDataType="1" SourceColumn="Action"/></Columns><Row><Sample_Number>46</Sample_Number><Examination>MKNG_PQC_PACK</Examination><Examination_Title>PQC Pack Audit</Examination_Title><Examination_Desc>Making & Packing PQC Pack Audit Sample Template</Examination_Desc><User_ID>clmf90</User_ID><User_Name></User_Name><Login_By>SYSTEM</Login_By><Test_Number>63</Test_Number><Sample_Time>2007-12-12T13:46:52</Sample_Time><Sample_Status>Complete</Sample_Status><Authorize>No</Authorize><Total_Defects>1</Total_Defects><Carton_Code_Date>-</Carton_Code_Date><Package_Code_Date>7T28D205 11:30</Package_Code_Date><Sample_1_Container>01-01</Sample_1_Container><Sample_2_Container>-</Sample_2_Container><Sample_1_Container_Get>01-01</Sample_1_Container_Get><Sample_2_Container_Get>-</Sample_2_Container_Get><Machine_Scanned>U-MAKER-205</Machine_Scanned><Maker_Name>0205</Maker_Name><Machine_Shift_Flag>1</Machine_Shift_Flag><Packer_Name>0205</Packer_Name><Sample_Size>2</Sample_Size><Associated_Cig_Audit_SAP_Code>2001155</Associated_Cig_Audit_SAP_Code><Associated_Pack_Audit_SAP_Code>-</Associated_Pack_Audit_SAP_Code><Associated_Cig_Audit_Sample_Number>MKNG-PQC-CIG-20071128-0004</Associated_Cig_Audit_Sample_Number><Associated_Pack_Audit_Sample_Number>---</Associated_Pack_Audit_Sample_Number><Machine_SAP_Code></Machine_SAP_Code><Machine_SAP_Desc>MAVERICK LT MENT 100</Machine_SAP_Desc><Maker_SAP_Code>2001155</Maker_SAP_Code><Maker_SAP_Desc>MAVERICK LT MENT 100</Maker_SAP_Desc><Packer_SAP_Desc>MAVERICK LT MENT 100</Packer_SAP_Desc><Packer_SAP_Code></Packer_SAP_Code></Row></Rowset></Rowsets>
What am I missing?Sparks,
Any reason you are not using the VerticalGrid Applet?
Did you specify any of the parameters required by the XSL, such as ColumnID and ValueID?
The XSL appears to only translate a single row node to a column....
Try using this XSL:
<?xml version="1.0"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:java="http://xml.apache.org/xslt/java" xmlns:xalan="http://xml.apache.org/xalan" exclude-result-prefixes="xalan java">
<xsl:output encoding="UTF-8" method="xml" media-type="text/xml"/>
<xsl:template match="/">
<Rowsets DateCreated="{Rowsets/@DateCreated}" Version="{Rowsets/@Version}" StartDate="{Rowsets/@StartDate}" EndDate="{Rowsets/@EndDate}">
<xsl:for-each select="Rowsets">
<xsl:copy-of select="FatalError"/>
<xsl:copy-of select="Messages"/>
<xsl:copy-of select="HyperLinks"/>
<xsl:if test="count(/Rowsets/FatalError) = '0'">
<Rowset>
<Columns>
<Column Name="Name" SourceColumn="Name" Description="Name" SQLDataType="1" MinRange="0.0" MaxRange="1.0"/>
<Column Name="Value" SourceColumn="Value" Description="Value" SQLDataType="1" MinRange="0.0" MaxRange="1.0"/>
</Columns>
<xsl:for-each select="/Rowsets/Rowset/Row/*[name()]">
<Row>
<xsl:element name="Name">
<xsl:value-of select="name(.)"/>
</xsl:element>
<xsl:element name="Value">
<xsl:value-of select="."/>
</xsl:element>
</Row>
</xsl:for-each>
</Rowset>
</xsl:if>
</xsl:for-each>
</Rowsets>
</xsl:template>
</xsl:stylesheet>
Sam -
SAP HANA View Row to Column Transpose
Hello Guys
What is the best way to transpose rows to columns within a Hana Model. I can find many articles that talk about Column to Row transpose.
Source:
Customer
Dim 1
Dim 2
1000
A
1100
1000
B
1200
1000
C
1300
Target:
Customer
A
B
C
1000
1100
1200
1300
ThanksNot as an answer, but as a question: Are there three rows *only* resp. could they be granted? Or you wanna have the amount dynamic, i.e. count the same customers and than created the amount of required columns?
-
Map CdC op transform SEQUENCE and ROW oper column properties
I am using Map_cdc_op transform, for sequence column and row operation column properties, what fields should i choose?
My source table has the following columns:
RMID - primary key int
RMName - varchar
created_date datetime
Last_updated datetime
target table also has above 4 columns plus one rmLogID column with int data type. which is key column in target.
does map_cdc_op transform also take care of the surrogate key column id in target? which is "rmlogid".
thank you very much for the helpful info.Suneer I am using SQL serverdatabase, this table is dragged to workspace as source:
RMID - primary key int
RMName - varchar
created_date datetime
Last_updated datetime
My task is to capture data changes from this table to target table, for that i a using Map_op_CDC transform, now under the properties SEQUENCE and ROW oper column what fields should i pic.I never used this transform before.
RMid is the primary key in source, rest teh of teh fields are changeable. i track created_dt and last_updated dates for that row.
Thanks a lot for the helpful info. -
Transpose of columns to rows (unpivoting) and not rows to columns
Hi,
I am right now using oracle 10g. I HAVE the following specification. Here I specified only 5 columns.
We have upto 200 columns.
TRANS_ID PORTFILIO_NUM TICKER PRICE NUM_SHARES ........................................
2 100 MO 25.00 100 ........................................
3 100 MCD 31.50 100 ........................................
I want the above to be transformed into the following output .
TRANS_ID TYPE VALUE
2 PORTFILIO_NUM 100
2 TICKER MO
2 PRICE 25.00
2 NUM_SHARES 100.
I don't want to use case/decode function by hard coding the 200 columns.
Can anyone provide me a good way (mostly dynamic way) of doing this?
I searched the whole forum and also other forums. Everywhere I could find
rows to columns / columns to rows where the column names have been hardcoded.
I want a dynamic way of doing it. Let me know if u need any other inputs.
DDL :
CREATE TABLE PORT_TRANS
TRANS_ID VARCHAR2(100 BYTE),
PORTFILIO_NUM VARCHAR2(100 BYTE),
TICKER VARCHAR2(100 BYTE),
PRICE VARCHAR2(100 BYTE),
NUM_SHARES VARCHAR2(100 BYTE)
INSERT INTO PORT_TRANS (TRANS_ID,PORTFILIO_NUM,TICKER,PRICE,NUM_SHARES)
VALUES('2','100','MO','25.00','100');
INSERT INTO PORT_TRANS (TRANS_ID,PORTFILIO_NUM,TICKER,PRICE,NUM_SHARES)
VALUES('3,'100','MCD','31.50','100');
COMMIT;
Thanks,
Priya.Hi,
What you're trying to write is something like this:
WITH cntr AS
SELECT LEVEL AS n
FROM dual
CONNECT BY LEVEL <= 4
SELECT p.trans_id
, CASE
WHEN c.n <= 2
THEN
CASE c.n
WHEN 1 THEN 'PORTFILIO_NUM'
WHEN 2 THEN 'TICKER'
END
ELSE
CASE c.n
WHEN 3 THEN 'PRICE'
WHEN 4 THEN 'NUM_SHARES'
END
END AS type
, CASE
WHEN c.n <= 2
THEN
CASE c.n
WHEN 1 THEN p.PORTFILIO_NUM
WHEN 2 THEN p.TICKER
END
ELSE
CASE c.n
WHEN 3 THEN p.PRICE
WHEN 4 THEN p.NUM_SHARES
END
END AS value
FROM port_trans p
CROSS JOIN cntr c
ORDER BY p.trans_id
, c.n
;I wrote this as if CASE could only handle 2 choices, rather than 128, just to show how to nest CASE expressions.
What you have to do is write the CASE expressions, based on the contents of all_tab_columns.
In your sample data, all of the columns are VARCHAR2 (another design flaw). If you have any columns of other types, use TO_CHAR to convert them to VARCHAR2; that is, the final code to be run will have something like:
... WHEN 4 THEN TO_CHAR (p.NUM_SHARES)If I had to do this, I might run several queries on all_tab_columns, each producing one script, containing just a fragment of the query.
To run the whole thing, I would hard-code a main query like this
WITH cntr AS
SELECT LEVEL AS n
FROM dual
CONNECT BY LEVEL <=
@num_columns.sql
SELECT p.trans_id
, CASE
@type.sql
END AS type
, CASE
@value.sql
END AS value
FROM port_trans p
CROSS JOIN cntr c
ORDER BY p.trans_id
, c.n
;As with any coidng, start small and take baby steps. Maybe the first step would just be to write num_columns.sql, which just contains the number 4. When you can do that, hard-code the CONNECT BY query, calling num_columns.sql.
Good luck! -
How to transpose the data records (rows) to column(lists) using apd
Hi,
how to transpose the data records (rows) to column (lists) using apd in sap bw.
I do not want to use abap routine.only use the transpose rows to list transformation .
Pls provide the step by step procedure .
thanks,
NimaiSave youe file to transpose as a csv and in the header row of your file for the columns you want to transpose you need to put some soer of a tag before the column name (ie your colum header was for a period budget will be something lie 2011001:ZFIBDD)
1. You will need to create a new apd process (rsanwb)
2. Insert a "Read from Data File" data source object and map it file (,csv)
3. insert a transpose object into your apd process (middle row 4th one over in the transformations section)
4. under the definition tab in the transformation object select all the columns that are to be transposed into rows and move them to the transformed area, the grouping fields section should contain the rows that you want to now be columns
5.under the transformation tab enter in the seperator you selected under the Field Name/Infoobject area (ie. ZFIBDD)
6. under the details tab you need to enter in all the fields to be transformed and tner the transposition field (ie ZFIBDD)
7. Then you can insert a set of transformations and a DSO and link the newly transposed fields into that.
hope that helps -
How to convert rows to columns in sql server 2008
How to convert rows to columns in sql server 2008 using the GROUP BY function? (only one query allowed)
Lookup the Pivot transformation. From BOL:
The Pivot transformation makes a normalized data set into a less normalized
but more compact version by pivoting the input data on a column value. For
example, a normalized Orders data set that lists customer name, product, and quantity purchased typically has multiple rows for any customer who purchased multiple products, with each row for that customer showing order
details for a different product. By pivoting the data set on the product column, the Pivot transformation can output a data set with a
single row per customer. That single row lists all the purchases by the customer, with the product names shown as column names, and the quantity shown as a value in the product column. Because not every customer purchases every product, many columns may contain
null values.
When a dataset is pivoted, input columns perform different roles in the pivoting process. A column can participate in the following ways:
The column is passed through unchanged to the output. Because many input rows
can result only in one output row, the transformation copies only the first
input value for the column.
The column acts as the key or part of the key that identifies a set of
records.
The column defines the pivot. The values in this column are associated with
columns in the pivoted dataset.
The column contains values that are placed in the columns that the pivot
creates.
Paul -
Hi all, i was wondering if there is a way to transform rows to column using analytical function. i know you can use min or max function and group by but would like to know if the same can be accomplish using analytical function. i am using oracle 9i
sample data
WITH table1 AS
SELECT 'AJD' id , 'BUNIT' code, 1000 myvalue FROM dual UNION all
SELECT 'AJD' id , 'BCAT' code, 2000 myvalue FROM dual UNION all
SELECT 'AJD' id , 'BLINE' code, 3000 myvalue FROM dual UNION all
SELECT 'AJD' id , 'BCEN' code, 4000 myvalue FROM dual UNION ALL
SELECT 'AAA' id , 'BUNIT' code, 5000 myvalue FROM dual UNION all
SELECT 'AAA' id , 'BCAT' code, 6000 myvalue FROM dual UNION all
SELECT 'AAA' id , 'BLINE' code, 7000 myvalue FROM dual UNION all
SELECT 'AAA' id , 'BCEN' code, 8000 myvalue FROM dual
desire output
ID UNIT CAT LINE CEN
ADJ 1000 2000 3000 4000
AAA 5000 6000 7000 8000
if this can be done using analytical function, please provide query if possible. thanksThanks for the sample data.
I'm not sure why you want to do this with analytic fiunctions instead of aggregates, it seems like a lot of effort for no gain, but this works on 9.2.0.8.0, and should work on other versions of 9i.
SQL> WITH table1 AS
2 (
3 SELECT 'AJD' id , 'BUNIT' code, 1000 myvalue FROM dual UNION all
4 SELECT 'AJD' id , 'BCAT' code, 2000 myvalue FROM dual UNION all
5 SELECT 'AJD' id , 'BLINE' code, 3000 myvalue FROM dual UNION all
6 SELECT 'AJD' id , 'BCEN' code, 4000 myvalue FROM dual UNION ALL
7 SELECT 'AAA' id , 'BUNIT' code, 5000 myvalue FROM dual UNION all
8 SELECT 'AAA' id , 'BCAT' code, 6000 myvalue FROM dual UNION all
9 SELECT 'AAA' id , 'BLINE' code, 7000 myvalue FROM dual UNION all
10 SELECT 'AAA' id , 'BCEN' code, 8000 myvalue FROM dual)
11 SELECT DISTINCT id, MAX(DECODE(code, 'BUNIT', myvalue)) OVER(PARTITION BY id) unit,
12 MAX(DECODE(code, 'BCAT', myvalue)) OVER(PARTITION BY id) cat,
13 MAX(DECODE(code, 'BLINE', myvalue)) OVER(PARTITION BY id) line,
14 MAX(DECODE(code, 'BCEN', myvalue)) OVER(PARTITION BY id) cen
15 FROM table1;
ID UNIT CAT LINE CEN
AJD 1000 2000 3000 4000
AAA 5000 6000 7000 8000John -
how can i open a PDF bank statement in "numbers" so that the rows and columns contain properly aligned data from statement?
Numbers can store pdfs pages or clippings but does not directly open pdf files. To get the bank statement into Numbers as a table I would open the bank statment in Preview (or Skim) or some pdf viewer.
Then hold the option key while selecting a column of data.
Then copy
Then switch to numbers and paste the column into a table
Then repeat for the other columns in the pdf document
It would be easier (in my opinion) to download the QFX or CSV version from your bank -
How to accessing current row report column value in Lov Query?
Hi,
which access methods (eg. bind variables, substitutions, ...) for getting the current row report column value can be used in the "Lov Query" property of a report column?
As what I know of and what I have read on the forum there are no bind variables for the report columns. For the "Link Text" property it seems that the column values exist as substitution strings (#COLUMN_NAME#). But they don't work in the Lov Query. => And would be good because of a hard parse each time the Lov query is executed.
The following post (Re: Simulating a correlated sub query in lov
is showing a solution to use package variables for temporary storage of the referenced value, but the only problem with that solution is that if a new record is added with the "Add rows to tabular form" process the package variable still contains the value from the last queried row! Is there a way (variable, APEX package, ...) to determine if the lov query is executed for a new record so that the package can return null?
I know that I could write the package in a way that the value is immediately cleared when lov_pkg.keyval is called (one time read), but then I would have to create several variables if I'm accessing the value multiple times in the query or in another query => I think an one time read solution would be very obscurely.
Thanks for your help
Patrick
http://inside-apex.blogspot.comHi Patrick,
I agree that it's a waste to continually use Ajax to go back to the server to get the contents of a dynamic select list.
There are no bind variables for any row item - but what you do have, as per my previous post, is the value of the data entered by the user in the first row. You can pass this into your application process (using get.add("VARIABLENAME", value)), which can use it to retrieve the correct LOV in your Ajax code - this will give you a "bind variable" that your process can use.
What you could do, however, is generate hidden select lists on your page - one for each possible LOV list and replace the contents of the new row's select list with the contents of the appropriate hidden select list. This is easy to do with javascript (using innerHTML functions). Obviously, though, the usefulness of this depends on the number and size of the select lists.
Even if you don't generate them to start with, you can keep a copy of any select lists returned by Ajax in the DOM for use on new rows. So, if you have retrieved a select list, you will have a copy of it in DOM which you can then copy into the new row. If you don't have the list in DOM, use Ajax to get it, store a copy of it and copy it into the new row.
Which method you use will depend on the number/size of select lists needed. If they are few in number and/or size, I would suggest generating hidden lists. If they are large, use Ajax to get them once, store them and then retrieve them from the DOM when needed.
There is another thread here where Arie recommends going to the server every time to make sure you get the most up-to-date data for the lists. If you want to follow this advice, for this reason, use get.add("VARIABLENAME", value) to pass the value to your process. If this is not an issue, you can use one of the other methods I outlined above.
Regards
Andy -
Row Selector column on Interactive Report
Hi all,
I have an interactive report based on a custom SQL query. How do I add a row selctor column so I cans select individual rows? I have also tried using a tabular form which also does not display a row selctor column. This is odd as my other tabular forms display a row selctor column.
Many thanks for you help,
ChrisChris,
Usually when someone wants a row selector, it's to choose a few rows by clicking a checkbox, then doing something with them. I'm guessing that's what you're looking for?
Martin Giffy D'Souza's [blog post|http://apex-smb.blogspot.com/2009/01/apex-report-with-checkboxes-advanced.html] explains one way to do that which should work for you. I haven't tried, it, but it looks pretty comprehensive.
Good luck,
Stew -
How to create a report region which the first colomn is row selector column
I want to create a report region and its first column is a row selector column. I have used select sentence to select some columns. But I do not know how set the first column to row selector column. I mean I want to do as follow. When a radio which is first column is chosen, it will return it's value of the column in the chosen row. Please help me! Thanks
Hi unnamed,
Suppose you have an id that identifies your record.
Go to Report definiton, tab report attirbutes.
Select the id of your record.
Create a link to the page you want to go to.
Hope this helps.
If not, I suggest you to create a from with report, and analyze the way the wizard has generated it.
Leo -
How to get number of rows and columns in a two dimensional array ?
Hello,
What would be the simplest way to get number of rows and columns in a two dimensional array represented as integers ?
I'm looking for another solution as For...Each loop in case of large arrays.
Regards,
PetriHi Petri,
See a attached txt file for obtaining two arrays with upper and lower index values
Regards
Ray
Regards
Ray Farmer
Attachments:
Get2DArrayIndex.txt 2 KB -
How do I delete multiple rows and columns in an image?
I am looking into how digital SLRs extract video data from a CMOS sensor. To give an example, the GH1 from Panasonic can record video at 1920 x 1080 from a 12 MPixel sensor that, say, is 4000 horizontal x 3000 vertical. How they do that seems to be not in the public domain, but there have been a few guesses (see http://www.dvxuser.com/V6/showthread.php?t=206797 and http://luminous-landscape.com/forum/index.php?showtopic=38713).
One approach would be to simply read every second row of sensor pixels (1500 rows read from the original 3000) and once you have those in memory, delete every second column (2000 columns left). You would end up with a 2000 x 1500 image which could then be resampled to 1920 x 1080.
I'd like to simulate what the camera appears to be doing, by generating a 4000 x 3000 test image and then asking Photoshop CS4 to delete the appropriate rows and columns. It may not necessarily be every second row; the Canon 5DMk11 appears to read every third row, so I may need to delete two out of every three rows.
Can Photoshop do that sort of thing? If so, how?Thanks for the suggestions. Yes, I did take a detailed look at your images, but they weren't 100% convincing because it wasn't clear just what was happening. And Adobe's explanation, after reading it again, explains nothing at all to someone who doesn't know how Nearest Neighbor works.
But you are correct -- Nearest Neighbor does effectively delete pixels. I proved it with the attached 6 x 6 image of coloured pixels (the tiny midget image right after this full stop -- you'll have to go to maximum zoom in PS to see it).
These are the steps to delete every second row and then every second column.
1. Select Image > Image Size.
2. Set Pixel Dimensions > Height to half the original (in this case set to 3 pixels).
3. Set Resample Image to Nearest Neighbor.
4. Click OK and the image shoould now consist of three vertical white strips and three vertical Green-Red-Blue stripes.
5. Repeat steps 1-4, but this time set Pixel Dimensions > Width to half the original (in this case set to 3 pixels). The image should now consist of three horizontal stripes Green-Red-Blue.
Just to make sure the method worked for every third pixel, I repeated the above steps but with 2 pixels instead of 3 and obtained the correct White-Green, White-Green pattern.
I resampled the Height and Width separately for the test so that I could see what was happening. In a real example, the height and width can be changed at the same time in the one step, achieving the same results as above.
Finally, how to explain how Nearest Neighbor works in the simple cases described above?
Division by 2
In the case of an exact division by two (pixels numbered from top leftmost pixel), only the even numbered rows and columns remain. To put it a different way, every odd numbered row and column are deleted.
Division by 3
Only rows and columns 2, 5, 8, 11... remain.
Division by N
Only rows and columns 2, 2+N, 2+2N, 2+3N... remain.
To put it simply, a resample using Nearest Neighbor (using an exact integer multiple) keeps every Nth row and column, starting from number two. The rest are deleted. -
Lock rows and columns header in a table view report. It is possible?
hi,
I have a Dashboard that displays a report in "Table View" with many rows and columns.
Is it possible to set a lock on the rows and columns like Excel?
This would have blocked such headers that contain attributes and measures and to browse the report (eg with a scroll bar) had always viewed the headers.
Can you help me?
Thankshi,
please go through this discussion
Re: SCROLL BAR to FREZZ HEADERS
thanks,
saichand.v
Maybe you are looking for
-
Is it possible to create a playlist in itunes on the iphone 4s, that automatically contains recently downloaded podcasts?
-
Special charecters handling while Converting XML string to DOM
Hi, I am using the following approach for converting XML string to DOM, but due to Special characters like "&", I am getting Exceptions: String xmlString; DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance(); DocumentBuilder builder
-
I'm trying to download Lion but the icon will just sit there and it says that the download is paused, when i try to do it from my purchases it says something about an internal error (500). How can I fix this?
-
Sub query filter may not reference current Report
Hi, I've 2 reports, i'm calling one from the another report by navigate methode(Column Propertives--> Value-->Navigate). 2nd report is having narrative veiw in title of the report . Getting Error: Sub query filter may not reference current Report
-
Can i concatenate or add to the existing header text with new text logic
Hi, Is it possible to concatenate additional text into an existing text type? For example, would it be possible to write new logic where if certain conditions are met, the text in theheader detailswould be added to the existing Preparation Header tex