Xml query search by date or number
table :tb column:pxml
<?xml version="1.0" encoding="GBK"?>
<model id="f379f851-b1f9-48bf-8ac7-dafd2b1cedb0" name="hotel" objid="8873dc02-cf83-4527-95bb-e3025469e4ba">
<property id="3100" name="thedate">
<value>2009-09-07</value>
</property>
<property id="3101" name="thenum">
<value>26</value>
</property>
</model>
sql:
select
extractValue(v.pxml,'//property[@name="thedate"]/value'),
extractValue(v.pxml,'//property[@name="thenum"]/value'),
from tb v
where
CONTAINS(v.pxml,'haspath(//property[@name="thenum" and value>="10" and value<="50"])')>0
and to_date(extractValue(v.pxml, '//property[@name="thedate"]/value'),
'yyyy-MM-dd') between to_date('2009-09-01', 'yyyy-MM-dd') and to_date('2009-09-30','yyyy-MM-dd')
the sql had two problems:
1), numeric range comparison wrong, simply finding out the result, but if you can directly =
2), the date type of comparison, I could not find XQUERY program, so use this alternative, the result is correct, but inefficient
Pradeep
You should check and compare the date field mapping with other fields that you have done the correct mapping like other fields , because there should be no issue if mapping is correct, is this entire custom page created by you ? or are you doing changes in existing custom page ?
thanks
Pratap
Similar Messages
-
Different row count for select versus insert in XML query
Hi,
I encounter a situation where a SELECT * returns a different rows count than an INSERT INTO... (SELECT *) by xmltable join. This makes no sense at all!!!
In breif, I tried to convert xml data into traditional relational tables. I wrote an xml query to select data from xmltable... I checked row count. When I used "create table as select" that was the same query above, I got correct row count. However when I used "insert into select" that was the same query above, I got the wrong info in the table I just inserted.
Does any one have any idea what caused this issue? Thanks for your help.DUPLICATE post
count of rows in a schema tables -
Clues about query's - Search for date type at INBOX
Hello Everyone,
At IC Webclient, does anyone experiment to add a new search date type at INBOX, at "search for date" search parameter?
I added a new date type at Date Management, ZDATA_PR.
When I try to assemble the query parts, I use something like this:
CL CL_CRM_QUERYAUI_RUN_BTIL
METH READ_BUSINESS_TRANSACTIONS
CALL METHOD cl_crm_report_qupart=>get_qupart_by_token
EXPORTING
iv_token = 'DAT'
iv_date_type = 'ZDATA_PAR'
iv_from = ls_query_aui-from
iv_to = ls_query_aui-to
IMPORTING
ev_qupart = ls_qupart_range-querypart.
APPEND ls_qupart_and TO lt_query.
APPEND ls_qupart_range TO lt_query.
However, I'm rewarded with an error message telling me that I have problems with the query.
Does anyone can give me clues how to perform this action? I'm clueless and there's not almost no information about this...
Thanks and Kind Regards
Bruno GarciaJust to solve a mistake at example code:
CALL METHOD cl_crm_report_qupart=>get_qupart_by_token
EXPORTING
iv_token = 'DAT'
iv_date_type = 'ZDATA_PR'
iv_from = ls_query_aui-from
iv_to = ls_query_aui-to
IMPORTING
ev_qupart = ls_qupart_range-querypart.
APPEND ls_qupart_and TO lt_query.
APPEND ls_qupart_range TO lt_query.
does anyone has a suggestion or a clue to give?
Thanks
Bruno -
I have to store year and date in a database column like 'yeardate number(6)'.
I have the year and month being passed to my procedure in a number and character strings respective.
Month's value is like 'MARCH' i.e. a string, and year value is like 2009 i.e. a number.
Now I need to convert the 'MARCH' to 03, and then store the value in the table as a number like 200903.
How should I convert the string 'MARCH' month to a number 03, concatanate it to the year 2009, so that I dont get the error 'ORA-01722: invalid number'Nag Swamiyyee wrote:
Please dont preach. We all get into jobs where applications are already written, data models are in place, and we have live with them. Reading Thomas Kyte is easy, but to work within a given scope of already developed applications is the challenge. With due respect we all know this truth.¿Preach? I don't 'preach', I pointed you to some links that helped me a lot, others a lot and I hoped they would help you as well.
And :this is not about 'reading=easy', it's about understanding how things work.
If you'd have read, searched and understood some more, then you'd know that: "applications come and go".
Data models and applications are not static.
Its about the data.
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1545206281987#51741210713900
(sorry for yet another link)
You don't adjust the data to your application, you adjust your application to the data.
I've spent quite some time to find out/understand that.
But, if you/your superiors think that it's better to store a DATE as NUMBER datatype, feel free to do so.
But 'the scope' you're refering to can be MySQL/.NET/whatever tomorrow, and then you're out of luck. No more Oracle, but we still need the data.
Just because that query using stored as a NUMBER that should actually be a DATE, crippled the database, since it got frequently used by that application you put so much faith into, but through the years it just couldn't perform anymore. Because you treated the data wrong.
work within a given scope of already developed applications is the challenge.I sincerely wish you the best of luck.
Times change.
Versions change.
So: I'm afraid 'your dream' will be shattered within 2/5 years (unless you're into mainframe-stuff), while the data is still there and still needed. And some other application will be 'sexy' then. For a while....
I'm not preaching, I'm being realistic, since I've been through that myself as well.
It's not about preaching or reading, it's about understanding.
Now, why would I reinvent the wheel if it's already there?
Hence I share links with others, since they really helped me (and many many others).
You should not only read, but also try to understand.
Nothing is static.
Applications are certainly not.
Data is.
You should not try to squash your data into a static application.
Your application should fit your data. -
Cooresponding Lists Names/Values XML Query
Let's say there is an XML Schema that has
<element name="mt" minOccurs="0" maxOccurs="unbounded">
<element name="r" minOccurs="0" maxOccurs="unbounded">
And for each measurement type (mt), you have a cooresponding measurement value (r). There are additional elements between these two lists. The actual XML data would look similiar to
<mi>
<mts>20061117100000-0800</mts>
<gp>900</gp>
<mt>MeasurementType1</mt>
<mt>MeasurementType2</mt>
<mt>MeasurementType3</mt>
<mt>MeasurementType4</mt>
<mt>MeasurementType5</mt>
<mt>MeasurementType6</mt>
<mt>MeasurementType7</mt>
<mv>
<moid>Identifier</moid>
<r>58</r>
<r>62</r>
<r>43</r>
<r>45</r>
<r>43</r>
<r>14</r>
<r>29</r>
<sf>FALSE</sf>
</mv>
</mi>
The first occurance of mt corresponds to the first occurance of r, the second cooresponds to the second, et cetra.
The MI element can repeat hundreds of times and there can be many measurementTypes. Now, I'm trying to figure out how I can create an XML query to efficiently handle this. The problem is that the <r> value is within the complexType <mv> and that is on the same level as <mt>. The XML Query would cause a cartesian product if I were to use something like ...
select
extractValue(value(xmldata), '/mi/mts') measurement_time_stamp,
extractValue(value(xmldata), '/mi/gp') granularity_period,
extractValue(value(mt), '/mt') measurement_type,
extractValue(value(mv), '/mv/moid') measured_obj_id,
extractValue(value(r), '/r') measurement_value,
from xmltable
, TABLE(XMLSequence(Extract(value(xmltable), '/mi/mv'))) mv
, TABLE(XMLSequence(Extract(value(mi), '/mi/mt'))) mt
, TABLE(XMLSequence(Extract(value(mi), '/mv/r'))) r
This obviously wouldn't work. I could go and store all the types and values into a column using
select
replace(replace(replace(extract(value(mi), '/mi/mt').getStringVal(), '</mt><mt>', ';'), '<mt>'), '</mt>') measurement_type,
extractValue(value(mv), '/mv/moid') measured_obj_id,
replace(replace(replace(extract(value(mv), '/mv/r').getStringVal(), '</r><r>', ';'), '<r>'), '</r>') measurement_value
from xmltable
TABLE(XMLSequence(Extract(value(xmldata), '/md/mi'))) mi
, TABLE(XMLSequence(Extract(value(mi), '/mi/mv'))) mv
But this wouldn't work once the XML grew over 4000 bytes. So I need a way to extract this data maintaining the correct integrity (avoiding cartesians).
One idea is a pipelined table function but I have concerns about scalability with that method. Is there a way to accomplish this optimally? I have solutions for this but none of them is going to deliver the scalability I am seeking.
I expect the method chosen will probably need to handle a few hundred thousand files per day.
Thanks,VJ
I'd not seen your XML schema when I worked the original example, so I reverse engineered it from the instance. Unfortunately when I work with your schema, which contains more levels of nested I can't get it to optimize properly
Here's what should work in theory
SQL> set echo on
SQL> spool testcase.log
SQL> --
SQL> connect sys/ as sysdba
Enter password:
Connected.
SQL> set define on
SQL> --
SQL> define USERNAME = OTNTEST
SQL> --
SQL> def PASSWORD = OTNTEST
SQL> --
SQL> def USER_TABLESPACE = USERS
SQL> --
SQL> def TEMP_TABLESPACE = TEMP
SQL> --
SQL> def LOCAL_FILESYSTEM = 'C:\xdb\otn\457595'
SQL> --
SQL> drop user &USERNAME cascade
2 /
old 1: drop user &USERNAME cascade
new 1: drop user OTNTEST cascade
User dropped.
SQL> grant connect, resource to &USERNAME identified by &PASSWORD
2 /
old 1: grant connect, resource to &USERNAME identified by &PASSWORD
new 1: grant connect, resource to OTNTEST identified by OTNTEST
Grant succeeded.
SQL> grant create any directory, drop any directory to &USERNAME
2 /
old 1: grant create any directory, drop any directory to &USERNAME
new 1: grant create any directory, drop any directory to OTNTEST
Grant succeeded.
SQL> grant alter session, create view to &USERNAME
2 /
old 1: grant alter session, create view to &USERNAME
new 1: grant alter session, create view to OTNTEST
Grant succeeded.
SQL> alter user &USERNAME default tablespace &USER_TABLESPACE temporary tablespace &TEMP_TABLESPACE
2 /
old 1: alter user &USERNAME default tablespace &USER_TABLESPACE temporary tablespace &TEMP_TABLESPACE
new 1: alter user OTNTEST default tablespace USERS temporary tablespace TEMP
User altered.
SQL> connect &USERNAME/&PASSWORD
Connected.
SQL> --
SQL> alter session set events ='19027 trace name context forever, level 0x800'
2 /
Session altered.
SQL> var schemaURL varchar2(256)
SQL> var schemaPath varchar2(256)
SQL> --
SQL> create or replace directory XMLDIR as '&LOCAL_FILESYSTEM'
2 /
old 1: create or replace directory XMLDIR as '&LOCAL_FILESYSTEM'
new 1: create or replace directory XMLDIR as 'C:\xdb\otn\457595'
Directory created.
SQL> begin
2 :schemaURL := 'testcase.xsd';
3 :schemaPath := '/public/testcase.xsd';
4 end;
5 /
PL/SQL procedure successfully completed.
SQL>
SQL> declare
2 res boolean;
3 xmlSchema xmlType := xmlType(
4 '<?xml version="1.0" encoding="UTF-8" standalone="no"?>
5 <!--W3C Schema generated by XMLSpy v2007 (http://www.altova.com)-->
6 <!--Please add namespace attributes, a targetNamespace attribute and import elements according to your requirements-->
7 <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" elementFormDefault="qualified" attributeFormDefaul
t="unqualified" xdb:storeVarrayAsTable="true">
8 <xs:import namespace="http://www.w3.org/XML/1998/namespace"/>
9 <xs:element name="mdc" xdb:defaultTable="MDC_TABLE">
10 <xs:complexType xdb:SQLType="MDC_TYPE" xdb:maintainDOM="false">
11 <xs:sequence>
12 <xs:element ref="mfh"/>
13 <xs:element ref="md" minOccurs="0" maxOccurs="unbounded"/>
14 <xs:element ref="mff"/>
15 </xs:sequence>
16 </xs:complexType>
17 </xs:element>
18 <xs:element name="mfh" xdb:defaultTable="">
19 <xs:complexType xdb:SQLType="MFH_TYPE" xdb:maintainDOM="false">
20 <xs:sequence>
21 <xs:element ref="ffv"/>
22 <xs:element ref="sn"/>
23 <xs:element ref="st"/>
24 <xs:element ref="vn"/>
25 <xs:element ref="cbt"/>
26 </xs:sequence>
27 </xs:complexType>
28 </xs:element>
29 <xs:element name="md" xdb:defaultTable="" >
30 <xs:complexType xdb:SQLType="MD_TYPE" xdb:maintainDOM="false">
31 <xs:sequence>
32 <xs:element ref="neid"/>
33 <xs:element ref="mi" minOccurs="0" maxOccurs="unbounded" />
34 </xs:sequence>
35 </xs:complexType>
36 </xs:element>
37 <xs:element name="neid" xdb:defaultTable="" >
38 <xs:complexType xdb:SQLType="NEID_TYPE" xdb:maintainDOM="false">
39 <xs:sequence>
40 <xs:element ref="neun"/>
41 <xs:element ref="nedn"/>
42 </xs:sequence>
43 </xs:complexType>
44 </xs:element>
45 <xs:element name="mi" xdb:defaultTable="" >
46 <xs:complexType xdb:SQLType="MI_TYPE" xdb:maintainDOM="false">
47 <xs:sequence>
48 <xs:element ref="mts"/>
49 <xs:element ref="gp"/>
50 <xs:element ref="mt" minOccurs="0" maxOccurs="unbounded"/>
51 <xs:element ref="mv" minOccurs="0" maxOccurs="unbounded" />
52 </xs:sequence>
53 </xs:complexType>
54 </xs:element>
55 <xs:element name="mv" xdb:defaultTable="" >
56 <xs:complexType xdb:SQLType="MV_TYPE" xdb:maintainDOM="false">
57 <xs:sequence>
58 <xs:element ref="moid"/>
59 <xs:element ref="r" minOccurs="0" maxOccurs="unbounded"/>
60 <xs:element ref="sf" minOccurs="0"/>
61 </xs:sequence>
62 </xs:complexType>
63 </xs:element>
64 <xs:element name="mff" xdb:defaultTable="" >
65 <xs:complexType xdb:maintainDOM="false">
66 <xs:sequence>
67 <xs:element ref="ts"/>
68 </xs:sequence>
69 </xs:complexType>
70 </xs:element>
71 <xs:element name="ts" type="xs:string"/>
72 <xs:element name="sf" type="xs:string"/>
73 <xs:element name="r">
74 <xs:complexType xdb:SQLType="R_TYTPE" xdb:maintainDOM="false">
75 <xs:simpleContent>
76 <xs:extension base="xs:string">
77 <xs:attribute ref="dummy" use="prohibited"/>
78 </xs:extension>
79 </xs:simpleContent>
80 </xs:complexType>
81 </xs:element>
82 <xs:attribute name="dummy" type="xs:boolean"/>
83 <xs:element name="mt">
84 <xs:complexType xdb:SQLType="MT_TYTPE" xdb:maintainDOM="false">
85 <xs:simpleContent>
86 <xs:extension base="xs:string">
87 <xs:attribute ref="dummy" use="prohibited"/>
88 </xs:extension>
89 </xs:simpleContent>
90 </xs:complexType>
91 </xs:element>
92 <xs:element name="moid" type="xs:string"/>
93 <xs:element name="gp" type="xs:string"/>
94 <xs:element name="mts" type="xs:string"/>
95 <xs:element name="nedn" type="xs:string"/>
96 <xs:element name="neun" type="xs:string"/>
97 <xs:element name="cbt" type="xs:string"/>
98 <xs:element name="vn" type="xs:string"/>
99 <xs:element name="st" type="xs:string"/>
100 <xs:element name="sn" type="xs:string"/>
101 <xs:element name="ffv" type="xs:string"/>
102 </xs:schema>');
103 begin
104 if (dbms_xdb.existsResource(:schemaPath)) then
105 dbms_xdb.deleteResource(:schemaPath);
106 end if;
107 res := dbms_xdb.createResource(:schemaPath,xmlSchema);
108 end;
109 /
PL/SQL procedure successfully completed.
SQL> begin
2 dbms_xmlschema.registerSchema
3 (
4 :schemaURL,
5 xdbURIType(:schemaPath).getClob(),
6 TRUE,TRUE,FALSE,TRUE
7 );
8 end;
9 /
PL/SQL procedure successfully completed.
SQL> declare
2 nested_table_name varchar2(256);
3 iot_index_name varchar2(256);
4 begin
5 select table_name
6 into nested_table_name
7 from user_nested_tables
8 where parent_table_column = '"XMLDATA"."md"'
9 and parent_table_name = 'MDC_TABLE';
10
11 execute immediate 'rename "'|| nested_table_name ||'" to MD_TABLE';
12
13 select index_name
14 into iot_index_name
15 from user_indexes
16 where table_name = 'MD_TABLE' and index_type = 'IOT - TOP';
17
18 execute immediate 'alter index "'|| iot_index_name ||'" rename to MD_IOT';
19
20 select table_name
21 into nested_table_name
22 from user_nested_tables
23 where parent_table_column = 'mi'
24 and parent_table_name = 'MD_TABLE';
25
26 execute immediate 'rename "'|| nested_table_name ||'" to MI_TABLE';
27
28 select index_name
29 into iot_index_name
30 from user_indexes
31 where table_name = 'MI_TABLE' and index_type = 'IOT - TOP';
32
33 execute immediate 'alter index "'|| iot_index_name ||'" rename to MI_IOT';
34
35 select table_name
36 into nested_table_name
37 from user_nested_tables
38 where parent_table_column = 'mt'
39 and parent_table_name = 'MI_TABLE';
40
41 execute immediate 'rename "'|| nested_table_name ||'" to MT_TABLE';
42
43 select index_name
44 into iot_index_name
45 from user_indexes
46 where table_name = 'MT_TABLE' and index_type = 'IOT - TOP';
47
48 execute immediate 'alter index "'|| iot_index_name ||'" rename to MT_IOT';
49
50 select table_name
51 into nested_table_name
52 from user_nested_tables
53 where parent_table_column = 'mv'
54 and parent_table_name = 'MI_TABLE';
55
56 execute immediate 'rename "'|| nested_table_name ||'" to MV_TABLE';
57
58 select index_name
59 into iot_index_name
60 from user_indexes
61 where table_name = 'MV_TABLE' and index_type = 'IOT - TOP';
62
63 execute immediate 'alter index "'|| iot_index_name ||'" rename to MV_IOT';
64
65 select table_name
66 into nested_table_name
67 from user_nested_tables
68 where parent_table_column = 'r'
69 and parent_table_name = 'MV_TABLE';
70
71 execute immediate 'rename "'|| nested_table_name ||'" to R_TABLE';
72
73 select index_name
74 into iot_index_name
75 from user_indexes
76 where table_name = 'R_TABLE' and index_type = 'IOT - TOP';
77
78 execute immediate 'alter index "'|| iot_index_name ||'" rename to R_IOT';
79 end;
80 /
PL/SQL procedure successfully completed.
SQL> desc MDC_TABLE
Name Null? Type
TABLE of SYS.XMLTYPE(XMLSchema "testcase.xsd" Element "mdc") STORAGE Object-relational TYPE "MDC_TYPE"
SQL> --
SQL> desc MD_TABLE
Name Null? Type
neid NEID_TYPE
mi mi9495_COLL
SQL> --
SQL> desc MI_TABLE
Name Null? Type
mts VARCHAR2(4000 CHAR)
gp VARCHAR2(4000 CHAR)
mt mt9493_COLL
mv mv9494_COLL
SQL> --
SQL> desc MT_TABLE
Name Null? Type
SYS_XDBBODY$ VARCHAR2(4000 CHAR)
dummy RAW(1)
SQL> --
SQL> desc MV_TABLE
Name Null? Type
moid VARCHAR2(4000 CHAR)
r r9492_COLL
sf VARCHAR2(4000 CHAR)
SQL> --
SQL> desc R_TABLE
Name Null? Type
SYS_XDBBODY$ VARCHAR2(4000 CHAR)
dummy RAW(1)
SQL> --
SQL> set autotrace on explain
SQL> set lines 150 pages 100
SQL> --
SQL> var XMLTEXT varchar2(4000)
SQL> --
SQL> begin
2 :xmlText :=
3 '<mdc>
4 <mfh>
5 <ffv/>
6 <sn/>
7 <st/>
8 <vn/>
9 <cbt/>
10 </mfh>
11 <md>
12 <neid>
13 <neun/>
14 <nedn/>
15 </neid>
16 <mi>
17 <mts>20061117100000-0800</mts>
18 <gp>900</gp>
19 <mt>MeasurementType1</mt>
20 <mt>MeasurementType2</mt>
21 <mt>MeasurementType3</mt>
22 <mt>MeasurementType4</mt>
23 <mt>MeasurementType5</mt>
24 <mt>MeasurementType6</mt>
25 <mt>MeasurementType7</mt>
26 <mv>
27 <moid>Identifier</moid>
28 <r>58</r>
29 <r>62</r>
30 <r>43</r>
31 <r>45</r>
32 <r>43</r>
33 <r>14</r>
34 <r>29</r>
35 <sf>FALSE</sf>
36 </mv>
37 </mi>
38 </md>
39 <mff>
40 <ts/>
41 </mff>
42 </mdc>';
43 end;
44 /
PL/SQL procedure successfully completed.
SQL> insert into MDC_TABLE values ( xmltype ( :xmltext ))
2 /
1 row created.
Execution Plan
Plan hash value: 1621636734
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | INSERT STATEMENT | | 1 | 100 | 1 (0)| 00:00:01 |
| 1 | LOAD TABLE CONVENTIONAL | MDC_TABLE | | | | |
SQL> commit
2 /
Commit complete.
SQL> select MT_INDEX, MT_VALUE, R_VALUE
2 from MDC_TABLE,
3 xmlTable
4 (
5 '/mdc/md/mi'
6 passing object_value
7 columns
8 XML xmltype path '.'
9 ) MI,
10 xmlTable
11 (
12 '/mi/mt'
13 passing MI.XML
14 columns
15 MT_INDEX for ordinality,
16 MT_VALUE varchar2(32) path 'text()'
17 ) MT,
18 xmlTable
19 (
20 '/mi/mv/r'
21 passing MI.XML
22 columns
23 R_INDEX for ordinality,
24 R_VALUE varchar2(32) path 'text()'
25 ) R
26 where MT_INDEX = R_INDEX
27 /
MT_INDEX MT_VALUE R_VALUE
1 MeasurementType1 58
2 MeasurementType2 62
3 MeasurementType3 43
4 MeasurementType4 45
5 MeasurementType5 43
6 MeasurementType6 14
7 MeasurementType7 29
7 rows selected.
Execution Plan
Plan hash value: 2832518671
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 5449M| 19T| 1616M (1)|999:59:59 |
| 1 | NESTED LOOPS | | 5449M| 19T| 1616M (1)|999:59:59 |
| 2 | NESTED LOOPS | | 66M| 237G| 197K (1)| 00:39:36 |
| 3 | NESTED LOOPS | | 8168 | 29M| 27 (0)| 00:00:01 |
|* 4 | TABLE ACCESS FULL | MDC_TABLE | 1 | 3788 | 3 (0)| 00:00:01 |
| 5 | COLLECTION ITERATOR PICKLER FETCH | XMLSEQUENCEFROMXMLTYPE | | | | |
| 6 | VIEW | | 8168 | 247K| 24 (0)| 00:00:01 |
| 7 | COUNT | | | | | |
| 8 | COLLECTION ITERATOR PICKLER FETCH| XMLSEQUENCEFROMXMLTYPE | | | | |
|* 9 | VIEW | | 82 | 2542 | 24 (0)| 00:00:01 |
| 10 | COUNT | | | | | |
| 11 | COLLECTION ITERATOR PICKLER FETCH | XMLSEQUENCEFROMXMLTYPE | | | | |
Predicate Information (identified by operation id):
4 - filter(SYS_CHECKACL("ACLOID","OWNERID",xmltype('<privilege
xmlns="http://xmlns.oracle.com/xdb/acl.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.oracle.com/xdb/acl.xsd http://xmlns.oracle.com/xdb/acl.xsd
DAV:http://xmlns.oracle.com/xdb/dav.xsd"><read-properties/><read-contents/></privilege>'))=1)
9 - filter("MT_INDEX"="R_INDEX")
Note
- dynamic sampling used for this statement
SQL>As you can see the re-write is not working out in this case. I'll ask development to take a look at it and see if they can solve it. I think it's similar to another bug I've filed...
WRT to your question about transforming. One easy transformation would be to number the nodes.. Eg use XSTL to add an index number to each MT node and each R node and then join on that value. -
Extending SalesOrder Query search on MSR 2.0
Hi all,
I need to extend query search for sales order by adding a date range (FROM - TO in place of a single date field DATE_DOC). But the DATE_DOC_HIGH custom field has not a FieldDescriptor so it's not possible to call:
searchKeys<i> = getFieldDescriptor("DATE_DOC_HIGH"); //->>returns an Exception
(where 'i' is the Condition number index)
It's possible to clone the FieldDescriptor of the standard field DATE_DOC, and modify its KEY and VALUE in order to fill it with those of DATE_DOC_HIGH?
The standard code that run the query is this:
conditions[j][0] = queryFactory.createCondition( searchKeys<i>,
criteriaTypes<i>,
tmpSearchCondition<i>.getDateValue());
Does anybody know another way to fullfill this goal?
Thanks in advance,
GBHello,
I am not sure I am getting the question. Normally you have to create to condition on the date field one with DATE > input date 1 and DATE < input date 2.
You create a composite condition and query with it directly. No need for a custom field as they cannot be used for querying (if they are not in the syncbo def).
Thank you,
Julien.
msc mobile Canada
http://www.msc-mobile.com -
Simple Transformation to deserialize an XML file into ABAP data structures?
I'm attempting to write my first simple transformation to deserialize
an XML file into ABAP data structures and I have a few questions.
My simple transformation contains code like the following
<tt:transform xmlns:tt="http://www.sap.com/transformation-templates"
xmlns:pp="http://www.sap.com/abapxml/types/defined" >
<tt:type name="REPORT" line-type="?">
<tt:node name="COMPANY_ID" type="C" length="10" />
<tt:node name="JOB_ID" type="C" length="20" />
<tt:node name="TYPE_CSV" type="C" length="1" />
<tt:node name="TYPE_XLS" type="C" length="1" />
<tt:node name="TYPE_PDF" type="C" length="1" />
<tt:node name="IS_NEW" type="C" length="1" />
</tt:type>
<tt:root name="ROOT2" type="pp:REPORT" />
<QueryResponse>
<tt:loop ref="ROOT2" name="line">
<QueryResponseRow>
<CompanyID>
<tt:value ref="$line.COMPANY_ID" />
</CompanyID>
<JobID>
<tt:value ref="$line.JOB_ID" />
</JobID>
<ExportTypes>
<tt:loop>
<ExportType>
I don't know what to do here (see item 3, below)
</ExportType>
</tt:loop>
</ExportTypes>
<IsNew>
<tt:value ref="$line.IS_NEW"
map="val(' ') = xml('false'), val('X') = xml('true')" />
</IsNew>
</QueryResponseRow>
</tt:loop>
</QueryResponse>
</tt:loop>
1. In a DTD, an element can be designated as occurring zero or one
time, zero or more times, or one or more times. How do I write the
simple transformation to accommodate these possibilities?
2. In trying to accommodate the "zero or more times" case, I am trying
to use the <tt:loop> instruction. It occurs several layers deep in the
XML hierarchy, but at the top level of the ABAP table. The internal
table has a structure defined in the ABAP program, not in the data
dictionary. In the simple transformation, I used <tt:type> and
<tt:node> to define the structure of the internal table and then
tried to use <tt:loop ref="ROOT2" name="line"> around the subtree that
can occur zero or more times. But every variation I try seems to get
different errors. Can anyone supply a working example of this?
3. Among the fields in the internal table, I've defined three
one-character fields named TYPE_CSV, TYPE_XLS, and TYPE_PDF. In the
XML file, I expect zero to three elements of the form
<ExportType exporttype='csv' />
<ExportType exporttype='xls' />
<ExportType exporttype='pdf' />
I want to set field TYPE_CSV = 'X' if I find an ExportType element
with its exporttype attribute set to 'csv'. I want to set field
TYPE_XLS = 'X' if I find an ExportType element with its exporttype
attribute set to 'xls'. I want to set field TYPE_PDF = 'X' if I find
an ExportType element with its exporttype attribute set to 'pdf'. How
can I do that?
4. For an element that has a value like
<ErrorCode>123</ErrorCode>
in the simple transformation, the sequence
<ErrorCode> <tt:value ref="ROOT1.CODE" /> </ErrorCode>
seems to work just fine.
I have other situations where the XML reads
<IsNew value='true' />
I wanted to write
<IsNew>
<tt:value ref="$line.IS_NEW"
map="val(' ') = xml('false'), val('X') = xml('true')" />
</IsNew>
but I'm afraid that the <tt:value> fails to deal with the fact that in
the XML file the value is being passed as the value of an attribute
(named "value"), rather than the value of the element itself. How do
you handle this?Try this code below:
data l_xml_table2 type table of xml_line with header line.
W_filename - This is a Path.
if w_filename(02) = '
open dataset w_filename for output in binary mode.
if sy-subrc = 0.
l_xml_table2[] = l_xml_table[].
loop at l_xml_table2.
transfer l_xml_table2 to w_filename.
endloop.
endif.
close dataset w_filename.
else.
call method cl_gui_frontend_services=>gui_download
exporting
bin_filesize = l_xml_size
filename = w_filename
filetype = 'BIN'
changing
data_tab = l_xml_table
exceptions
others = 24.
if sy-subrc <> 0.
message id sy-msgid type sy-msgty number sy-msgno
with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
endif. -
Getting data in a query on last date of all previous months
Dear Experts,
I need your help with a query.
I am trying to create a query which when run should pick the number of Open Sales Order on the last date of all the previous months from the start of company.
I could create the query for fetching the required data on last day of the previous month, say today is 25th June 2014, so my query only fetches data for May 31st 2014 but I need data for all previous month, May 31st 2014, April 30th 2014, March 31st 2014 & so on.
Please advise how to achieve this is SQL query.
Thanks & regards,
NitinHi,
Try this:
Select *
from(
SELECT T0.[DocNum] as #,t0.cardcode as C, t0.docdate
FROM ORDR T0
WHERE T0.[DocStatus] = 'o' and T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-1,0)),10) OR T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-2,0)),10) or T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-3,0)),10) or T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-4,0)),10) or T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-5,0)),10) or T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-6,0)),10) or T0.[DocDate] = CONVERT(VARCHAR(10), DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,getdate())-7,0)),10)
group by t0.cardcode,docdate,T0.[DocNum]) S
pivot
(count(#) for c in([cvt01],[cvt02],[cvt03],[cvt04],[cvt05],[cvt06],[cvt07],[cvt08],[cvt12])) P
Note: Replace Cvt01,02...with your customer code.
Thanks & Regards,
Nagarajan -
Query Help pls dates and logic.
Hi all gurus
I have a need where I need to put togahter series of timeline events that occurred in past as a one row concerning dates.
This is my table .
CREATE TABLE "SORS"."SOR_TRACKING"
( "TRACKING_ID" NUMBER NOT NULL ENABLE,
"LETTER_ID" NUMBER NOT NULL ENABLE,
"OFFENDER_ID" NUMBER NOT NULL ENABLE,
"LOCATION_ID" NUMBER NOT NULL ENABLE,
"OFFICE_ID" NUMBER,
"MAIL_DATE" DATE DEFAULT SYSDATE NOT NULL ENABLE,
"RESPONSE_DATE" DATE,
"STATUS" VARCHAR2(30 BYTE) DEFAULT 'Mailed',
"ENTRY_DATE" DATE DEFAULT SYSDATE NOT NULL ENABLE,
"ENTRY_USER" VARCHAR2(30 BYTE) NOT NULL ENABLE,
"COMMENTS" VARCHAR2(2000 BYTE),
"FIRST_NAME" VARCHAR2(80 BYTE) NOT NULL ENABLE,
"MIDDLE_NAME" VARCHAR2(80 BYTE),
"LAST_NAME" VARCHAR2(80 BYTE) NOT NULL ENABLE,
"SIR_NAME" VARCHAR2(30 BYTE),
"ADDRESS1" VARCHAR2(80 BYTE),
"ADDRESS2" VARCHAR2(80 BYTE),
"CITY" VARCHAR2(50 BYTE),
"STATE" VARCHAR2(30 BYTE),
"ZIP" VARCHAR2(20 BYTE),
"COUNTY" VARCHAR2(80 BYTE),
"OFFENDER_TYPE" VARCHAR2(30 BYTE),
"LAST_MAIL_DATE" DATE,
"BATCH_ID" NUMBER NOT NULL ENABLE,
"JURISDICTION" NUMBER,
"DL_NAME" VARCHAR2(60 BYTE),
"DL_OFFICE" VARCHAR2(60 BYTE),
"DL_ADDRESS" VARCHAR2(60 BYTE),
"DL_MAILING_ADDRESS" VARCHAR2(60 BYTE),
"DL_CITY" VARCHAR2(60 BYTE),
"DL_STATE" VARCHAR2(30 BYTE),
"DL_ZIP" VARCHAR2(20 BYTE),
"TITLE" VARCHAR2(10 BYTE),
"MODIFIED_USER" VARCHAR2(30 BYTE),
"MODIFIED_DATE" DATE,
CONSTRAINT "PK_SOR_TRACKING" PRIMARY KEY ("TRACKING_ID"))
Sample date
REM INSERTING into SOR_TRACKING
SET DEFINE OFF;
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (781410,1,4557,110207,2809,to_date('05-NOV-13','DD-MON-RR'),null,'Mailed',to_date('05-NOV-13','DD-MON-RR'),'NEEL',null,'BOB','Jolene','Luna',null,'20535 Valleyview Rd ',null,'Boo','OK','74840','potter','STANDARD',to_date('15-FEB-12','DD-MON-RR'),30211,2809,null,null,null,null,null,null,null,'Ms.','NEEL',to_date('05-NOV-13','DD-MON-RR'));
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (190294,1,4557,110207,2809,to_date('08-FEB-12','DD-MON-RR'),to_date('15-FEB-12','DD-MON-RR'),'Verified',to_date('17-FEB-12','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'20535 Valleyview Rd ',null,'Boo','OK','74840','potter','STANDARD',to_date('13-DEC-11','DD-MON-RR'),28442,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (184647,1,4557,110207,2809,to_date('05-DEC-11','DD-MON-RR'),to_date('13-DEC-11','DD-MON-RR'),'Verified',to_date('15-DEC-11','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'20535 Valleyview Rd ',null,'Boo','OK','74840','potter','STANDARD',to_date('09-NOV-11','DD-MON-RR'),27985,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (157253,1,4557,102288,2809,to_date('03-FEB-11','DD-MON-RR'),null,'Mailed',to_date('03-FEB-11','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('08-NOV-10','DD-MON-RR'),25613,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (149710,1,4557,102288,2809,to_date('01-NOV-10','DD-MON-RR'),to_date('08-NOV-10','DD-MON-RR'),'Verified',to_date('16-NOV-10','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('18-OCT-10','DD-MON-RR'),24939,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (138268,1,4557,99564,2809,to_date('03-JUN-10','DD-MON-RR'),to_date('17-JUN-10','DD-MON-RR'),'Letter Returned',to_date('17-JUN-10','DD-MON-RR'),'Boo','Post office: No mail receptacle','BOB','Jolene','Luna',null,'20535 Valley View Rd.',null,'Boo','OK','74840','potter','STANDARD',to_date('19-MAY-10','DD-MON-RR'),24216,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (128798,1,4557,91503,2809,to_date('04-FEB-10','DD-MON-RR'),null,'Mailed',to_date('04-FEB-10','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('10-NOV-09','DD-MON-RR'),23369,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (115073,1,4557,91503,2809,to_date('07-AUG-09','DD-MON-RR'),to_date('10-NOV-09','DD-MON-RR'),'Verified',to_date('10-NOV-09','DD-MON-RR'),'Boo',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('06-MAY-09','DD-MON-RR'),21926,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (108510,1,4557,91503,2809,to_date('05-MAY-09','DD-MON-RR'),to_date('06-MAY-09','DD-MON-RR'),'Verified',to_date('07-MAY-09','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('01-MAY-09','DD-MON-RR'),21316,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (101463,1,4557,88052,2809,to_date('05-FEB-09','DD-MON-RR'),null,'Mailed',to_date('05-FEB-09','DD-MON-RR'),'Boo',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('15-DEC-08','DD-MON-RR'),20525,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (97272,1,4557,88052,2809,to_date('05-DEC-08','DD-MON-RR'),to_date('15-DEC-08','DD-MON-RR'),'Verified',to_date('17-DEC-08','DD-MON-RR'),'Casper',null,'BOB','Jolene','Luna',null,'PO Box 146',null,'Boo','OK','74840','potter','STANDARD',to_date('12-NOV-08','DD-MON-RR'),20018,2809,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (68398,1,4557,76399,1070,to_date('10-OCT-07','DD-MON-RR'),to_date('15-NOV-07','DD-MON-RR'),'Letter Returned',to_date('15-NOV-07','DD-MON-RR'),'KERRYMIN','post office: not delivaerable as addressed ','BOB','Jolene','Luna',null,'752 Boo',null,'Norman','OK','73072','Cleveland','STANDARD',to_date('17-MAY-07','DD-MON-RR'),16596,1070,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (63767,1,4557,76399,1070,to_date('09-AUG-07','DD-MON-RR'),to_date('20-AUG-07','DD-MON-RR'),'Letter Returned',to_date('20-AUG-07','DD-MON-RR'),'KERRYMIN','post office: not deliverable as addressed ','BOB','Jolene','Luna',null,'752 Boo',null,'Norman','OK','73072','Cleveland','STANDARD',to_date('17-MAY-07','DD-MON-RR'),16077,1070,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (59499,1,4557,76399,1070,to_date('14-MAY-07','DD-MON-RR'),to_date('17-MAY-07','DD-MON-RR'),'Verified',to_date('17-MAY-07','DD-MON-RR'),'LAWANHAM',null,'BOB','Jolene','Luna',null,'752 Boo',null,'Norman','OK','73072','Cleveland','STANDARD',to_date('14-MAY-07','DD-MON-RR'),15569,1070,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (54783,1,4557,74693,1070,to_date('01-MAR-07','DD-MON-RR'),to_date('06-MAR-07','DD-MON-RR'),'Verified',to_date('13-MAR-07','DD-MON-RR'),'Boo',null,'BOB','Jolene','Luna',null,'1203 Rebecca Lane',null,'Norman','OK','73072','Cleveland','STANDARD',to_date('01-MAR-07','DD-MON-RR'),14994,1070,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (34100,1,4557,58761,1094,to_date('08-FEB-06','DD-MON-RR'),null,'Mailed',to_date('31-MAR-06','DD-MON-RR'),'LAWANHAM','per M. Splawn','BOB','Jolene','Luna',null,'Rt. 1 Box 42',null,'Wewoka','OK','74884','Seminole','STANDARD',to_date('03-FEB-06','DD-MON-RR'),9560,1094,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (27562,1,4557,52781,1115,to_date('08-NOV-05','DD-MON-RR'),null,'Mailed',to_date('08-NOV-05','DD-MON-RR'),'Boo',null,'BOB','Jolene','Luna',null,'5590 Rebel Ridge Rd',null,'El Reno','OK','73036','Canadian','STANDARD',to_date('16-OCT-05','DD-MON-RR'),8034,1115,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (11822,1,4557,39238,1094,to_date('03-MAR-05','DD-MON-RR'),null,'Mailed',to_date('03-MAR-05','DD-MON-RR'),'Boo',null,'BOB','Jolene','Luna',null,'Rt. 1, Box 42',null,'Wewoka','OK','74884','Seminole','STANDARD',to_date('17-FEB-05','DD-MON-RR'),306,null,null,null,null,null,null,null,null,'Ms.',null,null);
Insert into SOR_TRACKING (TRACKING_ID,LETTER_ID,OFFENDER_ID,LOCATION_ID,OFFICE_ID,MAIL_DATE,RESPONSE_DATE,STATUS,ENTRY_DATE,ENTRY_USER,COMMENTS,FIRST_NAME,MIDDLE_NAME,LAST_NAME,SIR_NAME,ADDRESS1,ADDRESS2,CITY,STATE,ZIP,COUNTY,OFFENDER_TYPE,LAST_MAIL_DATE,BATCH_ID,JURISDICTION,DL_NAME,DL_OFFICE,DL_ADDRESS,DL_MAILING_ADDRESS,DL_CITY,DL_STATE,DL_ZIP,TITLE,MODIFIED_USER,MODIFIED_DATE) values (2257,1,4557,18958,1087,to_date('03-AUG-04','DD-MON-RR'),null,'Mailed',to_date('29-JAN-05','DD-MON-RR'),'SYSTEM',null,'BOB','Jolene','Luna',null,'1037 Tabor Drive',null,'Oklahoma City','OK','73107','Oklahoma',null,null,178,null,null,null,null,null,null,null,null,null,null,null);
I want to find the all the records whose mail date(earliest of all with no reply) and no response date and find the earlier one with reponse date and display it as a one row.
For example
mail_Date+45 mail_Date in table earliest date of reponse
BEGIN_DATE DEL_DATES END_DATE TRACKING_ID
17-SEP-04 03-AUG-04 06- MAR-07 2257
Other date and their response date should be shown same way
The issue is I am not able to retrieve other records in the tables because I am using MIN function.
here is my query
SELECT t.MAIL_DATE + 45 begin_Date,
del_date.del_dates,
(SELECT MIN(st2.RESPONSE_DATE) Del_end_date
FROM sor_tracking st2
WHERE t.OFFENDER_ID = st2.OFFENDER_ID
AND st2.RESPONSE_DATE > del_date.del_dates
) End_date,
t.OFFENDER_ID,
t.TRACKING_ID
FROM
(SELECT MIN(st1.MAIL_DATE) del_dates
FROM sor_tracking st1
WHERE st1.OFFENDER_ID = 4557
AND st1.RESPONSE_DATE IS NULL
AND st1.MAIL_DATE < SysDate - 45
AND st1.STATUS IN ('Letter Returned', 'Mailed')
AND st1.LETTER_ID IN (1, 4)
) del_date,
sor_tracking t
WHERE t.MAIL_DATE <= del_date.del_dates
AND (t.OFFENDER_ID = 4557
AND t.RESPONSE_DATE IS NULL)
ORDER BY 1 desc
Thanks for you time and help.with
sor_tracking as
(select 4557 offender_id,
1 letter_id,
to_date('05-NOV-13','dd-MON-yy') mail_date,
781410 tracking_id,
110207 location_id,
null response_date,
'Mailed' status
from dual
union all
select 4557,1,to_date('08-FEB-12','dd-MON-yy'),190294,110207,to_date('15-FEB-12','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('05-DEC-11','dd-MON-yy'),184647,110207,to_date('13-DEC-11','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('03-FEB-11','dd-MON-yy'),157253,102288,null,'Mailed' from dual union all
select 4557,1,to_date('01-NOV-10','dd-MON-yy'),149710,102288,to_date('08-NOV-10','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('03-JUN-10','dd-MON-yy'),138268,99564,to_date('17-JUN-10','dd-MON-yy'),'Letter Returned' from dual union all
select 4557,1,to_date('04-FEB-10','dd-MON-yy'),128798,91503,null,'Mailed' from dual union all
select 4557,1,to_date('07-AUG-09','dd-MON-yy'),115073,91503,to_date('10-NOV-09','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('05-MAY-09','dd-MON-yy'),108510,91503,to_date('06-MAY-09','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('05-FEB-09','dd-MON-yy'),101463,88052,null,'Mailed' from dual union all
select 4557,1,to_date('05-DEC-08','dd-MON-yy'),97272,88052,to_date('15-DEC-08','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('10-OCT-07','dd-MON-yy'),68398,76399,to_date('15-NOV-07','dd-MON-yy'),'Letter Returned' from dual union all
select 4557,1,to_date('09-AUG-07','dd-MON-yy'),63767,76399,to_date('20-AUG-07','dd-MON-yy'),'Letter Returned' from dual union all
select 4557,1,to_date('14-MAY-07','dd-MON-yy'),59499,76399,to_date('17-MAY-07','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('01-MAR-07','dd-MON-yy'),54783,74693,to_date('06-MAR-07','dd-MON-yy'),'Verified' from dual union all
select 4557,1,to_date('08-FEB-06','dd-MON-yy'),34100,58761,null,'Mailed' from dual union all
select 4557,1,to_date('08-NOV-05','dd-MON-yy'),27562,52781,null,'Mailed' from dual union all
select 4557,1,to_date('03-MAR-05','dd-MON-yy'),11822,39238,null,'Mailed' from dual union all
select 4557,1,to_date('03-AUG-04','dd-MON-yy'),2257,18958,null,'Mailed' from dual
select offender_id,
letter_id,
to_char(event_date,'dd-MON-yy') mail_date,
tracking_id,
location_id,
case when last_response_verified is null
then status
end status,
to_char(last_response_verified,'dd-MON-yy') response_date,
case when last_response_verified is not null
then 'Verified'
end response_status
from (select offender_id,
letter_id,
event_date,
tracking_id,
location_id,
status,
to_find,
verified_response,
last_value(verified_response ignore nulls) over (partition by offender_id,letter_id
order by event_date desc
) last_response_verified
from (select offender_id,
letter_id,
event_date,
tracking_id,
location_id,
status,
case when event_date < lead(event_date) over (partition by offender_id,letter_id,tracking_id
order by event_date
) - 45
then 'THIS'
when status = 'Mailed'
and lead(tracking_id) over (partition by offender_id,letter_id,tracking_id
order by event_date
) is null
and direction != lag(direction,1,'null') over (partition by offender_id,letter_id
order by event_date
then 'this'
end to_find,
case when status = 'Verified'
then event_date
end verified_response
from (select offender_id,
letter_id,
mail_date event_date,
tracking_id,
location_id,
'Mailed' status,
'sent' direction
from sor_tracking
union all
select offender_id,
letter_id,
response_date,
tracking_id,
location_id,
status,
'rcvd' direction
from sor_tracking
where response_date is not null
where to_find is not null
order by event_date desc
OFFENDER_ID
LETTER_ID
MAIL_DATE
TRACKING_ID
LOCATION_ID
STATUS
RESPONSE_DATE
RESPONSE_STATUS
4557
1
05-NOV-13
781410
110207
Mailed
4557
1
03-FEB-11
157253
102288
13-DEC-11
Verified
4557
1
04-FEB-10
128798
91503
08-NOV-10
Verified
4557
1
07-AUG-09
115073
91503
10-NOV-09
Verified
4557
1
05-FEB-09
101463
88052
06-MAY-09
Verified
4557
1
03-AUG-04
2257
18958
06-MAR-07
Verified
Regards
Etbin -
Web Analysis Error -- Error while executing query and retrieving data
Regarding Web Analysis:
We have a number of reports that we created. yesterday they were all working fine. today we try to open them and most are generating an error.
The error is:
Error while executing query and retrieving data.; nested exception is:
com.hyperion.ap.APException: [1033] Native:
1013033[Thu Oct 22 09:08:17
2009]server name/application name/database name/user name/Error91013033)
ReportWriter exit abnormally
Does anyone have any insight into what is going on?
My version information is:
Hyperion System 9 BI+ Analytic Administration Services 9.3.0.1.1 Build 2
Web Analysis 9.3.0.0.0.286
Thanks in advance for your help.
DaveWHi,
And also click on check option by opening the query in Query designer,as Mr . Arun suggested.
And if you get any error in checking, see the long message(detail).
With rgds,
Anil Kumar Sharma .P -
How to use ADF Query search with EJB 3.0
Hi,
In ADF guide http://download.oracle.com/docs/cd/E12839_01/web.1111/b31974/web_search_bc.htm#CIHIJABA
The steps to create query search with ADF Business Components says:
"+From the Data Controls panel, select the data collection and expand the Named Criteria node to display a list of named view criteria.+"
But with EJB, I'm not able to find Named Criteria node. Can we use ADF query search component with EJB? If yes, can you please show me some example, tutorial etc.?
Thanks
BJFor EJBs you'll need to implement the query model on your own.
An example of how the model should look like is in the ADF Faces components demo.
http://jdevadf.oracle.com/adf-richclient-demo/faces/components/query.jspx
Code here:
http://www.oracle.com/technology/products/adf/adffaces/11/doc/demo/adf_faces_rc_demo.html -
Issue with the Data Type 'Number' in Business Objects
Hi,
I have an Object in the Universe where the Data Type of the Object is a number. This Column pertaining to this Object has certain values in the database out of which there is a 17 Digit Value which is 00000000031101165.
Now, when trying to retreive the same value through Business Objects it is getting rounded off to 00000000031101200 automatically when trying to view in Webi and when trying to retreive the same in Designer/Deski, it displays as 0.000000003110116E+16.
So, I would like to know if there is any other alternative in trying to retreive the Original Value that would not round off. Also, do we have any Limitation for the Data Type Number in Business Objects? The Version we are on is XI3.1.
Note: There are no functions that are used on this Object at the Universe Level and would not like to use any functions here.What is the underlying database?
It looks like the data is considered to have two decimals, but is rounded to zero decimals.
Only you don't see the number formatting.
Is this a BW query?
Is this a calculated keyfigure?
In the query you can specify the rounding you want and it is also possible to specify it on an infoobject level.
Check those settings...
Hope this helps,
Marianne
PS. Oh, and about the formatting, you can specify a default object format in the universe and override it on the final client (WebI, Crystal) -
Help needed in converting date to number
Hi,
I am trying to execute the below query, its failing with "invalid number" error.
select * from process_status
where time_process > to_char('&3','yyyymmdd')
time_process is the "number" datatype.
Details:
SQL> select * from process_status
2 where time_process > to_char('&3','yyyymmdd');
Enter value for 3: 01-MAY-09
old 2: where time_process > to_char('&3','yyyymmdd')
new 2: where time_process > to_char('01-MAY-09','yyyymmdd')
where time_process > to_char('01-MAY-09','yyyymmdd')
ERROR at line 2:
ORA-01722: invalid number
If I execute the below query its working fine, I am facing problem only when I pass the date explicitly to the query. Please help me on converting the date to number format.
select * from etl_process_status
where time_process > '20090501'
and rownum < 3;
Thanksas others have said, it would be best to have your time_process column stored as a date in oracle, if it contains dates.
The reason for this is because if you store your dates as a number, you have removed information from Oracle that says "This is a date", so when Oracle comes to try and estimate the number of rows greater than a specific value, it can no longer guess correctly, since it's basing its guess on that column being a number.
There are lots more numbers between '20091201' and '20091101' than there are days between 1st November and 1st December.
By hiding the fact that your data is a date, you could throw your execution plan off and end up with sub-optimal performance.
In short, store your data in the correct column format! -
Hi
I am working on BLS and having problem in xml query.I want to perform some calculation over xml columns.Than total of this as a new column.I can do this part in logic editor itself but can i do these both task by XSLT.
Can be made our own XSLT for this ?
I am feeling kind of fear to xslt. Can anybody help me in this.
Thanks a lot in advance
thomasRam,
In xMII there is a list of predefined xslt transforms that do something similar to what you are explaining. The 3 that I think may be what you are looking for are
they are under Calculation Transformations and Subtotal Transformation take a look at these and tell me if they are doing what you want to accomplish. In the xMII help file do a search on Inline Transforms or navigate to Advanced Topics -> Inline Transforms -> Predefined Inline Transforms. In this section there are examples of how to use these transforms and apply them in the query templates. If this is not what you are looking for can you explain in a little more detail along with a simple example of how you want this transform to work. Also why do you want to use xslt if you can already accomplish this in BLS?
Regards,
Erik -
Can't i use xml schema and oledb data connection at the same time?
Hello to all and thanks in advance.I use xml schema and oledb data connection at the same time and the problem is that when I try to export the xml, the outcome is not what i expect.Without the oledb connection everything is ok (just the schema) and the xml complies with the schema.
Can't i have both schema and oledb and the exported xml be as i want it?You can use both at the same time, but not gor Internet access if that's what you're asking.
Now there is a thing called Link Aggregation, which combines a number of interfaces for speed/redundancy, but it really only works locally, and then only with ALL special equipment in the route, and most likely OSX Server involved.
Sorry.
Maybe you are looking for
-
How to restict number of lines from my report output
Hi, I deveoped a noraml alv report where 26 fields displayed in the report output .But when nuber of lines exdeed 65,000 then we are unable to donload it to spread sheet .so we want to reduce the number of lines based on customer .How to do this in l
-
Photo storing app won't open after update.
This is kind of embarrassing. I downloaded the app Private Photo Vault on my jailbroken iOS 7.1.2 iPhone 4. I used this app to lock away some inappropriate pictures I didn't need other people seeing in my library. It had worked perfect until the upda
-
Hi Folks We are currently working on CRM 4.0. We have a new requirement where I need to add new fields to the Service Ticket. I added the new fields using EEWB and it works perfectly on SAP GUI. Now I need to add those new fields on CRM Webclient als
-
Intermittent exception - Load Report Failed
Hi, I am new to the forums, this does kinda involve .Net but not be a simple .Net issue so I posted here. I have been using Crystal Reports 10 for a while now, bolting it into web pages and never had issues but I have come across 2 lately that have m
-
How do you change a photoshop file back to an .eps extention
I am trying to help a user convert a photoshop file back into an .eps file. She was unable to open the ,eps file so I installed the Premiere Elements 13 to her workstation and point the file to open with Premiere and now it changed all her .eps file