Executing spatial queries on someone else's tables
I am trying to execute a spatial
query using a table owned by
someone else.
The SQL is...
select a.plancoupe, a.coupecrop, a.ha
from dba1.coupes a, dba1.coupes b
where b.plancoupe = 'CM004H' and
mdsys.sdo_within_distance(a.geom,b.geom,
'distance=1000 querytype=WINDOW')
= 'TRUE'
The query works fine when I am connected
as dba1, but when I connect as someone
else I get...
ERROR at line 1:
ORA-29902: error in executing ODCIIndexStart() routine
ORA-13211: failed to tessellate the window object
ORA-13209: internal error while reading SDO_INDEX_METADATA table
ORA-06512: at "MDSYS.SDO_INDEX_METHOD", line 73
ORA-06512: at line 1
I have granted SELECT permissions on the
table in question to PUBLIC and also to
dba1.sdo_geom_metadata and the spatial
index table generated by the spatial
indexing creation routines...
Any ideas?
regards
SImon
Hi Simon,
From your e-mail below, I assume you are
running Oracle 8.1.5.
Here is what you need to do for 8.1.5.
Hope this helps.
Please let me know if you have any questions. Thanks.
Dan
======
In Oracle 8.1.6, your query will work
if you do the following:
1) Grant select on dba1.coupes
2) Grant select on the spatial index
table created on dba1.coupes
In Oracle 8.1.5 ONLY, do steps 3 and 4:
3) In addition to step 1 and 2 above,
grant select on sdo_geom_metadata
4) As the Oracle MDSYS user, you must run
the following:
connect mdsys/mdsys
drop view sdo_index_metadata;
create view sdo_index_metadata as
select SDO_INDEX_OWNER, SDO_INDEX_NAME, SDO_INDEX_TABLE, SDO_INDEX_PRIMARY,
SDO_TSNAME, SDO_COLUMN_NAME,
SDO_LEVEL, SDO_NUMTILES,
SDO_MAXLEVEL, SDO_COMMIT_INTERVAL,
SDO_FIXED_META,
SDO_TABLESPACE,
SDO_INITIAL_EXTENT,
SDO_NEXT_EXTENT,
SDO_PCTINCREASE,
SDO_MIN_EXTENTS,
SDO_MAX_EXTENTS
from SDO_INDEX_METADATA_TABLE
where
(exists
(select table_name from all_tables
where table_name=sdo_index_table and owner=sdo_index_owner)
or
exists
(select view_name from all_views
where view_name=sdo_index_table and owner=sdo_index_owner)
or
exists
(select table_name from all_object_tables
where table_name=sdo_index_table and owner=sdo_index_owner)
grant select on sdo_index_metadata to public;
null
Similar Messages
-
How would I go about dropping a table that belongs to another user? I can see the table in 'all_tables', but have no idea how to drop it. I've got DBA privileges on my account as well.
Up until now I've only been dealing with tables under my own username, so this is new to me.Hi,
If the table is not in your schema, then you have to say schema_name.table_name instead of jhust table_name.
For example, if the table is named bar, and the owner of the table is foo:
DROP TABLE foo.bar;To do this, you need the DROP ANY TABLE system privilege. If you're not trying to do this from a strored procedure, then the privilege may be granted to some role that you have. If you are trying to do this from within a stroed procedure, then you may need the privilege granted directly to you.
You usually don't want to grant the DROP ANY TABLE system privilege to many users. An alternative could be to create a stored procedure in the foo schema that drops tables (perhaps only certain designated tables) in the foo schema. Foo could grant EXECUTE priviileges on this procedure to other users, and they would need no special system privileges. -
Spatial Queries are CPU bound and show very heavy use of query buffers
Hi,
Spatial Queries:
When using tkprof to analyse spatial queries it is clear that
there are implicit queries being done by Oracle spatial which
use vast amounts of buffers, and seem unable to cache basic
information from query to query - thus resulting in our machine
being CPU bound when stress testing Oracle Spatial, for example
the example below shows how information which is fixed for a
table and not likely to change very often is being retrieved
inefficiently (note the 26729 query buffers being used to do 6
executions of what should be immediately available!!!):
TKPROF: Release 8.1.7.0.0 - Production on Tue Oct 16 09:43:38
2001
(c) Copyright 2000 Oracle Corporation. All rights reserved.
SELECT ATTR_NO, ATTR_NAME, ATTR_TYPE_NAME, ATTR_TYPE_OWNER
FROM
ALL_TYPE_ATTRS WHERE OWNER = :1 AND TYPE_NAME = :2 ORDER BY
ATTR_NO
call count cpu elapsed disk query rows
Parse 6 0.00 0.01 0 0 0
Execute 6 0.00 0.01 0 0 0
Fetch 6 0.23 0.41 0 26729 5
total 18 0.23 0.43 0 26729 5
Misses in library cache during parse: 0
Optimizer goal: CHOOSE
Parsing user id: 37 (NAGYE)
Rows Row Source Operation
0 SORT ORDER BY
0 FILTER
1 NESTED LOOPS
1 NESTED LOOPS
290 NESTED LOOPS
290 NESTED LOOPS
290 NESTED LOOPS
290 NESTED LOOPS
290 TABLE ACCESS FULL ATTRIBUTE$
578 TABLE ACCESS CLUSTER TYPE$
578 TABLE ACCESS CLUSTER TYPE$
578 INDEX UNIQUE SCAN (object id 255)
578 TABLE ACCESS BY INDEX ROWID OBJ$
578 INDEX RANGE SCAN (object id 35)
578 TABLE ACCESS CLUSTER USER$
578 INDEX UNIQUE SCAN (object id 11)
289 TABLE ACCESS BY INDEX ROWID OBJ$
578 INDEX RANGE SCAN (object id 35)
0 TABLE ACCESS CLUSTER USER$
0 INDEX UNIQUE SCAN (object id 11)
0 FIXED TABLE FULL X$KZSPR
0 NESTED LOOPS
0 FIXED TABLE FULL X$KZSRO
0 INDEX RANGE SCAN (object id 101)
error during parse of EXPLAIN PLAN statement
ORA-01039: insufficient privileges on underlying objects of the
view
and again:
SELECT diminfo, nvl(srid,0)
FROM
ALL_SDO_GEOM_METADATA WHERE OWNER = 'NAGYE' AND TABLE_NAME =
NLS_UPPER('TILE_MED_LINES_MBR') AND '"'||COLUMN_NAME||'"'
= '"GEOM"'
call count cpu elapsed disk query
current rows
Parse 20 0.00 0.04 0
0 0 0
Execute 20 0.00 0.00 0
0 0 0
Fetch 20 0.50 0.50 0 5960
100 20
total 60 0.50 0.54 0 5960
100 20
Misses in library cache during parse: 0
Optimizer goal: CHOOSE
Parsing user id: 37 (NAGYE) (recursive depth: 1)
Rows Row Source Operation
1 FILTER
2 TABLE ACCESS BY INDEX ROWID SDO_GEOM_METADATA_TABLE
2 INDEX RANGE SCAN (object id 24672)
1 UNION-ALL
1 FILTER
1 NESTED LOOPS
1 NESTED LOOPS
1 NESTED LOOPS OUTER
1 NESTED LOOPS OUTER
1 NESTED LOOPS OUTER
1 NESTED LOOPS OUTER
1 NESTED LOOPS
1 TABLE ACCESS FULL OBJ$
1 TABLE ACCESS CLUSTER TAB$
1 INDEX UNIQUE SCAN (object id 3)
0 TABLE ACCESS BY INDEX ROWID OBJ$
1 INDEX UNIQUE SCAN (object id 33)
0 INDEX UNIQUE SCAN (object id 33)
0 TABLE ACCESS CLUSTER USER$
1 INDEX UNIQUE SCAN (object id 11)
1 TABLE ACCESS CLUSTER SEG$
1 INDEX UNIQUE SCAN (object id 9)
1 TABLE ACCESS CLUSTER TS$
1 INDEX UNIQUE SCAN (object id 7)
1 TABLE ACCESS CLUSTER USER$
1 INDEX UNIQUE SCAN (object id 11)
0 FILTER
0 NESTED LOOPS
0 NESTED LOOPS OUTER
0 NESTED LOOPS
0 TABLE ACCESS FULL USER$
0 TABLE ACCESS BY INDEX ROWID OBJ$
0 INDEX RANGE SCAN (object id 34)
0 INDEX UNIQUE SCAN (object id 97)
0 INDEX UNIQUE SCAN (object id 96)
0 FIXED TABLE FULL X$KZSPR
0 NESTED LOOPS
0 FIXED TABLE FULL X$KZSRO
0 INDEX RANGE SCAN (object id 101)
0 FIXED TABLE FULL X$KZSPR
0 NESTED LOOPS
0 FIXED TABLE FULL X$KZSRO
0 INDEX RANGE SCAN (object id 101)
error during parse of EXPLAIN PLAN statement
ORA-01039: insufficient privileges on underlying objects of the
view
Note: The actual query being performed is:
select a.id, a.geom
from
tile_med_lines_mbr a where sdo_relate(a.geom,mdsys.sdo_geometry
(2003,NULL,
NULL,mdsys.sdo_elem_info_array
(1,1003,3),mdsys.sdo_ordinate_array(151.21121,
-33.86325,151.21132,-33.863136)), 'mask=anyinteract
querytype=WINDOW') =
'TRUE'
call count cpu elapsed disk query
current rows
Parse 1 0.00 0.00 0 0 0 0
Execute 1 0.08 0.08 0 4 0 0
Fetch 5 1.62 21.70 0 56 0 827
total 7 1.70 21.78 0 60 0 827
Misses in library cache during parse: 0
Optimizer goal: CHOOSE
Parsing user id: 37 (NAGYE)
Rows Row Source Operation
827 TABLE ACCESS BY INDEX ROWID TILE_MED_LINES_MBR
828 DOMAIN INDEX
Rows Execution Plan
0 SELECT STATEMENT GOAL: CHOOSE
827 TABLE ACCESS GOAL: ANALYZED (BY INDEX ROWID) OF
'TILE_MED_LINES_MBR'
828 DOMAIN INDEX OF 'TILE_MLINES_SPIND'
CPU: none, I/O: none
call count cpu elapsed disk query
current rows
Parse 1 0.00 0.00 0 92
Execute 1 0.00 0.00 0 22
Fetch 1 0.00 0.00 38 236
total 3 0.00 0.00 38 350
Misses in library cache during parse: 1
Optimizer goal: CHOOSE
Parsing user id: 37 (NAGYE)
Rows Row Source Operation
12 TABLE ACCESS BY INDEX ROWID ROADELEMENT_MBR
178 DOMAIN INDEX
Rows Execution Plan
0 SELECT STATEMENT GOAL: CHOOSE
12 TABLE ACCESS GOAL: ANALYZED (BY INDEX ROWID) OF
'ROADELEMENT_MBR'
178 DOMAIN INDEX OF 'RE_MBR_SPIND'
CPU: none, I/O: none
Can Oracle improve the performance of Oracle spatial by
improving the implementation so as to perform alternative
implicit queries so as not to use these vast amounts of memory?
Cheers
Alex EadieHi Ravi,
Thankyou for your reply.
Here are some more details for you:
Yes the queries are cached in that it gets its data from RAM and
not from disk however the number of buffers used internally by
Oracle RDBMS/Spatial is rather large and results in significant
CPU usage (namely > 5000 per query or >40MByte). Which I'm sure
you'd agree? Those numerous internal queries taking >10ms CPU
time each, which is culmulative.
A single real of ours query of will take between 180ms and 580ms
depending on the number of results returned.
An example query is:
select a.id, a.geom
from tile_med_lines_mbr a where sdo_relate
(a.geom,mdsys.sdo_geometry
(2003,NULL, NULL,mdsys.sdo_elem_info_array
(1,1003,3),mdsys.sdo_ordinate_array(151.21121,
-33.86325,151.21132,-33.863136)), 'mask=anyinteract
querytype=WINDOW') = 'TRUE'
Our 500Mhz PC Server database can only execute 3 processes
running these queries simultaneously to go to 100% CPU loaded.
The disk is hardly utilized.
The data is the main roads in Sydney, Australia.
The tables, data and indexes were created as shown below:
1. Create the Oracle tables:
create table tile_med_nodes_mbr (
id number not null,
geom mdsys.sdo_geometry not null,
xl number not null,
yl number not null,
xh number not null,
yh number not null);
create table tile_med_lines_mbr (
id number not null,
fromid number not null,
toid number not null,
geom mdsys.sdo_geometry not null,
xl number not null,
yl number not null,
xh number not null,
yh number not null);
2. Use the sqlldr Oracle loader utility to load the data
into Oracle.
% sqlldr userid=csiro_scats/demo control=nodes.ctl
% sqlldr userid=csiro_scats/demo control=lines.ctl
3. Determine the covering spatial extent for the tile
mosaic and use this to create the geometry metadata.
% sqlplus
SQLPLUS> set numw 12
SQLPLUS> select min(xl), min(yl), max(xh), max(yh)
from (select xl, yl, xh, yh
from tile_med_nodes_mbr union
select xl, yl, xh, yh
from tile_med_lines_mbr);
insert into USER_SDO_GEOM_METADATA
(TABLE_NAME, COLUMN_NAME, DIMINFO)
VALUES ('TILE_MED_NODES_MBR', 'GEOM',
MDSYS.SDO_DIM_ARRAY
(MDSYS.SDO_DIM_ELEMENT('X', 151.21093421,
151.21205421, 0.000000050),
MDSYS.SDO_DIM_ELEMENT('Y', -33.86347146,
-33.86234146, 0.000000050)));
insert into USER_SDO_GEOM_METADATA
(TABLE_NAME, COLUMN_NAME, DIMINFO)
VALUES ('TILE_MED_LINES_MBR', 'GEOM',
MDSYS.SDO_DIM_ARRAY
(MDSYS.SDO_DIM_ELEMENT('X', 151.21093421,
151.21205421, 0.000000050),
MDSYS.SDO_DIM_ELEMENT('Y', -33.86347146,
-33.86234146, 0.000000050)));
4. Validate the data loaded:
create table result
(UNIQ_ID number, result varchar2(10));
execute sdo_geom.validate_layer
('TILE_MED_NODES_MBR','GEOM','ID','RESULT');
select result, count(result)
from RESULT
group by result;
truncate table result;
execute sdo_geom.validate_layer
('TILE_MED_LINES_MBR','GEOM','ID','RESULT');
select result, count(result)
from RESULT
group by result;
drop table result;
5. Fix any problems reported in the result table.
6. Create a spatial index, use the spatial index advisor to
determine the sdo_level.
create index tile_mlines_spind on
tile_med_lines_mbr (geom) indextype is
mdsys.spatial_index parameters
( 'sdo_level=7,initial=1M,next=1M,pctincrease=0');
7. Analyse table:
analyze table TILE_MED_LINES_MBR compute statistics;
8. Find the spatial index table name:
select sdo_index_table, sdo_column_name
from user_sdo_index_metadata
where sdo_index_name in
(select index_name
from user_indexes
where ityp_name = 'SPATIAL_INDEX'
and table_name = 'TILE_MED_LINES_MBR');
9. Analyse spatial index table:
analyze table TILE_MLINES_SPIND_FL7$
compute statistics;
I hope this helps.
Cheers
Alex Eadie -
How to execute multiple queries in one stored procedure.
Hi,
I am Kumar,
How to execute multiple queries in one stored procedure.
here is the my requirements,
1. get the max value from one table and sum of the that value.
2. insert the values and also sum of the max value.
using stored procedure
I am using SQL server 2000 database.
Please help me.
Advance thanks
by,
KumarThis is not a java question and it is not even a problem: your only problem is
1) lack of knowledge
2) lack of interest to find a manual
But you are going to have to change both by actually reading a book or a manual that explains the stored procedure language of SQL Server. It is the same as Sybase I think, so you could also look for a manual for that DBMS. -
Start a workflow for a list item that was created by someone else
What settings do I need to change so that I can start a workflow for a list item created by another user?
I have a SharePoint 2013 workflow (let's call it LSR Status Workflow) that is associated with a list (called
LSR List). When a user creates an item in the LSR List, it automatically starts the
LSR Status Workflow. That is what I wanted, but sometimes I make changes to the workflow (via SharePoint designer) and then I would want to terminate the existing workflows that are running and restart them.
When I try to start a workflow for anyone other than a list item that I created, I get the following error:
Retrying last request. Next attempt scheduled in less than one minute. Details of last request: HTTP NotFound to https://publishing.web.company.com/sites/mysite/_vti_bin/client.svc/web/lists/getbyid(guid'1f844b8f-19aa-4587-bcc2-dfb7085f36b5')/Items(31)
Correlation Id: 8efc5304-f0a3-90f6-8ece-6875bf811869 Instance Id:
60c83aae-5c25-4ee8-9c85-c64958ba701e
Then when the workflow is finally suspended after it keeps retrying, it reports the following error:
RequestorId: 8efc5304-f0a3-90f6-0000-000000000000. Details: An unhandled
exception occurred during the execution of the workflow instance.
Exception details: System.ApplicationException: HTTP 404
{"Transfer-Encoding":["chunked"],"X-SharePointHealthScore":["0"],"SPClientServiceRequestDuration":["36"],"SPRequestGuid":["8efc5304-f0a3-90f6-9bbd-d18d4d90af1b"],"request-id":["8efc5304-f0a3-90f6-9bbd-d18d4d90af1b"],"X-FRAME-OPTIONS":["SAMEORIGIN"],"MicrosoftSharePointTeamServices":["15.0.0.4551"],"X-Content-Type-Options":["nosniff"],"X-MS-InvokeApp":["1;
RequireReadOnly"],"Cache-Control":["max-age=0, private"],"Date":["Tue,
10 Feb 2015 22:36:44
GMT"],"Set-Cookie":["BIGipServerpublishing-blv-80-pool=2825582466.20480.0000;
path=/"],"Server":["Microsoft-IIS/7.5"],"X-AspNet-Version":["4.0.30319"],"X-Powered-By":["ASP.NET"]}
at
Microsoft.Activities.Hosting.Runtime.Subroutine.SubroutineChild.Execute(CodeActivityContext
context) at
System.Activities.CodeActivity.InternalExecute(ActivityInstance
instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at
System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor
executor, BookmarkManager bookmarkManager, Location resultLocation)
If I created the list item, I can stop it and restart it without any problems, but this is not the case for list items created by someone else.
What settings do I need to change so that I can start a workflow for a list item created by another user? I am the owner of the SharePoint site and am able to make changes to permissions if needed.You don't need to re-do the fields. If you create a new version of the PDF
file just open the old one and use the Replace Pages command to insert the
pages from the new version over the old ones. This will give you a new
version of the file, with the existing form fields still in tact. Of
course, you might need to adjust their location and/or size, but at least
you won't have to start all over again...
On Thu, Jan 22, 2015 at 11:59 PM, Laura Holancin <[email protected]> -
Executing Abap Queries in Abap Code and processing the result
Hi,
I want to execute ABAP Queries (designed by sq01) in an abap report and processing the result in an internal table.
How could it be work?
Thanks a lot for your responses,
with kind Regards
Reinhold StroblHello,
GO to SQ01 and select your query. Go to Menu QUERY-->More Functions->Display Report Name.
You can then take that report name and go to SE38. Copy the code before END-OF_SELECTION and then modify as per your own requirements.
Regrads
Saket Sharma -
Spatial Queries Not Always Producing Accurate Results
Hi,
Spatial queries are not always producing accurate results. Here are the issues. We would appreciate any clarification you could provide to resolve these issues.
1. When querying for points inside a polygon that is not an MBR (minimum bounded rectangle), some of the coordinates returned are not inside the polygon. It is as though the primary filter is working, but not the secondary filter when using sdo_relate. How can we validate that the spatial query using sdo_relate is using the secondary filter?
2. SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT returns true when validating geometries even though we find results that are invalid.
3. Illegal geodetic coordinates can be inserted into a table: latitude > 90.0, latitude < -90.0, longitude > 180.0 or longitude < -180.0.
4. Querying for coordinates outside the MBR for the world where illegal coordinates existed did NOT return any rows, yet there were coordinates of long, lat: 181,91.
The following are examples and information relating to the above-referenced points.
select * from USER_SDO_GEOM_METADATA
TABLE_NAME COLUMN_NAME DIMINFO(SDO_DIMNAME, SDO_LB, SDO_UB, SDO_TOLERANCE) SRID
LASTKNOWNPOSITIONS THE_GEOM SDO_DIM_ARRAY(SDO_DIM_ELEMENT('X', -180, 180, .05), SDO_DIM_ELEMENT('Y', -90, 90, .05)) 8307
POSITIONS THE_GEOM SDO_DIM_ARRAY(SDO_DIM_ELEMENT('X', -180, 180, .05), SDO_DIM_ELEMENT('Y', -90, 90, .05)) 8307
Example 1: Query for coordinates inside NON-rectangular polygon includes points outside of polygon.
SELECT l.vesselid, l.latitude, l.longitude, TO_CHAR(l.observationtime,
'YYYY-MM-DD HH24:MI:SS') as obstime FROM lastknownpositions l where
SDO_RELATE(l.the_geom,SDO_GEOMETRY(2003, 8307, NULL,
SDO_ELEM_INFO_ARRAY(1, 1003, 1),
SDO_ORDINATE_ARRAY(-98.20268,18.05079,-57.30101,18.00705,-57.08229,
54.66061,-98.59638,32.87842,-98.20268,18.05079)),'mask=inside')='TRUE'
This query returns the following coordinates that are outside of the polygon:
vesselid : 1152 obstime : 2005-08-24 06:00:00 long : -82.1 lat : 45.3
vesselid : 3140 obstime : 2005-08-28 12:00:00 long : -80.6 lat : 44.6
vesselid : 1253 obstime : 2005-08-22 09:00:00 long : -80.0 lat : 45.3
Example 2a: Using SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT
Select areaid, the_geom,
SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(the_geom, 0.005) from area where
areaid=24
ResultSet:
AREAID THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z), SDO_ELEM_INFO,
SDO_ORDINATES) SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(THE_GEOM,0.005)
24 SDO_GEOMETRY(2003, 8307, NULL, SDO_ELEM_INFO_ARRAY(1, 1003, 1), SDO_ORDINATE_ARRAY(-98.20268, 18.05079, -57.30101, 18.00705, -57.08229, 54.66061, -98.59638, 32.87842, -98.20268, 18.05079)) TRUE
Example 2b: Using SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT
Select positionid, vesselid, the_geom,
SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(the_geom, 0.005) from positions where vesselid=1152
ResultSet:
POSITIONID VESSELID THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z),
SDO_ELEM_INFO, SDO_ORDINATES) DO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(THE_GEOM,0.005)
743811 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-82.1, 45.3, NULL), NULL, NULL) TRUE
743812 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-82.1, 45.3, NULL), NULL, NULL) TRUE
743813 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-80.2, 42.5, NULL), NULL, NULL) TRUE
743814 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-80.2, 42.5, NULL), NULL, NULL) TRUE
Example 3: Invalid Coordinate values found in POSITIONS table.
SELECT p.positionid, p.latitude, p.longitude, p.the_geom FROM positions p
WHERE p.latitude < -180.0
2 lines from ResultSet:
POSITIONID LATITUDE LONGITUDE THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z), SDO_ELEM_INFO, SDO_ORDINATES)
714915 -210.85408 -79.74449 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-79.74449, -210.85408, NULL), NULL, NULL)
714938 -211.13632 -79.951256 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-79.951256, -211.13632, NULL), NULL, NULL)
SELECT p.positionid, p.latitude, p.longitude, p.the_geom FROM positions p
WHERE p.longitude > 180.0
3 lines from ResultSet:
POSITIONID LATITUDE LONGITUDE THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z), SDO_ELEM_INFO, SDO_ORDINATES)
588434 91 181 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(181, 91, NULL), NULL, NULL)
589493 91 181 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(181, 91, NULL), NULL, NULL)
589494 91 181 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(181, 91, NULL), NULL, NULL)
Example 4: Failure to locate illegal coordinates by querying for disjoint coordinates outside of MBR for the world:
SELECT p.vesselid, p.latitude, p.longitude, p.the_geom,
TO_CHAR(p.observationtime, 'YYYY-MM-DD HH24:MI:SS') as obstime,
SDO_GEOM.RELATE(p.the_geom, 'determine',
SDO_GEOMETRY(2003, 8307, NULL,SDO_ELEM_INFO_ARRAY(1, 1003, 1),
SDO_ORDINATE_ARRAY(-180.0,-90.0,180.0,-90.0,180.0,90.0,
-180.0,90.0,-180.0,-90.0)), .005) relationship FROM positions p where
SDO_GEOM.RELATE(p.the_geom, 'disjoint', SDO_GEOMETRY(2003, 8307,
NULL,SDO_ELEM_INFO_ARRAY(1, 1003, 1),
SDO_ORDINATE_ARRAY(-180.0,-90.0,180.0,-90.0,180.0,90.0,-80.0,90.0,
-180.0,-90.0)),.005)='TRUE'
no rows selected
Carol SaahHi Carol,
1) I think the results are correct. Note in a geodetic coordinate system adjacent points in a linestring or polygon are connected via geodesics. You are probably applying planar thinking to an ellipsoidal problem! I don't have time to do the full analysis right now, but a first guess is that is what is happening.
2) The query window seems to be valid. I don't think this is a problem.
3) Oracle will let you insert most anything into a table. In the index, it probably wraps. If you validate, I think the validation routines will tell you is is illegal if you use the signature with diminfo, where the coordinate system bounds are included in the validation.
4) Your query window is not valid. Your data is not valid. As the previous reply stated, you need to have valid data. If you think in terms of a geodetic coordinate system, you will realize that -180.0,-90.0 and 180.0,-90.0 are really the same point. Also, Oracle has a rule that polygon geometries cannot be greater than half the surface of the Earth.
Hope this helps. -
SQL Developer cannot execute multiple queries in one connection
Hi,
Using : SQL Developer 1.2.29.98, Oracle Database 9i on Windows XP.
Currently I have TOAD and PL/SQL Developer to handle things related to Oracle Database.
I found this SQL Developer tools as an interesting tool which might replace PL/SQL Developer (I assume).
I opened two SQL Worksheets and tried to execute 2 queries (time consuming one) simultanously, suddenly it freezes (the window become grayed). Just like TOAD, we cannot smoothly execute multiple queries on single connection on SQL Developer
I can monitor the query process using TOAD (Session Browser) and found both queries are running while SQL Developer window become grayed and seems not functioning. After the queries have been finished, I mean both of them, the SQL Developer window become normal and "alive" again showing the expected results.
Well, however PL/SQL Developer could handle this. Executing multiple queries in single application is just its unbeatable features.
We can view the query result once its finished instead of until the others being processed, by just only switch the SQL Windows.
Is it true that SQL Developer doesn't support executing multiple queries?
Or is it a feature which we should request?
Or we have to activate this feature by doing some changes on configuration / preferences?
(Do we need to open two SQL Developer instances? what a memory consuming solution).
Regards,
BuntoroSQLDeveloper connections are single threaded and also rather single minded (in that you can't do much else while a long query is running.)
There is an existing feature request http://apex.oracle.com/pls/otn/f?p=42626:39:3685254426061901::NO::P39_ID:4202 for which you can vote.
The workaround is to have multiple connections. Not brilliant but it works. -
How to use database control to execute sql queries which change at run time
Hi all,
I need to execute sql queries using database controls , where the sql changes
at run time
based on some condition. For eg. based on the condition , I can add some where
condition.
Eg. sql = select id,name from emp where id = ?.
based on some condition , I can add the following condition .
and location = ?.
Have anybody had this kind of situation.
thanks,
sathishFrom the perspective of the database control, you've got two options:
1) use the sql: keyword to do parameter substitution. Your observation
about {foo} style sbustitution is correct -- this is like using a
PreparedStatement. To do substitution into the rest of the SQL
statement, you can use the {sql: foo} substitution syntax which was
undocumented in GA but is documented in SP2. Then, you can build up
the filter clause String yourself in a JPF / JWS / etc and pass it into
the DB control.
For example:
* @jc:sql statement="select * from product {sql: filter}"
public Product[] getProducts(String filter) throws SQLException;
This will substitute the String filter directly into the statement that
is executed. The filter string could be null, "", "WHERE ID=12345", etc.
2) you can use the DatabaseFilter object to build up a set of custom
sorts and filters and pass that object into the DB control method.
There have been other posts here about doing this, look for the subject
"DatabaseFilter example".
Hope that helps...
Eddie
Dan Hayes wrote:
"Sathish Venkatesan" <[email protected]> wrote:
Hi Maruthi,
The parameter substituion , I guess is used like setting the values for
prepared
statements.
What I'm trying to do , is change the sql at run time based on some condition.
For example ,
consider the following query :
select col1,col2 from table t where t.col3 > 1
At run time , based on some condition , I need to add one more and condition.
i.e. select col1,col2 from table t where t.col3 > 1 and t.col4 < 10.
This MAY not address your issue but if you are trying to add "optional" parameters
you may try including ALL the possible parameters in the SQL but send in null
for those params that you don't want to filter on in any particular case. Then,
if you word your query
as follows:
select col1, col2 from table t where t.col3 > 1 and (t.col4 = {col4param} or
{col4param} is null) and (t.col5 = {col5param} or {col5param} is null) ...
you will get "dynamic" filters. In other words, col4 and col5 will only be
filtered if you send in non-null parameters for those arguments.
I have not tried this in a WL Workshop database control but I've used
this strategy dozens of times in stored procedures or jdbc prepared statements.
Good luck,
Dan -
Executing SQL queries in SAP-GUI e.g. select * from but000
Hallo,
I am newbie in the SAP world. Is there a way to run select statements from SAP GUI? e.g. I want to know how many rows are returning from a join xyz.
select count() from tabA and tabB where tabA.id = tabB.id and tabA.Name is not null.*
Is it possible with SQVI (SQ01)?
Please help.Testcase:
SQL> create table scott.testit
( id number not null,
value1 varchar2(10) not null )
tablespace DATA;
Table created.
SQL> desc scott.testit;
Name Null? Type
ID NOT NULL NUMBER
VALUE1 NOT NULL VARCHAR2(10)
SQL> insert into scott.testit (id,value1) values ( 1, 'Hello' );
1 row created.
SQL> commit;
Commit complete.
SQL> select * from scott.testit;
ID VALUE1
1 Hello
ADD COLUMN, the old fashioned way
SQL> alter table scott.testit add ( ADDFIELD1 varchar2(5) );
Table altered.
SQL> desc scott.testit;
Name Null? Type
ID NOT NULL NUMBER
VALUE1 NOT NULL VARCHAR2(10)
ADDFIELD1 VARCHAR2(5)
SQL> select * from scott.testit where ADDFIELD1 is null;
ID VALUE1 ADDFI
1 Hello
Works as expected
Try to get NOT NULL and DEFAULT to work
SQL> alter table scott.testit modify ( ADDFIELD1 NOT NULL );
alter table scott.testit modify ( ADDFIELD1 NOT NULL )
ERROR at line 1:
ORA-02296: cannot enable (SCOTT.) - null values found
SQL> alter table scott.testit modify ADDFIELD1 default '000';
Table altered.
SQL> alter table scott.testit modify ( ADDFIELD1 NOT NULL );
alter table scott.testit modify ( ADDFIELD1 NOT NULL )
ERROR at line 1:
ORA-02296: cannot enable (SCOTT.) - null values found
No suprise so far. You would usually need to update all NOT NULL
values to some values and you would be able to enable the NOT NULL constraint
allthough this may run for quite a while on big tables.
Now lets try the new stuff
SQL> alter table scott.testit drop column ADDFIELD1;
Table altered.
SQL> alter table scott.testit ADD ADDFIELD1 varchar2(3) DEFAULT '000' not null;
Table altered.
SQL> desc scott.testit
Name Null? Type
ID NOT NULL NUMBER
VALUE1 NOT NULL VARCHAR2(10)
ADDFIELD1 NOT NULL VARCHAR2(3) <<<< BING !!!
SQL> select * from scott.testit;
ID VALUE1 ADD
1 Hello 000 <<<< Default '000' is working
SQL> select * from scott.testit where ADDFIELD1 is NULL;
no rows selected <<<< NOW this might be suprising
SQL> insert into scott.testit (id,value1,addfield1) values (2,'Bye', '000');
1 row created.
SQL> commit; <<<< Trying to compare "real" '000' with DEFAULT '000'
Commit complete.
SQL> select * from scott.testit;
ID VALUE1 ADD
1 Hello 000 <<<< Added with default
2 Bye 000 <<<< inserted as '000'
SQL> alter table scott.testit modify ADDFIELD1 default '111';
Table altered.
SQL> select * from scott.testit; <<<< Now it gets exciting
ID VALUE1 ADD
1 Hello 000 <<<< WOA... How does this work?
2 Bye 000
SQL> set longC 20000 long 20000
SQL> select dbms_metadata.get_ddl('TABLE','TESTIT','SCOTT') from dual;
DBMS_METADATA.GET_DDL('TABLE','TESTIT','SCOTT')
CREATE TABLE "SCOTT"."TESTIT"
( "ID" NUMBER NOT NULL ENABLE,
"VALUE1" VARCHAR2(10) NOT NULL ENABLE,
"ADDFIELD1" VARCHAR2(3) DEFAULT '111' NOT NULL ENABLE <<<< No '000' DEFAULT
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "DATA"
SQL>
Looks like Oracle is at least a whole lot more clever than I expected.
It must have stored the first Default value somewhere else, as the documentation
says, that the effective rows will NOT be updated (otherwise it would never work so fast).
I need to dig into how datablocks are dumped and read.
Just to finalize this:
SQL> alter table scott.testit modify ADDFIELD1 NULL;
Table altered.
SQL> select * from scott.testit;
ID VALUE1 ADD
1 Hello 000
2 Bye 000
SQL> select * from scott.testit where addfield1 is null;
no rows selected
SQL>
So the change persists even if you revert the constraint allthough the data
should not been changed. Surely need to do a datablock dump of this.
Need to do additional tests with indexes.
But right now I am running out of time.
May be someone else likes to join the expedition.
Volker -
ADF how to display a processing page when executing large queries
ADF how to display a processing page when executing large queries
The ADF application that I have written currently has the following structure:
DataPage (search.jsp) that contains a form that the user enters their search criteria --> forward action (doSearch) --> DataAction (validate) that validates the inputted values --> forward action (success) --> DataAction (performSearch) that has a refresh method dragged on it, and an action that manually sets the itterator for the collection to -1 --> forward action (success) --> DataPage (results.jsp) that displays the results of the then (hopefully) populated collection.
I am not using a database, I am using a java collection to hold the data and the refresh method executes a query against an Autonomy Server that retrieves results in XML format.
The problem that I am experiencing is that sometimes a user may submit a query that is very large and this creates problems because the browser times out whilst waiting for results to be displayed, and as a result a JBO-29000 null pointer error is displayed.
I have previously got round this using Java Servlets where by when a processing servlet is called, it automatically redirects the browser to a processing page with an animation on it so that the user knows something is being processed. The processing page then recalls the servlet every 3seconds to see if the processing has been completed and if it has the forward to the appropriate results page.
Unfortunately I can not stop users entering large queries as the system requires users to be able to search in excess of 5 million documents on a regular basis.
I'd appreciate any help/suggestions that you may have regarding this matter as soon as possible so I can make the necessary amendments to the application prior to its pilot in a few weeks time.Hi Steve,
After a few attempts - yes I have a hit a few snags.
I'll send you a copy of the example application that I am working on but this is what I have done so far.
I've taken a standard application that populates a simple java collection (not database driven) with the following structure:
DataPage --> DataAction (refresh Collection) -->DataPage
I have then added this code to the (refreshCollectionAction) DataAction
protected void invokeCustomMethod(DataActionContext ctx)
super.invokeCustomMethod(ctx);
HttpSession session = ctx.getHttpServletRequest().getSession();
Thread nominalSearch = (Thread)session.getAttribute("nominalSearch") ;
if (nominalSearch == null)
synchronized(this)
//create new instance of the thread
nominalSearch = new ns(ctx);
} //end of sychronized wrapper
session.setAttribute("nominalSearch", nominalSearch);
session.setAttribute("action", "nominalSearch");
nominalSearch.start();
System.err.println("started thread calling loading page");
ctx.setActionForward("loading.jsp");
else
if (nominalSearch.isAlive())
System.err.println("trying to call loading page");
ctx.setActionForward("loading.jsp");
else
System.err.println("trying to call results page");
ctx.setActionForward("success");
Created another class called ns.java:
package view;
import oracle.adf.controller.struts.actions.DataActionContext;
import oracle.adf.model.binding.DCIteratorBinding;
import oracle.adf.model.generic.DCRowSetIteratorImpl;
public class ns extends Thread
private DataActionContext ctx;
public ns(DataActionContext ctx)
this.ctx = ctx;
public void run()
System.err.println("START");
DCIteratorBinding b = ctx.getBindingContainer().findIteratorBinding("currentNominalCollectionIterator");
((DCRowSetIteratorImpl)b.getRowSetIterator()).rebuildIteratorUpto(-1);
//b.executeQuery();
System.err.println("END");
and added a loading.jsp page that calls a new dataAction called processing every second. The processing dataAction has the following code within it:
package view;
import javax.servlet.http.HttpSession;
import oracle.adf.controller.struts.actions.DataForwardAction;
import oracle.adf.controller.struts.actions.DataActionContext;
public class ProcessingAction extends DataForwardAction
protected void invokeCustomMethod(DataActionContext actionContext)
// TODO: Override this oracle.adf.controller.struts.actions.DataAction method
super.invokeCustomMethod(actionContext);
HttpSession session = actionContext.getHttpServletRequest().getSession();
String action = (String)session.getAttribute("action");
if (action.equalsIgnoreCase("nominalSearch"))
actionContext.setActionForward("refreshCollection.do");
I'd appreciate any help or guidance that you may have on this as I really need to implement a generic loading page that can be called by a number of actions within my application as soon as possible.
Thanks in advance for your help
David. -
Executing BW Queries in BSP Page
Hi,
Can someone provide source code taking one query and accessing it through the BSP event handler page?
Will the BSP Page can replicate the same functionality of the BEX web application?
Can i have drilldown,graphics everything in BSP application?
Thank you
arunHi,
Answer to Arun:
<i>Can someone provide source code taking one query and accessing it through the BSP event handler page?</i>
You have to use CL_RSR_REQUEST/CL_RSR_DATA_SET classes (other option is OLAP BAPIs where u have to use MDX stts.)to execute BW queries thru ABAP. (sample code is available in ABAP forum)
<i>Will the BSP Page can replicate the same functionality of the BEX web application?</i>
No, you have to code everything yourself. but will be a tedious job.
<i>Can i have drilldown,graphics everything in BSP application?</i>
Yes you can, but again as i said you have to code everything , it dosent come easy as in the case of WAD.
Answer to LUCA:
One of the possible ways is to use redirect url (url poinintg to the BW query url) in the BSP page.
you can also pass query variable values via URL.
example:
TEMPLATEID=<template name>&var_name_1=<variable name>&var_value_ext_1=<var value>&var_name_2=<var name 2>&var_value_ext_2=<variable value 2>
Hope it helps.
Regards
Raja
Note: Better place for this question would be BSP forum
Message was edited by: Durairaj Athavan Raja -
Hello,
Trying to execute BW Queries against BW Cube data via BAPI calls over SAPJCO. I use the BAPIs BAPI_MDDATASET_CREATE_OBJECT (to construct the MDX-based query) and BAPI_MDDATASET_GET_DATA_XMLA to return query results in XML.
the XML in BAPI_MDDATASET_GET_DATA_XMLA is being returned in a non-english character set. Appears to be mandarin chinese. I don't set the characterset to use, and no other BAPI call returns results in this characterset. Is this a known issue? What to do about it?Hi,
Answer to Arun:
<i>Can someone provide source code taking one query and accessing it through the BSP event handler page?</i>
You have to use CL_RSR_REQUEST/CL_RSR_DATA_SET classes (other option is OLAP BAPIs where u have to use MDX stts.)to execute BW queries thru ABAP. (sample code is available in ABAP forum)
<i>Will the BSP Page can replicate the same functionality of the BEX web application?</i>
No, you have to code everything yourself. but will be a tedious job.
<i>Can i have drilldown,graphics everything in BSP application?</i>
Yes you can, but again as i said you have to code everything , it dosent come easy as in the case of WAD.
Answer to LUCA:
One of the possible ways is to use redirect url (url poinintg to the BW query url) in the BSP page.
you can also pass query variable values via URL.
example:
TEMPLATEID=<template name>&var_name_1=<variable name>&var_value_ext_1=<var value>&var_name_2=<var name 2>&var_value_ext_2=<variable value 2>
Hope it helps.
Regards
Raja
Note: Better place for this question would be BSP forum
Message was edited by: Durairaj Athavan Raja -
Hi guys,
Can the 3 dots in a bubble mean the recipient is reading or typing a message to someone else?And next time you post a question, please don't place demands on the answer.This is what you replied. And by the way just tell me who you are to say such a sentence to me? Also, would you tell me why such forums are executed? If i have to buy a book and research myself then why i should be here....just to listen such sentences. I don't think that you can have a little manners among JAVA DEVELOPERS COMMUNITY.
Next time you will be posting such replies, i'll be asking web masters to handle !!!
After all thanx for your time to write as such. -
Sending custom colors/gradients to someone else
Can a color and gradient saved to the swatches palette be
saved to a
.fla and be given to someone else with the color and
gradients available
on their own computer?
Thanks.I want to know if you can import an .aco file into Flash 8???
Want to know as I found a firefox add on that will save
colour palettes to .aco files to use with Flash and Photoshop.
The program/app is called Palette Grabber.
Does anyone know if this can be done? and How it can be done
because I cant see the way to do it!!!
Found it... Looks like it works but in the help it says the
following...
Importing and exporting color palettes
To import and export both RGB colors and gradients between
Flash files, you use Flash Color Set files (CLR files). You can
import and export RGB color palettes using Color Table files (ACT
files) that can be used with Macromedia Fireworks and Adobe
Photoshop. You can also import color palettes, but not gradients,
from GIF files. You cannot import or export gradients from ACT
files.
SO how come my .aco file works?
I am using Flash 8.
Maybe you are looking for
-
Stock in transit for Stock transfer report
HI Do we have any standard reports for Stock in Transit for Stock transfers (like what we have MB5T for STO) ? When we do a transfer using 303 mvt type we see stock in transit in MARC-UMLMC do we have any other standard report which can provide a
-
I want that line---file,edit,view,bookmarks,etc. to appear as soon as I begin. Thanks!!!
-
ORA-29279: SMTP permanent error: 530 5.7.1 Client was not authenticated
Hi , We have mail alert system which sends mail to users upon issues in procedure during daily load and etc. But few days ago we have database crash but unfortunately mail alert didnot get generated and the above error is desplayed. The main error is
-
Bandwidth Consumption through RV180
Hi, I am working with a customer utilizing a RV180 as their EDGE device and an SF102 small business 24 port switch for switching. At times, we are experiencing issues whereby some device or devices are consuming all available WAN bandwidth, causing
-
Exportto excel in tcode MCAF results in program termination for only 1 user
Hi, The user is trying to export data from MCAF into a local file. This action results in program termination, its only for his user id. It works for all the user ids the basis guys had a look at it and they said everthing is ok on the authorization