Query for spatial data with a GeometryCollection fails
There are exact 538 CurvePolygons (only exterior rings at this
sample). All of them are valid geometries and equal in dimension
and so on. Now I connect them to a GeometryCollection and query
for other relating spatial data in some tables. It seems that
the use of around (not exact!) 200 CurvePolygon in one
GeometryCollection works fine but the adding of more
CurvePolygon result in an error with the Spatial Index (I could
add the ORA- error numbers if I have some data in my test tables
again next days).
Is there anybody else having trouble with these mysterious
problem? Maybe there is a border by the number of points in
GeometryCollection?
(More details, programming code could be delivered)
(working with Java 1.3.1, oracle.sdoapi.*, Oracle 8.1.7.)
Hi Lutz,
Could you provide more info or samples of what is going wrong?
Also, could you try making sure the geometry you are passing in
as the query window is valid (i.e. instead of passing it in as a
query window, pass it into sdo_geom.validate_geometry).
Thanks,
Dan
Similar Messages
-
Query For Retrieving Data With Date and grouping
Hi Guys
I am having a hard time to figure out the sql query.
I have a table with data. Below is the illustration.
date location item description
1 jan 14 A apple desc1
2 jan 14 A apple desc2
3 jan 14 B apple desc1
4 jan 14 B apple desc2
1 jan 14 A orange desc1
2 jan 14 A orange desc2
3 jan 14 B orange desc1
4 jan 14 B orange desc2
My question it how to get the latest date on each location and item along with the description field
This is the result I want
date location item description
2 jan 14 A apple desc2
4 jan 14 B apple desc2
2 jan 14 A orange desc2
4 jan 14 B orange desc2
Thanks.provided it a datetime/date field you've for date you can do this
SELECT [date],location,itemdescription
FROM
SELECT [date],location,itemdescription,
ROW_NUMBER() OVER (PARTITION BY location ORDER BY [date] DESC) AS Rn
FROM table
)t
WHERE Rn = 1
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Query For Retrieving Data With Date
Hi Guys
I am having a hard time to figure out the sql query.
I have a table with data. Below is the illustration.
date location
item
1 jan 14 A
apple
2 jan 14 A
apple
3 jan 14 B
apple
4 jan 14 B
apple
1 jan 14 A
orange
2 jan 14 A
orange
3 jan 14 B
orange
4 jan 14 B
orange
My question it how to get the latest date on each location and item
This is the result I want
date location
item
2 jan 14 A
apple
4 jan 14 B
apple
2 jan 14 A
orange
4 jan 14 B
orange
Thanks.Try the below:
create table #temp (sdate datetime,location char(1),Item varchar(20))
insert into #temp values('1 jan 14','A','Apple')
insert into #temp values('2 jan 14','A','Apple')
insert into #temp values('3 jan 14','B','Apple')
insert into #temp values('4 jan 14','B','Apple')
insert into #temp values('1 jan 14','A','Orange')
insert into #temp values('2 jan 14','A','Orange')
insert into #temp values('3 jan 14','B','Orange')
insert into #temp values('4 jan 14','B','Orange')
;With cte as
Select * , Row_Number()Over(partition by location, item order by sdate desc) Rn
From #Temp
Select * From cte Where rn=1 Order by Item asc,location asc
drop table #temp -
Error handling for master data with direct update
Hi guys,
For master data with flexible update, error handling can be defined in InfoPackege, and if the load is performed via PSA there are several options - clear so far. But what about direct update...
But my specific question is: If an erroneous record (e.g invalid characters) occur in a master data load using direct update, this will set the request to red. But what does this mean in terms of what happens to the other records of the request (which are correct) are they written to the master data tables, so that they can be present once the masterdata is activated, or are nothing written to masterdata tables if a single record is erroneous???
Many thanks,
/ ChristianHi Christian -
Difference between flexible upload & Direct upload is that direct upload does not have Update Rules, direct upload will have PSA as usual & you can do testing in PSA.
second part when you load master data - if error occurs all the records for that request no will be status error so activation will not have any impact on it i.e. no new records from failed load will be available.
hope it helps
regards
Vikash -
Best practice for sharing data with model window
Hi team,
what would the best practice for sharing data with a modal
window be ? I use a modal window to display record details from a
record list, but i am not quite sure how to access the data from
the components in the main application in the modal window.
Any hints would be welcome
Best
FrankPass a reference to the parent into the modal popup. Then you
can reference anything in the parent scope.
I haven't done this i 2.0 yet so I can't give you code. I'll
post if I do.
Oh, also, you can reference the parent using parentDocument.
So in the popup you could do:
parentDocument.myPublicVariable = "whatever";
Tracy -
How to create sql query for item master with operator LIKE with variables?
hi all,
How to create sql query for item master with
operator LIKE(Contains,Start With,End With) with variables using query generator in SAP B1 ?
JeyakanthanHi Jeyakanthan,
here is an example (put the like statement into the where field)
SELECT T0.CardCode, T0.CardName FROM OITM T0 WHERE T0.CardName Like '%%test%%'
The %% sign is a wildcard. If you need start with write 'test%%' and otherwise ends with '%%test'. For contains you need '%%test%%'. You also could combinate this statements like 'test%%abc%%'. This means starts with test and contains abc.
Regards Steffen -
VStest.console.exe Query for a Data Row Result DURING Test Execution Status
I started with the following
thread and was asked to create a new thread.
I have the following test that reads a .csv as a data source.
/// <summary>
/// Summary description for Test
/// </summary>
[TestCategory("LongTest"),
TestCategory("Visual Studio 2013"), TestMethod]
[DeploymentItem("data.csv")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\data.csv",
"data#csv",
DataAccessMethod.Sequential),
DeploymentItem("data.csv")]
public void Create_And_Build_All_Templates()
testmethodname = "Create And Build All Templates ";
LaunchVisualStudio2013();
When I run the test from VStest.console.exe, I see the following:
vstest.console.exe /testcasefilter:"TestCategory=LongTest" /settings:"C:\testing\CodedUI.testsettings" /logger:TRX /logger:CodedUITestLogger C:\AppTests\CodedUITest.dll
Microsoft (R) Test Execution Command Line Tool Version 12.0.31101.0
Copyright (c) Microsoft Corporation. All rights reserved.
Running tests in C:\testing\bin\debug\TestResults
Starting test execution, please wait...
I want to report on the status of the iterations DURING the test run from VStest.console.exe, like how the test explorer window does this.
How can I achieve the output below (notice the (Data Row ) values) ?
vstest.console.exe
/testcasefilter:"TestCategory=LongTest"
/settings:"C:\testing\CodedUI.testsettings"
/logger:TRX
/logger:CodedUITestLogger
C:\AppTests\CodedUITest.dll
Microsoft (R) Test Execution Command Line Tool Version 12.0.31101.0
Copyright (c) Microsoft Corporation. All rights reserved.
Running tests in C:\testing\bin\debug\TestResults
Starting test execution, please wait...
Test Passed - Create_And_Build_All_Templates (Data Row 1)
Test Passed - Create_And_Build_All_Templates (Data Row 2)
Test Failed - Create_And_Build_All_Templates (Data Row 3)
Test Passed - Create_And_Build_All_Templates (Data Row 4)
Ian CeicysJack, again the results are printed to the std. out console AFTER the test data row has been completed. Is there a way to query VSTest.console and find out which test\data row is being executed so it can be written out to the console
DURING the test run?
I put together the following screencast showing the issue:
http://www.screencast.com/t/IrxxfhGlzD
Also here is the github repo with the source code that I included in the screen cast:
https://github.com/ianceicys/VisualStudioSamples2015
Take a look at LongRunningDataDrivenTest.sln
Unit Test
[TestClass]
public class UnitTest
public static int _executedTests = 0;
public static int _passedTests = 0;
public void IncrementTests()
_executedTests++;
public void IncrementPassedTests()
_passedTests++;
[TestInitialize]
public void TestInitialize()
IncrementTests();
Console.WriteLine("Total tests Row executed: {0}", _executedTests);
[TestCleanup]
public void TestCleanup()
if (TestContext.CurrentTestOutcome == UnitTestOutcome.Passed)
IncrementPassedTests();
Console.WriteLine("Total passed tests: {0}", _passedTests);
private TestContext testContextInstance;
/// <summary>
/// Long Running Test
/// </summary>
[TestCategory("Long Running Test Example"),TestMethod]
[DeploymentItem("data.csv")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\data.csv",
"data#csv",
DataAccessMethod.Sequential),
DeploymentItem("data.csv")]
public void LongRunning_TestMethod_Takes_Over_30_Seconds_to_Run()
WaitTime(Convert.ToInt32(TestContext.DataRow["WaitTime"].ToString()));
public void WaitTime (int waittime)
Thread.Sleep(waittime);
public TestContext TestContext
get { return testContextInstance; }
set { testContextInstance = value; }
data.csv
WaitTime,
13000
19000
10000
11000
15000
Ian Ceicys -
Hi, How I can get the max(Gst_date) record of GRD. IF record of GRD has more than one record then I need the one max(GST_date) record of every GRD. Thanks.
create table #CRT (CRT numeric, GRD numeric, GST_Date datetime)
insert into #CRT values (7 ,1900,'01-01-2000')
insert into #CRT values (19,1900,'01-01-2002')
insert into #CRT values (24,1900,'01-01-2013')
insert into #CRT values (7 ,2100,'01-01-2010')
insert into #CRT values (19,2100,'01-01-2012')
insert into #CRT values (7 ,2200,'01-01-2012')
insert into #CRT values (19,2200,'02-02-2012')
I would like the following output from query. The following record is the max(GST_Date ) of every '''GRD'".
CRT GRD GST_Date
24 1900 01-01-2013
19 2100 01-01-2012
19 2200 02-02-2012Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your data. You should follow ISO-11179 rules for naming data elements. Everything you posted is wrong. You should follow ISO-8601 rules for
displaying temporal data. In fact is is required by ANSI/ISO Standards. You failed again. We need to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL. You need to read and download the PDF for:
https://www.simple-talk.com/books/sql-books/119-sql-code-smells/
Please, please learn why rows are not records. You do not even know what a key is or how to write a INTEGER NOT NULL(s, p) declaration. Allowing a table and column to have the same name is legal syntax and we regret not taking it out of the Standard. It is
stupid! How can a set also be an attribute of itself? But your DDL gave no attribute properties, as per ISO-11179.
Here is my attempt at repairs
CREATE TABLE CRT
(crt_something INTEGER NOT NULL,
grd_something INTEGER NOT NULL,
PRIMARY KEY (crt_something, grd_something),
gst_date DATE NOT NULL);
Here is how we write an insertion statement; you are still using Sybase syntax! Why did you pick the most ambiguous, non-ANSI date display format?
INSERT INTO CRT
VALUES
( 7, 1900, '2000-01-01),
( 7, 2100, '2010-01-01'),
( 7, 2200, '2012-01-01'),
(19, 1900, '2002-01-01'),
(19, 2100, '2012-01-01'),
(19, 2200, '2012-02-02'),
(24, 1900, '2013-01-01');
>> How I can get the MAX(gst_date) record [sic] of grd_something. If record [sic] of grd_something has more than one record [sic] then I need the one MAX(gst_date) record [sic] of every grd_something. <<
WITH X
(SELECT crt_something, grd_something, gst_date,
MAX(gst_date)
OVER (PARTITION BY grd_something) AS gst_date_max
FROM CRT)
SELECT *
FROM X
WHERE gst_date_max = gst_date;
--CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
in Sets / Trees and Hierarchies in SQL -
Query for finding data during special holidays
Hi, i have a table for special holidays that looks like this:
PROFILE_DAY
VAR_DATE
REGULAR_HOLIDAY
1/1/2013
H_WEEK_THURSDAY
3/28/2013
H_WEEK_FRIDAY
3/29/2013
REGULAR_HOLIDAY
12/24/2013
REGULAR_HOLIDAY
12/25/2013
REGULAR_HOLIDAY
12/31/2013
And another table (LOAD_PROFILE_TEST), which contains LOAD_PROF1 values from (TIME_EQ) 0 to 24 at intervals 0.25 for (PROFILE_DAY) MONDAY to SUNDAY including REGULAR_HOLIDAY, H_WEEK_THURSDAY and H_WEEK_FRIDAY. All in all, this table contains 970 records (97 records between 0 to 24 with interval of 0.25 per PROFILE_DAY, with 10 distinct PROFILE_DAY).
TIME_EQ
PROFILE_DAY
LOAD_PROF1
LOAD_PROF2
0
REGULAR_HOLIDAY
11.47
0.25
REGULAR_HOLIDAY
11.27
0.5
REGULAR_HOLIDAY
11.3
0.75
REGULAR_HOLIDAY
11.08
0
MONDAY
11.27
0.25
MONDAY
11.33
0.5
MONDAY
11.18
Now, I have this query to update value of LOAD_PROF2 of the said table whenever parameters V_DATE_OUT & V_DATE_IN is entered:
UPDATE LOAD_PROFILE_TEST
SET LOAD_PROF2 = LOAD_PROF1 + :LOAD_DIFF
WHERE UPPER(PROFILE_DAY) IN (select UPPER(to_char(:V_DATE_OUT + (level-1), 'fmDAY'))
from dual
connect by level <= :V_DATE_IN - :V_DATE_OUT + 1
where :LOAD_DIFF is a certain pre-determined value.
This query works fine if i am trying to update regular days from MONDAY to SUNDAY. What i would like to do is to determine if the two parameter dates, V_DATE_OUT & V_DATE_IN would fall under any of the holidays on the first table LOAD_PROFILE_TEST, then update only those rows. For example, V_DATE_OUT = 12/02/2013, Monday and V_DATE_IN = 12/06/2013, Friday. The query above would update the values of LOAD_PROF2 for PROFILE_DAY - MONDAY to FRIDAY, corresponding to dates 12/02/2013 and 12/06/2013. If however, V_DATE_OUT = 12/23/2013, Monday and V_DATE_IN = 12/27/2013, Friday, this should update the rows corresponding to PROFILE_DAY - MONDAY for 12/23/2013, THURSDAY for 12/26/2013, FRIDAY for 12/27/2013, and REGULAR_HOLIDAY for the dates 12/24/2013 and 12/25/2013 since these two are included in the first table (table of holidays). This same scenario will work the same way when V_DATE_OUT and/or V_DATE_IN fall in the dates 3/28/2013 and 3/29/2013. All other dates not included in the table for special holidays will be treated according to the day they fall on (Monday thru Sunday). I hope my point is clear. Thank you in advance.Thanks for your reply. Firstly, I am using Forms [32 Bit] Version 10.1.2.0.2 (Production). I will try to explain a little further but I don't know if i can give the exact details of the tables since this would be voluminous. First i have a table REG_HOLIDAYS, whcih contain the following:
PROFILE_DAY
VAR_DATE
REGULAR_HOLIDAY
1/1/2013
H_WEEK_THURSDAY
3/28/2013
H_WEEK_FRIDAY
3/29/2013
REGULAR_HOLIDAY
12/24/2013
REGULAR_HOLIDAY
12/31/2013
REGULAR_HOLIDAY
12/25/2013
Now i have another table LOAD_PROFILE_TEST which i will simplify just to show what i wanted to do:
CREATE
TABLE LOAD_PROFILE_TEST
TIME_EQ
NUMBER (5, 2),
PROFILE_DAY
VARCHAR2 (15 BYTE),
LOAD_PROF1
NUMBER (6, 2),
LOAD_PROF2
NUMBER (6, 2)
Here are the sample values of this table:
TIME_EQ
PROFILE_DAY
LOAD_PROF1
LOAD_PROF2
0
Monday
2
0
Tuesday
2.1
0
Wednesday
2.3
0
Thursday
2.5
0
Friday
2.2
0
Saturday
2.4
0
Sunday
2.3
0
Regular_holiday
1.9
0.25
Monday
2.1
0.25
Tuesday
2.1
0.25
Wednesday
2.4
0.25
Thursday
2.2
0.25
Friday
2.5
0.25
Saturday
2.3
0.25
Sunday
2.3
0.25
Regular_holiday
2.5
However, in the actual table, TIME_EQ will start with 0 until 24 with interval of 0.25 (i.e 0, 0.25, 0.5, 0.75, 1, 1.25, 1.5, ...23.5, 23.75, 24). So for a PROFILE_DAY of 'Monday', there will be 97 rows corresponding to TIME_EQ of 0 to 24. The same is true for Tuesday, Wednesday until Sunday including Regular_holiday. All in all, this table would contain 776 rows. LOAD_PROF1 values are random values initally inputted with PROFILE_DAY and TIME_EQ.
The first goal is to UPDATE this table (LOAD_PROFILE_TEST) by updating the column LOAD_PROF2 by adding a certain parameter value, :LOAD_DIFF, say 0.1. Now, i will have two scenarios to show what i would like to happen. First, i have two date parameters :V_DATE_OUT and :V_DATE_IN.
First Case, V_DATE_OUT = 12/15/2013 (Monday); V_DATE_IN = 12/22/2013 (Sunday)
I will have this query to update said table:
UPDATE LOAD_PROFILE_TEST
SET LOAD_PROF2 = LOAD_PROF1 + :LOAD_DIFF
WHERE UPPER(PROFILE_DAY) IN (select UPPER(
to_char(
:V_DATE_OUT + (level-1),
'fmDAY'))
from dual
connect by level <=
:V_DATE_IN - :V_DATE_OUT + 1);
The output of this query would be:
TIME_EQ
PROFILE_DAY
LOAD_PROF1
LOAD_PROF2
0
Monday
2
2.1
0
Tuesday
2.1
2.2
0
Wednesday
2.3
2.4
0
Thursday
2.5
2.6
0
Friday
2.2
2.3
0
Saturday
2.4
2.5
0
Sunday
2.3
2.4
0
Regular_holiday
1.9
0.25
Monday
2.1
2.2
0.25
Tuesday
2.1
2.2
0.25
Wednesday
2.4
2.5
0.25
Thursday
2.2
2.3
0.25
Friday
2.5
2.6
0.25
Saturday
2.3
2.4
0.25
Sunday
2.3
2.4
0.25
Regular_holiday
2.5
Since, 12/15/2013 up to 12/22/2013 is from Monday to Sunday without having a particular day included in the first table, REG_HOLIDAYS, therefore all the rows with Monday to Sunday are updated.
Second Case, V_DATE_OUT = 12/23/2013 (Monday); V_DATE_IN = 12/29/2013 (Sunday)
Take note that 12/24 and 12/25 are included in the first table, REG_HOLIDAYS, therefore i need a query so that this would be my output afterwards:
TIME_EQ
PROFILE_DAY
LOAD_PROF1
LOAD_PROF2
0
Monday
2
2.1
0
Tuesday
2.1
0
Wednesday
2.3
0
Thursday
2.5
2.6
0
Friday
2.2
2.3
0
Saturday
2.4
2.5
0
Sunday
2.3
2.4
0
Regular_holiday
1.9
2.0
0.25
Monday
2.1
2.2
0.25
Tuesday
2.1
0.25
Wednesday
2.4
0.25
Thursday
2.2
2.3
0.25
Friday
2.5
2.6
0.25
Saturday
2.3
2.4
0.25
Sunday
2.3
2.4
0.25
Regular_holiday
2.5
2.6
As can be seen, since 12/23/2013 up to 12/29/2013 is from Monday to Sunday, rows with Monday upto Sunday should be updated, HOWEVER, since 12/24 and 12/25 which falls on a Tuesday and a Wednesday also were declared as holidays (included in the REG_HOLIDAYS table), instead of updating rows with PROFILE_DAY of Tuesday and Wednesday, the query should instead update rows with Regular_holiday as PROFILE_DAY and leave as is the rows with Tuesday and Wednesday. Additional note, days declared in the REG_HOLIDAYS table with PROFILE_DAY of Regular_holidays are treated as DISTINCT as in the case of 12/24 and 12/25.
The query you gave me gives the correct value if the involved days between V_DATE_OUT and V_DATE_IN is from Monday to Sunday only, otherwise it always gives Regular_holiday only regardless of the other dates queried. As in the first case above, it would give Monday to Sunday (correct) but for the second case, it will only give Regular_holiday, the other days (Monday, Thursday, Friday, Saturday and Sunday) was not outputted.
I hope this became clearer since i will still be needing this for another query which i will inquire again after resolving this issue. Thanks a lot for your help. -
Conversion of .dgn file to .shp files for spatial data processing
Hi All,
Presently I am working oracle spatial database (10gR2). I have following task:
Task: Spatial data loading into oracle database 10g Release 2:
Task process description: Client has given the input file as with .dgn extension. So we have to convert that to .shp file and load to spatial database.
There is utility in OTN site for loading data .shp file data into spatial database. This we can able to do.
Now I am looking for how to convert the .dgn file to .shp file? I mean, if there is any tool for it or any commands to do this in specific environment.
Any help would appreciate.
Thanks,
[email protected]Hi,
There are several ways to do this.
Get a general GIS format translator. I use FME. This can be obtained from SAFE at safe.com. There is a free 30 day evaluation.
Get GIS software which has a data translation facility. Take your pick of the systems.
Write some code to read the DGN into Oracle. The DGN format can be found at http://www.bentley.com/en-US/Products/MicroStation/OpenDGN/. You need to know the version of DGN.
Ask your client to supply data in SHP format.
Ivan -
Trouble writing Query for Pivoting data from a table
I am having a little trouble writing a query for converting the below table data into a pivot data. I am trying to write a query for which if I give a single valid report_week date as input it should give me the data for that week and also provide two extra columns, one which gives the data of last week for the same countries and the second column which gives the difference of numbers in both the columns(i.e. COUNT - COUNT_LAST_WEEK).
REPORT_WEEK DIVISION COUNT
9/26/2009 country1 81
9/26/2009 country2 97
9/26/2009 country3 12
9/26/2009 country4 26
9/26/2009 country5 101
10/3/2009 country1 85
10/3/2009 country2 98
10/3/2009 country3 10
10/3/2009 country4 24
10/3/2009 country5 101
10/10/2009 country1 84
10/10/2009 country2 98
10/10/2009 country3 10
10/10/2009 country4 25
10/10/2009 country5 102
For example, if I give input as 10/10/2009, the output should be as give below.
REPORT_WEEK DIVISION COUNT COUNT_LAST_WEEK DIFFERENCE
10/10/2009 country1 84 85 -1
10/10/2009 country2 98 98 0
10/10/2009 country3 10 10 0
10/10/2009 country4 25 24 1
10/10/2009 country5 102 101 1
For example, if I give input as 10/3/2009, the output should be as give below.
REPORT_WEEK DIVISION COUNT COUNT_LAST_WEEK DIFFERENCE
10/3/2009 country1 85 81 4
10/3/2009 country2 98 97 1
10/3/2009 country3 10 12 -2
10/3/2009 country4 24 26 -2
10/3/2009 country5 101 101 0
Can anyone please shed some light on Query building for the above scenarios.
Thank you
SKP
Edited by: user11343284 on Oct 10, 2009 7:53 AM
Edited by: user11343284 on Oct 10, 2009 8:28 AMI assume there is no gap in report weeks. If so:
SQL> variable report_week varchar2(10)
SQL> exec :report_week := '10/10/2009'
PL/SQL procedure successfully completed.
with t as (
select to_date('9/26/2009','mm/dd/yyyy') report_week,'country1' division,81 cnt from dual union all
select to_date('9/26/2009','mm/dd/yyyy'),'country2',97 from dual union all
select to_date('9/26/2009','mm/dd/yyyy'),'country3',12 from dual union all
select to_date('9/26/2009','mm/dd/yyyy'),'country4',26 from dual union all
select to_date('9/26/2009','mm/dd/yyyy'),'country5',101 from dual union all
select to_date('10/3/2009','mm/dd/yyyy'),'country1',85 from dual union all
select to_date('10/3/2009','mm/dd/yyyy'),'country2',98 from dual union all
select to_date('10/3/2009','mm/dd/yyyy'),'country3',10 from dual union all
select to_date('10/3/2009','mm/dd/yyyy'),'country4',24 from dual union all
select to_date('10/3/2009','mm/dd/yyyy'),'country5',101 from dual union all
select to_date('10/10/2009','mm/dd/yyyy'),'country1',84 from dual union all
select to_date('10/10/2009','mm/dd/yyyy'),'country2',98 from dual union all
select to_date('10/10/2009','mm/dd/yyyy'),'country3',10 from dual union all
select to_date('10/10/2009','mm/dd/yyyy'),'country4',25 from dual union all
select to_date('10/10/2009','mm/dd/yyyy'),'country5',102 from dual
select max(report_week) report_week,
division,
max(cnt) keep(dense_rank last order by report_week) cnt_this_week,
max(cnt) keep(dense_rank first order by report_week) cnt_last_week,
max(cnt) keep(dense_rank last order by report_week) - max(cnt) keep(dense_rank first order by report_week) difference
from t
where report_week in (to_date(:report_week,'mm/dd/yyyy'),to_date(:report_week,'mm/dd/yyyy') - 7)
group by division
order by division
REPORT_WE DIVISION CNT_THIS_WEEK CNT_LAST_WEEK DIFFERENCE
10-OCT-09 country1 84 85 -1
10-OCT-09 country2 98 98 0
10-OCT-09 country3 10 10 0
10-OCT-09 country4 25 24 1
10-OCT-09 country5 102 101 1
SQL> exec :report_week := '10/3/2009'
PL/SQL procedure successfully completed.
SQL> /
REPORT_WE DIVISION CNT_THIS_WEEK CNT_LAST_WEEK DIFFERENCE
03-OCT-09 country1 85 81 4
03-OCT-09 country2 98 97 1
03-OCT-09 country3 10 12 -2
03-OCT-09 country4 24 26 -2
03-OCT-09 country5 101 101 0
SQL> SY. -
Oracle Content Management for spatial data
Hi,
Is it possible to use Oracle universal content management sofware for managing GIS files/spatial data.
Can we integrate Oracle Universal content management or anyother tool to manage spatial files. I am looking for standard content management features like login/chekin/checkout/security/previes for spatial files etc..
ThanksHi,
-Does this product required additional licensing cost or it came with bundle Please contact your Oracle Sales representative for license queries and questions.
Global Pricing and Licensing
http://www.oracle.com/us/corporate/pricing/index.htm
-Does this product require additional server or it can be run on application server?
-What are the key feature we can get from Oracle Content Management Server?
-What kind of document we can store in Oracle Content Management Server?
-How much additional resources required if we deploy this product?Please refer to the following links.
Note: 305373.1 - Oracle Document Management in Release 11i through 12 - Installation and Configuration
https://metalink2.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=305373.1
Oracle Universal Content Management
http://www.oracle.com/products/middleware/content-management/universal-content-management.html
Oracle Content Management 10gR3 (10.1.3.3)
http://download.oracle.com/docs/cd/E10316_01/ouc.htm
Regards,
Hussein -
MDSYS Schema in the Database for spatial data
Hi
As MDSYS Schema is required in the oracle database to hold the spatial data, The client database server has no MDSYS schema installed.
So, when i import the mvdemo.dmp file in the database it is erroring out since there is no spatial data, Please let me know how to get the MDSYS schema in the database.
Thank YouHi thanks for your response, i found this out in the oracle site, DBA ran this script for creation of MDSYS..but there was no data in some of the tables which are needed for development.
--- COMPATIBLE init.ora parameter is set to 9.0.0.0.0 or higher
show parameter compat
NAME TYPE VALUE
compatible string 10.2.0.3.0
If you create an Oracle database using the Database Configuration Assistant (DBCA), Spatial is installed by default and you do not need to perform the installation steps described in this section.
Steps to create the MDSYS schema and objects
1) Connect to the database instance specifying AS SYSDBA.
2) Create the MDSYS user with a command in the following format:
SQL> CREATE USER MDSYS IDENTIFIED BY <password>;
3) Grant the required privileges to the MDSYS user by running the following procedure:
SQL> @ORACLE_HOME/md/admin/mdprivs.sql
4) Connect as MDSYS.
5) Install Spatial by running the following procedure:
SQL> @ORACLE_HOME/md/admin/catmd.sql
SQL> ALTER USER MDSYS ACCOUNT LOCK;
So this solution dint work out....... -
Can't not Query the Spatial Data In VS2005
I use Oracle 10 g, Win XP service pack 2, VS2005 and ODT.
And my problem is when i try to select a column that is a spatial column. I receive a error message: "Unsupported Data Type". I use Query Window of ODT within VS2005.
Please help me fix this bugIts always worth searching the forums before posting...
Re: ODP.Net + Spatial
Any support for spatial and .NET? -
Query for inserting data into table and incrementing the PK.. pls help
I have one table dd_prohibited_country. prohibit_country_key is the primary key column.
I have to insert data into dd_prohibited_country based on records already present.
The scenario I should follow is:
For Level_id 'EA' and prohibited_level_id 'EA' I should retreive the
max(prohibit_country_key) and starting from the maximum number again I have to insert them
into dd_prohibited_country. While inserting I have to increment the prohibit_country_key and
shall replace the values of level_id and prohibited_level_id.
(If 'EA' occurs, I have to replace with 'EUR')
For Instance,
If there are 15 records in dd_prohibited_country with Level_id 'EA' and prohibited_level_id 'EA', then
I have to insert these 15 records starting with prohibit_country_key 16 (Afetr 15 I should start inserting with number 16)
I have written the following query for this:
insert into dd_prohibited_country
select
a.pkey,
b.levelid,
b.ieflag,
b.plevelid
from
(select
max(prohibit_country_key) pkey
from
dd_prohibited_country) a,
(select
prohibit_country_key pkey,
replace(level_id,'EA','EUR') levelid,
level_id_flg as ieflag,
replace(prohibited_level_id,'EA','EUR') plevelid
from
dd_prohibited_country
where
level_id = 'EA' or prohibited_level_id = 'EA') b
My problem here is, I am always getting a.pkey as 15, because I am not incrementing it.
I tried incrementing it also, but I am unable to acheive it.
Can anyone please hepl me in writing this query.
Thanks in advance
Regards
RaghuBecause you are not incrementing your pkey. Try like this.
insert
into dd_prohibited_country
select a.pkey+b.pkey,
b.levelid,
b.ieflag,
b.plevelid
from (select max(prohibit_country_key) pkey
from dd_prohibited_country) a,
(select row_number() over (order by prohibit_country_key) pkey,
replace(level_id,'EA','EUR') levelid,
level_id_flg as ieflag,
replace(prohibited_level_id,'EA','EUR') plevelid
from dd_prohibited_country
where level_id = 'EA' or prohibited_level_id = 'EA') bNote: If you are in multiple user environment you can get into trouble for incrementing your PKey like this.
Maybe you are looking for
-
How to open PDF files in Safari
How does one open PDF files when using Safari
-
Hello all, We want to change the Self Service HR address page. Seeded, it is driven off of the City. An LOV is provided for the user to match. Short story, it is cumbersome, and not all the fields that are required, are marked as such (though there i
-
Trying to create recovery disks but it keeps canceling on me, what's wrong?
HP Pavilion Slimline s5107c. Windows Vista 64- bit. No error message just keeps cancleing.
-
Where does iTunes store files created by their OneStep DVD process?
I can't seem to find them anywhere.
-
Table for link between Accounting document and Billing document
Which is the table for link between Accounting document and Billing document ? Thanks & Regards