SQL string for blank or populated date
I need to build an Oracle sql string where I need to check that if a datetime column is not blank its value must be greater than current_date and I am failing.
SELECT ColumnData FROM MyTable WHERE STARTDATE > current_date AND (ENDDATE !="" and ENDATE > current_date) Please tell me how I should do this.
Edited by: StrayGrey on May 15, 2009 1:09 PM
SELECT ColumnData FROM MyTable WHERE STARTDATE > current_date AND ENDATE > current_date
Similar Messages
-
How to encode sql string for SQL Server when using JDBC?
in code, dynamically generate sql stirng like:
String sqlstring = "select column from table where column=' " + var + " ' ";
Question is: if var include char ' , it will cause error becuase ' is reserved by SQL Server for string reference.
So how to encode string for dynamic sql string? for example, following sql(when var =" I'm tester"):
select column from table where column like ' I'm tester '
Edited by: KentZhou on Jun 17, 2009 3:10 PMUse PreparedStatement. Use it all the way. It not only saves you from SQL injections, but also eases setting non-standard Java objects like Date and InputStream in a SQL statement.
Prepare here: [http://java.sun.com/docs/books/tutorial/jdbc/basics/prepared.html]. -
Using FIFO as sql code for assigning indicator to data items
Hi All,
We are looking for a solution based on FIFO algorithm. We have a table having following data:
We need to perform FIFO on this table, and assign "object" as data items to other rows based on following conditions:
1. first we have to group the rows based on "object" column
2. then we have to traverse each group from the row having minimum start time of the object group e.g: row id 1 for object group for "19O"
2.1 Assign a "EqpREf" as "object" + <an integer value> , where integer value changes when the start and end chain finishes. Start -end chain is explained in step 2.2
2.2 then we have to pick the "nextstarttime" of the picked row and compare it against the closest "starttime" of the rows having "start" as same as "end" of the same row we are
picking for "nextstarttime" e.g: row id 2 of object 19O is having "nextstarttime" 0310 closest to "starttime" 0355 of row id 2 of object 19O , and rowid 2 is having "start" AAL which is similar to "end" of
rowid 1
2.3 We have to perform this chain till we find end of same and allocate each row in a chain same "Eqpref"
hence the output we need to generate will come as:
Kindly help on same.
Thanks in advance
-Regards
KumudHi,
Please find the following code block for what is input data and what should be output.
--The input data
create table temp_table
row_id int,
engine_no varchar(20),
schedule_no varchar(20),
start_station varchar(20),
end_station varchar(20),
startdate datetime,
enddate datetime,
starttime datetime,
endtime datetime,
record_id int,
engine_id int,
Mgt int,
nextstarttime datetime,
Schedule_ref varchar(20),
Engine_Ref varchar(20)
GO
insert into temp_table values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00',null,null)
insert into temp_table values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00',null,null)
insert into temp_table values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00',null,null)
insert into temp_table values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00',null,null)
insert into temp_table values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00',null,null)
insert into temp_table values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00',null,null)
insert into temp_table values(7,'19O','107','DFW','ABC','2015/01/01','2015/01/01','06:00:00','07:00:00',7,10,60,'08:00:00',null,null)
insert into temp_table values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00',null,null)
insert into temp_table values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00',null,null)
insert into temp_table values(10,'19O','110','XYZ','BDW','2015/01/01','2015/01/01','13:00:00','15:00:00',10,10,60,'16:00:00',null,null)
insert into temp_table values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00',null,null)
insert into temp_table values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00',null,null)
insert into temp_table values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00',null,null)
insert into temp_table values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00',null,null)
insert into temp_table values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00',null,null)
insert into temp_table values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00',null,null)
insert into temp_table values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00',null,null)
insert into temp_table values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00',null,null)
insert into temp_table values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00',null,null)
insert into temp_table values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00',null,null)
insert into temp_table values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00',null,null)
insert into temp_table values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00',null,null)
insert into temp_table values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00',null,null)
insert into temp_table values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00',null,null)
insert into temp_table values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00',null,null)
insert into temp_table values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00',null,null)
insert into temp_table values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00',null,null)
insert into temp_table values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00',null,null)
insert into temp_table values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00',null,null)
insert into temp_table values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00',null,null)
insert into temp_table values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00',null,null)
insert into temp_table values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00',null,null)
insert into temp_table values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00',null,null)
insert into temp_table values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00',null,null)
--output should come as the data in temp_table_final
create table temp_table_final
row_id int,
engine_no varchar(20),
schedule_no varchar(20),
start_station varchar(20),
end_station varchar(20),
startdate datetime,
enddate datetime,
starttime datetime,
endtime datetime,
record_id int,
engine_id int,
Mgt int,
nextstarttime datetime,
Schedule_ref varchar(20),
Engine_Ref varchar(20)
GO
insert into temp_table_final values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00','101','19O-1')
insert into temp_table_final values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00','102','19O-2')
insert into temp_table_final values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00','103','19O-3')
insert into temp_table_final values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00','101','19O-1')
insert into temp_table_final values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00','102','19O-2')
insert into temp_table_final values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00','104','19O-1')
insert into temp_table_final values(7,'19O','107','DFW','ABC','2015/01/01','2015/01/01','06:00:00','07:00:00',7,10,60,'08:00:00','107','19O-4')
insert into temp_table_final values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00','105','19O-2')
insert into temp_table_final values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00','106','19O-1')
insert into temp_table_final values(10,'19O','110','XYZ','BDW','2015/01/01','2015/01/01','13:00:00','15:00:00',10,10,60,'16:00:00','110','19O-5')
insert into temp_table_final values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00','111','319-1')
insert into temp_table_final values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00','211','319-2')
insert into temp_table_final values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00','111','319-1')
insert into temp_table_final values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00','212','319-3')
insert into temp_table_final values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00','213','319-4')
insert into temp_table_final values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00','118','319-5')
insert into temp_table_final values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00','212','319-3')
insert into temp_table_final values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00','112','319-1')
insert into temp_table_final values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00','101','19O-1')
insert into temp_table_final values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00','102','19O-2')
insert into temp_table_final values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00','103','19O-3')
insert into temp_table_final values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00','101','19O-1')
insert into temp_table_final values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00','102','19O-2')
insert into temp_table_final values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00','104','19O-1')
insert into temp_table_final values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00','105','19O-2')
insert into temp_table_final values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00','106','19O-1')
insert into temp_table_final values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00','111','319-1')
insert into temp_table_final values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00','211','319-2')
insert into temp_table_final values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00','111','319-1')
insert into temp_table_final values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00','212','319-3')
insert into temp_table_final values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00','213','319-4')
insert into temp_table_final values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00','118','319-5')
insert into temp_table_final values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00','212','319-3')
insert into temp_table_final values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00','112','319-1')
What we are doing here is generating a schedule for Trains departures.
here, we should identify the train schedules making a chain of stations considering the endstation of a train engine no should be startstation of another record for same engineno. also the starttime of engineno should be nearest of nextstarttime
of same station.
for example : if we pick Ist row "SGC-IXP", nextstarttime for same is "02:00:00 am". this means train departed from SGC will reach to IXP and is available for departure from IXP after "02:00:00 am". So we have to consider
the record having startstation as IXP and nearest starttime to nextstarttime ("02:00:00"). So the next train departure would be IXP-DFW having starttime as "03:30:00 am".
here you can see we have to assign the scheduleno of previously considered record to the chained schedule so we have updated the "schedule_ref" as 101. Also we have to assign the engine no - <counter of integer> to a single chain on schedule
given in engine_ref.
Regards
Kumud -
Xquery string for taking out repeated data
I receive an xml like this from a service:
<mov>
<id>123</id>
<timestamp>2009-06-15-13.38.50.148244</timestamp>
<data>some data for id 123</data>
</mov>
<mov>
<id>456</id>
<timestamp>2009-06-15-15.00.50.148244</timestamp>
<data>some data</data>
</mov>
<mov>
<id>123</id>
<timestamp>2009-06-15-21.30.50.123240</timestamp>
<data>updated data for id 123</data>
</mov>
(well, there are around 100 movs in a 300kb xml file)
I have to update a database table (whose primary key is id) with the received data.
They aren't suposed to send me repeated <id> values in one xml .. but it is happening once and again, as in the sample above with id=123 , so my merge obviously fails with an oracle error as it is unable to get a consistent set of records for the update.
Is there any XQuery expression to eliminate in the xmltable the <mov> with the repeated id value?
Specifically, I want to get rid of the first mov (the oldest one based on timestamp).
If the timestamp comparison is hard to achieve, getting rid of the prior movs for the same id would do.
Thanks.Hi,
You could do it in SQL :
WITH t AS (
SELECT xmltype('<root>
<mov>
<id>123</id>
<timestamp>2009-06-15-13.38.50.148244</timestamp>
<data>some data for id 123</data>
</mov>
<mov>
<id>456</id>
<timestamp>2009-06-15-15.00.50.148244</timestamp>
<data>some data</data>
</mov>
<mov>
<id>123</id>
<timestamp>2009-06-15-21.30.50.123240</timestamp>
<data>updated data for id 123</data>
</mov>
</root>') doc
FROM dual
SELECT id, tstamp, data
FROM (
SELECT id,
to_timestamp(tstamp,'YYYY-MM-DD-HH24.MI.SS.FF') tstamp,
data,
row_number() over( partition by id
order by to_timestamp(tstamp,'YYYY-MM-DD-HH24.MI.SS.FF') desc ) rn
FROM t, XMLTable(
'/root/mov'
passing t.doc
columns
id number path 'id',
tstamp varchar2(30) path 'timestamp',
data varchar2(4000) path 'data'
) x
WHERE rn = 1;With XQuery only, as you've foreseen, it's not easy to deal with timestamps (at least in the format you get them).
Below, an example retrieving only the last occurrence of each id (your 2nd option) :
WITH t AS (
SELECT xmltype('<root>
<mov>
<id>123</id>
<timestamp>2009-06-15-13.38.50.148244</timestamp>
<data>some data for id 123</data>
</mov>
<mov>
<id>456</id>
<timestamp>2009-06-15-15.00.50.148244</timestamp>
<data>some data</data>
</mov>
<mov>
<id>123</id>
<timestamp>2009-06-15-21.30.50.123240</timestamp>
<data>updated data for id 123</data>
</mov>
</root>') doc
FROM dual
SELECT x.id,
to_timestamp(x.tstamp,'YYYY-MM-DD-HH24.MI.SS.FF') tstamp,
x.data
FROM t, XMLTable(
'for $i in distinct-values($d//mov/id)
return $d//mov[id=$i][last()]'
passing t.doc as "d"
columns
id number path 'id',
tstamp varchar2(30) path 'timestamp',
data varchar2(4000) path 'data'
) x;Edited by: odie_63 on 27 avr. 2010 12:51 -
Problem with SQL Statement for Result Filtering
Dear Visual Composer Experts,
Here is another Question from me: I have a SQL Query that is working as the data service
Select AB.AgingBandID, AB.AgingBand,
Sum(Case when priority='Emergency' then '1' Else 0 End) as [Emergency],
Sum(Case when priority='Ugent' then '1' Else 0 End) as Ugent,
Sum(Case when priority='High' then '1' Else 0 End) as High,
Sum(Case when priority='Medium' then '1' Else 0 End) as Medium,
Sum(Case when priority='Low' then '1' Else 0 End) as Low
from DimAgingBand AB left outer join
(Select AgingBandID , priority , yeardesc
from vNotifications where YearDesc = (select year(getdate())-1)) as vN
on AB.AgingBandID=vN.AgingBandID
where AB.AgingBandID<>'1'
Group by AB.AgingBandID, AB.AgingBand
Order by AB.AgingBandID
That would return me a table as in the following:
Agingband E U H M L
< 1week 0 0 0 0 1
1 - 2 weeks 0 0 0 0 0
2 - 4weeks 0 0 0 0 1
> 1month 8 2 1 1 6
Now that I would like to add some parameters to filter the result, so I modify the query and put it in the SQL Statement input port of the same data service. The query is like this:
"Select AB.AgingBandID, AB.AgingBand,Sum(Case when priority='Emergency' then '1' Else 0 End) as [Emergency],Sum(Case when priority='Ugent' then '1' Else 0 End) as Ugent,Sum(Case when priority='High' then '1' Else 0 End) as High,Sum(Case when priority='Medium' then '1' Else 0 End) as Medium,Sum(Case when priority='Low' then '1' Else 0 End) as Low from DimAgingBand AB left outer join (Select AgingBandID , priority , yeardesc from vNotifications where YearDesc like '2009%' and Branch like '" & if(STORE@selectedBranch=='ALL', '%', STORE@selectedBranch) & "' and MainWorkCentre like '%') as vN on AB.AgingBandID=vN.AgingBandID where AB.AgingBandID<>'1' Group by AB.AgingBandID, AB.AgingBand Order by AB.AgingBandID"
However this input port query keeps giving me error as NullPointerException. I have actually specified a condition where the query will run if only STORE@selectedBranch != u2018u2019.
I have other filtering queries working but they are not as complicated query as this one. Could it be possible that query in the input port cannot handle left outer join?
Could it be anything else?
Help is very much appreciated.
Thanks & Regard,
SarahHi,
Thank you very much for your replys. I've tested if the dynamic value of the condition is integer, it's OK
But if the ClassID type is not integer, it's string, I write a SQL Statement like:
"Select DBADMIN.Class.ClassName from DBADMIN.Class where DBADMIN.Class.ClassID = '1' "
or with dynamic condition:
"Select DBADMIN.Class.ClassName from DBADMIN.Class where DBADMIN.Class.ClassID = '"&@ClassID&"'"
or I write the SQL Statement for insert/update/delete data,
I always have errors.
I've tested if the dynamic value of the condition is integer, it's OK
Do you know this problem ?
Thank you very much & kindly regards,
Tweety -
I'm using Netbeans IDE 3.4 and trying to build SQL statments for MySQL DB.
For that reason I need to concate the char " or ' to my SQL string . for example I need to create the string:
SELECT * FROM user WHERE username = "test".
I use variables to get the username from other function and I need to create that statment, I tried to run the following code:
SELECT * FROM user WHERE username =" + '\"' + Username + '\"'
and also tried to run:
SELECT * FROM user WHERE username =" + "\"" + Username + "\""
but I get always the following string:
SELECT * FROM user WHERE username =\"test\" , which of course is not OK.
I tried to replace the " with ' but with no luck.
Can someone help ?
Many Thanks...String sql = "SELECT * FROM user WHERE username = '" + username + "'"
single/double double/single/double
You will have problems where your username is something like 'O'Neil' where you will have to escape the single quote within the username by adding an additional single quote 'O''Neil'
single/single -
In JDBC Sender Adapter , the server is Microsoft SQL .I need to pass current date as the input column while Executing stored procedure, which will get me 10 Output Columns. Kindly suggest me the SQL Query String , for executing the Stored Procedure with Current date as the input .
Hi Srinath,
The below blog might be useful
http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/03/06/executing-stored-procedure-from-sender-adapter-in-sap-pi-71
PI/XI: Sender JDBC adapter for Oracle stored procedures in 5 days
regards,
Harish -
How to Search for a Date in a string having both characters and date
Hi ,
I have a requirement to search DATE in the String having Both Characters and DATE . Please kindly let me know as early as possible.
Regards
Anil Kumar KTry to use SEARCH command with the pattern, making the pattern as fine as it can, like if you have your date in the format 02/23/2007, you can give
SEARCH STRING FOR ' //'. (assuming the date has at least a single blank before it)
WRITE: SY-SUBRC UNDER 'SY-SUBRC',
SY-FDPOS UNDER 'SY-FDPOS'.
SY-FDPOS is set to offset of the string from which you can get the date value
Refer here
http://help.sap.com/saphelp_47x200/helpdata/en/fc/eb33cc358411d1829f0000e829fbfe/frameset.htm
It would have been better if there is something like UNIX "regular expression" matching in ABAP -
Using CVS in SQL Developer for Data Modeler changes.
Hi,
I am fairly new to SQL Developer Data Modeler and associated version control mechanisms.
I am prototyping the storage of database designs and version control for the same, using the Data Modeler within SQL Developer. I have SQL Developer version 3.1.07.42 and I have also installed the CVS extension.
I can connect to our CVS server through sspi protocol and external CVS executable and am able to check out modules.
Below is the scenario where I am facing some issue:
I open the design from the checked out module and make changes and save it. In the File navigator, I look for the files that have been modified or added newly.
This behaves rather inconsistently in the sense that even after clicking on refresh button, sometimes it does not get refreshed. Next I try to look for the changes in Pending Changes(CVS) window. According to the other posts, I am supposed to look at the View - Data Modeler - Pending Changes window for data modeler changes but that shows up empty always( I am not sure if it is only tied to Subversion). But I do see the modified files/ files to be added to CVS under Versioning - CVS - Pending Changes window. The issue is that when I click on the refresh button in the window, all the files just vanish and all the counts show 0. Strangely if I go to Tools - Preferences - Versioning - CVS and just click OK, the pending changes window gets populated again( the counts are inconsistent at times).
I believe this issue is fixed and should work correctly in 3.1.07.42 but it does not seem to be case.
Also, I m not sure if I can use this CVS functionality available in SQL Dev for data modeler or should I be using an external client such as Wincvs for check in/ check out.
Please help.
ThanksHi Joop,
I think you will find that in Data Modeler's Physical Model tree the same icons are used for temporary Tables and Materialized Views as in SQL Developer.
David -
Store Date as Oracle Date as string for simplicity?
I got this table (scaled-down):
CREATE TABLE TBLSERVICES
STORENBR CHAR(4 byte) NOT NULL,
SERVICEDATE CHAR(10 byte) NOT NULL,
CHECKNBR CHAR(6 byte),
ORDER_START_TIME DATE NOT NULL,
ITEMID CHAR(4 byte),
SERVERID NUMBER(18),
SERVERNAME VARCHAR2(50 byte),
ITEMQUANTITY NUMBER
) Function MakeServicesDatatable(byval onodesconfig as xmlnodelist) return datatable
dim objDT as new datatable
With objDt
.Columns.Add("StoreNbr", GetType(String))
.Columns.Add("Order_Start_Time", GetType(String))
End With
dim dte as string
For Each onode As XmlNode In onodesConfig
If onode.InnerText = "5" Or onode.InnerText = "6" Then
objdr = objDt.NewRow
objdr("StoreNbr") = storenbr
dteYear = parentNode("Order_Start_Time").SelectSingleNode("Year").InnerText
dteMonth = parentNode("Order_Start_Time").SelectSingleNode("Month").InnerText
dteDay = parentNode("Order_Start_Time").SelectSingleNode("Day").InnerText
DteHour = parentNode("Order_Start_Time").SelectSingleNode("Hour").InnerText
dteMin = parentNode("Order_Start_Time").SelectSingleNode("Minute").InnerText
dteSecs = parentNode("Order_Start_Time").SelectSingleNode("Second").InnerText
dte = String.Format("{0}/{1}/{2} {3}:{4}:{5}", dteMonth, dteDay, dteYear, DteHour, dteMin, dteSecs)
objDR("order_start_time") = dte
objDt.rows.add(objdr)
end if
next onode
return objDT
end function
I call following:
dt = MakeServicesDatatable(NodesWithData)
WriteDataToDB(dt)
Public Sub WriteDataToDB(ByVal dtServices As DataTable)
Dim altConnString As String = connString
Dim queryString As String = "Select * from tblServces"
Using connection As New OracleConnection(altConnString)
Dim adapter As New OracleDataAdapter()
adapter.SelectCommand = New OracleCommand(queryString, connection)
Dim builder As OracleCommandBuilder = New OracleCommandBuilder(adapter)
connection.Open()
Dim ds As DataSet = New DataSet
adapter.Fill(ds, "tblServices")
For Each drService As DataRow In dtServices.Rows
ds.Tables("tblServices").ImportRow(drService)
Next drService
adapter.Update(ds, "tblServices")
End Using
End SubIn a PL/SL SQL Query, I need to say something like this:
- Lunch is anything before 3:00pm
- Assume date is same as date of order start time
select storenbr,
itemquantity
from tblServices
where order_start_time < DatePArt(order start time date) || 15:59:00 I don't know syntax for getting date portion of oracle date
Problem is the order_start_time is stored as dd-mmm-yyyy.
I need seconds also. Should I make order-start_time a string
and convert to date in SQL statement.Hi Vinod,
Create a varaible of type customer exit for RUNDATE.
IF i_step = 1.
select max (rundate) from /bic/prundate into zrundate.
IF i_vnam = 'rundate'.
l_s_range - low = zrundate.
l_s_range- opt = 'eq'.
append l_s_range into e_t_range.
endif.
Endif.
hope it helps
bhaskar -
RFC WHICH CAN USE DYNAMIC SQL AS INPUT AND SHOW COMPLETE DATA FOR TABLE
Hi Expert,
I am trying to create a FM like RFC_READ_TABLE. In this table we put table name and the field name for which we write a query and option for query we get the out put only for that field in this case.
My requirement is very similar to this. But here i want to enter any table name and in option i want to write dynamic sql query for any filed of table then i want data based on this so that it will display the entire table entries.
Like TABNAMELIKE EKKO
OTHERCON bukrs_k = 3000.
Based on this selection it has to show the entire table fields.
To make this easy to understand i made a custom FM which are getting data from table or view and i select any field and put query it will show the result.
FUNCTION ZDYNSQL_EKKO_EKPO.
""Local Interface:
*" IMPORTING
*" VALUE(ERNAMLIKE) TYPE CHAR15 OPTIONAL
*" VALUE(OTHERCON) TYPE CHAR50 OPTIONAL
*" TABLES
*" VALUE STRUCTURE V_EKKO_EKPO
*TABLES : V_EKKO_EKPO, EKKO, EKPO.
DATA: STR_WHERE TYPE TABLE OF EDPLINE.
DATA: STR_LINE TYPE EDPLINE.
*CONCATENATE 'EBELN LIKE''' EBELNLIKE '%''' INTO STR_LINE.
CONCATENATE 'ERNAM LIKE ''' ERNAMLIKE '%''' INTO STR_LINE.
IF OTHERCON <> ' '.
CONCATENATE STR_LINE 'AND' OTHERCON ' ' INTO STR_LINE SEPARATED BY SPACE.
ENDIF.
APPEND STR_LINE TO STR_WHERE.
SELECT * FROM V_EKKO_EKPO INTO CORRESPONDING FIELDS OF TABLE VALUE WHERE (STR_WHERE).
ENDFUNCTION.
Now here is sample code of exact requirement.
FUNCTION ZDYNSQL_TABLE_READ.
""Local Interface:
*" IMPORTING
*" VALUE(TABNAMELIKE) TYPE DD02L-TABNAME
*" VALUE(OTHERCON) TYPE CHAR80 OPTIONAL
*" TABLES
*" VALUE STRUCTURE DD02L
DATA: STR_WHERE TYPE TABLE OF EDPLINE.
DATA: STR_LINE TYPE STRING.
CONCATENATE 'TABNAME LIKE ''' TABNAMELIKE '%' 'DD02L' 'TABNAME' INTO STR_LINE.
IF OTHERCON <> ' '.
CONCATENATE STR_LINE 'AND' OTHERCON ' ' INTO STR_LINE SEPARATED BY SPACE.
ENDIF.
APPEND STR_LINE TO STR_WHERE.
SELECT * FROM DD02L INTO CORRESPONDING FIELDS OF TABLE VALUE WHERE (STR_WHERE).
ENDFUNCTION.
In this i put table name as EKKO and put sql query as bukrs_k = 3000 it provide a short dump.
How can i solve this problem. Please provide some input or modification
Thanks And Regards
Ranjeet SinghHi Kris,
I tried to make sample using that link you provide to me. How can i declare Global Interface in FM and in import parameter references like "REFERENCE(I_INTERFACE_CHECK) DEFAULT SPACE".
Also it uses a function-pool.
Let me tell you about my exact requirement about FM.
I want in import parameter input as any SAP Table name like
TABNAME TYPE EKKO
OPTIONS TYPE CHAR80
I want my output to be stored in TABLES attributes as per the table name entered in import parameter. In import parameter Table name can be any one of SAP tables and Option based on that particular table. Like if i go with table EKKO and put OPTIONS as
ebelp = 4 then TABLES attributes Tab contains all the relevant data for input.
Is there any way with the help of that i can put my data into internal tables. I tried to put in TABLES as VALUE LIKE ANY but it shows that generic are not allowed. Can you provide some sample on this.
I also getting exceptions like CX_SY_DYNAMIC_OSQL_SEMANTICS, SAPSQL_INVALID_FIELDNAME.
Waiting for your valuable reply.
Thanks And Regards
Ranjeet Singh -
For loops and dynamic sql string syntax
Hi
is there a why to loop through a dynamic sql string
normally you would have
FOR cur IN (select * from emp) LOOP
but I have a dynamic sql string called l_sql
I have tried the following
FOR cur IN l_sql LOOP
but I get
PLS-00456: item 'L_SQL' is not a cursorCompilation failed
Any ideas?You will need to use ref cursors and the OPEN v_rc FOR '<your sql string'> and then loop through it as you would with any other OPEN, LOOP, EXIT WHEN, END LOOP syntax
-
Date fields are showing as / / in DSO for blank dates or no dates
We are loading flat file data to DSO and the date fields are showing as / / in DSO for blank dates or no dates in the flat file source system. We don't want to see this / / and instead of this, we would want to this field in DSO to be in blank. Please help how to achieve this. Is there any way to do this transformation. If yes, then can you please provide the sample coding.
Advance Thanks,
ChristyI have added the code and data loading is successful. while DSO activation, it is failing. The error message is,
Value '#' of characteristic 0DATE is not a number with 000008 spaces.
It seems that we need to change the code. Can you provide me the corrected code please. Thanks.
Christy -
Hi Friends,
I Have a view named - item_sales with 4 column
Item code
Item name
Transaction_YYYYMM (Date stored in YYYYMM format )
QTY_RECEIVED
QTY_SOLD
Sample data is
ITEM_CODE ITEM NAME TRANSACTION_YYYMM QTY_RECD QTY_SOLD
AX TSHIRT 201307 3000 2000
AX TSHIRT 201308 2000 500
AX TSHIRT 201309 1000 3000
CX XLSHIRT 201307 3000 2000
CX XLSHIRT 201308 3000 2500
CX XLSHIRT 201309 3000 2500
EVERY MONTH END I WILL RUN THIS QUERY TO FIND OUT THE BELOW DETAILS
1. TO FIND ITEM_NAME WISE - QTY_RECEIVED AND QTY_SOLD ( FOR CURRENT MONTH - EXAMPLE SEP )
2. TO FIND ITEM_NAME WISE - QTY_RECEIVED AND QTY_SOLD (FOR CURRENT YEAR EXAMPLE FROM JAN TO SEP )
OUTPUT FOR SEPTEMBER MONTH LOOK LIKE THIS
SEP-MONTH JAN TO SEP
ITEM_CODE ITEM_NAME QTY_RECEIVED QTY_SOLD QTY_RECEIVED QTY_SOLD
AX TSHIRT 1000 3000 6000 5500
CX XLSHIRT 3000 2000 9000 7000
Pls advise me how to write queries for this
RdkJust FYI, you *can* edit your own posts, you know
Rdk wrote:
Transaction_YYYYMM (Date stored in YYYYMM format )
First "problem". Don't store dates as string. Store them as dates. It will save you so much headache don't the road you won't believe it.
True, this is a view, so maybe not as critical - assuming the underlying *DATA* is actually a date.
1. TO FIND ITEM_NAME WISE - QTY_RECEIVED AND QTY_SOLD ( FOR CURRENT MONTH - EXAMPLE SEP )
2. TO FIND ITEM_NAME WISE - QTY_RECEIVED AND QTY_SOLD (FOR CURRENT YEAR EXAMPLE FROM JAN TO SEP )
So yeah, based on these requirements, I'd recommend you make that column a DATE, not a string. Dates are easier to parse for date-related logic - such as month by month as you need here.
Using that, here's one way to do it:
with w_data as (
select 'AX' item_code, 'TSHIRT ' item_name, to_date('20130701','yyyymmdd') trans_dt, 3000 qty_recd, 2000 qty_sold from dual union all
select 'AX' , 'TSHIRT ' , to_date('20130801','yyyymmdd') , 2000 , 500 from dual union all
select 'AX' , 'TSHIRT ' , to_date('20130901','yyyymmdd') , 1000 , 3000 from dual union all
select 'CX' , 'XLSHIRT' , to_date('20130701','yyyymmdd') , 3000 , 2000 from dual union all
select 'CX' , 'XLSHIRT' , to_date('20130801','yyyymmdd') , 3000 , 2500 from dual union all
select 'CX' , 'XLSHIRT' , to_date('20130901','yyyymmdd') , 3000 , 2500 from dual
w_base as (
select item_code, item_name, trans_dt, qty_recd, qty_sold,
sum(qty_recd) over (partition by item_code, trunc(trans_dt, 'MM')) mm_recd,
sum(qty_sold) over (partition by item_code, trunc(trans_dt, 'MM')) mm_sold,
sum(qty_recd) over (partition by item_code, trunc(trans_dt, 'YY')) yy_recd,
sum(qty_sold) over (partition by item_code, trunc(trans_dt, 'YY')) yy_sold,
row_number() over (partition by item_code order by trans_dt desc) rnum
from w_data d
Select item_code, item_name, mm_recd, mm_sold, yy_recd, yy_sold
from w_base
where rnum = 1
IT ITEM_NA MM_RECD MM_SOLD YY_RECD YY_SOLD
AX TSHIRT 1000 3000 6000 5500
CX XLSHIRT 3000 2500 9000 7000 -
Displaying diff dates using PL/SQL expression for 'display only' item ?
Hi ,
I am having a display only item -- :P2_FROM_Date . If its Thu,Fri,Sat or Sun I want to set the date as the last Monday's date . If its Mon,Tue or Wed then it should be the present Monday's date .
E.g: Today is Friday and the last Monday was on 18th .
So for yesterday , today,tomorrow and Sunday , the date should be displayed as 18-JUN-2012.
From the coming Monday to Wednesday , the date should of be the coming Monday i.e , 24-JUN-2012
I tried it doing under 'Source ' of item using PL/SQL expression and PL/SQL function body. Not working
Can someone help ?
Thanks & Regards
UmerNice1 wrote:
declare
lv_date number;
begin
select to_char(sysdate,'D') into lv_date from dual;
if lv_date=2 then
:P2_FROM_DATE := to_char(sysdate-1);
end if;
end;I tried this under " PL/SQL function body " in "Source " tab of the item P2_FROM_DATE
When I run this , nothing is displayed corresponding to the item P2_FROM_DATEExactly as expected. This code will only set a value for <tt>P2_FROM_DATE</tt> when run on Mondays in territories where the first day of the week is Sunday, and when run on Tuesdays where Monday is the first day of of the week:
SQL> var P2_FROM_DATE varchar2(30)
SQL> alter session set nls_date_format='Dy DD-MON-YYYY';
Session altered.
SQL> select sysdate from dual
SYSDATE
Mon 25-JUN-2012
SQL> alter session set nls_territory='AMERICA';
Session altered.
SQL> declare
2 lv_date number;
3 begin
4 select to_char(sysdate,'D') into lv_date from dual;
5 if lv_date=2 then
6 :P2_FROM_DATE := to_char(sysdate-1);
7 end if;
8 end;
9 /
PL/SQL procedure successfully completed.
SQL> print p2_from_date
P2_FROM_DATE
Sun 24-JUN-2012
SQL> alter session set nls_territory='UNITED KINGDOM';
Session altered.
SQL> exec :p2_from_date := null
SQL> declare
2 lv_date number;
3 begin
4 select to_char(sysdate,'D') into lv_date from dual;
5 if lv_date=2 then
6 :P2_FROM_DATE := to_char(sysdate-1);
7 end if;
8 end;
9 /
PL/SQL procedure successfully completed.
SQL> print p2_from_date
P2_FROM_DATE
SQL>Hence the questions about language above.
>
I am having a display only item -- :P2_FROM_Date . If its Thu,Fri,Sat or Sun I want to set the date as the last Monday's date . If its Mon,Tue or Wed then it should be the present Monday's date .
E.g: Today is Friday and the last Monday was on 18th .
So for yesterday , today,tomorrow and Sunday , the date should be displayed as 18-JUN-2012.
From the coming Monday to Wednesday , the date should of be the coming Monday i.e , 24-JUN-2012
>
The coming Monday is 25-JUN-2012.
Aren't these rules equivalent to "Monday this week, where Monday is the first day of the week"? In which case the PL/SQL Expression you require is:
trunc(sysdate, 'iw')For example:
SQL> with t as (
2 select date '2012-06-21' + level d from dual connect by level <= 17)
3 select
4 d
5 , trunc(d, 'iw') monday
6 from
7 t;
D MONDAY
Fri 22-JUN-2012 Mon 18-JUN-2012
Sat 23-JUN-2012 Mon 18-JUN-2012
Sun 24-JUN-2012 Mon 18-JUN-2012
Mon 25-JUN-2012 Mon 25-JUN-2012
Tue 26-JUN-2012 Mon 25-JUN-2012
Wed 27-JUN-2012 Mon 25-JUN-2012
Thu 28-JUN-2012 Mon 25-JUN-2012
Fri 29-JUN-2012 Mon 25-JUN-2012
Sat 30-JUN-2012 Mon 25-JUN-2012
Sun 01-JUL-2012 Mon 25-JUN-2012
Mon 02-JUL-2012 Mon 02-JUL-2012
Tue 03-JUL-2012 Mon 02-JUL-2012
Wed 04-JUL-2012 Mon 02-JUL-2012
Thu 05-JUL-2012 Mon 02-JUL-2012
Fri 06-JUL-2012 Mon 02-JUL-2012
Sat 07-JUL-2012 Mon 02-JUL-2012
Sun 08-JUL-2012 Mon 02-JUL-2012
17 rows selected.
SQL> alter session set nls_territory='AMERICA';
Session altered.
SQL> alter session set nls_date_format='Dy DD-MON-YYYY';
Session altered.
SQL> with t as (
2 select date '2012-06-21' + level d from dual connect by level <= 17)
3 select
4 d
5 , trunc(d, 'iw') monday
6 from
7 t;
D MONDAY
Fri 22-JUN-2012 Mon 18-JUN-2012
Sat 23-JUN-2012 Mon 18-JUN-2012
Sun 24-JUN-2012 Mon 18-JUN-2012
Mon 25-JUN-2012 Mon 25-JUN-2012
Tue 26-JUN-2012 Mon 25-JUN-2012
Wed 27-JUN-2012 Mon 25-JUN-2012
Thu 28-JUN-2012 Mon 25-JUN-2012
Fri 29-JUN-2012 Mon 25-JUN-2012
Sat 30-JUN-2012 Mon 25-JUN-2012
Sun 01-JUL-2012 Mon 25-JUN-2012
Mon 02-JUL-2012 Mon 02-JUL-2012
Tue 03-JUL-2012 Mon 02-JUL-2012
Wed 04-JUL-2012 Mon 02-JUL-2012
Thu 05-JUL-2012 Mon 02-JUL-2012
Fri 06-JUL-2012 Mon 02-JUL-2012
Sat 07-JUL-2012 Mon 02-JUL-2012
Sun 08-JUL-2012 Mon 02-JUL-2012
17 rows selected.Also note that using the item source properties will only set the <tt>P2_FROM_DATE</tt> in the rendered page, not in session state.
Maybe you are looking for
-
How to identify missing records in a single-column table?
How to identify missing records in a single-column table ? Column consists of numbers in a ordered manner but the some numbers are deleted from the table in random manner and need to identify those rows.
-
I get the following error when trying to open a pdf file... '*pdf.part could not be saved, because the source file could not be read". I am able to open pdf files in IE and other programs.
-
Dear Engineers, I need your urgent help amd support, I have issued command *#7370# and all my data has been lost (contacts. messages ) , I been using phone memory and dont have any back-up. How can i restore old data and your help and support will b
-
hi i have a learning HTML file with a data file include a swf files how can i open it in my ipad?
-
No Control / Manual in Pricing Date Control/Category?
Hi Experts, There're 5 options in the pricing date control in the PIR (Purchase info record), How does the "No Control"(Blank) and the "Manual"(4) work? The system's default option is the Blank. Thanks, John.Z