Query needed for Cumulative data
HI Friends,
I need output like this.
Frequency Percent (%) Cumulative
Frequency Cumulative Percent
4468 0.91 4468 0.91
21092 4.31 25560 5.23
57818 11.82 83378 17.05
6274 1.28 89652 18.33
I am using Oracle 9i.
My output data like this and I need to write the query for 3 columns (Percent ,Cumulative frequency and Cumulative percent)
1:The formula for Percent column data is (Frequency/Sum of cumulative frequency)*100
2:The formula for Cumulative Frequency column data is (Cumulative of Frequency column data)
3:The formula for Cumulative Percent column data is (Cumulative of Percent column data)
What should be the analytic function and how to write the query.
Thanks,
Lony
Hi Friends,
I need output like this.
Frequency Percent (%) Cumulative Frequency Cumulative Percent
4468 0.91 4468 0.91
21092 4.31 25560 5.23
57818 11.82 83378 17.05
6274 1.28 89652 18.33
I am using Oracle 9i.
My output data like this and I need to write the query for 3 columns (Percent ,Cumulative frequency and Cumulative percent)
1:The formula for Frequency column data is sum of (dd+cc+mc_cc_mc).
1:The formula for Percent column data is (Frequency/Sum of cumulative frequency)*100
2:The formula for Cumulative Frequency column data is (Cumulative of Frequency column data)
3:The formula for Cumulative Percent column data is (Cumulative of Percent column data)
What should be the analytic function and how to write the query.Please find the sample data and table script.
CREATE TABLE all_lony (
campno varchar2(20),
dd INTEGER,
cc INTEGER,
mc INTEGER,
cc_mc INTEGER
insert into all_lony (campno,dd,cc,mc,cc_mc)
values(36,156,1320,445,2547);
insert into all_lony (campno,dd,cc,mc,cc_mc)
values(40,233,19711,263,885);
=============
Please find my query below
SELECT campno
|| ','
|| dm
|| ','
|| cc
|| ','
|| mc
|| ','
|| cc_mc
|| ','
|| frequency
|| ','
|| per
||','
||cumulative_fr
||','
|| SUM (per) OVER (ORDER BY per ROWS UNBOUNDED PRECEDING)
FROM (SELECT q3.campno campno, q3.dm, q3.cc, q3.mc, q3.cc_mc,
q3.frequency, q3.cumulative_fr,
(q3.Frequency / SUM (q3.cumulative_fr)) * 100 per
FROM (SELECT q2.campno campno, SUM (q2.dm) dm, SUM (q2.cc) cc,
SUM (q2.mc) mc, SUM (q2.cc_mc) cc_mc,
(SUM ( NVL (q2.dm, 0)
+ NVL (q2.cc, 0)
+ NVL (q2.mc, 0)
+ NVL (q2.cc_mc, 0)
) frequency,
SUM (SUM ( NVL (q2.dm, 0)
+ NVL (q2.cc, 0)
+ NVL (q2.mc, 0)
+ NVL (q2.cc_mc, 0)
) OVER (ORDER BY SUM ( NVL (q2.dm, 0)
+ NVL (q2.cc, 0)
+ NVL (q2.mc, 0)
+ NVL (q2.cc_mc,0)
) ROWS UNBOUNDED PRECEDING)
cumulative_fr
from all_lony
q1 )q2
GROUP BY q3.campno) q3
GROUP BY campno, dm, cc, mc,cc_mc, frequency,cumulative_fr)
Can anybody just verify the query and let me know.
Similar Messages
-
Query need for transpose data output
Dear All,
I have a query regarding transpose output.
for ex. I created one table employee in database
create table emp(emp_no number,dept_id number(10));
it is having data like....
emp_no dept_id
101 10
102 20
103 10
104 10
105 20
so I want the output in transpose format like
dept_id emp_no1 emp_no2 emp_no3
10 101 103 104
20 102 105
can anybody suggest me any query for the above output.
Thanks...
Prashant....select dept_id
, max (decode (emp_no, 101, emp_no))
, max (decode (emp_no, 102, emp_no))
, max (decode (emp_no, 103, emp_no))
, max (decode (emp_no, 104, emp_no))
, max (decode (emp_no, 105, emp_no))
from test
group by dept_idas in
SQL> with test as
2 (
3 select 101 emp_no, 10 dept_id from dual union all
4 select 102 emp_no, 20 dept_id from dual union all
5 select 103 emp_no, 10 dept_id from dual union all
6 select 104 emp_no, 10 dept_id from dual union all
7 select 105 emp_no, 20 dept_id from dual
8 )
9 select dept_id
10 , max (decode (emp_no, 101, emp_no)) emp1
11 , max (decode (emp_no, 102, emp_no)) emp2
12 , max (decode (emp_no, 103, emp_no)) emp3
13 , max (decode (emp_no, 104, emp_no)) emp4
14 , max (decode (emp_no, 105, emp_no)) emp5
15 from test
16 group by dept_id
17 ;
DEPT_ID EMP1 EMP2 EMP3 EMP4 EMP5
20 102 105
10 101 103 104 -
How to only show data for a certain period of time for cumulative data?
I need to show cumulative data for the past 12 months on a chart (e.g. # of accounts). But in order to get the cumulative data for Month 1, I need to cum the data starting from the very first month when we started to add new account which can be Jan 2006. If I put a filter to limit the data to only last 12 months, then my Month 1's cum data will not include any account prior to Month 1. So how do I do this?
Edited by: user637947 on Feb 5, 2009 2:02 PMHi,
Try this Filter....
Opportunity."Close Date" >= TIMESTAMPADD(SQL_TSI_MONTH, -11, TIMESTAMPADD(SQL_TSI_DAY, -(DAY(CURRENT_DATE) - 1), CURRENT_DATE)) AND Opportunity."Close Date" <= TIMESTAMPADD(SQL_TSI_MONTH, 1, TIMESTAMPADD(SQL_TSI_DAY, -(DAY(CURRENT_DATE) + 0), CURRENT_DATE))
Thanks and Regards,
Amit Koul -
Hello,
There is Selection Screen where it allows to enter KUNNR, LAND1 and VBELN.
I want to display below fields from table specified.
VBELN, FKART, FKDAT WAERK, KUNRG, KUNWE from VBRK Table, BSTNK, VBELN, AUART from VBAK Table, POSNR, MATNR, FKIMG, WAVWR from VBRP Table, KBETR from KOMV and MWSBP from KOMP.
So i need to find out query to display above fields data as output as per input in selection screen. default values if nothing entered in Selection screen are all data.
Please help me with Program or Query to retrieve above data.
Thanks,Well... you seem to be new to ABAP.. hope this helps.....
select VBELN, FKART, FKDAT WAERK, KUNRG, KUNWE
from vbrk
into table t_vbrk
where vbeln = s_vbeln. " given in selection screen
if sy-subrc = 0.
endif.
select BSTNK, VBELN, AUART
from vbak
into table t_vbak
where vbeln = s_vbeln.
if sy-subrc = 0.
endif.
Similarly write the other select queries as well.
Its very useful if you pass the key fields while selecting from the database. Helps in faster processing.
Regards. -
Blank needed for cumulative Key Figure
Hi All,
here's what my BEx report looks like when I run (after 10AM and before 11AM)::
Hour Qty CumulativeQty
7AM 10 10
8AM 20 30
9AM 15 45
10AM 13 58
11AM 58
12PM 58
But, here's what my BEx report should look like when I run (after 10AM and before 11AM)::
Hour Qty CumulativeQty
7AM 10 10
8AM 20 30
9AM 15 45
10AM 13 58
11AM
12PM
Any suggestionsTry this If.. Else Formula for Cumulative Qty
( Qty KF < > 0 ) * Cumulative Qty KF + ( Qty KF = = 0 ) * 0
Finally set Zero as space in Query global properties. -
Input needed for writing data back into BW/ECC data
Hello Everyone,
Can anyone please let me know an example or process you have used for writing data back into BW (DSO/Cubes) or ECC systems.
We do not have integrated planning in our current system configuration currently.
So, any sample code for using any of the BAPI/RFC for writing data would be appreciated.
Also, I am trying to find out if there is any way to schedule the models developed in VC 7.1 in background to automate certain type of data processing which goes across multiple systems.
Any help would be appreciated.
Thanks.Hello,
Can anyone please help me out on this one....
I am aware of few BAPI's such as RSDRI_ODSO_INSERT_RFC but I am not aware of what action has to be used to transfer the data from the table within VC to this BAPI and how to define the parameters as the one's available for the BAPI I mentioned do not fit into the data I have in the table within VC.
The following are the columns I have in the table within VC,
1. GL Account
2. Credit Amount
3. Debit Amount
4. Sales Amount
I have defined the DSO with the same where G/L Account is the keyfield and the rest being data fields.
Any help would be really appreciated.
Thanks. -
Libraries need for ADF Data visualization
Hi all,
I want to ask list of libraries which needed by ADF Data visualization.
Thanks
Hadi WijayaHi,
since you ask this question on the JDeveloper 10.1.3 forum, just be aware that DVT components work on JSF 1.2 only and not in JDeveloper 10.1.3
Frank -
Sql query need for date format''Mon 23 Apr 16:45 pm'
i need SQL query to print date like 'Mon 23 Apr 16:45 pm' format
SQL> select to_char(sysdate,'Dy fmddfm Mon hh24:mi pm','nls_date_language=american') from dual
2 /
TO_CHAR(SYSDATE,'DY
Fri 27 Apr 13:04 pm
1 rij is geselecteerd.Regards,
Rob. -
Query Needed for Partitioning table
Hi,
I have created a table called Test. There is a column named business_name.
There are several businesses like ABC,BCD,ADE....
There will be lakhs of rows corresponding to each business, i mean there will be lakhs of entires corresponding to ABC,BCD....
So i like to partition the table according to business_name so that the search will be more faster.As we had partitioned according to the business_name, i hope we need to search only on the partition corresponding to the particular business.
can any one provide the Query to partition the table ' TEST ' according to the column ' business_name ' .
Also can anyone provide Query to modify the already existing table ' TEST ' to incorporate partition for the column ' business_name '.We can partiton a table by the following
create table Generalledger (
record_id number,
business_name varchar2(3)
sales_dt date,
amount number(10)
partition by list (business_name)
partition ct values ('ABC'),
partition ca values ('BCD'),
partition def values (default)
But if we dont know the values like 'ABC' , 'BCD'
....how can we do the partitionuse SQL to generate part (or all) of your DDL statement. The following will output one partition statement for each business_name:
SELECT DISTINCT 'partition p_' || BUSINESS_NAME || ' values (''' ||
BUSINESS_NAME || '''),'
FROM GENERALLEDGER; -
How can I make CONTAINS query work for a date range
In either 9i or 10g (eventual). I have a CONTEXT index that contains multiple columns from multiple tables and using a USER_DATASTORE. E.g., I have names that come from 3 different table locations and dates that come from 4. I can index them fine but how can I perform a single consolidated CONTAINS query against the single CONTEXT index to do the following:
smith WITHIN lname AND john WITHIN fname AND dob BETWEEN '19870315' and '19970315'
I know that I can use a mixed query but this is inefficient (esp since I have birth dates in multiple tables). Is there any algorithm for a range operator (>, <, between?) within the CONTAINS operator?
CTXCAT index is not an option, as I have many text columns I am searching.
Thanks!When you run the cdstore.sql, in addition to creating the ctx_cd package, it also creates the friedman package that contains the algorithm that the ctx_cd package uses. You could use the functions from that friedman package in your procedure for your user_datastore and in the creation of your query string, as demonstrated below.
SCOTT@orcl_11g> CREATE OR REPLACE PROCEDURE my_proc
2 (p_rowid IN ROWID,
3 p_clob IN OUT NOCOPY CLOB)
4 AS
5 BEGIN
6 FOR r IN
7 (SELECT emp.ename, emp.job, emp.hiredate, dept.dname
8 FROM emp, dept
9 WHERE emp.deptno = dept.deptno
10 AND emp.ROWID = p_rowid)
11 LOOP
12 DBMS_LOB.WRITEAPPEND (p_clob, 7, '<ename>');
13 DBMS_LOB.WRITEAPPEND (p_clob, LENGTH (r.ename), r.ename);
14 DBMS_LOB.WRITEAPPEND (p_clob, 8, '</ename>');
15 DBMS_LOB.WRITEAPPEND (p_clob, 5, '<job>');
16 DBMS_LOB.WRITEAPPEND (p_clob, LENGTH (r.job), r.job);
17 DBMS_LOB.WRITEAPPEND (p_clob, 6, '</job>');
18 DBMS_LOB.WRITEAPPEND (p_clob, 7, '<dname>');
19 DBMS_LOB.WRITEAPPEND (p_clob, LENGTH (r.dname), r.dname);
20 DBMS_LOB.WRITEAPPEND (p_clob, 8, '</dname>');
21 DBMS_LOB.WRITEAPPEND (p_clob, 10, '<hiredate>');
22 -- apply friedman algorithm to date column ------------------
23 friedman.init
24 (TO_NUMBER (TO_CHAR (TO_DATE (19000101, 'YYYYMMDD'), 'J')),
25 TO_NUMBER (TO_CHAR (TO_DATE (21001231, 'YYYYMMDD'), 'J')));
26 DBMS_LOB.WRITEAPPEND
27 (p_clob,
28 LENGTH (friedman.encodedate (r.hiredate)),
29 friedman.encodedate (r.hiredate));
30 --------------------------------------------------------------
31 DBMS_LOB.WRITEAPPEND (p_clob, 11, '</hiredate>');
32 END LOOP;
33 END my_proc;
34 /
Procedure created.
SCOTT@orcl_11g> SHOW ERRORS
No errors.
SCOTT@orcl_11g> BEGIN
2 CTX_DDL.CREATE_PREFERENCE ('my_datastore', 'USER_DATASTORE');
3 CTX_DDL.SET_ATTRIBUTE ('my_datastore', 'PROCEDURE', 'my_proc');
4 END;
5 /
PL/SQL procedure successfully completed.
SCOTT@orcl_11g> CREATE INDEX my_index ON emp (ename)
2 INDEXTYPE IS CTXSYS.CONTEXT
3 PARAMETERS
4 ('DATASTORE my_datastore
5 SECTION GROUP CTXSYS.AUTO_SECTION_GROUP')
6 /
Index created.
SCOTT@orcl_11g> EXEC DBMS_STATS.GATHER_TABLE_STATS (USER, 'DEPT')
PL/SQL procedure successfully completed.
SCOTT@orcl_11g> EXEC DBMS_STATS.GATHER_TABLE_STATS (USER, 'EMP')
PL/SQL procedure successfully completed.
SCOTT@orcl_11g> VARIABLE cstring VARCHAR2(4000)
SCOTT@orcl_11g> BEGIN
2 :cstring := 'smith WITHIN ename';
3 :cstring := :cstring || ' AND ' || 'clerk WITHIN job';
4 :cstring := :cstring || ' AND ' || 'research WITHIN dname';
5 -- apply friedman algorithm to search criteria ---------------------------
6 friedman.init
7 (TO_NUMBER (TO_CHAR (TO_DATE (19000101, 'YYYYMMDD'), 'J')),
8 TO_NUMBER (TO_CHAR (TO_DATE (21001231, 'YYYYMMDD'), 'J')));
9 :cstring := :cstring || ' AND ((' ||
10 friedman.integercontainscriteria
11 (TO_NUMBER (TO_CHAR (TO_DATE ('19800315', 'YYYYMMDD'), 'J')),
12 TO_NUMBER (TO_CHAR (TO_DATE ('19810315', 'YYYYMMDD'), 'J')),
13 'B')
14 || ') WITHIN hiredate)';
15 ---------------------------------------------------------------------------
16 END;
17 /
PL/SQL procedure successfully completed.
SCOTT@orcl_11g> SET AUTOTRACE ON EXPLAIN
SCOTT@orcl_11g> SELECT *
2 FROM emp
3 WHERE CONTAINS (ename, :cstring) > 0
4 /
EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
7369 SMITH CLERK 7902 17-DEC-80 800 20
Execution Plan
Plan hash value: 1887222286
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 37 | 4 (0)| 00:00:01 |
| 1 | TABLE ACCESS BY INDEX ROWID| EMP | 1 | 37 | 4 (0)| 00:00:01 |
|* 2 | DOMAIN INDEX | MY_INDEX | | | 4 (0)| 00:00:01 |
Predicate Information (identified by operation id):
2 - access("CTXSYS"."CONTAINS"("ENAME",:CSTRING)>0)
SCOTT@orcl_11g> -
Hi all
The current requirement we have is to Load data from Excel File to oracle Databases.
The Excel file has around 30 Sheets , each corresponds to a table, ( means 30 tables in the database) .
Currently we are using sqlldr commands to load data from CSV files to the Oracle Database. The only problem is that sometime the DBA has to go out or is busy workign on somthing else, so May i kindy get some expert suggestions on how to automate this DataLoading Task.
Somebody has suggested to My Manager ( who is not aware of oracle and IT at all) that data can be loaded via ODBC. It is suggested to him that all we need to do is to place the CSV files on the Server at a particular folder and Oracle will do the rest.
I am not that proficient in Data loading methods.. so may i know any technique which will simplify/automate the Task.
I mean how can we automate so that every time , the the sql loader scripts run at command promt. I think data base ( Oracle ) as nothing to do with command promt i feel. isnt it ..
Kindly have ur expert./experienced suggestions.
I would be highly grateful to all
RegardsTo automate sqlldr scripts, you would usually write a OS script file that will periodically looks for files in a given directory and run the sqlldr commands:
- on Windows: .bat or .wsh files to be scheduled with "at" command or windows scheduled tasks tool
- on Unix: a script shell in the crontab.
Much more complicate but without any OS script:
- write a PL/SQL code to read, parse the file to be loaded using UTL_FILE
package
- this PL/SQL code must also generate INSERT statements and process errors ...
- this PL/SQL code can be scheduled with DBMS_JOB package to run a periodic intervals.
Message was edited by:
Pierre Forstmann -
Looking for a proper analytical query/solution for below data
I have data as shown below:
mob_id status_code status_text media_date
39585 600 Online 04-Aug-09
54988 600 Online 05-Aug-09
39585 600 Online 05-Aug-09
39585 600 Online 06-Aug-09
39585 600 Online 07-Aug-09
39585 600 Online 08-Aug-09
39585 600 Online 09-Aug-09
39585 600 Online 10-Aug-09
39585 600 Online 11-Aug-09
39585 600 Online 12-Aug-09
39585 600 Online 13-Aug-09
39585 600 Online 14-Aug-09
39585 600 Online 15-Aug-09
39585 600 Online 16-Aug-09
39585 700 Deinstall 17-Aug-09
54988 600 Online 06-Aug-09
54988 600 Online 07-Aug-09
54988 600 Online 08-Aug-09
54988 600 Online 09-Aug-09
54988 600 Online 10-Aug-09
54988 600 Online 11-Aug-09
54988 600 Online 12-Aug-09
54988 600 Online 13-Aug-09
54988 600 Online 14-Aug-09
54988 600 Online 15-Aug-09
54988 600 Online 16-Aug-09
39585 600 Online 20-Aug-09
39585 600 Online 21-Aug-09
39585 600 Online 22-Aug-09
39585 600 Online 23-Aug-09
39585 600 Online 24-Aug-09
39585 600 Online 25-Aug-09
39585 700 Deinstall 26-Aug-09
39585 600 Online 27-Aug-09
39585 600 Online 28-Aug-09
39585 600 Online 29-Aug-09
39585 600 Online 30-Aug-09
39585 600 Online 31-Aug-09
39585 600 Online 01-Sep-09
39585 700 Deinstall 02-Sep-09
54988 600 Online 17-Aug-09
54988 600 Online 18-Aug-09
54988 600 Online 19-Aug-09
54988 600 Online 20-Aug-09
54988 600 Online 21-Aug-09
54988 600 Online 22-Aug-09
54988 600 Online 23-Aug-09
54988 600 Online 24-Aug-09
54988 600 Online 25-Aug-09
54988 700 Deinstall 26-Aug-09
69875 600 Online 20-Aug-09
69875 600 Online 21-Aug-09
69875 600 Online 22-Aug-09
69875 600 Online 23-Aug-09
69875 600 Online 24-Aug-09
69875 600 Online 25-Aug-09
69875 600 Online 26-Aug-09
Using the above data I need to find out the below result set. Can any one help in this?
occurrnace_seq mob_id start_media_date end_media_date no_of_days
1 39585 04-Aug-09 17-Aug-09 13
2 39585 20-Aug-09 26-Aug-09 6
3 39585 27-Aug-09 02-Sep-09 6
1 54988 05-Aug-09 26-Aug-09 21
1 69875 20-Aug-09 null null
Here start_media_date can be found with status_code=600 & end_media_date can be found with status_code=700.
Please look that the mobility_id is starting multiple times.
Any one can help me in producing this result using SQL or PL/SQL.
Many thanks in advance.
Thanks
Guttiwasguttis wrote:
Can I run this query on a 70 million records? Does it raise any performance problems. If you have any idea, just thorough some possible suggestions to protect such isses.Well, you can certailny run it on 70 million records. How long it will run depends on your hardware, Oracle and OS settings. Said that, there is a simpler solution:
select occurrenace_seq,
mob_id,
min(case grp when 'start-of-group' then media_date end) start_media_date,
max(case grp when 'end-of-group' then media_date end) end_media_date,
max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
from (
select t.*,
case
when status_text = 'Deinstall' then 'end-of-group'
when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
end grp,
sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
from your_table t
where grp in ('start-of-group','end-of-group')
group by mob_id,
occurrenace_seq
order by mob_id,
occurrenace_seq
/With your sample:
with t as (
select 39585 mob_id,600 status_code,'Online' status_text, to_date('04-Aug-09','dd-mon-yy') media_date from dual union all
select 54988,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
select 39585,700,'Deinstall', to_date('17-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
select 39585,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('27-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('28-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('29-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('30-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('31-Aug-09','dd-mon-yy') from dual union all
select 39585,600,'Online',to_date('01-Sep-09','dd-mon-yy') from dual union all
select 39585,700,'Deinstall', to_date('02-Sep-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('17-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('18-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('19-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
select 54988,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
select 54988,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
select 69875,600,'Online',to_date('26-Aug-09','dd-mon-yy') from dual
select occurrenace_seq,
mob_id,
min(case grp when 'start-of-group' then media_date end) start_media_date,
max(case grp when 'end-of-group' then media_date end) end_media_date,
max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
from (
select t.*,
case
when status_text = 'Deinstall' then 'end-of-group'
when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
end grp,
sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
from t
where grp in ('start-of-group','end-of-group')
group by mob_id,
occurrenace_seq
order by mob_id,
occurrenace_seq
OCCURRENACE_SEQ MOB_ID START_MED END_MEDIA NO_OF_DAYS
1 39585 04-AUG-09 17-AUG-09 13
2 39585 20-AUG-09 26-AUG-09 6
3 39585 27-AUG-09 02-SEP-09 6
1 54988 05-AUG-09 26-AUG-09 21
1 69875 20-AUG-09
SQL> SY. -
Hi All,
Following is the query which is working for aging for condition and round(st.close_dt - st.open_dt) >60 and corresponding column output count(round(st.close_dt - st.open_dt)) ">60" but we need the same in report of column of
count(round(st.close_dt - st.open_dt)) "<15"
count(round(st.close_dt - st.open_dt)) "16 to 30"
count(round(st.close_dt - st.open_dt)) "31 to 45"
count(round(st.close_dt - st.open_dt)) "45 to 60"
which is based on condition round(st.close_dt - st.open_dt) <15
round(st.close_dt - st.open_dt) between 16 and 31
round(st.close_dt - st.open_dt) between 31 and 45
round(st.close_dt - st.open_dt) between 46 and 60
but we can't use this all condition at a time as individual it is working fine.
Could you please let me know how we can user this all in one so the I can get the output at a time <15, 16 to 30, 31 to 45, 46 to 60 and >60.
{select count(round(st.close_dt - st.open_dt)) ">60"
from stat.csr_master_tbl m,
stat.customers c,
stat.proj_csrs p,
stat.appl_type ap,
stat.csr_type_cd tp,
stat.csr_status st,
stat.csr_status_cd sc,
stat.wrkflw_defn w
where m.cust_id = c.cust_id
and m.csr_id = p.csr_id (+)
and m.sd_cd = p.sd_cd (+)
and m.appl_cd = ap.appl_cd
and m.sd_cd = ap.sd_cd
and m.sd_cd = tp.sd_cd
and m.csr_type = tp.csr_type_cd
and m.sd_cd = st.sd_cd
and m.csr_id = st.csr_id
and st.csr_status_cd = sc.csr_status_cd
and st.sd_cd = sc.sd_cd
and tp.csr_type_cd = sc.csr_type_cd
and m.wrkflw_id = w.wrkflw_id
and (m.sd_cd in ('11i','AVA') or (m.sd_cd in ('STR') ))
and ( st.close_dt between to_date(:beg_date,'MM/DD/YYYY') and to_date(:end_date,'MM/DD/YYYY')
or st.closed_status_flag <> 'Y')
--and sc.descr <> 'Cancel'
AND sc.descr IN ('Close')
and p.proj_cd='OAPS'
and round(st.close_dt - st.open_dt) >60
and tp.descr in ('Break-Fix Datafix'
,'Break-Fix Configuration'
,'Break-Fix Development'
,'Vendor Patch')
and c.first_name in ('Springville'
,'Cookeville'
,'Bangalore'
,'Deer Park'
,'Sulphur Springs'
,'P11i'
,'Singapore'
,'Bangalore Chemical'
,'Baton Rouge'
,'Kaohsiung'
,'Bangalore Controls'
,'Damam'
,'Dayton Foundry'
,'Edmonton'
,'Jebel Ali'
,'Melbourne'
,'Philadelphia')}You might try something like
select
sum(case when round(st.close_dt - st.open_dt) < 15 then 1 else 0 end) "< 15", -- shouldn't it be <=15?
sum(case when round(st.close_dt - st.open_dt) between 16 and 30 then 1 else 0 end) "16 to 30",
...and you should have posted this question in the PL/SQL forum.
Edited by: UW (Germany) on 17.08.2012 12:15 -
QUERY NEED FOR TIME DIFFERENCE
Hi
I need a query to find the difference between time
for example
5.00 hours - 0.30 minutes = 4.30 hours
Thanks in Advance
AdinaIs this what you are looking for -
1 select to_date('05:00', 'hh24:mi') - 0.5/24 --Subtract ½ Hr from time
2* from dual
SQL> /
TO_DATE('05:
200804010430
1 select to_date('05:00', 'hh24:mi') - 3/24 --Subtract 3 Hrs from time
2* from dual
SQL> /
TO_DATE('05:
200804010200
1 select to_date('05:30', 'hh24:mi') + 0.5/24 --Adds ½ Hr to the time
2* from dual
SQL> /
TO_DATE('05:
200804010600
1 select to_date('20080416 00:02', 'yyyymmdd hh24:mi') - 0.5/24 --Subtract ½ Hr from time 2* from dual
SQL> /
TO_DATE('200
200804152332Shailender Mehta -
Time needed for automated data transfer - hypothetic
An EBS client with interest in maintaining integration in their large customer base backoffice and with CRMOD has had past challenges in terms of the overall size of the data they wish to integration and the time it takes to do so. As such, the client wishes to understand from a data size (X GIGS) and time (how much time) it might take to for instance synchronize their data. Since they would not quantify their data size, I can only ask... how much data be moved and how fast. This is urgent. Any experienced help here using strategies such as batch integration leveraging web services, cast iron systems or PIP is of help here!
Disclaimer
The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
Liability Disclaimer
In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
Posting
"And the solution was using an FTP client which was able to send files using multiple parallel threads."
Yep, although that assumes you have multiple files (for which it can work very well).
For a single large file, a variation of what Milan described, is using a utility that can slice and dice (and reassemble) a single file across multiple TCP flows.
"Another possibility would be using some WAN accelerators (like Riverbeds, e.g.) which would sent "local ACKs" to the clients on both sides and improve the efficiency of the WAN file transfer."
Yep, again, and such products can even "transfer" faster than the line rate. The latter is possible because they generally compress in-line and also use local caching (negating the need to send some data). One of their issues, their transfer rates can be very variable and inconsistent.
Maybe you are looking for
-
Dsc opc shared variables yields Server Failure
I've been struggling with the transition from DSC 7.1 to 8+ ever since the release of version 8. I use the Allen Bradley OPC server supplied with the DSC module (Industrial Automation OPC Servers v 5.1) to communicate with a PLC 5/30 via Ethernet, or
-
How to create web servise for Material Master catalog
Hi Experts We would like to have a web servise foe the material master catalog. Please advise hoe we can implement this. Thanks, moshe
-
hi, i have the problem regarding to the tds cycle,so pls inform about the total process of 'TDS CYCLE-MAPPING AND TESTING' REGARDS JANA
-
Hi I'm trying to synchronize two folder: one (source) on my macbook and the other (destination) on a NAS always connected to my network. I'm using the rsync command +rsync -aE --delete-after --exclude=.DS_Store --delete-excluded -u "/Users/myuser/Doc
-
HELP!! Raise without a handler error! Reader 9, Mac WTF!??
Yikes! I've just finished weeks of work on my project, using Acrobat Pro 9, just tested it in Reader 9 (Mac), and it crashes on page 5 with this error: There was a RAISE without a handler. The application will now exit. The page that brings this up,