Aging Query - supressing values across subsequent columns
Oracle 11.2.0.3...
I have this query... working on it in SQL*Navigator.
We pay all vendors with Net 20, Net 25 and Net 30 terms in due date + 15 days. I'm trying to develop a query that will tell me who is due now, who will be due in the next 7, next 10, next 14 and next 21 days.
The query below returns not due, due, due +7, etc... but anything that is due is also showing up in the +7, +10, etc - because if it's due now, it will still be due in a week!
How can I suppress these values if they are already due?
(or, if they're due in +7 days, how can I suppress in +10, +14, etc?)
SELECT
pv.vendor_name "Vendor",
nps.invoice_number "Invoice",
nps.invoice_date "Invoice Date",
nps.due_date "Due Date",
inv.terms "Terms",
(TO_DATE (sysdate) - nps.invoice_date) "Days Entered",
(CASE
when to_date(nps.due_date) < to_date(sysdate)
then (to_date(sysdate) - to_date(nps.due_date))
else NULL
end) "Days Overdue",
nps.amount_remaining "Amount Remaining",
-- 0-15
(CASE
WHEN (TO_DATE (sysdate) - nps.invoice_date) < 45
THEN nps.amount_remaining
ELSE NULL
END
) "Not Due",
(CASE
WHEN (TO_DATE (sysdate) - nps.invoice_date) >= 45
THEN nps.amount_remaining
ELSE NULL
END
) "Due" ,
(CASE
WHEN (TO_DATE (sysdate+7) - nps.invoice_date) >= 45
THEN nps.amount_remaining
ELSE NULL
END
) "+7 Days" ,
(CASE
WHEN (TO_DATE (sysdate+10) - nps.invoice_date) >= 45
THEN nps.amount_remaining
ELSE NULL
END
) "+10 Days" ,
(CASE
WHEN (TO_DATE (sysdate+14) - nps.invoice_date) >= 45
THEN nps.amount_remaining
ELSE NULL
END
) "+14 Days" ,
(CASE
WHEN (TO_DATE (sysdate+21) - nps.invoice_date) >= 45
THEN nps.amount_remaining
ELSE NULL
END
) "+21 Days"
FROM
inv,
pv,
nps
WHERE
and nps.amount_remaining <> 0
and inv.terms in ('Net 20', 'Net 25', 'Net 30')
sample output:
Vendor Invoice Invoice Date Due Date Terms Days Entered Days Overdue Amount Remaining Not Due Due +7 Days +10 Days +14 Days +21 Days
vendor 1 00470871 12/27/2012 1/26/2013 Net 30 12 126.62 126.62
vendor 2 59355648 11/28/2012 12/28/2012 Net 30 41 11 538.75 538.75 538.75 538.75 538.75 538.75
vendor 3 75793062 12/4/2012 1/3/2013 Net 30 35 5 950 950 950 950 950
vendor 4 52835 12/13/2012 1/13/2013 Net 30 26 298.92 298.92 298.92
vendor 4 52814 12/4/2012 1/3/2013 Net 30 35 5 330 330 330 330 330
any sql gurus have ideas?
Edited by: camforbes on Jan 8, 2013 1:24 PM
Hi,
It's very unclear what you want, but I think something more or less like this will do what you want:
WITH got_grp AS
SELECT pv.vendor_name
, nps.invoice_number
, nps.invoice_date
, nps.due_date
, inv.terms
, SYSDATE - nps.invoice_date AS days_entered
, CASE
WHEN nps.due_date < SYSDATE
THEN SYSDATE - nps.due_date
ELSE NULL
END AS days overdue
, nps.amount_remaining
, CASE
WHEN nps.invoice_date > SYSDATE - 24 THEN 'Over 21 Days'
WHEN nps.invoice_date > SYSDATE - 31 THEN '+21 Days'
WHEN nps.invoice_date > SYSDATE - 35 THEN '+14 Days'
WHEN nps.invoice_date > SYSDATE - 38 THEN '+7 Days'
WHEN nps.invoice_date > SYSDATE - 45 THEN 'Due'
ELSE 'Not Due'
END AS grp
FROM inv
JOIN pv ON ...
JOIN nps ON ...
WHERE nps.amount_remaining != 0
AND inv.terms IN ('Net 20', 'Net 25', 'Net 30')
SELECT pv.vendor_name
, nps.invoice_number
, nps.invoice_date
, nps.due_date
, inv.terms
, days_entered
, days overdue
, CASE WHEN grp = 'Not Due' THEN amount_remianing END AS not_due
, CASE WHEN grp = 'Due' THEN amount_remianing END AS due
, CASE WHEN grp = '+7 Days' THEN amount_remianing END AS plus_7_days
, CASE WHEN grp = '+14 Days' THEN amount_remianing END AS plus_14_days
, CASE WHEN grp = '+21 Days' THEN amount_remianing END AS plus_21_days
FROM got_grp
;Whenever you have a question, please post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all the tables involved, and the results you want from that data.
If the results depend on when the query is run, give an exact run time, or even better, give a couple of different run times and the results you want from the same sample data for each one.
Explain, using specific examples, how you get those results from that data.
Always say what version of Oracle you're using (e.g. 11.2.0.2.0).
See the forum FAQ {message:id=9360002}
Similar Messages
-
How do I search for common values across multiple columns?
I am coordinating a schedule with 5 people across hundreds of dates, and have columns A-E filled with many rows of dates. How can I make a new column that displays all the dates (values) that each person (column) has in common with all the others?
Is there a simple formula for this?
thanks!Scarampella,
A second table can be used to find your matching dates.
Here's an example:
The formula in Matching Dates is:
=IF(ISERROR(MATCH(A,Table 1 :: A, 0)+MATCH(A,Table 1 :: B, 0)+MATCH(A,Table 1 :: C, 0)+MATCH(A,Table 1 :: D, 0)+MATCH(A,Table 1 :: E, 0)), "", A)
Basically, I look for matches in each person's list of dates, and if any fail to produce a match with the date being examined, the result is a miss, and if all match, it's a hit. You can sort the result to get a short list of matches without spaces.
Regards,
Jerry -
How to split values across the columns
Hi,
I have a dataset as follow
<?xml version="1.0" encoding="UTF-8" ?>
- <ROWSET>
- <ROW>
<fm_firmnum>0054</fm_firmnum>
</ROW>
- <ROW>
<fm_firmnum>0053</fm_firmnum>
</ROW>
- <ROW>
<fm_firmnum>0055</fm_firmnum>
</ROW>
</ROWSET>
Now, I have created a rtf template and would like to make the table look like the following under this circumstances
- If the rows (row count, but I dont' know how to do row count in BIP) mod 2 = 1 then put it in the left column of the table
- If the rows mod 2 = 0 then put it in the right column of the table.
Hence the table should look like this as the outcome
fm_firmnum | fm_firmnum
0054 | 0053
0055 |
Any idea on how to do it? Many thx
Rgds,
RaymondHi Raymond,
The rowcount can be known using position() syntax:
so your logic can be achieved using the following way:
Col1 Col2
<?if:position() mod 2 =1?><fm_firmnum><?end if?> <?if:position() mod 2 =0?><fm_firmnum><?end if?>
Keep this on one form field <?if:position() mod 2 =1?><fm_firmnum><?end if?>
and keep this in 2nd form field <?if:position() mod 2 =0?><fm_firmnum><?end if?>
Best Regards,
Mahi
Edited by: BIP Learner on Jul 19, 2009 10:44 AM -
SAP Web Intelligence -Comparing subsequent columns in cross tab
Hello,
I am new in SAP web intelligence , I need formula (Not a SQL Query) to find difference of values between subsequent columns in cross tab. Please find attached excel for more details
Thanks & RegardsTry something like in below screenshot-
Craete variable
Diff Amount2=Previous([Amount2])-[Amount2]
~Anuj -
How to return Parameters values as a column in a query?
Hi All,
I have number of parameters in a report.
I need to return its interred values as a column in a query to use this column in a chart.
I want this column to be the second one in this query:
SELECT ROWNUM
FROM ALL_OBJECTS
WHERE ROWNUM <= 10
That is if there is any way to use these parameters directly as a column in the chart no need for the previous statement.
Note: I am using Reports 6iDear sir
You can enter parameter as column in query like
select :parameter p1, &hexadecima_paramataer p2, empcode
from emps;
that query make parameter as query column .
&hexadecimal paramater can refer to database column of table emps
and :paramater can refer to static string -
Query to get combinations of column values
-- Input Data
with t as (select 'H' as symbol, 1 as id from dual union all
select 'H' as symbol, 2 as id from dual union all
select 'H' as symbol, 3 as id from dual union all
select 'H' as symbol, 4 as id from dual
,s as (select 'L' as symbol, 1 as id from dual union all
select 'L' as symbol, 2 as id from dual union all
select 'L' as symbol, 3 as id from dual union all
select 'L' as symbol, 4 as id from dual
--select * from t,s
[pre]
-- Required output is four columns with the below values. The column headings are the ids from the original query. Therefore I suspect the solution would have some sort of crosstab..
id1---id2---id3---id4
HHHH
HHHL
HHLH
HHLL
HLHH
HLHL
HLLH
HLLL
LHHH
LHHL
LHLH
LHLL
LLHH
LLHL
LLLH
LLLLselect REPLACE(sys_connect_by_path(symbol,'-'),'-','' ) "id1---id2---id3---id4"
from (select*from t union all
select*from s)
where connect_by_isleaf = 1
start with id=1
connect by prior ID=ID-1
ORDER BY 1
I THINK U WANT THIS ........
select substr(regexp_substr(scbp, '-[^-]*', 1, 1),2) as ID1,
substr(regexp_substr(scbp, '-[^-]*', 1, 2),2) as ID2,
substr(regexp_substr(scbp, '-[^-]*', 1, 3),2) as ID3,
substr(regexp_substr(scbp, '-[^-]*', 1, 4),2) as ID4
from (
select sys_connect_by_path(symbol, '-') scbp
from (select*from t union all
select*from s)
where connect_by_isleaf = 1
start with id=1
connect by prior ID=ID-1
Edited by: user12108669 on Nov 12, 2009 9:42 PM
Edited by: user12108669 on Nov 12, 2009 9:43 PM -
How to show the VALUE as the Column Header using SQL query?
Hi
I have a requirement to show the picked value as the column header using SQL query.
Example:
======
SELECT EMPNO FROM EMP
WHERE EMPNO=7934;
Result Should be:
7934
7934I have a requirement to show the picked value as the column header using SQL query.In sql*plus you can do
SQL> set verify on
SQL> def e = 7934
old: SELECT empno "&&e" FROM emp WHERE empno = &&e
new: SELECT empno "7934" FROM emp WHERE empno = 7934
SQL> SELECT empno "7934" FROM emp WHERE empno = 7934
7934
7934
1 row selected. -
Query to delete row where column value contains alphabets
Hi,
Could anyone please help me to get this query work.
Query to delete row where column value contains alphabets.
DELETE FROM BIN_ITEM WHERE order_nmb LIKE '%[A-Z]%' || LIKE '%[a-z]%'
Thanks and Regards,
Deekay.RaminHashimzadeh wrote:
SELECT order_nmb FROM BIN_ITEM WHERE regexp_count(order_nmb,'[0-9]') = 0
Ramin Hashimzade
But that won't reject strings like 'gfgG%dgh' which aren't pure alphabetic.
Try:
with test_data as (
select 'ghTYJYEhdfe' str from dual
union
select 'dfF5ssd' from dual
union
select 'rgth*dgheh' from dual
union
select 'ggf{' from dual
union
select 'rwhrhrh' from dual
select *
from test_data
where regexp_instr(str,'[^[:alpha:]]')=0; -
Concatenate a column value across multiple rows - PDW
We are using PDW based on SQL2014. We require an efficient logic on how to concatenate a column value across multiple rows. We have the following table
T1
(CompanyID, StateCD)
Having following rows:
1 NY
1 NJ
1 CT
2 MA
2 NJ
2 VA
3 FL
3 CA
We need a code snippet which will return following result set:
1
CT,NJ,NY
2
MA,NJ,VA
3
CA,FL
We have tried built-in function STUFF with FOR XML PATH clause and it is not supported in PDW. So, we need a fast alternative.Hi Try this:
SELECT * INTO #ABC
FROM
SELECT 1 AS ID,'NY' AS NAME
UNION
SELECT 1 AS ID,'NJ' AS NAME
UNION
SELECT 1 AS ID,'CT' AS NAME
UNION
SELECT 2 AS ID,'MA' AS NAME
UNION
SELECT 2 AS ID,'NJ' AS NAME
UNION
SELECT 2 AS ID,'VA' AS NAME
UNION
SELECT 3 AS ID,'FL' AS NAME
UNION
SELECT 3 AS ID,'CA' AS NAME
)A
CREATE TABLE ##CDB (ID INT, NAME NVARCHAR(800))
DECLARE @TMP VARCHAR(MAX),
@V_MIN INT,
@V_MAX INT,
@V_COUNT INT
SELECT @V_MIN=MIN(ID),@V_MAX=MAX(ID) FROM #ABC
SET @V_COUNT=@V_MIN
WHILE @V_COUNT<=@V_MAX
BEGIN
SET @TMP = '' SELECT @TMP = @TMP + CONVERT(VARCHAR,NAME) + ', ' FROM #ABC
WHERE ID=@V_COUNT
INSERT INTO ##CDB (ID, NAME) SELECT @V_COUNT AS ID ,CAST(SUBSTRING(@TMP, 0, LEN(@TMP)) AS VARCHAR(8000)) AS NAME
SET @V_COUNT=@V_COUNT+1
END
SELECT * FROM ##CDB
OR
SELECT * INTO #ABC
FROM
SELECT 1 AS ID,'NY' AS NAME
UNION
SELECT 1 AS ID,'NJ' AS NAME
UNION
SELECT 1 AS ID,'CT' AS NAME
UNION
SELECT 2 AS ID,'MA' AS NAME
UNION
SELECT 2 AS ID,'NJ' AS NAME
UNION
SELECT 2 AS ID,'VA' AS NAME
UNION
SELECT 3 AS ID,'FL' AS NAME
UNION
SELECT 3 AS ID,'CA' AS NAME
UNION
SELECT 5 AS ID,'LG' AS NAME
UNION
SELECT 5 AS ID,'AP' AS NAME
)A
CREATE TABLE ##CDB (ID INT, NAME NVARCHAR(800))
DECLARE @TMP VARCHAR(MAX),
@V_MIN INT,
@V_MAX INT,
@V_COUNT INT
SELECT @V_MIN=MIN(ID),@V_MAX=MAX(ID) FROM #ABC
SET @V_COUNT=@V_MIN
WHILE @V_COUNT<=@V_MAX
BEGIN
SET @TMP = '' SELECT @TMP = @TMP + CONVERT(VARCHAR,NAME) + ', ' FROM #ABC
WHERE ID=@V_COUNT
SELECT @V_COUNT AS ID ,CAST(SUBSTRING(@TMP, 0, LEN(@TMP)) AS VARCHAR(8000)) AS NAME INTO #TEMP
INSERT INTO ##CDB (ID, NAME) SELECT ID, NAME FROM #TEMP WHERE NAME<>''
DROP TABLE #TEMP
SET @V_COUNT=@V_COUNT+1
END
SELECT * FROM ##CDB
Thanks Shiven:) If Answer is Helpful, Please Vote -
Oracle SP for comparing 80 column values across 8 table pairs in 2 diff DBs
Hi All,
I have an Oracle SP for comparing 80 column values across 8 table pairs in 2 diff DBs.
However, it is taking hell lot of time around 6hours to process 10,000 records.
Can anyone suggest how to fine-tune this?
Thanks guys.Tables prefixed with X are the temp tables to store data of DB-A.
The report will be originally based on DB-B, so DB Links will not be required for @PROD1.WORLD tables.
This is a test region, so I have pointed to @PROD1.WORLD to test with Prod Data.
SEC_COMPARE_CONFIG is the config table containing the table_name to be reported, corresponding temp tables to store the data and the columns on which it is to be reported.
There are in total 8 tables- 90 rows and 8 temp tables.
SPOKE_TO_HUB_SEC_MTCH_TBL records the securities on which it is to be reported.
HIST_DATA_COMPARE_TBL is the final results table.
Here is the entire code:
CREATE OR REPLACE PACKAGE SECURITY_COMPARE AS
PROCEDURE PROCESS_RECORDS (IN_EFFECTIVE_DATE IN DATE,
IN_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL);
PROCEDURE IDENTIFY_SECURITIES ( P_EFFECTIVE_DATE IN DATE,
P_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL);
PROCEDURE RETREIVE_RECORDS_FROM_SPOKE;
PROCEDURE COMPARE_RECORDS(p_err_msg OUT VARCHAR2);
PROCEDURE INSERT_DATA_TO_TABLE ( v_target_table VARCHAR2, v_sql_to_run VARCHAR2, v_commit_after NUMBER);
END SECURITY_COMPARE;
CREATE OR REPLACE PACKAGE BODY SECURITY_COMPARE AS
/*Declared String for recording Dynamic SQL's*/
LC_SQL VARCHAR2 (20000);
PROCEDURE PROCESS_RECORDS(IN_EFFECTIVE_DATE IN DATE,
IN_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL)
AS
L_EFF_DATE DATE;
L_PRIMARY_ASSET_ID VARCHAR2(100);
k_err_msg VARCHAR2(100); --Error message displayed in case of NO discretionary records found.
BEGIN
L_EFF_DATE := IN_EFFECTIVE_DATE;
L_PRIMARY_ASSET_ID := IN_PRIMARY_ASSET_ID;
IDENTIFY_SECURITIES(L_EFF_DATE,L_PRIMARY_ASSET_ID); --Calling the Identify_Securities procedure to identify the securities older by 90 days from report effective date
RETREIVE_RECORDS_FROM_SPOKE(); --Retreiving the historic records from the security tables into temporary tables.
COMPARE_RECORDS(p_err_msg=>k_err_msg); --Compare the records and report the discrepencies into HIST_DATA_COMPARE_TBL table
END PROCESS_RECORDS;
PROCEDURE IDENTIFY_SECURITIES(P_EFFECTIVE_DATE IN DATE,
P_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL)
AS
P_EFF_DATE DATE; --Effective Date of the report
P_PRIMARY_ID VARCHAR2(100); --Primary AssetID which is used to search based on specific security
v_target_table VARCHAR2(500); --Variable indicating the Target table for inserting the data
v_sql_to_run VARCHAR2(5000); --Variable to store the Dynamic SQL to be executed
v_commit_after NUMBER; --Variable to define after how many records is COMMIT to be done
BEGIN
LC_SQL :='';
P_EFF_DATE := P_EFFECTIVE_DATE;
P_PRIMARY_ID := P_PRIMARY_ASSET_ID;
/*Deleting Old Entries from SPOKE_TO_HUB_SEC_MTCH_TBL table*/
LC_SQL := 'TRUNCATE TABLE SPOKE_TO_HUB_SEC_MTCH_TBL';
EXECUTE IMMEDIATE LC_SQL;
IF(P_PRIMARY_ID is NULL) --In case records do not need to be identified on basis of specific security
THEN
/*Identify Securities older by 90days from report effective date*/
v_target_table := ' SPOKE_TO_HUB_SEC_MTCH_TBL';
v_sql_to_run := 'WITH T AS ('||
' SELECT R.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_R,'||
' R.SECURITY_ALIAS SECURITY_ALIAS_R,'||
' R.LAST_HELD_DATE LAST_HELD_DATE_R,'||
' R.PREV_HELD_DATE PREV_HELD_DATE_R,'||
' Q.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_Q,'||
' Q.SECURITY_ALIAS SECURITY_ALIAS_Q,'||
' COUNT(*) OVER(PARTITION BY Q.PRIMARY_ASSET_ID) CNT'||
' FROM [email protected] R,'||
' [email protected] Q'||
' WHERE SYS_OP_MAP_NONNULL(R.last_held_date) <> '||q'!'FF'!'||
' and ceil(R.last_held_date-to_date('||''''||P_EFF_DATE||''''||')) >= 0'||
' and ceil(R.last_held_date-to_date('||''''||P_EFF_DATE||''''||')) <= 60'||
' and R.PRIMARY_ASSET_ID=Q.PRIMARY_ASSET_ID'||
' )'||
' SELECT PRIMARY_ASSET_ID_R,'||
' SECURITY_ALIAS_R,'||
' LAST_HELD_DATE_R,'||
' PREV_HELD_DATE_R,'||
' PRIMARY_ASSET_ID_Q,'||
' SECURITY_ALIAS_Q'||
' FROM T'||
' WHERE CNT =1';
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
ELSE
v_target_table := ' SPOKE_TO_HUB_SEC_MTCH_TBL';
v_sql_to_run := 'WITH T AS ('||
' SELECT R.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_R,'||
' R.SECURITY_ALIAS SECURITY_ALIAS_R,'||
' R.LAST_HELD_DATE LAST_HELD_DATE_R,'||
' R.PREV_HELD_DATE PREV_HELD_DATE_R,'||
' Q.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_Q,'||
' Q.SECURITY_ALIAS SECURITY_ALIAS_Q,'||
' COUNT(*) OVER(PARTITION BY Q.PRIMARY_ASSET_ID) CNT'||
' FROM [email protected] R,'||
' [email protected] Q'||
' where R.PRIMARY_ASSET_ID='||''''||P_PRIMARY_ID||''''||
' and R.PRIMARY_ASSET_ID=Q.PRIMARY_ASSET_ID'||
' )'||
' SELECT PRIMARY_ASSET_ID_R,'||
' SECURITY_ALIAS_R,'||
' LAST_HELD_DATE_R,'||
' PREV_HELD_DATE_R,'||
' PRIMARY_ASSET_ID_Q,'||
' SECURITY_ALIAS_Q'||
' FROM T'||
' WHERE CNT =1';
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
END IF;
LC_SQL := 'TRUNCATE TABLE HIST_DATA_COMPARE_TBL';
EXECUTE IMMEDIATE LC_SQL;
END IDENTIFY_SECURITIES;
PROCEDURE RETREIVE_RECORDS_FROM_SPOKE
AS
v_target_table VARCHAR2(500);
v_sql_to_run VARCHAR2(5000);
v_commit_after NUMBER;
BEGIN
LC_SQL :='';
LC_SQL:= 'TRUNCATE TABLE X_SECMASTER_HISTORY_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_SEC_MASTER_DTL_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_SECMASTER_DTL_EXT_HST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_EQUITY_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_EQUITY_DETAIL_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_FIXED_INCOME_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_FIXED_INCOME_DTL_EXT_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_DERIVATIVES_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
/*SECMASTER_HISTORY*/
v_target_table := 'X_SECMASTER_HISTORY_TBL';
v_sql_to_run := ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*SECURITY_MASTER_DETAIL_HIST*/
v_target_table := 'X_SEC_MASTER_DTL_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*SECMASTER_DETAIL_EXT_HIST*/
v_target_table := 'X_SECMASTER_DTL_EXT_HST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*EQUITY_HIST*/
v_target_table := 'X_EQUITY_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*EQUITY_DETAIL_HIST*/
v_target_table := 'X_EQUITY_DETAIL_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*FIXED_INCOME_HIST*/
v_target_table := 'X_FIXED_INCOME_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*FIXED_INCOME_DETAIL_EXT_HIST*/
v_target_table := 'X_FIXED_INCOME_DTL_EXT_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*DERIVATIVES_HIST*/
v_target_table := 'X_DERIVATIVES_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
END RETREIVE_RECORDS_FROM_SPOKE;
PROCEDURE COMPARE_RECORDS(p_err_msg OUT VARCHAR2)
AS
l_count NUMBER;
l_err_msg VARCHAR2(100);
TYPE T_SECURITIES is TABLE of HIST_DATA_COMPARE_TBL%rowtype;
ttype T_SECURITIES;
CURSOR C1
IS
SELECT TABLE_NAME, TEMP_TABLE, COLUMN_NAME from SEC_COMPARE_CONFIG
where column_name='EFFECTIVE_DATE';
CURSOR C2
IS
SELECT * FROM SEC_COMPARE_CONFIG where id <=82;
C_REC SEC_COMPARE_CONFIG%rowtype;
BEGIN
LC_SQL :='';
p_err_msg :='';
FOR C_REC in C1
loop
LC_SQL:= ' SELECT /*+DRIVING_SITE(B)*/ /*+PARALLEL(A,100)*/ B.SECURITY_ALIAS, to_char(C.SPOKE_PAID), A.SECURITY_ALIAS,to_char(C.HUB_PAID),'||''''||C_REC.TABLE_NAME||''''||','||q'!'EFFECTIVE_DATE'!'||','||
' NVL((cast(B.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||'),'||
' NVL((cast(A.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||')'||
' FROM '||C_REC.TEMP_TABLE||' A, SECURITYDBO.'||C_REC.TABLE_NAME ||'@PROD1.WORLD B,'||
' SPOKE_TO_HUB_SEC_MTCH_TBL C'||
' WHERE A.SRC_INTFC_INST=140'||
' AND B.SRC_INTFC_INST=140'||
' AND A.SECURITY_ALIAS=C.spoke_sec'||
' and b.security_alias=C.HUB_SEC'||
' AND a.effective_date <> (select max(h.effective_date) from SECURITYDBO.'||C_REC.TABLE_NAME||'@PROD1.WORLD H'||
' where h.security_alias=c.hub_sec and h.src_intfc_inst=140 )';
EXECUTE IMMEDIATE LC_SQL BULK COLLECT into ttype;
FORALL x in ttype.First..ttype.Last
insert into HIST_DATA_COMPARE_TBL values ttype(x);
commit;
end loop;
For C_REC in C2
loop
LC_SQL:= ' SELECT /*+DRIVING_SITE(B)*/ /*+PARALLEL(A,100)*/ B.SECURITY_ALIAS, to_char(C.SPOKE_PAID), A.SECURITY_ALIAS,to_char(C.HUB_PAID),'||''''||C_REC.TABLE_NAME||''''||','||''''||C_REC.COLUMN_NAME||''''||','||
' NVL((cast(B.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||'),'||
' NVL((cast(A.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||')'||
' FROM '||C_REC.TEMP_TABLE||' A, SECURITYDBO.'||C_REC.TABLE_NAME ||'@PROD1.WORLD B,'||
' SPOKE_TO_HUB_SEC_MTCH_TBL C'||
' WHERE A.SRC_INTFC_INST=140'||
' AND B.SRC_INTFC_INST=140'||
' AND A.SECURITY_ALIAS=C.spoke_sec'||
' and b.security_alias=C.HUB_SEC'||
' and b.effective_date=a.effective_date'||
' AND NVL((cast(A.'||C_REC.column_name||' as VARCHAR2(100))),'||q'!'No Records Found'!'||') <>'||
' NVL((cast(B.'||C_REC.column_name||' as VARCHAR2(100))),'||q'!'No Records Found'!'||')';
EXECUTE IMMEDIATE LC_SQL BULK COLLECT into ttype;
FORALL x in ttype.First..ttype.Last
insert into HIST_DATA_COMPARE_TBL values ttype(x);
commit;
end loop;
BEGIN
select count(*) into l_count from HIST_DATA_COMPARE_TBL;
if(l_count=0) then
l_err_msg :='No records found';
end if;
END;
END COMPARE_RECORDS;
NAME: INSERT_DATA_TO_TABLE
DESCRIPTION: This procedure will insert the records into the target table based based on the data fetched using the sql to run variable.
It also records the commit_after variable which defines that after how many records the insert needs to be committed.
PROCEDURE INSERT_DATA_TO_TABLE ( v_target_table VARCHAR2,
v_sql_to_run VARCHAR2,
v_commit_after NUMBER) IS
v_limit_sql1 VARCHAR2(300) := ' ';
v_limit_sql2 VARCHAR2(900) := ' ';
v_plsql_to_run VARCHAR2(32767);
BEGIN
IF NVL(v_commit_after,0) <> 0 THEN
v_limit_sql1:= ' LIMIT ' || TO_CHAR(v_commit_after) ;
v_limit_sql2:= ' IF MOD(v_number_of_rows, ' || TO_CHAR(v_commit_after) || ' ) = 0 THEN ' ||
' COMMIT; ' ||
' END IF; ' ;
END IF;
v_plsql_to_run:= ' ' ||
'DECLARE ' ||
' v_number_of_rows number:=0; ' ||
' ' ||
' TYPE MyType IS REF CURSOR; ' ||
' CV MyType; ' ||
' TYPE RecTyp IS TABLE OF ' || v_target_table || '%ROWTYPE; ' ||
' rec RecTyp; ' ||
' ' ||
'BEGIN ' ||
' ' ||
'OPEN CV FOR ' ||
' ' || REPLACE( v_sql_to_run, ';', ' ' ) || ' ; ' ||
' LOOP ' ||
' FETCH CV BULK COLLECT INTO rec ' || v_limit_sql1 || '; ' ||
' FORALL i IN 1..rec.COUNT ' ||
' INSERT /*+ APPEND */ INTO ' || v_target_table || ' VALUES rec(i); ' ||
' v_number_of_rows := v_number_of_rows + SQL%ROWCOUNT; ' ||
' ' || v_limit_sql2 || ' ' ||
' EXIT WHEN CV%NOTFOUND; ' ||
' ' ||
' END LOOP; ' ||
' COMMIT; ' ||
' CLOSE CV; ' ||
'END; ';
EXECUTE IMMEDIATE v_plsql_to_run;
COMMIT;
END INSERT_DATA_TO_TABLE;
END SECURITY_COMPARE; -
DAX: Sum unique values across columns
Hello,
How do I sum unique values of "Yes" across my columns in DAX?
It should look like this with [Total Yes] being my calculated column:
Name
January
February
March
Total Yes
Bob
Yes
Yes
No
2
Jim
No
Yes
No
1
Mark
No
No
Yes
1
Thanks!
~UGSimply this
=IF([January]="Yes",1,0)+IF([February]="Yes",1,0)+IF([March]="Yes",1,0)
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Function o know the value of nth column of a query
Hi,
I want to create a function where the input will be some sql statement (select statement) and the output will be the the value of 2nd column(or say nth column) knowing only column position (say 2 or 3 or n)
I know this is possible through DBMS_SQL but do not know how.
Please let me know how to do this.
Regards
SurendraI'm not sure to understand as well, but you could read the following doc, especially the example 8 which seems to be close of what you want to do :
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_sql.htm#i997238
Nicolas. -
Full text query across multiple columns
In the SQL Server, when you do the full text query, you can specify multiple columns, e.g.
FREETEXT ( { column_name | [b](column_list) | * } , 'freetext_string' [ , LANGUAGE language_term ] )
CONTAINS ( { column_name | [b](column_list) | * } , '< contains_search_condition>' [ , LANGUAGE language_term ])
Where,
column_list Indicates that several columns, separated by a comma, can be specified...
* Specifies that all columns in the table registered for full-text searching should be used to search for the given contains search condition. The columns in the CONTAINS clause must come from a single table...
That makes full text query cross multiple columns very convenient. Are there any mechnisms in Oracle to do the same thing?
Thanks in advance.Thanks for your reply.
I knew that you could build full text index for the multiple columns using Oracle Text. But that does not solve my problem, which is how to build the query to search multiple columns at once. Say, I have columns firstname, lastname, address, and email in the table customers. I want to get the results that ANY column contains 'bob'. In SQL Server, I can do
select * from customers where contains(*, 'bob')
that is. But for Oracle, I have to do
select * from customers where contains('firstname', 'bob') or contains('lastname', 'bob') or contains('address', 'bob') or contains('email', 'bob')
Can you imagine if I have many columns in many tables and I have to do the query against all columns in all tables? I have to dynamically get all the columns and then build the query string.
So, any better solutions? -
Hi,
I am working on an aging query. This query must have the sales org in the column and the quantity and SO count should be bucketed depending on difference between the current date and another date.
Output should resemble as below:
Sorg <30days >31<60 >60<90 >90
count | Qty Qty Qty Qty
1111 xxx xxx xxx xxx
I created a structure in the Rows and then created formula variables "Current date-given date < 30" and another variable " Current date-given date >31<60" etc
I get correct values if I include the given date and the Sdoc number into my query. If I move the Sdoc number to free char, my output values are zeros.
How can I get a summarize view of my output without including the Sdoc number into the query. I did try searching for similar issue within the forum. Dint find it helpful So any feedback is appreciated.
Thanks,
RadHi Rad,
When you include sales organization alone in the column, you get wrong result...that happens coz several document numbers are falling under 1 organization, when you run query it accumulates the data based on query design. As you have only sales org which in turn contains many sales document's so it aggregates all of them and dont able to identify what date it should pick up for calculation's hence output is not fine.
So any how u have to go to lowest level for date calculation but you might not have to show them in your query, then please include the sales document number and in display please select hide and supress result display for document number.
Hope its clear to you.
Thanks
dipika -
Inventory Ageing query performance
Hi All,
I have created inventory ageing query on our custom cube which is replica of 0IC_C03. We have data from 2003 onwards. the performance of the query is very poor the system almost hangs. I tried to create aggregates to improve performance but its failed. What i should do to improve the performance and why the aggregate filling is failed. Cube have compressed data. Pls guide.
Regards:
JitendraInaddition to the above posts
Check the below points ... and take action accordingly to increase the query performance.
mainly check --Is the Cube data Compressed. it will increase the performance of the query..
1)If exclusions exist, make sure they exist in the global filter area. Try to remove exclusions by subtracting out inclusions.
2)Check code for all exit variables used in a report.
3)Check the read mode for the query. recommended is H.
4)If Alternative UOM solution is used, turn off query cache.
5)Use Constant Selection instead of SUMCT and SUMGT within formulas.
6)Check aggregation and exception aggregation on calculated key figures. Before aggregation is generally slower and should not be used unless explicitly needed.
7)Check if large hierarchies are used and the entry hierarchy level is as deep as possible. This limits the levels of the hierarchy that must be processed.
Use SE16 on the inclusion tables and use the List of Value feature on the column successor and predecessor to see which entry level of the hierarchy is used.
8)Within the free characteristics, filter on the least granular objects first and make sure those come first in the order.
9)If hierarchies are used, minimize the number of nodes to include in the query results. Including all nodes in the query results (even the ones that are not needed or blank) slows down the query processing.
10)Check the user exits usage involved in OLAP run time?
11)Use Constant Selection instead of SUMCT and SUMGT within formulas.
12)
Turn on the BW Statistics: RSA1, choose Tools -> BW statistics for InfoCubes(Choose OLAP and WHM for your relevant Cubes)
To check the Query Performance problem
Use ST03N -> BW System load values to recognize the problem. Use the number given in table 'Reporting - InfoCubes:Share of total time (s)' to check if one of the columns %OLAP, %DB, %Frontend shows a high number in all InfoCubes.
You need to run ST03N in expert mode to get these values
based on the analysis and the values taken from the above - Check if an aggregate is suitable or setting OLAP etc.
Edited by: prashanthk on Nov 26, 2010 9:17 AM
Maybe you are looking for
-
In a plain UITableView, is it necessary for me to explicitly call insertSections to create the number of sections in my table or is it sufficient to just return the number of sections via my numberOfSectionsInTableView data source method? I've been u
-
How to create a Folder Tree in Exchange Server from a Java progarm.
hi all, I am using Microsoft Exchange Server 2000. I have a requirement of creating the following things from a Java program. 1) Creating a Folder Tree 2) Creating a Mailbox Store 3) Creating a Public Store 4) Creating a Recipient Policy 5) Creating
-
How to change the posting date at the line item level in a sales order
how to change the posting date at the line item level in a sales order
-
Full YR calendar & row generators -probs with mnths that have 28/29/30 days
Hello, With Rod West's advice i've been able to build an absence calendar report using Row Generators, see the learndiscoverer blog (http://learndiscoverer.blogspot.com/2008/10/row-generators.html) for more details on row generators. I have hit a pro
-
Airport Extreme won't work with windows computer?
I can't get airport extreme to work with my other computer which is a windows machine. The windows machine can't find the network address. Sometime it does find an address and appears to be connected, however I cannot connect to the internet with tha