Table necessary for join - no columns wanted
I have three tables, A,B and C.
A joins to B and B joins to C.
I want to see information from A and C but not B.
Discoverer workbooks will not let me select just table A and C, I must select table B for the join condition to work.
How can I "hide" the columns from table B? Or do I need to make a "view" in the Admin user of tables A and B or B and C to do this?
null
Adrienne,
You have a few options here, none of which you are attempting.
1. Create a view on the database and import it as a folder.
2. Create a custom folder with sql that is passed from discoverer to the database.
3. create a folder move the items in from the three tables and then make the items for the second folder invisible.
CHW
Similar Messages
-
Can display:table used for dynamically varring column
Dear All,
i am new in JSP,Struts development, can any help me regarding the issue as follows:
i would like to show the record of table (from Database that i fetch in Array list in which according to table the column is varring )on the JSP
so any one tell me that how can implement this
or there is any other way that i should follow.
since i could not show much more record on JSP at a time so pagination is requered so , i would like to use display tag.
thanks in advance
ajjuplease help me i am stuck on this issue
-
Update columns in Table A based on columns in Table B for more than 500K rows
Guys,
I need to update 9 columns in table A based on value from table B for for more than 500K rows.
So what is best way to achieve this. I am thinking of writing a Procedure with cursor to update the rows of table A.
When i googled about it, they say cursor will decrease the performance. So i have no clue how to go for this.
Rough code which i though
1) Procedure with no parameter
2) Will declare 9 variable to store value from cursor
3) cursor will fetch row by row based on join condition between table a and table b
4) i will pass column values from table B to variables
5) will make an update statement for table A
Please let me know if above method is correct or is there any other way to do this without using cursor.Guys,
Below is the rough code i wrote as per my requirement. Does it look correct? As of now i dont have any platform to test it so any help with the below code is highly appreciated. As i said i need to update more than 500K rows by matching Table
A and Table B. One more thing which i would like to add in below code, is to get log of all the rows that are in table B but not exist in table A. Table A already has more than million data in it.
Also not sure how the loop in below code willl run when @rowcount is become to zero?
Please let me know if i need to consider performance related impact while running the script.
GO
SET SERVEROUTPUT ON
CREATE PROCEDURE ONETIMEUPDATE
DECLARE @cnt INT;
SET @cnt = 1;
DECLARE @MSG varchar(255);
DECLARE @COUNT_VAR INT;
SET @COUNT_VAR=0;
WHILE @cnt > 0
BEGIN
Update TOP (50000) A
Set A.Col1=B.Col1,
A.COL2=B.COL2,
A.COL3=B.COL3,
A.COL4=B.COL4,
A.COL5=B.COL5,
A.COL6=B.COL6,
A.COL7=B.COL7
From TableA A
Inner Join TableB B
on A.ID = B.ID--ID
WHERE A.Col1 <> B.Col1
OR A.Col2 <> B.Col2;
SET @cnt = @@ROWCOUNT;
IF @@ROWCOUNT=25000
@COUNT_VAR=@COUNT_VAR + @@ROWCOUNT
SELECT @MSG = CONVERT(varchar, @COUNT_VAR) + "Rows Updated" -- I WANT TO DISPLAY UPDATE after EVERY 25000 ROWS
PRINT @MSG
IF @@ROWCOUNT=0
BEGIN
COMMIT
END
WAITFOR DELAY '00:00:01' --wait for a second before the next update
END;
END; -
How to join two columns between two tables with different column names
Hi
How i can join 2 columns with different names between the 2 tables.
Can any one please give solution for this.
Thanks
ManuHi,
basic understanding of joins:
If you want to join 2 tables there should be matching column w.r.t. data and it's data type. You need not to have same column names in both the tables...
so, find those columns which has got same values.. -
Join a row to a table name stored within a column ?
Hi guys - can you tell me if this is doable at all ?
I've an existing table with a lot of data in it that I wish to query on.
This table has a text column, and then 4 more columns - Join Table name, Join column name, Join value and table value column.
The 3 join 3 columns allow me to join this piece of text to a row in another table, identified by the 3 join columns listed above.
Now, I appreciate this is a lousy design, but its in place and there's nothing I can do about it.
I do however need to get a row by row report of the text, followed by the value stored in the column identified by table id column.
So lets say I have
Table_A
Pers_ID
Name
Age
Sex
Table_B
Col_ID
Colour
Hue
My text table will have rows like this
Text
text = 'Person A Age'
join tablename = 'Table_A'
join columnname = 'Pers_ID'
join value = 1
table value column = 'Age'
text = 'Person B Sex'
join tablename = 'Table_A'
join columnname = 'Pers_ID'
join value = 2
table value column = 'Sex'
text = 'Pers A Fave Colour'
join tablename = 'Table_B'
join columnname = 'Col_ID'
join value = 1
table value column = 'Colour'
So - what I should be able to do is go to either table a, and select the column specified, where the id column = the value supplied.
The only way I can think of doing this is to create a temporary table and go through the text table, row by row using pl/sql and select from whatever table it tells me to using dynamic sql and populate the temporary table.
Can this be done using sql ??Hi,
ScottyH wrote:
Hi guys - can you tell me if this is doable at all ?
I've an existing table with a lot of data in it that I wish to query on.
This table has a text column, and then 4 more columns - Join Table name, Join column name, Join value and table value column.
The 3 join 3 columns allow me to join this piece of text to a row in another table, identified by the 3 join columns listed above.
Now, I appreciate this is a lousy design, ...I agree. It's a variation of the Entity-Attribute-Value (EAV) model, which might be a good thing if relational databases didn't exist.
I think your present approach, using dynamic SQL, is the best that you can do.
One possible alternative is to unpivot table_a, table_b, and all the other tables like them into an EAV form, then do a UNION of all those unpivoted results. (You could do this in a view, if you wanted to.) Then your query would simply be a join between your text table and the EAV table (or view).
If you'd care to post CREATE TABLE and INSERT statements for a little sample data, and the results you want from that data, then I could demonstrate what I mean.
What version of Oracle are you using? You should always include this when you have a question. It might not matter in this case, but why take a chance?
Here's another approach that would work in very special circumstances.
If there are only a couple of tables like table_a and table_b, and they only have a couple of columns that you might need to reference, then you could do all possible joins, and use CASE expressions to get the results from the one you actually want. All the joins would have to be outer joins, so I expect this approach would be very slow. -
Is this a bug in OWB 11.2 - importing table metadata for character columns
The Oracle® Warehouse Builder Data Modeling, ETL, and Data Quality Guide provides an overview of the data types supported.
http://docs.oracle.com/cd/E11882_01/owb.112/e10935/orcl_data_objx.htm
It says that for VARCHAR2 data type it saws (http://docs.oracle.com/cd/E11882_01/owb.112/e10935/orcl_data_objx.htm#CHDFIADI )
"Stores variable-length character data. How the data is represented internally depends on the database character set. The VARCHAR2 data type takes a required parameter that specifies a maximum size up to 4,000 characters"
That means , I guess, it says that when I import a table, any columns of type VARCHAR2(10) in the database should have its length show as characters in OWB, so a column of type Varchar2(10) in the Oracle database, should be shown as Varchar2(10) when imported into OWB table metadata via the OWB import function.
However, if I have a database that set-up as a single-byte and import a table using the OWB import function a column that has a size of e.g. 10 in the database, is imported as OWB table metadata and the size is 10. Correct, I am happy.
However, if the database is modified to support multi-byte characters, ALTUF16 encoding with the semantics set to "CHAR", then when I import the same table into OWB, OWB reports the size as 40, I guess its 40 bytes as in 10 characters @ 4 bytes per character.
Is this a bug in OWB, as the datatype in the Oracle DB is varchar2(10), should OWB after importing a table not also report the column as VARCHAR2(10) ? Currently, is shows the column as varchar2(40).I noticed that myself in our project.
Our varchars2 are defined as VARCHAR2(xxx CHAR) - OWB puts the size*4
In fact if you have special characters like umlauts (ü,ä,ö,...) it will use 4 bytes per character.
You can try it yourself. Define a Varchar2(1 CHAR) and manually change the size of the Column in your mapping inside OWB (in filters, joins or your target table).
Then shoot an umlaut through the mapping and will end up with a "too small" error.
Dont mind the size*4 issue - we totally ignored it and run without error since 4 years now. -
Want field and table name for RESERVED coloum in MMBE
Hi ,
There is a RESERVED coloum in MMBE and I want this * RESERVED* coloum in MB52 .
I got to know that Field related to this RESERVED coloum is BDMNG coming from table RESB .
To confirm the field and table I checked the value in BDMNG by putting join on MARA and RESB where MATNR = BAUGR.
But I got value in BDMNG is 50 and in MMBE it is .228 for that material.
Please tell me field and table name for RESERVED coloum. and hoe to join MARA and RESB to get exact result.
Thanks and Regards,
RanuHi,
Mara-matnr is not equal to Resb-matnr and it shows no record if i do like this.
But Mara-matnr is equal to Resb-baugr, and it gives me 1 record .But porblem is that i dint get BDMNG field matching with mmbe RESEVERD coloum. -
Can we implement the custom sql query in CR for joining the two tables
Hi All,
Is there anyway to implement the custom sql query in CR for joining the two tables?
My requirement here is I need to write sql logics for joining the two tables...
Thanks,
GanaIn the Database Expert, expand the Create New Connection folder and browse the subfolders to locate your data source.
Log on to your data source if necessary.
Under your data source, double-click the Add Command node.
In the Add Command to Report dialog box, enter an appropriate query/command for the data source you have opened.
For example:
SELECT
Customer.`Customer ID`,
Customer.`Customer Name`,
Customer.`Last Year's Sales`,
Customer.`Region`,
Customer.`Country`,
Orders.`Order Amount`,
Orders.`Customer ID`,
Orders.`Order Date`
FROM
Customer Customer INNER JOIN Orders Orders ON
Customer.`Customer ID` = Orders.`Customer ID`
WHERE
(Customer.`Country` = 'USA' OR
Customer.`Country` = 'Canada') AND
Customer.`Last Year's Sales` < 10000.
ORDER BY
Customer.`Country` ASC,
Customer.`Region` ASC
Note: The use of double or single quotes (and other SQL syntax) is determined by the database driver used by your report. You must, however, manually add the quotes and other elements of the syntax as you create the command.
Optionally, you can create a parameter for your command by clicking Create and entering information in the Command Parameter dialog box.
For more information about creating parameters, see To create a parameter for a command object.
Click OK.
You are returned to the Report Designer. In the Field Explorer, under Database Fields, a Command table appears listing the database fields you specified.
Note:
To construct the virtual table from your Command, the command must be executed once. If the command has parameters, you will be prompted to enter values for each one.
By default, your command is called Command. You can change its alias by selecting it and pressing F2. -
Stored Procedure for updating dynamic columns in a table
I have a table with a lot of columns around 30ish
Now I want to create a stored procedure that will update a specific column to a new value.
I the @columname and @newvalue to be the parameters, and an addition @itemId for the WHERE clause so it will update that specific row/data only.
here's my Stored Procedure
USE [db]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[usp_UpdateData]
@itemID int,
@ColumnName varchar(50),
@newValue nvarchar(50)
AS
BEGIN
Update dbo.ProjectAllocation
Set @ColumnName = @newValue
Where itemID = @itemID END
When I pass the following
@ColumnName: UserName
@newValue: NewUserName
@itemID: 1
it doesnt update row with itemID = 1.
How can I do this?
----------------------- Sharepoint NewbieDid you try my last post method. Its better to use parameterized approach.
Anyway, if you want to change your code,you may need to change as below.....
SET @sql = 'UPDATE dbo.table SET ' + @ColumnName + '=''' + @newValue + ''' WHERE [itemID] = ' + cast(@itemID as varchar(50)); -
Join 3 tables using FOR ALL ENTRIES
Hi,
I want to join 3 tables and show their result in a text file delimited by comma.
ITs a download program.Can someone tell me how to how to join the three table using for all entires instead of inner join.please give step by step illustrationHi,
Please check below code which downloads records from 3 tables into excel file.
if you want other file means you can specify other file type in the fm parameter
REPORT zstemp_qty2_ LINE-SIZE 255 .
DATA:it_vbak LIKE vbak OCCURS 0 WITH HEADER LINE.
DATA:v_file1 LIKE rlgrap-filename.
DATA:v_file2(80) TYPE c.
DATA:it_vbap LIKE vbap OCCURS 0 WITH HEADER LINE.
DATA:it_mara LIKE mara OCCURS 0 WITH HEADER LINE.
START-OF-SELECTION.
SELECT * FROM vbak INTO TABLE it_vbak UP TO 100 ROWS.
IF NOT it_vbak[] IS INITIAL.
SELECT * FROM vbap INTO TABLE it_vbap
FOR ALL ENTRIES IN it_vbak
WHERE vbeln = it_vbak-vbeln.
ENDIF.
LOOP AT it_vbap.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
input = it_vbap-matnr
IMPORTING
output = it_vbap-matnr.
MODIFY it_vbap TRANSPORTING matnr.CLEAR it_vbap.
ENDLOOP.
IF NOT it_vbap[] IS INITIAL.
SELECT * FROM mara INTO TABLE it_mara
FOR ALL ENTRIES IN it_vbap
WHERE matnr = it_vbap-matnr.
ENDIF.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
program_name = sy-cprog
dynpro_number = sy-dynnr
FIELD_NAME = ' '
IMPORTING
file_name = v_file1 .
v_file2 = v_file1.
CALL FUNCTION 'WS_DOWNLOAD'
EXPORTING
BIN_FILESIZE = ' '
CODEPAGE = ' '
filename = v_file2
filetype = 'WK1'
MODE = ' '
WK1_N_FORMAT = ' '
WK1_N_SIZE = ' '
WK1_T_FORMAT = ' '
WK1_T_SIZE = ' '
col_select = '1'
COL_SELECTMASK = ' '
NO_AUTH_CHECK = ' '
IMPORTING
FILELENGTH =
TABLES
data_tab = it_mara
FIELDNAMES =
EXCEPTIONS
FILE_OPEN_ERROR = 1
FILE_WRITE_ERROR = 2
INVALID_FILESIZE = 3
INVALID_TYPE = 4
NO_BATCH = 5
UNKNOWN_ERROR = 6
INVALID_TABLE_WIDTH = 7
GUI_REFUSE_FILETRANSFER = 8
CUSTOMER_ERROR = 9
OTHERS = 10
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
LOOP AT it_mara.
WRITE:/ it_mara-matnr.
ENDLOOP.
Regds
Sivaparvathi
Please reward points if helpful..... -
Forming generic sql query for joining multiple sap tables in ABAp
Hi,
I am new to this abap field ,facing an issue onsap-jco project . I have used RFC_READ_TABLE FM ,Customized this FM but facing an issue how to write generic open SQl select statement for joining multiple tables using RFC_READ_TABLE .Kindly help on this issue.
Thanks.something like this? If your tuples are not single columns, then you'll have to use dynamic sql to achieve the same result.with
table_1 as
(select '|Xyz|Abc|Def|' tuple from dual
table_2 as
(select '|Data1|Data21|Data31|Data41|Data51|' tuple from dual union all
select '|Data2|Data22|Data32|Data42|Data52|' tuple from dual union all
select '|Data3|Data23|Data33|Data43|Data53|' tuple from dual union all
select '|Data4|Data24|Data34|Data44|Data54|' tuple from dual union all
select '|Data5|Data25|Data35|Data45|Data55|' tuple from dual
select case the_row when 1
then tuple
else '|---|---|' || substr(tuple,instr(tuple,'|',1,3) + 1)
end tuple
from (select substr(a.tuple,instr(a.tuple,'|',:one_one),instr(a.tuple,'|',:one_one + 1)) ||
substr(a.tuple,instr(a.tuple,'|',1,:one_two) + 1,instr(a.tuple,'|',1,:one_two + 1) - instr(a.tuple,'|',1,:one_two)) ||
substr(b.tuple,instr(b.tuple,'|',1,:two_one) + 1,instr(b.tuple,'|',1,:two_one + 1) - instr(b.tuple,'|',1,:two_one)) ||
substr(b.tuple,instr(b.tuple,'|',1,:two_two) + 1,instr(b.tuple,'|',1,:two_two + 1) - instr(b.tuple,'|',1,:two_two)) tuple,
rownum the_row
from table_1 a,table_2 b
order by the_rowRegards
Etbin
Message was edited by:Etbin
user596003 -
Rules for joining tables in SQVI
Hi all,
I'm new to this community and this is my first time when i post on this forum. Hope I didn't post or did anything to bad.
I just want to know how can I know which tables should I join. I'm trying to create a report that shows me all the open items from a GL account (Construction In Progress) and have beside the AUC the AUC description, order, PO, Vendor, vendor name, plant, material (at least).
I tried to join BSIS and ANLA for starters, but anything I select it just gives me the same error "No data was selected" Is there any rule like only the key should be selected as selection fields, anything that you could advise would be a blessing.
Thank you
Edited by: Viorel_Petre on Jan 18, 2011 10:52 AMHi,
Welcome to SDN!!!
Since you are new to SAP.
you should ask the functional consultant to provide you with the names of the tables and fields.
the transaction SQVI is used to create queries and you can also use it to know relation bewetween tables for appplying join conditions.
for join conditions.
1. create VIEW.
2. Choose data source as table join.
3. insert 2 tables so see the realation between them.
search google/SDN for more information. -
Issues while joining two tables as the joining column has duplicate values - Please help!
Hi,
I have a table A -which has few columns including a Amount column - I am joining this table A to Table B. The joining column in B has duplicates. So, the number of records are getting more after the joining. As per the requirment when I create a table
after joining the tables and count the salary clumn, there is a difference in the counting. How can I solve this? Can you please help me?
Here is the DDL and sample values
create table #student (sid int, name varchar(10),salary int)
create table [#address] (sid int, city varchar(10),grade char(1),lineneumber int)
insert into #student values (1,'sachin',8000)
insert into #student values (2,'Dhoni',2000)
insert into #student values (3,'Ganguly',7000)
insert into #student values (4,'Kohli',1000)
insert into [#address] values(1,'mumbai','A',1)
insert into [#address] values(1,'mumbai','B',2)
insert into [#address] values(1,'mumbai','C',3)
insert into [#address] values(1,'mumbai','D',4)
insert into [#address] values(2,'JARKHAND','D',3)
insert into [#address] values(2,'JARKHAND','D',4)
SELECT S.SID,NAME,salary,CITY ,grade,linenumber
into #FINAL
FROM #STUDENT S
LEFT JOIN #ADDRESS A
ON S.SID=A.SID
SELECT SUM(salary) FROM #FINAL
--44000
Final result should be 18000 , but it is coming as 44000. can you please help me to get the correct result - what do i do in the joining?
In my real project, i have 5 tables joining, each table have more than 30 columns and few joining tables joining column have duplicates. I have simplified the issue so that i can ask the question clearly. So,while answering, please consider that also in mind.
thanks in advance for your help!SELECT S.SID,NAME,salary,CITY
into #FINAL
FROM #STUDENT S
LEFT JOIN (SELECT DISTINCT sid,city
FROM #Address) A
ON S.SID=A.SIDthis will do a join on student table and city table with unique sid and city name so adddress selection will be sid city1 mumbai2 jarkand -
Oracle SP for comparing 80 column values across 8 table pairs in 2 diff DBs
Hi All,
I have an Oracle SP for comparing 80 column values across 8 table pairs in 2 diff DBs.
However, it is taking hell lot of time around 6hours to process 10,000 records.
Can anyone suggest how to fine-tune this?
Thanks guys.Tables prefixed with X are the temp tables to store data of DB-A.
The report will be originally based on DB-B, so DB Links will not be required for @PROD1.WORLD tables.
This is a test region, so I have pointed to @PROD1.WORLD to test with Prod Data.
SEC_COMPARE_CONFIG is the config table containing the table_name to be reported, corresponding temp tables to store the data and the columns on which it is to be reported.
There are in total 8 tables- 90 rows and 8 temp tables.
SPOKE_TO_HUB_SEC_MTCH_TBL records the securities on which it is to be reported.
HIST_DATA_COMPARE_TBL is the final results table.
Here is the entire code:
CREATE OR REPLACE PACKAGE SECURITY_COMPARE AS
PROCEDURE PROCESS_RECORDS (IN_EFFECTIVE_DATE IN DATE,
IN_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL);
PROCEDURE IDENTIFY_SECURITIES ( P_EFFECTIVE_DATE IN DATE,
P_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL);
PROCEDURE RETREIVE_RECORDS_FROM_SPOKE;
PROCEDURE COMPARE_RECORDS(p_err_msg OUT VARCHAR2);
PROCEDURE INSERT_DATA_TO_TABLE ( v_target_table VARCHAR2, v_sql_to_run VARCHAR2, v_commit_after NUMBER);
END SECURITY_COMPARE;
CREATE OR REPLACE PACKAGE BODY SECURITY_COMPARE AS
/*Declared String for recording Dynamic SQL's*/
LC_SQL VARCHAR2 (20000);
PROCEDURE PROCESS_RECORDS(IN_EFFECTIVE_DATE IN DATE,
IN_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL)
AS
L_EFF_DATE DATE;
L_PRIMARY_ASSET_ID VARCHAR2(100);
k_err_msg VARCHAR2(100); --Error message displayed in case of NO discretionary records found.
BEGIN
L_EFF_DATE := IN_EFFECTIVE_DATE;
L_PRIMARY_ASSET_ID := IN_PRIMARY_ASSET_ID;
IDENTIFY_SECURITIES(L_EFF_DATE,L_PRIMARY_ASSET_ID); --Calling the Identify_Securities procedure to identify the securities older by 90 days from report effective date
RETREIVE_RECORDS_FROM_SPOKE(); --Retreiving the historic records from the security tables into temporary tables.
COMPARE_RECORDS(p_err_msg=>k_err_msg); --Compare the records and report the discrepencies into HIST_DATA_COMPARE_TBL table
END PROCESS_RECORDS;
PROCEDURE IDENTIFY_SECURITIES(P_EFFECTIVE_DATE IN DATE,
P_PRIMARY_ASSET_ID IN VARCHAR2 DEFAULT NULL)
AS
P_EFF_DATE DATE; --Effective Date of the report
P_PRIMARY_ID VARCHAR2(100); --Primary AssetID which is used to search based on specific security
v_target_table VARCHAR2(500); --Variable indicating the Target table for inserting the data
v_sql_to_run VARCHAR2(5000); --Variable to store the Dynamic SQL to be executed
v_commit_after NUMBER; --Variable to define after how many records is COMMIT to be done
BEGIN
LC_SQL :='';
P_EFF_DATE := P_EFFECTIVE_DATE;
P_PRIMARY_ID := P_PRIMARY_ASSET_ID;
/*Deleting Old Entries from SPOKE_TO_HUB_SEC_MTCH_TBL table*/
LC_SQL := 'TRUNCATE TABLE SPOKE_TO_HUB_SEC_MTCH_TBL';
EXECUTE IMMEDIATE LC_SQL;
IF(P_PRIMARY_ID is NULL) --In case records do not need to be identified on basis of specific security
THEN
/*Identify Securities older by 90days from report effective date*/
v_target_table := ' SPOKE_TO_HUB_SEC_MTCH_TBL';
v_sql_to_run := 'WITH T AS ('||
' SELECT R.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_R,'||
' R.SECURITY_ALIAS SECURITY_ALIAS_R,'||
' R.LAST_HELD_DATE LAST_HELD_DATE_R,'||
' R.PREV_HELD_DATE PREV_HELD_DATE_R,'||
' Q.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_Q,'||
' Q.SECURITY_ALIAS SECURITY_ALIAS_Q,'||
' COUNT(*) OVER(PARTITION BY Q.PRIMARY_ASSET_ID) CNT'||
' FROM [email protected] R,'||
' [email protected] Q'||
' WHERE SYS_OP_MAP_NONNULL(R.last_held_date) <> '||q'!'FF'!'||
' and ceil(R.last_held_date-to_date('||''''||P_EFF_DATE||''''||')) >= 0'||
' and ceil(R.last_held_date-to_date('||''''||P_EFF_DATE||''''||')) <= 60'||
' and R.PRIMARY_ASSET_ID=Q.PRIMARY_ASSET_ID'||
' )'||
' SELECT PRIMARY_ASSET_ID_R,'||
' SECURITY_ALIAS_R,'||
' LAST_HELD_DATE_R,'||
' PREV_HELD_DATE_R,'||
' PRIMARY_ASSET_ID_Q,'||
' SECURITY_ALIAS_Q'||
' FROM T'||
' WHERE CNT =1';
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
ELSE
v_target_table := ' SPOKE_TO_HUB_SEC_MTCH_TBL';
v_sql_to_run := 'WITH T AS ('||
' SELECT R.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_R,'||
' R.SECURITY_ALIAS SECURITY_ALIAS_R,'||
' R.LAST_HELD_DATE LAST_HELD_DATE_R,'||
' R.PREV_HELD_DATE PREV_HELD_DATE_R,'||
' Q.PRIMARY_ASSET_ID PRIMARY_ASSET_ID_Q,'||
' Q.SECURITY_ALIAS SECURITY_ALIAS_Q,'||
' COUNT(*) OVER(PARTITION BY Q.PRIMARY_ASSET_ID) CNT'||
' FROM [email protected] R,'||
' [email protected] Q'||
' where R.PRIMARY_ASSET_ID='||''''||P_PRIMARY_ID||''''||
' and R.PRIMARY_ASSET_ID=Q.PRIMARY_ASSET_ID'||
' )'||
' SELECT PRIMARY_ASSET_ID_R,'||
' SECURITY_ALIAS_R,'||
' LAST_HELD_DATE_R,'||
' PREV_HELD_DATE_R,'||
' PRIMARY_ASSET_ID_Q,'||
' SECURITY_ALIAS_Q'||
' FROM T'||
' WHERE CNT =1';
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
END IF;
LC_SQL := 'TRUNCATE TABLE HIST_DATA_COMPARE_TBL';
EXECUTE IMMEDIATE LC_SQL;
END IDENTIFY_SECURITIES;
PROCEDURE RETREIVE_RECORDS_FROM_SPOKE
AS
v_target_table VARCHAR2(500);
v_sql_to_run VARCHAR2(5000);
v_commit_after NUMBER;
BEGIN
LC_SQL :='';
LC_SQL:= 'TRUNCATE TABLE X_SECMASTER_HISTORY_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_SEC_MASTER_DTL_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_SECMASTER_DTL_EXT_HST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_EQUITY_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_EQUITY_DETAIL_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_FIXED_INCOME_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_FIXED_INCOME_DTL_EXT_TBL';
EXECUTE IMMEDIATE LC_SQL;
LC_SQL:= 'TRUNCATE TABLE X_DERIVATIVES_HIST_TBL';
EXECUTE IMMEDIATE LC_SQL;
/*SECMASTER_HISTORY*/
v_target_table := 'X_SECMASTER_HISTORY_TBL';
v_sql_to_run := ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*SECURITY_MASTER_DETAIL_HIST*/
v_target_table := 'X_SEC_MASTER_DTL_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*SECMASTER_DETAIL_EXT_HIST*/
v_target_table := 'X_SECMASTER_DTL_EXT_HST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*EQUITY_HIST*/
v_target_table := 'X_EQUITY_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*EQUITY_DETAIL_HIST*/
v_target_table := 'X_EQUITY_DETAIL_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*FIXED_INCOME_HIST*/
v_target_table := 'X_FIXED_INCOME_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*FIXED_INCOME_DETAIL_EXT_HIST*/
v_target_table := 'X_FIXED_INCOME_DTL_EXT_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
/*DERIVATIVES_HIST*/
v_target_table := 'X_DERIVATIVES_HIST_TBL';
v_sql_to_run:= ' SELECT /*+DRIVING_SITE(K)*/ K.* FROM [email protected] K '||
' INNER JOIN SPOKE_TO_HUB_SEC_MTCH_TBL I'||
' ON K.SECURITY_ALIAS = I.SPOKE_SEC'||
' AND K.SRC_INTFC_INST = 140 '||
' and K.EFFECTIVE_DATE =(SELECT /*+DRIVING_SITE(H)*/ MAX (H.EFFECTIVE_DATE) FROM [email protected] H WHERE'||
' H.SECURITY_ALIAS = K.SECURITY_ALIAS AND H.SRC_INTFC_INST = K.SRC_INTFC_INST)' ;
v_commit_after := 0;
INSERT_DATA_TO_TABLE(v_target_table,v_sql_to_run,v_commit_after);
END RETREIVE_RECORDS_FROM_SPOKE;
PROCEDURE COMPARE_RECORDS(p_err_msg OUT VARCHAR2)
AS
l_count NUMBER;
l_err_msg VARCHAR2(100);
TYPE T_SECURITIES is TABLE of HIST_DATA_COMPARE_TBL%rowtype;
ttype T_SECURITIES;
CURSOR C1
IS
SELECT TABLE_NAME, TEMP_TABLE, COLUMN_NAME from SEC_COMPARE_CONFIG
where column_name='EFFECTIVE_DATE';
CURSOR C2
IS
SELECT * FROM SEC_COMPARE_CONFIG where id <=82;
C_REC SEC_COMPARE_CONFIG%rowtype;
BEGIN
LC_SQL :='';
p_err_msg :='';
FOR C_REC in C1
loop
LC_SQL:= ' SELECT /*+DRIVING_SITE(B)*/ /*+PARALLEL(A,100)*/ B.SECURITY_ALIAS, to_char(C.SPOKE_PAID), A.SECURITY_ALIAS,to_char(C.HUB_PAID),'||''''||C_REC.TABLE_NAME||''''||','||q'!'EFFECTIVE_DATE'!'||','||
' NVL((cast(B.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||'),'||
' NVL((cast(A.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||')'||
' FROM '||C_REC.TEMP_TABLE||' A, SECURITYDBO.'||C_REC.TABLE_NAME ||'@PROD1.WORLD B,'||
' SPOKE_TO_HUB_SEC_MTCH_TBL C'||
' WHERE A.SRC_INTFC_INST=140'||
' AND B.SRC_INTFC_INST=140'||
' AND A.SECURITY_ALIAS=C.spoke_sec'||
' and b.security_alias=C.HUB_SEC'||
' AND a.effective_date <> (select max(h.effective_date) from SECURITYDBO.'||C_REC.TABLE_NAME||'@PROD1.WORLD H'||
' where h.security_alias=c.hub_sec and h.src_intfc_inst=140 )';
EXECUTE IMMEDIATE LC_SQL BULK COLLECT into ttype;
FORALL x in ttype.First..ttype.Last
insert into HIST_DATA_COMPARE_TBL values ttype(x);
commit;
end loop;
For C_REC in C2
loop
LC_SQL:= ' SELECT /*+DRIVING_SITE(B)*/ /*+PARALLEL(A,100)*/ B.SECURITY_ALIAS, to_char(C.SPOKE_PAID), A.SECURITY_ALIAS,to_char(C.HUB_PAID),'||''''||C_REC.TABLE_NAME||''''||','||''''||C_REC.COLUMN_NAME||''''||','||
' NVL((cast(B.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||'),'||
' NVL((cast(A.'||C_REC.COLUMN_NAME||' as VARCHAR2(100))),'||q'!'No Records Found'!'||')'||
' FROM '||C_REC.TEMP_TABLE||' A, SECURITYDBO.'||C_REC.TABLE_NAME ||'@PROD1.WORLD B,'||
' SPOKE_TO_HUB_SEC_MTCH_TBL C'||
' WHERE A.SRC_INTFC_INST=140'||
' AND B.SRC_INTFC_INST=140'||
' AND A.SECURITY_ALIAS=C.spoke_sec'||
' and b.security_alias=C.HUB_SEC'||
' and b.effective_date=a.effective_date'||
' AND NVL((cast(A.'||C_REC.column_name||' as VARCHAR2(100))),'||q'!'No Records Found'!'||') <>'||
' NVL((cast(B.'||C_REC.column_name||' as VARCHAR2(100))),'||q'!'No Records Found'!'||')';
EXECUTE IMMEDIATE LC_SQL BULK COLLECT into ttype;
FORALL x in ttype.First..ttype.Last
insert into HIST_DATA_COMPARE_TBL values ttype(x);
commit;
end loop;
BEGIN
select count(*) into l_count from HIST_DATA_COMPARE_TBL;
if(l_count=0) then
l_err_msg :='No records found';
end if;
END;
END COMPARE_RECORDS;
NAME: INSERT_DATA_TO_TABLE
DESCRIPTION: This procedure will insert the records into the target table based based on the data fetched using the sql to run variable.
It also records the commit_after variable which defines that after how many records the insert needs to be committed.
PROCEDURE INSERT_DATA_TO_TABLE ( v_target_table VARCHAR2,
v_sql_to_run VARCHAR2,
v_commit_after NUMBER) IS
v_limit_sql1 VARCHAR2(300) := ' ';
v_limit_sql2 VARCHAR2(900) := ' ';
v_plsql_to_run VARCHAR2(32767);
BEGIN
IF NVL(v_commit_after,0) <> 0 THEN
v_limit_sql1:= ' LIMIT ' || TO_CHAR(v_commit_after) ;
v_limit_sql2:= ' IF MOD(v_number_of_rows, ' || TO_CHAR(v_commit_after) || ' ) = 0 THEN ' ||
' COMMIT; ' ||
' END IF; ' ;
END IF;
v_plsql_to_run:= ' ' ||
'DECLARE ' ||
' v_number_of_rows number:=0; ' ||
' ' ||
' TYPE MyType IS REF CURSOR; ' ||
' CV MyType; ' ||
' TYPE RecTyp IS TABLE OF ' || v_target_table || '%ROWTYPE; ' ||
' rec RecTyp; ' ||
' ' ||
'BEGIN ' ||
' ' ||
'OPEN CV FOR ' ||
' ' || REPLACE( v_sql_to_run, ';', ' ' ) || ' ; ' ||
' LOOP ' ||
' FETCH CV BULK COLLECT INTO rec ' || v_limit_sql1 || '; ' ||
' FORALL i IN 1..rec.COUNT ' ||
' INSERT /*+ APPEND */ INTO ' || v_target_table || ' VALUES rec(i); ' ||
' v_number_of_rows := v_number_of_rows + SQL%ROWCOUNT; ' ||
' ' || v_limit_sql2 || ' ' ||
' EXIT WHEN CV%NOTFOUND; ' ||
' ' ||
' END LOOP; ' ||
' COMMIT; ' ||
' CLOSE CV; ' ||
'END; ';
EXECUTE IMMEDIATE v_plsql_to_run;
COMMIT;
END INSERT_DATA_TO_TABLE;
END SECURITY_COMPARE; -
How to add column in table control for transaction APPCREATE
Hi All,
How can i add the additional column in table control for transaction APPCREATE.
There is structure PT1045_EXT present behind table control. But not found any customer exit or badi to display on screen.
Please help...You can add new columns
If you add new columns in tr. PHAP_CATALOG
Maybe you are looking for
-
Return process from Depot with excise duties.
Hi All, I want to return the material from depot(rec. plant) to manufacturing plant (supl plant). When I gone thru forum I found different procedure by different experts. Can anyone guide me for the exact step by step procedure which if anybody is us
-
How to free up space on my mac
hi all, i've a macbbok air with 128gb flash storage{2013}.i needed to free up space, so i made a backup on a seagate harddrive connected to my airport extreme. after the backup was successfully done, I moved a previously downloaded large file{45gb} t
-
I was installing ESSBASE on Linux but it failed. I tried a second time and it completed successfully. I noticed it's now listening on PORT 1424, the config utility auto-incremented because 1423 was not available at the time. I think it might be beca
-
Opacity issue when rendering with IE
what is the fix for opacity rendering as a solid colour in IE? it renders just fine with Firefox and safari (opacity set to 80%)
-
Get_component_list
Hi there, I'm using the FM 'GET_COMPONENT_LIST' to get the fields of a IT, but it's not working, I'm in 4.7 Not even in SE37 it's working. there some restriction to use it in 4.7, or someone know another way to get the fields of a internal table? Ale