Issue with determing table name runtime
Hi All,
I have a problem in determing the table name during runtime
TABLES : ekko.
DATA : test TYPE REF TO lcl_test.
DATA : itab TYPE STANDARD TABLE OF ekko.
IF test IS INITIAL.
CREATE OBJECT test.
ENDIF.
test->cmeth( EXPORTING itab1 = itab ).
CLASS lcl_test DEFINITION.
PUBLIC SECTION.
METHODS : cmeth IMPORTING itab1 TYPE STANDARD TABLE.
ENDCLASS.
* CLASS lcl_test IMPLEMENTATION
CLASS lcl_test IMPLEMENTATION.
METHOD test.
*Here i have to know the table name of the importing internal data itab1 (In this particular case its EKKO)
*In general it can be any table
*Is there a way to determine a table name (as EKKO) in this method
*My problem is i need to find out the field of that internal table
*to find out the fields of the table i'm using
CALL FUNCTION 'GET_COMPONENT_LIST'
EXPORTING
program = sy-repid
fieldname = 'I need to pass header of the internal table something like wa_ekko'
TABLES
components = icomp.
** so i have to find out the table name and declare a work area and then pass that to Get_component_list FM*
OR
I CAN USE BELOW CODE AS WELL
data:
wa_ref type ref to data,
desc_table type ref to cl_abap_tabledescr,
desc_struc type ref to cl_abap_structdescr.
field-symbols:
<p_data> type any,
<p_field> type any,
<p_component> type abap_compdescr.
** The probelm here is it_data has s tructure defined in class unlike mine with is type standard table*
create data wa_ref like line of it_data.
assign wa_ref->* to <p_data>.
desc_table ?= cl_abap_tabledescr=>describe_by_data( it_data ).
desc_struc ?= desc_table->get_table_line_type( ).
loop at it_data assigning <p_data>.
loop at desc_struc->components assigning <p_component>.
assign component <p_component>-name of structure <p_data> to <p_field>.
endloop.
endloop.
endmethod.
Hope i'm clear
Thanks
David
Hi
Perhaps something like this can help you:
TABLES : EKKO.
DATA : ITAB TYPE STANDARD TABLE OF EKKO.
CLASS LCL_TEST DEFINITION.
PUBLIC SECTION.
METHODS : CMETH IMPORTING ITAB1 TYPE STANDARD TABLE.
ENDCLASS. "lcl_test DEFINITION
CLASS LCL_TEST IMPLEMENTATION.
METHOD CMETH.
DATA: MY_WA TYPE REF TO DATA.
DATA: DESC_TABLE TYPE REF TO CL_ABAP_TABLEDESCR,
DESC_STRUC TYPE REF TO CL_ABAP_STRUCTDESCR.
FIELD-SYMBOLS:
<P_DATA> TYPE ANY,
<P_FIELD> TYPE ANY,
<P_COMPONENT> TYPE ABAP_COMPDESCR.
CREATE DATA MY_WA LIKE LINE OF ITAB1.
ASSIGN MY_WA->* TO <P_DATA>.
DESC_STRUC ?= CL_ABAP_TYPEDESCR=>DESCRIBE_BY_DATA( <P_DATA> ).
LOOP AT DESC_STRUC->COMPONENTS ASSIGNING <P_COMPONENT>.
WRITE: / <P_COMPONENT>-NAME.
ENDLOOP.
ENDMETHOD. "test
ENDCLASS. "lcl_test IMPLEMENTATION
DATA : TEST TYPE REF TO LCL_TEST.
START-OF-SELECTION.
IF TEST IS INITIAL.
CREATE OBJECT TEST.
ENDIF.
TEST->CMETH( EXPORTING ITAB1 = ITAB ).
Max
Similar Messages
-
Issue with attachment file name
Hi All,
This is about an issue with attachment file name:
we have a scenario wherein we have payload with attachments ...(attachments can be any doc ,pdf) , problem is main document is comming with messageid.sap.com and thats normal but attachments are comming with file names for example something.pdf or something.doc or something.txt ...this is failing in adapter as it expects same name as u have in main document...anybody have any idea to get through this issue...
Regards
kiranwe have a scenario wherein we have payload with attachments ...(attachments can be any doc ,pdf) , problem is main document is comming with messageid.sap.com and thats normal but attachments are comming with file names for example something.pdf or something.doc or something.txt ...this is failing in adapter as it expects same name as u have in main document...anybody have any idea to get through this issue...
- <SAP:Payload xlink:href="cid:payload-4CED452F17C601BDE10080000A492050---sap.com">
<SAP:Name>1 .Header1.txt</SAP:Name>
Error we are getting is
Cannot cast 'Header' to boolean] in class com.sap.aii.mappingtool.flib7.NodeFunctions method createIf[Header, com.sap.aii.mappingtool.tf7.rt.Context---27a73bfa]
So we have to change the File name Header1.txt to something which we can cast to creatif....(we cannot tell the sendr to change the file name as its is set already)
Thanks for interste and assisting
Regards
Kiran -
Issue with Data Provider name in variable screen for BEx Analyzer
Hello all,
We got an issue with Data Provider name in Variable screen in BEx Analayzer.
We want to change the DataProvider name there to Description of the report instead of its Technical name.
Any inputs are appreciated.
Thanks
KumarYou have to create a workbook to do this.
Refresh your query/report. In Bex analyser, there is one toolbar named BEx design toolbox, If you are not able to see it in analyser, right click on the toolbar space of BEx analyser and click on BEx design toolbox. Here, goto to design mode, by clicking on a sysbol like 'A'. after that place the curser where you want to see the Query description. and click on insert text (T) in BEx toolbox. click on it and check "Query description" in constant tab. in the general tab you need to assign a dataprovider, for that assign your query name in workbook settings (in Bex design toolbox). also check the "display caption" in general tab.
Pravender -
Revision: 12982
Revision: 12982
Author: [email protected]
Date: 2009-12-15 20:44:23 -0800 (Tue, 15 Dec 2009)
Log Message:
Fix for issue with exposing accessible names for combobox list items
QE notes: none
Doc notes: none
Bugs: n/a
Reviewer: Gordon
Tests run: checkintests
Is noteworthy for integration: no
Modified Paths:
flex/sdk/trunk/frameworks/projects/spark/src/spark/accessibility/ComboBoxAccImpl.as
flex/sdk/trunk/frameworks/projects/spark/src/spark/accessibility/ListBaseAccImpl.asAdd this to the end of your nav p CSS selector at Line 209 of your HTML file, after 'background-repeat...':
margin-bottom: -2px;
Your nav p will then look like this:
nav p {
font-size: 90%;
font-weight: bold;
color: #FFC;
background-color: #090;
text-align: right;
padding-top: 5px;
padding-right: 20px;
padding-bottom: 5px;
border-bottom-width: 2px;
border-bottom-style: solid;
border-bottom-color: #060;
background-image: url(images/background.png);
background-repeat: repeat-x;
margin-bottom: -2px; -
Hello Gurus..... ISSUE with child Table update
I have an issue with child table update
I have created a GTC with one parent table and two child tables. I'm able to update the parent table and the values are found in db, but the ISSUE is the child Table values are not updating the db.
please give me a solution
regards
SrikanthIf you are keeping referential integrity in the database, not in the application, it is easy to find the child and parent tables. Here is a quick and dirty query. You can join this to dba_cons_columns to find out on which columns the referential constraints are defined. This lists all child-parent table including SYS and SYSTEM users. You can run this for specific users of course.
select cons1.owner child_owner,cons1.table_name child_table,
cons2.owner parent_owner,cons2.table_name parent_table
from dba_constraints cons1,dba_constraints cons2
where cons1.constraint_type='R'
and cons1.r_constraint_name=cons2.constraint_name; -
Copy the structure of a table to another with another table name.
how to copy the structure of a table to another with another table name.
ie. i want a emp table with same values/structure to be copied to another table called my_employee.
how can this be done?create table my_emp as select * from emp;
If you do not want the data to be copied then do the following:
create table my_emp as select * from emp
where 1=2;
Avanti. -
Issues with Advance Table Add Row New Row not work in some scenarios.
Hi,
Wondering if there's any issue with Advanced Tables where it does not create any rows. I don't know if anyone tried this or not. I have one OA Page with Advanced Table and a button that when clicked open a new OA Page in a POP-UP Window. The pop-up page conatins one textbox where u enter a data and this gets saved in one of the VO's transient attribute. Now on the ase page if you don't click a button to open a pop-up page you can Add New Rows in the Advanced Table by clicking Add Row Button. But as soon as you open a popup window and close it Add New Rows button doesn't work and is not creating any new rows. Basically page stops working. Both the POP-UP and the base page share the same AM but have different controllers.
POP-UP page is a custom page that I open giving the Destination URI value in the button item and target frame _blank.
I even tried creating rows programmatically for Advance Table but this too doesn't work once u open a pop-up. Also I have used pageContext.putTransactionValue in the pop-up page and am checking and removing this in the base page.
Any help is appreciated.
Thanksanyone
-
Problem with Dynamic Table Name
Hello all,
I am having trouble using a dynamic table name. I have the following code.....
declare l_cur sys_refcursor;
l_ID int;
l_tableName varchar(30);
BEGIN
open l_cur for
select hkc.ColumnID, mapping from &HKAPPDB_Schema_Name..doctablemapping ddm
inner join &HKDB_Schema_Name..HKColumns hkc on hkc.doctablemappingid = ddm.id
where ddm.id > 0;
LOOP
FETCH l_cur into l_ID, l_tableName;
EXIT WHEN l_cur%notfound;
-- update missing VerbID in DocumentDocMapping table
UPDATE &HKAPPDB_Schema_Name..IndexedDocument
SET VerbID = (SELECT t.VerbID
FROM (SELECT DocRef, VerbID, DateUpdated
FROM &HKAPPDB_Schema_Name..l_tableName dd - this is where the dynamic table name is used
WHERE dd.VerbID is not NULL))
WHERE HKColumnID = l_ID AND VerbID is NULL;
END loop;
end;
/When I try to execute this i get an error
ORA-00942: table or view does not exist
What am I doing wrong?
Regards,
Tobyredeye wrote:
I only started about 6 weeks ago, with no tutorials and learning it on the fly; Same here.. only my introduction was to a 12 node Oracle OPS cluster all those years ago.. and required a whole new mind set after using SQL-Server extensively. But it was fun. Still is. :-)
but thats what you get when a company throws you in at the deep end with a ridiculous time constraint to migrate a whole MSSQL DB.Migrating SQL-Server to Oracle is not a simple thing. A lot of best practices in SQL-Server are absolutely worse practices in Oracle - they are that different. Simple example is lock escalation - an issue in SQL-Server. In Oracle, the concept of a lock being escalated into a page lock simply does not exist.
In terms of getting the migration done as quickly and painlessly as possible I try to reuse all the logic as it appears in the MSSQL code - in this case it was using dynamic table names. I do not doubt that i am probably shooting myself in the foot in the long run.....As long as you do not splatter too much blood on us here.. not a problem :D
Seriously though - just keep in mind that what works in SQL-Server may not work as well (or even at all) in Oracle. So do not hesitate to refactor (from design to code to SQL) mercilessly when you think it is warranted. -
Issue with Temp tables in SSIS 2012 with RetainSameConnection=true
Hello,
We have few packages written in 2008 and are being upgraded to 2012. Our package mostly uses temp tables during the process. During initial migration, we faced issue with handling temp table in the OLE Db destination provider and found a solution for
the same under
usage of Temp tables in SSIS 2012
Most of our packages execute fine now.
we came across a different issue recently. For one of our package, which merges 3 feeds into a temp table and then executes a stored procedure for processing, the package fails intermittently.
Below are properties of SSIS and its components, which you might be interested
* Retainsameconnection for the OLE Db connection manager set to True
* properties of OLEDB Destination
AccessMode : SQL Command
CommandTimeOut : 0
SQLCommand : Select * from #tmp
* using SSIS 2012 and SQL OLEDB Native Provider 11 (Provider=SQLNCLI11.1)
* one of the feed is 10MB
During investigation using profiler, found that though I use RetainSameConnection, I often could see that more than one SPId is used during the scope of SSIS execution and when ever this happens, package fails with below error message
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E14 Description: "Statement(s) could not be prepared.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E14 Description: "Invalid object name '#tmp'."
Now, Why SSIS uses a different SPId during its execution when RetainSameconnection is set to True (note : I have only one OLEDB connection in that package)?
To simulate the issue, Instead of 10MB file, I used a 500KB file and execute the package twice and all went fine.
Is it because of 10 MB file taking long time to process causing the time out of that OLEDB destionation forcing the SSIS to go for another connection? but remember, CommandTimeout is set to infinite(0) for that OLEDB destination.
Much appreciated your response.Hey,
I understand you used Retainsameconnection property true for all the OLEDB connections you used in the package if not make sure its for all the connection including file connection as well.
Additionally, you can try to set Delayvalidation property to true for all the dataflows and control flows in the connection and try running the package for 10MB file.
I hope this will fix the intermittent failure issue you are facing with SSIS.
(Please mark solved if I've answered your question, vote for it as helpful to help other user's find a solution quicker)
Thanks,
Atul Gaikwad. -
SQL Query : Order By issue with HUGE Table
Hello friends,
I have been through a terrible issue with order by. I would appreciate your help. Please let me know, your input for my case:
=> if i run select query it returns result quick in some milliseconds. (sql dev. fetches 50 rows at a time)
=> if i run select query with where condition and column (say A) in where condition is even indexed and i have order by and that order by column (say B) is also indexed.
Now, here is the issue:
1. if no. of rows with that where condition can filter yielding small result set then order by works fine .. 1-5 sec which is good.
2.*if no. of rows with that where condition can filter yielding Large result set, say more than 50,000 then with order by then the wait time is exponential.... i have even waited 10+ mins to get the result back for 120,000 records.*
Is order by takes that long for 100K records ... i think something else if wrong... your pointer will really be helpful... i am very new to sql and even newer for large table case.
I am using SQL Developer Version 2.1.1.64
and Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
Thank you so much.
Edited by: 896719 on Jan 11, 2013 8:38 AMYes you are correct, but my concentration was on order by thing, so it will do full scan of table so i was putting that ... and was also wondering if millions of record in table should not be a issue...???
Any way for the explain plan , when just a value in the where changes there is the huge difference i want to point out too as below:
SELECT
FROM
EES_EVT EES_EVT where APLC_EVT_CD= 'ABC' ORDER BY CRE_DTTM DESC
execution time : 0.047 sec
Plan hash value: 290548126
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 27 | 14688 | 25 (4)| 00:00:01 |
| 1 | SORT ORDER BY | | 27 | 14688 | 25 (4)| 00:00:01 |
| 2 | TABLE ACCESS BY INDEX ROWID| EES_EVT | 27 | 14688 | 24 (0)| 00:00:01 |
|* 3 | INDEX RANGE SCAN | XIE1EES_EVT | 27 | | 4 (0)| 00:00:01 |
Predicate Information (identified by operation id):
3 - access("APLC_EVT_CD"='ABC')
Note
- SQL plan baseline "SYS_SQL_PLAN_6d41e6b91925c463" used for this statement
=============================================================================================
SELECT
FROM
EES_EVT EES_EVT where APLC_EVT_CD= 'XYZ' ORDER BY CRE_DTTM DESC
execution : 898.672 sec.
Plan hash value: 290548126
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 121K| 62M| | 102K (1)| 00:11:02 |
| 1 | SORT ORDER BY | | 121K| 62M| 72M| 102K (1)| 00:11:02 |
| 2 | TABLE ACCESS BY INDEX ROWID| EES_EVT | 121K| 62M| | 88028 (1)| 00:09:27 |
|* 3 | INDEX RANGE SCAN | XIE1EES_EVT | 121K| | | 689 (1)| 00:00:05 |
Predicate Information (identified by operation id):
3 - access("APLC_EVT_CD"='XYZ')
Note
- SQL plan baseline "SYS_SQL_PLAN_ef5709641925c463" used for this statementAlso Note this table contains 74328 MB data in it.
Thanks -
Insert performance issue with Partitioned Table.....
Hi All,
I have a performance issue during with a table which is partitioned. without table being partitioned
it ran in less time but after partition it took more than double.
1) The table was created initially without any partition and the below insert took only 27 minuts.
Total Rec Inserted :- 2424233
PL/SQL procedure successfully completed.
Elapsed: 00:27:35.20
2) Now I re-created the table with partition(range yearly - below) and the same insert took 59 minuts.
Is there anyway i can achive the better performance during insert on this partitioned table?
[ similerly, I have another table with 50 Million records and the insert took 10 hrs without partition.
with partitioning the table, it took 18 hours... ]
SQL> select * from table(dbms_xplan.display);
PLAN_TABLE_OUTPUT
Plan hash value: 4195045590
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 643K| 34M| | 12917 (3)| 00:02:36 |
|* 1 | HASH JOIN | | 643K| 34M| 2112K| 12917 (3)| 00:02:36 |
| 2 | VIEW | index$_join$_001 | 69534 | 1290K| | 529 (3)| 00:00:07 |
|* 3 | HASH JOIN | | | | | | |
| 4 | INDEX FAST FULL SCAN| PK_ACCOUNT_MASTER_BASE | 69534 | 1290K| | 181 (3)| 00:00
| 5 | INDEX FAST FULL SCAN| ACCOUNT_MASTER_BASE_IDX2 | 69534 | 1290K| | 474 (2)| 00:00
PLAN_TABLE_OUTPUT
| 6 | TABLE ACCESS FULL | TB_SISADMIN_BALANCE | 2424K| 87M| | 6413 (4)| 00:01:17 |
Predicate Information (identified by operation id):
1 - access("A"."VENDOR_ACCT_NBR"=SUBSTR("B"."ACCOUNT_NO",1,8) AND
"A"."VENDOR_CD"="B"."COMPANY_NO")
3 - access(ROWID=ROWID)
Open C1;
Loop
Fetch C1 Bulk Collect Into C_Rectype Limit 10000;
Forall I In 1..C_Rectype.Count
Insert test
col1,col2,col3)
Values
val1, val2,val3);
V_Rec := V_Rec + Nvl(C_Rectype.Count,0);
Commit;
Exit When C_Rectype.Count = 0;
C_Rectype.delete;
End Loop;
End;
Total Rec Inserted :- 2424233
PL/SQL procedure successfully completed.
Elapsed: 00:51:01.22
Edited by: user520824 on Jul 16, 2010 9:16 AMI'm concerned about the view in step 2 and the index join in step 3. A composite index with both columns might eliminate the index join and result in fewer read operations.
If you know which partition the data is going into beforehand you can save a little bit of processing by specifying the partition (which may not be a scalable long-term solution) in the insert - I'm not 100% sure you can do this on inserts but I know you can on selects.
The APPEND hint won't help the way you are using it - the VALUES clause in an insert makes it be ignored. Where it is effective and should help you is if you can do the insert in one query - insert into/select from. If you are using the loop to avoid filling up undo/rollback you can use a bulk collect to batch the selects and commit accordingly - but don't commit more often than you have to because more frequent commits slow transactions down.
I don't think there is a nologging hint :)
So, try something like
insert /*+ hints */ into ...
Select
A.Ing_Acct_Nbr, currency_Symbol,
Balance_Date, Company_No,
Substr(Account_No,1,8) Account_No,
Substr(Account_No,9,1) Typ_Cd ,
Substr(Account_No,10,1) Chk_Cd,
Td_Balance, Sd_Balance,
Sysdate, 'Sisadmin'
From Ideaal_Cons.Tb_Account_Master_Base A,
Ideaal_Staging.Tb_Sisadmin_Balance B
Where A.Vendor_Acct_Nbr = Substr(B.Account_No,1,8)
And A.Vendor_Cd = b.company_no
;Edited by: riedelme on Jul 16, 2010 7:42 AM -
Creating a form with variable table name(s)
Hi,
I am building a form that will allow user(s) switch to other user
(s) dynamically. I have been able able to do this by creating a
non-database table and use record group to read from the base
table of each user.(All the tables in question have the same
structure).
Record group is a read only, and I want to be able to do data
maninpulation with this form.
The questions are
(1) It possible to do data manipulation with record group and if
it is, how?
(2) Is it possible to build a form with a variable table name?
if possible, how?
Somebody pls help.
Thanks,
Olutunde.
nullOlutunde Babarinsa (guest) wrote:
: Hi,
: I am building a form that will allow user(s) switch to other
user
: (s) dynamically. I have been able able to do this by creating a
: non-database table and use record group to read from the base
: table of each user.(All the tables in question have the same
: structure).
: Record group is a read only, and I want to be able to do data
: maninpulation with this form.
: The questions are
: (1) It possible to do data manipulation with record group and
if
: it is, how?
: (2) Is it possible to build a form with a variable table name?
: if possible, how?
: Somebody pls help.
: Thanks,
: Olutunde.
Hi,
you can create and manipulate record groups at runtime(see Forms
Reference 'Create_Group' and 'Add_Group_Row') don't use
Create_Group_from_Query. For your purpose it's better to build a
cursor loop on your Query and add your datas after
manipulating with 'Add_Group_Row' to your Record Group.
It's possible to SET_BLOCK_PROPERTY(QUERY_DATA_SOURCE_NAME) but
it's not possible to change the item property 'Column Name'.
Therefor I would suggest to build a non database block and
populate these block by a programm unit, which works with a
pl/sql cursor loop and the 'create record' to populate these
block.
null -
Performance issues with pipelined table functions
I am testing pipelined table functions to be able to re-use the <font face="courier">base_query</font> function. Contrary to my understanding, the <font face="courier">with_pipeline</font> procedure runs 6 time slower than the legacy <font face="courier">no_pipeline</font> procedure. Am I missing something? The <font face="courier">processor</font> function is from [url http://www.oracle-developer.net/display.php?id=429]improving performance with pipelined table functions .
Edit: The underlying query returns 500,000 rows in about 3 minutes. So there are are no performance issues with the query itself.
Many thanks in advance.
CREATE OR REPLACE PACKAGE pipeline_example
IS
TYPE resultset_typ IS REF CURSOR;
TYPE row_typ IS RECORD (colC VARCHAR2(200), colD VARCHAR2(200), colE VARCHAR2(200));
TYPE table_typ IS TABLE OF row_typ;
FUNCTION base_query (argA IN VARCHAR2, argB IN VARCHAR2)
RETURN resultset_typ;
c_default_limit CONSTANT PLS_INTEGER := 100;
FUNCTION processor (
p_source_data IN resultset_typ,
p_limit_size IN PLS_INTEGER DEFAULT c_default_limit)
RETURN table_typ
PIPELINED
PARALLEL_ENABLE(PARTITION p_source_data BY ANY);
PROCEDURE with_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ);
PROCEDURE no_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ);
END pipeline_example;
CREATE OR REPLACE PACKAGE BODY pipeline_example
IS
FUNCTION base_query (argA IN VARCHAR2, argB IN VARCHAR2)
RETURN resultset_typ
IS
o_resultset resultset_typ;
BEGIN
OPEN o_resultset FOR
SELECT colC, colD, colE
FROM some_table
WHERE colA = ArgA AND colB = argB;
RETURN o_resultset;
END base_query;
FUNCTION processor (
p_source_data IN resultset_typ,
p_limit_size IN PLS_INTEGER DEFAULT c_default_limit)
RETURN table_typ
PIPELINED
PARALLEL_ENABLE(PARTITION p_source_data BY ANY)
IS
aa_source_data table_typ;-- := table_typ ();
BEGIN
LOOP
FETCH p_source_data
BULK COLLECT INTO aa_source_data
LIMIT p_limit_size;
EXIT WHEN aa_source_data.COUNT = 0;
/* Process the batch of (p_limit_size) records... */
FOR i IN 1 .. aa_source_data.COUNT
LOOP
PIPE ROW (aa_source_data (i));
END LOOP;
END LOOP;
CLOSE p_source_data;
RETURN;
END processor;
PROCEDURE with_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ)
IS
BEGIN
OPEN o_resultset FOR
SELECT /*+ PARALLEL(t, 5) */ colC,
SUM (CASE WHEN colD > colE AND colE != '0' THEN colD / ColE END)de,
SUM (CASE WHEN colE > colD AND colD != '0' THEN colE / ColD END)ed,
SUM (CASE WHEN colD = colE AND colD != '0' THEN '1' END) de_one,
SUM (CASE WHEN colD = '0' OR colE = '0' THEN '0' END) de_zero
FROM TABLE (processor (base_query (argA, argB),100)) t
GROUP BY colC
ORDER BY colC
END with_pipeline;
PROCEDURE no_pipeline (argA IN VARCHAR2,
argB IN VARCHAR2,
o_resultset OUT resultset_typ)
IS
BEGIN
OPEN o_resultset FOR
SELECT colC,
SUM (CASE WHEN colD > colE AND colE != '0' THEN colD / ColE END)de,
SUM (CASE WHEN colE > colD AND colD != '0' THEN colE / ColD END)ed,
SUM (CASE WHEN colD = colE AND colD != '0' THEN 1 END) de_one,
SUM (CASE WHEN colD = '0' OR colE = '0' THEN '0' END) de_zero
FROM (SELECT colC, colD, colE
FROM some_table
WHERE colA = ArgA AND colB = argB)
GROUP BY colC
ORDER BY colC;
END no_pipeline;
END pipeline_example;
ALTER PACKAGE pipeline_example COMPILE;Edited by: Earthlink on Nov 14, 2010 9:47 AM
Edited by: Earthlink on Nov 14, 2010 11:31 AM
Edited by: Earthlink on Nov 14, 2010 11:32 AM
Edited by: Earthlink on Nov 20, 2010 12:04 PM
Edited by: Earthlink on Nov 20, 2010 12:54 PMEarthlink wrote:
Contrary to my understanding, the <font face="courier">with_pipeline</font> procedure runs 6 time slower than the legacy <font face="courier">no_pipeline</font> procedure. Am I missing something? Well, we're missing a lot here.
Like:
- a database version
- how did you test
- what data do you have, how is it distributed, indexed
and so on.
If you want to find out what's going on then use a TRACE with wait events.
All nessecary steps are explained in these threads:
HOW TO: Post a SQL statement tuning request - template posting
http://oracle-randolf.blogspot.com/2009/02/basic-sql-statement-performance.html
Another nice one is RUNSTATS:
http://asktom.oracle.com/pls/asktom/ASKTOM.download_file?p_file=6551378329289980701 -
Issue with Multiple Tables in Report
Post Author: dwessell
CA Forum: General
Hi,
I'm using Crystal Reports 2k8.
I'm doing a report with three tables, CQ_HEADER, SO_HEADER and SALESPERSON. Both the CQ_HEADER and the SO_HEADER tables link to the SALESPERSON table via a SPN_AUTO_KEY field.
However, I always receive duplicates in my result set, due to the joins made, and I don't receive results that are valid in one table, and empty in another (Such that it only counts a CQ, if there is a SO associated with it. Here's the query that's produced by CR.
SELECT "CQ_HEADER"."CQ_NUMBER", "CQ_HEADER"."ENTRY_DATE", "CQ_HEADER"."TOTAL_PRICE", "SALESPERSON"."SALESPERSON_NAME", "SO_HEADER"."ENTRY_DATE", "SO_HEADER"."TOTAL_PRICE"
FROM "CQ_HEADER" "CQ_HEADER" INNER JOIN ("SO_HEADER" "SO_HEADER" INNER JOIN "SALESPERSON" "SALESPERSON" ON "SO_HEADER"."SPN_AUTO_KEY"="SALESPERSON"."SPN_AUTO_KEY") ON "CQ_HEADER"."SPN_AUTO_KEY"="SALESPERSON"."SPN_AUTO_KEY"
WHERE ("CQ_HEADER"."ENTRY_DATE">={ts '2007-12-01 00:00:00'} AND "CQ_HEADER"."ENTRY_DATE"<{ts '2007-12-18 00:00:00'}) AND ("SO_HEADER"."ENTRY_DATE">={ts '2007-12-01 00:00:00'} AND "SO_HEADER"."ENTRY_DATE"<{ts '2007-12-18 00:00:00'})
ORDER BY "SALESPERSON"."SALESPERSON_NAME"
There is no link between the SO_HEADER and the CQ_HEADER. Can anyone make a suggestion as to how I could go about structuring this such that it doesn't return duplicate values?
Thanks
DavidHey,
I understand you used Retainsameconnection property true for all the OLEDB connections you used in the package if not make sure its for all the connection including file connection as well.
Additionally, you can try to set Delayvalidation property to true for all the dataflows and control flows in the connection and try running the package for 10MB file.
I hope this will fix the intermittent failure issue you are facing with SSIS.
(Please mark solved if I've answered your question, vote for it as helpful to help other user's find a solution quicker)
Thanks,
Atul Gaikwad. -
Rebate related issue with database table VKDFS & VBAK
Hi everybody,
I am facing the problem with the tables VKDFS and VBAK.
In my program the report has to display the details of the agrement numbers concerning to the sale or billing doucmnets later on it has to create a credit memo for that particular customer.
In the coding the program in very beging step, it is fetching all sales documents from VKDFS as per selections like following.
select * from vkdfs into table ivkdfs
where fktyp in r_fktyp
and vkorg in s_vkorg
and fkdat in s_fkdat
and kunnr in s_kunnr
and fkart in s_fkart
and vbeln in s_vbeln
and faksk in s_faksk
and vtweg in s_vtweg
and spart in s_spart
and netwr in s_netwr
and waerk in s_waerk.
After this whatever the sales orders fetched here, for those all again its fetching from VBAK table as following.
SVBAK[] = IVKDFS[]
select * from vbak into table ivbak
for all entries in svbak
where vbeln = svbak-vbeln
and knuma in s_knuma
and auart in s_auart
and submi in s_submi
and (vbak_wtab).
So, its filtering from VBAK.
But the exact issue is that, there is one sales order which is available in VBAK but does not available in VKDFS table.
So, my program fails to display the report regarding to that agreement number.
As per my analysis I came to know that there are no entries in VKDFS table against to the sales orders in VBAK concerning agreement numbers.
VKDFS-SD index: billing initiator table.
I want to know how come this VKDFS table is updating against to VBAK table. If possible how to make this entry in that table against to the values in VBAK. But it should not effect other tables.
Please let me know the solution if you people have any .
Its an urgent and sev 1 tickets
eagerly waiting for solution or some information.
Thanks&Regards.
J.Hi everybody,
I am facing the problem with the tables VKDFS and VBAK.
In my program the report has to display the details of the agrement numbers concerning to the sale or billing doucmnets later on it has to create a credit memo for that particular customer.
In the coding the program in very beging step, it is fetching all sales documents from VKDFS as per selections like following.
select * from vkdfs into table ivkdfs
where fktyp in r_fktyp
and vkorg in s_vkorg
and fkdat in s_fkdat
and kunnr in s_kunnr
and fkart in s_fkart
and vbeln in s_vbeln
and faksk in s_faksk
and vtweg in s_vtweg
and spart in s_spart
and netwr in s_netwr
and waerk in s_waerk.
After this whatever the sales orders fetched here, for those all again its fetching from VBAK table as following.
SVBAK[] = IVKDFS[]
select * from vbak into table ivbak
for all entries in svbak
where vbeln = svbak-vbeln
and knuma in s_knuma
and auart in s_auart
and submi in s_submi
and (vbak_wtab).
So, its filtering from VBAK.
But the exact issue is that, there is one sales order which is available in VBAK but does not available in VKDFS table.
So, my program fails to display the report regarding to that agreement number.
As per my analysis I came to know that there are no entries in VKDFS table against to the sales orders in VBAK concerning agreement numbers.
VKDFS-SD index: billing initiator table.
I want to know how come this VKDFS table is updating against to VBAK table. If possible how to make this entry in that table against to the values in VBAK. But it should not effect other tables.
Please let me know the solution if you people have any .
Its an urgent and sev 1 tickets
eagerly waiting for solution or some information.
Thanks&Regards.
J.
Maybe you are looking for
-
How can I change my email address in APPS
Help with this please
-
Preview issue: can't change the text of annotation taken in PDFs.
With Preview you can annotate PDFs (highlight or underline text, taking notes). It is a nice feature, and I use it a lot. I've notice that after saving the changes made on a PDf, I can no longer modify the text in the notes (I can change the text, bu
-
Apple Mail Archiving in Mavericks
Just updated to Mavericks. My 'Achive' mbox disappeared. Now, when selecting archive from drop down box in an email, the message disappears. Where did it go? How can I restore my Arhived messages from before I updated to Mavericks? Thanks.
-
Hi, How do i implement sub-queries in owb. Something like this select * from t1 where t1.claimno > select t2.claimno from t2 What are the transformations I need to use and their order in the mapping? Thank you Billu
-
How can you make 3d models work in Motion 5 or Final Cut Pro for under 100$
How would you be able to make a 3d model appear in Final Cut Pro or Motion and work, for cheap.