Taking too much time for saving PO with line item more than 600
HI
We are trying to save PO with line items more than 600, but it is taking too much time to save. It is taking more than 1 hour to save the PO.
Kindly let me know is there any restriction for no. of line items in the PO. Pls guide
regards
Sanjay
Hi,
I suggest you to do a trace (tcode ST05) to identify the bottleneck.
You can know some reasons in Note 205005 - Performance composite note: Purchase order.
I hope this helps you
Regards
Eduardo
Similar Messages
-
JDBC adapter taking too much time for inserting
we` have given a "update insert" query to be used in the JDBC adapter to insert these records into the database. Consists of 3 tables with 29fields in first, 4 fields in the other two.
While message in XI gets processed in just 1-3 secs, the database processing time goes up to as much as 8 minutes for one record to get inserted.
As immediate solution, is there any way we can have the JDBC adapter process these messages faster? These messages get queued up and hence all the other messages also get queued up delaying the other interfaces. We have a central adapter engine...
Also is there any way we can get alert when the status is in "Processing/To be delivered/Delivering" and the message count exceeds a certain number say 1000I am using only one receiver JDBC channel
We have been inserting three different table by using 3 different statemets tags(i.e) statement1(for table1),statement2(for table2),statement3(for table3).
My structure is,
<messagetype Name>
<Statement1>
<tag1>
<action>UPDATE_INSERT</action>
<table>Table1</table>
<access>
<field1>
<field2>
<field28>
<key>
<MatNumber>
</tag1>
</statement1>
<Statement2>
<tag1>
<action>UPDATE_INSERT</action>
<table>Table2</table>
<access>
<field1>
<field2>
<field4>
<key>
<MatNumber>
</tag1>
</statement2>
<Statement3>
<tag3>
<action>UPDATE_INSERT</action>
<table>Table3</table>
<access>
<field1>
<field2>
<field4>
<key>
<MatNumber>
</tag3>
</statement3>
You can see we are also using key as well.In the first table we have 28 fields,second & third we are having 4.
Edited by: rajesh shanmugasundaram on Jul 31, 2008 11:08 AM -
Variable screen is taking too much time for some reports
Hi all,
We are using a sales billing cube for our billing reports . For one query , variable screen takes lot of time to come , i have in RSRT , i have generated the reoprt there and also executed the report there , it s fine there.Morever iam not using any customer exit variable for this.
Can anybody suggets me whats is the exact problem;How I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables??
I suggest you start with index tuning. Specifically, make sure columns specified in the WHERE and JOIN columns are properly indexed (ideally clustered or covering, and unique when possible). Changing outer joins to inner joins is appropriate
if you don't need outer joins in the first place.
Dan Guzman, SQL Server MVP, http://www.dbdelta.com -
Query is taking too much time for inserting into a temp table and for spooling
Hi,
I am working on a query optimization project where I have found a query which takes hell lot of time to execute.
Temp table is defined as follows:
DECLARE @CastSummary TABLE (CastID INT, SalesOrderID INT, ProductionOrderID INT, Actual FLOAT,
ProductionOrderNo NVARCHAR(50), SalesOrderNo NVARCHAR(50), Customer NVARCHAR(MAX), Targets FLOAT)
SELECT
C.CastID,
SO.SalesOrderID,
PO.ProductionOrderID,
F.CalculatedWeight,
PO.ProductionOrderNo,
SO.SalesOrderNo,
SC.Name,
SO.OrderQty
FROM
CastCast C
JOIN Sales.Production PO ON PO.ProductionOrderID = C.ProductionOrderID
join Sales.ProductionDetail d on d.ProductionOrderID = PO.ProductionOrderID
LEFT JOIN Sales.SalesOrder SO ON d.SalesOrderID = SO.SalesOrderID
LEFT JOIN FinishedGoods.Equipment F ON F.CastID = C.CastID
JOIN Sales.Customer SC ON SC.CustomerID = SO.CustomerID
WHERE
(C.CreatedDate >= @StartDate AND C.CreatedDate < @EndDate)
It takes almost 33% for Table Insert when I insert the data in a temp table and then 67% for Spooling. I had removed 2 LEFT joins and made it as JOIN from the above query and then tried. Query execution became bit fast. But still needs improvement.
How I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables?? Please suggest.
-PepHow I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables??
I suggest you start with index tuning. Specifically, make sure columns specified in the WHERE and JOIN columns are properly indexed (ideally clustered or covering, and unique when possible). Changing outer joins to inner joins is appropriate
if you don't need outer joins in the first place.
Dan Guzman, SQL Server MVP, http://www.dbdelta.com -
UTL_FILE reqeust taking too much time for completion
Hi,
We are running utl_file reqeust.
the select statement fetching 6000 records, for this it taking 22 mins for completion.
the code is
CREATE OR REPLACE PACKAGE BODY "XXC"."XXC_MOD_IN_068_AP_TO_FIS_PKG"
AS
* Module Type : PL/SQL
* Module Name : XXC_MOD_IN_068_AP_FIS_PKG
* Description : This package is used for AP to Fiscal Interface.
* Run Env. : SQL*Plus
* Procedure Name Description
* XXC_MOD_068_AP_PR XXC_MOD_068_AP_PR Procedure is used to insert transactions
* into CSV OutPut File from Oracle Account Payables.
* Calling Module: None
* Modules Called: XXC_COMMON_INT_PK.INSERT_AUDIT
* Module Number : MOD_IN_068
* Known Bugs and Restrictions: none
* History
* =======
* Version Name Date Description of Change
* 0.1 Sanjeev Khurana 25-JULY-2011 Initial Creation.
* 0.2 Rohit 09-DEC-2011 Updated header details for the file
* 0.3 Amit Kulwal 28-AUG-2012 Updated the cursor query for incident 671520
* 0.4 Swaraj Goud 20-Nov-2012 Updated as per the CR 671520
| PACKAGE BODY
-- Actual Code Start Here
-- Procedure : XXC_MOD_068_AP_PR
-- Description : XXC_MOD_068_AP_PR Procedure is used to insert transactions
-- into CSV OUTPUT File from Oracle Account Payables.
-- Parameters:
-- Parm Name I/O Description
-- p_errbuf OUT Error message.
-- p_retcode OUT Error code. Returns 0 if no errors otherwise returns 1.
-- p_start_date IN Start Date
-- p_end_date IN End Date
PROCEDURE xxc_mod_068_ap_pr (
p_errbuf OUT VARCHAR2,
p_retcode OUT NUMBER,
p_start_date IN VARCHAR2,
p_end_date IN VARCHAR2
IS
-- Define variables and assign default values
l_sucess_count NUMBER := 0;
l_error_count NUMBER := 0;
-- Standard declaration
l_source VARCHAR2 (10);
l_target VARCHAR2 (10);
lc_module_description VARCHAR2 (50)
:= 'MOD_IN_068 - AP to Fiscal';
l_status CONSTANT VARCHAR2 (50) := 'NEW';
p_status NUMBER;
l_batch_id NUMBER;
l_batch_id_next NUMBER
:= apps_common_out_batch_id_s1.NEXTVAL;
l_proc_name VARCHAR2 (100) := 'XXC_MOD_IN_068';
l_request_id NUMBER
:= fnd_global.conc_request_id;
l_audit_master_id NUMBER := NULL;
l_mod_code VARCHAR2 (100);
l_log_type NUMBER := 1; --INFORMATION
l_det_status_success NUMBER := 0; --SUCCESS
l_det_status_inprocess NUMBER := 3; --INPROCESS
l_det_status_rejected NUMBER := 4; --REJECTED
l_det_status_err NUMBER := 3; --Error
l_det_status_complete NUMBER := 9; --COMPLETE
-- Standard who Columns
l_created_by NUMBER := fnd_global.user_id;
l_creation_date DATE := SYSDATE;
l_last_update_date DATE := SYSDATE;
l_last_update_login NUMBER := fnd_global.user_id;
v_file UTL_FILE.file_type;
l_location VARCHAR2 (150);
l_archive_location VARCHAR2 (150);
l_date VARCHAR2 (50);
l_filename VARCHAR2 (50);
l_open_mode VARCHAR2 (1) := 'W';
--- l_max_linesize NUMBER := 32767;
l_max_linesize VARCHAR2 (150); -- Updated 09-Nov-2012
--Cursor is used to fetch valid records for the interface
CURSOR get_ap_data_inv
IS
SELECT asp.segment1 supplier_ref,
-- asp.vendor_name supplier_name,
replace(asp.vendor_name, ',', ' ') supplier_name,
--aia.invoice_num invoice_number,
replace(aia.invoice_num, ',','') invoice_number,
aia.invoice_date,
aia.invoice_amount amount,
aia.doc_sequence_value unique_id,
aia.creation_date date_invoice_entered,
apsa.due_date date_invoice_paid,
aia.SOURCE user_id,
aia.payment_status_flag,
aia.invoice_type_lookup_code doc_type,
--aia.description,
replace(aia.description, ',' , ' ') description,
apsa.gross_amount
FROM ap_invoices_all aia,
ap_suppliers asp,
apps.ap_payment_schedules_all apsa
-- apps.iby_payments_all iba,
-- apps.iby_docs_payable_all dp
WHERE aia.invoice_id = apsa.invoice_id
AND aia.vendor_id = asp.vendor_id
AND aia.org_id = apsa.org_id
-- AND apsa.payment_status_flag != 'Y' -- commented for CR
-- AND dp.payment_id = iba.payment_id(+)
-- AND aia.invoice_id = dp.calling_app_doc_unique_ref2(+)
-- AND apsa.due_date <= (SYSDATE + 1)
AND TRUNC (aia.creation_date)
BETWEEN NVL (fnd_date.canonical_to_date (p_start_date),
TRUNC (aia.creation_date))
AND NVL (fnd_date.canonical_to_date (p_end_date),
TRUNC (aia.creation_date));
TYPE xxc_tbl IS TABLE OF get_ap_data_inv%ROWTYPE;
xxc_tbl1 xxc_tbl;
BEGIN
l_batch_id := apps_common_out_batch_id_s1.CURRVAL;
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_inprocess,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Process Starts',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
-- Get Module Code
BEGIN
SELECT TRIM (fval.flex_value),
TRIM (SUBSTR (fval.description,
1,
INSTR (fval.description, ' -')
INTO l_source,
l_mod_code
FROM fnd_flex_values_vl fval, fnd_flex_value_sets vset
WHERE vset.flex_value_set_id = fval.flex_value_set_id
AND vset.flex_value_set_name IN ('XXC_COMM_INT_CONFIG')
AND fval.enabled_flag = 'Y'
AND fval.description = lc_module_description;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_err,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Error Mode Code',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
raise_application_error (-20045, SQLERRM);
END;
--File Location Path for OutPut File
BEGIN
SELECT fnd_profile.VALUE ('XXC_MOD_IN_068_AP_OUTBOUND'),
fnd_profile.VALUE ('XXC_MOD_IN_068_AP_ARCHIVE')
INTO l_location,
l_archive_location
FROM DUAL;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_rejected,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Profile Value not found',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
BEGIN
SELECT TO_CHAR (SYSDATE, 'YYYYMMDDhh24miss')
INTO l_date
FROM DUAL;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_rejected,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'status not found',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
l_filename := 'AP_Fiscal_' || l_date || '.csv';
v_file :=
UTL_FILE.fopen (LOCATION => l_location,
filename => l_filename,
open_mode => l_open_mode,
max_linesize => l_max_linesize
-- Changed as per Sarah's email on 9th Decemeber
/* UTL_FILE.put_line (v_file,
'SUPPLIER_REF'
|| ','
|| 'SUPPLIER_NAME'
|| ','
|| 'INVOICE_NUMBER'
|| ','
|| 'INVOICE_DATE'
|| ','
|| 'AMOUNT'
|| ','
|| 'UNIQUE_ID'
|| ','
|| 'DATE_INVOICE_ENTERED'
|| ','
|| 'DATE_INVOICE_PAID'
|| ','
|| 'USER_ID'
|| ','
|| 'PAYMENT_STATUS_FLAG'
|| ','
|| 'DOC_TYPE'
|| ','
|| 'DESCRIPTION'
|| ','
|| 'PAYMENT_AMOUNT'
UTL_FILE.put_line (v_file,
'SUPPLIERREF'
|| ','
|| 'SUPPLIERNAME'
|| ','
|| 'INVOICENUMBER'
|| ','
|| 'DATE'
|| ','
|| 'AMOUNT'
|| ','
|| 'UNIQUEID'
|| ','
|| 'DATEINVOICEENTERED'
|| ','
|| 'DATEINVOICEPAID'
|| ','
|| 'USERID'
|| ','
|| 'PAYMENTSTATUS'
|| ','
|| 'DOCTYPE'
|| ','
|| 'DESCRIPTION'
|| ','
|| 'PAYMENTAMOUNT');
UTL_FILE.put_line (v_file,
'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX');
open get_ap_data_inv;
loop
fetch get_ap_data_inv bulk collect into xxc_tbl1 limit 6000;
fnd_file.put_line(fnd_file.log, 'Cursor Count is : '||xxc_tbl1.count);
for i in xxc_tbl1.first .. xxc_tbl1.count
--FOR cur_rec IN get_ap_data_inv
LOOP
BEGIN
--Common package used for proper sequence for Record_id and Bath_id
l_sucess_count := l_sucess_count + 1;
--Insert into CSV file
fnd_file.put_line (fnd_file.LOG, 'Before Utl file');
UTL_FILE.put_line (v_file,
xxc_tbl1(i).supplier_ref
|| ','
|| xxc_tbl1(i).supplier_name
|| ','
|| xxc_tbl1(i).invoice_number
|| ','
|| xxc_tbl1(i).invoice_date
|| ','
|| xxc_tbl1(i).amount
|| ','
|| xxc_tbl1(i).unique_id
|| ','
|| xxc_tbl1(i).date_invoice_entered
|| ','
|| xxc_tbl1(i).date_invoice_paid
|| ','
|| xxc_tbl1(i).user_id
|| ','
|| xxc_tbl1(i).payment_status_flag
|| ','
|| xxc_tbl1(i).doc_type
|| ','
|| xxc_tbl1(i).description
|| ','
|| xxc_tbl1(i).gross_amount);
fnd_file.put_line (fnd_file.LOG,
'Supplier Reference : ' || xxc_tbl1(i).supplier_ref);
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_complete,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Inserting records from AP to Fiscal Successfully',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
EXCEPTION
WHEN OTHERS
THEN
l_error_count := l_error_count + 1;
fnd_file.put_line (fnd_file.LOG,
'Error While Inserting from AP to Fiscal '
|| SQLERRM);
-- Create audit log for AP Inv records insert Exception
--Insert into the Audit table XXC_COMM_AUDIT_DETAIL_LOG and XXC_COMM_AUDIT_MASTER_LOG
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_rejected,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Error While Inserting from AP to Fiscal',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
END LOOP;
exit when get_ap_data_inv%NOTFOUND;
end loop;
close get_ap_data_inv;
UTL_FILE.fclose (v_file);
UTL_FILE.fcopy (l_location,
l_filename,
l_archive_location,
l_filename
-- Create audit log for Successfully processed records
-- Procedure call to insert in Audit tables
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_complete,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Compeleted Sucessfully AP to Fiscal',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
--Insert into the Audit table XXC_COMM_AUDIT_DETAIL_LOG and XXC_COMM_AUDIT_MASTER_LOG for populating email drop table
BEGIN
SELECT transaction_status
INTO p_status
FROM xxc_comm_audit_master_log
WHERE audit_master_id = l_audit_master_id;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_err,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Status Error',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
IF p_status <> 0
THEN
xxc_comm_audit_log_pk.populate_email_drop_table (l_audit_master_id,
l_batch_id);
END IF;
EXCEPTION
WHEN UTL_FILE.invalid_path
THEN
UTL_FILE.fclose (v_file);
raise_application_error (-20000, 'File location is invalid.');
WHEN UTL_FILE.invalid_mode
THEN
UTL_FILE.fclose (v_file);
raise_application_error (-20001,
'The open_mode parameter in FOPEN is invalid.');
WHEN UTL_FILE.invalid_filehandle
THEN
UTL_FILE.fclose (v_file);
raise_application_error (-20002, 'File handle is invalid.');
WHEN UTL_FILE.invalid_operation
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20003,
'File could not be opened or operated on as requested.');
WHEN UTL_FILE.write_error
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20005,
'Operating system error occurred during the write operation.');
WHEN UTL_FILE.file_open
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20008,
'The requested operation failed because the file is open.');
WHEN UTL_FILE.invalid_maxlinesize
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20009,
'The MAX_LINESIZE value for FOPEN() is invalid; it should '
|| 'be within the range 1 to 32767.');
COMMIT;
ROLLBACK TO data_extract;
WHEN OTHERS
THEN
raise_application_error (-20045, SQLERRM);
UTL_FILE.fclose (v_file);
END xxc_mod_068_ap_pr;
END xxc_mod_in_068_ap_to_fis_pkg;
Show Errors
Iam implemented BULK collect concept in programe,can anyone please suggest how can I imporve performance....
Thanks,
RakeshCREATE OR REPLACE PACKAGE BODY "XXC"."XXC_MOD_IN_068_AP_TO_FIS_PKG"
AS
PROCEDURE xxc_mod_068_ap_pr (
p_errbuf OUT VARCHAR2,
p_retcode OUT NUMBER,
p_start_date IN VARCHAR2,
p_end_date IN VARCHAR2
IS
-- Define variables and assign default values
l_sucess_count NUMBER := 0;
l_error_count NUMBER := 0;
-- Standard declaration
l_source VARCHAR2 (10);
l_target VARCHAR2 (10);
lc_module_description VARCHAR2 (50)
:= 'MOD_IN_068 - AP to Fiscal';
l_status CONSTANT VARCHAR2 (50) := 'NEW';
p_status NUMBER;
l_batch_id NUMBER;
l_batch_id_next NUMBER
:= apps_common_out_batch_id_s1.NEXTVAL;
l_proc_name VARCHAR2 (100) := 'XXC_MOD_IN_068';
l_request_id NUMBER
:= fnd_global.conc_request_id;
l_audit_master_id NUMBER := NULL;
l_mod_code VARCHAR2 (100);
l_log_type NUMBER := 1; --INFORMATION
l_det_status_success NUMBER := 0; --SUCCESS
l_det_status_inprocess NUMBER := 3; --INPROCESS
l_det_status_rejected NUMBER := 4; --REJECTED
l_det_status_err NUMBER := 3; --Error
l_det_status_complete NUMBER := 9; --COMPLETE
-- Standard who Columns
l_created_by NUMBER := fnd_global.user_id;
l_creation_date DATE := SYSDATE;
l_last_update_date DATE := SYSDATE;
l_last_update_login NUMBER := fnd_global.user_id;
v_file UTL_FILE.file_type;
l_location VARCHAR2 (150);
l_archive_location VARCHAR2 (150);
l_date VARCHAR2 (50);
l_filename VARCHAR2 (50);
l_open_mode VARCHAR2 (1) := 'W';
--- l_max_linesize NUMBER := 32767;
l_max_linesize VARCHAR2 (150); -- Updated 09-Nov-2012
--Cursor is used to fetch valid records for the interface
CURSOR get_ap_data_inv
IS
SELECT asp.segment1 supplier_ref,
-- asp.vendor_name supplier_name,
replace(asp.vendor_name, ',', ' ') supplier_name,
--aia.invoice_num invoice_number,
replace(aia.invoice_num, ',','') invoice_number,
aia.invoice_date,
aia.invoice_amount amount,
aia.doc_sequence_value unique_id,
aia.creation_date date_invoice_entered,
apsa.due_date date_invoice_paid,
aia.SOURCE user_id,
aia.payment_status_flag,
aia.invoice_type_lookup_code doc_type,
--aia.description,
replace(aia.description, ',' , ' ') description,
apsa.gross_amount
FROM ap_invoices_all aia,
ap_suppliers asp,
apps.ap_payment_schedules_all apsa
-- apps.iby_payments_all iba,
-- apps.iby_docs_payable_all dp
WHERE aia.invoice_id = apsa.invoice_id
AND aia.vendor_id = asp.vendor_id
AND aia.org_id = apsa.org_id
-- AND apsa.payment_status_flag != 'Y' -- commented for CR
-- AND dp.payment_id = iba.payment_id(+)
-- AND aia.invoice_id = dp.calling_app_doc_unique_ref2(+)
-- AND apsa.due_date <= (SYSDATE + 1)
AND TRUNC (aia.creation_date)
BETWEEN NVL (fnd_date.canonical_to_date (p_start_date),
TRUNC (aia.creation_date))
AND NVL (fnd_date.canonical_to_date (p_end_date),
TRUNC (aia.creation_date));
TYPE xxc_tbl IS TABLE OF get_ap_data_inv%ROWTYPE;
xxc_tbl1 xxc_tbl;
BEGIN
l_batch_id := apps_common_out_batch_id_s1.CURRVAL;
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_inprocess,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Process Starts',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
-- Get Module Code
BEGIN
SELECT TRIM (fval.flex_value),
TRIM (SUBSTR (fval.description,
1,
INSTR (fval.description, ' -')
INTO l_source,
l_mod_code
FROM fnd_flex_values_vl fval, fnd_flex_value_sets vset
WHERE vset.flex_value_set_id = fval.flex_value_set_id
AND vset.flex_value_set_name IN ('XXC_COMM_INT_CONFIG')
AND fval.enabled_flag = 'Y'
AND fval.description = lc_module_description;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_err,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Error Mode Code',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
raise_application_error (-20045, SQLERRM);
END;
--File Location Path for OutPut File
BEGIN
SELECT fnd_profile.VALUE ('XXC_MOD_IN_068_AP_OUTBOUND'),
fnd_profile.VALUE ('XXC_MOD_IN_068_AP_ARCHIVE')
INTO l_location,
l_archive_location
FROM DUAL;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_rejected,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Profile Value not found',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
BEGIN
SELECT TO_CHAR (SYSDATE, 'YYYYMMDDhh24miss')
INTO l_date
FROM DUAL;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_rejected,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'status not found',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
l_filename := 'AP_Fiscal_' || l_date || '.csv';
v_file :=
UTL_FILE.fopen (LOCATION => l_location,
filename => l_filename,
open_mode => l_open_mode,
max_linesize => l_max_linesize
-- Changed as per Sarah's email on 9th Decemeber
/* UTL_FILE.put_line (v_file,
'SUPPLIER_REF'
|| ','
|| 'SUPPLIER_NAME'
|| ','
|| 'INVOICE_NUMBER'
|| ','
|| 'INVOICE_DATE'
|| ','
|| 'AMOUNT'
|| ','
|| 'UNIQUE_ID'
|| ','
|| 'DATE_INVOICE_ENTERED'
|| ','
|| 'DATE_INVOICE_PAID'
|| ','
|| 'USER_ID'
|| ','
|| 'PAYMENT_STATUS_FLAG'
|| ','
|| 'DOC_TYPE'
|| ','
|| 'DESCRIPTION'
|| ','
|| 'PAYMENT_AMOUNT'
UTL_FILE.put_line (v_file,
'SUPPLIERREF'
|| ','
|| 'SUPPLIERNAME'
|| ','
|| 'INVOICENUMBER'
|| ','
|| 'DATE'
|| ','
|| 'AMOUNT'
|| ','
|| 'UNIQUEID'
|| ','
|| 'DATEINVOICEENTERED'
|| ','
|| 'DATEINVOICEPAID'
|| ','
|| 'USERID'
|| ','
|| 'PAYMENTSTATUS'
|| ','
|| 'DOCTYPE'
|| ','
|| 'DESCRIPTION'
|| ','
|| 'PAYMENTAMOUNT');
UTL_FILE.put_line (v_file,
'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX'
|| ','
|| 'XX');
open get_ap_data_inv;
loop
fetch get_ap_data_inv bulk collect into xxc_tbl1 limit 6000;
fnd_file.put_line(fnd_file.log, 'Cursor Count is : '||xxc_tbl1.count);
for i in xxc_tbl1.first .. xxc_tbl1.count
--FOR cur_rec IN get_ap_data_inv
LOOP
BEGIN
--Common package used for proper sequence for Record_id and Bath_id
l_sucess_count := l_sucess_count + 1;
--Insert into CSV file
fnd_file.put_line (fnd_file.LOG, 'Before Utl file');
UTL_FILE.put_line (v_file,
xxc_tbl1(i).supplier_ref
|| ','
|| xxc_tbl1(i).supplier_name
|| ','
|| xxc_tbl1(i).invoice_number
|| ','
|| xxc_tbl1(i).invoice_date
|| ','
|| xxc_tbl1(i).amount
|| ','
|| xxc_tbl1(i).unique_id
|| ','
|| xxc_tbl1(i).date_invoice_entered
|| ','
|| xxc_tbl1(i).date_invoice_paid
|| ','
|| xxc_tbl1(i).user_id
|| ','
|| xxc_tbl1(i).payment_status_flag
|| ','
|| xxc_tbl1(i).doc_type
|| ','
|| xxc_tbl1(i).description
|| ','
|| xxc_tbl1(i).gross_amount);
fnd_file.put_line (fnd_file.LOG,
'Supplier Reference : ' || xxc_tbl1(i).supplier_ref);
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_complete,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Inserting records from AP to Fiscal Successfully',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
EXCEPTION
WHEN OTHERS
THEN
l_error_count := l_error_count + 1;
fnd_file.put_line (fnd_file.LOG,
'Error While Inserting from AP to Fiscal '
|| SQLERRM);
-- Create audit log for AP Inv records insert Exception
--Insert into the Audit table XXC_COMM_AUDIT_DETAIL_LOG and XXC_COMM_AUDIT_MASTER_LOG
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_rejected,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Error While Inserting from AP to Fiscal',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
END LOOP;
exit when get_ap_data_inv%NOTFOUND;
end loop;
close get_ap_data_inv;
UTL_FILE.fclose (v_file);
UTL_FILE.fcopy (l_location,
l_filename,
l_archive_location,
l_filename
-- Create audit log for Successfully processed records
-- Procedure call to insert in Audit tables
xxc_common_int_pk.insert_audit
(p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_complete,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Compeleted Sucessfully AP to Fiscal',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
--Insert into the Audit table XXC_COMM_AUDIT_DETAIL_LOG and XXC_COMM_AUDIT_MASTER_LOG for populating email drop table
BEGIN
SELECT transaction_status
INTO p_status
FROM xxc_comm_audit_master_log
WHERE audit_master_id = l_audit_master_id;
EXCEPTION
WHEN OTHERS
THEN
xxc_common_int_pk.insert_audit (p_batch_id => l_batch_id,
p_request_id => l_request_id,
p_source_system => l_source,
p_proc_name => l_proc_name,
p_log_type => l_log_type,
p_det_status => l_det_status_err,
p_msg_code => NULL,
p_entity => NULL,
p_msg_desc => 'Status Error',
p_mast_request_id => l_request_id,
p_record_id => NULL,
p_source => l_source,
p_target => l_target,
p_email => NULL,
p_mod_code => l_mod_code,
p_audit_master_id => l_audit_master_id
END;
IF p_status <> 0
THEN
xxc_comm_audit_log_pk.populate_email_drop_table (l_audit_master_id,
l_batch_id);
END IF;
EXCEPTION
WHEN UTL_FILE.invalid_path
THEN
UTL_FILE.fclose (v_file);
raise_application_error (-20000, 'File location is invalid.');
WHEN UTL_FILE.invalid_mode
THEN
UTL_FILE.fclose (v_file);
raise_application_error (-20001,
'The open_mode parameter in FOPEN is invalid.');
WHEN UTL_FILE.invalid_filehandle
THEN
UTL_FILE.fclose (v_file);
raise_application_error (-20002, 'File handle is invalid.');
WHEN UTL_FILE.invalid_operation
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20003,
'File could not be opened or operated on as requested.');
WHEN UTL_FILE.write_error
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20005,
'Operating system error occurred during the write operation.');
WHEN UTL_FILE.file_open
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20008,
'The requested operation failed because the file is open.');
WHEN UTL_FILE.invalid_maxlinesize
THEN
UTL_FILE.fclose (v_file);
raise_application_error
(-20009,
'The MAX_LINESIZE value for FOPEN() is invalid; it should '
|| 'be within the range 1 to 32767.');
COMMIT;
ROLLBACK TO data_extract;
WHEN OTHERS
THEN
raise_application_error (-20045, SQLERRM);
UTL_FILE.fclose (v_file);
END xxc_mod_068_ap_pr;
END xxc_mod_in_068_ap_to_fis_pkg;Please verify... -
0BBP_BPART_BUYID_ATTR Datasource taking too much time for full load
Hello Gurus,
This Data source is fetching 95000 records for five hours in full load. There is no Delta capability. When I run trace and try to see code, It has Oracle and MS SQL inxes but not DB2, and we are on DB2 database. Also all the fields are hidden except Business partner and Buyer ID. I didnot find any notes on this datasource.
Did any one faced this issue, Help me how can I make this loading Faster in Daily Full loads.Hi,
Divide the load using selections in different infopackages
Thanks
HP -
Spatial query with sdo_aggregate_union taking too much time
Hello friends,
the following query taking too much time for execution.
table1 contains around 2000 records.
table2 contains 124 rows
SELECT
table1.id
, table1.txt
, table1.id2
, table1.acti
, table1.acti
, table1.geom as geom
FROM
table1
WHERE
sdo_relate
table1.geom,
select sdo_aggr_union(sdoaggrtype(geom, 0.0005))
from table2
,'mask=(ANYINTERACT) querytype=window'
)='TRUE'
I am new in spatial. trying to find out list of geometry which is fall within geometry stored in table2.
ThanksHi Thanks lot for your reply.
But It is not require to use sdo_aggregate function to find out whether geomatry in one table is in other geomatry..
Let me give you clear picture....
What I trying to do is, tale1 contains list of all station (station information) of state and table2 contains list of area of city. So I want to find out station which is belonging to city.
For this I thought to get aggregation union of city area and then check for any interaction of that final aggregation result with station geometry to check whether it is in city or not.
I hope this would help you to understand my query.
Thanks
I appreciate your efforts. -
ACCTIT table Taking too much time
Hi,
In SE16: ACCTIT table i gave the G/L account no after that i executed in my production its taking too much time for to show the result.
ThankuHi,
Here iam sending details of technical settings.
Name ACCTIT Transparent Table
Short text Compressed Data from FI/CO Document
Last Change SAP 10.02.2005
Status Active Saved
Data class APPL1 Transaction data, transparent tables
Size category 4 Data records expected: 24,000 to 89,000
Thanku -
SAP GUI taking too much time to open transactions
Hi guys,
i have done system copy from Production to Quality server completely.
after that i started SAP on Quality server, it is taking too much time for opening SAP transactions (it is going in compilation mode).
i started SGEN, but it is giving time_out errors. please help me on this issue.
MY hardware details on quality server
operating system : SuSE Linux 10 SP2
Database : 10.2.0.2
SAP : ECC 6.0 SR2
RAM size : 8 GB
Hard disk space : 500 GB
swap space : 16 GB.
regards
RameshHi,
>i started SGEN, but it is giving time_out errors. please help me on this issue.
You are supposed to run SGEN as a batch job and so, it should be possible to get time out errors.
I've seen Full SGEN lasting from 3 hours on high end systems up to 8 full days on PC hardware...
Regards,
Olivier -
SetDataSource function taking too much time !
Hi,
We are using Crystal reports 2008 for developing our applications in Visual Studio .Net 2005 widows application.We are using Crystal viewer ActiveX control in our windows application for displaying the reports. The datasource we are using is strongly typed XSD files. And we have mutiple xsd files for each report with mutiple tabels in it.
We are using SetDataSource function for setting the data for each table from .Net code to report by push method. But it takes lot of time in this part.
Kindly Help.
Thanks,
Shyam SHi Johnathan,
We are passing the data from the database to crystal reports using XSD files.
These XSD dataset fiels are included in our project. And first we populate the data into the objects of these datasets files and use the setdatasource function of crystaldocument class to assign the data from the XSF tables to the respective table in the report.
The report database is constructed using these ADO.NET (XML) new connectin pointing to the XSD files.
And these tables which show in the report database are used in the report.
These setdatasource functions are taking around 20 seconds to process; eventhoug the dataset objects we have in the code are the same objects of the XSD files which are used in the report. so it is just like assignment of data from code to the report which should not be taking so much time. Also datasets doesn't return more than 20 rows.
We are using version 12.1.0.892 of crystal reports 2008. Yes i have installed SP1 for crystal reports 2008.
One more observation i have found is that the more the number of tables i have in the XSD file more the time it takes, eventhough i may be using only one table in that particular XSD file. I have checked this by noting the time differences by adding unwanted tables in the existing XSD files.
So i think eventhough i am assigning only one table using this statement, it loads the whole xsd file before loading the data
CrystalDocumentObj.Database.Tables["CON_InvoiceSummary"].SetDataSource(reportConfigurationDS.Tables["CON_InvoiceSummary"]);
Is there anything can be done ?
Thanks,
Shyam S -
Filling aggregates is taking too much time
Hello,
Filling aggregates is taking too much time for around 17000 records & we have checked the cube consistancy in rsrv is everything is fine .
how can i trace the problem .
THanks in advance.
Ravinderhi ravinder,
can you tell me how many records are there in the cube at line item level.
Regards
Amar. -
Auto Invoice Program taking too much time : problem with update sql
Hi ,
Oracle db version 11.2.0.3
Oracle EBS version : 12.1.3
Though we have a SEV-1 SR with oracle we have not been able to find much success.
We have an auto invoice program which runs many times in the day and its taking too much time since the begining . On troubleshooting we have found one query to be taking too much of the time and seek suggestion on how to tune it. I am attaching the explain plan for the for same. Its an update query. Please guide.
Plan
UPDATE STATEMENT ALL_ROWSCost: 0 Bytes: 124 Cardinality: 1
50 UPDATE AR.RA_CUST_TRX_LINE_GL_DIST_ALL
27 FILTER
26 HASH JOIN Cost: 8,937,633 Bytes: 4,261,258,760 Cardinality: 34,364,990
24 VIEW VIEW SYS.VW_NSO_1 Cost: 8,618,413 Bytes: 446,744,870 Cardinality: 34,364,990
23 SORT UNIQUE Cost: 8,618,413 Bytes: 4,042,339,978 Cardinality: 34,364,990
22 UNION-ALL
9 FILTER
8 SORT GROUP BY Cost: 5,643,052 Bytes: 3,164,892,625 Cardinality: 25,319,141
7 HASH JOIN Cost: 1,640,602 Bytes: 32,460,436,875 Cardinality: 259,683,495
1 TABLE ACCESS FULL TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 154,993 Bytes: 402,499,500 Cardinality: 20,124,975
6 HASH JOIN Cost: 853,567 Bytes: 22,544,143,440 Cardinality: 214,706,128
4 HASH JOIN Cost: 536,708 Bytes: 2,357,000,550 Cardinality: 29,835,450
2 TABLE ACCESS FULL TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 153,008 Bytes: 1,163,582,550 Cardinality: 29,835,450
3 TABLE ACCESS FULL TABLE AR.RA_CUSTOMER_TRX_LINES_ALL Cost: 307,314 Bytes: 1,193,526,000 Cardinality: 29,838,150
5 TABLE ACCESS FULL TABLE AR.RA_CUSTOMER_TRX_ALL Cost: 132,951 Bytes: 3,123,197,116 Cardinality: 120,122,966
21 FILTER
20 SORT GROUP BY Cost: 2,975,360 Bytes: 877,447,353 Cardinality: 9,045,849
19 HASH JOIN Cost: 998,323 Bytes: 17,548,946,769 Cardinality: 180,916,977
13 VIEW VIEW AR.index$_join$_027 Cost: 108,438 Bytes: 867,771,256 Cardinality: 78,888,296
12 HASH JOIN
10 INDEX RANGE SCAN INDEX AR.RA_CUSTOMER_TRX_N15 Cost: 58,206 Bytes: 867,771,256 Cardinality: 78,888,296
11 INDEX FAST FULL SCAN INDEX (UNIQUE) AR.RA_CUSTOMER_TRX_U1 Cost: 62,322 Bytes: 867,771,256 Cardinality: 78,888,296
18 HASH JOIN Cost: 748,497 Bytes: 3,281,713,302 Cardinality: 38,159,457
14 TABLE ACCESS FULL TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 154,993 Bytes: 402,499,500 Cardinality: 20,124,975
17 HASH JOIN Cost: 519,713 Bytes: 1,969,317,900 Cardinality: 29,838,150
15 TABLE ACCESS FULL TABLE AR.RA_CUSTOMER_TRX_LINES_ALL Cost: 302,822 Bytes: 716,115,600 Cardinality: 29,838,150
16 TABLE ACCESS FULL TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 149,847 Bytes: 1,253,202,300 Cardinality: 29,838,150
25 TABLE ACCESS FULL TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 157,552 Bytes: 5,158,998,615 Cardinality: 46,477,465
41 SORT GROUP BY Bytes: 75 Cardinality: 1
40 FILTER
39 MERGE JOIN CARTESIAN Cost: 11 Bytes: 75 Cardinality: 1
35 NESTED LOOPS Cost: 8 Bytes: 50 Cardinality: 1
32 NESTED LOOPS Cost: 5 Bytes: 30 Cardinality: 1
29 TABLE ACCESS BY INDEX ROWID TABLE AR.RA_CUSTOMER_TRX_LINES_ALL Cost: 3 Bytes: 22 Cardinality: 1
28 INDEX UNIQUE SCAN INDEX (UNIQUE) AR.RA_CUSTOMER_TRX_LINES_U1 Cost: 2 Cardinality: 1
31 TABLE ACCESS BY INDEX ROWID TABLE AR.RA_CUSTOMER_TRX_ALL Cost: 2 Bytes: 133,114,520 Cardinality: 16,639,315
30 INDEX UNIQUE SCAN INDEX (UNIQUE) AR.RA_CUSTOMER_TRX_U1 Cost: 1 Cardinality: 1
34 TABLE ACCESS BY INDEX ROWID TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 3 Bytes: 20 Cardinality: 1
33 INDEX RANGE SCAN INDEX AR.RA_CUST_TRX_LINE_GL_DIST_N6 Cost: 2 Cardinality: 1
38 BUFFER SORT Cost: 9 Bytes: 25 Cardinality: 1
37 TABLE ACCESS BY INDEX ROWID TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 3 Bytes: 25 Cardinality: 1
36 INDEX RANGE SCAN INDEX AR.RA_CUST_TRX_LINE_GL_DIST_N1 Cost: 2 Cardinality: 1
49 SORT GROUP BY Bytes: 48 Cardinality: 1
48 FILTER
47 NESTED LOOPS
45 NESTED LOOPS Cost: 7 Bytes: 48 Cardinality: 1
43 TABLE ACCESS BY INDEX ROWID TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 4 Bytes: 20 Cardinality: 1
42 INDEX RANGE SCAN INDEX AR.RA_CUST_TRX_LINE_GL_DIST_N6 Cost: 3 Cardinality: 1
44 INDEX RANGE SCAN INDEX AR.RA_CUST_TRX_LINE_GL_DIST_N1 Cost: 2 Cardinality: 1
46 TABLE ACCESS BY INDEX ROWID TABLE AR.RA_CUST_TRX_LINE_GL_DIST_ALL Cost: 3 Bytes: 28 Cardinality: 1
As per oracle , they had suggested multiple patches but that has not been helpful. Please suggest how do i tune this query. I dont have much clue of query tuning.
RegardsHi Paul, My bad. I am sorry I missed it.
Query as below :
UPDATE RA_CUST_TRX_LINE_GL_DIST LGD SET (AMOUNT, ACCTD_AMOUNT) = (SELECT /*+ index(rec1 RA_CUST_TRX_LINE_GL_DIST_N6) ordered */ NVL(LGD.AMOUNT, 0) - ( SUM(LGD2.AMOUNT) - ( DECODE(LGD.GL_DATE, REC1.GL_DATE, 1, 0) * CTL.EXTENDED_AMOUNT ) ), NVL(LGD.ACCTD_AMOUNT, 0) - ( SUM(LGD2.ACCTD_AMOUNT) - ( DECODE(LGD.GL_DATE, REC1.GL_DATE, 1, 0) * DECODE(:B2 , NULL, ROUND( CTL.EXTENDED_AMOUNT * NVL(CT.EXCHANGE_RATE,1), :B3 ), ROUND( ( CTL.EXTENDED_AMOUNT * NVL(CT.EXCHANGE_RATE,1) ) / :B2 ) * :B2 ) ) ) FROM RA_CUSTOMER_TRX_LINES CTL, RA_CUSTOMER_TRX CT, RA_CUST_TRX_LINE_GL_DIST LGD2, RA_CUST_TRX_LINE_GL_DIST REC1 WHERE CTL.CUSTOMER_TRX_LINE_ID = LGD2.CUSTOMER_TRX_LINE_ID AND CTL.CUSTOMER_TRX_ID = CT.CUSTOMER_TRX_ID AND LGD.CUSTOMER_TRX_LINE_ID = CTL.CUSTOMER_TRX_LINE_ID AND LGD2.ACCOUNT_SET_FLAG = 'N' AND REC1.CUSTOMER_TRX_ID = CT.CUSTOMER_TRX_ID AND REC1.ACCOUNT_CLASS = 'REC' AND REC1.LATEST_REC_FLAG = 'Y' AND NVL(LGD.GL_DATE, TO_DATE( 2415021, 'J') ) = NVL(LGD2.GL_DATE, TO_DATE( 2415021, 'J') ) GROUP BY CTL.CUSTOMER_TRX_LINE_ID, REC1.GL_DATE, CTL.EXTENDED_AMOUNT, CTL.REVENUE_AMOUNT, CT.EXCHANGE_RATE ), PERCENT = (SELECT /*+ index(rec2 RA_CUST_TRX_LINE_GL_DIST_N6) */ DECODE(LGD.ACCOUNT_CLASS || LGD.ACCOUNT_SET_FLAG, 'SUSPENSEN', LGD.PERCENT, 'UNBILLN', LGD.PERCENT, 'UNEARNN', LGD.PERCENT, NVL(LGD.PERCENT, 0) - ( SUM(NVL(LGD4.PERCENT, 0)) - DECODE(REC2.GL_DATE, NVL(LGD.GL_DATE, REC2.GL_DATE), 100, 0) ) ) FROM RA_CUST_TRX_LINE_GL_DIST LGD4, RA_CUST_TRX_LINE_GL_DIST REC2 WHERE LGD.CUSTOMER_TRX_LINE_ID = LGD4.CUSTOMER_TRX_LINE_ID AND REC2.CUSTOMER_TRX_ID = LGD.CUSTOMER_TRX_ID AND REC2.CUSTOMER_TRX_ID = LGD4.CUSTOMER_TRX_ID AND REC2.ACCOUNT_CLASS = 'REC' AND REC2.LATEST_REC_FLAG = 'Y' AND LGD4.ACCOUNT_SET_FLAG = LGD.ACCOUNT_SET_FLAG AND DECODE(LGD4.ACCOUNT_SET_FLAG, 'Y', LGD4.ACCOUNT_CLASS, LGD.ACCOUNT_CLASS) = LGD.ACCOUNT_CLASS AND NVL(LGD.GL_DATE, TO_DATE( 2415021, 'J') ) = NVL(LGD4.GL_DATE, TO_DATE( 2415021, 'J') ) GROUP BY REC2.GL_DATE, LGD.GL_DATE ), LAST_UPDATED_BY = :B1 , LAST_UPDATE_DATE = SYSDATE WHERE CUST_TRX_LINE_GL_DIST_ID IN (SELECT /*+ index(rec3 RA_CUST_TRX_LINE_GL_DIST_N6) */ MIN(DECODE(LGD3.GL_POSTED_DATE, NULL, LGD3.CUST_TRX_LINE_GL_DIST_ID, NULL) ) FROM RA_CUSTOMER_TRX_LINES CTL, RA_CUSTOMER_TRX T, RA_CUST_TRX_LINE_GL_DIST LGD3, RA_CUST_TRX_LINE_GL_DIST REC3 WHERE T.REQUEST_ID = :B5 AND T.CUSTOMER_TRX_ID = CTL.CUSTOMER_TRX_ID AND (CTL.LINE_TYPE IN ( 'TAX','FREIGHT','CHARGES','SUSPENSE' ) OR (CTL.LINE_TYPE = 'LINE' AND CTL.ACCOUNTING_RULE_ID IS NULL )) AND LGD3.CUSTOMER_TRX_LINE_ID = CTL.CUSTOMER_TRX_LINE_ID AND LGD3.ACCOUNT_SET_FLAG = 'N' AND REC3.CUSTOMER_TRX_ID = T.CUSTOMER_TRX_ID AND REC3.ACCOUNT_CLASS = 'REC' AND REC3.LATEST_REC_FLAG = 'Y' AND NVL(T.PREVIOUS_CUSTOMER_TRX_ID, -1) = DECODE(:B4 , 'INV', -1, 'REGULAR_CM', T.PREVIOUS_CUSTOMER_TRX_ID, NVL(T.PREVIOUS_CUSTOMER_TRX_ID, -1) ) GROUP BY CTL.CUSTOMER_TRX_LINE_ID, LGD3.GL_DATE, REC3.GL_DATE, CTL.EXTENDED_AMOUNT, CTL.REVENUE_AMOUNT, T.EXCHANGE_RATE HAVING ( SUM(NVL(LGD3.AMOUNT, 0)) <> CTL.EXTENDED_AMOUNT * DECODE(LGD3.GL_DATE, REC3.GL_DATE, 1, 0) OR SUM(NVL(LGD3.ACCTD_AMOUNT, 0)) <> DECODE(LGD3.GL_DATE, REC3.GL_DATE, 1, 0) * DECODE(:B2 , NULL, ROUND( CTL.EXTENDED_AMOUNT * NVL(T.EXCHANGE_RATE,1), :B3 ), ROUND( ( CTL.EXTENDED_AMOUNT * NVL(T.EXCHANGE_RATE,1) ) / :B2 ) * :B2 ) ) UNION SELECT /*+ index(rec5 RA_CUST_TRX_LINE_GL_DIST_N6) INDEX (lgd5 ra_cust_trx_line_gl_dist_n6) index(ctl2 ra_customer_trx_lines_u1) */ TO_NUMBER( MIN(DECODE(LGD5.GL_POSTED_DATE||LGD5.ACCOUNT_CLASS|| LGD5.ACCOUNT_SET_FLAG, 'REVN', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'REVY', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'TAXN', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'TAXY', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'FREIGHTN', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'FREIGHTY', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'CHARGESN', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'CHARGESY', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'UNEARNY', LGD5.CUST_TRX_LINE_GL_DIST_ID, 'UNBILLY', LGD5.CUST_TRX_LINE_GL_DIST_ID, NULL ) ) ) FROM RA_CUST_TRX_LINE_GL_DIST LGD5, RA_CUST_TRX_LINE_GL_DIST REC5, RA_CUSTOMER_TRX_LINES CTL2, RA_CUSTOMER_TRX T WHERE T.REQUEST_ID = :B5 AND T.CUSTOMER_TRX_ID = REC5.CUSTOMER_TRX_ID AND CTL2.CUSTOMER_TRX_LINE_ID = LGD5.CUSTOMER_TRX_LINE_ID AND REC5.CUSTOMER_TRX_ID = LGD5.CUSTOMER_TRX_ID AND REC5.ACCOUNT_CLASS = 'REC' AND REC5.LATEST_REC_FLAG = 'Y' AND (CTL2.LINE_TYPE IN ( 'TAX','FREIGHT','CHARGES','SUSPENSE') OR (CTL2.LINE_TYPE = 'LINE' AND (CTL2.ACCOUNTING_RULE_ID IS NULL OR LGD5.ACCOUNT_SET_FLAG = 'Y' ))) GROUP BY LGD5.CUSTOMER_TRX_LINE_ID, LGD5.GL_DATE, REC5.GL_DATE, LGD5.ACCOUNT_SET_FLAG, DECODE(LGD5.ACCOUNT_SET_FLAG, 'N', NULL, LGD5.ACCOUNT_CLASS) HAVING SUM(NVL(LGD5.PERCENT, 0)) <> DECODE( NVL(LGD5.GL_DATE, REC5.GL_DATE), REC5.GL_DATE, 100, 0) )
I understand that this could be a seeded query but my attempt is to tune it.
Regards -
Hi Experts,
Am looing to pull vbrp-vbeln i.e. billing doc #, based on the VGBEL i.e. sales doc #
i.e.
select single * from vbrp into wa_vbrp
where vgbel = wa_vbap-vbeln
and posnr = wa_vbap-posnr.
but, as there is no secondary index in vbrp for vgbel and there r tonns of recs in vbrp, its taking too much time?
so, wht is the alternative that i can find billing doc # with my sales doc #?
thanqMr. Srinivas,
Just a suggestion, if you need only the header details, then why not extract data from VBRK (header for billing doc) & VBAK (header for sales doc). These 2 tables contain only single line per billing or sales doc & hence the performance should be better.
If my suggestion is not what you are looking for, then apologies for the same.
Regards,
Vivek
Alternatively as Mr. Eric suggests, you can use VBFA
VBFA-VBELN = VBRK-VBELN
VBFA-VBELV = VBAK-VBELN
Logic is VBFA-VBELN is the subsequent document & VBFA-VBELV is the preceding document.
Hope it helps. (but be sure, the document created after sales order is billing document, there might be cases where there could be delivery documents after sales order & before billing document, so be careful)
Edited by: Vivek on Jan 29, 2008 11:11 PM -
Creative Cloud is taking too much time to load and is not downloading the one month trial for Photoshop I just paid money for.
stop the download if it's stalled, and restart your download.
-
I having issue with my Iphone 4 while playing music its taking too much time to play
I am using Iphone which is taking too much time to play music & some time its shows one album cover and playing others song please help and let me know whats the issue
Hello Sanjay,
I would recommend steps 1, 3, and 5 from our iPhone Troubleshooting Assistant found here: http://www.apple.com/support/iphone/assistant/phone/#section_1
Here is step 1 to get you started.
Restart iPhone
To restart iPhone, first turn iPhone off by pressing and holding the Sleep/Wake button until a red slider appears. Slide your finger across the slider and iPhone will turn off after a few moments.
Next, turn iPhone on by pressing and holding the Sleep/Wake button until the Apple logo appears.
Is iPhone not responding? To reset iPhone, press and hold the Sleep/Wake button and the Home button at the same time for at least 10 seconds, until the Apple logo appears.
If your device does not turn on or displays a red battery icon, try recharging next.
Take care,
Sterling
Maybe you are looking for
-
I have Hp Elitebook 8560p with Windows 8.It's a great machine but I'M GETTİNG SİCK OF deleting synapstics and load it again and restarting it.It works for clicking but I can't disable/enable it from the little dot from the left top of touchpad.That's
-
Numbers with US country code (+1) are not recognized
Hi Everyone, some of my contacts have +1 (US country code) stored in front of their number, as that is the only way to ensure texts reach them, while one is abroad. In the US, though, when they call their numbers are not always mapped to the contact
-
Limitations of JDeveloper WebServicesAssembler
I'm attempting top-down deployment of a stateless document literal web service using WebServicesAssembler (OC4J 10.1.2). I am supplying the WSDL which is split into two files - deployment and abstract. The abstract WSDL imports a couple of schemas wh
-
I'm running up against a limit on the size of the tracklog that can be loaded in the Map panel. I tried loading an approximately 12MB .gpx tracklog which represents about 12 hours of coordinates taken every 1 second and got an error message saying th
-
Changing status of Task created by system in Notification.
I am facing a problem while changing status of "Task" in Notification to "Complete". Actually I want that if system is cretating a Task automatically in Notification, this Task should have status "Complete" instead of release. For this to happen I di