Trans_amount
while am creating table with field trans_amount,when i activated the table it is asking reference table and field.
you need to provide the table name and the field of which type you would like to have the field format.
Number type VBAK-VBELN " VBAK is the table , VBELN is the field.
In your case.. if you are not sure for the table name and field.. you can declare as
data: trans_amount type p decimal 2.
Similar Messages
-
Problem while having a large set of data to work on!
Hi,
I am facing great problem with processing large set of data. I have a requirement in which i'm supposed to generate a report.
I have a table and a MView, which i have joined to reduce the number of records to process. The MView holds 200,00,000 records while the table 18,00,000. Based on join conditions and where clause i'm able to break down the useful data to approx 4,50,000 and i'm getting 8 of my report columns from this join. I'm dumping these records into the table from where i'll be generating the report by spooling.
Below is the block which takes 12mins to insert into the report table MY_ACCOUNT_PHOTON_DUMP:
begin
dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
insert into MY_ACCOUNT_PHOTON_DUMP --- Report table
(SUBSCR_NO, ACCOUNT_NO, AREA_CODE, DEL_NO, CIRCLE, REGISTRATION_DT, EMAIL_ID, ALT_CNTCT_NO)
select crm.SUBSCR_NO, crm.ACCOUNT_NO, crm.AREA_CODE, crm.DEL_NO, crm.CIRCLE_ID,
aa.CREATED_DATE, aa.EMAIL_ID, aa.ALTERNATE_CONTACT
from MV_CRM_SUBS_DTLS crm, --- MView
(select /*+ ALL_ROWS */ A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
from MCCI_PROFILE_DTLS a, MCCI_PROFILE_SUBSCR_DTLS b
where A.PROFILE_ID = B.PROFILE_ID
and B.ACE_STATUS = 'N'
) aa --- Join of two tables giviing me 18,00,000 recs
where crm.SUBSCR_NO = aa.SUBSCR_NO
and crm.SRVC_TYPE_ID = '125'
and crm.END_DT IS NULL;
INTERNET_METER_TABLE_PROC_1('MCCIPRD','MY_ACCOUNT_PHOTON_DUMP'); --- calling procedure to analyze the report table
COMMIT;
dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
end; --- 12 min 04 secFor the rest of the 13 columns required i am running a block which has a FOR UPDATE cursor on the report table:
declare
cursor cur is
select SUBSCR_NO, ACCOUNT_NO, AREA_CODE, DEL_NO,
CIRCLE, REGISTRATION_DT, EMAIL_ID, ALT_CNTCT_NO
from MCCIPRD.MY_ACCOUNT_PHOTON_DUMP --where ACCOUNT_NO = 901237064
for update of
MRKT_SEGMNT, AON, ONLINE_PAY, PAID_AMNT, E_BILL, ECS, BILLED_AMNT,
SRVC_TAX, BILL_PLAN, USAGE_IN_MB, USAGE_IN_MIN, NO_OF_LOGIN, PHOTON_TYPE;
v_aon VARCHAR2(10) := NULL;
v_online_pay VARCHAR2(10) := NULL;
v_ebill VARCHAR2(10) := NULL;
v_mkt_sgmnt VARCHAR2(50) := NULL;
v_phtn_type VARCHAR2(50) := NULL;
v_login NUMBER(10) := 0;
v_paid_amnt VARCHAR2(50) := NULL;
v_ecs VARCHAR2(10) := NULL;
v_bill_plan VARCHAR2(100):= NULL;
v_billed_amnt VARCHAR2(10) := NULL;
v_srvc_tx_amnt VARCHAR2(10) := NULL;
v_usg_mb NUMBER(10) := NULL;
v_usg_min NUMBER(10) := NULL;
begin
dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
for rec in cur loop
begin
select apps.TTL_GET_DEL_AON@MCCI_TO_PRD591(rec.ACCOUNT_NO, rec.DEL_NO, rec.CIRCLE)
into v_aon from dual;
exception
when others then
v_aon := 'NA';
end;
SELECT DECODE(COUNT(*),0,'NO','YES') into v_online_pay
FROM TTL_DESCRIPTIONS@MCCI_TO_PRD591
WHERE DESCRIPTION_CODE IN(SELECT DESCRIPTION_CODE FROM TTL_BMF_TRANS_DESCR@MCCI_TO_PRD591
WHERE BMF_TRANS_TYPE
IN (SELECT BMF_TRANS_TYPE FROM
TTL_BMF@MCCI_TO_PRD591 WHERE ACCOUNT_NO = rec.ACCOUNT_NO
AND POST_DATE BETWEEN
TO_DATE('01-'||TO_CHAR(SYSDATE,'MM-YYYY'),'DD-MM-YYYY') AND SYSDATE
AND DESCRIPTION_TEXT IN (select DESCRIPTION from fnd_lookup_values@MCCI_TO_PRD591 where
LOOKUP_TYPE='TTL_ONLINE_PAYMENT');
SELECT decode(count( *),0,'NO','YES') into v_ebill
FROM TTL_CUST_ADD_DTLS@MCCI_TO_PRD591
WHERE CUST_ACCT_NBR = rec.ACCOUNT_NO
AND UPPER(CUSTOMER_PREF_MODE) ='EMAIL';
begin
select ACC_SUB_CAT_DESC into v_mkt_sgmnt
from ttl_cust_dtls@MCCI_TO_PRD591 a, TTL_ACCOUNT_CATEGORIES@MCCI_TO_PRD591 b
where a.CUST_ACCT_NBR = rec.ACCOUNT_NO
and a.market_code = b.ACC_SUB_CAT;
exception
when others then
v_mkt_sgmnt := 'NA';
end;
begin
select nvl(sum(TRANS_AMOUNT),0) into v_paid_amnt
from ttl_bmf@MCCI_TO_PRD591
where account_no = rec.ACCOUNT_NO
AND POST_DATE
BETWEEN TO_DATE('01-'||TO_CHAR(SYSDATE,'MM-YYYY'),'DD-MM-YYYY')
AND SYSDATE;
exception
when others then
v_paid_amnt := 'NA';
end;
SELECT decode(count(1),0,'NO','YES') into v_ecs
from ts.Billdesk_Registration_MV@MCCI_TO_PRD591 where ACCOUNT_NO = rec.ACCOUNT_NO
and UPPER(REGISTRATION_TYPE ) = 'ECS';
SELECT decode(COUNT(*),0,'PHOTON WHIZ','PHOTON PLUS') into v_phtn_type
FROM ts.ttl_cust_ord_prdt_dtls@MCCI_TO_PRD591 A, ttl_product_mstr@MCCI_TO_PRD591 b
WHERE A.SUBSCRIBER_NBR = rec.SUBSCR_NO
and (A.prdt_disconnection_date IS NULL OR A.prdt_disconnection_date > SYSDATE )
AND A.prdt_disc_flag = 'N'
AND A.prdt_nbr = b.product_number
AND A.prdt_type_id = b.prouduct_type_id
AND b.first_level LIKE 'Feature%'
AND UPPER (b.product_desc) LIKE '%HSIA%';
SELECT count(1) into v_login
FROM MCCIPRD.MYACCOUNT_SESSION_INFO a
WHERE (A.DEL_NO = rec.DEL_NO or A.DEL_NO = ltrim(rec.AREA_CODE,'0')||rec.DEL_NO)
AND to_char(A.LOGIN_TIME,'Mon-YYYY') = to_char(sysdate-5,'Mon-YYYY');
begin
select PACKAGE_NAME, BILLED_AMOUNT, SERVICE_TAX_AMOUNT, USAGE_IN_MB, USAGE_IN_MIN
into v_bill_plan, v_billed_amnt, v_srvc_tx_amnt, v_usg_mb, v_usg_min from
(select rank() over(order by STATEMENT_DATE desc) rk,
PACKAGE_NAME, USAGE_IN_MB, USAGE_IN_MIN
nvl(BILLED_AMOUNT,'0') BILLED_AMOUNT, NVL(SRVC_TAX_AMNT,'0') SERVICE_TAX_AMOUNT
from MCCIPRD.MCCI_IM_BILLED_DATA
where (DEL_NUM = rec.DEL_NO or DEL_NUM = ltrim(rec.AREA_CODE,'0')||rec.DEL_NO)
and STATEMENT_DATE like '%'||to_char(SYSDATE,'Mon-YY')||'%')
where rk = 1;
exception
when others then
v_bill_plan := 'NA';
v_billed_amnt := '0';
v_srvc_tx_amnt := '0';
v_usg_mb := 0;
v_usg_min := 0;
end;
-- UPDATE THE DUMP TABLE --
update MCCIPRD.MY_ACCOUNT_PHOTON_DUMP
set MRKT_SEGMNT = v_mkt_sgmnt, AON = v_aon, ONLINE_PAY = v_online_pay, PAID_AMNT = v_paid_amnt,
E_BILL = v_ebill, ECS = v_ecs, BILLED_AMNT = v_billed_amnt, SRVC_TAX = v_srvc_tx_amnt,
BILL_PLAN = v_bill_plan, USAGE_IN_MB = v_usg_mb, USAGE_IN_MIN = v_usg_min, NO_OF_LOGIN = v_login,
PHOTON_TYPE = v_phtn_type
where current of cur;
end loop;
COMMIT;
dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
exception when others then
dbms_output.put_line(SQLCODE||'::'||SQLERRM);
end;The report takes >6hrs. I know that most of the SELECT queries have ACCOUNT_NO as WHERE clause and can be joined, but when i joining few of these blocks with the initial INSERT query it was no better.
The individual queries within the cursor loop dont take more then 0.3 sec to execute.
I'm using the FOR UPDATE as i know that the report table is being used solely for this purpose.
Can somebody plz help me with this, i'm in desperate need of good advice here.
Thanks!!
Edited by: user11089213 on Aug 30, 2011 12:01 AMHi,
Below is the explain plan for the original query:
select /*+ ALL_ROWS */ crm.SUBSCR_NO, crm.ACCOUNT_NO, ltrim(crm.AREA_CODE,'0'), crm.DEL_NO, >crm.CIRCLE_ID
from MV_CRM_SUBS_DTLS crm,
(select /*+ ALL_ROWS */ A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
from MCCIPRD.MCCI_PROFILE_DTLS a, MCCIPRD.MCCI_PROFILE_SUBSCR_DTLS b
where A.PROFILE_ID = B.PROFILE_ID
and B.ACE_STATUS = 'N'
) aa
where crm.SUBSCR_NO = aa.SUBSCR_NO
and crm.SRVC_TYPE_ID = '125'
and crm.END_DT IS NULL
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1481K| 100M| | 245K (5)| 00:49:09 |
|* 1 | HASH JOIN | | 1481K| 100M| 46M| 245K (5)| 00:49:09 |
|* 2 | HASH JOIN | | 1480K| 29M| 38M| 13884 (9)| 00:02:47 |
|* 3 | TABLE ACCESS FULL | MCCI_PROFILE_SUBSCR_DTLS | 1480K| 21M| | 3383 (13)| 00:00:41 |
| 4 | INDEX FAST FULL SCAN| SYS_C002680 | 2513K| 14M| | 6024 (5)| 00:01:13 |
|* 5 | MAT_VIEW ACCESS FULL | MV_CRM_SUBS_DTLS_08AUG | 1740K| 82M| | 224K (5)| 00:44:49 |
Predicate Information (identified by operation id):
1 - access("CRM"."SUBSCR_NO"="B"."SUBSCR_NO")
2 - access("A"."PROFILE_ID"="B"."PROFILE_ID")
3 - filter("B"."ACE_STATUS"='N')
5 - filter("CRM"."END_DT" IS NULL AND "CRM"."SRVC_TYPE_ID"='125')Whereas for the modified MView query, the plane remains the same:
select /*+ ALL_ROWS */ crm.SUBSCR_NO, crm.ACCOUNT_NO, ltrim(crm.AREA_CODE,'0'), crm.DEL_NO, >crm.CIRCLE_ID
from (select * from MV_CRM_SUBS_DTLS
where SRVC_TYPE_ID = '125'
and END_DT IS NULL) crm,
(select /*+ ALL_ROWS */ A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
from MCCIPRD.MCCI_PROFILE_DTLS a, MCCIPRD.MCCI_PROFILE_SUBSCR_DTLS b
where A.PROFILE_ID = B.PROFILE_ID
and B.ACE_STATUS = 'N'
) aa
where crm.SUBSCR_NO = aa.SUBSCR_NO
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1481K| 100M| | 245K (5)| 00:49:09 |
|* 1 | HASH JOIN | | 1481K| 100M| 46M| 245K (5)| 00:49:09 |
|* 2 | HASH JOIN | | 1480K| 29M| 38M| 13884 (9)| 00:02:47 |
|* 3 | TABLE ACCESS FULL | MCCI_PROFILE_SUBSCR_DTLS | 1480K| 21M| | 3383 (13)| 00:00:41 |
| 4 | INDEX FAST FULL SCAN| SYS_C002680 | 2513K| 14M| | 6024 (5)| 00:01:13 |
|* 5 | MAT_VIEW ACCESS FULL | MV_CRM_SUBS_DTLS_08AUG | 1740K| 82M| | 224K (5)| 00:44:49 |
Predicate Information (identified by operation id):
1 - access("CRM"."SUBSCR_NO"="B"."SUBSCR_NO")
2 - access("A"."PROFILE_ID"="B"."PROFILE_ID")
3 - filter("B"."ACE_STATUS"='N')
5 - filter("CRM"."END_DT" IS NULL AND "CRM"."SRVC_TYPE_ID"='125')Also took your advice and tried to merge all the queries into single INSERT SQL, will be posting the results shortly.
Edited by: BluShadow on 30-Aug-2011 10:21
added {noformat}{noformat} tags. Please read {message:id=9360002} to learn to do this yourself -
Hi All,
I already calculated a row(Add Total row) like Gross profit within Group1 (a,b,c,d,e which are sub groups) using Previous and current scope column group values. ex: Gross Profit = a - b. (within group1)
Gross Profit = Previous(Sum(Fields!Trans_amount.Value)) - Sum(Fields!Trans_amount.Value) in Group1
Now, I want to get values for Net income like Net income = a - b - c in Group1. (OR)
Net income = Gross Profit - c in Group1 (Using Gross Profit textbox values using Reportitems!textbox.value). but, values different. Since Expression got new calculation)
Please help me.
Thanks Advance.
- Prem Kumar T D http://www.sharepointbasic.com/Hi Premtd,
As per my understanding, there are group and subgroups in the report, you added total to a group with the expresson: Previous(Sum(Fields!Trans_amount.Value)) - Sum(Fields!Trans_amount.Value). You want to add a text box to the report to calculate Net income
with the expression: Previous(Sum(Fields!Trans_amount.Value)) - Sum(Fields!Trans_amount.Value) - Sum(Fields!Trans_amount.Value). In order to improve the efficiency of troubleshooting, I need to ask several questions:
• “I already calculated a row(Add Total row) like Gross profit within Group1 (a,b,c,d,e which are sub groups)” What’s the meaning of Group1 and subgroup a, b, c, d, e ? Could you please mark Gross profit and the groups in the screenshot?
• How to calculate Net income? Please provide some more detailed information of your requirements. I would be appreciated it if you could provide sample data and clear screenshot of the report.
This may be a lot of information to ask for at one time. However, by collecting this information now, it will help us move more quickly toward a solution.
If you have any more questions, please feel free to ask.
Thanks,
Wendy Fu -
Query to find revenue of a bank
We have loan datamart and account datamart. For joining the two fact tables of loan and account, we use customer table as conformed dimension.we need to calculate the revenue,assets and liability of the bank.can anyone help us with this??
loan datamart contains the folowing tables:
loan_fact customer time branch loan interest
loan_balance cust_id time_id branch_id loan_id interest_id
trans_amount cust_name year branch location loantype interest_rate
loan_amount month no_of_installments
EMI day no_of_installments paid
account datamart have the following tables:
account_fact customer time branch interest account
deposit_amount account_id
withdraw amount account-type
service_charge
insufficient fund fee
Edited by: 915235 on Mar 7, 2012 1:01 AM915235 wrote:
We have loan datamart and account datamart. For joining the two fact tables of loan and account, we use customer table as conformed dimension.we need to calculate the revenue,assets and liability of the bank.can anyone help us with this??
loan datamart contains the folowing tables:
loan_fact customer time branch loan interest
loan_balance cust_id time_id branch_id loan_id interest_id
trans_amount cust_name year branch location loantype interest_rate
loan_amount month no_of_installments
EMI day no_of_installments paid
account datamart have the following tables:
account_fact customer time branch interest account
deposit_amount account_id
withdraw amount account-type
service_charge
insufficient fund fee
Edited by: 915235 on Mar 7, 2012 1:01 AMI see you edited your original post.
to answer your question about whether we can help you or not:
yes, but not with the information you've given us, you've answered one of my questions (table definitions) but:
what have you got so far?
why hasn't it worked for you? (what error messages are you getting?)
what is your oracle version?
where is the sample data?
where is the sample expected output? -
Update Millions of records in a table
Hi ,
My DB is 10.2.0
This is a very generic question, hence I am not adding the Plan.
I have to update a column in a table with value of sum of another column from the same table.
The table has 19 million records.
UPDATE transactions a SET rest_amount = Nvl((SELECT Sum(trans_amount) FROM transactions b
WHERE b.trans_date=a.trans_date AND trans_ind < 2),trans_amount)
WHERE trans_grp < 6999 and trans_ind < 2;This query takes 10 hours to run. There are indexes on trans_date and trans_grp. There is no index on rest_amount column.
As per tom in
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6407993912330
He had suggested to disable all indexes and constraints and then enable them. Is it applicable only for INSERT or for all DML.
Only during insert the constraints will be validated and index will be refreshed with the new row. But in my case, I am updating a column which is not a part of any index. Should I go as per tom's suggestion ???The table has 19 million records.How many rows (that's rows) are actually updated?
-
Inventory management process key values plz
Hi gurus,
can anyone help me with the process key values
in my update rules i have this for issues
process keys= 100,101,104,105,106,110
and for receipts 000,001,004,005,006,010
what do they mean i am using the standard update rules
its very urgent becoz i am getting the stock values wrong
thanks and regards
neeluhi Neel,
check oss note 352344 - Process key + reversals in Inventory Management ?
Symptom
This note is a consulting note and describes the use of the process key (0PROCESSKEY) in Inventory Management (MSEG). It focusses on the way the system deals with reversed transactions for DataSources 2lis_40_s279 and 2lis_03_bf.
To be able to use theses DataSources, you ABSOLUTELY MUST activate the transaction key generation (process key, PROCESSKEY) using Transaction MCB_ (from the OLTP IMG for BW: SBIW) (standard, retail or consumption goods).
The following transaction keys are available
(PROCESSKEY/Appl. Component/Description):
000/MM Misc. receipts
001/MM Goods receipt / vendor
004/MM Article transfer posting receipt
005/MM Stock correction inventory +
006/MM Stock correction other +
007/IS-R Receipt value-only article (single article
posting)
010/MM Receipt from stock transfer
002/IS-R Merchandise clearing receipt
003/IS-R GR from DC
100/MM Misc. issues
101/MM Returns / Vendor
104/MM Article transfer posting issue
105/MM Stock correction inventory -
106/MM Stock correction other -
107/IS-R Issue value-only article (single article
posting)
110/MM Issue from stock transfer
102/IS-R Merchandise clearing issue
103/IS-R GI from DC
450/IS-R Generic Article (not relevant)
Remark: Transaction keys 002/003, 102/103 break down the core keys 010/110 in more detail with respect to retail processes. They are only available in an R/3 Retail.
As you can see in the overview, the transaction keys can be divided according to receipts and issues in Inventory Management. Furthermore, the transaction keys are displayed according to reversed and regular transactions.
A regular receipt has a debit/credit indicator "S" (SHKZG, 0DCINDIC), whereas a regular issue has a debit/credit indicator "H".
For reverse transactions the opposite is true.
Transaction D/C ind. D/C ind.
S H
RECEIPTS
0 Misc. receipts regular reversed
1 Goods receipt / vendor regular reversed
2 Merchandise clearing receipt regular reversed
3 GR from DC regular reversed
4 Article transfer posting receipt regular reversed
5 Stock correction inventory + regular reversed
6 Stock correction other + regular reversed
7 Receipt value-only article regular reversed
10 Receipt from stock transfer regular reversed
ISSUES
100 Misc. issues reversed regular
101 Returns / vendor reversed regular
102 Merchandise clearing issue reversed regular
103 GI from DC reversed regular
104 Article transfer posting issue reversed regular
105 Stock correction inventory - reversed regular
106 Stock correction other - reversed regular
107 Issue value-only article reversed regular
110 Issue from stock transfer reversed regular
Note: You can also recognize a reversal for DataSource 2lis_03_bf by means of the entry 0STORNO = ´X´. The fields that are marked with X in the table are then transferred with negative +/- sign. This was not the case with DataSource 2LIS_40_S279!!! In the case of DataSource 2LIS_40_S279 more logic was required in the BW update rules to make sure that key figures were updated correctly.
Example:
In the delivered InfoCubes 0CP_IC_C1 (CP) and 0RT_C01 (Retail), for example in key "Stock correction +", transaction keys 5 and 6 were grouped together. Furthermore, distinction is to be made between the different stock types. Depending on which stock types you want to distinguish between in different key figures, you must use a corresponding condition (IF statement) in the update rules in the BW.
Example (pseudo source code):
Updating Routine "stock adjustment +" for 2lis_02_bfIF ( STOCKCAT is initial ) AND "Evaluated stocks ( PROCESSKEY = 5 OR PROCESSKEY = 6 )._ RESULT = TRANS_AMOUNT. RETURNCODE = 0. "Updating Key figureELSE. RETURNCODE = 4. "No Updating of KeyfigureENDIF.
The pseudo source code for 2LIS_40_S279 read as follows:
Updating Routine "stock adjustment +" for 2lis_40_s279IF ( STOCKCAT is initial ) AND "Evaluated stocks ( PROCESSKEY = 5 OR PROCESSKEY = 6 ). IF DCINDIC = 'S'. RESULT = TRANS_AMOUNT. "regular ELSE. RESULT = -1 * TRANS_AMOUNT. ENDIF. RETURNCODE = 0. "Updating Key figureELSE. RETURNCODE = 4. "No Updating of KeyfigureENDIF.
Here, the debit/credit indicator must be checked in accordance with the table above. Transactions 5 and 6 are receipts in Inventory Management. As the debit/credit indicator is set to "S", it is a regular transaction whose value(TRANS_AMOUNT) is assigned to the key figure. In the other case (debit/credit indicator = "H") it is a reversal, that is, the transaction should reverse a transaction that has already been updated. For this, the value is multiplied by -1 so that a corresponding decrease/reduction of this key figure is achieved during the update of the key figure in the InfoCube.
This logic is no longer required for the 2LIS_03_BF (see first pseudo source code), because the reversed quantity and values are automatically provided as negative with the help of the 0STORNO field.
Using this DataSource 2LIS_03_BF, it is for example possible to create a key figure such as "Reversed receipts", which is not a part of the Business Content delivered. The following pseudo source code of an update routine makes this clear:
Update routine "Reversed receipts"
IF ( PROCESSKEY = 1 ) AND (STORNO = ´X` ) "Reverse RESULT = -1 * TRANS_AMOUNT. RETURNCODE = 0.ELSE. RETURNCODE = 4. "no update of key figure!ENDIF.
Note: For DataSource 2LIS_40_S279 the pseudo source code read as follows:
Update routine "Reversed receipts"
for 2LIS_40_S279IF ( PROCESSKEY = 1 ) AND ( DCINDIC = H ) "Reverse RESULT = TRANS_AMOUNT. RETURNCODE = 0.ELSE. RETURNCODE = 4. "no update of key figure!ENDIF.
To be able to understand the overall scheme more comprehensively, you should have a look at the update rules of the Standard Business Content for retail or consumption goods (for example InfoCubes 0RT_C01 or 0CP_IC_C1). -
Please help me tuning this query...
Hi all,
Can any body suggest some changes (synthatic) so that it can improve its performance... taking too much time...
This is a query used in oracle report...
SELECT distinct cc.seg1,cc.seg2,cc.seg3,cc.seg4,cc.seg5,cc.seg6,cc.seg7,cc.seg8,cc.code_combination_id code_comb_id,
r.flex_validation_rule_name cv_rule,gl.currency_code currency1,
(NVL (gjl.accounted_dr, 0) - NVL (gjl.accounted_cr, 0)) trans_amount,
gjh.doc_sequence_value doc_num,
gjs.user_je_source_name je_source, gjc.user_je_category_name je_category,
fu.user_name last_updated_by,(NVL (gb.begin_balance_dr, 0)
- NVL (gb.begin_balance_cr, 0))
+ (NVL (gb.period_net_dr_beq, 0)
- NVL (gb.period_net_cr_beq, 0)) balance,
gjh.last_update_date last_updated_date,gjl.effective_date
FROM gl_code_combinations cc,
fnd_flex_vdation_rules_vl r,
fnd_flex_validation_rule_lines l,
gl_je_headers gjh,
gl_je_lines gjl,
gl_balances gb,
fnd_user fu,
gl_je_categories gjc,
gl_je_sources gjs,
gl_ledgers gl
WHERE cc.enabled_flag = 'Y'
AND NVL (cc.end_date_active, SYSDATE + 1) > SYSDATE
AND cc.chart_of_accounts_id = :p_chart_of_accounts_id
AND r.id_flex_num = cc.chart_of_accounts_id
AND r.id_flex_code = 'GL#'
AND r.application_id = 101
AND gjs.je_source_name=gjh.je_source
AND gjc.je_category_name=gjh.je_category
AND l.application_id = r.application_id
AND gb.code_combination_id=cc.code_combination_id
AND gjh.period_name=gb.period_name
AND r.enabled_flag = 'Y'
AND NVL (r.end_date_active, SYSDATE + 1) > SYSDATE
AND l.id_flex_code = r.id_flex_code
AND l.id_flex_num = r.id_flex_num
AND l.flex_validation_rule_name = r.flex_validation_rule_name
AND l.include_exclude_indicator = 'E'
AND gjl.code_combination_id = gb.code_combination_id
AND gjh.je_header_id = gjl.je_header_id
AND gjh.status = 'P'
AND gl.ledger_id=gjh.ledger_id
and gl.ledger_id=:p_ledger_id
AND gjh.last_updated_by = fu.user_id
and (gjh.last_update_date,gjl.last_update_date)=(
SELECT MAX (a.last_update_date) ,MAX(b.last_update_date)
FROM gl_je_headers a, gl_je_lines b
WHERE b.code_combination_id = cc.code_combination_id
AND a.je_header_id = b.je_header_id
AND a.status = 'P'
AND ROWNUM = 1)
and (NVL (gjl.accounted_dr, 0) - NVL (gjl.accounted_cr, 0))=(select (NVL (accounted_dr, 0) - NVL (accounted_cr, 0)) from gl_je_lines
where je_header_id=gjh.je_header_id and code_combination_id=cc.code_combination_id and period_name=gjh.period_name and
last_update_date=gjh.last_update_date and rownum=1)
and (NVL (gb.begin_balance_dr, 0)
- NVL (gb.begin_balance_cr, 0))
+ (NVL (gb.period_net_dr_beq, 0)
- NVL (gb.period_net_cr_beq, 0))=(select max((NVL (begin_balance_dr, 0)
- NVL (begin_balance_cr, 0))
+ (NVL (period_net_dr_beq, 0)
- NVL (period_net_cr_beq, 0))) from gl_balances where code_combination_id=cc.code_combination_id
and period_name=gb.period_name)
and gjl.description=(select description from gl_je_lines where code_combination_id=cc.code_combination_id and
je_header_id=gjh.je_header_id and period_name=gb.period_name and rownum=1)
AND cc.seg1 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 1, 3),
cc.seg1
AND NVL (SUBSTR (l.concatenated_segments_high, 1, 3),
cc.seg1
AND cc.seg2 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 5, 4),
cc.seg2
AND NVL (SUBSTR (l.concatenated_segments_high, 5, 4),
cc.seg2
AND cc.seg3 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 10, 4),
cc.segt3
AND NVL (SUBSTR (l.concatenated_segments_high, 10, 4),
cc.seg3
AND cc.seg4 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 15, 3),
cc.seg4
AND NVL (SUBSTR (l.concatenated_segments_high, 15, 3),
cc.seg4
AND cc.seg5 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 19, 6),
cc.seg5
AND NVL (SUBSTR (l.concatenated_segments_high, 19, 6),
cc.seg5
AND cc.seg6 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 26, 4),
cc.segment6
AND NVL (SUBSTR (l.concatenated_segments_high, 26, 4),
cc.seg6
AND cc.seg7 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 31, 6),
cc.seg7
AND NVL (SUBSTR (l.concatenated_segments_high, 31, 6),
cc.seg7
AND cc.seg8 BETWEEN NVL (SUBSTR (l.concatenated_segments_low, 38, 6),
cc.seg8
AND NVL (SUBSTR (l.concatenated_segments_high, 38, 6),
cc.seg8)Thanks for help
aspDear all,
the folliwing is explain plan for above query...
<ExplainPlan>
- <PlanElement object_ID="0" id="0" operation="SELECT STATEMENT" optimizer="ALL_ROWS" cost="1,417" cardinality="1" bytes="477" cpu_cost="263,995,086" io_cost="1,385" time="18">
- <PlanElements>
- <PlanElement object_ID="0" id="1" operation="HASH" option="UNIQUE" cost="1,417" cardinality="1" bytes="477" cpu_cost="263,995,086" io_cost="1,385" time="18">
- <PlanElements>
- <PlanElement object_ID="0" id="2" operation="FILTER" filter_predicates="("GJH"."LAST_UPDATE_DATE","GJL"."LAST_UPDATE_DATE")= (SELECT MAX("A"."LAST_UPDATE_DATE"),MAX("B"."LAST_UPDATE_DATE") FROM "GL"."GL_JE_LINES" "B","GL"."GL_JE_HEADERS" "A" WHERE ROWNUM=1 AND "A"."JE_HEADER_ID"="B"."JE_HEADER_ID" AND "A"."STATUS"='P' AND "B"."CODE_COMBINATION_ID"=:B1) AND NVL("GJL"."ACCOUNTED_DR",0)-NVL("GJL"."ACCOUNTED_CR",0)= (SELECT NVL("ACCOUNTED_DR",0)-NVL("ACCOUNTED_CR",0) FROM "GL"."GL_JE_LINES" "GL_JE_LINES" WHERE ROWNUM=1 AND "PERIOD_NAME"=:B2 AND "CODE_COMBINATION_ID"=:B3 AND "JE_HEADER_ID"=:B4 AND "LAST_UPDATE_DATE"=:B5) AND NVL("GB"."BEGIN_BALANCE_DR",0)-NVL("GB"."BEGIN_BALANCE_CR",0)+(NVL("GB"."PERIOD_NET_DR_BEQ",0)-NVL("GB"."PERIOD_NET_CR_BEQ",0))= (SELECT MAX(NVL("BEGIN_BALANCE_DR",0)-NVL("BEGIN_BALANCE_CR",0)+(NVL("PERIOD_NET_DR_BEQ",0)-NVL("PERIOD_NET_CR_BEQ",0))) FROM "GL"."GL_BALANCES" "GL_BALANCES" WHERE "PERIOD_NAME"=:B6 AND "CODE_COMBINATION_ID"=:B7)">
- <PlanElements>
- <PlanElement object_ID="0" id="3" operation="NESTED LOOPS" cost="1,262" cardinality="1" bytes="477" cpu_cost="254,376,045" io_cost="1,231" time="16">
- <PlanElements>
- <PlanElement object_ID="0" id="4" operation="NESTED LOOPS" cost="1,261" cardinality="1" bytes="464" cpu_cost="254,366,853" io_cost="1,230" time="16">
- <PlanElements>
- <PlanElement object_ID="0" id="5" operation="NESTED LOOPS" cost="1,260" cardinality="1" bytes="433" cpu_cost="254,357,622" io_cost="1,229" time="16">
- <PlanElements>
- <PlanElement object_ID="0" id="6" operation="NESTED LOOPS" cost="1,259" cardinality="1" bytes="402" cpu_cost="254,348,331" io_cost="1,228" time="16">
- <PlanElements>
- <PlanElement object_ID="0" id="7" operation="NESTED LOOPS" cost="1,258" cardinality="1" bytes="350" cpu_cost="254,337,522" io_cost="1,227" time="16">
- <PlanElements>
- <PlanElement object_ID="0" id="8" operation="NESTED LOOPS" cost="1,239" cardinality="1" bytes="273" cpu_cost="254,166,811" io_cost="1,208" time="15">
- <PlanElements>
- <PlanElement object_ID="0" id="9" operation="HASH JOIN" cost="1,184" cardinality="1" bytes="248" cpu_cost="253,727,112" io_cost="1,153" access_predicates=""B"."ID_FLEX_NUM"="CC"."CHART_OF_ACCOUNTS_ID"" filter_predicates=""CC"."SEGMENT1">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",1,3),"CC"."SEGMENT1") AND "CC"."SEGMENT1"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",1,3),"CC"."SEGMENT1") AND "CC"."SEGMENT2">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",5,4),"CC"."SEGMENT2") AND "CC"."SEGMENT2"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",5,4),"CC"."SEGMENT2") AND "CC"."SEGMENT3">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",10,4),"CC"."SEGMENT3") AND "CC"."SEGMENT3"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",10,4),"CC"."SEGMENT3") AND "CC"."SEGMENT4">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",15,3),"CC"."SEGMENT4") AND "CC"."SEGMENT4"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",15,3),"CC"."SEGMENT4") AND "CC"."SEGMENT5">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",19,6),"CC"."SEGMENT5") AND "CC"."SEGMENT5"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",19,6),"CC"."SEGMENT5") AND "CC"."SEGMENT6">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",26,4),"CC"."SEGMENT6") AND "CC"."SEGMENT6"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",26,4),"CC"."SEGMENT6") AND "CC"."SEGMENT7">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",31,6),"CC"."SEGMENT7") AND "CC"."SEGMENT7"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",31,6),"CC"."SEGMENT7") AND "CC"."SEGMENT8">=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_LOW",38,6),"CC"."SEGMENT8") AND "CC"."SEGMENT8"<=NVL(SUBSTR("L"."CONCATENATED_SEGMENTS_HIGH",38,6),"CC"."SEGMENT8")" time="15">
- <PlanElements>
- <PlanElement object_ID="1" id="10" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_FLEX_VALIDATION_RULE_LINES" object_type="TABLE" object_instance="3" cost="3" cardinality="1" bytes="119" cpu_cost="49,909" io_cost="3" filter_predicates=""L"."INCLUDE_EXCLUDE_INDICATOR"='E'" time="1">
- <PlanElements>
- <PlanElement object_ID="0" id="11" operation="NESTED LOOPS" cost="6" cardinality="1" bytes="189" cpu_cost="75,588" io_cost="6" time="1">
- <PlanElements>
- <PlanElement object_ID="0" id="12" operation="NESTED LOOPS" cost="3" cardinality="1" bytes="70" cpu_cost="25,679" io_cost="3" time="1">
- <PlanElements>
- <PlanElement object_ID="0" id="13" operation="NESTED LOOPS" cost="3" cardinality="1" bytes="39" cpu_cost="23,779" io_cost="3" time="1">
- <PlanElements>
- <PlanElement object_ID="2" id="14" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_LEDGERS" object_type="TABLE" object_instance="10" cost="1" cardinality="1" bytes="8" cpu_cost="8,541" io_cost="1" time="1">
- <PlanElements>
<PlanElement object_ID="3" id="15" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_LEDGERS_U2" object_type="INDEX (UNIQUE)" search_columns="1" cost="0" cardinality="1" cpu_cost="1,050" io_cost="0" access_predicates=""GL"."LEDGER_ID"=TO_NUMBER(:P_LEDGER_ID)" time="1" />
</PlanElements>
</PlanElement>
- <PlanElement object_ID="4" id="16" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_FLEX_VALIDATION_RULES" object_type="TABLE" object_instance="19" cost="2" cardinality="1" bytes="31" cpu_cost="15,238" io_cost="2" filter_predicates=""B"."ENABLED_FLAG"='Y' AND NVL("B"."END_DATE_ACTIVE",SYSDATE@!+1)>SYSDATE@!" time="1">
- <PlanElements>
<PlanElement object_ID="5" id="17" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_FLEX_VALIDATION_RULES_U1" object_type="INDEX (UNIQUE)" search_columns="3" cost="1" cardinality="1" cpu_cost="7,321" io_cost="1" access_predicates=""B"."APPLICATION_ID"=101 AND "B"."ID_FLEX_CODE"='GL#' AND "B"."ID_FLEX_NUM"=TO_NUMBER(:P_CHART_OF_ACCOUNTS_ID)" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
<PlanElement object_ID="6" id="18" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_FLEX_VDATION_RULES_TL_U1" object_type="INDEX (UNIQUE)" search_columns="5" cost="0" cardinality="1" bytes="31" cpu_cost="1,900" io_cost="0" access_predicates=""T"."APPLICATION_ID"=101 AND "T"."ID_FLEX_CODE"='GL#' AND "T"."ID_FLEX_NUM"=TO_NUMBER(:P_CHART_OF_ACCOUNTS_ID) AND "B"."FLEX_VALIDATION_RULE_NAME"="T"."FLEX_VALIDATION_RULE_NAME" AND "T"."LANGUAGE"=USERENV('LANG')" time="1" />
</PlanElements>
</PlanElement>
<PlanElement object_ID="7" id="19" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_FLEX_VAL_RULE_LINES_N1" object_type="INDEX" search_columns="4" cost="1" cardinality="41" cpu_cost="16,371" io_cost="1" access_predicates=""L"."APPLICATION_ID"=101 AND "L"."ID_FLEX_CODE"='GL#' AND "L"."ID_FLEX_NUM"=TO_NUMBER(:P_CHART_OF_ACCOUNTS_ID) AND "L"."FLEX_VALIDATION_RULE_NAME"="B"."FLEX_VALIDATION_RULE_NAME"" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
<PlanElement object_ID="8" id="20" operation="TABLE ACCESS" option="FULL" optimizer="ANALYZED" object_owner="GL" object_name="GL_CODE_COMBINATIONS" object_type="TABLE" object_instance="1" cost="1,177" cardinality="1,088" bytes="64,192" cpu_cost="249,419,570" io_cost="1,147" filter_predicates=""CC"."ENABLED_FLAG"='Y' AND "CC"."CHART_OF_ACCOUNTS_ID"=TO_NUMBER(:P_CHART_OF_ACCOUNTS_ID) AND NVL("CC"."END_DATE_ACTIVE",SYSDATE@!+1)>SYSDATE@!" time="15" />
</PlanElements>
</PlanElement>
- <PlanElement object_ID="9" id="21" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_BALANCES" object_type="TABLE" object_instance="6" cost="55" cardinality="52" bytes="1,300" cpu_cost="439,699" io_cost="55" time="1">
- <PlanElements>
<PlanElement object_ID="10" id="22" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_BALANCES_N1" object_type="INDEX" search_columns="1" cost="2" cardinality="52" cpu_cost="25,693" io_cost="2" access_predicates=""GB"."CODE_COMBINATION_ID"="CC"."CODE_COMBINATION_ID"" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="11" id="23" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES" object_type="TABLE" object_instance="5" cost="19" cardinality="40" bytes="3,080" cpu_cost="170,710" io_cost="19" time="1">
- <PlanElements>
<PlanElement object_ID="12" id="24" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES_N1" object_type="INDEX" search_columns="1" cost="2" cardinality="49" cpu_cost="24,693" io_cost="2" access_predicates=""GJL"."CODE_COMBINATION_ID"="GB"."CODE_COMBINATION_ID"" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="13" id="25" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_HEADERS" object_type="TABLE" object_instance="4" cost="1" cardinality="1" bytes="52" cpu_cost="10,809" io_cost="1" filter_predicates=""GJH"."STATUS"='P' AND "GJH"."LEDGER_ID"=TO_NUMBER(:P_LEDGER_ID) AND "GJH"."PERIOD_NAME"="GB"."PERIOD_NAME"" time="1">
- <PlanElements>
- <PlanElement object_ID="14" id="26" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_HEADERS_U1" object_type="INDEX (UNIQUE)" search_columns="1" cost="0" cardinality="1" cpu_cost="1,950" io_cost="0" access_predicates=""GJH"."JE_HEADER_ID"="GJL"."JE_HEADER_ID"" filter_predicates=""GJL"."DESCRIPTION"= (SELECT "DESCRIPTION" FROM "GL"."GL_JE_LINES" "GL_JE_LINES" WHERE ROWNUM=1 AND "PERIOD_NAME"=:B1 AND "CODE_COMBINATION_ID"=:B2 AND "JE_HEADER_ID"=:B3)" time="1">
- <PlanElements>
- <PlanElement object_ID="0" id="27" operation="COUNT" option="STOPKEY" filter_predicates="ROWNUM=1">
- <PlanElements>
- <PlanElement object_ID="11" id="28" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES" object_type="TABLE" object_instance="15" cost="5" cardinality="1" bytes="62" cpu_cost="39,168" io_cost="5" filter_predicates=""JE_HEADER_ID"=:B1" time="1">
- <PlanElements>
<PlanElement object_ID="12" id="29" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES_N1" object_type="INDEX" search_columns="2" cost="3" cardinality="4" cpu_cost="22,364" io_cost="3" access_predicates=""CODE_COMBINATION_ID"=:B1 AND "PERIOD_NAME"=:B2" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="15" id="30" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_SOURCES_TL" object_type="TABLE" object_instance="16" cost="1" cardinality="2" bytes="62" cpu_cost="9,291" io_cost="1" time="1">
- <PlanElements>
<PlanElement object_ID="16" id="31" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_SOURCES_TL_U1" object_type="INDEX (UNIQUE)" search_columns="2" cost="0" cardinality="1" cpu_cost="1,900" io_cost="0" access_predicates=""JE_SOURCE_NAME"="GJH"."JE_SOURCE" AND "LANGUAGE"=USERENV('LANG')" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="17" id="32" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_CATEGORIES_TL" object_type="TABLE" object_instance="17" cost="1" cardinality="2" bytes="62" cpu_cost="9,231" io_cost="1" time="1">
- <PlanElements>
<PlanElement object_ID="18" id="33" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_CATEGORIES_TL_U1" object_type="INDEX (UNIQUE)" search_columns="2" cost="0" cardinality="1" cpu_cost="1,900" io_cost="0" access_predicates=""JE_CATEGORY_NAME"="GJH"."JE_CATEGORY" AND "LANGUAGE"=USERENV('LANG')" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="19" id="34" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_USER" object_type="TABLE" object_instance="7" cost="1" cardinality="1" bytes="13" cpu_cost="9,191" io_cost="1" time="1">
- <PlanElements>
<PlanElement object_ID="20" id="35" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="APPLSYS" object_name="FND_USER_U1" object_type="INDEX (UNIQUE)" search_columns="1" cost="0" cardinality="1" cpu_cost="1,900" io_cost="0" access_predicates=""GJH"."LAST_UPDATED_BY"="FU"."USER_ID"" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="0" id="36" operation="SORT" option="AGGREGATE" cardinality="1" bytes="33">
- <PlanElements>
- <PlanElement object_ID="0" id="37" operation="COUNT" option="STOPKEY" filter_predicates="ROWNUM=1">
- <PlanElements>
- <PlanElement object_ID="0" id="38" operation="NESTED LOOPS" cost="69" cardinality="49" bytes="1,617" cpu_cost="624,699" io_cost="69" time="1">
- <PlanElements>
- <PlanElement object_ID="11" id="39" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES" object_type="TABLE" object_instance="12" cost="20" cardinality="49" bytes="882" cpu_cost="164,029" io_cost="20" time="1">
- <PlanElements>
<PlanElement object_ID="12" id="40" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES_N1" object_type="INDEX" search_columns="1" cost="3" cardinality="49" cpu_cost="30,964" io_cost="3" access_predicates=""B"."CODE_COMBINATION_ID"=:B1" time="1" />
</PlanElements>
</PlanElement>
- <PlanElement object_ID="13" id="41" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_HEADERS" object_type="TABLE" object_instance="11" cost="1" cardinality="1" bytes="15" cpu_cost="9,401" io_cost="1" filter_predicates=""A"."STATUS"='P'" time="1">
- <PlanElements>
<PlanElement object_ID="14" id="42" operation="INDEX" option="UNIQUE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_HEADERS_U1" object_type="INDEX (UNIQUE)" search_columns="1" cost="0" cardinality="1" cpu_cost="1,900" io_cost="0" access_predicates=""A"."JE_HEADER_ID"="B"."JE_HEADER_ID"" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="0" id="43" operation="COUNT" option="STOPKEY" filter_predicates="ROWNUM=1">
- <PlanElements>
- <PlanElement object_ID="11" id="44" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES" object_type="TABLE" object_instance="13" cost="5" cardinality="1" bytes="33" cpu_cost="39,068" io_cost="5" filter_predicates=""JE_HEADER_ID"=:B1 AND "LAST_UPDATE_DATE"=:B2" time="1">
- <PlanElements>
<PlanElement object_ID="12" id="45" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_JE_LINES_N1" object_type="INDEX" search_columns="2" cost="3" cardinality="4" cpu_cost="22,364" io_cost="3" access_predicates=""CODE_COMBINATION_ID"=:B1 AND "PERIOD_NAME"=:B2" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
- <PlanElement object_ID="0" id="46" operation="SORT" option="AGGREGATE" cardinality="1" bytes="25">
- <PlanElements>
- <PlanElement object_ID="9" id="47" operation="TABLE ACCESS" option="BY INDEX ROWID" optimizer="ANALYZED" object_owner="GL" object_name="GL_BALANCES" object_type="TABLE" object_instance="14" cost="6" cardinality="1" bytes="25" cpu_cost="45,399" io_cost="6" time="1">
- <PlanElements>
<PlanElement object_ID="10" id="48" operation="INDEX" option="RANGE SCAN" optimizer="ANALYZED" object_owner="GL" object_name="GL_BALANCES_N1" object_type="INDEX" search_columns="2" cost="3" cardinality="2" cpu_cost="21,964" io_cost="3" access_predicates=""CODE_COMBINATION_ID"=:B1 AND "PERIOD_NAME"=:B2" time="1" />
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</PlanElements>
</PlanElement>
</ExplainPlan>Sorry this xml generated...
Regards
asp -
My sql query is taking hours to run from toad
SELECT * from (select hr.NAME org_name, ractla.org_id, acct_site.attribute_category,
acct_site.attribute10 general_agent, arc.reversal_category,arr.reversal_gl_date,ard.reversed_source_id,
DECODE (acct_site.attribute_category,
'COMM', acct_site.attribute9,
'IPLAN', acct_site.attribute11,
NULL
) broker_number, -- changed by Michelle
SUBSTR (hca.account_number, 1, 12) customer_number,
ractla.attribute3 plan_type,
SUBSTR (hp.party_name, 1, 15) customer_name,
DECODE (ractla.attribute_category,
'Commercial Receivables', ractla.attribute1,
'Subscriber Receivables', ractla.attribute1,
NULL
) product, --x Product
rax.trx_number trx_number, rax.trx_date transaction_date,
DECODE
(SIGN (INSTR (rax.attribute4, '/')),
1, TO_DATE
( '01'
|| TO_CHAR
(DECODE (SIGN (INSTR (rax.attribute4, '/')),
1, TO_DATE
(SUBSTR (rax.attribute4,
1,
10
'rrrr/mm/dd'
NULL
'MON-RRRR'
'DD-MON-RRRR'
NULL
) coverage_month,
DECODE (SIGN (INSTR (rax.attribute4, '/')),
1, TO_DATE (SUBSTR (rax.attribute4, 1, 10), 'rrrr/mm/dd'),
NULL
) due_date,
DECODE (acct_site.attribute_category,
'COMM', SUBSTR (hca.account_number, 1, 6),
acct_site.attribute10
) employer_group,
DECODE (arc.reversal_category,
'NSF', TRUNC (arr.creation_date),
'REV', TRUNC (arr.creation_date),
DECODE (arr.reversal_gl_date,
NULL, TRUNC (arr.apply_date),
DECODE (ard.reversed_source_id,
NULL, TRUNC (arr.apply_date),
TRUNC (arr.creation_date)
) as application_date,
arr.apply_date,
arr.creation_date, --9/8/03
arc.receipt_number receipt_number, arc.receipt_date receipt_date,
arr.amount_applied applied_amount, rax.attribute1 company,
rax.attribute2 division, ractla.attribute2 coverage_plan, -- (Plan Code)
DECODE (acct_site.attribute_category,
'IPLAN', acct_site.attribute13,
'SH', acct_site.attribute8,
NULL
) coverage_type,
ps.amount_due_original trans_amount,
NVL (ps.amount_due_remaining, 0) trans_amount_remaining,
SUBSTR (hre.first_name || ' ' || hre.middle_name || ' '
|| hre.last_name,
1,
25
) salesrep_name,
SUBSTR (acct_site.attribute5, 1, 5) salesrep_extension,
hca.attribute4 SOURCE
FROM apps.ra_customer_trx_all rax, -- invoice info
apps.ar_payment_schedules_all ps,
apps.ra_cust_trx_types_all rat,
apps.hz_cust_accounts_all hca,
apps.hz_parties hp,
apps.ar_receivable_applications_all arr,
apps.ar_cash_receipts_all arc, --- receipt info
apps.hr_operating_units hr,
apps.hr_employees hre,
apps.hz_cust_acct_sites_all acct_site,
apps.hz_cust_site_uses_all acct_site_use, --added by tapas on 7/16
apps.ra_customer_trx_lines_all ractla,
apps.ar_distributions_all ard
WHERE
hca.cust_account_id = rax.bill_to_customer_id
AND ps.customer_trx_id = rax.customer_trx_id
AND rat.cust_trx_type_id = rax.cust_trx_type_id
AND rat.org_id = rax.org_id
AND arr.applied_customer_trx_id = rax.customer_trx_id
AND arr.status = 'APP'
AND arr.cash_receipt_id = arc.cash_receipt_id
AND hr.organization_id = rax.org_id
AND rax.customer_trx_id = ractla.customer_trx_id
AND ractla.line_number = 1
AND rax.bill_to_site_use_id = acct_site_use.site_use_id
AND hca.cust_account_id = acct_site.cust_account_id
AND acct_site.cust_acct_site_id = acct_site_use.cust_acct_site_id
AND hca.party_id = hp.party_id
AND acct_site.attribute4 = hre.employee_id(+)
AND hr.NAME = 'Commercial Group'
AND ard.source_table = 'RA'
AND ard.source_id = arr.receivable_application_id) app
where app.application_date between :start_date and :end_Date
and app.application_date <> app.apply_date
and app.applied_amount>0;
This is my query..due to some requirement i have written an inline view. The query is getting executed without inline view and even after including inline view but when i included the condition "app.applied_amount>0" its taking hours and i dont see it complete. Please advise. Urgently needed as a discoverer report needs to be built based on this.
Thanks.
Edited by: 958913 on Oct 8, 2012 3:52 PM
Edited by: 958913 on Oct 8, 2012 4:19 PM
Edited by: 958913 on Oct 8, 2012 4:33 PMHi,
the next action plan i have taken is as i need to write a condition that is based on application date(which is not a database column and is getting by a decode statement in the select) i have removed that decode statement from select and written in inline view and there by adding in where condition. Here is the query and still its taking time to run.
/* Formatted on 2012/10/09 13:51 (Formatter Plus v4.8.8) */
SELECT hr.NAME org_name, ractla.org_id, acct_site.attribute_category,
acct_site.attribute10 general_agent, arc.reversal_category,
arr.reversal_gl_date, ard.reversed_source_id,
DECODE (acct_site.attribute_category,
'COMM', acct_site.attribute9,
'IPLAN', acct_site.attribute11,
NULL
) broker_number, -- changed by Michelle
SUBSTR (hca.account_number, 1, 12) customer_number,
ractla.attribute3 plan_type,
SUBSTR (hp.party_name, 1, 15) customer_name,
DECODE (ractla.attribute_category,
'Commercial Receivables', ractla.attribute1,
'Subscriber Receivables', ractla.attribute1,
NULL
) product, --x Product
rax.trx_number trx_number, rax.trx_date transaction_date,
DECODE
(SIGN (INSTR (rax.attribute4, '/')),
1, TO_DATE ( '01'
|| TO_CHAR (DECODE (SIGN (INSTR (rax.attribute4, '/')),
1, TO_DATE
(SUBSTR (rax.attribute4,
1,
10
'rrrr/mm/dd'
NULL
'MON-RRRR'
'DD-MON-RRRR'
NULL
) coverage_month,
DECODE (SIGN (INSTR (rax.attribute4, '/')),
1, TO_DATE (SUBSTR (rax.attribute4, 1, 10), 'rrrr/mm/dd'),
NULL
) due_date,
DECODE (acct_site.attribute_category,
'COMM', SUBSTR (hca.account_number, 1, 6),
acct_site.attribute10
) employer_group,
app.application_date,
arr.apply_date, arr.creation_date, --9/8/03
arc.receipt_number receipt_number,
arc.receipt_date receipt_date, arr.amount_applied applied_amount,
rax.attribute1 company, rax.attribute2 division,
ractla.attribute2 coverage_plan, -- (Plan Code)
DECODE (acct_site.attribute_category,
'IPLAN', acct_site.attribute13,
'SH', acct_site.attribute8,
NULL
) coverage_type,
ps.amount_due_original trans_amount,
NVL (ps.amount_due_remaining, 0) trans_amount_remaining,
SUBSTR (hre.first_name || ' ' || hre.middle_name || ' '
|| hre.last_name,
1,
25
) salesrep_name,
SUBSTR (acct_site.attribute5, 1, 5) salesrep_extension,
hca.attribute4 SOURCE
FROM apps.ra_customer_trx_all rax, -- invoice info
apps.ar_payment_schedules_all ps,
apps.ra_cust_trx_types_all rat,
apps.hz_cust_accounts_all hca,
apps.hz_parties hp,
apps.ar_receivable_applications_all arr,
apps.ar_cash_receipts_all arc, --- receipt info
apps.hr_operating_units hr,
apps.hr_employees hre,
apps.hz_cust_acct_sites_all acct_site,
apps.hz_cust_site_uses_all acct_site_use, --added by tapas on 7/16
apps.ra_customer_trx_lines_all ractla,
apps.ar_distributions_all ard,
(select rax1.customer_trx_id,DECODE (arc1.reversal_category,
'NSF', TRUNC (arr1.creation_date),
'REV', TRUNC (arr1.creation_date),
DECODE (arr1.reversal_gl_date,
NULL, TRUNC (arr1.apply_date),
DECODE (ard1.reversed_source_id,
NULL, TRUNC (arr1.apply_date),
TRUNC (arr1.creation_date)
) as application_date
from apps.ar_receivable_applications_all arr1,
apps.ar_cash_receipts_all arc1,
apps.ra_customer_trx_all rax1,
apps.ar_distributions_all ard1
where arr1.applied_customer_trx_id = rax1.customer_trx_id
AND arr1.status = 'APP'
AND arr1.cash_receipt_id = arc1.cash_receipt_id
AND ard1.source_table = 'RA'
AND ard1.source_id = arr1.receivable_application_id
) app
WHERE hca.cust_account_id = rax.bill_to_customer_id
AND app.customer_trx_id = rax.customer_trx_id
AND arr.apply_date <> app.application_date
and app.application_date between :start_date and :end_date
--AND rax.trx_number IN ('52715888', '52689753')
AND ps.customer_trx_id = rax.customer_trx_id
AND rat.cust_trx_type_id = rax.cust_trx_type_id
AND rat.org_id = rax.org_id
AND arr.applied_customer_trx_id = rax.customer_trx_id
AND arr.status = 'APP'
AND arr.cash_receipt_id = arc.cash_receipt_id
AND hr.organization_id = rax.org_id
AND rax.customer_trx_id = ractla.customer_trx_id
AND ractla.line_number = 1
AND rax.bill_to_site_use_id = acct_site_use.site_use_id
AND hca.cust_account_id = acct_site.cust_account_id
AND acct_site.cust_acct_site_id = acct_site_use.cust_acct_site_id
AND hca.party_id = hp.party_id
AND acct_site.attribute4 = hre.employee_id(+)
AND hr.NAME = 'Commercial Group'
AND ard.source_table = 'RA'
AND ard.source_id = arr.receivable_application_id -
Query problem in viewobject wizard
I have problem to viewobject creation
In the sql Query of viewobject wizard, I can not use build-in sql function like trunc, nvl to the attribute
if I use function, there is no problem with query test(query is valid). However when I click ok or apply button
it popup message like this:
"the following attributes will be set as transients. Derived expression will be lost"
Does this mean I can not use any build-in fuction in select list in view obect wizard?
Another question is how can I nest sql in the select list?
for instance:
in my select list in Query(viewobject wizard)
I have Query:
select Customer.CUSTOMER_ID,
Customer.BANK_NUMBER,
Customer.TRANS_AMOUNT,
from CUSTOMER Customer
where Customer.CUSTOMER_ID =:0
it works without problem
But if add nested select query:
select Customer.CUSTOMER_ID,
Customer.BANK_NUMBER,
(select sum(c.TRANS_AMOUNT)
from CUSTOMER c
where......
) as "Amount"
Account.FILE_NAME,
from CUSTOMER Customer, ACCOUNT Account
where Customer.CUSTOMER_ID =:0
It did not work.
Can somebody tell me how to solve this problem?
Or is there another way to achieve this?
thankssory i wrote the wrong one
function CF_ITEMFormula return number is
v_item number(30);
begin
IF :P_state ALL then
IF :P_STATE = 'SMALL' THEN
SELECT aa.item
INTO v_item
FROM tableA aa,
tableb bb
WHERE aa.code = bb.code
AND aa.date = :P_date
AND aa.name = :P_name
AND bb.state LIKE '%et%''
AND (case :P_DIV
when ('west') then aa.division = 1
when ('east') then aa.division = 2
when ('south') then aa.division = 3
when ('south & west) then aa.division in (1,3)
else aa.division in (1,2,3)
end);
end if;
IF :P_STATE = 'BIG' THEN
SELECT aa.item
INTO v_item
FROM tableA aa,
tableb bb
WHERE aa.code = bb.code
AND aa.date = :P_date
AND aa.name = :P_name
AND bb.state LIKE '%wa%''
AND (case :P_DIV
when ('west') then aa.division = 1
when ('east') then aa.division = 2
when ('south') then aa.division = 3
when ('south & west) then aa.division in (1,3)
else aa.division in (1,2,3)
end);
end if;
end if;
EXCEPTION
WHEN NO_DATA_FOUND THEN RETURN (0)
end; -
select distinct TRANS_counselid,
Decode(upper(months),'APR', total_qty, 0) As April,
Decode(upper(months),'MAY', total_qty, 0) As May,
Decode(upper(months),'JUN', total_qty, 0) As June,
Decode(upper(months),'JUL', total_qty, 0) As July,
Decode(upper(months),'AUG', total_qty, 0) As August,
Decode(upper(months),'SEP', total_qty, 0) As Sept,
Decode(upper(months),'OCT', total_qty, 0) As October,
Decode(upper(months),'NOV', total_qty, 0) As November,
Decode(upper(months),'DEC', total_qty, 0) As December,
Decode(upper(months),'JAN', total_qty, 0) As January,
Decode(upper(months),'FEB', total_qty, 0) As February,
Decode(upper(months),'MAR', total_qty, 0) As March,total_qty
from (select TRANS_counselid,
to_char(TRANS_DATE,'MON') months,
sum(TRANS_AMOUNT) total_qty
from LEGAL_TRANS
group by TRANS_counselid, to_char(TRANS_DATE,'MON'))
This gives me a monthly report in a matrix format but though it gives a monlthy figure in each month for trans_counselid it creates a new row for each month. Is there any way to make it show trans_counselid on only 1 row ?
Thanks in advancenot quite understand what you mean,
some sample data might help.
anyway, this is my guessing
WITH legal_trans as (
select 1 trans_counselid, to_date('01-01-06','DD-MM-YY') trans_date, 10 trans_amount from dual union all
select 1 trans_counselid, to_date('01-02-06','DD-MM-YY') trans_date, 20 trans_amount from dual union all
select 2 trans_counselid, to_date('01-02-06','DD-MM-YY') trans_date, 20 trans_amount from dual union all
select 3 trans_counselid, to_date('01-03-06','DD-MM-YY') trans_date, 30 trans_amount from dual union all
select 4 trans_counselid, to_date('01-04-06','DD-MM-YY') trans_date, 40 trans_amount from dual union all
select 5 trans_counselid, to_date('01-05-06','DD-MM-YY') trans_date, 50 trans_amount from dual union all
select 6 trans_counselid, to_date('01-06-06','DD-MM-YY') trans_date, 60 trans_amount from dual union all
select 7 trans_counselid, to_date('01-07-06','DD-MM-YY') trans_date, 70 trans_amount from dual union all
select 8 trans_counselid, to_date('01-08-06','DD-MM-YY') trans_date, 80 trans_amount from dual union all
select 9 trans_counselid, to_date('01-09-06','DD-MM-YY') trans_date, 90 trans_amount from dual union all
select 9 trans_counselid, to_date('01-10-06','DD-MM-YY') trans_date, 90 trans_amount from dual
SELECT DISTINCT trans_counselid,
SUM(decode(UPPER(MONTHS), '04', total_qty, 0)) AS april,
SUM(decode(UPPER(MONTHS), '05', total_qty, 0)) AS may,
SUM(decode(UPPER(MONTHS), '06', total_qty, 0)) AS june,
SUM(decode(UPPER(MONTHS), '07', total_qty, 0)) AS july,
SUM(decode(UPPER(MONTHS), '08', total_qty, 0)) AS august,
SUM(decode(UPPER(MONTHS), '09', total_qty, 0)) AS sept,
SUM(decode(UPPER(MONTHS), '10', total_qty, 0)) AS october,
SUM(decode(UPPER(MONTHS), '11', total_qty, 0)) AS november,
SUM(decode(UPPER(MONTHS), '12', total_qty, 0)) AS december,
SUM(decode(UPPER(MONTHS), '01', total_qty, 0)) AS january,
SUM(decode(UPPER(MONTHS), '02', total_qty, 0)) AS february,
SUM(decode(UPPER(MONTHS), '03', total_qty, 0)) AS march,
SUM(total_qty) total_qty
FROM
( SELECT trans_counselid,
to_char(trans_date, 'MM') MONTHS,
SUM(trans_amount) total_qty
FROM legal_trans
GROUP BY trans_counselid,
to_char(trans_date, 'MM') )
GROUP BY trans_counselid
ORDER BY trans_counselid
TRANS_COUNSELID APRIL MAY JUNE JULY AUGUST SEPT OCTOBER NOVEMBER DECEMBER JANUARY FEBRUARY MARCH TOTAL_QTY
1 0 0 0 0 0 0 0 0 0 10 20 0 30
2 0 0 0 0 0 0 0 0 0 0 20 0 20
3 0 0 0 0 0 0 0 0 0 0 0 30 30
4 40 0 0 0 0 0 0 0 0 0 0 0 40
5 0 50 0 0 0 0 0 0 0 0 0 0 50
6 0 0 60 0 0 0 0 0 0 0 0 0 60
7 0 0 0 70 0 0 0 0 0 0 0 0 70
8 0 0 0 0 80 0 0 0 0 0 0 0 80
9 0 0 0 0 0 90 90 0 0 0 0 0 180
9 rows selected -
Finding Duplicate Entries in a table
I have a table with the following fields in Oracle 10g.
TABLE_1
account_no | tracking_id | trans_amount
Each account_no can have multiple tracking Id's.
How do i query out the duplicate entries of account_no where tracking lies between 1 and 1000 ?
Many thanks in advance for your help,
novice.Hi,
Whenever you have a question, it helps to post a little sample data, and the results you want from that data.
You may think you're explaining clearly, but it won't be nearly as clear to someone who isn't familiar with your tables, your data, and your requirements. Sanjay and I apparantly have different ideas about what you requested, and perhaps neither is what you really want. It would be much harder for us to make mistakes and give you bad advice if you posted the data and results.
Here's my guess:
WITH got_cnt AS
SELECT table_1.*
, COUNT (*) OVER (PARTITION BY account_no)
FROM table_1
WHERE tracking_id BETWEEN 1
AND 1000
SELECT account_no, tracking_id, trans_amount
FROM got_cnt
WHERE cnt > 1
;If there are two rows with the same account_no, and both rows have tracking_id in the given range, then both rows will be displayed. -
What is maintaining process keys in mm extraction
hi friends,
what is maintaining process keys in mm extraction
Thanking u
suneel.Hi Suneel,
The process keys basically represent the type of transaction which has happened in the source system.
If you notice the update rules for the Purchasing cubes like 0PUR_C01, etc. You will see some routines like this:
IF COMM_STRUCTURE-PROCESSKEY = `001`.
RESULT = COMM_STRUCTURE-TRANS_AMOUNT.
ELSE.
CLEAR RESULT. no update of key-figure
ENDIF.
<b>This means that if the process key determines which KF should be populated for the same in the Cube.</b>
Some of the common process keys for Purchasing are:
Event MA (purchase order) and purchase order category 1 yield process key 001.
Event MA (purchase order) and purchase order category 2 yield process key 011.
Event MA (purchase order) and purchase order category 3 yield process key 021.
Event MB (goods receipt) and purchase order category 1 yield process key 002.
Event MB (goods receipt) and purchase order category 2 yield process key 012.
Event MB (goods receipt) and purchase order category 3 yield process key 022.
Event MC (invoice receipt) and purchase order category 1 yield process key 003.
Event MC (invoice receipt) and purchase order category 3 yield process key 023.
Event MD (scheduling agreement) and purchase order category 1 yield process key 004.
Event MD (scheduling agreement) and purchase order category 2 yield process key 014.
Event MD (scheduling agreement) and purchase order category 3 yield process key 024.
Event ME (contracts) and purchase order category 1 yield process key 008.
Event ME (contracts) and purchase order category 2 yield process key 028.
Event ME (contracts) and purchase order category 3 yield process key 028.
You can find more information if you search across the forum.
Thanks,
Amol -
Can any one please tell me the functionality of process keys and how to use it in inventory management.
Thanks ..Hi Akash
check oss note 352344 - Process key + reversals in Inventory Management ?
Symptom
This note is a consulting note and describes the use of the process key (0PROCESSKEY) in Inventory Management (MSEG). It focusses on the way the system deals with reversed transactions for DataSources 2lis_40_s279 and 2lis_03_bf.
To be able to use theses DataSources, you ABSOLUTELY MUST activate the transaction key generation (process key, PROCESSKEY) using Transaction MCB_ (from the OLTP IMG for BW: SBIW) (standard, retail or consumption goods).
The following transaction keys are available
(PROCESSKEY/Appl. Component/Description):
000/MM Misc. receipts
001/MM Goods receipt / vendor
004/MM Article transfer posting receipt
005/MM Stock correction inventory +
006/MM Stock correction other +
007/IS-R Receipt value-only article (single article
posting)
010/MM Receipt from stock transfer
002/IS-R Merchandise clearing receipt
003/IS-R GR from DC
100/MM Misc. issues
101/MM Returns / Vendor
104/MM Article transfer posting issue
105/MM Stock correction inventory -
106/MM Stock correction other -
107/IS-R Issue value-only article (single article
posting)
110/MM Issue from stock transfer
102/IS-R Merchandise clearing issue
103/IS-R GI from DC
450/IS-R Generic Article (not relevant)
Remark: Transaction keys 002/003, 102/103 break down the core keys 010/110 in more detail with respect to retail processes. They are only available in an R/3 Retail.
As you can see in the overview, the transaction keys can be divided according to receipts and issues in Inventory Management. Furthermore, the transaction keys are displayed according to reversed and regular transactions.
A regular receipt has a debit/credit indicator "S" (SHKZG, 0DCINDIC), whereas a regular issue has a debit/credit indicator "H".
For reverse transactions the opposite is true.
Transaction D/C ind. D/C ind.
S H
RECEIPTS
0 Misc. receipts regular reversed
1 Goods receipt / vendor regular reversed
2 Merchandise clearing receipt regular reversed
3 GR from DC regular reversed
4 Article transfer posting receipt regular reversed
5 Stock correction inventory + regular reversed
6 Stock correction other + regular reversed
7 Receipt value-only article regular reversed
10 Receipt from stock transfer regular reversed
ISSUES
100 Misc. issues reversed regular
101 Returns / vendor reversed regular
102 Merchandise clearing issue reversed regular
103 GI from DC reversed regular
104 Article transfer posting issue reversed regular
105 Stock correction inventory - reversed regular
106 Stock correction other - reversed regular
107 Issue value-only article reversed regular
110 Issue from stock transfer reversed regular
Note: You can also recognize a reversal for DataSource 2lis_03_bf by means of the entry 0STORNO = ´X´. The fields that are marked with X in the table are then transferred with negative +/- sign. This was not the case with DataSource 2LIS_40_S279!!! In the case of DataSource 2LIS_40_S279 more logic was required in the BW update rules to make sure that key figures were updated correctly.
Example:
In the delivered InfoCubes 0CP_IC_C1 (CP) and 0RT_C01 (Retail), for example in key "Stock correction +", transaction keys 5 and 6 were grouped together. Furthermore, distinction is to be made between the different stock types. Depending on which stock types you want to distinguish between in different key figures, you must use a corresponding condition (IF statement) in the update rules in the BW.
Example (pseudo source code):
Updating Routine "stock adjustment +" for 2lis_02_bfIF ( STOCKCAT is initial ) AND "Evaluated stocks ( PROCESSKEY = 5 OR PROCESSKEY = 6 )._ RESULT = TRANS_AMOUNT. RETURNCODE = 0. "Updating Key figureELSE. RETURNCODE = 4. "No Updating of KeyfigureENDIF.
The pseudo source code for 2LIS_40_S279 read as follows:
Updating Routine "stock adjustment +" for 2lis_40_s279IF ( STOCKCAT is initial ) AND "Evaluated stocks ( PROCESSKEY = 5 OR PROCESSKEY = 6 ). IF DCINDIC = 'S'. RESULT = TRANS_AMOUNT. "regular ELSE. RESULT = -1 * TRANS_AMOUNT. ENDIF. RETURNCODE = 0. "Updating Key figureELSE. RETURNCODE = 4. "No Updating of KeyfigureENDIF.
Here, the debit/credit indicator must be checked in accordance with the table above. Transactions 5 and 6 are receipts in Inventory Management. As the debit/credit indicator is set to "S", it is a regular transaction whose value(TRANS_AMOUNT) is assigned to the key figure. In the other case (debit/credit indicator = "H") it is a reversal, that is, the transaction should reverse a transaction that has already been updated. For this, the value is multiplied by -1 so that a corresponding decrease/reduction of this key figure is achieved during the update of the key figure in the InfoCube.
This logic is no longer required for the 2LIS_03_BF (see first pseudo source code), because the reversed quantity and values are automatically provided as negative with the help of the 0STORNO field.
Using this DataSource 2LIS_03_BF, it is for example possible to create a key figure such as "Reversed receipts", which is not a part of the Business Content delivered. The following pseudo source code of an update routine makes this clear:
Update routine "Reversed receipts"
IF ( PROCESSKEY = 1 ) AND (STORNO = ´X` ) "Reverse RESULT = -1 * TRANS_AMOUNT. RETURNCODE = 0.ELSE. RETURNCODE = 4. "no update of key figure!ENDIF.
Note: For DataSource 2LIS_40_S279 the pseudo source code read as follows:
Update routine "Reversed receipts"
for 2LIS_40_S279IF ( PROCESSKEY = 1 ) AND ( DCINDIC = H ) "Reverse RESULT = TRANS_AMOUNT. RETURNCODE = 0.ELSE. RETURNCODE = 4. "no update of key figure!ENDIF.
To be able to understand the overall scheme more comprehensively, you should have a look at the update rules of the Standard Business Content for retail or consumption goods (for example InfoCubes 0RT_C01 or 0CP_IC_C1).
Thanks=Points in SDN
SANJEEV KUMAR HAMSALA -
Hi
I'm using RFBIBL00 prg for doing FB01 postings.
I use structures BBKPF,BBSEG. Actually in the source file i have 50 line items.
All the 50 line items should get posted as one Document. Now what happens is for each source record..a separate document is created.
Pls help... how to achieve thisI have done a similar thing using RFBIBL00.
I upload transactions via an excel spreadsheet and submit these to RFBIBL00.
Have a look at the code below:
======================================================================
Report : Purchase Card Interface *
======================================================================
Title : General Ledger Posting Interface *
Author : Andy Scott *
Date : 15/12/2006 *
Purpose : To process an input file containing G/L information *
from the Purchase Cards and to create a batch session. *
This batch session will then replicate the G/L Account *
Posting Transaction F-02 (FB01). *
An Excel spreadsheet from RBS is loaded and validated *
against. It is then reformatted and transferred into *
the sap/usr/interfaces directory on to the application *
server. A batch session is then created via RFBIBL00 *
and processed as normal via SM35. *
There are various types of card transactions that can *
be loaded. These are described below: *
Level 1 transaction without split *
- (one transaction line and one VAT line) *
Level 1 transaction with split *
- (multiple transaction lines and VAT lines) *
Level 2 transaction without split *
- (one transaction line and one VAT line) *
Level 2 transaction with split *
- (multiple transaction lines and VAT lines) *
Level 3 transaction without split *
- (multiple transaction lines and VAT lines) *
Level 3 transaction with split *
- (multiple transaction lines and VAT lines) *
======================================================================
Amendment History: *
Date User ID Request Number Brief Description *
15/12/06 CSSCOTTA DV1K914412 Initial development *
======================================================================
*EJECT
REPORT zfii001 MESSAGE-ID z_msszfii.
TABLES
TABLES: csks, " Cost Centre
skb1, " G/L Account Master
aufk. " Order Master (Internal Order)
Internal Table to hold original xls values i.e. Row, Column etc.
DATA: excel_itab LIKE alsmex_tabline OCCURS 0 WITH HEADER LINE.
*EJECT
Internal table containing data loaded from the Excel spread sheet
TYPES: BEGIN OF purchase_card_record,
bbkpf_ind(1) TYPE c, " Indicates BBKPF (header) record needed
exp_financial(1) TYPE c, " FIN. Exported Financial
trans_date(10) TYPE c, " FIN. Transaction Date
mast_txn_id(10) TYPE c, " FIN. Mastercard Txn ID
merchant_name(30) TYPE c, " MCH. Merchant Name
db_cr_code(1) TYPE c, " FIN. Debit Credit Code
cust_code(20) TYPE c, " FIN. Customer Code
trans_amount(15) TYPE c, " FIN. Primary Trans Amount
vat_amount(10) TYPE c, " FIN. VAT Amount
tax_amount(10) TYPE c, " PUR. Tax Amount
net_trans_amount(15) TYPE c, " FIN. Net Transaction Amount
ext_item_amount(15) TYPE c, " PUR. Extended Item Amount
cost_centre(10) TYPE c, " FIN. Cost Centre
gl_code(10) TYPE c, " FIN. GL Code
internal_order(12) TYPE c, " FIN. Internal Order
account_name(30) TYPE c, " Account Name
line_item_info(30) TYPE c, " PUR. Line Item Information
expense_descr(30) TYPE c, " FIN. Expense Description
vat_indicator(10) TYPE c, " FIN. VAT Eligibilty Ind.
split_sequence(10) TYPE c, " FIN. Split Sequence
country(10) TYPE c, " MCH. Country
acc_address(30) TYPE c, " ACC. Address
company_code(4) TYPE c, " Company Code
split_ind(1) TYPE c, " Indicates Multiple Split Line Items
vat_type(20) TYPE c, " Type of VAT i.e. Normal or Level 3
posting_key(2) TYPE c, " Posting key i.e. 40, 50
holding_gl_account(1) TYPE c, " Post To Holding Account
error_message(50) TYPE c, " Error Message For Holding Account
row(4) TYPE c, " Excel Spreadsheet Row No
END OF purchase_card_record.
DATA:
purchase_card_wa TYPE purchase_card_record,
purchase_card_tab TYPE TABLE OF purchase_card_record,
last_purchase_card_wa TYPE purchase_card_record.
Internal table containing possible errors within the Excel
spread sheet
TYPES: BEGIN OF error_type,
row_num(6) TYPE c,
message(80) TYPE c,
line(200) TYPE c,
END OF error_type.
DATA: error_wa TYPE error_type,
error_tab TYPE STANDARD TABLE OF error_type.
*EJECT
Document Record structures containing RFBIBL00 data
There will be one header BBKPF record with more than one
associated BBSEG record(s) for each transaction.
i.e. BGR00 ( File Session Header )
|
BBKPF
|
BBSEG ....
BBSEG ....
etc ....
BBKPF
|
BBSEG ....
BBSEG ....
etc ....
DATA:
bgr00_record LIKE bgr00, " Batch Input Structure for Session Data
bbkpf_record LIKE bbkpf, " Document Header (Batch Input Structure)
bbseg_record LIKE bbseg. " Document Segment (Batch Input Structure)
DATA: bbseg_tab TYPE STANDARD TABLE OF bbseg.
DATA: empty_bbseg_record LIKE bbseg,
empty_bgr00_record LIKE bgr00,
empty_bbkpf_record LIKE bbkpf.
*EJECT
WORK AREA *
DATA: wa_num_rows_loaded TYPE i,
wa_mod TYPE n,
wa_num_of_invalid_rows TYPE i,
wa_num_of_transactions TYPE i,
wa_num_of_transaction_lines TYPE i,
wa_num_of_vat_lines TYPE i,
wa_error_count TYPE i,
wa_error_exists(1) TYPE c,
wa_company_code LIKE skb1-bukrs,
wa_gl_code(10) TYPE n,
wa_aufnr(12) TYPE n,
wa_counter TYPE i,
wa_output_file(75) TYPE c, " Output file name
wa_period TYPE i, " Payroll Period
wa_first_time(1) TYPE c,
wa_error(200) TYPE c,
wa_unix_command(100) TYPE c,
wa_todays_date(8) TYPE c,
wa_wrbtr(16) TYPE c,
wa_format(3) TYPE c.
Batch Totals *
DATA:
wa_total_amount LIKE bbseg-wrbtr,
wa_total_vat LIKE bbseg-wrbtr.
Internal table to hold UNIX command
DATA: BEGIN OF unix_itab OCCURS 10,
text(80),
END OF unix_itab.
Define constants
CONSTANTS:
c_filename(75) VALUE 'Purchase_Cards',
c_data_directory(75) VALUE '/sapinterfaces/data/',
c_archive_directory(75) VALUE '/sapinterfaces/archive/',
c_holding_gl_account(4) VALUE '7888'.
Excel Spreadsheet Filename / Location
SELECTION-SCREEN BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
SELECTION-SCREEN COMMENT /30(70) TEXT-002.
SELECTION-SCREEN COMMENT /40(40) TEXT-003.
PARAMETERS: pa_file LIKE rlgrap-filename OBLIGATORY MEMORY ID m01.
SELECTION-SCREEN END OF BLOCK b1.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR pa_file.
CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
EXPORTING
mask = ',.XLS,.XLS'
static = 'X'
CHANGING
file_name = pa_file.
*EJECT
START-OF-SELECTION.
Upload the excel spreadsheet into the the internal table
PERFORM upload_xls_spreadsheet.
Any Records To Process ??
IF wa_num_rows_loaded > 0.
PERFORM process_file.
ELSE.
CONCATENATE 'No Data Found In The Excel Spreadsheet ' pa_file
INTO wa_error SEPARATED BY space.
CALL FUNCTION 'POPUP_TO_DISPLAY_TEXT'
EXPORTING
titel = 'File Error'
textline1 = wa_error
textline2 = 'Does File exist ? '
start_column = 25
start_row = 6.
ENDIF.
END-OF-SELECTION.
*EJECT
Form UPLOAD_FILE
Subroutine to upload excel file
FORM upload_xls_spreadsheet .
Call function to upload spreadsheet into an internal table
REFRESH : excel_itab,
purchase_card_tab.
CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
EXPORTING
filename = pa_file
i_begin_col = '1'
i_begin_row = '2' "Do not require headings
i_end_col = '21'
i_end_row = '10000'
TABLES
intern = excel_itab
EXCEPTIONS
inconsistent_parameters = 1
upload_ole = 2
OTHERS = 3.
IF sy-subrc <> 0.
CONCATENATE 'Problem Trying To Read File' pa_file
INTO wa_error SEPARATED BY space.
CALL FUNCTION 'POPUP_TO_DISPLAY_TEXT'
EXPORTING
titel = 'File Error'
textline1 = wa_error
textline2 = 'Does File exist ? '
start_column = 25
start_row = 6.
ELSE.
PERFORM convert_table.
ENDIF.
ENDFORM. " upload_xls_spreadsheet
*EJECT
Form PROCESS_FILE
This form will control the flow of the program
FORM process_file.
Validate the input file
PERFORM validate_input_file.
Insert Document Header Records
PERFORM insert_bbkpf_indicators.
Display any invalid rows from the imported file
IF wa_error_count > 0.
PERFORM display_invalid_records.
ENDIF.
Create the text batch file which will be submitted to RFBIBL00
PERFORM create_batch_file.
Submit File - Call RFBIBL00 ( Batch Input Program )
PERFORM submit_rfbibl00.
Move the text batch file into the SAP archive directory
PERFORM archive_file.
Display Batch Total Amounts
PERFORM batch_totals.
ENDFORM. " process_file.
*EJECT
Form Validate_Input_File
FORM validate_input_file.
This subroutine will validate the input file against the Business
Rules that have been created by the E-Trading and Accounts teams.
CLEAR wa_counter.
Only select transactions which have not previously been posted i.e. = 0
LOOP AT purchase_card_tab INTO purchase_card_wa WHERE exp_financial = '0'.
wa_counter = wa_counter + 1.
Need to determine the Company Code
CASE: purchase_card_wa-acc_address.
WHEN 'SALFORD CITY COUNCIL'.
wa_company_code = '1000'.
WHEN 'SALFORD COMM LEISURE'.
wa_company_code = '8000'.
WHEN 'NEW PROSPECT HOUSING LTD'.
wa_company_code = 'NPHL'.
ENDCASE.
*EJECT
Cost Centre
IF purchase_card_wa-cost_centre IS NOT INITIAL.
Valid Cost Centre ?
CLEAR csks.
SELECT SINGLE *
FROM csks
WHERE kokrs = wa_company_code
AND kostl = purchase_card_wa-cost_centre
AND datbi = '99991231'.
IF sy-subrc <> 0.
PERFORM create_error_message USING 'Invalid Cost Centre'
purchase_card_wa-cost_centre
purchase_card_wa-row
'Y'.
ELSE.
IF csks-pkzkp = 'X'.
PERFORM create_error_message USING 'CC Blocked For Posting'
purchase_card_wa-cost_centre
purchase_card_wa-row
'Y'.
ENDIF.
ENDIF.
Verify Correct Cost Centre for Company
CASE: wa_company_code.
WHEN '1000'. " Salford City Council
IF purchase_card_wa-cost_centre(1) CA 'JT'.
PERFORM create_error_message USING 'Must Be SCC Cost Centre'
purchase_card_wa-cost_centre
purchase_card_wa-row
'Y'.
ENDIF.
WHEN '8000'. " Salford Community Leisure
IF purchase_card_wa-cost_centre(1) <> 'T'.
PERFORM create_error_message USING 'Must Be SCL Cost Centre'
purchase_card_wa-cost_centre
purchase_card_wa-row
'Y'.
ENDIF.
WHEN 'NPHL'.
IF purchase_card_wa-cost_centre(1) <> 'J'. " New Prospect Housing
PERFORM create_error_message USING 'Must Be NPHL Cost Centre'
purchase_card_wa-cost_centre
purchase_card_wa-row
'Y'.
ENDIF.
ENDCASE.
ENDIF.
*EJECT
GL Account Code
Insert leading zeroes
wa_gl_code = purchase_card_wa-gl_code.
IF purchase_card_wa-gl_code IS INITIAL.
PERFORM create_error_message USING 'G/L Account Code Required'
purchase_card_wa-cost_centre
purchase_card_wa-row'Y'.
ELSE.
Valid GL Account ?
CLEAR skb1.
SELECT SINGLE *
FROM skb1
WHERE bukrs = wa_company_code
AND saknr = wa_gl_code.
IF sy-subrc <> 0.
PERFORM create_error_message USING 'Invalid G/L Account Code'
purchase_card_wa-gl_code
purchase_card_wa-row
'Y'.
ELSE.
IF skb1-xspeb = 'X'.
PERFORM create_error_message USING 'G/L Blocked For Posting'
purchase_card_wa-gl_code
purchase_card_wa-row
'Y'.
ENDIF.
ENDIF.
ENDIF.
*EJECT
Internal Order No.
There can be two types of internal order numbers:
1) RIO - Real Internal Order in format A12345
2) SIO - Statistical Intrenal Order in format 12345
IF purchase_card_wa-internal_order IS NOT INITIAL.
Insert leading zeroes
wa_aufnr = purchase_card_wa-internal_order.
Decide which format. Check first character
IF purchase_card_wa-internal_order(1) CA 'ABCDEFGHIJKLMNOPQRSTUZWXYZ'.
wa_format = 'RIO'.
ELSE.
wa_format = 'SIO'.
ENDIF.
Valid Internal Order - RIO only?
IF wa_format = 'RIO'.
SELECT SINGLE *
FROM aufk
WHERE aufnr = wa_aufnr.
IF sy-subrc <> 0.
PERFORM create_error_message USING 'Invalid Internal Order'
purchase_card_wa-internal_order
purchase_card_wa-row
'Y'.
ENDIF.
IF purchase_card_wa-cost_centre IS NOT INITIAL
OR purchase_card_wa-gl_code(1) <> '7'.
PERFORM create_error_message USING 'Incorrect GL/Int Ord Combo'
purchase_card_wa-internal_order
purchase_card_wa-row
'Y'.
ENDIF.
ELSE.
IF purchase_card_wa-cost_centre IS INITIAL.
PERFORM create_error_message USING 'Cost Centre Required'
purchase_card_wa-row
'Y'.
ENDIF.
IF purchase_card_wa-gl_code(1) NA '45'.
PERFORM create_error_message USING 'Incorrect GL Code for CC'
purchase_card_wa-internal_order
purchase_card_wa-row
'Y'.
ENDIF.
ENDIF.
ELSE.
IF purchase_card_wa-cost_centre IS INITIAL
OR purchase_card_wa-gl_code(1) NA '45'.
PERFORM create_error_message USING 'Incorrect GL Code for CC'
purchase_card_wa-gl_code
purchase_card_wa-row
'Y'.
ENDIF.
ENDIF.
*EJECT
VAT Amount.
IF purchase_card_wa-vat_amount > 0
AND purchase_card_wa-country <> 'GBR'.
PERFORM create_error_message USING 'VAT Must Be 0 For Non UK Trans'
purchase_card_wa-gl_code
purchase_card_wa-row
'Y'.
ENDIF.
Keep a count of how many rows in the spreadsheet are invalid
IF wa_error_exists = 'Y'.
wa_num_of_invalid_rows = wa_num_of_invalid_rows + 1.
ENDIF.
CLEAR: wa_error_exists,
purchase_card_wa.
ENDLOOP.
ENDFORM. " Validate_Input_File
*EJECT
Form Insert_BBKPF_indicators
FORM insert_bbkpf_indicators.
DATA: wa_temp_date(12) TYPE c.
This routine wil make sure that the BBKPF indicator in the table is
set to 'Y' when a new unique FIN. Mastercard Txn ID is reached. This
is so that the correct document header records (BBKPF) can be created
in the text file.
DATA: previous_mast_txn_id(10) TYPE c,
previous_split_sequence(10) TYPE c.
First we need to delete transactions that may have been previoulsy
posted
N.B. This is done after the validation process in order to keep
the row number in sync with the excel spreadsheet
DELETE purchase_card_tab WHERE exp_financial = '1'.
Sort Data Into FIN. Mastercard Transaction No. and Split Sequence order
SORT purchase_card_tab BY mast_txn_id split_sequence.
Only select transactions which have not previously been posted
LOOP AT purchase_card_tab INTO purchase_card_wa WHERE exp_financial = '0'.
Keep a note of where we will need to write a new Document Header
record i.e. BBKPF. A new Document Header is needed for each unique
Mastercard Transaction ID
IF purchase_card_wa-mast_txn_id <> previous_mast_txn_id.
purchase_card_wa-bbkpf_ind = 'Y'.
previous_mast_txn_id = purchase_card_wa-mast_txn_id.
ENDIF.
We also need to keep a note of the split sequence for multiple line
items. Basically we will not post the lines where the split sequence
held in the Split Sequence (e.g. Split 1 ) is duplicated.
IF purchase_card_wa-split_sequence = previous_split_sequence
AND purchase_card_wa-mast_txn_id = previous_mast_txn_id
AND purchase_card_wa-split_sequence <> '0'.
purchase_card_wa-split_ind = 'Y'.
ENDIF.
previous_split_sequence = purchase_card_wa-split_sequence.
Need to determine the Company Code
CASE: purchase_card_wa-acc_address.
WHEN 'SALFORD CITY COUNCIL'.
wa_company_code = '1000'.
WHEN 'SALFORD COMM LEISURE LTD'.
wa_company_code = '8000'.
WHEN 'NEW PROSPECT HOUSING LTD'.
wa_company_code = 'NPHL'.
ENDCASE.
purchase_card_wa-company_code = wa_company_code.
Convert Transaction Date From DD/MM/YY to DD.MM.YYYY
OR DD/MM/YYYY to DD.MM.YYYY
IF STRLEN( purchase_card_wa-trans_date ) = 8.
CONCATENATE purchase_card_wa-trans_date+0(2)
purchase_card_wa-trans_date+3(2)
sy-datum+0(2)
purchase_card_wa-trans_date+6(2)
INTO wa_temp_date.
ELSE.
CONCATENATE purchase_card_wa-trans_date+0(2)
purchase_card_wa-trans_date+3(2)
purchase_card_wa-trans_date+6(4)
INTO wa_temp_date.
ENDIF.
purchase_card_wa-trans_date = wa_temp_date.
Each transaction line will normally be subject to VAT unless the
transaction line is for a Level 3 with No Splits. If this is the case
only ONE VAT line per transaction will be needed. Where as the other
transactions will have a VAT line posted for each individual line item.
IF purchase_card_wa-country = 'GBR'
AND ( purchase_card_wa-vat_amount > '0.00'
OR purchase_card_wa-tax_amount > '0.00' ).
purchase_card_wa-vat_type = 'Normal'.
ENDIF.
IF ( purchase_card_wa-vat_indicator = 'GBR001'
AND purchase_card_wa-split_sequence = '0' ).
purchase_card_wa-vat_type = 'Level 3 No Splits'.
ENDIF.
Posting Key - 40 ( Debit = '-' ) 50 ( Credit = '+' )
IF purchase_card_wa-db_cr_code = '-'.
purchase_card_wa-posting_key = '40'.
ELSE.
purchase_card_wa-posting_key = '50'.
ENDIF.
Update Internal Table
MODIFY purchase_card_tab FROM purchase_card_wa.
ENDLOOP.
Now need to remove all the duplicated split sequences
i.e. Where split_ind = 'Y'.
DELETE purchase_card_tab WHERE split_ind = 'Y'.
ENDFORM. " insert_bbkpf_indicators
*EJECT
Form display_invalid_records
FORM display_invalid_records.
WRITE: ' Total No Of Rows Uploaded From Spreadsheet:',
wa_num_rows_loaded,' From File : ' ,pa_file.
WRITE sy-uline.
SKIP 1.
FORMAT INTENSIFIED.
SKIP 1.
WRITE: /40 'I N V A L I D D A T A '.
SKIP 1.
WRITE: /3 'Row',
15 'Error Message',
100 'Original Data'.
WRITE: /2 '------',
15 '----
100 '----
FORMAT INTENSIFIED OFF.
Display contents of all invalid records from the excel spreadsheet
LOOP AT error_tab INTO error_wa.
COMPUTE wa_mod = sy-tabix MOD 2.
Use alternative colours
IF wa_mod = 0.
FORMAT COLOR COL_KEY.
ELSE.
FORMAT COLOR COL_HEADING.
ENDIF.
WRITE: /3 error_wa-row_num,
15 error_wa-message,
100 error_wa-line.
ENDLOOP.
SKIP 1.
FORMAT COLOR COL_NEGATIVE.
WRITE: ' No. Of Invalid Rows ', wa_num_of_invalid_rows,
' Total No. Errors In Spreadsheet ' ,wa_error_count.
ULINE.
ENDFORM. "display_invalid_records
*EJECT
Form create_batch_file
This form will create the text file containing all of the transaction
lines data i.e. BBKPF,BBSEG etc. This text file is then passed to the
SAP standard program RFBIBL00 which will create a batch session.
FORM create_batch_file .
Calculate payroll period
IF sy-datum+4(2) >= 4.
wa_period = sy-datum+4(2) - 3.
ELSE.
wa_period = sy-datum+4(2) + 9.
ENDIF.
Todays Date
CONCATENATE sy-datum+6(2)
sy-datum+4(2)
sy-datum+0(4)
INTO wa_todays_date.
Append Date to File Name i.e. filenameDDMMYYYY
CONCATENATE c_data_directory
c_filename
wa_todays_date
INTO wa_output_file.
Initialise Accounting Document Work Areas
PERFORM initialise_structure USING 'BGR00'
'EMPTY_BGR00_RECORD'.
PERFORM initialise_structure USING 'BBKPF'
'EMPTY_BBKPF_RECORD'.
PERFORM initialise_structure USING 'BBSEG'
'EMPTY_BBSEG_RECORD'.
Open t -
Hi gurus
i need the process keys related to issues and receipts for the below mentioned stocks
I also need to stock types and the stock categories for the same.
I can get the process keys data in SBIW, but i am unable to find the exact keys related to the below stocks.
The below fields are from MARD table
LABST - Unrestricted
UMLME - Stock in Transfer
EINME - Restricted Use
SPEME - Blocked
RETME - Return
Please help me...
Thanks & Regards
SeshuHi Bhima,
look at OSS Note 353042 'How to Activate transaction key PROCESSKEY'...
However, Logistic Cockpit extractors represent, in principle, structures that collect document-specific information from the logistics application and transfer it to the SAP BW.
Many key figures are yielded from the application. These must be transferred to the SAP BW with the help of the extractors. SAP ensures that document information is interpreted in advance with the help of a 'process key' so that key figures, which describe and measure the particular processes, can be more easily derived for customers.
In the 2LIS_02_SCL, for example, this means that the rows to be transferred only contain transfer quantities and value fields. The actual key figure is hidden behind the process key. There are generic key figures (transfer quantities and transfer values) that have a significance when combined with the process keys.
Process keys must, during the update, be resolved into corresponding key figures at the time of query creation at the latest, you must have a precise knowledge of the process keys for the creation of customer-specific infocube updates or the creation of the query.
Example: Update routine for the key figure 'Order Volume Quantity (External Vendor)'(pseudo-coding):
IF COMM_STRUCTURE-PROCESSKEY = `001`.
RESULT = COMM_STRUCTURE-TRANS_AMOUNT.
ELSE.
CLEAR RESULT. no update of key-figure
ENDIF.
Look at MCB0 transaction !!!
Hope it helps!
(Bhima, don't forget to reward the answers !!!)
Bye,
Roberto
Maybe you are looking for
-
Images and styles not showing in Design view on CS3
Hi all, I am playing around with a CSS website template that I downloaded and opened in DW-CS3. I am just using this to learn and play around with. It came with the style sheet, jpg images etc. I just noticed that when in Design view for the index pa
-
How do I change the main iCloud account on my iPad?
I need to change the main iCloud account on my iPad. I'm merging family MobileMe accounts' contacts, and need to set up multiple devices with the main iCloud family account so we will all share photos, contacts, calendars, etc, but have separate emai
-
How to create and hide a drive using java?
I want to ask how to create a temporary drive/ or a drive by allocating some space of existing drives(eg allocating some space of C: Drive such as 5mb for storing files on the new drive that has been created) and then i want to hide that drive .So Pl
-
User Exit (FI/CO)
Hi experts, Pls let me know the user exit by that we can update table GLPCA. Actually whenever we post any financial document via transaction code FB50 or FB01, at that time all entries related to profit center takes place in GLPCA table and i want a
-
How to insert a new line char(s) into a java String? a single '\n' seems not the best solution.