Generic extractor FM : taking 5-6 hours time for 3 months to BW urgent:
Dear experts,
I have designed a FM for generic extraction , which is taking 5-6 hoours time for 3 months data i.e 24 lakhs records to BW up to PSA .
i have given the coding below plz provide any modifications to improve the performance.....
FUNCTION zhr_att_analysis.
""Local Interface:
*" IMPORTING
*" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
*" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
*" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
*" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
*" VALUE(I_REMOTE_CALL) TYPE SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
*" TABLES
*" I_T_SELECT TYPE SBIWA_T_SELECT OPTIONAL
*" I_T_FIELDS TYPE SBIWA_T_FIELDS OPTIONAL
*" E_T_DATA STRUCTURE ZHR_ATT_MAIN OPTIONAL
*" EXCEPTIONS
*" NO_MORE_DATA
*" ERROR_PASSED_TO_MESS_HANDLER
Auxiliary Selection criteria structure
DATA: l_s_select TYPE sbiwa_s_select.
Maximum number of lines for DB table
STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
Select ranges
RANGES: l_r_pernr FOR pa9004-pernr,
l_r_bukrs FOR pa0001-bukrs,
l_r_persg FOR pa0001-persg,
l_r_begda FOR pa9004-begda,
l_r_persk FOR pa0001-persk.
Maximum number of lines for DB table
STATICS: s_s_if TYPE srsc_s_if_simple,
counter
s_counter_datapakid LIKE sy-tabix,
cursor
s_cursor TYPE cursor.
*"Declaration of store data
TYPES : BEGIN OF ty_9004,
pernr TYPE persno,
endda TYPE endda,
begda TYPE begda,
zrs TYPE zrs,
zstorecode TYPE zstorecode,
END OF ty_9004.
*"Declaration of employee data
TYPES : BEGIN OF ty_0001,
pernr TYPE pernr_d,
endda TYPE endda,
begda TYPE begda,
AEDTM TYPE AEDAT,
bukrs TYPE bukrs,
persg TYPE persg,
persk TYPE persk,
END OF ty_0001.
*"Declaration of expected mandays
TYPES : BEGIN OF ty_0000,
pernr TYPE persno,
endda TYPE endda,
begda TYPE begda,
aedtm TYPE aedat,
stat2 TYPE stat2,
massn TYPE massn,
END OF ty_0000.
*"Declaration of man days swiped
TYPES : BEGIN OF ty_teven,
pernr TYPE pernr_d,
ldate TYPE ldate,
satza TYPE retyp,
aedtm TYPE aedat,
counter_swiped TYPE i,
END OF ty_teven.
*"Declaration of Man days regularized
TYPES : BEGIN OF ty_2002,
pernr TYPE pernr_d,
subty TYPE subty,
endda TYPE endda,
begda TYPE begda,
aedtm TYPE aedat,
END OF ty_2002.
*"Declaration of Man days lostdue to leave
TYPES : BEGIN OF ty_2001,
pernr TYPE pernr_d,
subty TYPE subty,
endda TYPE endda,
begda TYPE begda,
aedtm TYPE aedat,
END OF ty_2001.
*****Declaration of weekly off
TYPES : BEGIN OF ty_2003,
pernr TYPE pernr_d,
subty TYPE subty,
endda TYPE endda,
begda TYPE begda,
aedtm TYPE aedat,
tprog TYPE tprog,
END OF ty_2003.
Auxiliary Selection criteria structure
DATA :
it_0001 TYPE TABLE OF ty_0001,
wa_0001 TYPE ty_0001,
it_0000 TYPE TABLE OF ty_0000,
wa_0000 TYPE ty_0000,
it_teven TYPE TABLE OF ty_teven,
wa_teven TYPE ty_teven,
it_2002 TYPE TABLE OF ty_2002 ,
wa_2002 TYPE ty_2002,
it_2001 TYPE TABLE OF ty_2001,
wa_2001 TYPE ty_2001,
it_2003 TYPE TABLE OF ty_2003,
wa_2003 TYPE ty_2003,
wa_target TYPE zhr_att_main.
DATA : date TYPE dats,
doj TYPE dats,
dol TYPE dats,
date1 TYPE dats,
date2 TYPE dats,
counter(9) TYPE n.
Initialization mode (first call by SAPI) or data transfer mode
(following calls) ?
IF i_initflag = sbiwa_c_flag_on.
Initialization: check input parameters
buffer input parameters
prepare data selection
Check DataSource validity
CASE i_dsource.
WHEN 'ZHR_ATT_ANALYSIS'.
WHEN OTHERS.
IF 1 = 2. MESSAGE e009(r3). ENDIF.
log_write 'E' "message type
'R3' "message class
'009' "message number
i_dsource "message variable 1
' '. "message variable 2
RAISE error_passed_to_mess_handler.
ENDCASE.
APPEND LINES OF i_t_select TO s_s_if-t_select.
Fill parameter buffer for data extraction calls
s_s_if-requnr = i_requnr.
s_s_if-dsource = i_dsource.
s_s_if-maxsize = i_maxsize.
Fill field list table for an optimized select statement
(in case that there is no 1:1 relation between InfoSource fields
and database table fields this may be far from beeing trivial)
APPEND LINES OF i_t_fields TO s_s_if-t_fields.
ELSE. "Initialization mode or data extraction ?
Data transfer: First Call OPEN CURSOR + FETCH
Following Calls FETCH only
First data package -> OPEN CURSOR
IF s_counter_datapakid = 0.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'PERNR'.
MOVE-CORRESPONDING l_s_select TO l_r_pernr.
APPEND l_r_pernr.
ENDLOOP.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'BUKRS'.
MOVE-CORRESPONDING l_s_select TO l_r_bukrs.
APPEND l_r_bukrs.
ENDLOOP.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'PERSG'.
MOVE-CORRESPONDING l_s_select TO l_r_persg.
APPEND l_r_persg.
ENDLOOP.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'BEGDA'.
MOVE-CORRESPONDING l_s_select TO l_r_begda.
APPEND l_r_begda.
ENDLOOP.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'PERSK'.
MOVE-CORRESPONDING l_s_select TO l_r_persk.
APPEND l_r_persk.
ENDLOOP.
OPEN CURSOR WITH HOLD s_cursor FOR
populate only store code employess does not have empty store codes
SELECT apernr bpernr bendda bbegda bbukrs bpersg b~persk FROM pa9004 AS a INNER JOIN pa0001 AS b
ON apernr = bpernr
WHERE a~pernr IN l_r_pernr AND
a~zstorecode <> '' AND
bukrs IN l_r_bukrs AND
persg IN l_r_persg AND
persk IN l_r_persk.
ENDIF.
Fetch records into interface table.
named E_T_'Name of extract structure'.
FETCH NEXT CURSOR s_cursor
APPENDING CORRESPONDING FIELDS
OF TABLE it_0001
PACKAGE SIZE s_s_if-maxsize.
IF sy-subrc <> 0.
CLOSE CURSOR s_cursor.
RAISE no_more_data.
ELSE.
break-point.
IF l_r_begda-high = '00000000' AND l_r_begda-low = '00000000'.
date1 = sy-datum - 1.
date2 = sy-datum - 1.
ELSE.
date1 = l_r_begda-low .
date2 = l_r_begda-high.
ENDIF.
SORT it_0001 BY pernr persg begda endda bukrs.
DELETE it_0001 WHERE persg NE 'T' AND
persg NE 'K' AND
persg NE 'P' AND
persg NE 'W'.
DELETE ADJACENT DUPLICATES FROM it_0001 COMPARING pernr begda endda bukrs.
populate all the employees that are active in pa9004.
IF NOT it_0001[] IS INITIAL.
SELECT pernr endda begda aedtm massn FROM pa0000
INTO CORRESPONDING FIELDS OF TABLE it_0000
FOR ALL ENTRIES IN it_0001
WHERE pernr = it_0001-pernr
AND ( massn = 'A1' OR massn = '00' OR massn = 'A6' OR massn = 'A3' ).
SORT it_0000 BY pernr begda DESCENDING.
ENDIF.
populate SWIPED MAN DAYS data
IF NOT it_0001[] IS INITIAL.
SELECT pernr ldate satza aedtm FROM teven
INTO CORRESPONDING FIELDS OF TABLE it_teven
FOR ALL ENTRIES IN it_0001
WHERE pernr = it_0001-pernr AND
satza = 'P01'
AND ldate IN l_r_begda.
SORT it_teven BY pernr ldate.
ENDIF.
**populate REGULARIZATION DAYS data
IF NOT it_0001[] IS INITIAL.
SELECT pernr subty endda begda aedtm FROM pa2002
INTO CORRESPONDING FIELDS OF TABLE it_2002
FOR ALL ENTRIES IN it_0001
WHERE pernr = it_0001-pernr
AND begda >= date1
AND endda <= date2 .
SORT it_2002 BY pernr begda endda.
ENDIF.
**populate LEAVE DAYS data
IF NOT it_0001[] IS INITIAL.
SELECT pernr subty endda begda aedtm FROM pa2001
INTO CORRESPONDING FIELDS OF TABLE it_2001
FOR ALL ENTRIES IN it_0001
WHERE pernr = it_0001-pernr
AND begda >= date1
AND endda <= date2 .
SORT it_2001 BY pernr begda endda .
ENDIF.
**populate WEEKLY OFF data
IF NOT it_0001[] IS INITIAL.
SELECT pernr subty endda begda aedtm tprog FROM pa2003
INTO CORRESPONDING FIELDS OF TABLE it_2003
FOR ALL ENTRIES IN it_0001
WHERE pernr = it_0001-pernr AND
tprog = 'OFF'
AND begda >= date1
AND endda <= date2 .
SORT it_2003 BY pernr begda endda.
ENDIF.
date = sy-datum.
********added changes on 06.04.2008**************action type & date dependent extaction****
loop over it_0001 table
BREAK-POINT.
LOOP AT it_0001 INTO wa_0001.
if sy-tabix = 1.
counter = 0.
for expected mandays
LOOP AT it_0000 INTO wa_0000 WHERE pernr = wa_0001-pernr .
IF wa_0000-massn = 'A1' OR wa_0000-massn = '00' OR wa_0000-massn = 'A3'.
doj = wa_0000-begda.
if wa_0000-endda = '99991231'.
date2 = sy-datum.
else.
dol = date2.
endif.
ELSEIF wa_0000-massn = 'A6'.
dol = wa_0000-begda.
ENDIF.
ENDLOOP.
IF date1 <= wa_0001-begda AND date2 <= wa_0001-endda AND date2 >= wa_0001-begda AND date1 <= wa_0001-endda.
counter = date2 - wa_0001-begda .
counter = counter + 1.
date = wa_0001-begda - 1.
ELSEIF date1 >= wa_0001-begda AND date2 >= wa_0001-endda AND date2 >= wa_0001-begda AND date1 <= wa_0001-endda.
counter = wa_0001-endda - date1.
counter = counter + 1.
date = date1 - 1.
ELSEIF date1 >= wa_0001-begda AND date2 <= wa_0001-endda AND date2 >= wa_0001-begda AND date1 <= wa_0001-endda.
counter = date2 - date1.
counter = counter + 1.
date = date1 - 1.
ELSEIF date1 <= wa_0001-begda AND date2 >= wa_0001-endda AND date2 >= wa_0001-begda AND date1 <= wa_0001-endda.
counter = wa_0001-endda - wa_0001-begda.
counter = counter + 1.
date = wa_0001-begda - 1.
ELSE.
CONTINUE.
ENDIF.
********completed changes on 06.04.2008**************action type & date dependent extaction**
split records from date of joining to till date
DO counter TIMES.
CLEAR : wa_teven , wa_target.
date = date + 1.
wa_target-date1 = date.
wa_target-pernr = wa_0001-pernr.
wa_target-bukrs = wa_0001-bukrs.
wa_target-persg = wa_0001-persg.
wa_target-persk = wa_0001-persk.
for expected mandays count
IF wa_target-date1 >= doj AND wa_target-date1 <= dol.
wa_target-expectedmandays = 1.
wa_target-aedtm = wa_0000-aedtm.
for swiped mandays
READ TABLE it_teven INTO wa_teven WITH KEY pernr = wa_target-pernr
ldate = wa_target-date1 BINARY SEARCH.
IF sy-subrc = 0.
wa_target-swiped_days = 1.
wa_target-aedtm = wa_teven-aedtm.
ENDIF.
for regularized days
LOOP AT it_2002 INTO wa_2002 WHERE pernr = wa_target-pernr
AND ( endda GE wa_target-date1 AND begda LE wa_target-date1 ).
wa_target-reg_days = 1.
wa_target-subty2 = wa_2002-subty.
wa_target-aedtm = wa_2002-aedtm.
ENDLOOP.
for leave days
LOOP AT it_2001 INTO wa_2001 WHERE pernr = wa_target-pernr
AND ( endda GE wa_target-date1 AND begda LE wa_target-date1 ).
wa_target-leave_days = 1.
wa_target-subty1 = wa_2001-subty.
wa_target-aedtm = wa_2001-aedtm.
ENDLOOP.
for weekly off days
LOOP AT it_2003 INTO wa_2003 WHERE pernr = wa_target-pernr
AND ( endda GE wa_target-date1 AND begda LE wa_target-date1 ).
wa_target-off_days = 1.
wa_target-aedtm = wa_2003-aedtm.
ENDLOOP.
append work area to e_t_data
APPEND wa_target TO e_t_data.
ENDIF.
ENDDO.
ENDLOOP.
clear internal tables
CLEAR : it_0000 , it_0001 , it_2001 , it_2002 , it_2003 , it_teven.
ENDIF.
s_counter_datapakid = s_counter_datapakid + 1.
ENDIF. "Initialization mode or data extraction ?
ENDFUNCTION.
Hi Guys
I can have both your cases looked into for you.
Please send me an email using the contact us form in my profile. The address for this form in the section 'about me'.
Thanks
Stuart
BTCare Community Mod
If we have asked you to email us with your details, please make sure you are logged in to the forum, otherwise you will not be able to see our ‘Contact Us’ link within our profiles.
We are sorry that we are unable to deal with service/account queries via the private message(PM) function so please don't PM your account info, we need to deal with this via our email account :-)
Similar Messages
-
Hi,
How to uninstall Mcafee anti-virus from my machine because of machine bootup is taking too much of time for usage? so that i moved all mcafee related file to trash but while machine boot up mcafee showing and asking Authentication so any one give me better suggestion on this.
Thanks in advance,
Suresh BalakrishnanHi All,
i uninstalled AV successfully using as below mentioned links:
Moved to VirusScan Other where hopefully an expert can help you.
Until they do I Googled 'How to uninstall VirusScan 8.6 for MAC' and got this as one of the many choices: https://kc.mcafee.com/corporate/index?page=content&id=KB54975
There are also uninstall instructions in the User Guide: https://kc.mcafee.com/resources/sites/MCAFEE/content/live/PRODUCT_DOCUMENTATION/ 22000/PD22134/en_US/VirusScan%20for%20Mac%208.6.1%20User%20Guide%20Addendum.pdf
By condemning files to the bin you may have scotched any automated method of removal but I will defer to the experts on that.
Thanks lot,
Suresh Balakrishann -
Generic Extractor on DB Table without Date, Time stamp
Hi all,
We have requirement of creating a generic extractor on the table which doesnt have date or time stamp in the data field. The only option availabel is to extract from Document numbers.
And we cant put the extractor mode to "Read from view" because the table contains Currency field which refers to external table for currency key.
The moment we change the extractor mode from Extraction from view to Extraction from FM the option "Numeric pointer" gets hidden.
What shall we do in this scenario ?? Pls advice.
Thanks.
Regards
NimeshHello Tapan, Prakash
Prakash : Currency key is needed in BW .
Tapan : I was just trying diff. options i.e. extraction from view and FM . Problem with Generic Extractor is that i dont have any date fields getting updated in the table. The only field that can be used for delta is document number. And Generic extractor only allows date or time stamp option , its not allowing numeric pointer if one reads from FM.
Regards
Nimesh -
So when I next look for the movie I rented , there was no sign if it in the movie download section.
I looked in "Account" section and saw where it existed under "recent" . So , I don't see where I can watch it. I am at this time using my pc laptop, Itunes .....Terry StrightBonjour,
Je pense pouvoir t'aider......Essaie toujours ...... Retourne dans Itunes ... Rubrique films.....Reselectionne ton film......La tu verras que tu peux lancer le visionnage ( le choix regarder apparaitra si tu as loué le film et non acheter ou louer)
Apres cela il apparaitra dans videos -
Estimated time for a query execution -- URGENT PLEASE
Hi,
I am interested in knowing the time of execution of a query before actually running it. Is there any stored procedures/methods that gives us this feature.
Thanks.Without actually running the query, there's no way to find out how long it will take to run. I'm not even sure that there's a theoretical way to do this.
You can use the v$long_ops table to see roughly how far along a long-running query is while it's running.
Justin -
Discoverer report is taking a very long time
Hi All,
I need help on below discoverer issue.
discoverer report is taking a very long time for rows to be retrieved on export when it is run for India and it is required for month end. For some reason only 250 rows are retrieved at a time and retrieval is slow so it is taking 10 minutes to bring back 10,000 rows.
Regards
KumarPlease post the details of the application release, database version and OS along with the discoverer version.
I need help on below discoverer issue.
discoverer report is taking a very long time for rows to be retrieved on export when it is run for India and it is required for month end. For some reason only 250 rows are retrieved at a time and retrieval is slow so it is taking 10 minutes to bring back 10,000 rows.Please see these links.
https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Long+AND+Time&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Performance&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Slow&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
Thanks,
Hussein -
Initial download taking time for CTParts in syclo inventory manager 3.2
Hi All,
While doing the Initial download in syclo inventory manager 3.2 we have observed that it is taking a lot of time for fetching the data from the complex table CTParts.
In agentry diagram CTParts complex table is showing nine fields, out of this nine fields few fields like UOM, BatchIndicator etc does not have any dependency.So can i delete those fields?
If yes, what will be the impact on application after deleting those fields .
Thanks for your help
-Garima
Tags edited by: Michael ApplebyGarima,
You need to analyze couple of things before making any program changes:-
a) Can you check if you have set filter for ctparts MDO object in SAP ? if MDO filter for plant points to user parameter "WRK' , look at value of WRK in SU3. Make sure that you have plant value maintained for WRK parameter.
b) if indeed WRK value is maintained then go to MARC and check the number of materials that exists for WRK plant. if it is too many then do you really all those materials downloaded to Mobile device ? check if you can maintain other filters values to restrict material records downloaded like material type , material group etc.
c) Check the point of bottlecheck a) whether it takes more time to execute query in SAP . b) whether it takes time to transfer data from SAP to Java layer. if so try to increase Java heap size.
d) Also look at MDO field selections for ctparts in SAP. Only select fields that you want to do.
e) Did you create additional indexes for ctparts complex table ?
f) Finally if nothing works then look at option of replacing output structure in BAPI which return CTparts with Z structure with only required 9 fields which also requires Z Java code changes for ctparts complex table.
Thanks
Manju. -
hi,
I have a small schema in a database housing lots of users.
The expdp is taking a lot of time for this schema eventhough the old exp takes 1/10 th time.
When i switched on the trace i see its stuck at below on <SID>dw0115741.trc trace file
KUPW: 03:40:41.551: 1: DBMS_LOB.TRIM returned
KUPW: 03:40:41.551: 1: DBMS_METADATA.FETCH_XML_CLOB called
its stuck at
Starting "USER"."USER_JOB": USER/******** dumpfile=USER.dp logfile=USER.log job_name=USER_job parallel=4 exclude=grant,index,trigger,constraint,statistics,index_statistics trace=1FF0300 directory=EXPDP_BKP
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
I have replaces the actual username with USER
sga_target 2147483648
sga_max_size 2634022912
streams_pool_size 0
Does any one has any suggestion?
I tried the metalink as well but didnt find anything useful
Thanks
ShipraPl post details of OS and database versions
Pl see is MOS Doc 453895.1 (Checklist for Slow Performance of Export Data Pump (expdp) and Import DataPump (impdp)) - "Defects 10/11" - can help
HTH
Srini -
Generic Extraction : Taking a lot of time
HI Experts
I have created ZKONV a generic extracter which is a copy of KONV table. It has around 16,00,000 records .When I´m pulling the data in to ODS it is taking a lot of time. it is taking around 5+ hours to load the data. Is there anything wrong with my Datasource why it is taking so much of time.
Kindly provide some inputs
Thanks
NLNHi Lakshminarayana
You got to check couple of things.
First goto the source system and see, how long the extract program is running. The long time may be due to poor source system performance or huge processing at the BW system side. If the job in the source system is running for a long time, then check the source system resource. Build proper index on the mentioned table and see whether it has improved the performance. You can ask the basis people for the SQL trace and they let u know what kindof indexes u can build on the tables.
If the processing is taking more time, then u have to imrpove the start/update routine in the BW side. Please let us know, where exactly u have the problem. Then we can think of resolving it.
Sriram -
I just got an iPhone 5 32GB and i have backed up my old 3GS 32GB via itunes (20GB of data on there) and it is taking a very long time to restore to my iPhone5. Its taken more than 20 hours and didnt complete.
The text "Time remaining ... hours" flickers.
I set it to start restoring from the 3GS backup yesterday at 1pm and the time remaining kept on increasing. I left it overnight and at 9am this morning the progress bar hadnt moved and it said 12 hours remaining.
I don't know why its taking so long. I unplugged it and it just set up as new phone. So i did a factory reset and tried again at 4pm and it is now 12am and it says 4 hours remaining and my computer occasionally flickers. (Its not a bad computer, 2 months old, i7, 6GB RAM, Heaps of HDD space). It also messes with google chrome and i cannot browse the web at all while it is trying to restore my iphone.
I have tried restarting my laptop and should i reinstall itunes? I simply dont see why it should take longer than 20 hours to copy over 20GB of data!
Can anyone help me?
Thank youI'm having this EXACT same problem with my iPhone 4, and I have the same computer stats (I have a Samsung Series 7)
-
So i have an apple ipod touch 1st generation and it is not charging! It shows the charging sign but its taking a really long time to charge. Ive had it charging for at least 3 hours and it still says under 20% so can someone help me figure this out??
Not Charge
- See:
iPod touch: Hardware troubleshooting
iPhone and iPod touch: Charging the battery
- Try another cable.
- Try another charging source
- Inspect the dock connector on the iPod for bent or missing contacts, foreign material, corroded contacts, broken, missing or cracked plastic.
- Make an appointment at the Genius Bar of an Apple store.
Apple Retail Store - Genius Bar -
Query taking long time for EXTRACTING the data more than 24 hours
Hi ,
Query taking long time for EXTRACTING the data more than 24 hours please find the query and explain plan details below even indexes avilable on table's goe's to FULL TABLE SCAN. please suggest me.......
SQL> explain plan for select a.account_id,round(a.account_balance,2) account_balance,
2 nvl(ah.invoice_id,ah.adjustment_id) transaction_id,
to_char(ah.effective_start_date,'DD-MON-YYYY') transaction_date,
to_char(nvl(i.payment_due_date,
to_date('30-12-9999','dd-mm-yyyy')),'DD-MON-YYYY')
due_date, ah.current_balance-ah.previous_balance amount,
decode(ah.invoice_id,null,'A','I') transaction_type
3 4 5 6 7 8 from account a,account_history ah,invoice i_+
where a.account_id=ah.account_id
and a.account_type_id=1000002
and round(a.account_balance,2) > 0
and (ah.invoice_id is not null or ah.adjustment_id is not null)
and ah.CURRENT_BALANCE > ah.previous_balance
and ah.invoice_id=i.invoice_id(+)
AND a.account_balance > 0
order by a.account_id,ah.effective_start_date desc; 9 10 11 12 13 14 15 16
Explained.
SQL> select * from table(dbms_xplan.display);
PLAN_TABLE_OUTPUT
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)|
| 0 | SELECT STATEMENT | | 544K| 30M| | 693K (20)|
| 1 | SORT ORDER BY | | 544K| 30M| 75M| 693K (20)|
|* 2 | HASH JOIN | | 544K| 30M| | 689K (20)|
|* 3 | TABLE ACCESS FULL | ACCOUNT | 20080 | 294K| | 6220 (18)|
|* 4 | HASH JOIN OUTER | | 131M| 5532M| 5155M| 678K (20)|
|* 5 | TABLE ACCESS FULL| ACCOUNT_HISTORY | 131M| 3646M| | 197K (25)|
| 6 | TABLE ACCESS FULL| INVOICE | 262M| 3758M| | 306K (18)|
Predicate Information (identified by operation id):
2 - access("A"."ACCOUNT_ID"="AH"."ACCOUNT_ID")
3 - filter("A"."ACCOUNT_TYPE_ID"=1000002 AND "A"."ACCOUNT_BALANCE">0 AND
ROUND("A"."ACCOUNT_BALANCE",2)>0)
4 - access("AH"."INVOICE_ID"="I"."INVOICE_ID"(+))
5 - filter("AH"."CURRENT_BALANCE">"AH"."PREVIOUS_BALANCE" AND ("AH"."INVOICE_ID"
IS NOT NULL OR "AH"."ADJUSTMENT_ID" IS NOT NULL))
22 rows selected.
Index Details:+_
SQL> select INDEX_OWNER,INDEX_NAME,COLUMN_NAME,TABLE_NAME from dba_ind_columns where
2 table_name in ('INVOICE','ACCOUNT','ACCOUNT_HISTORY') order by 4;
INDEX_OWNER INDEX_NAME COLUMN_NAME TABLE_NAME
OPS$SVM_SRV4 P_ACCOUNT ACCOUNT_ID ACCOUNT
OPS$SVM_SRV4 U_ACCOUNT_NAME ACCOUNT_NAME ACCOUNT
OPS$SVM_SRV4 U_ACCOUNT CUSTOMER_NODE_ID ACCOUNT
OPS$SVM_SRV4 U_ACCOUNT ACCOUNT_TYPE_ID ACCOUNT
OPS$SVM_SRV4 I_ACCOUNT_ACCOUNT_TYPE ACCOUNT_TYPE_ID ACCOUNT
OPS$SVM_SRV4 I_ACCOUNT_INVOICE INVOICE_ID ACCOUNT
OPS$SVM_SRV4 I_ACCOUNT_PREVIOUS_INVOICE PREVIOUS_INVOICE_ID ACCOUNT
OPS$SVM_SRV4 U_ACCOUNT_NAME_ID ACCOUNT_NAME ACCOUNT
OPS$SVM_SRV4 U_ACCOUNT_NAME_ID ACCOUNT_ID ACCOUNT
OPS$SVM_SRV4 I_LAST_MODIFIED_ACCOUNT LAST_MODIFIED ACCOUNT
OPS$SVM_SRV4 I_ACCOUNT_INVOICE_ACCOUNT INVOICE_ACCOUNT_ID ACCOUNT
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_ACCOUNT ACCOUNT_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_ACCOUNT SEQNR ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_INVOICE INVOICE_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_ADINV INVOICE_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_CIA CURRENT_BALANCE ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_CIA INVOICE_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_CIA ADJUSTMENT_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_CIA ACCOUNT_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_LMOD LAST_MODIFIED ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_ADINV ADJUSTMENT_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_PAYMENT PAYMENT_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_ADJUSTMENT ADJUSTMENT_ID ACCOUNT_HISTORY
OPS$SVM_SRV4 I_ACCOUNT_HISTORY_APPLIED_DT APPLIED_DATE ACCOUNT_HISTORY
OPS$SVM_SRV4 P_INVOICE INVOICE_ID INVOICE
OPS$SVM_SRV4 U_INVOICE CUSTOMER_INVOICE_STR INVOICE
OPS$SVM_SRV4 I_LAST_MODIFIED_INVOICE LAST_MODIFIED INVOICE
OPS$SVM_SRV4 U_INVOICE_ACCOUNT ACCOUNT_ID INVOICE
OPS$SVM_SRV4 U_INVOICE_ACCOUNT BILL_RUN_ID INVOICE
OPS$SVM_SRV4 I_INVOICE_BILL_RUN BILL_RUN_ID INVOICE
OPS$SVM_SRV4 I_INVOICE_INVOICE_TYPE INVOICE_TYPE_ID INVOICE
OPS$SVM_SRV4 I_INVOICE_CUSTOMER_NODE CUSTOMER_NODE_ID INVOICE
32 rows selected.
Regards,
Bathula
Oracle-DBAI have some suggestions. But first, you realize that you have some redundant indexes, right? You have an index on account(account_name) and also account(account_name, account_id), and also account_history(invoice_id) and account_history(invoice_id, adjustment_id). No matter, I will suggest some new composite indexes.
Also, you do not need two lines for these conditions:
and round(a.account_balance, 2) > 0
AND a.account_balance > 0
You can just use: and a.account_balance >= 0.005
So the formatted query isselect a.account_id,
round(a.account_balance, 2) account_balance,
nvl(ah.invoice_id, ah.adjustment_id) transaction_id,
to_char(ah.effective_start_date, 'DD-MON-YYYY') transaction_date,
to_char(nvl(i.payment_due_date, to_date('30-12-9999', 'dd-mm-yyyy')),
'DD-MON-YYYY') due_date,
ah.current_balance - ah.previous_balance amount,
decode(ah.invoice_id, null, 'A', 'I') transaction_type
from account a, account_history ah, invoice i
where a.account_id = ah.account_id
and a.account_type_id = 1000002
and (ah.invoice_id is not null or ah.adjustment_id is not null)
and ah.CURRENT_BALANCE > ah.previous_balance
and ah.invoice_id = i.invoice_id(+)
AND a.account_balance >= .005
order by a.account_id, ah.effective_start_date desc;You will probably want to select:
1. From ACCOUNT first (your smaller table), for which you supply a literal on account_type_id. That should limit the accounts retrieved from ACCOUNT_HISTORY
2. From ACCOUNT_HISTORY. We want to limit the records as much as possible on this table because of the outer join.
3. INVOICE we want to access last because it seems to be least restricted, it is the biggest, and it has the outer join condition so it will manufacture rows to match as many rows as come back from account_history.
Try the query above after creating the following composite indexes. The order of the columns is important:create index account_composite_i on account(account_type_id, account_balance, account_id);
create index acct_history_comp_i on account_history(account_id, invoice_id, adjustment_id, current_balance, previous_balance, effective_start_date);
create index invoice_composite_i on invoice(invoice_id, payment_due_date);All the columns used in the where clause will be indexed, in a logical order suited to the needs of the query. Plus each selected column is indexed as well so that we should not need to touch the tables at all to satisfy the query.
Try the query after creating these indexes.
A final suggestion is to try larger sort and hash area sizes and a manual workarea policy.alter session set workarea_size_policy = manual;
alter session set sort_area_size = 2147483647;
alter session set hash_area_size = 2147483647; -
How to reduce long extraction time - Generic extractor based on view
Hello. The previous thread about this error was closed without any answer.
I'm in the same configuration.
"Using a generic extractor (Delta enabled on confirmation date) based on view to load data from ECC6 to BI.
Now my issue is that the delta extraction is running for a long time (around 1 hr) even if the data volume is very small
While checking the job overview in source system i saw that the job is hanging at the below step for almost 1 hr."
1 LUWs confirmed and 1 LUWs to be deleted with function module MF RSC2_QOUT_CONFIRM_DATA.
I am not understanding why this is happening.
Here is the log of the job:
30.05.2011 03:58:11 Job started
30.05.2011 03:58:11 Step 001 started (program SBIE0001, variant &0000000166864, user ID ALEREMOTE)
30.05.2011 03:58:11 Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
30.05.2011 03:58:11 DATASOURCE = 0CO_OM_WBS_6
30.05.2011 03:58:11 *************************************************************************
30.05.2011 03:58:11 * Current Values for Selected Profile Parameters *
30.05.2011 03:58:11 *************************************************************************
30.05.2011 03:58:11 * abap/heap_area_nondia......... 4000000000 *
30.05.2011 03:58:11 * abap/heap_area_total.......... 8000000000 *
30.05.2011 03:58:11 * abap/heaplimit................ 100000000 *
30.05.2011 03:58:11 * zcsa/installed_languages...... DEFS *
30.05.2011 03:58:11 * zcsa/system_language.......... E *
30.05.2011 03:58:11 * ztta/max_memreq_MB............ 2047 *
30.05.2011 03:58:11 * ztta/roll_area................ 6500000 *
30.05.2011 03:58:11 * ztta/roll_extension........... 2000000000 *
30.05.2011 03:58:11 *************************************************************************
30.05.2011 03:58:11 1 LUWs confirmed and 1 LUWs to be deleted with function module RSC2_QOUT_CONFIRM_DATA
30.05.2011 05:02:53 Call customer enhancement BW_BTE_CALL_BW204010_E (BTE) with 171 records
30.05.2011 05:02:53 Result of customer enhancement: 171 records
30.05.2011 05:02:53 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 171 records
30.05.2011 05:02:53 Result of customer enhancement: 171 records
30.05.2011 05:02:53 Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
30.05.2011 05:02:53 IDOC: Info IDoc 2, IDoc No. 4667050, Duration 00:00:00
30.05.2011 05:02:53 IDoc: Start = 30.05.2011 03:58:11, End = 30.05.2011 03:58:11
30.05.2011 05:02:53 tRFC: Data Package = 1, TID = AC11082D38B44DE308DD028A, Duration = 00:00:00, ARFCSTATE = RECORDED
30.05.2011 05:02:53 tRFC: Start = 30.05.2011 05:02:53, End = 30.05.2011 05:02:53
30.05.2011 05:02:53 Altogether, 0 records were filtered out through selection conditions
30.05.2011 05:02:53 Asynchronous transmission of info IDoc 3 in task 0003 (0 parallel tasks)
30.05.2011 05:02:53 IDOC: Info IDoc 3, IDoc No. 4667051, Duration 00:00:00
30.05.2011 05:02:53 IDoc: Start = 30.05.2011 05:02:53, End = 30.05.2011 05:02:53
30.05.2011 05:02:53 Synchronized transmission of info IDoc 4 (0 parallel tasks)
30.05.2011 05:02:53 IDOC: Info IDoc 4, IDoc No. 4667052, Duration 00:00:00
30.05.2011 05:02:53 IDoc: Start = 30.05.2011 05:02:53, End = 30.05.2011 05:02:53
30.05.2011 05:02:53 Job finished
Thanks for your help.
YannParth Kulkarni,
following the note and the other thread, I've check the index 4 of the table COEP.
Here is what I got:
Index ID COEP - 4
Short text MANDT/TIMESTMP/OBJNR, Index for Delta read method (CO-PA)
Last changed SAP 04.01.2010
Status Active Saved
Does not exist in the database
DB index nme
Not defined as DB index in the ABAP Dictionary
Is the fact that it doesn't exist in the database a problem ?
Is the fact that it is active is good enough ?
By the way thanks for your speedy reply !
Yann
Edited by: Yann GOFFIN on May 30, 2011 1:35 PM -
Hello i have a problem that i was making update IOS7 for my i phone 5 it's taking a lot of time about 4 hours till now, i can't use my phone know, it's appere logo i tunes only, can't do eny thing else like shutdown
Hey Andrew keriakous,
Thanks for the question. It sounds like your iPhone may be in recovery-mode. To resolve this issue, see the following:
If you can't update or restore your iOS device
http://support.apple.com/kb/HT1808
Thanks,
Matt M. -
Hello,
I needed to do a hard reset on my iPhone 3GS and it's taking me too much time. It's taking almost 3 hours by now!!!!
I did it by going to "Settings, General, Reset, Erase All Content and Settings", confirm it and then the iPhone turns off, gets the screen black with only the animated circle going on. And it's been there since then.
Is this a symptom of something bad? Should I do the hard reset by another way?Hello Ali Syed,
It sounds like your phone is stuck at the spinning gear icon on the screen after Erasing all Content and Settings on your device. I would start by resetting the device with this process:
To reset, press and hold both the Sleep/Wake and Home buttons for at least 10 seconds, until you see the Apple logo.
From: Turn your iOS device off and on (restart) and reset
http://support.apple.com/kb/ht1430
If the issue persists you may need to put the phone into recovery mode, and then restore it with iTunes as a new device:
If you can't update or restore your iOS device
http://support.apple.com/kb/ht1808
Thank you for using Apple Support Communities.
Take care,
Sterling
Maybe you are looking for
-
Open document from Personnel file in ESS
Hi Guys, When opening the personnel file in ESS the record with all nodes and documents are shown. When I try to open a document the document itself is not opened and the message "Error while reading component info" is shown. When I show the record i
-
Create an image of a Bootcamp partition
I just purchases an iMac, and I have installed Windows 7 as a Bootcamp partition on it. In addition, I have installed a bunch of Windows software that I want on it. What I'd like to do now is create an "image" of this BootCamp partition that I could
-
What's a good codec for working with Sony HDCAM 24P How much storage space
cutting a short film shot with Sony HDCAM 24P Is there a sequence setting foe 24P and should I keep it 24P? What's the storage requirements for this or where can I look it up? Thanks!
-
Sound not coming through from recorder
*HELP* When I transfer movies through Pyro A/V link from my recorder, the sound is not coming through. The video jack and the sound jacks are both plugged in from the same source. Any suggestions!!!
-
Specifying filename for each list item
Me again...I'm almost there! When I added the list items, the exportSheets is no longer defined. I'm assuming the 'if' statements are all screwed up? tell application "Microsoft Excel" set xbook to make new workbook tell workbo