E4200 Serious performance problems.
I am replacing 2 bad dlink dir-655 wireless routers. Hey, I have kids and they spill things. Regardless of the reason. I decided to try the e4200s due to the dual band ability. Wow I am so disappointed in the performance.
My setup is as follows. I have a cable connection 20meg/1.5meg, the cable box is a arris wbm760. Direct connect to one of the E4200's. My Desktop pc has a Linksys Dual/band wmp600n with latest divers. I am running windows 7.
the other e4200 is linked via a network cable and setup per you documentation on how to bridge the 2 routers.
If I run my 50' cable to my desktop and run a speedcheck or connect my notebook to the cable modem I get
Download Speed: 20874 kbps (2609.3 KB/sec transfer rate)
Upload Speed: 1541 kbps (192.6 KB/sec transfer rate)
Latency: 5 ms
Monday, April 18, 2011 10:16:45 AM
Use the Wireless network, Note I have direct line of site and < 50' distance to wireless router. I am getting downloads of < 10kbps, and uploads are pretty much on par with the cable connect at 1521 kbps. Latency is 13ms. < - this seems high.
I am running the latest firmware.
Jay Conway
Have you tried changing channels to check if the transfer rates would vary?
Similar Messages
-
Serious performance problems on iWeb site
Can anyone help me find the problem with peterlance.com? It looks like author Peter Lance uses iWeb to update his web site. He redesigned it recently (and did not use iWeb for the old design). Anyway, it has serious performance problems on Internet Explorer, which I was hoping you guys could help pinpoint? Is there something simple I can tell him to try to fix the problem? He is launching a new book on Tuesday so he will be getting a lot of traffic, but for now the response time is not very good. Any suggestions?
In particular, each new page seems to hang for about ten seconds and freeze using Internet Explorer 6. Any fixes?
I mean, fixes other than: "Don't use Internet Explorer!"
http://www.peterlance.com/Peter%20Lance/Home/Home.html
name="Generator" content="iWeb 1.1.2"Welcome to the discussions, TopDog08.
There are several plain .jpg's and that's good, but you need to convert every .png on the page (or as many .png's as you can) to a .jpg. For example,
http://www.peterlance.com/Peter%20Lance/Home/Homefiles/shapeimage46.png
Convert that to a jpg, then replace the graphic and text on the iWeb page with this converted jpg (some have has success using Downsize http://www.stuntsoftware.com/Downsize/)
The ones you won't be able to do anything about are the ones that are a part of the blog, but that will be only 5 images instead of the many you have now.
This image
http://www.peterlance.com/Peter%20Lance/Home/Homefiles/shapeimage50.png
Is covering part of your menu which is why it appears that only the top half is sensitive to rolling over. You'll have to adjust your layout to account for this.
Another area to look at, there are white boxed png's surrounding the amazon book covers at the top.
http://www.peterlance.com/Peter%20Lance/Home/Homefiles/imageEffectsAboveAli%20Mug%20TPI%20jpg.png
Another set of .png's you'll have to get rid of. How did those get put on the page?
What you see happening with IE is a java script running to fix IE's problem with png's. The fewer .png's you have, the less of a problem with IE you have. -
Serious performance problems after updating from 10.5.6 to 10.5.7
I've got a Macbook Pro C2D 15" (late 2006). After updating the system from 10.5.6 to 10.5.7 and quicktime to 7.6.2, I experience serious performance problems which I have never seen before:
-From time to time, kernel_task has peaks in activity monitor (consumes a lot of CPU). In that moment, the mouse hangs for a short moment and audio playback is distorted/has drop outs.
-Recording audio consumes a lot of CPU (independent of the audio application).
-Video playback consumes a lot of CPU, regardless if I use VLC or the quicktime player.
Any idea what could be done? Thanks in advance.After a few days of using CoolBook I report that my 1st generation MacBook Air is now reliable and never overheating, regardless of environment conditions.
I must point out that I do believe that it really is something new in 10.5.7 that is causing this problem as my environmental conditions have not changed and even Summer in Dublin still sees temperatures no more than 72F/22C. I am now in Madrid, where the environment is quite different, and I am still having no problem with kernel_task spinning.
I plan on reporting my problem to Apple anyway and being aggressive about the issue as MacLovinRanga did in this thread (http://discussions.apple.com/thread.jspa?messageID=9647353#9647353). Perhaps many/all of those of us having this problem will obtain new MacBook Airs.
Joe Kiniry -
Hi
I am having some very serious performance problems witgh java web start. I have a swing client that transmits XML/HTTP to and from a server, parses it, etc... I am finding that the XML/HTTP data request
is a factor of 10 to 100X slower, at least, using Java Web Start than without.
Any suggestions would be helpful
Thanks
CharlesCharles,
I've also observed some overhead when running an application deployed with JAWS compared to running the application from the jar, or in my dev environment.
After searching these forums a bit, I found this buried nugget of info: Try System.setSecurityManager(null) if you are requesting <all-permissions/> already.
Before the workaround, my application was taking 194 seconds for a single analysis run. After this workaround was applied, the same analysis run took only 82 seconds.
So as you can see, there is considerable overhead associated with performing operations through the layers of security (my guess). My particular application does a lot of disk I/O (which is probably something the security manager watches closely).
I stress the 'workaround' aspect of this because I haven't fully investigated the ramifications of using it, but eventually this might turn into a bug submission (because if I'm already requesting all-permissions, why do I still need to do setSecurityManager(null)?).
At any rate, hope this helps!
-jw -
Hi people,
I have a seriuos performance problem regarding ApEx.
Via sqldeveloper is fast and feedback is normally fast.
Via browser (IE, FF or Opera) sometimes is fast, sometimes images are missing, sometimes white pages are displayed in the browser, and system respond is so dog slow...
In general configuration is
- VmWare ESX server 3.5
- Windows 2003 server R2 with 4GB Ram (1 reserved)
- Oracle 11g patch 12
- 2 CPU AMD64
- APEX 3.1.2
- 3 virtual disks 20G for win, 20G for Oracle 40G for datafiles
Any suggestion in order to start a serious analysis?
I'm desperate :(
Thanks
ClaudioI found these parameters:
audit_file_dest = <omitted>\ADUMP
audit_trail = DB
compatible = 11.1.0.0.0
control_files = ("<omitted>\CONTROL01.CTL", "<omitted>\CONTROL02.CTL", "<omitted>\CONTROL03.CTL")
control_management_pack_access = DIAGNOSTIC+TUNING
cursor_sharing = SIMILAR
db_block_size = 8192
db_cache_advice = ON
db_domain = <omitted>
db_name = silsvil
db_recovery_file_dest = <omitted>\flash_recovery_area
db_recovery_file_dest_size = 2147483648
diagnostic_dest = <omitted>\ORACLE
dispatchers = '(PROTOCOL=TCP) (SERVICE=<omitted>XDB)'
job_queue_processes = 1000
memory_max_target = 2147483648
memory_target = 1073741824
nls_language = ITALIAN
nls_territory = ITALY
open_cursors = 600
optimizer_mode = FIRST_ROWS
plsql_debug = TRUE
processes = 150
query_rewrite_enabled = FALSE
remote_login_passwordfile = EXCLUSIVE
session_cached_cursors = 60
shared_server_sessions = 10
sql_trace = FALSE
statistics_level = ALL
trace_enabled = TRUE
undo_tablespace = UNDOTBS1
For the Database Configuration Assistant it is in DEDICATED MODE....
Claudio
Edited by: Seek78 on Nov 27, 2008 5:26 PM -
Serious performance problem - SELECT DISTINCT x.JDOCLASSX FROM x
I am noticing a huge performance problem when trying to access a member that
is lazily loaded:
MonitorStatus previousStatus = m.getStatus();
This causes the following query to be executed:
SELECT DISTINCT MONITORSTATUSX.JDOCLASSX FROM MONITORSTATUSX
This table has 3 million records and this SQL statement takes 3 minutes to
execute! Even worse, my app heavily uses threads, so this statement is
executed in each of the 32 threads. As a result the application stops.
Is there any way that I can optimize this? And more importantly, can Kodo
handle a multithreaded app like this with a huge database? I've been having
a lot of performance problems since I've started doing stress & load
testing, and I'm thinking Kodo isn't ready for this type of application.
Thanks,
MichaelYou can prevent this from happening by explicitly enumerating the valid
persistent types in a property. See
http://docs.solarmetric.com/manual.html#com.solarmetric.kodo.PersistentTypes
for details.
>
Inconveniently, this nugget of performance info is not listed in the
optimization guide. I'll add in an entry for it.This setting did in fact prevent the query from running which fixed the
problem. It definitely belongs in the optimization guide.
And more importantly, can Kodo
handle a multithreaded app like this with a huge database? I've beenhaving
a lot of performance problems since I've started doing stress & load
testing, and I'm thinking Kodo isn't ready for this type of application.I'd like to find out more information about details about your issues. We
do a decent amount of stress / load testing internally, but there are
always use cases that we don't test. Please send me an email (I'm assuming
that [email protected] is not really your address) and let's
figure out some way to do an analysis of what you're seeing.This email is just for posting to usenet, to avoid spam. I'm now running my
app through stress/load testing so I hope to discover any remaining issues
before going into production. As of this morning the system seems to be
performing quite well. Now the biggest performance problem for me is the
lack of what I think is called "outer join". I know you'll have this in 3.0
but I'm suprised you don't have this already because not having it really
affects performance. I already had to code one query by hand with JDBC due
to this. It was taking 15+ minutes with Kodo and with my JDBC version it
only takes a few seconds. There are lots of anti-JDO people and performance
issues like this really give them ammunition. Overall I just have the
impression that Kodo hasn't been used on many really large scale projects
with databases that have millions of records.
Thanks for configuration fix,
Michael -
JSP causes serious performance problem in my EP
I have a native JSP application and running it on the EP. This application accessing a Oracle db (using oracle.jdbc.driver.OracleDriver driver).
When this db not running, my EP doesn't open any page. The EP try to connect with this db, but not get success.
I modify this application with TRY and CATCH to treat the application problem, but the EP problem continues.
In the EP Thread Overview (System Adm -> Monitoring -> Portal -> Thread Overview), I see threads (locked) about this application. To unlock this threads, I delete the par file from the Portal and re-deploy it.
When this threads is deleted, the application treat the problem (display a error message in the iView container) and EP work normally.
My questions:
1) Exists some way that the problem with this application does not cause performance problems with the EP (via code or property)?
2) Exists some property in the Config Tool where I configure the connection timeout for access the external DB?
3) Exists some way to liberate these (locked) threads of the EP (I see the page Clearing the Portal Runtime Cache - http://help.sap.com/saphelp_nw04/helpdata/en/d2/a216e1bd7b431c82fa5ff105187112/frameset.htm - but I don't know if I can use it for this)
I use EP 6.0 SP 15.
Thanks,
Yuri Fiori de AlmeidaHi Umair,
the code is:
<%@ page contentType="text/html; charset=iso-8859-1" language="java" import="java.sql.*" %>
<%@ include file="isa_cad.jsp" %>
<%
Connection Connselect_localidade = null;
try{
Driver Driverselect_localidade = (Driver)Class.forName(MM_isa_cad_DRIVER).newInstance();
Connselect_localidade = DriverManager.getConnection(MM_isa_cad_STRING,MM_isa_cad_USERNAME,MM_isa_cad_PASSWORD);
PreparedStatement Statementselect_localidade = Connselect_localidade.prepareStatement("SELECT DISTINCT X AS Y FROM Z.K WHERE W = 1 ORDER BY X");
ResultSet select_localidade = Statementselect_localidade.executeQuery();
boolean select_localidade_isEmpty = !select_localidade.next();
boolean select_localidade_hasData = !select_localidade_isEmpty;
Object select_localidade_data;
int select_localidade_numRows = 0;
%>
<!---- Page Layout --->
<%
select_localidade_hasData = select_localidade.next();
select_localidade.close();
Statementselect_localidade.close();
//Connselect_localidade.close();
%>
<!---- Page Layout --->
<%
catch(Exception e){
%>
<!---- Page Layout ---><%
finally{
try{
if(Connselect_localidade != null) Connselect_localidade.close();
catch(Exception e){
%>
Thanks,
Yuri. -
Serious Performance Problems in ABAP Reports
Hi All,
We are developing ABAP reports for SAP IS-U/CCS Modules and facing Performance issues.Whole scenario is given below.Please suggest some solution.
1. Total No. Of Business Partners = 1500000
2. Reports Selection Criteria are of two types:
a) GSBER(Business Area) - Selection for atleast 200000 Business Partners
b) Cokey(Division) - Selection for around 1000 to 50000 Business Partners
3. For implementing our reports logic we have to access several tables all of which are very large.They are :
Table Name No. of records (Appx)
DBERCHV 20000000
DBERCHZ1 20000000
DBERDLB 20000000
DFKKKO 20000000
DFKKOP 20000000
EANLH 4000000
ERCH 10000000
ERCHC 10000000
ETTIFN 30000000
EVER 1500000
FKKVKP 1500000
TECOKT 500
TGSBT 12
4. Due to large no. of records we are facing problems at two levels:
a) OpenSQL Statement is taking too much time for data selection
b) Since large no. of records are selected Corresponding loops and data processing also takes much time
5. We have tried almost all ABAP Performance optimization techniques such as using Index, SQL optimization techniques,Read Table Optimization, Loop Statement Optimization etc. but there is not much improvement.
6. For example one of our Reports "R15" takes around 1500 seconds for 1000 Business Partners.
Its Code is attached below:
<b>a) ZISU_SCHL_LTR15_BAPI - Program which schedules actual R15 report in background</b>
*& Report ZISU_SCHL_LTR15_BAPI *
*& Developed By : Piyusha Kirwai *
*& Date : 02/12/2005 *
*& Purpose : To Schedule the LT R 15 prog in background and store
*& the File for Manual R-15 into server.
REPORT ZISU_SCHL_LTR15_BAPI NO STANDARD PAGE HEADING.
TABLES ZEVERFKKVKP.
DATA: DYFIELDS LIKE DYNPREAD OCCURS 0 WITH HEADER LINE.
DATA: IT_TAB TYPE FILETABLE,
GD_SUBRC TYPE I.
DATA: LV_GSBER TYPE TGSBT-GSBER.
RANGES R_COKEY FOR ZEVERFKKVKP-COKEY.
DATA: BEGIN OF GT_TECOKT OCCURS 100,
COKEY TYPE TECOKT-COKEY,
LTEXT TYPE TECOKT-LTEXT,
END OF GT_TECOKT,
BEGIN OF GT_TE422 OCCURS 100,
TERMSCHL TYPE TE422-TERMSCHL,
TERMTEXT TYPE TE422-TERMTEXT,
END OF GT_TE422.
DATA: BEGIN OF GWA_MANUAL_DATA,
COL_1(5) TYPE C,
COL_2(10) TYPE C,
COL_3(40) TYPE C,
COL_4(40) TYPE C,
COL_5(10) TYPE C,
COL_6(19) TYPE C,
COL_7(19) TYPE C,
COL_8(19) TYPE C,
COL_9(19) TYPE C,
COL_10(19) TYPE C,
COL_11(19) TYPE C,
COL_12(10) TYPE C,
COL_13(19) TYPE C,
COL_14(19) TYPE C,
COL_15(19) TYPE C,
COL_16(19) TYPE C,
COL_17(19) TYPE C,
COL_18(19) TYPE C,
END OF GWA_MANUAL_DATA,
GT_MANUAL_DATA LIKE GWA_MANUAL_DATA OCCURS 0,
GT_INTERN_DATA TYPE ALSMEX_TABLINE OCCURS 0 WITH HEADER LINE.
*&-----FOR UPLOADING FILE TO SERVER------------------------------------*
DATA: LV_SERVER_FILE_PREFIX(60) TYPE C.
DATA: LV_SERVER_DATAFILE_NAME(60) TYPE C.
DATA: LV_SERVER_ERRFILE_NAME(60) TYPE C.
DATA: ENDT LIKE SY-UZEIT,
ENDD LIKE SY-DATUM,
JOBCOUNT TYPE TBTCJOB-JOBCOUNT.
SELECTION-SCREEN BEGIN OF BLOCK 001 WITH FRAME TITLE TEXT-001.
SELECT-OPTIONS: SO_GSBER FOR ZEVERFKKVKP-GSBER OBLIGATORY,
SO_COKEY FOR ZEVERFKKVKP-COKEY OBLIGATORY,
SO_MRU FOR ZEVERFKKVKP-ABLEINH.
PARAMETERS: P_BMNTH(7) TYPE C OBLIGATORY.
SELECT-OPTIONS: SO_GPART FOR ZEVERFKKVKP-GPART.
SELECTION-SCREEN END OF BLOCK 001.
SELECTION-SCREEN BEGIN OF BLOCK 002 WITH FRAME TITLE TEXT-003.
PARAMETERS: P_COMPUT RADIOBUTTON GROUP R15,
P_INCLUD RADIOBUTTON GROUP R15,
P_FILE TYPE RLGRAP-FILENAME MODIF ID ACT.
SELECTION-SCREEN END OF BLOCK 002.
SELECTION-SCREEN BEGIN OF LINE.
SELECTION-SCREEN COMMENT 2(83) TEXT-004.
SELECTION-SCREEN END OF LINE.
*SELECTION-SCREEN BEGIN OF LINE.
* SELECTION-SCREEN COMMENT 8(40) TEXT-005.
*SELECTION-SCREEN END OF LINE.
SELECTION-SCREEN BEGIN OF BLOCK 003 WITH FRAME TITLE TEXT-002.
PARAMETERS P_SCHL TYPE C AS CHECKBOX.
PARAMETERS : P_IMMED RADIOBUTTON GROUP SCH ,
P_DT_TM RADIOBUTTON GROUP SCH,
P_DATE TYPE SY-DATUM MODIF ID SHL,
P_TIME TYPE SY-UZEIT MODIF ID SHL.
SELECTION-SCREEN END OF BLOCK 003.
AT SELECTION-SCREEN.
AT SELECTION-SCREEN ON SO_GSBER.
IF NOT SO_GSBER-LOW IS INITIAL.
SELECT SINGLE GSBER FROM TGSBT INTO LV_GSBER
WHERE GSBER = SO_GSBER-LOW.
IF SY-SUBRC <> 0.
MESSAGE E007(ZISU).
ENDIF.
ENDIF.
IF NOT SO_GSBER-HIGH IS INITIAL .
SELECT SINGLE GSBER FROM TGSBT INTO LV_GSBER
WHERE GSBER = SO_GSBER-HIGH.
IF SY-SUBRC <> 0.
MESSAGE E007(ZISU).
ENDIF.
ENDIF.
AT SELECTION-SCREEN ON P_BMNTH.
**check the validity of Billing month format and for valid billing month
IF P_BMNTH CO '0123456789/'.
IF P_BMNTH CP '++++/++'.
IF P_BMNTH+5(2) > 12.
MESSAGE E002(ZISU) WITH 'from'.
ENDIF.
** current year is less than year entered
IF SY-DATUM(4) < P_BMNTH(4).
MESSAGE E003(ZISU) WITH 'from'.
ELSEIF SY-DATUM(4) = P_BMNTH(4).
** month in future
IF SY-DATUM+4(2) < P_BMNTH+5(2).
MESSAGE E004(ZISU) WITH 'from'.
ENDIF.
ENDIF.
ELSE.
MESSAGE E001(ZISU) WITH 'from'.
ENDIF.
ELSE.
** entry have some invalid char
MESSAGE E010(ZISU).
ENDIF.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR SO_COKEY-LOW.
**for geting the CO account assigment key of the entered Business area
CLEAR: DYFIELDS[], DYFIELDS.
PERFORM DIVISON_DATA_GET CHANGING SO_COKEY-LOW.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR SO_COKEY-HIGH.
**for geting the CO account assigment key of the entered Business area
CLEAR: DYFIELDS[], DYFIELDS.
PERFORM DIVISON_DATA_GET CHANGING SO_COKEY-HIGH.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR SO_MRU-LOW.
CLEAR: DYFIELDS[], DYFIELDS.
PERFORM MRU_DATA_GET CHANGING SO_MRU-LOW.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR SO_MRU-HIGH.
CLEAR: DYFIELDS[], DYFIELDS.
PERFORM MRU_DATA_GET CHANGING SO_MRU-HIGH.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR P_FILE.
REFRESH: IT_TAB.
**Opens File Open Dialog Box for selecting input file.
CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_OPEN_DIALOG
EXPORTING
WINDOW_TITLE = 'Select File'
DEFAULT_FILENAME = '*.xls'
MULTISELECTION = ' '
CHANGING
FILE_TABLE = IT_TAB
RC = GD_SUBRC.
LOOP AT IT_TAB INTO P_FILE.
* so_fpath-sign = 'I'.
* so_fpath-option = 'EQ'.
* append so_fpath.
ENDLOOP.
AT SELECTION-SCREEN OUTPUT.
LOOP AT SCREEN.
IF SCREEN-GROUP1 = 'ACT'.
IF P_INCLUD = ' '.
SCREEN-INPUT = '0'.
ELSE.
SCREEN-INPUT = '1'.
ENDIF.
ENDIF.
IF SCREEN-GROUP1 = 'SHL'.
IF P_DT_TM = 'X'.
SCREEN-INPUT = '1'.
ELSE.
SCREEN-INPUT = '0'.
ENDIF.
ENDIF.
MODIFY SCREEN.
ENDLOOP.
INITIALIZATION.
P_DATE = SY-DATUM.
P_TIME = SY-UZEIT + 180.
START-OF-SELECTION.
IF P_INCLUD = 'X' AND P_FILE IS INITIAL.
MESSAGE 'Enter the Manual R-15 File' TYPE 'E'.
ENDIF.
* IF P_INCLUD = 'X'.
* PERFORM UPLOAD_EXCEL_FILE.
* CONCATENATE 'R15' SY-DATUM '-' SY-UZEIT INTO LV_SERVER_FILE_PREFIX.
* CONCATENATE LV_SERVER_FILE_PREFIX '-' 'LT.TMP' INTO
* LV_SERVER_DATAFILE_NAME.
* OPEN DATASET LV_SERVER_DATAFILE_NAME
* FOR OUTPUT
* IN TEXT MODE
* ENCODING DEFAULT.
* LOOP AT GT_MANUAL_DATA INTO GWA_MANUAL_DATA.
* IF ( GWA_MANUAL_DATA-COL_6 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_7 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_8 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_9 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_10 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_11 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_13 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_14 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_15 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_16 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_17 CN '0123456789 '
* AND GWA_MANUAL_DATA-COL_18 CN '0123456789 ' ).
* MESSAGE 'Character Data in Numerical Field' TYPE 'E'.
* ENDIF.
* TRANSFER GWA_MANUAL_DATA TO LV_SERVER_DATAFILE_NAME.
* ENDLOOP.
* CLOSE DATASET LV_SERVER_DATAFILE_NAME.
* ENDIF. " P_INCLUD = 'X'.
IF P_SCHL = 'X'.
IF P_IMMED = 'X'.
CALL FUNCTION 'C14B_ADD_TIME'
EXPORTING
I_STARTTIME = SY-UZEIT
I_STARTDATE = SY-DATUM
I_ADDTIME = '000010'
IMPORTING
E_ENDTIME = ENDT
E_ENDDATE = ENDD.
ELSEIF P_DT_TM = 'X'.
ENDD = P_DATE.
ENDT = P_TIME.
ENDIF.
CALL FUNCTION 'JOB_OPEN'
EXPORTING
* DELANFREP = ' '
* JOBGROUP = ' '
JOBNAME = 'R15JOB'
* SDLSTRTDT = ENDD
* SDLSTRTTM = ENDT
* JOBCLASS =
IMPORTING
JOBCOUNT = JOBCOUNT
* CHANGING
* RET =
* EXCEPTIONS
* CANT_CREATE_JOB = 1
* INVALID_JOB_DATA = 2
* JOBNAME_MISSING = 3
* OTHERS = 4
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
IF P_INCLUD = 'X'.
PERFORM UPLOAD_EXCEL_FILE.
CONCATENATE 'R15' SY-DATUM '-' SY-UZEIT INTO LV_SERVER_FILE_PREFIX.
CONCATENATE LV_SERVER_FILE_PREFIX '-' JOBCOUNT 'LT.TMP' INTO
LV_SERVER_DATAFILE_NAME.
CONDENSE LV_SERVER_DATAFILE_NAME.
OPEN DATASET LV_SERVER_DATAFILE_NAME
FOR OUTPUT
IN TEXT MODE
ENCODING DEFAULT.
IF SY-SUBRC NE 0.
MESSAGE 'Error in Creating File on Server' TYPE 'E'.
ENDIF.
LOOP AT GT_MANUAL_DATA INTO GWA_MANUAL_DATA.
IF ( GWA_MANUAL_DATA-COL_6 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_7 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_8 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_9 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_10 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_11 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_13 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_14 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_15 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_16 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_17 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_18 CN '0123456789 ' ).
MESSAGE 'Character Data in Numerical Field' TYPE 'E'.
ENDIF.
TRANSFER GWA_MANUAL_DATA TO LV_SERVER_DATAFILE_NAME.
ENDLOOP.
CLOSE DATASET LV_SERVER_DATAFILE_NAME.
IF SY-SUBRC NE 0.
MESSAGE 'Error in Creating File on Server' TYPE 'E'.
ENDIF.
ENDIF. " P_INCLUD = 'X'.
SUBMIT ZISU_LTR15_TUNE
WITH SO_GSBER IN SO_GSBER
WITH SO_COKEY IN SO_COKEY
WITH SO_MRU IN SO_MRU
WITH SO_GPART IN SO_GPART
WITH P_BMNTH = P_BMNTH
WITH P_COMPUT = P_COMPUT
WITH P_INCLUD = P_INCLUD
WITH P_FILE = LV_SERVER_DATAFILE_NAME
USER SY-UNAME VIA JOB 'R15JOB' NUMBER JOBCOUNT AND RETURN.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
* AT_OPMODE = ' '
* AT_OPMODE_PERIODIC = ' '
* CALENDAR_ID = ' '
* EVENT_ID = ' '
* EVENT_PARAM = ' '
* EVENT_PERIODIC = ' '
JOBCOUNT = JOBCOUNT
JOBNAME = 'R15JOB'
* LASTSTRTDT = NO_DATE
* LASTSTRTTM = NO_TIME
* PRDDAYS = 0
* PRDHOURS = 0
* PRDMINS = 0
* PRDMONTHS = 0
* PRDWEEKS = 0
* PREDJOB_CHECKSTAT = ' '
* PRED_JOBCOUNT = ' '
* PRED_JOBNAME = ' '
SDLSTRTDT = ENDD
SDLSTRTTM = ENDT
* STARTDATE_RESTRICTION = BTC_PROCESS_ALWAYS
STRTIMMED = P_IMMED
* TARGETSYSTEM = ' '
* START_ON_WORKDAY_NOT_BEFORE = SY-DATUM
* START_ON_WORKDAY_NR = 0
* WORKDAY_COUNT_DIRECTION = 0
* RECIPIENT_OBJ =
* TARGETSERVER = ' '
* DONT_RELEASE = ' '
* TARGETGROUP = ' '
* DIRECT_START =
* IMPORTING
* JOB_WAS_RELEASED =
* CHANGING
* RET =
* EXCEPTIONS
* CANT_START_IMMEDIATE = 1
* INVALID_STARTDATE = 2
* JOBNAME_MISSING = 3
* JOB_CLOSE_FAILED = 4
* JOB_NOSTEPS = 5
* JOB_NOTEX = 6
* LOCK_FAILED = 7
* INVALID_TARGET = 8
* OTHERS = 9
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
ELSE. " IF NOT TO RUN IN BACKGROUND
IF P_INCLUD = 'X'.
PERFORM UPLOAD_EXCEL_FILE.
CONCATENATE 'R15' SY-DATUM '-' SY-UZEIT INTO LV_SERVER_FILE_PREFIX.
CONCATENATE LV_SERVER_FILE_PREFIX '-' 'LT.TMP' INTO
LV_SERVER_DATAFILE_NAME.
CONDENSE LV_SERVER_DATAFILE_NAME.
OPEN DATASET LV_SERVER_DATAFILE_NAME
FOR OUTPUT
IN TEXT MODE
ENCODING DEFAULT.
IF SY-SUBRC NE 0.
MESSAGE 'Error in Creating File on Server' TYPE 'E'.
ENDIF.
LOOP AT GT_MANUAL_DATA INTO GWA_MANUAL_DATA.
IF ( GWA_MANUAL_DATA-COL_6 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_7 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_8 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_9 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_10 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_11 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_13 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_14 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_15 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_16 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_17 CN '0123456789 '
AND GWA_MANUAL_DATA-COL_18 CN '0123456789 ' ).
MESSAGE 'Character Data in Numerical Field' TYPE 'E'.
ENDIF.
TRANSFER GWA_MANUAL_DATA TO LV_SERVER_DATAFILE_NAME.
ENDLOOP.
CLOSE DATASET LV_SERVER_DATAFILE_NAME.
IF SY-SUBRC NE 0.
MESSAGE 'Error in Creating File on Server' TYPE 'E'.
ENDIF.
ENDIF. " P_INCLUD = 'X'.
IF NOT SY-BATCH IS INITIAL.
SUBMIT ZISU_LTR15_TUNE
WITH SO_GSBER IN SO_GSBER
WITH SO_COKEY IN SO_COKEY
WITH SO_MRU IN SO_MRU
WITH SO_GPART IN SO_GPART
WITH P_BMNTH = P_BMNTH
WITH P_COMPUT = P_COMPUT
WITH P_INCLUD = P_INCLUD
WITH P_FILE = LV_SERVER_DATAFILE_NAME
TO SAP-SPOOL WITHOUT SPOOL DYNPRO.
ELSE.
SUBMIT ZISU_LTR15_TUNE
WITH SO_GSBER IN SO_GSBER
WITH SO_COKEY IN SO_COKEY
WITH SO_MRU IN SO_MRU
WITH SO_GPART IN SO_GPART
WITH P_BMNTH = P_BMNTH
WITH P_COMPUT = P_COMPUT
WITH P_INCLUD = P_INCLUD
WITH P_FILE = LV_SERVER_DATAFILE_NAME.
ENDIF.
ENDIF.
*& Form divison_data_get
* text
* <--P_SO_COKEY_LOW text
FORM DIVISON_DATA_GET CHANGING P_SO_COKEY_LOW.
**to get the search help for division
DATA : LV_LINES TYPE SY-TFILL,
LT_RETURN_TAB LIKE DDSHRETVAL OCCURS 0, " with header line.
LWA_RETURN_TAB LIKE DDSHRETVAL.
DATA: LV_COKEY TYPE ZEVERFKKVKP-COKEY.
**according to the Business area entered]
* break csebdev1.
REFRESH R_COKEY[].
DESCRIBE TABLE SO_GSBER LINES LV_LINES.
** when user has neither pressed the enter key nor selected the values
**using search help
IF LV_LINES = 0.
* loop at so_gsber.
CLEAR R_COKEY.
* if so_gsber-high is initial.
* break csebdev1.
DYFIELDS-FIELDNAME = 'SO_GSBER-LOW'.
APPEND DYFIELDS.
CALL FUNCTION 'DYNP_VALUES_READ'
EXPORTING
DYNAME = SY-CPROG
DYNUMB = SY-DYNNR
TABLES
DYNPFIELDS = DYFIELDS.
* r_cokey-sign = 'I'.
* r_cokey-option = 'CP'.
*r_cokey-low = so_gsber-low+0(2).
CONCATENATE DYFIELDS-FIELDVALUE+0(2) '*' INTO R_COKEY-LOW.
* elseif not so_gsber-high is initial.
DYFIELDS-FIELDNAME = 'SO_GSBER-HIGH'.
APPEND DYFIELDS.
CALL FUNCTION 'DYNP_VALUES_READ'
EXPORTING
DYNAME = SY-CPROG
DYNUMB = SY-DYNNR
TABLES
DYNPFIELDS = DYFIELDS.
IF NOT DYFIELDS-FIELDVALUE IS INITIAL.
CONCATENATE DYFIELDS-FIELDVALUE+0(2) '99999999' INTO R_COKEY-HIGH.
ENDIF.
* endif.
IF NOT R_COKEY-HIGH IS INITIAL.
R_COKEY-SIGN = 'I'.
R_COKEY-OPTION = 'BT'.
ELSEIF R_COKEY-HIGH IS INITIAL.
R_COKEY-SIGN = 'I'.
R_COKEY-OPTION = 'CP'.
ENDIF.
APPEND R_COKEY.
ENDIF.
* endloop.
* APPEND DYFIELDS.
IF LV_LINES > 0.
* break-point.
IF NOT SO_GSBER[] IS INITIAL.
LOOP AT SO_GSBER.
* r_cokey-sign = 'I'.
* r_cokey-option = 'CP'.
*r_cokey-low = so_gsber-low+0(2).
CONCATENATE SO_GSBER-LOW+0(2) '*' INTO R_COKEY-LOW.
IF NOT SO_GSBER-HIGH IS INITIAL.
CONCATENATE SO_GSBER-HIGH+0(2) '99999999' INTO R_COKEY-HIGH.
ENDIF.
IF NOT R_COKEY-HIGH IS INITIAL.
R_COKEY-SIGN = 'I'.
R_COKEY-OPTION = 'BT'.
ELSEIF R_COKEY-HIGH IS INITIAL.
R_COKEY-SIGN = 'I'.
R_COKEY-OPTION = 'CP'.
ENDIF.
APPEND R_COKEY TO R_COKEY.
ENDLOOP.
ENDIF.
ENDIF.
IF NOT R_COKEY[] IS INITIAL.
* break csebdev1.
REFRESH GT_TECOKT[].
SELECT COKEY LTEXT
FROM TECOKT
INTO TABLE GT_TECOKT
WHERE COKEY IN R_COKEY AND SPRAS = 'EN'.
* %_HINTS ORACLE '("TECOKT","TECOKT~1")'.
ENDIF.
**now call the search help
CALL FUNCTION 'F4IF_INT_TABLE_VALUE_REQUEST'
EXPORTING
* DDIC_STRUCTURE = ' '
RETFIELD = 'COKEY'
* PVALKEY = ' '
* DYNPPROG = ' '
* DYNPNR = ' '
* DYNPROFIELD = ' '
* STEPL = 0
WINDOW_TITLE = 'Division'
VALUE = DYFIELDS-FIELDVALUE
VALUE_ORG = 'S'
* MULTIPLE_CHOICE = ' '
* DISPLAY = ' '
* CALLBACK_PROGRAM = ' '
* CALLBACK_FORM = ' '
* MARK_TAB =
* IMPORTING
* USER_RESET =
TABLES
VALUE_TAB = GT_TECOKT
* FIELD_TAB =
RETURN_TAB = LT_RETURN_TAB
* DYNPFLD_MAPPING =
* EXCEPTIONS
* PARAMETER_ERROR = 1
* NO_VALUES_FOUND = 2
* OTHERS = 3
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
READ TABLE LT_RETURN_TAB INTO LWA_RETURN_TAB INDEX 1.
LV_COKEY = LWA_RETURN_TAB-FIELDVAL.
*break csebdev1.
ENDFORM. " divison_data_get
*& Form upload_excel_file
* text
* --> p1 text
* <-- p2 text
FORM UPLOAD_EXCEL_FILE .
DATA: LV_INDEX TYPE I.
FIELD-SYMBOLS <VAL> TYPE ANY.
CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
EXPORTING
FILENAME = P_FILE
I_BEGIN_COL = 1
I_BEGIN_ROW = 8
I_END_COL = 18
I_END_ROW = 94
TABLES
INTERN = GT_INTERN_DATA
EXCEPTIONS
INCONSISTENT_PARAMETERS = 1
UPLOAD_OLE = 2
OTHERS = 3.
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
IF NOT GT_INTERN_DATA[] IS INITIAL.
SORT GT_INTERN_DATA BY ROW COL.
LOOP AT GT_INTERN_DATA.
MOVE GT_INTERN_DATA-COL TO LV_INDEX.
ASSIGN COMPONENT LV_INDEX OF STRUCTURE GWA_MANUAL_DATA TO <VAL>.
MOVE GT_INTERN_DATA-VALUE TO <VAL>.
AT END OF ROW.
APPEND GWA_MANUAL_DATA TO GT_MANUAL_DATA.
CLEAR GWA_MANUAL_DATA.
ENDAT.
ENDLOOP.
ENDIF.
ENDFORM. " upload_excel_file
*& Form MRU_DATA_GET
* text
* <--P_SO_MRU_LOW text
FORM MRU_DATA_GET CHANGING LV_MRU.
**to get the search help for Group
DATA : LV_LINES TYPE SY-TFILL,
LT_RETURN_TAB LIKE DDSHRETVAL OCCURS 0,
LWA_RETURN_TAB LIKE DDSHRETVAL.
* lv_mru_p(3) type c.
RANGES R_MRU FOR EANLH-ABLEINH.
**according to the Business area entered
** break csebdev1.
REFRESH R_MRU[].
DESCRIBE TABLE SO_GSBER LINES LV_LINES.
** when user has neither pressed the enter key nor selected the values
**using search help
IF LV_LINES = 0.
CLEAR R_MRU[].
DYFIELDS-FIELDNAME = 'SO_GSBER-LOW'.
APPEND DYFIELDS.
CALL FUNCTION 'DYNP_VALUES_READ'
EXPORTING
DYNAME = SY-CPROG
DYNUMB = SY-DYNNR
TABLES
DYNPFIELDS = DYFIELDS.
CONCATENATE DYFIELDS-FIELDVALUE+0(2) '*' INTO R_MRU-LOW.
DYFIELDS-FIELDNAME = 'SO_GSBER-HIGH'.
APPEND DYFIELDS.
CALL FUNCTION 'DYNP_VALUES_READ'
EXPORTING
DYNAME = SY-CPROG
DYNUMB = SY-DYNNR
TABLES
DYNPFIELDS = DYFIELDS.
IF NOT DYFIELDS-FIELDVALUE IS INITIAL.
CONCATENATE DYFIELDS-FIELDVALUE+0(2) '99999999' INTO R_MRU-HIGH.
ENDIF.
IF NOT R_MRU-HIGH IS INITIAL.
R_MRU-SIGN = 'I'.
R_MRU-OPTION = 'BT'.
ELSEIF R_MRU-HIGH IS INITIAL.
R_MRU-SIGN = 'I'.
R_MRU-OPTION = 'CP'.
ENDIF.
APPEND R_MRU.
ENDIF. " end lv_lines =0
IF LV_LINES > 0.
IF NOT SO_GSBER[] IS INITIAL.
LOOP AT SO_GSBER.
CONCATENATE SO_GSBER-LOW+0(2) '*' INTO R_MRU-LOW.
IF NOT SO_GSBER-HIGH IS INITIAL.
CONCATENATE SO_GSBER-HIGH+0(2) '99999999' INTO R_MRU-HIGH.
ENDIF.
IF R_MRU-HIGH IS INITIAL.
R_MRU-SIGN = 'I'.
R_MRU-OPTION = 'CP'.
ELSEIF NOT R_MRU-HIGH IS INITIAL.
R_MRU-SIGN = 'I'.
R_MRU-OPTION = 'BT'.
ENDIF.
APPEND R_MRU.
ENDLOOP.
ENDIF. " end so_GSBER[]
ENDIF. " end lv_lines > 0
IF NOT R_MRU[] IS INITIAL.
* break csebdev1.
REFRESH GT_TE422[].
SELECT TERMSCHL TERMTEXT FROM TE422 INTO CORRESPONDING FIELDS OF
TABLE GT_TE422 WHERE TERMSCHL IN R_MRU .
ENDIF.
**now call the search help
CALL FUNCTION 'F4IF_INT_TABLE_VALUE_REQUEST'
EXPORTING
* DDIC_STRUCTURE = ' '
RETFIELD = 'TERMSCHL'
* PVALKEY = ' '
* DYNPPROG = ' '
* DYNPNR = ' '
* DYNPROFIELD = ' '
* STEPL = 0
WINDOW_TITLE = 'Group'
VALUE = DYFIELDS-FIELDVALUE
VALUE_ORG = 'S'
* MULTIPLE_CHOICE = ' '
* DISPLAY = ' '
* CALLBACK_PROGRAM = ' '
* CALLBACK_FORM = ' '
* MARK_TAB =
* IMPORTING
* USER_RESET =
TABLES
VALUE_TAB = GT_TE422
* FIELD_TAB =
RETURN_TAB = LT_RETURN_TAB
* DYNPFLD_MAPPING =
* EXCEPTIONS
* PARAMETER_ERROR = 1
* NO_VALUES_FOUND = 2
* OTHERS = 3
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
READ TABLE LT_RETURN_TAB INTO LWA_RETURN_TAB INDEX 1.
LV_MRU = LWA_RETURN_TAB-FIELDVAL.
ENDFORM. " MRU_DATA_GET
<b>b) ZISU_LTR15_TUNE - Actual R15 report</b>
*& Report ZISU_LTR15_TUNE *
REPORT ZISU_LTR15_TUNE NO STANDARD PAGE HEADING MESSAGE-ID ZISU
LINE-SIZE 250 LINE-COUNT 65.
**Tables
TABLES : EVER,
FKKVKP,
EANLH.
SELECTION-SCREEN BEGIN OF BLOCK SELECTION
WITH FRAME TITLE TEXT-001 . "no intervals.
SELECT-OPTIONS: SO_GSBER FOR EVER-GSBER OBLIGATORY,
SO_COKEY FOR EVER-COKEY,
SO_MRU FOR EANLH-ABLEINH.
PARAMETERS: P_BMNTH(7) TYPE C OBLIGATORY.
PARAMETERS: P_CONT TYPE EVER-VERTRAG.
SELECT-OPTIONS: SO_GPART FOR FKKVKP-GPART.
SELECTION-SCREEN END OF BLOCK SELECTION.
SELECTION-SCREEN BEGIN OF BLOCK PROCESS WITH FRAME.
PARAMETERS: P_COMPUT RADIOBUTTON GROUP R15.
PARAMETERS: P_INCLUD RADIOBUTTON GROUP R15,
P_FILE(60) TYPE C.
SELECTION-SCREEN END OF BLOCK PROCESS.
* Start of Type declaration
TYPES: BEGIN OF ST_TB024 ,
IND_SECTOR TYPE TB024-IND_SECTOR,
TEXTLONG TYPE TB024-TEXTLONG,
END OF ST_TB024,
BEGIN OF ST_TECOKT,
COKEY TYPE TECOKT-COKEY,
LTEXT TYPE TECOKT-LTEXT,
END OF ST_TECOKT,
BEGIN OF ST_TGSBT,
GSBER TYPE TGSBT-GSBER,
GTEXT TYPE TGSBT-GTEXT,
END OF ST_TGSBT,
BEGIN OF ST_TE422,
TERMSCHL TYPE TE422-TERMSCHL,
TERMTEXT TYPE TE422-TERMTEXT,
END OF ST_TE422.
* Start of data declaration
DATA: IT_TAB TYPE FILETABLE,
GD_SUBRC TYPE I.
DATA : GT_TB024 TYPE STANDARD TABLE OF ST_TB024 WITH HEADER LINE
INITIAL SIZE 0,
GT_TECOKT TYPE STANDARD TABLE OF ST_TECOKT
WITH HEADER LINE,
GT_TGSBT TYPE SORTED TABLE OF ST_TGSBT
WITH UNIQUE KEY GSBER WITH HEADER LINE,
GT_TE422 TYPE TABLE OF ST_TE422 WITH HEADER LINE INITIAL SIZE 0,
BEGIN OF GWA_EVER,
VKONT TYPE EVER-VKONTO,
ANLAGE TYPE EVER-ANLAGE,
ABRSPERR TYPE EVER-ABRSPERR,
END OF GWA_EVER,
GT_EVER LIKE TABLE OF GWA_EVER INITIAL SIZE 0,
BEGIN OF GWA_EANLH,
ANLAGE TYPE EANLH-ANLAGE,
BRANCHE TYPE EANLH-BRANCHE,
END OF GWA_EANLH,
GT_EANLH LIKE TABLE OF GWA_EANLH INITIAL SIZE 0,
BEGIN OF GWA_FKKVKP,
VKONT TYPE FKKVKP-VKONT,
KTOKL TYPE FKKVKP-KTOKL,
END OF GWA_FKKVKP,
GT_FKKVKP LIKE TABLE OF GWA_FKKVKP INITIAL SIZE 0,
BEGIN OF GWA_EVER_EANLH_FKKVKP,
VKONT TYPE FKKVKP-VKONT,
ANLAGE TYPE EVER-ANLAGE,
ABRSPERR TYPE EVER-ABRSPERR,
BRANCHE TYPE EANLH-BRANCHE,
KTOKL TYPE FKKVKP-KTOKL,
END OF GWA_EVER_EANLH_FKKVKP,
GT_EVER_EANLH_FKKVKP LIKE TABLE OF GWA_EVER_EANLH_FKKVKP
INITIAL SIZE 0,
GT_EVER_EANLH_FKKVKP_INACT LIKE STANDARD TABLE OF
GWA_EVER_EANLH_FKKVKP INITIAL SIZE 0,
BEGIN OF GWA_ERCH,
BELNR TYPE ERCH-BELNR,
VKONT TYPE ERCH-VKONT,
END OF GWA_ERCH,
GT_ERCH LIKE TABLE OF GWA_ERCH INITIAL SIZE 0,
BEGIN OF GWA_PRINTDOC,
OPBEL TYPE ERDK-OPBEL,
GPART TYPE ERDK-PARTNER,
VKONT TYPE ERDK-VKONT,
BUDAT TYPE ERDK-BUDAT,
FAEDN TYPE ERDK-FAEDN,
END OF GWA_PRINTDOC,
GT_PRINTDOC LIKE TABLE OF GWA_PRINTDOC INITIAL SIZE 0,
BEGIN OF GWA_DBERCHZ1,
BELNR TYPE DBERCHZ1-BELNR,
EIN01 TYPE DBERCHZ1-EIN01,
V_ABRMENGE TYPE DBERCHZ1-V_ABRMENGE,
END OF GWA_DBERCHZ1,
GT_DBERCHZ1 LIKE TABLE OF GWA_DBERCHZ1 INITIAL SIZE 0,
BEGIN OF GWA_ERCHC,
BELNR TYPE ERCHC-BELNR,
OPBEL TYPE ERCHC-OPBEL,
BUDAT TYPE ERCHC-BUDAT,
END OF GWA_ERCHC,
GT_ERCHC LIKE TABLE OF GWA_ERCHC INITIAL SIZE 0,
* arrears for inactive consumers.
BEGIN OF GWA_DFKKOP_INACTIVE_ARR,
OPBEL TYPE DFKKOP-OPBEL,
VKONT TYPE DFKKOP-VKONT,
BETRH TYPE DFKKOP-BETRH,
END OF GWA_DFKKOP_INACTIVE_ARR,
GT_DFKKOP_INACTIVE_ARR LIKE TABLE OF GWA_DFKKOP_INACTIVE_ARR
INITIAL SIZE 0,
BEGIN OF GWA_ETTIFN,
ANLAGE TYPE ETTIFN-ANLAGE,
OPERAND TYPE ETTIFN-OPERAND,
WERT1 TYPE ETTIFN-WERT1,
END OF GWA_ETTIFN,
GT_ETTIFN LIKE SORTED TABLE OF GWA_ETTIFN
WITH NON-UNIQUE KEY ANLAGE OPERAND INITIAL SIZE 0,
BEGIN OF GWA_DBERDLB,
PRINTDOC TYPE DBERDLB-PRINTDOC,
BILLDOC TYPE DBERDLB-BILLDOC,
BILLDOCLINE TYPE DBERDLB-BILLDOCLINE,
NETTOBTR TYPE DBERDLB-NETTOBTR,
END OF GWA_DBERDLB,
GT_DBERDLB LIKE TABLE OF GWA_DBERDLB INITIAL SIZE 0,
BEGIN OF GWA_DBERCHZ1_BILL,
BELNR TYPE DBERCHZ1-BELNR,
BELZEILE TYPE DBERCHZ1-BELZEILE,
TVORG TYPE DBERCHZ1-TVORG,
END OF GWA_DBERCHZ1_BILL,
GT_DBERCHZ1_BILL LIKE TABLE OF GWA_DBERCHZ1_BILL INITIAL SIZE 0,
BEGIN OF GWA_BILLDOC,
PRINTDOC TYPE ERDK-OPBEL,
BILLDOC TYPE DBERDLB-BILLDOC,
END OF GWA_BILLDOC,
GT_BILLDOC LIKE TABLE OF GWA_BILLDOC INITIAL SIZE 0,
BEGIN OF GWA_DBERCHV,
BELNR TYPE DBERCHV-BELNR,
OPERAND TYPE DBERCHV-OPERAND,
EZ_ABRMENGE TYPE DBERCHV-EZ_ABRMENGE,
ABLESGR TYPE DBERCHV-ABLESGR,
ABLESGRV TYPE DBERCHV-ABLESGRV,
END OF GWA_DBERCHV,
GT_DBERCHV LIKE TABLE OF GWA_DBERCHV INITIAL SIZE 0,
BEGIN OF GWA_DFKKOP_ARREAR,
OPBEL TYPE DFKKOP-OPBEL,
VKONT TYPE DFKKOP-VKONT,
HVORG TYPE DFKKOP-HVORG,
TVORG TYPE DFKKOP-TVORG,
BUDAT TYPE DFKKOP-BUDAT,
BETRH TYPE DFKKOP-BETRH,
AUGDT TYPE DFKKOP-AUGDT,
XBLNR TYPE DFKKOP-XBLNR,
END OF GWA_DFKKOP_ARREAR,
GT_DFKKOP_ARREAR LIKE TABLE OF GWA_DFKKOP_ARREAR INITIAL SIZE 0,
BEGIN OF GWA_MASTER_DATA,
SLNO(4) TYPE C,
IND_SECTOR TYPE TB024-IND_SECTOR,
TEXTLONG TYPE TB024-TEXTLONG,
AC_CODE TYPE TFK033D-FUN01,
END OF GWA_MASTER_DATA,
GT_MASTER_DATA LIKE TABLE OF GWA_MASTER_DATA INITIAL SIZE 0,
** internal table for final prepared data
BEGIN OF GWA_FINAL_DATA,
SLNO(4) TYPE C,
IND_SECTOR TYPE TB024-IND_SECTOR, " for testing
INDUSTRY TYPE TB024-TEXTLONG,
AC_CODE TYPE TFK033D-FUN01,
* *for urban partners
UPARTNER TYPE I , "(6) type n,
UBAD_METER TYPE I, "(4) type n,
UINACTIVE TYPE I,
ULOAD(8) TYPE P DECIMALS 2,
UUNITS(8) TYPE P DECIMALS 2,
UDEMANDS(8) TYPE P DECIMALS 2,
UARREARS(8) TYPE P DECIMALS 2, "FKKMAKO-Msalm,
UINACT_ARR(8) TYPE P DECIMALS 2,
BLANK(10) TYPE C,
* *for rural partners
RPARTNER TYPE I , "(6) type n,
RBAD_METER TYPE I , "(4) type n,
RINACTIVE TYPE I,
RLOAD(8) TYPE P DECIMALS 2,
RUNITS(8) TYPE P DECIMALS 2,
RDEMANDS(8) TYPE P DECIMALS 2,
RARREARS(8) TYPE P DECIMALS 2,
RINACT_ARR(8) TYPE P DECIMALS 2,
END OF GWA_FINAL_DATA,
GT_FINAL_DATA LIKE TABLE OF GWA_FINAL_DATA INITIAL SIZE 0,
BEGIN OF GWA_MANUAL_DATA,
COL_1(5) TYPE C,
COL_2(10) TYPE C,
COL_3(40) TYPE C,
COL_4(40) TYPE C,
COL_5(10) TYPE C,
COL_6(19) TYPE C,
COL_7(19) TYPE C,
COL_8(19) TYPE C,
COL_9(19) TYPE C,
COL_10(19) TYPE C,
COL_11(19) TYPE C,
COL_12(10) TYPE C,
COL_13(19) TYPE C,
COL_14(19) TYPE C,
COL_15(19) TYPE C,
COL_16(19) TYPE C,
COL_17(19) TYPE C,
COL_18(19) TYPE C,
END OF GWA_MANUAL_DATA,
GT_MANUAL_DATA LIKE TABLE OF GWA_MANUAL_DATA INITIAL SIZE 0.
** Variables for grand total of all heads.
DATA : GV_T_UPART TYPE I,
GV_T_RPART TYPE I,
GV_T_RINACTIVE TYPE I,
GV_T_UINACTIVE TYPE I,
GV_UDEF_MTR TYPE I,
GV_RDEF_MTR TYPE I,
GV_UCON_LOAD(16) TYPE P DECIMALS 2,
GV_RCON_LOAD(16) TYPE P DECIMALS 2,
GV_UUNITS(16) TYPE P DECIMALS 2,
GV_RUNITS(16) TYPE P DECIMALS 2,
GV_UDEMAND(16) TYPE P DECIMALS 2,
GV_RDEMAND(16) TYPE P DECIMALS 2,
GV_UARREAR(16) TYPE P DECIMALS 2,
GV_RARREAR(16) TYPE P DECIMALS 2,
GV_UINACT_ARR(16) TYPE P DECIMALS 2,
GV_RINACT_ARR(16) TYPE P DECIMALS 2,
GV_U_SD_DMD(16) TYPE P DECIMALS 2,
GV_R_SD_DMD(16) TYPE P DECIMALS 2,
GV_U_SD_ARR(16) TYPE P DECIMALS 2,
GV_R_SD_ARR(16) TYPE P DECIMALS 2,
GV_UR_PART TYPE I,
GV_UR_DEF_MTR TYPE I,
GV_UR_CON_LOAD(16) TYPE P DECIMALS 2,
GV_UR_UNITS(16) TYPE P DECIMALS 2,
GV_UR_DEMAND(16) TYPE P DECIMALS 2,
GV_UR_ARREARS(16) TYPE P DECIMALS 2,
GV_LOWDATE TYPE SY-DATUM,
GV_HIGHDATE TYPE SY-DATUM,
GV_YEAR LIKE DBERCHV-EZ_ABRMENGE,
GV_MONTH LIKE DBERCHZ1-V_ABRMENGE,
GV_MNTH_NAME TYPE T247-KTX.
DATA:
**total meter rent
GV_UTOTMETERRENT TYPE P LENGTH 16 DECIMALS 2,
GV_RTOTMETERRENT TYPE P LENGTH 16 DECIMALS 2,
**for total surcharge
GV_UTOTSURCHRG TYPE P LENGTH 16 DECIMALS 2, "dberdlb-nettobtr,
GV_RTOTSURCHRG TYPE P LENGTH 16 DECIMALS 2, "dberdlb-nettobtr,
**for total ED
GV_UTOTED TYPE P LENGTH 16 DECIMALS 2, "dberdlb-nettobtr,
GV_RTOTED TYPE P LENGTH 16 DECIMALS 2, "dberdlb-nettobtr,
**for total ED Cess
GV_UTOTEDCESS TYPE P LENGTH 16 DECIMALS 2, "dberdlb-nettobtr,
GV_RTOTEDCESS TYPE P LENGTH 16 DECIMALS 2, "dberdlb-nettobtr,
**for total other misc rev
GV_UTOTMISCREV TYPE P LENGTH 16 DECIMALS 2,
GV_RTOTMISCREV TYPE P LENGTH 16 DECIMALS 2,
**for total ED of Free agricultural pump
GV_RTOTEDAGRI TYPE P LENGTH 16 DECIMALS 2,
GV_UTOTEDAGRI TYPE P LENGTH 16 DECIMALS 2,
**for ED cess of Free agriculture pump
GV_RTOTCESSAGRI TYPE P LENGTH 16 DECIMALS 2,
GV_UTOTCESSAGRI TYPE P LENGTH 16 DECIMALS 2,
***data for selection screen data validation
GV_GSBER TYPE EVER-GSBER,
GV_COKEY TYPE EVER-COKEY,
DYFIELDS LIKE DYNPREAD OCCURS 0 WITH HEADER LINE,
GT_INTERN_DATA TYPE ALSMEX_TABLINE OCCURS 0 WITH HEADER LINE.
RANGES: R_COKEY FOR TECOKT-COKEY.
AT SELECTION-SCREEN.
AT SELECTION-SCREEN ON SO_GSBER.
IF NOT SO_GSBER-LOW IS INITIAL.
SELECT SINGLE GSBER FROM TGSBT INTO GT_TGSBT-GSBER WHERE
GSBER = SO_GSBER-LOW.
IF SY-SUBRC <> 0.
MESSAGE E007.
ENDIF.
ENDIF.
IF NOT SO_GSBER-HIGH IS INITIAL .
SELECT SINGLE GSBER FROM TGSBT INTO GT_TGSBT-GSBER WHERE
GSBER = SO_GSBER-HIGH.
IF SY-SUBRC <> 0.
MESSAGE E007.
ENDIF.
ENDIF.
AT SELECTION-SCREEN ON P_BMNTH.
**check the validity of Billing month format and for valid billing month
IF P_BMNTH CO '0123456789/'.
IF P_BMNTH CP '++++/++'.
IF P_BMNTH+5(2) > 12.
MESSAGE E002 WITH 'from'.
ENDIF.
** current year is less than year entered
IF SY-DATUM(4) < P_BMNTH(4).
MESSAGE E003 WITH 'from'.
ELSEIF SY-DATUM(4) = P_BMNTH(4).
** month in future
IF SY-DATUM+4(2) < P_BMNTH+5(2).
MESSAGE E004 WITH 'from'.
ENDIF.
ENDIF.
ELSE.
MESSAGE E001 WITH 'from'.
ENDIF.
ELSE.
** entry have some invalid char
MESSAGE E010.
ENDIF.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR SO_COKEY-LOW.
**for geting the CO account assigment key of the entered Business area
CLEAR: DYFIELDS[], DYFIELDS.
PERFORM DIVISON_DATA_GET CHANGING SO_COKEY-LOW.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR SO_COKEY-HIGH.
CLEAR: DYFIELDS[], DYFIELDS.
PERFORM DIVISON_DATA_GET CHANGING SO_COKEY-HIGH.
*AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
* REFRESH: IT_TAB.
***Opens File Open Dialog Box for selecting input file.
* CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_OPEN_DIALOG
* EXPORTING
* WINDOW_TITLE = 'Select File'
* DEFAULT_FILENAME = '*.xls'
* MULTISELECTION = ' '
* CHANGING
* FILE_TABLE = IT_TAB
* RC = GD_SUBRC.
* LOOP AT IT_TAB INTO P_FILE.
** so_fpath-sign = 'I'.
** so_fpath-option = 'EQ'.
** append so_fpath.
* ENDLOOP.
AT SELECTION-SCREEN OUTPUT.
LOOP AT SCREEN.
IF SCREEN-GROUP1 = 'ACT'.
IF P_INCLUD = ' '.
SCREEN-INPUT = '0'.
ELSE.
SCREEN-INPUT = '1'.
ENDIF.
ENDIF.
MODIFY SCREEN.
ENDLOOP.
* End of Selection screen processing
TOP-OF-PAGE.
DATA : LV_TEXT(70) TYPE C.
FORMAT INTENSIFIED OFF.
WRITE:/50 'CHHATTISGARH STATE ELECTRICITY BOARD',
/42 'R-15 REPORT FOR LT CONSUMERS FOR THE MONTH OF',
GV_MNTH_NAME NO-GAP,'-' NO-GAP, P_BMNTH(4) NO-GAP,
180 'PAGE-NO. -', SY-PAGNO LEFT-JUSTIFIED.
SKIP.
WRITE:/2 'RAO', 15 ':'.
LOOP AT GT_TGSBT WHERE GSBER IN SO_GSBER.
IF STRLEN( LV_TEXT ) > 60.
WRITE: 16 LV_TEXT, / ''.
CLEAR LV_TEXT.
ENDIF.
CONCATENATE LV_TEXT GT_TGSBT-GSBER '-' GT_TGSBT-GTEXT
INTO LV_TEXT SEPARATED BY SPACE.
ENDLOOP.
WRITE: 16 LV_TEXT.
CLEAR LV_TEXT.
WRITE: /2 'Division',15 ':'.
LOOP AT GT_TECOKT.
IF STRLEN( LV_TEXT ) > 60.
WRITE: 16 LV_TEXT, /.
CLEAR LV_TEXT.
ENDIF.
CONCATENATE LV_TEXT GT_TECOKT-COKEY '-' GT_TECOKT-LTEXT
INTO LV_TEXT SEPARATED BY SPACE.
ENDLOOP.
WRITE: 16 LV_TEXT.
CLEAR LV_TEXT.
WRITE: /2 'Group',15 ':'.
LOOP AT GT_TE422 .
IF STRLEN( LV_TEXT ) > 60.
WRITE: 16 LV_TEXT, /.
CLEAR LV_TEXT.
ENDIF.
CONCATENATE LV_TEXT GT_TE422-TERMSCHL '-' GT_TE422-TERMTEXT
INTO LV_TEXT SEPARATED BY SPACE.
ENDLOOP.
WRITE: 16 LV_TEXT.
CLEAR LV_TEXT.
**legends
* write :/ text-028, 15 ':',16 text-029.
**now write the headings on every page
SET LEFT SCROLL-BOUNDARY COLUMN 50.
FORMAT COLOR 1 ON.
WRITE :/1(244) SY-ULINE.
WRITE :/1 SY-VLINE,6 SY-VLINE,50 SY-VLINE, 58 SY-VLINE, 59
'<----------------------------------',
'U R B A N A R E A',
'----------------------------------->'.
WRITE : 151 SY-VLINE, 152
'<----------------------------------',
'R U R A L A R E A',
'----------------------------------->',
244 SY-VLINE.
WRITE :/1 SY-VLINE,
2 'Slno',
6 SY-VLINE,
7 'Revenue Category',
50 SY-VLINE,
51 'A/C',
58 SY-VLINE,
59 'Cons-',
67 SY-VLINE,
68 'Deff',
75 SY-VLINE,
76 'Conn',
89 SY-VLINE,
90 'Sold',
108 SY-VLINE,
109 'Demand',
129 SY-VLINE,
130 'Previous',
151 SY-VLINE,
152 'Cons-',
160 SY-VLINE,
161 'Deff',
168 SY-VLINE,
169 'Conn',
182 SY-VLINE,
183 'Sold',
201 SY-VLINE,
202 'Demand',
222 SY-VLINE,
223 'Previous',
244 SY-VLINE.
WRITE :/1 SY-VLINE,
6 SY-VLINE,
50 SY-VLINE,
51 'Code',
58 SY-VLINE,
59 'umers',
67 SY-VLINE,
68 'mtrs',
75 SY-VLINE,
76 'Load-KW',
89 SY-VLINE,
90 'Units',
108 SY-VLINE,
129 SY-VLINE,
130 'Arrear',
151 SY-VLINE,
152 'umers',
160 SY-VLINE,
161 'mtrs',
168 SY-VLINE,
169 'Load-KW',
182 SY-VLINE,
183 'Units',
201 SY-VLINE,
222 SY-VLINE,
223 'Arrear',
244 SY-VLINE.
WRITE :/1(244) SY-ULINE.
SET LEFT SCROLL-BOUNDARY COLUMN 59.
* Start of Data Selection
START-OF-SELECTION.
SELECT GSBER GTEXT INTO TABLE GT_TGSBT FROM TGSBT
WHERE SPRAS = SY-LANGU.
SELECT IND_SECTOR TEXTLONG INTO TABLE GT_TB024
FROM TB024 WHERE SPRAS = SY-LANGU
AND ( ( IND_SECTOR >= '01' AND IND_SECTOR <= '55' )
OR ( IND_SECTOR = '57' OR IND_SECTOR = '58'
OR IND_SECTOR = '94' ) ).
SELECT SINGLE KTX INTO GV_MNTH_NAME FROM T247
WHERE MNR = P_BMNTH+5(2) AND SPRAS = SY-LANGU.
***master data selection
PERFORM CONSUMER_DATA_SELECTION.
*&--Get Meter Status Connected Load & MF for each installation
PERFORM OPERAND_DATA_SELECTION.
*&--Get the Demand corresponding to each Print documents selected
PERFORM BILLING_DATA_SELECTION.
*&--Get the arrears corresponding to each Print document selected
PERFORM ARREAR_DATA_SELECTION.
*&--To include Mannual R-15 into computerized R-15.
IF P_INCLUD = 'X'.
OPEN DATASET P_FILE FOR INPUT IN TEXT MODE ENCODING DEFAULT.
DO.
READ DATASET P_FILE INTO GWA_MANUAL_DATA.
IF SY-SUBRC <> 0.
EXIT.
ELSE.
APPEND GWA_MANUAL_DATA TO GT_MANUAL_DATA.
ENDIF.
ENDDO.
CLOSE DATASET P_FILE.
DELETE DATASET P_FILE.
ENDIF.
END-OF-SELECTION.
* End of data Selection
* Start of Data Processing
PERFORM FINAL_TABLE_PREPARE.
PERFORM FINAL_OUTPUT_PREPARE.
PERFORM FINAL_OUTPUT_DISPLAY.
* End of Data Processing
PERFORM FREE_MEMORY.
*& Form master_data_selection
* text
* --> p1 text
* <-- p2 text
FORM CONSUMER_DATA_SELECTION .
DATA: LV_MAX_DAYS TYPE I,
LV_LAST_DAY(2) TYPE C,
LV_IMONTH TYPE I,
LV_IYEAR TYPE I,
LV_BMNTH TYPE ZERDK_ERCHC-V_ABRMENGE,
LV_BYEAR TYPE ZERDK_ERCHC-EZ_ABRMENGE,
*&----Temporary tables for global internal tables.
LT_DBERCHZ1_TEMP LIKE TABLE OF GWA_DBERCHZ1,
LWA_DBERCHZ1_TEMP LIKE GWA_DBERCHZ1,
LT_ERCH LIKE TABLE OF GWA_ERCH,
LWA_ERCH LIKE GWA_ERCH,
LT_EVER LIKE TABLE OF GWA_EVER.
*DATA: lt_ever_fkkvkp like gwa_ever_eanlh_fkkvkp occurs 0.
LV_IMONTH = P_BMNTH+5(2).
LV_IYEAR = P_BMNTH(4).
LV_BMNTH = P_BMNTH+5(2).
LV_BYEAR = P_BMNTH(4).
CALL FUNCTION 'RTP_US_API_MAX_DAYS_IN_MONTH'
EXPORTING
I_DATE_MONTH = LV_IMONTH
I_DATE_YEAR = LV_IYEAR
IMPORTING
E_MAX_DAYS = LV_MAX_DAYS.
LV_LAST_DAY = LV_MAX_DAYS.
CONCATENATE P_BMNTH(4) P_BMNTH+5(2) LV_LAST_DAY INTO GV_HIGHDATE.
IF SO_GPART[] IS INITIAL.
SELECT VKONTO ANLAGE ABRSPERR
INTO TABLE GT_EVER
FROM EVER
WHERE GSBER IN SO_GSBER
AND COKEY IN SO_COKEY
AND BUKRS = 'CSEB'
AND SPARTE = '01'
AND KOFIZ = '02'.
**get installation no from contract data and check biling class in eanlh
IF GT_EVER[] IS INITIAL.
MESSAGE 'No Business Partner exist for entered selection data' TYPE 'A'.
ENDIF.
IF NOT GT_EVER[] IS INITIAL.
SELECT VKONT KTOKL INTO TABLE GT_FKKVKP
FROM FKKVKP
FOR ALL ENTRIES IN GT_EVER
WHERE VKONT = GT_EVER-VKONT
AND GPART IN SO_GPART.
IF GT_FKKVKP[] IS INITIAL.
MESSAGE 'No Business Partner exist for entered Selection' TYPE 'A'.
ENDIF.
ENDIF.
ELSE.
SELECT VKONT KTOKL INTO TABLE GT_FKKVKP
FROM FKKVKP
WHERE GPART IN SO_GPART.
IF NOT GT_FKKVKP[] IS INITIAL.
SELECT VKONTO ANLAGE INTO TABLE GT_EVER
FROM EVER
FOR ALL ENTRIES IN GT_FKKVKP
WHERE VKONTO EQ GT_FKKVKP-VKONT
AND GSBER IN SO_GSBER
AND COKEY IN SO_COKEY
AND BUKRS EQ 'CSEB'
AND SPARTE EQ '01'
AND KOFIZ EQ '02'.
ENDIF.
ENDIF.
IF GT_EVER[] IS INITIAL AND GT_FKKVKP[] IS INITIAL.
MESSAGE 'No Business Partner exist for entered Selection' TYPE 'A'.
ENDIF.
SELECT ANLAGE BRANCHE INTO TABLE GT_EANLH
FROM EANLH
FOR ALL ENTRIES IN GT_EVER
WHERE ANLAGE EQ GT_EVER-ANLAGE
AND ABLEINH IN SO_MRU
AND AKLASSE EQ'0002'
AND AB <= GV_HIGHDATE
AND BIS => GV_HIGHDATE.
IF GT_EANLH[] IS INITIAL.
MESSAGE 'No Business Partner exist for entered Selection' TYPE 'A'.
ENDIF.
SORT : GT_EVER BY VKONT ANLAGE,
GT_FKKVKP BY VKONT,
GT_EANLH BY ANLAGE.
LOOP AT GT_EANLH INTO GWA_EANLH.
READ TABLE GT_EVER INTO GWA_EVER WITH KEY ANLAGE = GWA_EANLH-ANLAGE.
IF SY-SUBRC = 0.
READ TABLE GT_FKKVKP INTO GWA_FKKVKP
WITH KEY VKONT = GWA_EVER-VKONT.
IF SY-SUBRC = 0.
GWA_EVER_EANLH_FKKVKP-VKONT = GWA_EVER-VKONT.
GWA_EVER_EANLH_FKKVKP-ANLAGE = GWA_EVER-ANLAGE.
GWA_EVER_EANLH_FKKVKP-ABRSPERR = GWA_EVER-ABRSPERR.
GWA_EVER_EANLH_FKKVKP-BRANCHE = GWA_EANLH-BRANCHE.
GWA_EVER_EANLH_FKKVKP-KTOKL = GWA_FKKVKP-KTOKL.
APPEND GWA_EVER_EANLH_FKKVKP TO GT_EVER_EANLH_FKKVKP.
CLEAR GWA_EVER_EANLH_FKKVKP.
ENDIF.
ENDIF.
ENDLOOP.
IF GT_EVER_EANLH_FKKVKP[] IS INITIAL.
MESSAGE 'No Business Partner exist for Selection Data' TYPE 'A'.
ENDIF.
*&----get all the billdocuments for the busines partner's contract
*&----account
SELECT BELNR VKONT INTO TABLE GT_ERCH
FROM ERCH
FOR ALL ENTRIES IN GT_EVER_EANLH_FKKVKP
WHERE VKONT EQ GT_EVER_EANLH_FKKVKP-VKONT.
*&----get the BILL MONTH & BILL YEAR FOR THE BILLDOCUMENTS.
IF NOT GT_ERCH[] IS INITIAL.
SELECT BELNR EIN01 V_ABRMENGE INTO TABLE GT_DBERCHZ1
FROM DBERCHZ1
FOR ALL ENTRIES IN GT_ERCH
WHERE BELNR EQ GT_ERCH-BELNR
AND AKLASSE = '0002'
AND ( ( EIN01 = 'BILL_MNTH1'
AND V_ABRMENGE = P_BMNTH+5(2) )
OR ( EIN01 = 'BILL_YEAR1'
AND V_ABRMENGE = P_BMNTH(4) ) ).
*&---- GET THOSE BILL DOCUMENTS WHICH ARE FOR THE ENTERED BILL MONTH
IF NOT GT_DBERCHZ1[] IS INITIAL.
LOOP AT GT_DBERCHZ1 INTO GWA_DBERCHZ1 WHERE EIN01 = 'BILL_MNTH1'
AND V_ABRMENGE = P_BMNTH+5(2).
READ TABLE GT_DBERCHZ1 INTO LWA_DBERCHZ1_TEMP
WITH KEY BELNR = GWA_DBERCHZ1-BELNR EIN01 = 'BILL_YEAR1'
V_ABRMENGE = P_BMNTH(4).
IF SY-SUBRC = 0.
APPEND LWA_DBERCHZ1_TEMP TO LT_DBERCHZ1_TEMP.
CLEAR: LWA_DBERCHZ1_TEMP.
ENDIF.
ENDLOOP.
GT_DBERCHZ1[] = LT_DBERCHZ1_TEMP[].
ENDIF.
ENDIF.
*&---NOW FIND THE PRINT DOCUMENTS FOR THE SELECTED BILLDOCUMENTS.
IF NOT GT_DBERCHZ1[] IS INITIAL.
SELECT BELNR OPBEL BUDAT FROM ERCHC
INTO TABLE GT_ERCHC
FOR ALL ENTRIES IN GT_DBERCHZ1
WHERE BELNR = GT_DBERCHZ1-BELNR
AND INTOPBEL EQ SPACE
AND SIMULATED EQ SPACE
AND INVOICED EQ 'X'.
ENDIF.
IF NOT GT_ERCHC[] IS INITIAL.
LOOP AT GT_ERCHC INTO GWA_ERCHC.
READ TABLE GT_ERCH INTO GWA_ERCH WITH KEY BELNR = GWA_ERCHC-BELNR.
IF SY-SUBRC = 0.
APPEND GWA_ERCH TO LT_ERCH.
CLEAR GWA_ERCH.
ENDIF.
CLEAR GWA_ERCHC.
ENDLOOP.
ENDIF.
GT_ERCH[] = LT_ERCH[].
FREE: LT_ERCH, LT_DBERCHZ1_TEMP,LWA_DBERCHZ1_TEMP.
LOOP AT GT_EVER_EANLH_FKKVKP INTO GWA_EVER_EANLH_FKKVKP.
* READ TABLE gt_erdk_erchc INTO gwa_erdk_erchc
* WITH KEY vkont = gwa_ever_eanlh_fkkvkp-vkont.
READ TABLE GT_ERCH INTO GWA_ERCH
WITH KEY VKONT = GWA_EVER_EANLH_FKKVKP-VKONT.
IF SY-SUBRC <> 0.
IF GWA_EVER_EANLH_FKKVKP-ABRSPERR = SPACE.
GWA_EVER_EANLH_FKKVKP-ABRSPERR = '01'.
MODIFY GT_EVER_EANLH_FKKVKP FROM GWA_EVER_EANLH_FKKVKP
TRANSPORTING ABRSPERR.
APPEND GWA_EVER_EANLH_FKKVKP TO GT_EVER_EANLH_FKKVKP_INACT.
ENDIF.
ELSE.
IF GWA_EVER_EANLH_FKKVKP-ABRSPERR <> SPACE.
GWA_EVER_EANLH_FKKVKP-ABRSPERR = SPACE.
MODIFY GT_EVER_EANLH_FKKVKP FROM GWA_EVER_EANLH_FKKVKP
TRANSPORTING ABRSPERR.
ENDIF.
ENDIF.
CLEAR GWA_ERCH.
ENDLOOP.
REFRESH GT_TECOKT[].
IF NOT SO_COKEY[] IS INITIAL.
SELECT COKEY LTEXT INTO TABLE GT_TECOKT
FROM TECOKT
WHERE COKEY IN SO_COKEY
AND SPRAS = SY-LANGU .
* %_HINTS ORACLE '("TECOKT","TECOKT~1")'.
ENDIF.
IF NOT SO_MRU[] IS INITIAL.
SELECT TERMSCHL TERMTEXT INTO TABLE GT_TE422
FROM TE422
WHERE TERMSCHL IN SO_MRU.
ENDIF.
*FREE lt_ever_fkkvkp[].
ENDFORM. " consumer_data_selection
*& Form Operand_data_selection
* text
* --> p1 text
* <-- p2 text
FORM OPERAND_DATA_SELECTION .
DATA: LT_EVER_EANLH_FKKVKP_ACT LIKE TABLE OF GWA_EVER_EANLH_FKKVKP.
* SORT gt_ever_eanlh_fkkvkp BY vkont.
CHECK NOT GT_EVER_EANLH_FKKVKP[] IS INITIAL.
**now depending upon the billing month check the time slice date
**the last date of the billing month or the range of billing month
*should fall in between the time slice of the installation
** billing month to is space
**now select operands for processed installation
LT_EVER_EANLH_FKKVKP_ACT[] = GT_EVER_EANLH_FKKVKP[].
DELETE LT_EVER_EANLH_FKKVKP_ACT WHERE ABRSPERR NE SPACE.
IF NOT LT_EVER_EANLH_FKKVKP_ACT[] IS INITIAL.
SELECT ANLAGE OPERAND WERT1
FROM ETTIFN
INTO TABLE GT_ETTIFN
FOR ALL ENTRIES IN LT_EVER_EANLH_FKKVKP_ACT
WHERE ANLAGE = LT_EVER_EANLH_FKKVKP_ACT-ANLAGE
AND OPERAND IN ('MTR_STS','CONN_LOAD','LOAD_CODE','KWH_MF')
AND ( AB <= GV_HIGHDATE AND BIS >= GV_HIGHDATE ).
* %_HINTS ORACLE '("ETTIFN","ETTIFN~003")'.
ENDIF.
ENDFORM. " Operand_data_selection
*& Form billing_data_selection
*&------------------------------Hi,
Please do run time analysis as Rob correctly mentioned or do SQL trace(ST05) to find where the program is consuming more time.
Lanka -
Browsing websites with Process Monitor running, there are millions of instances of this executable running in the background. It apparently causes websites being browsed to go into a "not resplonding" state. I use Webroot for security, and the computer is new - purchased about 3 weeks ago - HP Slimline.
The .EXE is installed as part of HP's "Simple Pass" which was apparently installed on my system when I downloaded a driver for my HP Officejet Pro 8500 printer/copier/scanner/fax.
It's supposed to be an identity protection software. Does this program actually work? I want to uninstall it because the performance impact is just ridiculous.
Thanks for any advice or suggestions you can provide.I'm having the same issue with my new HP Envy 8.1 . I had to send it in at least 2 times for factory repairs. OPBHOBrokerDsktop.exe shows up as a warning in the reliabilty monitor now. Maybe that's what made my monitor and motherboard go out before. It has always showed the "simple pass broker" or -"OPBHOBrokerDsktop.exe"- as a warning . I thought it was that I just didn't scan my finger over the sensor right. When you hit the the link to find a solution about the warning. It says no solution found.Also it doesn't start automatic maintenance on schedule. Here just a few days ago in the start up screen it asked if I would like to use a PIN number to log in instead of "SimplePass". But it didn't say why. Now I'm beginning to understand why. They won't admit there is a problem either. I had a remote session today because the automatic maintenance scanner ran for 11 hours and never completed. All they told me to do was to do a factory recovery back to factory defaults, and to save all my info. They must know by now that there is a problem . Why can't they find a fix instead of just resetting to factory defaults. This will make twice now I've had to reset my computer. A $ 1000.00 computer that has so many problems "brand new !? I bought it new custom built in "China". But the "Recovery Manager is a black control panel, meaning it was manufactured before 2013. A white Recovery Manager page means it is 2014 or newer. I bought mine new custom built in June 2104, and it shows it was manufactured before 2013 . So I have a old windows 8 computer that was updated to 8.1 . They sold me as a NEW computer. That could be why I'm having so many problems with mine.
-
Serious performance problems with IMAP
Hi,
our Groupwise server runs on SLES10 SP4, has about 2500 accounts in 7 postoffices, one gwia, one mta and webaccess.
When IMAP is activated on the gwia, the cpu load goes to 99% after a couple of hours, making the server unusable. When I turn IMAP off the cpu load is stable under 1%.
We have already limited the number of items per folder to 1000.
Where do I look? What do I do? I'm at my wits end!!!
MikeOn 1//20/2012 7:05 AM, Steven Eschenburg wrote:
> We were having a similar issue with our 8.0.2 GWIA running under Windows
> Server 2008 on a VM, and adding a second CPU cut the CPU utilization
> down substantially.
>
>
> >>> tgm its<[email protected]> 1/20/2012 7:36 AM >>>
>
> konecnya;2169415 Wrote:
> > In article <[email protected]>, Tgm its wrote:
> > > When IMAP is activated on the gwia, the cpu load goes to 99% after a
> > > couple of hours, making the server unusable.
> > >
> > Is this just a single core single CPU system? If not make sure you are
> > running the SMP kernel.
> > uname -a
> >
> Yes, it is a single core singel CPU system. It's a virtual machine on a
> SLES10 xen host.
> uname -a:
> Linux gw1 2.6.16.60-0.85.1-xen #1 SMP Thu Mar 17 11:45:06 UTC 2011
> x86_64 x86_64 x86_64 GNU/Linux
>
> konecnya;2169415 Wrote:
> >
> > confirm that it is actually GWIA using up all that CPU and not
> > something else or a mix of things. Knowing what exactly is busy will
> > help.
> > top
> >
> > With 'top' if you have enough width to your terminal session, hitting
> > 'c' will tell you which POA is the offender if one is being the
> > problem.
> >
> Yes, I'm sure its gwia and not a postoffice. Gwia is using 99% of the
> cpu time.
>
> konecnya;2169415 Wrote:
> >
> > As per Michael, do make sure maintenance has been run on all the Post
> > Offices and do so from the top-down approach as per TID 3347260
> > (formerly TID 10007365)
> >
> http://www.novell.com/support/dynami...rd=nonthreaded
> <http://www.novell.com/support/dynamickc.do?cmd=show&forward=nonthreaded>
> > KC&docType=kc&externalId=3347260&sliceId=1
> > If you don't have most of this automated, I wrote up the steps a while
> > back that still does the trick, ideally not running them all at the
> > same time on that system.
> > 'GroupWise Maintenance' (http://www.konecnyad.ca/andyk/gwmnt5x.htm)
>
> Mike
>
>
> --
> tgm_its
> ------------------------------------------------------------------------
> tgm_its's Profile: http://forums.novell.com/member.php?userid=93411
> View this thread: http://forums.novell.com/showthread.php?t=451021
>
>
>
Yes, this would be my next step. You do have a fair amount of users. Is
allocating another core an option? -
Serious performance Problems with SQL
I am running this query to a oracle 10
update ETY_MEDIASHEET set PUB_TYPE = 'REVOKED' where id in (9090,9094,9100,9106,9112,9118,9124,9130,9136,9142,9148,9154,9160,9166,9172)
I am using ejb3 with hibernate as entitymanager. However, due to performace I need to set status quickly (so that gui gets correct data), thus i have to set it "hard" in the database.
This query takes a very long time. From seconds if the list has 1-2-3 items, and to minutes if it has 50+ items
Any1 got an idea as to solve this issue? or even why it occurs?
BTW - I am no sql star.yes, the ID has a index (as well as title).
the table holds data representing albums and albumtracks
"Column Name" "Data Type" "Nullable" "Data Default" "COLUMN ID" "Primary Key" "COMMENTS"
"ID" "NUMBER(19,0)" "No" "" "2" "1" ""
"SHEET_TYPE" "VARCHAR2(31 Char)" "No" "" "1" "" ""
"CRE_DATE" "TIMESTAMP(6)" "Yes" "" "3" "" ""
"CRE_USER" "VARCHAR2(255 Char)" "Yes" "" "4" "" ""
"MOD_DATE" "TIMESTAMP(6)" "Yes" "" "5" "" ""
"MOD_USER" "VARCHAR2(255 Char)" "Yes" "" "6" "" ""
"PUB_DATE" "TIMESTAMP(6)" "Yes" "" "7" "" ""
"PUB_USER" "VARCHAR2(255 Char)" "Yes" "" "8" "" ""
"PUB_MSG" "VARCHAR2(255 Char)" "Yes" "" "9" "" ""
"PUB_TYPE" "VARCHAR2(255 Char)" "Yes" "" "10" "" ""
"TITLE" "VARCHAR2(512 Char)" "Yes" "" "11" "" ""
"PUBLISHER_LINE" "VARCHAR2(512 Char)" "Yes" "" "12" "" ""
"COPYRIGHT_LINE" "VARCHAR2(512 Char)" "Yes" "" "13" "" ""
"RELEASE_DATE" "TIMESTAMP(6)" "Yes" "" "14" "" ""
"ORGRELEASE_DATE" "TIMESTAMP(6)" "Yes" "" "15" "" ""
"PARENTAL_ADVISORY" "NUMBER(1,0)" "Yes" "" "16" "" ""
"PUBLISHABLE" "NUMBER(1,0)" "Yes" "" "17" "" ""
"IS_FOR_SALE" "NUMBER(1,0)" "Yes" "" "18" "" ""
"PACKAGE_TYPE" "VARCHAR2(255 Char)" "Yes" "" "19" "" ""
"EDITION" "VARCHAR2(255 Char)" "Yes" "" "20" "" ""
"DURATION" "NUMBER(10,0)" "Yes" "" "21" "" ""
"LANGUAGE_CODE" "VARCHAR2(255 Char)" "Yes" "" "22" "" ""
"PRODUCTION_COMPANY" "VARCHAR2(255 Char)" "Yes" "" "23" "" ""
"OFFICIAL_SITE" "VARCHAR2(255 Char)" "Yes" "" "24" "" ""
"RECOMMENDED_FROM" "VARCHAR2(255 Char)" "Yes" "" "25" "" ""
"LABEL" "VARCHAR2(255 Char)" "Yes" "" "26" "" ""
"RECORDINGCOUNTRY" "VARCHAR2(255 Char)" "Yes" "" "27" "" ""
"VERSION" "VARCHAR2(255 Char)" "Yes" "" "28" "" ""
"CATALOG_ID" "NUMBER(19,0)" "Yes" "" "29" "" ""
"SUPPLIER_PRICECODE_ID" "NUMBER(19,0)" "Yes" "" "30" "" ""
"SUPPLIER_ID" "NUMBER(19,0)" "Yes" "" "31" "" ""
Index
"Index Name" "Columns" "Uniqueness" "Status" "Index Type" "Temporary" "Partitioned" "Function Index Status" "Join Index"
"SYS_C0025348" "ID" "UNIQUE" "VALID" "NORMAL" "N" "NO" "" "NO"
"IDX_ETY_SHEET_TITLE" "TITLE" "NONUNIQUE" "VALID" "NORMAL" "N" "NO" "" "NO" -
HELP: LAST_RECORD and POST_QUERY serious performance PROBLEM
Hi:
I've got and ORDERS table with 5000 records (nothing special). the ORDERS form have detail records for ORDERS_ITEMS. The form has POST_QUERY trigger that select TRANSPORT_TYPE from TRANSPORT table, and other kind of triggers like this one.
When I execute LAST_RECORD in this form it takes HUUUUGE time to go to the last record, and database CPU goes 100% for lots of seconds (30-40). I've analysed my TOP SQL, and I see that all my POST_CHANGE and block_POST_QUERY sql statements are executed thousands of times. THIS IS UGLY! LAST_RECORD should only go to the last record, and in that record show the data. What happens here is that FORMS execute the POST_CHANGE and POST_QUERY for every record in my table, and this is an multiply effect on queries.
I'm using forms9i patched with the latest IDS patch and 9ias with latest NonCore Patch.
I need help URGENT!
Joao Oliveirapost-change better not to use as it's for backward compatible purpose. post-query can be replace by when-new-record-instance(add a line go_item(non-db-item)) and the non-db-item's pre-text-item trigger.
But as of original issue, you seems to want to use post_query with last_record to go to last record, I don't know the detailed situation, but I feel that you may try
1) put it pre-query to query the last record by sql:
select f.* from
(--your sql used in form
select * from your tables
where ...
ORDER BY KEY-COLUMN DESC
) f
where rownum =1
2) another way:
get rid of the post-change, post-query trigger.
for post-query trigger, use the when-new-record-instance and pre-text-item trigger; for post-change, it depends on the code inside, you might be able to do it in the those two trigger too. -
Performance problems with XMLTABLE and XMLQUERY involving relational data
Hello-
Is anyone out there using XMLTABLE or XMLQUERY with more than a toy set of data? I am running into serious performance problems tyring to do basic things such as:
* Combine records in 10 relational tables into a single table of XMLTYPE records using XMLTABLE. This hangs indefinitely for any more than 800 records. Oracle has confirmed that this is a problem and is working on a fix.
* Combine a single XMLTYPE record with several relational code tables into a single XMLTYPE record using XMLQUERY and ora:view() to insert code descriptions after each code. Performance is 10 seconds for 10 records (terrible) passing a batch of records , or 160 seconds for one record (unacceptable!). How can it take 10 times longer to process 1/10th the number of records? Ironically, the query plan says it will do a full table scan of records for the batch, but an index access for the one record passed to the XMLQUERY.
I am rapidly losing faith in XML DB, and desparately need some hints on how to work around these performance problems, or at least some assurance that others have been able to get this thing to perform.<Note>Long post, sorry.</Note>
First, thanks for the responses above. I'm impressed with the quality of thought put into them. (Do the forum rules allow me to offer rewards? :) One suggestion in particular made a big performance improvement, and I’m encouraged to hear of good performance in pure XML situations. Unfortunately, I think there is a real performance challenge in two use cases that are pertinent to the XML+relational subject of this post and probably increasingly common as XML DB usage increases:
• Converting legacy tabular data into XML records; and
• Performing code table lookups for coded values in XML records.
There are three things I want to accomplish with this post:
• Clarify what we are trying to accomplish, which might expose completely different approaches than I have tried
• Let you know what I tried so far and the rationale for my approach to help expose flaws in my thinking and share what I have learned
• Highlight remaining performance issues in hopes that we can solve them
What we are trying to accomplish:
• Receive a monthly feed of 10,000 XML records (batched together in text files), each containing information about an employee, including elements that repeat for every year of service. We may need to process an annual feed of 1,000,000 XML records in the future.
• Receive a one-time feed of 500,000 employee records stored in about 10 relational tables, with a maximum join depth of 2 or 3. This is inherently a relational-to-XML process. One record/second is minimally acceptable, but 10 records/sec would be better.
• Consolidate a few records (from different providers) for each employee into a single record. Given the data volume, we need to achieve a minimum rate of 10 records per second. This may be an XML-only process, or XML+relational if code lookups are done during consolidation.
• Allow the records to be viewed and edited, with codes resolved into user-friendly descriptions. Since a user is sitting there, code lookups done when a record is viewed (vs. during consolidation) should not take more than 3 seconds total. We have about 20 code tables averaging a few hundred rows each, though one has 450,000 rows.
As requested earlier, I have included code at the end of this post for example tables and queries that accurately (but simply) replicate our real system.
Why we did and why:
• Stored the source XML records as CLOBS: We did this to preserve the records exactly as they were certified and sent from providers. In addition, we always access the entire XML record as a whole (e.g., when viewing a record or consolidating employee records), so this storage model seemed like a good fit. We can copy them into another format if necessary.
• Stored the consolidated XML employee records as “binary XML”. We did this because we almost always access a single, entire record as a whole (for view/edit), but might want to create some summary statistics at some point. Binary XML seemed the best fit.
• Used ora:view() for both tabular source records and lookup tables. We are not aware of any alternatives at this time. If it made sense, most code tables could be pre-converted into XML documents, but this seemed risky from a performance standpoint because the lookups use both code and date range constraints (the meaning of codes changes over time).
• Stored records as XMLTYPE columns in a table with other key columns on the table, plus an XMLTYPE metadata column. We thought this would facilitate pulling a single record (or a few records for a given employee) quickly. We knew this might be unnecessary given XML indexes and virtual columns, but were not experienced with those and wanted the comfort of traditional keys. We did not used XMLTYPE tables or the XML Repository for documents.
• Used XMLTABLE to consolidate XML records by looping over each distinct employee ID in the source batch. We also tried XMLQUERY and it seems to perform about the same. We can achieve 10 to 20 records/second if we do not do any code lookups during consolidation, just meeting our performance requirement, but still much slower than expected.
• Used PL/SQL with XMLFOREST to convert tabular source records to XML by looping over distinct employee IDs. We tried this outside PL/SQL both with XMLFOREST and XMLTABLE+ora:view(), but it hangs in both cases for more than 800 records (a known/open issue). We were able to get it to work by using an explicit cursor to loop over distinct employee IDs (rather than processing all records at once within the query). The performance is one record/second, which is minimally acceptable and interferes with other database activity.
• Used XMLQUERY plus ora:view() plus XPATH constraints to perform code lookups. When passing a single employee record, the response time ranges from 1 sec to 160 sec depending on the length of the record (i.e., number of years of service). We achieved a 5-fold speedup using an XMLINDEX (thank you Marco!!). The result may be minimally acceptable, but I’m baffled why the index would be needed when processing a single XML record. Other things we tried: joining code tables in the FOR...WHERE clauses, joining code tables using LET with XPATH constraints and LET with WHERE clause constraints, and looking up codes individually via JDBC from the application code at presentation time. All those approaches were slower. Note: the difference I mentioned above in equality/inequality constraint performance was due to data record variations not query plan variations.
What issues remain?
We have a minimally acceptable solution from a performance standpoint with one very awkward PL/SQL workaround. The performance of a mixed XML+relational data query is still marginal IMHO, until we properly utilize available optimizations, fix known problems, and perhaps get some new query optimizations. On the last point, I think the query plan for tabular lookups of codes in XML records is falling short right now. I’m reminded of data warehousing in the days before hash joins and star join optimization. I would be happy to be wrong, and just as happy for viable workarounds if I am right!
Here are the details on our code lookup challenge. Additional suggestions would be greatly appreciated. I’ll try to post more detail on the legacy table conversion challenge later.
-- The main record table:
create table RECORDS (
SSN varchar2(20),
XMLREC sys.xmltype
xmltype column XMLREC store as binary xml;
create index records_ssn on records(ssn);
-- A dozen code tables represented by one like this:
create table CODES (
CODE varchar2(4),
DESCRIPTION varchar2(500)
create index codes_code on codes(code);
-- Some XML records with coded values (the real records are much more complex of course):
-- I think this took about a minute or two
DECLARE
ssn varchar2(20);
xmlrec xmltype;
i integer;
BEGIN
xmlrec := xmltype('<?xml version="1.0"?>
<Root>
<Id>123456789</Id>
<Element>
<Subelement1><Code>11</Code></Subelement1>
<Subelement2><Code>21</Code></Subelement2>
<Subelement3><Code>31</Code></Subelement3>
</Element>
<Element>
<Subelement1><Code>11</Code></Subelement1>
<Subelement2><Code>21</Code></Subelement2>
<Subelement3><Code>31</Code></Subelement3>
</Element>
<Element>
<Subelement1><Code>11</Code></Subelement1>
<Subelement2><Code>21</Code></Subelement2>
<Subelement3><Code>31</Code></Subelement3>
</Element>
</Root>
for i IN 1..100000 loop
insert into records(ssn, xmlrec) values (i, xmlrec);
end loop;
commit;
END;
-- Some code data like this (ignoring date ranges on codes):
DECLARE
description varchar2(100);
i integer;
BEGIN
description := 'This is the code description ';
for i IN 1..3000 loop
insert into codes(code, description) values (to_char(i), description);
end loop;
commit;
end;
-- Retrieve one record while performing code lookups. Takes about 5-6 seconds...pretty slow.
-- Each additional lookup (times 3 repeating elements in the data) adds about 1 second.
-- A typical real record has 5 Elements and 20 Subelements, meaning more than 20 seconds to display the record
-- Note we are accessing a single XML record based on SSN
-- Note also we are reusing the one test code table multiple times for convenience of this test
select xmlquery('
for $r in Root
return
<Root>
<Id>123456789</Id>
{for $e in $r/Element
return
<Element>
<Subelement1>
{$e/Subelement1/Code}
<Description>
{ora:view("disaac","codes")/ROW[CODE=$e/Subelement1/Code]/DESCRIPTION/text() }
</Description>
</Subelement1>
<Subelement2>
{$e/Subelement2/Code}
<Description>
{ora:view("disaac","codes")/ROW[CODE=$e/Subelement2/Code]/DESCRIPTION/text()}
</Description>
</Subelement2>
<Subelement3>
{$e/Subelement3/Code}
<Description>
{ora:view("disaac","codes")/ROW[CODE=$e/Subelement3/Code]/DESCRIPTION/text() }
</Description>
</Subelement3>
</Element>
</Root>
' passing xmlrec returning content)
from records
where ssn = '10000';
The plan shows the nested loop access that slows things down.
By contrast, a functionally-similar SQL query on relational data will use a hash join and perform 10x to 100x faster, even for a single record. There seems to be no way for the optimizer to see the regularity in the XML structure and perform a corresponding optimization in joining the code tables. Not sure if registering a schema would help. Using structured storage probably would. But should that be necessary given we’re working with a single record?
Operation Object
|SELECT STATEMENT ()
| SORT (AGGREGATE)
| NESTED LOOPS (SEMI)
| TABLE ACCESS (FULL) CODES
| XPATH EVALUATION ()
| SORT (AGGREGATE)
| NESTED LOOPS (SEMI)
| TABLE ACCESS (FULL) CODES
| XPATH EVALUATION ()
| SORT (AGGREGATE)
| NESTED LOOPS (SEMI)
| TABLE ACCESS (FULL) CODES
| XPATH EVALUATION ()
| SORT (AGGREGATE)
| XPATH EVALUATION ()
| SORT (AGGREGATE)
| XPATH EVALUATION ()
| TABLE ACCESS (BY INDEX ROWID) RECORDS
| INDEX (RANGE SCAN) RECORDS_SSN
With an xmlindex, the same query above runs in about 1 second, so is about 5x faster (0.2 sec/lookup), which is almost good enough. Is this the answer? Or is there a better way? I’m not sure why the optimizer wants to scan the code tables and index into the (one) XML record, rather than the other way around, but maybe that makes sense if the optimizer wants to use the same general plan as when the WHERE clause constraint is relaxed to multiple records.
-- Add an xmlindex. Takes about 2.5 minutes
create index records_record_xml ON records(xmlrec)
indextype IS xdb.xmlindex;
Operation Object
|SELECT STATEMENT ()
| SORT (GROUP BY)
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (AGGREGATE)
| FILTER ()
| TABLE ACCESS (FULL) CODES
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (GROUP BY)
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (AGGREGATE)
| FILTER ()
| TABLE ACCESS (FULL) CODES
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (GROUP BY)
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (AGGREGATE)
| FILTER ()
| TABLE ACCESS (FULL) CODES
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (AGGREGATE)
| FILTER ()
| NESTED LOOPS ()
| FAST DUAL ()
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| SORT (AGGREGATE)
| TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
| INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
| TABLE ACCESS (BY INDEX ROWID) RECORDS
| INDEX (RANGE SCAN) RECORDS_SSN
Am I on the right path, or am I totally using the wrong approach? I thought about using XSLT but was unsure how to reference the code tables.
I’ve done the best I can constraining the main record to a single row passed to the XMLQUERY. Given Mark’s post (thanks!) should I be joining and constraining the code tables in the SQL WHERE clause too? That’s going to make the query much more complicated, but right now we’re more concerned about performance than complexity. -
Performance problems with 0EC_PCA_3 datasource
Hi experts,
We have recently upgraded the Business Content in our BW system, as well as the plug-in on R/3 side. Now we have BC3.3 and PI2004.1. Given the opportunity, we decided to apply the new 0EC_PCA_3 and 0EC_PCA_4 datasources that provide more detailed data from table GLPCA.
The new datasources have been activated and transported to the QA system, where we experience serious performance problems while extracting data from R/3. All other data extractions work as before so there should not be any problem with the hardware.
Do you use 0EC_PCA_3? Have you experienced any problem with the speed of data extraction/transfer? We already have applied the changes suggested in note 597909 (Performance of FI-SL line item extractors: Creating indexes) and created secondary indexes on GLPCA table but it did not help.
thanks and regards,
CsabaSeems the problem was caused by a custom development - quantity conversion in user exit. However, we tried loading earlier after removal of the exit, it did not help then (loading took even longer...). Now it did.
-
Performance problems when using Premiere Elements for photo slideshows
Hello,
I had been using Premiere Elements 9 (PE9) to make a simple slideshow for my parents from their vacation trip and I ran into some serious performance problems. I had used it to create similar projects before, but not nearly as big. This one is like 260 photos, so basically it is 260 seperate clips. I have a POWERHOUSE workstation (see below) so it isn't my PC. Even when PE9 crashes, looking at my performance monitor my CPU and RAM aren't even halfway being utilized. I finally switched to Windows Movie Maker of all things and it worked seemlessly, amazing really. I'm wondering if I was just using PE9 for something other than what it was designed for since there weren't really any video clips, just a ton of photos that I made into video clips, if that makes sense. Based upon my experience with this so far, I can't imagine using PE9 anymore for anything really. I might have the need for a more professional video editing program in the near future, although it does seem like PE has a lot of features. How can I make sure it utilizes my workstation to its full potential? Here are my specs:
PC
Intel Core i7-2600K 4.6 GHz Overclocked
ASUS P8P67 Deluxe Motherboard
AMD Firepro V8800 Video Card
Crucial 128 GB SATA 6Gb/s Solid State Drive (Operating System)
Corsair Vengeance 16GB (4x4GB) Memory
Corsair H60 Liquid CPU Cooler
Corsair Professional Series Gold AX850 Power Supply
Graphite Series 600T Mid-Tower Case
Western Digital Caviar Black 1 TB SATA III Hard Drive
Western Digital Caviar Black 2 TB SATA III Hard Drive
Western Digital Green 3 TB SATA III Hard Drive
Logitech Wireless Gaming Mouse G700
I don’t play any games but it’s a great productivity mouse with 13 customizable buttons
Wacom Intuos5 Pen Tablet
Yes, this system is blazingly fast. I have yet to feel it slow down, even with Photoshop, Lightroom, InDesign, Illustrator and numerous other apps running at the same time. HOWEVER, Premiere Elements 9 has crashed NUMERUOS times, every time my system wasn't even close to being fully taxed.
Monitors – All run on the ATI V8800
Dell Ultra Sharp 30 inch
Samsung 27 Inch
HAANS-G 28 Inch
Herman Miller Embody Ergonomic Chair (one of my favorite items)Andy,
There ARE some differences between PrE and PrPro w/ an approved CUDA-capable and MPE hardware acceleration-enabled nVidia video card, but those differences show up ONLY in the quality of the Scaling. The processing overhead is almost exactly the same, when it comes to handling the extra pixels.
As of PrPro CS 5, two things changed:
The max. size of Still Images went up from 4096 x 4096 pixels, to quite a bit larger (cannot recall the numbers now).
The Scaling algorithms have been improved, though ONLY with the correct nVidia cards, with MPE hardware support enabled.
Now, there CAN be another consideration, between the two programs, in that PrPro CS 5 - CS 6, are 64-bit ONLY, so one benefits from the computer and OS to run it. PrE can be either 32-bit, or 64-bit, so one might, or might not, be taking advantage of the 64-bit program and OS. Still, the processing overhead will be almost identical, it's just that the 64-bit OS can spread it around a bit.
I still recommend Scaling the large Still Images in PS, prior to Import, to keep that processing overhead as low as is possible. Scaled Still Images work just fine, and I have one Project with 3000+ Scaled Still Images, that edits just fine in PrPro, even on my older 32-bit workstation. Testing that same machine, and PrPro some years ago, I could ONLY work with up to 5 - 4096 x 4096 Stills, before things ground to a crawl.
Now, Adobe AfterEffects handles large Still Images differently, so I just moved that test Project to AE, and added another 20 large Images, which edited just fine. IIRC, AE can handle Still Images up to 10K x 10K pixels, and that might have gone up, as of CS 5.
Good luck, and hope that helps,
Hunt
Maybe you are looking for
-
I have a physical disk that I can see from multipath -ll that shows up as such # multipath -ll 3600c0ff00012f4878be35c5401000000 dm-115 HP,P2000G3 FC/iSCSI size=410G features='1 queue_if_no_path' hwhandler='0' wp=rw |-+- policy='round-robin 0' prio=
-
No Personal Technical Support from Apple for iTunes on a new 64bit systems.
HP Pavilion DV 9000 64 Bit Windows Vista Brand New HP Pavilion DV 9000 64 Bit Windows Vista Brand New When I attempt to import the well organized Music folder located on my from C drive only 6 out 10 gigs are copied. Even worse, many of the i
-
Invoice and Invoice amount in FI/CO When enter check number
Hi All, I have a requirement like this. We have custom t-code ZF04 ( FAST CASH ) to post payments to customer accounts. In this t-code, we have one field for Check#. When they enter Check# number and click on 'SEARCH' icon i need to display list
-
Hi guru When I am creating a reservation through TC MB21, it picks up the GL code assigned to the material but the commitment item is not changing but remains the same as is in the first line item. Kindly suggest what additional changes needs to be d
-
Re: (forte-users) DateTimeData and time zones, I need amagictrick !
You might try externalizing as textdata whenever its passed between partitions. kelsey.petrychynsasktel.sk.ca on 04/26/2000 03:45:28 PM To: "Guylain Rochon" <GRochonInfluatec.ca> cc: kamranaminyahoo.com (bcc: Charlie Shell/Bsg/MetLife/US) Subject: Re