SAP MaxDB 7.7.07.16 - performance issue due to IOWait(R) task
We are running MaxDB 7.7.07.16 unicode in a non SAP environment (is basis for a web application).
Database Server has 4 Cpu's, 8 Gb RAM, operating system Debian-Linux 6.0.
We have some performance problems in not regular periods of time.
The performance problem occurs for a time frame between 3 and 15 minutes and blocks more or less the complete system.
After a long phase of checking and evaluating possible issues, we think that we are very near to the issue.
We identified the database tasks
- TblPrfC - Prefetch Table Coordinator:
- PrefPag - Prefetch Pages
having the current state "IO Wait (R)".
As long these tasks are in state "IO Wait (R)" the users (at the web application) gets awful slow respond times on their queries.
On 9th of April 2011 we changed the DB parameters:
- ReadAheadTableThreshold old 0 => new 128
But today we got the same performance problem again.
If you have any hints or tips, you're welcome!
Best regards
Hannes
Hello Hannes,
ok brief but hopefully usefull answer to your post:
- if the database I/O generated by SQL statements alone brings the system to a grinding halt, then your storage system is undersized. How should it be able to handle additional I/O, e.g. for backups if it isn't capable to handle the standard I/O?
- the prefetching tasks are used to speed up large scan operations. So, the next thing to do here is to find statements that perform table scans and check whether or not these can be tuned to KEY or Index accesses
- As a workaround, I would propose to increase the value of ReadAheadTableThreshold to, say, 500, so that the prefetching is done only for really large scans.
regards,
Lars
Similar Messages
-
Cisco ASR 1002- performance issue due to access list
Hi,
We are planning to implement inbound access-list to block subnets from particular country. Since the subnets are not contiguous, we have about 16000 lines of acl entries.
I want to know, would there be any performance or latency issues after applying 16k lines of acl?
Is there a good document where I can read more about ACL limitations and performance issues on ASR.
This is for ASR1002, running IOS-XE 15.3(1)S1.
ThanksDisclaimer
The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
Liability Disclaimer
In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
Posting
Sorry, I don't know the answer to your questions, but I'm writing to mention a 7200 feature, that if supported on the ASR, might help in your situation. See http://www.cisco.com/c/en/us/support/docs/security/ios-firewall/23602-confaccesslists.html#turbo -
Avoiding performance issue due to loop within loop on internal tables
Hi Experts,
I have a requirement where in i want to check whether each of the programs stored in one internal table are called from any of the programs stored in another internal table. In this case i am looping on two internal tables (Loop within a loop) which is causing a major performance issue. Program is running very very slow.
Can any one advise how to resolve this performance issue so that program runs faster.
Thanks in advance.
Regards,
Chetan.Forget the parallel cursur stuff, it is much to complicated for general usage and helps nearly nothing. I will publish a blog in the next days where this is shown in detail.
Loop on loop is no problem if the inner table is a hashed or sorted table.
If it must be a standard table, then you must make a bit more effort and faciliate a binary search (read binary search / loop from index exit)
see here the exact coding Measurements on internal tables: Reads and Loops:
/people/siegfried.boes/blog/2007/09/12/runtimes-of-reads-and-loops-on-internal-tables
And don't forget, the other table must not be sorted, the loop reaches anyway every line. The parallel cursor requires both tables to be sorted. The additional sort
consumes nearly the whole advantage of the parallel cursor compared to the simple but good loop in loop solutions.
Siegfried -
Performance issue due to column formula and filters
Hi,
I am facing strange issue with performance for my OBIEE reports. I have two sets of reports Static and Dynamic. Both runs against same tables. The only difference between these reports is that the Static reports would run against all the data for given aggregation level e.g. Year, Month, Date and so on. Where as for Dynamic one I have range prompts to filter data. Other difference is that I have a column formula for one of the column in the Dynamic report, which is nothing but Go URL to show another page with certain parameters.
The static report takes around 14-15 Seconds where as the Dynamic one takes around 3.5 min. The amount of data and range is same here. From the logs I could see that for the Static reports, i.e. reports without filters it applys group by at SQL level where as it is not doing so for the dynamic one. Is this expected ?
Second issue is, even if I say remove the filters and just have report with column formula in one and no formula in other there is significant time difference in the processing at Presentation service layer. Again this is taken from the log. it takes 8 second to get data from DB but shows almost 218 Seconds as response time at Presentation layer.
Below are conceptual details about table and reports -
Table 1 (It is date dimension) : Date_Dim
DateCode Date
Day Number
MonthCode Varchar2
YearCode Varchar2
Table 2 (It is aggregate table at year level) : Year_Aggr
DateCode Date (FK to Table1 above)
Measure1
Measure2
Measure3
Measure4
Measure5
Report 1
Date_Dim.YearCode | Year_Aggr.Measure1 | Year_Aggr.Measure2 | Year_Aggr.Measure3 | Year_Aggr.Measure4
Report 2
Dashboard Filter : Dimension1 | Dimension2 | Year Start | Year End |
Date_Dim.YearCode | Year_Aggr.Measure1 | Year_Aggr.Measure2 | Year_Aggr.Measure3 | Year_Aggr.Measure4
Column formula for Date_Dim.YearCode is something like :
'<a href="saw.dll?Dashboard&PortalPath=somepath and parameters target=_self>' || Date Dim"."YearCode" || '</a">'
Filters :
Dimension1 is prompted...
Dimension2 is prompted...
cast("Date Dim"."YearCode" as Int) is greater than or equal to @{Start_Year}
cast("Date Dim"."YearCode" as Int) is greater than or equal to @{End_Year}
Note : I need to apply cast to int as column is varchar2, legacy problem.+
How can I fix this? Am I missing something? In the result of report2 the DB SQL doesn't show the year in where thought it is displayed in the logical sql.
Let me know if anybody had faced this and have fixed. Or suggetion to make changes to fix this.
Thannks,
Ritesh</a>Hi Ritesh,
I think you right about the root cause of your problem. The first request does the group by in the database which returns fewer records to the BI Server for processing. The second request does not do the group by and sends significantly more records back to the BI server forcing it to do the group by. Compound that with the fact that pivot table views are relatively expensive computationally and that explains the difference between the execution times.
Assuming that the execution time of the first report is satisfactory, I would recommend you try to experiment with a few settings to see if you can get the second report to do the group by in the database.
Are the two filters identical except for the following conditions?
cast("Date Dim"."YearCode" as Int) is greater than or equal to @{Start_Year}
cast("Date Dim"."YearCode" as Int) is greater than or equal to @{End_Year}
Best regards,
-Joe -
Performance issue due to RFC calls. (R/3 to R/3 system)
Hi,
My application face serious performance problem because of RFC calls (R/3 system to R/3 system).
1)is there any transaction code for doing performance analysis on RFC calls in R/3 system?
2)How far large volume of Data Transfer (mainly due to internal tables) in a RFC call affects the performance? is there any limit for data transfer size at a time in a RFC call? if so how to calculate for best performance?
Thanks and regards,
Prakash.Hi Andreas,
Suppose an RFC enabled Fm having an internal table as importing parameter.During execution of this Fm with 4000 enteries for the internal table results in performance degrading.is there any way to improve the performance?.
Thanks and regards,
Prakash. -
Performance issue due to localization code in SSRS 2008
The reports I am working on consist lot of data and all the customers use it frequently.
Report title and columns are localized by expressions .
This takes long time for report rendering and exporting to csv. When I test without the localization code it doesn't take that long .
Can someone help me to optimize the report. SSRS 2008 R2
ArchanaHi Archana,
In Reporting Services, the total time to generate a reporting server report (RDL) can be divided into 3 elements:
Total time = (TimeDataRetrieval) + (TimeProcessing) + (TimeRendering)
TimeRendering means the number of milliseconds spent after the Rendering Object Model is exposed to the rendering extension. It includes the Time spent in on-demand expression evaluations (e.g. TextBox.Value, Style.*). So it is make sense that the report
with localization code takes longer time.
Besides, we can improve the report performance form other aspects. For example, we can add filter, sorting and aggregation in dataset query, because filter, sort, and aggregation is more efficient on the data source than during report processing. For more
details about report performance, please refer to the following article:
http://technet.microsoft.com/en-us/library/bb522806(v=sql.105).aspx
If there are any other questions, please feel free to ask.
Thanks,
Katherine Xiong
Katherine Xiong
TechNet Community Support -
System performance issue due to multiple submission of a job
For month end recoincilation users run few critical reports which are quite resource consuming.
In order to control this, want to restrict the usage of such reports. For example if one session is active (Foreground or background) user couldn't submit another job or gets a pop up with an error mesage.
Searched SDN for this and couldn't find much.
anyaAnya since this is the ABAP forum and I happen to be an ABAP programmer I can give you an ABAP solution. This would involve changes to the code in all these reports so you would need the help of an ABAP programmer if you are not one.
a) Create a Z table containing 3 fields
1) Client of type MANDT (Primary key field)
2)Program Name of type PROGRAM_ID (Primary key field)
3) User name of type XUBNAME (Regular field).
b) Create table maintenance for this internal table.
c) Create one record for each of the programs you want to regulate. You only need to enter the program name initially and leave the user name blank.
d) In all the ABAP programs make the following change.
1) Under the START-OF-SELECTION event of the report lock the use function module ENQUEUE_E_TABLEE to lock the record in the Ztable for the program being executed. Look at the sample code below.
TABLES: <ztable>.
data: w_message(100) TYPE c,
w_locked(1) TYPE c.
CALL FUNCTION 'ENQUEUE_E_TABLEE'
EXPORTING
tabname = <ztable name>
varkey = <concatenation of mandt and sy-cprog>
EXCEPTIONS
foreign_lock = 1
system_failure = 2
OTHERS = 3.
IF sy-subrc EQ 1.
SELECT SINGLE *
FROM <ztable>
INTO <ztable>
WHERE program FIELD name EQ sy-cprog.
IF sy-subrc EQ 0.
CONCATENATE 'Program'
sy-cprog
'is currently being used by'
<ztable>-<user name>
INTO w_message SEPARATED BY space.
MESSAGE e208(00) WITH w_message.
ENDIF.
ELSEIF sy-subrc EQ 0.
w_locked = 'X'.
SELECT SINGLE *
FROM <ztable>
INTO <ztable>
WHERE program FIELD name EQ sy-cprog.
IF sy-subrc EQ 0.
<ztable>-user name> = sy-uname.
MODIFY <ztable> FROM <ztable>.
ENDIF.
ENDIF.
e) At the event END-OF-SELECTION (at the end of the program) use function module DEQUEUE_E_TABLEE to unlock the record. Look at sample code below.
CASE w_locked.
WHEN 'X'.
CALL FUNCTION 'DEQUEUE_E_TABLEE'
EXPORTING
tabname = <ztable name>
varkey = <concatenation of mandt and sy-cprog>.
ENDCASE.
This code is designed to allow just one user or job to run the program at a time. The second person will be issued an error message informing him/her that the program is being used by <user name>. -
Performance Issues due to Loading of ADF/JClient View Objects
Hi,
I developed a two-tier ADF/JClient application.
When the ADF/JClient application opens for the first time, there's a big delay
However opening it subsequently, takes less than half of the time to open the screen.
I think it might the xml files of different view objects which the framework loads for the first time, and thus takes a lot of time. And after that, the time to open the application reduces.
We have around 200 view objects in our application. Can this be the reason?
If yes, then can we load the view objects before ahead, or can we just load them at the very time they are requested ?
Thanks,
Makrand PareHi Makrand,
Check this out:
"Limiting Fetching of Business Components Attributes in ADF Swing
http://www.oracle.com/webapps/online-help/jdeveloper/10.1.3/state/content/navId.4/navSetId._/vtTopicFile.developing_jclient_applications%7Cjui_plimitingfetching%7Ehtml/
Note: In local mode deployment (the client and Business Components reside in the same VM), the fetching of attribute properties is not supported.
You can optimize startup time for a Business Components client application and the remotely deployed Business Components by specifying the list of view object attributes that your client uses. If you create a project without the metadata, by coding to the API, you will want to add fetchAttributeProperties() to the bootstrap code of the client forms with a list of only the attributes used by the form. Without this method call, your client form would fetch all control hint properties (including the attributes format and label for example) for all the attributes of the named view objects in the application module, in a single network roundtrip.
For example, when you do not intend to use all the attributes of the ADF Swing form's bound view object, with the fetchAttributeProperties() method, your ADF Swing form fetches only the information required to layout your forms, while ignoring the attributes you do not require.
Note: In local mode deployment (the client and Business Components reside in the same VM), the fetching of attribute pro
Calling fetchAttributeProperties() will prevent property methods such as getFormat() or getLabel() from being called on the Business Components attribute definition whenever the form is created."
Vlad -
Hi,
I am having serious performance issue due to BSEG table .I am having a change request in which I have to solve the performance issue with regard to BSEG. The situation was that previously they had used select * on both BKPF and BSEG. I removed the select * and selected only those fields which are required as shown below. I also tried using cursors. But the problem is happening in the TEST server where BSEG is having more than 1 crore entries. I have gone through some threads but still not able to understand how to solve this problem. Please help
select bukrs belnr gjahr bldat bstat from bkpf into table T_BKPF_p
WHERE BUKRS IN sd_bukrs AND
BLDAT < s_bldat-low
and BSTAT = ' ' .
select bukrs belnr gjahr shkzg dmbtr hkont from bseg into table T_BSEG_C
FOR ALL ENTRIES IN t_BKPF_p
WHERE BUKRS = T_bkpf_p-bukrs
AND BELNR = T_bkpf_p-belnr
AND GJAHR = T_bkpf_p-gjahr
AND HKONT = SKB1-SAKNR.Hi Kunal,
Here is my take on your issue.
In your select statement on BKPF you are selecting every BKPF record for a specified company code and blank document status that was created before a specified date. If your company has implemented SAP 10 years ago, and your user enters todays date and leaves the company code field blank you will effectively be retrieving almost all the records from BKPF (excluding the ones created today or those with non-blank document status). This would effectively be a huge amount of data. After that you are looking for the corresponding BSEG records for all the records that you have selected in BKPF.
My question to you is why do you need to look at all the records before a given date? Why not ask the user to enter a smaller date range and make the document date and the company code a mandatory entry? You do not have to look at 10 years worth of data especially if you are running this online (as opposed to in the background).
Your BSEG select looks correct. There is very little that you can do except for adding BUZEI to the field list. If you use for all entries and do not include the entire primary key you could lose data.
TABLES: bkpf,
skb1.
SELECT-OPTIONS: s_bldat FOR bkpf-bldat OBLIGATORY,
sd_bukrs FOR bkpf-bukrs OBLIGATORY.
TYPES: BEGIN OF ty_bkpf,
bukrs TYPE bkpf-bukrs,
belnr TYPE bkpf-belnr,
gjahr TYPE bkpf-gjahr,
bldat TYPE bkpf-bldat,
bstat TYPE bkpf-bstat,
END OF ty_bkpf,
BEGIN OF ty_bseg,
bukrs TYPE bseg-bukrs,
belnr TYPE bseg-belnr,
gjahr TYPE bseg-gjahr,
buzei TYPE bseg-buzei,
shkzg TYPE bseg-shkzg,
dmbtr TYPE bseg-dmbtr,
hkont TYPE bseg-hkont,
END OF ty_bseg.
DATA: t_bkpf_p TYPE TABLE OF ty_bkpf,
t_bseg_c TYPE TABLE OF ty_bseg.
SELECT bukrs
belnr
gjahr
bldat
bstat
FROM bkpf
INTO TABLE t_bkpf_p
WHERE bukrs IN sd_bukrs
AND bldat IN s_bldat
AND bstat EQ space .
IF NOT t_bkpf_p[] IS INITIAL.
SELECT bukrs
belnr
gjahr
buzei
shkzg
dmbtr
hkont
FROM bseg
INTO TABLE t_bseg_c
FOR ALL ENTRIES IN t_bkpf_p
WHERE bukrs EQ t_bkpf_p-bukrs
AND belnr EQ t_bkpf_p-belnr
AND gjahr EQ t_bkpf_p-gjahr
AND hkont EQ skb1-saknr.
ENDIF. -
Report Performance Issue - Activity
Hi gurus,
I'm developing an Activity report using Transactional database (Online real time object).
the purpose of the report is to list down all contacts related activities and activities NOT related to Contact by activity owner (user id).
In order to fullfill that requirment I've created 2 report
1) All Activities related to Contact -- Report A
pull in Acitivity ID , Activity Type, Status, Contact ID
2) All Activities not related to Contact UNION All Activities related to Contact (Base report) -- Report B
to get the list of activities not related to contact i'm using Advanced filter based on result of another request which is I think is the part that slow down the query.
<Activity ID not equal to any Activity ID in Report B>
Anyone encountered performance issue due to the advanced filter in analytic before?
any input is really appriciated
Thanks in advanced,
FinaFina,
Union is always the last option. If you can get all record in one report, do not use union.
since all records, which you are targeting, are in the activity subject area, it is not nessecery to combine reports. add a column with the following logic
if contact id is null (or = 'Unspecified') then owner name else contact name
Hopefully, this is helping. -
Performance issue with view selection after migration from oracle to MaxDb
Hello,
After the migration from oracle to MaxDb we have serious performance issues with a lot of our tableview selections.
Does anybody know about this problem and how to solve it ??
Best regards !!!
Gert-JanHello Gert-Jan,
most probably you need additional indexes to get better performance.
Using the command monitor you can identify the long running SQL statements and check the optimizer access strategy. Then you can decide which indexes might help.
If this is about an SAP system, you can find additional information about performance analysis in SAP notes 725489 and 819641.
SAP Hosting provides the so-called service 'MaxDB Migration Support' to help you in such cases. The service description can be found here:
http://www.saphosting.de/mediacenter/pdfs/solutionbriefs/MaxDB_de.pdf
http://www.saphosting.com/mediacenter/pdfs/solutionbriefs/maxDB-migration-support_en.pdf.
Best regards,
Melanie Handreck -
Performance issues with SAP BPC 7.0/7.5 (SP06, 07, 08) NW
Hi Experts
There are some performance issues with SAP BPC 7.5/7.0 NW, users are saying they are not getting data or there are some issues while getting data from R/3 system or ECC 6.0. Then what things do I need to consider to check, such as what DataSources or Cubes I need to check? So, how to solve this issue?
What things I need to consider for SAP NW BI 7.0 u2013 SAP BPC 7.5 NW (SP06, 07, 08) Implementation?
Your help is greatly appreciated.
Regards,
QadeerHi,
New SP was released in February, and now most of the new bugs should been caught ,This has a Central Note. For SP06 it's Note 1527325 - Planning and Consolidation 7.5 SP06 NetWeaver Central Note to fix any issues. Most of the improvements in SP06 were related to performance, especially when logging on from the BPC clients.There you should be able to find a big list of fixes/improvements and Notes that describe those. Some of the Notes even have test description how to reproduce that issue in the old version.
hope this will help you
Regards
Rv -
Performance Issue in Dashboard using SAP BW NetWeaver Connection
HI Experts ,
We developed Dashboard which is based on BW Queries, However it is taking considerable amount time while executing.
We are using BO Dashboard SP2 version ,BO 4.1 and BI system 7.01 .
We are looking for few clarifications on SAP BO Xcelsius Dashboards.
Though we know limitations on component number and data volumes which could badly affect performance of the dashboard, We do have a requirement to handle huge data volumes and multiple components. Our source data lies in SAP BI system and we are using BICS connectivity/ Webi with QAAWS for updating data in BO dashboard .
Our requirement is too complex where we should be in a position to meet user expectations for complete view 75 KPI’s in a single Dashboard.
We have scenarios like YTD ,QTD and MTD other complex calculations in Bex Query.
Here are my questions,
Is there any way to provide complete functionality using large data sets to the users with the current architecture without any performance issues?
Are there any third party tools which can be used with BO Dashboard for the performance improvement and handling huge volumes?
Do you suggest any alternate solution for complete functionality?
Many thanks for your inputs in advance!
Regards
VenkatHi Rajesh,
Thank you so much for your response.
I have tried the way you suggested. But here my issue is , I have a prompt in my webi report based on months selection and it is a single valued prompt.
so I was able to schedule my report only for one month;whereas my dashboard needs to show values for one year data based on the months the user will select in the dashboard.
Though i use the WebI instance data in the dashboard , i getting only one month value, also I am not able to associate the selected month for the WebI instance in dashboard.
Is there any option to schedule webI report for different months and the dashboard has to pick the 12 months instance and the combo box selection in the dashboard must associate with it??
Please help me with your thoughts. -
Performance issue in Webi rep when using custom object from SAP BW univ
Hi All,
I had to design a report that runs for the previous day and hence we had created a custom object which ranks the dates and then a pre-defined filter which picks the date with highest rank.
the definition for the rank variable(in universe) is as follows:
<expression>Rank([0CALDAY].Currentmember, Order([0CALDAY].Currentmember.Level.Members ,Rank([0CALDAY].Currentmember,[0CALDAY].Currentmember.Level.Members), BDESC))</expression>
Now to the issue I am currently facing,
The report works fine when we ran it on a test environment ie :with small amount of data.
Our production environment has millions of rows of data and when I run the report with filter it just hangs.I think this is because it tries to rank all the dates(to find the max date) and thus resulting in a huge performance issue.
Can someone suggest how this performance issue can be overcome?
I work on BO XI3.1 with SAP BW.
Thanks and Regards,
Smitha.Hi,
Using a variable on the BW side is not feasible since we want to use the same BW query for a few other reports as well.
Could you please explain what you mean by 'use LAG function'.How can it be used in this scenario?
Thanks and Regards,
Smitha Mohan. -
Performance issue of BI reports in SAP Enterprise portal
Dear Friends,
We have integrated BI reports with SAP Enterprise portal 7.0.Reports are running properly But the issue is reports are taking more time to dispsaly its content and leading it to performance effect.
In Bex ( BI side) reports performance is little better than SAP EP platform. BI Team also looking for ways to improve performance at BI side.
Could you please share your valuable ideas to improve the performance at SAP EP side also ..
Thanks and Regards
Ratnakar ReddyHi ratnakar,
The first step is to identify which component is causing the performance problem. Run your report in the portal but try appending the string &PROFILING=X in the end of the URL. This will generate BI statistics which you can use to see which component (Java stack, ABAP stack, Database) is causing the performance issue.
Hope this helps.
Maybe you are looking for
-
Sending sales order through idoc
hai, Is it possible to send IDOC of sales order in IDES. If so can anyone pls let me the solution
-
Battery LIfe in Macbook Pro 2012- complete joke
Hi All, just got a new macbook pro for xmas. It was a refurbushed one from July 2012 which comes with warranty etc etc. Firstly am pretty annoyed as think Appple is really going downhill fast (iphone 5 have seem issues with battery but service in hk
-
How to setup my iPad to sync automatically not only the movies and songs I have bought but also the tv shows?
-
Urgent on ACH payment Please help
One of our user created an ACH file and sent it to the bank for payment.He created this ACH file in one bank but by mistake sent it to other bank for payment.The payment has been cleared . Because of this my bank balances have changed. Could anyone h
-
How do you open Illustrator CS6 files in CS5
how do you open Illustrator CS6 files in CS5