Performance Improvement Checks
Hi All,
I am trying to do some suggested performance tuning in the BW (BI 7) system.
My first step was reducing the number of sub chains in the Process Chains.
What help I require from you masters is that, I want to compare this modified process chain with the existing process chain on the basis of number of Dialogue Processes used and time taken. Both can be separate as well.
Is there any transaction or tool which will help me do such comparisions?
The modified process chain is in the development server and the existing one is in the production server.
Any inputs will be awarded.
Thanks In Advance
Message was edited by:
PPMM
I don't know how granular your sub-chains have been created. While there is some overhead as part of sub-chains needing to report their status, having sub-chains provides a lot of flexibility in supporting problem resolution, having multiple variations of a schedule, unit testing, etc.
So unless the sub-chains are exceptionally granular, I would look at whether you can simply rearrange things so that more things can be running concurrently. Maybe more Background/dialog processes need to be established by Basis so more chains can run simultaneoulsy, unless you have already maxed out the server.
Similar Messages
-
Performance improvement in OBIEE 11.1.1.5
Hi all,
In OBIEE 11.1.1.5 reports takes long time to load , Kindly provide me some performance improvement guides.
Thanks,
Haree.Hi Haree,
Steps to improve the performance.
1. implement caching mechanism
2. use aggregates
3. use aggregate navigation
4. limit the number of initialisation blocks
5. turn off logging
6. carry out calculations in database
7. use materialized views if possible
8. use database hints
9. alter the NQSONFIG.ini parameters
Note:calculate all the aggregates in the Repository it self and Create a Fast Refresh for MV(Materialized views).
and you can also do one thing you can schedule an IBOT to run the report every 1 hour or some thing so that the report data will be cached and when the user runs the report the BI Server extracts the data from Cache
This is the latest version for OBIEE11g.
http://blogs.oracle.com/pa/resource/Oracle_OBIEE_Tuning_Guide.pdf
Report level:
1. Enable cache -- change nqsconfig instead of NO change to YES.
2. GO--> Physical layer --> right click table--> properties --> check cacheable.
3. Try to implement Aggregate mechanism.
4.Create Index/Partition in Database level.
There are multiple other ways to fine tune reports from OBIEE side itself:
1) You can check for your measures granularity in reports and have level base measures created in RPD using OBIEE utility.
http://www.rittmanmead.com/2007/10/using-the-obiee-aggregate-persistence-wizard/
This will pick your aggr tables and not detailed tables.
2) You can use Caching Seeding options. Using ibot or Using NQCMD command utility
http://www.artofbi.com/index.php/2010/03/obiee-ibots-obi-caching-strategy-with-seeding-cache/
http://satyaobieesolutions.blogspot.in/2012/07/different-to-manage-cache-in-obiee-one.html
OR
http://hiteshbiblog.blogspot.com/2010/08/obiee-schedule-purge-and-re-build-of.html
Using one of the above 2 methods, you can fine tune your reports and reduce the query time.
Also, on a safer side, just take the physical SQL from log and run it directly on DB to see the time taken and check for the explain plan with the help of a DBA.
Hope this help's
Thanks,
Satya
Edited by: Satya Ranki Reddy on Aug 12, 2012 7:39 PM
Edited by: Satya Ranki Reddy on Aug 12, 2012 8:12 PM
Edited by: Satya Ranki Reddy on Aug 12, 2012 8:20 PM -
Why GN_INVOICE_CREATE has no performance improvement even in HANA landscape?
Hi All,
We have a pricing update program which is used to update the price for a Material Customer combination(CMC).This update is done using the FM 'GN_INVOICE_CREATE'.
The logic is designed to loop on customers, wherein this FM will be called passing all the materials valid for that customer.
This process is taking days(Approx 5 days) to get executed and updated for CMC of 100 million records.
Hence we are planning to move towards HANA for better improvement in performance.
We designed the same programs in the HANA landscape and executed it in both systems for 1 customer and 1000 material combination.
Unfortunately, both the systems gave same runtimes around 27 seconds for execution.
This is very disappointing thinking the performance improvement we should have on HANA landscape.
Could anyone throw light on any areas where we are missing out and why no performance improvement was obtained ?
Also is there any configuration related changes to be done on HANA landscape for better performance.?
The details regarding both the systems are as below.
Suite on HANA:
SAP_BASIS : 740
SAP_APPL : 617
ECC
SAP_BASIS : 731
SAP_APPL : 606
Also see the below screenshots of the system details.
HANA:
ECC:
Thanks & regards,
NaseemHi,
just to fill in on Lars' already exhaustive comments:
Migrating to HANA gives you lots of options to replace your own functionality (custom ABAP code) wuth HANA artifacts - views or SQLscript procedures. This is where you can really gain on performance. Expecting ABAP code to automatically run faster on HANA may be unrealistic, since it depends on the functionality of the code and how well it "translates" to a HANA environment. The key to really minimize run time is to replace DB calls with specific HANA views or procedures, then call these from your code.
I wrote a blog on this; you might find it useful as a general introduction:
A practical example of ABAP on HANA optimization
When it comes to SAP standard code, like your mentioned FM, it is true that SAP is migrating some of this functionality to HANA-optimized versions, but this doesn't mean everything will be optimized in one go. This particular FM is probably not among those being initially selected for "HANAification", so you basically have to either create your own functionality (which might not be advisable due to the fact that this might violate data integrity) or just be patient.
But again, the beauty of HANA lies in the brand new options for developers to utilize the new ways of pushing code down to the DB server. Check out the recommendations from Lars and you'll find yourself embarking on a new and exciting journey!
Also - as a good starting point - check out the HANA developer course on open.sap.com.
Regards,
Trond -
Performance improve using TEZ/HIVE
Hi,
I’m newbie in HDInsight. Sorry for asking simple Questions. I have queries around performance improvement of my HIVE query on File data of 90 GB (15 GB * 6).
We have enabled execution engine has TEZ, I heard the AVRO format improves the speed of execution, Is AVRO SERDE enabled TEZ Queries or do I need upload *.jar files to WASB. I’m using latest version. Any sample Query.
In TEZ, Will ORC Column Format and Avro compression can work together, when we set ORC compression level on hive has
Snappy and LZO ?. Is there any Limitation of Number of columns for ORC tables.
Is there any best compression technique to upload data file to Blob, I mean compress and upload. I used *.gz, which compressed by 1/4<sup>th</sup> of File Size and upload to Blob, but problem *.gz is not split able and it will always
uses less (single ) Mapper or should I use Avro with Snappy Compression . Is the Microsoft Avro Library performs snappy Compression or is there any compress which can be split and compress.
If data structure for file change over time , will there be necessity of reloading older data?. Can existing query works without change in code.
It has been said that TEZ has Real Time Reporting capability , but when I Query 90 GB file (It includes Group By, order by clauses) is taking almost 8 mins of time on 20 nodes, are there any pointers to improve performance further and get the Query result
in Seconds.
Mahender-- Tez is an execution engine, I don't think you need any additional jar file to get AVRO Serde working on Hive when Tez is used. You can used AvroSerDe, AvroContainerInputFormat & AvroContainerOutputFormat to get AVRO working when tez is
used.
-- I tried creating a table with about 220 columns, although the table was empty, I was able to query from the table, how many columns does your table hold?
CREATE EXTERNAL TABLE LargColumnTable02(t1 string,.... t220 string)
PARTITIONED BY(EventDate string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS ORC LOCATION '/data'
tblproperties("orc.compress"="SNAPPY");
-- You can refer
http://dennyglee.com/2013/03/12/using-avro-with-hdinsight-on-azure-at-343-industries/
Getting Avro data into Azure Blob Storage Section
-- It depends on what data has change , if you are using Hadoop, HBase etc..
-- You will have to monitor your application check node manager logs if there is any pause in execution again. It depends on what you are doing, would suggest open a Support case to investigate further. -
Hi to all,
i am working on a performance Improve,
we are storeing date in session in specific search results, but it is going to up more than 2MB,
now we want remove unnessery date from session,
for that we have design like,
we have to create side process it will run every 10min,and it will check if the user is not accessing session data from last 15min so we have to remove from session.
what i want is how to create this side process,
if any body knows implementaion pl reply me.
Thanks in Advancewe have to create side process it will run every
10min,and it will check if the user is not accessing
session data from last 15min so we have to remove
from session.
How do you know when user has accessed given session
data?
Do you have timestamp management in your search
result data structure?
Generally speaking, doing a repetitive task can be
achieved with the help of
[url=http://java.sun.com/j2se/1.4.2/docs/api/java/util
/Timer.html]java.util.TimerJust out of curiosity ... I was under the impression that only some call you ... Tim. Can I call you Sam? Or Vsevolod? Or even Matabei?
� {� -
Tabular Model Performance Improvements
Hi !
We have a bulitv tabular model inline which has a fact table and 2 dimension tables .The performance of SSRS report is very slow and we have bottle neck in deciding SSRS as reporting tool.
Can you help us on performance improvements with Tabular Inline
Regards,Hi Bhadri,
As Sorna said, it hard to give you the detail tips to improve the tabular model performance according the limited information. Here are some useful link about performance Tuning of Tabular Models in SQL Server 2012 Analysis Services, please refer to the
link below.
http://msdn.microsoft.com/en-us/library/dn393915.aspx
If this is not what you want, please elaborate the detail information, so that we can make further analysis.
Regards,
Charlie Liao
TechNet Community Support -
Oracle 11g R2 client installation on Windows 7 64 bit hangs after performing prerequisite checks
Windows 7 Professional SP1 64bit OS
6 GB RAM
380 GB free space
Extracted client to C:\Stage\win64_11gR2_client\client
Ran setup.exe as administrator, but the installation always hangs after step 5 of 7 is completed (Perform Prerequisite checks). Install log ends with entry "INFO: Get view named [SummaryUI]"
What am I missing here? Any help would be greatly appreciated.
C:\Stage\win64_11gR2_client\client>dir
Directory of C:\Stage\win64_11gR2_client\client
06/15/2013 02:56 PM <DIR> .
06/15/2013 02:56 PM <DIR> ..
06/15/2013 02:56 PM <DIR> doc
06/15/2013 02:56 PM <DIR> install
06/15/2013 02:56 PM <DIR> response
06/15/2013 02:55 PM 341,304 setup.exe
06/15/2013 02:55 PM 56 setup.ini
06/15/2013 02:59 PM <DIR> stage
06/15/2013 02:55 PM 4,327 welcome.html
3 File(s) 345,687 bytes
6 Dir(s) 412,474,404,864 bytes free
Last few lines of install log:
WARNING: Active Help Content for InstallLocationPane.cbxOracleBases do not exist. Error :Can't find resource for bundle oracle.install.ivw.client.resource.ContextualHelpResource, key InstallLocationPane.cbxOracleBases.conciseHelpText
WARNING: Active Help Content for InstallLocationPane.cbxSoftwareLoc do not exist. Error :Can't find resource for bundle oracle.install.ivw.client.resource.ContextualHelpResource, key InstallLocationPane.cbxSoftwareLoc.conciseHelpText
INFO: View for [InstallLocationUI] is oracle.install.ivw.client.view.InstallLocationGUI@75f0f8ff
INFO: Initializing view <InstallLocationUI> at state <getOracleHome>
INFO: inventory location isC:\Program Files\Oracle\Inventory
INFO: Completed initializing view <InstallLocationUI> at state <getOracleHome>
INFO: Displaying view <InstallLocationUI> at state <getOracleHome>
INFO: Completed displaying view <InstallLocationUI> at state <getOracleHome>
INFO: Loading view <InstallLocationUI> at state <getOracleHome>
INFO: Completed loading view <InstallLocationUI> at state <getOracleHome>
INFO: Localizing view <InstallLocationUI> at state <getOracleHome>
INFO: Completed localizing view <InstallLocationUI> at state <getOracleHome>
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Executing action at state getOracleHome
INFO: Completed executing action at state <getOracleHome>
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Moved to state <getOracleHome>
INFO: inventory location isC:\Program Files\Oracle\Inventory
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Validating view at state <getOracleHome>
INFO: Completed validating view at state <getOracleHome>
INFO: Validating state <getOracleHome>
INFO: custom prereq file name: oracle.client_Administrator.xml
INFO: refDataFile: C:\Stage\win64_11gR2_client\client\stage\cvu\oracle.client_Administrator.xml
INFO: isCustomRefDataFilePresent: false
INFO: InstallAreaControl exists: false
INFO: Checking:NEW_HOME
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:COMP
INFO: Checking:ORCA_HOME
INFO: Reading shiphome metadata from c:\Stage\win64_11gR2_client\client\install\..\stage\shiphomeproperties.xml
INFO: Loading beanstore from file:/c:/Stage/win64_11gR2_client/client/install/../stage/shiphomeproperties.xml
INFO: Translating external format into raw format
INFO: Restoring class oracle.install.driver.oui.ShiphomeMetadata from file:/c:/Stage/win64_11gR2_client/client/install/../stage/shiphomeproperties.xml
INFO: inventory location isC:\Program Files\Oracle\Inventory
INFO: inventory location isC:\Program Files\Oracle\Inventory
INFO: size estimation for Administratorinstall is 1068.0003070831299
INFO: PATH has :==>C:\Temp\OraInstall2013-06-17_02-55-15PM\jdk\jre\bin;.;C:\windows\system32;C:\windows;C:\Perl\site\bin;C:\Perl\bin;C:\Program Files\Common Files\Microsoft Shared\Windows Live;C:\Program Files (x86)\Common Files\Microsoft Shared\Windows Live;C:\windows\system32;C:\windows;C:\windows\System32\Wbem;C:\windows\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\Windows Live\Shared;C:\APPS\PuTTY\;C:\Program Files (x86)\IDM Computer Solutions\UltraEdit\
INFO: Completed validating state <getOracleHome>
INFO: InstallLocationAction to INVENTORY_NO
INFO: Verifying route INVENTORY_NO
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Executing action at state prereqExecutionDecider
INFO: Completed executing action at state <prereqExecutionDecider>
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Moved to state <prereqExecutionDecider>
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Validating view at state <prereqExecutionDecider>
INFO: Completed validating view at state <prereqExecutionDecider>
INFO: Validating state <prereqExecutionDecider>
WARNING: Validation disabled for the state prereqExecutionDecider
INFO: Completed validating state <prereqExecutionDecider>
INFO: Verifying route executeprereqs
INFO: Get view named [PrereqUI]
INFO: View for [PrereqUI] is [email protected]e4b
INFO: Initializing view <PrereqUI> at state <checkPrereqs>
INFO: Completed initializing view <PrereqUI> at state <checkPrereqs>
INFO: Displaying view <PrereqUI> at state <checkPrereqs>
INFO: Completed displaying view <PrereqUI> at state <checkPrereqs>
INFO: Loading view <PrereqUI> at state <checkPrereqs>
INFO: Completed loading view <PrereqUI> at state <checkPrereqs>
INFO: Localizing view <PrereqUI> at state <checkPrereqs>
INFO: Completed localizing view <PrereqUI> at state <checkPrereqs>
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Executing action at state checkPrereqs
INFO: custom prereq file name: oracle.client_Administrator.xml
INFO: refDataFile: C:\Stage\win64_11gR2_client\client\stage\cvu\oracle.client_Administrator.xml
INFO: isCustomRefDataFilePresent: false
INFO: Completed executing action at state <checkPrereqs>
INFO: Waiting for completion of background operations
INFO: Finishing all forked tasks at state checkPrereqs
INFO: Waiting for completion all forked tasks at state checkPrereqs
INFO: Creating PrereqChecker Job for leaf task Physical Memory
INFO: Creating CompositePrereqChecker Job for container task Free Space
INFO: Creating PrereqChecker Job for leaf task Free Space: PN-PC:C:\Temp
INFO: Creating PrereqChecker Job for leaf task Architecture
INFO: Creating PrereqChecker Job for leaf task Environment variable: "PATH"
INFO: CVU tracingEnabled = false
INFO: Nodes are prepared for verification.
INFO: *********************************************
INFO: Physical Memory: This is a prerequisite condition to test whether the system has at least 128MB (131072.0KB) of total physical memory.
INFO: Severity:IGNORABLE
INFO: OverallStatus:SUCCESSFUL
INFO: -----------------------------------------------
INFO: Verification Result for Node:PN-PC
INFO: Expected Value:128MB (131072.0KB)
INFO: Actual Value:5.9491GB (6238064.0KB)
INFO: -----------------------------------------------
INFO: *********************************************
INFO: Free Space: PN-PC:C:\Temp: This is a prerequisite condition to test whether sufficient free space is available in the file system.
INFO: Severity:IGNORABLE
INFO: OverallStatus:SUCCESSFUL
INFO: -----------------------------------------------
INFO: Verification Result for Node:PN-PC
INFO: Expected Value:130MB
INFO: Actual Value:384.1495GB
INFO: -----------------------------------------------
INFO: *********************************************
INFO: Architecture: This is a prerequisite condition to test whether the system has a certified architecture.
INFO: Severity:CRITICAL
INFO: OverallStatus:SUCCESSFUL
INFO: -----------------------------------------------
INFO: Verification Result for Node:PN-PC
INFO: Expected Value:64-bit
INFO: Actual Value:64-bit
INFO: -----------------------------------------------
INFO: *********************************************
INFO: Environment variable: "PATH": This test checks whether the length of the environment variable "PATH" does not exceed the recommended length.
INFO: Severity:CRITICAL
INFO: OverallStatus:SUCCESSFUL
INFO: -----------------------------------------------
INFO: Verification Result for Node:PN-PC
INFO: Expected Value:1023
INFO: Actual Value:369
INFO: -----------------------------------------------
INFO: All forked task are completed at state checkPrereqs
INFO: Completed background operations
INFO: Moved to state <checkPrereqs>
INFO: Waiting for completion of background operations
INFO: Completed background operations
INFO: Validating view at state <checkPrereqs>
INFO: Completed validating view at state <checkPrereqs>
INFO: Validating state <checkPrereqs>
INFO: Using default Validator configured in the Action class oracle.install.ivw.client.action.PrereqAction
INFO: Completed validating state <checkPrereqs>
INFO: Verifying route success
INFO: Get view named [SummaryUI]Yes, I've tried using the -jreLoc option but it doesn't seem to like the path.
Version 6.0.200.2 is at C:\Program Files (x86)\Java\jre6\bin
C:\Stage\win64_11gR2_client\client>setup.exe -jreLoc "C:\Program Files (x86)\Java\jre6"
Starting Oracle Universal Installer...
Checking monitor: must be configured to display at least 256 colors Higher than 256 . Actual 4294967296 Passed
Preparing to launch Oracle Universal Installer from C:\Temp\OraInstall2013-06-17_04-47-16PM. Please wait ...
The Java RunTime Environment was not found at "C:\Program Files (x86)\Java\jre6"\bin\javaw.exe. Hence, the Oracle Universal Installer cannot be run.
Please visit http://www.javasoft.com and install JRE version 1.3.1 or higher and try again
I downloaded a newer version to C:\APPS\Java thinking the space in the path for Program Files (x86) might be a problem, but that also fails with the same error message
Version 7.0.210.11 is at C:\APPS\Java\bin
C:\Stage\win64_11gR2_client\client>setup.exe -jreLoc "C:\APPS\Java\"
Starting Oracle Universal Installer...
Checking monitor: must be configured to display at least 256 colors Higher than 256 . Actual 4294967296 Passed
Preparing to launch Oracle Universal Installer from C:\Temp\OraInstall2013-06-17_05-09-13PM. Please wait ...
The Java RunTime Environment was not found at C:APPS\Java"\bin\javaw.exe. Hence, the Oracle Universal Installer cannot be run.
Please visit http://www.javasoft.com and install JRE version 1.3.1 or higher and try again -
Best practice for performing spell check in ADF
Hi,
I would like to know if there is a way to perform spell check in ADF. What is the best way to do? Does ADF have some type functionality for that or do I need to have an external library for that?
Any help will be appreciated.
thank you in advanced,
Abraham
I'm using Jdevelper 11.1.1.4.0 - Build JDEVADF_11.1.1.4.0_GENERIC_101227.1736.5923couple of related threads
http://kr.forums.oracle.com/forums/thread.jspa?threadID=2167553
spell check for the input text
may be you can check if there is a ajax way of doing this through jsf -
How to perform validity checks just before form data is saved?
hello all,
i need to perform some checks on various fields just before data on a standard form is saved, and to eventually rollback the transaction.
if the checks are succesful i have to perform some db operation in transaction with main data.
solution ( if existent! ) has to be addin only -> no triggers / no procedures / no sbo_transaction thingies
scenario:
involved tables:
OITM
USERDEFINEDTABLE1
USERDEFINEDTABLE2
involved forms:
Items - OITM - formtypeex = "150"
logic:
an user defined field U_UDFCHECKACTIVE has been added to OITM, a combo box has been added to items form containing values Y and N, bound to OITM.U_UDFCHECKACTIVE.
when the user saves an item, if OITM.U_UDFCHECKACTIVE == Y we have to perform additional checks on USERDEFINEDTABLE1
if the check result is true we have to save OITM current data and insert a record in USERDEFINEDTABLE2 containing itmgrp value ( )
if the check result is false we have to block save operation
if OITM.U_UDFCHECKACTIVE == N no check is performed and everything works normally
we started hooking the form data event for oitm / form type 150:
Application.SBO_Application.FormDataEvent += new SAPbouiCOM._IApplicationEvents_FormDataEventEventHandler(FormDataEventHandler.SBO_Application_FormDataEvent);
then we filtered the event
switch (BusinessObjectInfo.FormTypeEx)
case "150":
if (BusinessObjectInfo.BeforeAction &&
(BusinessObjectInfo.EventType == SAPbouiCOM.BoEventTypes.et_FORM_DATA_UPDATE ||
BusinessObjectInfo.EventType == SAPbouiCOM.BoEventTypes.et_FORM_DATA_ADD))
BL.Items_BusinessLogic.SBO_Application_FormDataEvent(ref BusinessObjectInfo, out BubbleEvent);
in our business logic class i don't know how to correctly check the values just before they are saved to db
for example i tried using oitm.browser :
if (oitm.Browser.GetByKeys(BusinessObjectInfo.ObjectKey))
String value = oitm.UserFields.Fields.Item("U_UDFCHECKACTIVE ").Value as String;
//perform checks
//if ok insert records and bubbleevent = true
//if ko rollback the bubbleevent = false
//if checks are not to be performed then bubbleevent = true....
but it's wrong, as it populates oitm object from db, and not from the form. i have old values stored in the object, values tha are to be overwritten with those present on form.
i tried inspecting the form for data sources but i don't know how to navigate to the right one, i found only system generated objects with SYS_##, with no clue about the related dbfield
i tried getting the values directly from the form, but in object BusinessObjectInfo i don't have anything that identifies univocally the form i'm editing.
i mean if i have 2 item forms open, i don't know which one is the one i'm saving.
i'm expecting to find somewhere a temporary oitm business object containing values that are to be saved, values that i can easily check.
so i'm asking you how to find those values in order to perform my bl checks
thank you in advanceHi Christian,
The values that are about to be saved should be in the datasources:
form.DataSources.DBDataSources.Item("OITM");
form.DataSources.DBDataSources.Item("ITM1");
The form variable should be form 150.
Best regards,
Pedro Magueija -
DS 5.2 P4 performance improvement
We have +/- 300,000 users that regularly authenticate using our DS. The user ou is divided in ou=internal (20,000 ids) and ou=external (280,000) uids. Approximately 85-90% percent of the traffic happens on the internal ou. The question is: Could I get any performance improvement by separating the internal branch into its own suffix/database? Would running two databases adversely affect the performance instead? We see performance impacts when big searches are performed on the ou=external branch. Would the separation isolate the issue, or those searches will most likely affect the DS as a whole?
Thanks for your help!
Enrique.Thank you for the info. Are u a Sun guy - do you work
for sun?Yes I am. I'm the Architect for Directory Server Enterprise Edition 6.0. Previously I worked on all DS 5 releases (mostly on Replication).
You are getting the Dukes!Thanks.
Ludovic. -
Performance improvement in a function module
Hi All,
I am using SAP 6.0 version. I have a function module to retrive the PO's . for just 10,000 records its taking long time.
Can any one sugguest the ways to improve the performance.
Thanks in advance.Moderator message - Welcome to SCN.
But
Moderator message - Please see Please Read before Posting in the Performance and Tuning Forum before posting
Just 10,000 records? The first rule in performance improvement is to reduce the amount of selected data. If you cannot do that, it's going to take time.
I wouldn't bother with a BAPI for so many records. Write some custom code to get only the data you need.
Tob -
Pls help me to modify the query for performance improvement
Hi,
I have the below initialization
DECLARE @Active bit =1 ;
Declare @id int
SELECT @Active=CASE WHEN id=@id and [Rank] ='Good' then 0 else 1 END FROM dbo.Students
I have to change this query in such a way that the conditions id=@id and [Rank] ='Good' should go to the where condition of the query. In that case, how can i use Case statement to retrieve 1 or 0? Can you please help me to modify this initialization?I dont understand your query...May be below? or provide us sample data and your output...
SELECT * FROM dbo.students
where @Active=CASE
WHEN id=@id and rank ='Good' then 0 else 1 END
But, I doubt you will have performance improvement here?
Do you have index on id?
If you are looking for getting the data for @ID with rank ='Good' then use the below:Make sure, you have index on id,rank combination.
SELECT * FROM dbo.students
where id=@id
and rank ='Good' -
MV Refresh Performance Improvements in 11g
Hi there,
the 11g new features guide, says in section "1.4.1.8 Refresh Performance Improvements":
"Refresh operations on materialized views are now faster with the following improvements:
1. Refresh statement combinations (merge and delete)
2. Removal of unnecessary refresh hint
3. Index creation for UNION ALL MV
4. PCT refresh possible for UNION ALL MV
While I understand (3.) and (4.) I don't quite understand (1.) and (2.). Has there been a change in the internal implementation of the refresh (from a single MERGE statement)? If yes, then which? Is there a Note or something in the knowledge base, about these enhancements in 11g? I couldn't find any.
Considerations are necessary for migration decision to 11g or not...
Thanks in advance.I am not quit sure, what you mean. You mean perhaps, that the MVlogs work correctly when you perform MERGE stmts with DELETE on the detail tables of the MV?
And were are the performance improvement? What is the refresh hint?
Though I am using MVs and MVlogs at the moment, our app performs deletes and inserts in the background (no merges). The MVlog-based fast refresh scales very very bad, which means, that the performance drops very quickly, with growing changed data set. -
Will there performance improvement over separate tables vs single table with multiple partitions? Is advisable to have separate tables than having a single big table with partitions? Can we expect same performance having single big table with partitions? What is the recommendation approach in HANA?
Suren,
first off a friendly reminder: SCN is a public forum and for you as an SAP employee there are multiple internal forums/communities/JAM groups available. You may want to consider this.
Concerning your question:
You didn't tell us what you want to do with your table or your set of tables.
As tables are not only storage units but usually bear semantics - read: if data is stored in one table it means something else than the same data in a different table - partitioned tables cannot simply be substituted by multiple tables.
Looked at it on a storage technology level, table partitions are practically the same as tables. Each partition has got its own delta store & can be loaded and displaced to/from memory independent from the others.
Generally speaking there shouldn't be too many performance differences between a partitioned table and multiple tables.
However, when dealing with partitioned tables, the additional step of determining the partition to work on is always required. If computing the result of the partitioning function takes a major share in your total runtime (which is unlikely) then partitioned tables could have a negative performance impact.
Having said this: as with all performance related questions, to get a conclusive answer you need to measure the times required for both alternatives.
- Lars -
DMA Performance Improvements for TIO-based Devices
Hello!
DMA Performance Improvements for TIO-based Devices
http://digital.ni.com/public.nsf/websearch/1B64310FAE9007C086256A1D006D9BBF
Can I apply the procedure to NI-DAQmx 9? These ini-files dont seem to exist anymore in the newer version.
Best, ViktorHi Viktor,
this page is 7 years old and doesn't apply to the DAQmx.
Regards, Stephan
Maybe you are looking for
-
Late 2011 17 Inch MacBook Pro Not Booting, Three Beeps When I hold "C"
my late 2011 mbp 17 inch will not boot. In safe mode with verbose it stops at the point looking for my hard drive. Disk utility in single user mode said the SSD has "keys out of order" and could not verify disk etc. However when I hold "c" aka tryi
-
How do you get the product key for your new Satellite C55-B5202?
I recently bought this new laptop and for the love of God I can't use Microsoft Office cause it keeps asking me for a product key. Please tell me how do I retrieve it! It did not come with a CD, there is no sticker at the bottom of the laptop or the
-
Unattended Daily Exporting of Query to Flat File
I am VERY new to Oracle and SQL Plus. I've been tasked with automating a query to run unattended daily to export data and store it in a flat file with structured naming convention (CC_MMDDYY.exp). I need to have it be able to an "on demand" query to
-
Preview.app Scrolling Issue
Since updating I've been having a problem with Preview while scrolling through PDFs of all sizes. The scrolling will stop randomly and the app will become unresponsive to my touch pad movements for 1-2 seconds. Anyone else having this issue? I'm on a
-
Archive and install - not enough memory
I have installed 2 recent updates and when I I was prompted to restart my macbook it would not start up (just the gray screen appeared with the wheel running). I tried to archieve and install 0S 10 from the start up disk 1 provided with the macbook.