Reduce time

Hi
I have records more than one one Lakh in my table.
1) i have created index for a primary key column.
2) i have used analytical functions in where condition.
so now i want to reduce time for that query .. is there any way to reduce time ...???

BluShadow wrote:
Or use a laptop and sit on a fast plane or train. The faster it goes to slower time goes. ;)You don't even need to do that!
A lesser known consequence of general relativity is that time will move slower in a stronger gravitational field. On Earth, one implication of this is that a clock on the second floor of an office building will move faster than one on the first floor. Using the ultra-precise clock setup, the NIST researchers tested this as well. One of the optical clocks was placed about a foot above the other and measurements were taken. They found a fractional frequency change of (4.1±1.6)x10-17; plugging this number back into relativity's formulas produced an equivalent height differential gave 14.5±5.9 inches, a result that nicely bracketed the 12 inch difference in the experimental setup.http://arstechnica.com/science/news/2010/09/einsteins-relativity-measured-in-newtons-domain.ars

Similar Messages

  • How can i reduce time of execution

    Hi all,
        I have a report program having HR logical database (PNP). This report were developed 1 year back. which is running well. This program were designed that it should take 1.20 hour execution  time. But since one week it is taking 3 hours. So i need to find why it is taking this much of time. I didn't get any clue to find out. please guide me how i can solve this issue.
       Right answer will be appriciated. Thanking you

    Hai,
    To reduce time of execution go for runtime analysis.
    I think this documentation will help you out...
    RUNTIME ANALYSIS
    The runtime analysis is an additional development workbench tool that is quite useful for analyzing performance of an ABAP / 4 Program or transaction. With this tool, the system can display information about:
    •     Executed instruction
    •     Accessed execution time.
    •     Tables and Types of access.
    •     Chronological execution flow
    The runtime analysis tool creates lists that reveal expensive statements, summarize table accesses. Runtime analysis is specifically designed for tuning individual programs and transactions.
    The Runtime Analysis tool measures ABAP/4 statements that are potentially expensive in terms of CPU time. The most significant of these are:
    Statement used for database access like select.
    Statement used for modularization such as module, perform, call function.
    Internal table statements like append, collect.
    Starting Runtime Analysis
    •     From ABAP/4 development workbench select Test – Runtime Analysis.
    •     From ABAP/4 editor, select utilities – more utilities – Runtime Analysis.
    •     From ABAP/ source code screen, select Execute – Runtime Analysis.
    •     From R3 screen, select System – Utilities – Runtime Analysis.
    •     Entering Transaction code SE30 in the command field.
    Following screen is the initial screen for SE30 transaction.
    On the initial screen, select the needed object you want to analyze i.e. program or transaction. Enter the name of the object. Click on execute. The system will execute the specified object and will generate a trace file or performance data file, which can then be analyzed when the transaction or program is finished.
    Analyzing a performance data file
    These files are created at operating system level and many times occupy large memory space, so be sure to remove the files, which are no longer needed.
    To analyze the files:
    •     Click on Analysis
    •     Following screen is displayed
    •     From GOTO option you can get overview of runtime analysis.
    The options are as follows:
    •     Hit List – Displays a list with the most system expensive instructions.
    •     Tables – Displays the most important tables, the number of accesses and the time needed for the accesses.
    •     Group hit list – Displays a list with the performed instructions classified by instruction type.
    •     Call hierarchy – Presents a chronological listing with the flow of calls during the execution of a program.
    During Runtime Analysis, the system measures the statements and stores these measurements in a performance data file. If you measure the same program or transaction several times, the data can vary. Many factors make it difficult to reproduce identical result. E.g., Network traffic.
    When you evaluate this file, the system displays the overview - Runtime Analysis Evaluation screen including a bar chart for total execution time. From this screen, you can analyze several types of information like:
    •     Hit list: displays the list with the most `system-expensive’ instructions.
    •     Tables: displays the most important tables, the number of accesses and the time needed for the accesses.
    •     Group hit list: displays a list of performed instruction classified by its type.
    •     Call hierarchy: presents a chronological listing with the flow of calls during the execution of program.

  • Synchronous Scenario. How to reduce time. Pls advice urgent.

    Hi All,
    I have Synchronous Scenario. It has no BPM , simple synchronous scenario (HTTP --- Webservice).
    It is taking 10 seconds. How can I reduce time of this scenario.
    Pls advice

    Hi,
    The Time Consumption is based on below factors
    1. the amount of data to be processed across the interface.
    2. The response time of Receiver system
    3. Type of Mapping used and implementations of any complex logics with Advanced UDFs etc
    4. Hardware configurations
    5. Resource Consumptions etc.
    please refer below fine tunning docs.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/489f5844-0c01-0010-79be-acc3b52250fd
    XI Performance Benchmarks ?
    Thanks
    Swarup

  • Reduce Time for Rman Backup

    Dear Experts;
    rman for 0 level backup is taking about 5:26 hours, backup size is now 312gb I have enabled block track checking & it reduces time for incremental level 1 from 2hour to almost 3 minutes.
    database shows biggest tablespace is "users"
    I want valuable suggestions for reducing its time or is there any way to break 0 level backup. I can allocate channels but ultimately it will take time when taking "users" tablepace backup
    Right now I am taking backup at usb drive & its version is 2.0
    Regards

    As you are taking backup to a usb drive there is not much that can be done to improve the speed. If you are concerned about the backup being slow.. then you could think of taking the backup on local disk( which would be faster and more efficient) and then move the backups from the disk to usb drive.
    This can be done in a single backup script as 2 part operation.
    1) take backup to disk.
    2) copy the backup to usb drive and delete the backups from the disk.
    There are many additional features that u can add to enhance it thoe.
    Regards,
    V

  • Reducing time while Filtering Out New Records with the Same Key

    Hi Experts,
    I have an issue .. I am trying to load data into 0MATERIAL . While loading via DTP the step u201CFilter Out New Records with the Same Keyu201D   takes much time for processing . For each datapackage it is taking around 30 minutes to filter out new records with the same key .  While for other master data it hardly takes 5 minutes.
    Any pointers how to reduce the timeu2026
    Thanks in advance
    Sam

    Hello,
    No there is no need to do that change then.
    Can you tell me are you doing a full load daily to 0MATERIAL object.
    why dont you use delta.
    Also if the DTP is of type FULL, then are you deleting the previous PSA requests.
    Maybe you can ask the basis guys to check with a trace what is really happening on the DB side during this 30 mins and then maybe we can find the needed fix.
    Regards,
    Shashank

  • Reduce time to build setup project and size of .msi

    I have noticed that since we started working with Crystal 2008 (previously we used version 8.5), the merge module added to the Visual Studio 2005 setup project seems to cause the build process to take much longer than before (nearly 10 minutes), and the resulting .msi has ballooned from a few megs into around 54MB.
    I think that this is because it is included language support for all languages, even though the only language we need is English.  I might be misunderstanding the cause - this is just my best guess.
    Does anyone know if there is a way to reduce the amount of time to build, and reduce the size of the .msi?
    Thank you for any help.. shannon

    Thank you for the response..
    My Visual Studio 2005 solution is configured to not build the setup project automatically.  I only build it manually when I need to do a test deployment on the development server.  As I am approaching the end of the development cycle, I am having to deploy to the dev server pretty frequently - this is why the long build process is becoming more of a problem for me.  It is making the testing process very cumbersome.
    To be honest, the size of the .msi file doesn't cause any problem for me, but I am guessing that if there were a way to reduce the size, it would also reduce the time, since it seems to take a long time to pack everything into that .msi during the setup project build.

  • How to reduce time for gather statistics for a table.

    I have a table size 520 gb
    Its one of the partition size is 38 gb
    and total indexes of related table is 412 gb.
    Server/instance details.
    ==========
    56 cpu -> Hyper threading enable
    280 gb ram
    35 gb sga
    27 gb buffer cache
    4.5 gb shared pool size
    25 gb pga
    undo size 90gb
    temp size 150 gb
    Details :
    exec dbms_stats.gather_table_stats('OWNER','TAB_NAME',PARTNAME=>'PART_NAME',CASCADE=>FALSE,ESTIMATE_PERCENT=>10,DEGREE=>30,NO_INVALIDATE=>TRUE);
    when i am firing this in an ideal time when there is no load that time also is is taking 28 mins to complete.
    Can anybody please reply me how can we reduce the stats gather time.
    Thanks in advance,
    Tapas Karmakar
    Oracle DBA.

    Enable tracing to see where the time is going.
    parallel 30 seems optimistic - unless you have a large number of discs to support the I/O ?
    you haven't limited histogram collection, and most of the time spent of histograms may be wasted time - what histograms do you really need, and how many does Oracle analyse for and then discard ?
    Using a block sample may help slightly
    You haven't limited the granularity of the stats collection to the partition - the default is partition plus table, so I think you're also doing a massive sample on the table after completing the partition. Is this what you want to do, or do you have an alternative strategy for generating table-level stats.
    Regards
    Jonathan Lewis

  • How to reduce time for replicating large tables?

    Hi
    Any suggestions on how to reduce the amount of time it takes to replicate a large table when it is first created?
    I have a table with 150 million rows in it, and it takes forever to start the replication process even if I run it in parallel, and I can’t afford the downtime.

    What downtime are you referring to? The primary doesn't need to be down when you're setting up replication and you're presumably still in the process of doing the initial configuration on the replicated database, so it's not really down, it's just not up yet.
    Justin

  • Reducing time required for ABAP-only copyback (system copy) process

    Our company is investigating how to reduce the amount of time it takes to perform a copyback (system copy) from a production ABAP system to a QA system.  We use a similar process for all ABAP-only systems in our landscape, ranging from 3.1h systems to ECC6.0 ABAP-only systems on both DB2 and Oracle database platforms, and the process takes approximately two weeks of effort from end-to-end (this includes time required to resolve any issues encountered). 
    Here is an overview of the process we use:
    u2022     Create and release backup transports of key system tables and IDu2019s (via client copy) in the QA system to be overwritten (including RFC-related tables, partner profile and IDOC setup-related tables,  scheduled background jobs, archiving configuration, etc.).
    u2022     Reconfigure the landscape transport route to remove QA system from transport landscape.
    u2022                    Create a virtual import queue attached to the development system to capture all transports released from development during the QA downtime.
    u2022     Take a backup of the target production database.
    u2022     Overwrite the QA destination database with the production copy.
    u2022     Localize the database (performed by DBAu2019s).
    u2022     Overview of Basis tasks (for smaller systems, this process can be completed in one or two days, but for larger systems, this process takes closer to 5 days because of the BDLS runtime and the time it takes to import larger transport requests and the user ID client copy transports):
    o     Import the SAP license.
    o     Execute SICK to check the system.
    o     Execute BDLS to localize the system.
    o     Clear out performance statistics and scheduled background jobs.
    o     Import the backup transports.
    o     Import the QA client copy of user IDu2019s.
    o     Import/reschedule background jobs.
    o     Perform any system-specific localization (example: for a CRM system with TREX, delete the old indexes).
    u2022     Restore the previous transport route to include the QA system back into the landscape.
    u2022     Import all transports released from the development system during the QA system downtime.
    Our companyu2019s procedure is similar to the procedure demonstrated in this 2010 TechEd session:
    http://www.sapteched.com/10/usa/edu_sessions/session.htm?id=825
    Does anyone have experience with a more efficient process that minimizes the downtime of the QA system?
    Also, has anyone had a positive experience with the system copy automation tools offered by various companies (e.g., UC4, Tidal)?
    Thank you,
    Matt

    Hi,
    > One system that immediately comes to mind has a database size of 2TB.  While we have reduced the copyback time for this system by running multiple BDLS sessions in parallel, that process still takes a long time to complete.  Also, for the same system, importing the client copy transports of user ID's takes about 8 hours (one full workday) to complete.
    >
    For BDLS run, I agree with Olivier.
    > The 2 weeks time also factors in time to resolve any issues that are encountered, such as issues with the database restore/localization process or issues resulting from human error.  An example of human error could be forgetting to request temporary ID's to be created in the production system for use in the QA system after it has been initially restored (our standard production Basis role does not contain all authorizations required for the QA localization effort).
    >
    For the issues that you encounter because of system copy, you can minimize this time period as you would be doing it on periodic basis (making some task list) and you can make a note of issues that you faced in previous run. So, normally i don't count it as system copy time
    Thanks
    Sunny

  • Steps to reducing time for loading of data

    Hi
    Could any one tell me how to reduce the time for loading of records into a particular cube or Ods. For ex: iam loading some 1 lac records.It was taking for me some 5 hrs. I want to reduce the time to 3hrs or 4hrs.What are the very first steps to be considered to make fast.
    Regards
    Ajay

    Hi Ajay,
    Check the following.
    1> Any routine you have in transfer rule and update rule should not fire database select more then one time in the same code.
    2> Load Master data before transaction data.
    3> Reduce the data pack size in infopackage.
    4> Delete old PSA because you may space issue while data loading .
    5> If you are loading in ODS then remove Bex check in ODS maintenance screen if you are not doing report on that ODS.
    hope this will help you.
    Suneel

  • How to reduce Time taken by DSO Activation.

    Hi
    My Inventory Transaction DSO taking nearly 18min for 180,000 records to activate. It means it takes 1 min for every 10,000 records to activate. I think its too much.
    Is there any way i can reduce DSO activation time... what are factors that increases the DSO activation Time.
    Pls let me know if u have some idea  this issue.
    Thanks.....

    Hi,
      Some of the factors which affect DSO activation are
    1) Data volume and size of DSO (if the volume and no of fields are more then it takes long time)
    2) SID generation(uncheck this if the DSO is not used for reporting)
    3) Secondary index (remove unwanted secondary index)
    4) No of background jobs and packages siz allocated for activation (check the settings for ur DSO in tcode RSODSO_SETTINGS)
    Apart from this it also depends on database performance and other factors
    Regards,
    Raghavendra.

  • Reducing Time of Rendering when Exporting to Encore

    Does anyone know how to reduce the rendering time when exporting from Premiere Pro CS3 to Encore? I'm trying to create a DVD of a 90 minute event using VBR 1 Pass and is taking 22 hours to get the job done! I have a 5 GB Mac Pro with 2 X 3 GHz Dual Core Intel Xeon Processors running on OS 10.4.11. Will "Forcing CBR" help? Will it compromise image quality? Any thoughts?

    With your machine the encoding to MPEG should be better then real time: a 90 minutes timeline should take about 90 minutes or less. However the effects and transitions rendering can greatly add to this time. Some effects can take "forever" to render. If you provide the effects you are using, we may be able to tell you which ones are time consuming to render.

  • Reducing time complexcity in java

    I have developed one application which recursively iterates through all Documents version in parent node (DOM) .
    and lists report in outfile .
    For each version found i am creating one collection object and at end itrating all objects , outputting it to file .
         // Create Doc iterator
             while (allDocfinish) {
                       check for version of each Doc
                         while (All Version Finishes) {
                                 // Read version collect File information .
                                // Add information to Collection (Each object is Java Bean )
          // Iterator Collection Obj
             // OutPut it to CSV I am currently following this format . but this is taking too much time . It is not throwing any Out Of Memory exception
    But it is running for more than 20 hours to get all node information (near abt 150 nodes each having 50 - 70 versions in it )
    Please suggest solution how do i optimize this
    Thanks
    Amit

    ejp wrote:
    That's only 10,500 operations.It's not many. You can't reduce the time complexity of M*N if you have to visit them all. You can loop to 10,500 in much less than a second. So I'd suggest your problems lie elsewhere, most probably in String concatenation when you should be using a StringBuffer or StringBuilder.I am reading whole file version each time to calculate file size . (Putting this in string Buffer Using StringWriter for this )
    Even if there is string operation , for each iteration same string references are assigned with new object .

  • Reducing time Notifications on iMessages

    badges appears wehn i got a new messages on iMessages. But badges stays too long, how do we reduce that time ?

    So the 3 to 5 seconds is too long? I don't see any where, where that can be changed.
    If your problem is the fact that you are receiving messages while you are presenting something from your Mac. Then might I suggest using Do Not Disturb.
    If you want it done automatically, check mark the "When mirroring to TVs and Projectors." Under the DND pane in Notifications pane of System Preferences.
    KOT

  • Reducing time for rebooting

    Hi experts,
            iam using the torch 9860 handset,it taking long time for rebooting please help me in reducing the time. 
    It's almost 3 to 5 mins to reboot.
    Thanks &. Regards,
      Karthik.m

    use latest OS and dont keep tons of stuff on media card
    or do like i do and Reboot it prior to going to bed and it should be fine when you wake up
    Click here to Backup the data on your BlackBerry Device! It's important, and FREE!
    Click "Accept as Solution" if your problem is solved. To give thanks, click thumbs up
    Click to search the Knowledge Base at BTSC and click to Read The Fabulous Manuals
    BESAdmin's, please make a signature with your BES environment info.
    SIM Free BlackBerry Unlocking FAQ
    Follow me on Twitter @knottyrope
    Want to thank me? Buy my KnottyRope App here
    BES 12 and BES 5.0.4 with Exchange 2010 and SQL 2012 Hyper V

Maybe you are looking for

  • Performance problem

    Hi, I'm having a performance problem I cannot solve, so I'm hoping anyone here has some advice. Our system consists of a central database server, and many applications (on various servers) that connect to this database. Most of these applications are

  • How to make a border around text?

    In my latest FCE project, I made a solid color matte clip (dark blue) to set below my titles to give the text a colored background (these titles are not overlaid on video clips, so without the colored matte, the background would be the default black

  • Firefox 4 under Vista is my default browser I cannot now open .xps files even with IEtab2 addon, any suggestions?

    When I try to open a .xps file (even using open with IE) firefox launches and prompts me to save as the only option. Trying to open the saved file just opens a new tab and repeats the save option. I have installed IEtab 2 but this has no effect.

  • Camera is not opening in my 3GS

    Hello, I have a problem with my iPhone 3GS The camera is not opening, I have already restored but it is the same... I don't know what to do... please help

  • Activating Workflow Purchase Order Material Management:

    Hi: I am working in the Workflow for Purchaser Order. Actually I have to notify when an Order have been created. I am using the BUS2012 and also using as basis the Workflow WS20000075. My problem is that I pre-configured Workflow in the transaction S