Best way to reduce the load on selct query?

Hi,
i have a requirement where i have to fetch data from wlk1 based on some conditions but when i am doing select on wlk1 then it is taking more than 5 minutes to fetch data. please let me know how i can reduce the load on wlk1?
thanks,
AD
Moderator message - Please ask a specific question - post locked
Edited by: Rob Burbank on May 6, 2009 9:43 AM

Hi ,
Try this way....
Data : entries type standard table of < your table name > with Header line.
Table_name = your table name...
* read client dependant tables always with client
  SELECT SINGLE * FROM dd02l WHERE tabname = table_name AND as4local = 'A'.
  IF dd02l-clidep = 'X'.
    CONCATENATE sy-mandt gen_key INTO gen_key.
    keyln = STRLEN( gen_key ) * cl_abap_char_utilities=>charsize.
  ENDIF.
*  endselect.
  IF keyln NE 0 OR from_key = space.
    CALL 'C_GET_TABLE' ID 'TABLNAME'  FIELD table_name
                       ID 'INTTAB'    FIELD buf-*sys*
                       ID 'GENKEY'    FIELD gen_key
                       ID 'GENKEY_LN' FIELD keyln
                       ID 'DBCNT'     FIELD number_of_entries
                       ID 'BYPASS'    FIELD bypass_buffer.
  ELSE.
    CALL 'C_GET_TABLE' ID 'TABLNAME'  FIELD table_name
                       ID 'INTTAB'    FIELD buf-*sys*
                       ID 'FROM_KEY'  FIELD from_key
                       ID 'TO_KEY'    FIELD to_key
                       ID 'DBCNT'     FIELD number_of_entries
                       ID 'BYPASS'    FIELD bypass_buffer.
  ENDIF.
  CASE sy-subrc.
    WHEN 4.  RAISE table_empty.
    WHEN 8.  RAISE table_not_found.
    WHEN 12. RAISE internal_error.
  ENDCASE.
  DESCRIBE TABLE buf LINES number_of_entries.
  IF max_entries = 0. max_entries = number_of_entries. ENDIF.
  LOOP AT buf TO max_entries.
    entries = buf.
    APPEND entries.
  ENDLOOP.
regards,
Prabhudas

Similar Messages

  • What is the best way to reduce the size of my Library

    Over time I have been extremely lazy and have not managed my Library very well. I have used CS2 along side Aperture and have done all the following:
    1 - Edited files in CS2 and then imported them into Aperture as a TIFF or PSD
    2 - Edited files in CS2 and then imported them into Aperture as a TIFF or PSD then made further adjustments in Aperture
    3 - From Aperture 'opened files with external editor' (CS2) and made adjustments in CS2
    4 - From Aperture 'opened files with external editor' and made adjustments in CS2 then made further adjustments in Aperture
    And to top all this off I have been merrily keywording and changing the Metadata. Switching between CS2 and Aperture in this way has left me with some VERY big files (especially the TIFF that Aperture produces when I use the 'open with external editor function'.
    I am looking for advice on how I can set about reducing the size of some of these files and what a suggested approach might be for 1-4 as each of them presents it's own special problem.
    Here's hoping you can help
    Cheers

    Hard drive capacities keep increasing. Easiest (seriously) is just to buy a larger hard drive. The time expended trying to save space on previously manipulated images is seldom worth the effort.
    -Allen Wicks

  • What r the ways to reduce the load time of master data

    Hi
    The full load of the master data load IP is taking 18hr to 2 days loading on average 2516360 Records
    Is their any option to reduce load time as its causing data base lock and impacting transports(other than data selection)
    Thanks in advance
    Anuj

    You will have to do some research. What MD extractor are you talking about?
    Test on R/3 system: First try to extract a considerable amount of records via transaction RSA3 in R/3 (10.000 or 100.000). Keep the time that it takes.
    Extract the same into BW but only into PSA. Again measure the time.
    Load data from PSA into datatarget and see how long this takes. You should now have a picture on where the performance problems are located.
    Is the performance also bad for small loads or is there a boundary before which performance is ok. (in other words, is loading 200.000 records ten times longer then loading 20.000 records?)
    Suspect the following in R/3:
    - datasource enhancements in R/3. A redesign may improve your extraction in a big way.
    - missing indexes. If you are extracting data from tables without proper indexes the extraction performance can be dramatic.
    Suspect the following if loading to PSA is bad:
    - Datapackage reads data in small chunks (much smaller than 50000 records). Overhead causes more time than the actual data transport.
    - Network problems. Maybe the network is congested by other servers?
    If loading from PSA to datatarget is slow:
    - Check start routines for performance.
    - Are enough batch partitions available? Sometimes activation of ODS can be improved by more parallel processes.
    - Are you using a lot masterdata-lookups when filling the datatargets?
    When you report you findings in this forum we may be able to help you further.

  • Best tool to reduce the network bandwidth for WEBI reports

    Hi Experts,
    We have a central BO XI server installed in Head Office. Few users are needed to connet to Head Office from their place (Regions) to access the WEBI reports.
    For this we have two options to refresh the report.
              1. BO InfoView
              2. Web Intelligence Rich Client
    My question is , which is the best way to reduce the internet bandwidth. Since my users are on remote location and doesn't have good link, which one is the best option to reduce the bandwidth consumtion.
    Regards,
    Suresh

    If you're launching Webi Rich Client in 3-tier (ZABO) mode there should be no difference from Infoview refresh, as WRC will utilize same webi report servers for refresh as Infoview does.
    If you use 2-tier mode with WRC, then refresh will be local, using local connection to the reporting DB.
    I'd say Infoview will be less bandwith intensive and you will be sending only structured and prepared data tot he client browser, not the rough data report needs.
    To further decrease bandwith, you can design your server infrastructure to have regional processing, that way most communicatyions with be local and requests go to central location only for main data...

  • What is the best way to mimic the data from production to other server?

    Hi,
    here we user streams and advanced replication to send the data for 90% of tables from production to another production database server. if one goes down can use another one. is there any other best option rather using the streams and replication? we are having lot of problems with streams these days they keep break and get calls.
    I heard about data guard but dont know what is use of it? please advice the best way to replicate the data.
    Thanks a lot.....

    RAC, Data Guard. The first one is active-active, that is, you have two or more nodes accessing the same database on shared storage and you get both HA and load balancing. The second is active-passive (unless you're on 11.2 with Active Standby or Snapshot Standby), that is one database is primary and the other is standby, which you normally cannot query or modify, but to which you can quickly switch in case primary fails. There's also Logical Standby - it's based on Streams and generally looks like what you seem to be using now (sort of.) But it definitely has issues. You can also take a look at GoldenGate or SharePlex.

  • Best Way to port the data from one DB to another DB using Biztalk

    Hi,
    please suggest best way to move the data from one db to another DB using biztalk.
    Currently I am doing like that, for each transaction(getting from different source tables) through receive port, and do some mapping (some custom logic for data mapping), then insert to target normalized tables(multiple tables) and back to update the status
    of transaction in source table in sourceDB. It is processing one by one.
    How/best we we can do it using  bulk transfer and update the status. Since it has more than 10000 transaction per call.
    Thanks,
    Vinoth

    Hi Vinoth,
    For SQL Bulk inserts you can always use SQL Bulk Load
    adapter.
    http://www.biztalkgurus.com/biztalk_server/biztalk_blogs/b/biztalksyn/archive/2005/10/23/processing-a-large-flat-file-message-with-biztalk-and-the-sqlbulkinsert-adapter.aspx
    However, even though a SQL Bulk Load adapter can efficiently insert a large amount of data into SQL you are still stuck with the issues of transmitting the
    MessageBox database and the memory issues of dealing with really large messages.
    I would personally suggest you to use SSIS, as you have mentioned that records have to be processed in specific time of day as opposed to when the
    records are available.
    Please refer to this link to get more information about SSIS: http://msdn.microsoft.com/en-us/library/ms141026.aspx
    If you have any more questions related to SSIS, please ask it in
    SSIS 
    forum and you will get specific support.
    Rachit

  • Best Trategy to reduce the Database Size

    Hi Everyone,
    In our Client's Landscape SAP systems have been upgraded to newer versions whereas our client want
    one copy of older Production systems (one copy to retain)
    1) SAP R/3 4.6 C system  (database size of this system is approx 2TB)
    2) SAP BW 3.0 (database size of this system is approx 2TB)
    Now CLient wants us to reduce the database size via re-organization because Archiving of IDOCs & Links we have already done
    Client has recommended for :
    1) Oracle Export/Import: Only Oracle DBA can do (ignore this one)
    2) Database Reorganization : We have tried Reoragnization via BRtools but found very tedious 9 (ignore this one)
    3) SAP Export/Import : Via this way we want to reduce the database size
    Can anybody Tell us How much Free space do we require in order at OS level in order to store the Database Export
    of Two Databases of size around 4TB & what would be the best strategy of reducing the Dabase size.
    Via SAP Export/Import how much approx how much database size will be reduced
    Thanks & Regards
    Deepak Gosain

    Hi,
    >Can anybody Tell us How much Free space do we require in order at OS level in order to store the Database Export
    >of Two Databases of size around 4TB & what would be the best strategy of reducing the Dabase size.
    The only realistic way to know is to do a system copy of the production system on a testbed system and to test the Database Export.
    If you really want to decrease the database size you will have to archive a lot more than the IDOC archiving object.
    Regards,
    Olivier

  • Is there a way to reduce the installation size of preview edition?

    Is there a way to reduce the installation size of preview edition?

    Hi IndianLamp,
    Thank you for posting in MSDN forum.
    Based on your issue, you mean that you want t reduce the installation size of the version of VS Preview, am I right?
    If yes, I know that when we download the version of the VS Preview from the Microsoft Official, it is default that the install package size. However, you can try to unselect some components during the installation, it will reduce the installation size of
    preview edition.
    If I misunderstanding your issue, please tell me more detail message about your issue.
    Best Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Once combine file, it is too large to email. Is there a way to reduce the size of the file?

    Combined files and now it is too large to send. Is there a way to reduce the size of the converted file in Adobe?

    Hi yodonna,
    Adobe Reader doesn't allow you to reduce the size of a PDF file, as Acrobat does. But, you could upload that file to your Acrobat.com account, and create an anonymous link to it using Adobe Send. Then, you could send that link via email. You can access Adobe Send at https://cloud.acrobat.com/send. The Create an Anonymous Link feature doesn't require that you have an Adobe Send subscription.
    Best,
    Sara

  • What is the best way to replace the Inline Views for better performance ?

    Hi,
    I am using Oracle 9i ,
    What is the best way to replace the Inline Views for better performance. I see there are lot of performance lacking with Inline views in my queries.
    Please suggest.
    Raj

    WITH plus /*+ MATERIALIZE */ hint can do good to you.
    see below the test case.
    SQL> create table hx_my_tbl as select level id, 'karthick' name from dual connect by level <= 5
    2 /
    Table created.
    SQL> insert into hx_my_tbl select level id, 'vimal' name from dual connect by level <= 5
    2 /
    5 rows created.
    SQL> create index hx_my_tbl_idx on hx_my_tbl(id)
    2 /
    Index created.
    SQL> commit;
    Commit complete.
    SQL> exec dbms_stats.gather_table_stats(user,'hx_my_tbl',cascade=>true)
    PL/SQL procedure successfully completed.
    Now this a normal inline view
    SQL> select a.id, b.id, a.name, b.name
    2 from (select id, name from hx_my_tbl where id = 1) a,
    3 (select id, name from hx_my_tbl where id = 1) b
    4 where a.id = b.id
    5 and a.name <> b.name
    6 /
    Execution Plan
    0 SELECT STATEMENT Optimizer=ALL_ROWS (Cost=7 Card=2 Bytes=48)
    1 0 HASH JOIN (Cost=7 Card=2 Bytes=48)
    2 1 TABLE ACCESS (BY INDEX ROWID) OF 'HX_MY_TBL' (TABLE) (Cost=3 Card=2 Bytes=24)
    3 2 INDEX (RANGE SCAN) OF 'HX_MY_TBL_IDX' (INDEX) (Cost=1 Card=2)
    4 1 TABLE ACCESS (BY INDEX ROWID) OF 'HX_MY_TBL' (TABLE) (Cost=3 Card=2 Bytes=24)
    5 4 INDEX (RANGE SCAN) OF 'HX_MY_TBL_IDX' (INDEX) (Cost=1 Card=2)
    Now i use the with with the materialize hint
    SQL> with my_view as (select /*+ MATERIALIZE */ id, name from hx_my_tbl where id = 1)
    2 select a.id, b.id, a.name, b.name
    3 from my_view a,
    4 my_view b
    5 where a.id = b.id
    6 and a.name <> b.name
    7 /
    Execution Plan
    0 SELECT STATEMENT Optimizer=ALL_ROWS (Cost=8 Card=1 Bytes=46)
    1 0 TEMP TABLE TRANSFORMATION
    2 1 LOAD AS SELECT
    3 2 TABLE ACCESS (BY INDEX ROWID) OF 'HX_MY_TBL' (TABLE) (Cost=3 Card=2 Bytes=24)
    4 3 INDEX (RANGE SCAN) OF 'HX_MY_TBL_IDX' (INDEX) (Cost=1 Card=2)
    5 1 HASH JOIN (Cost=5 Card=1 Bytes=46)
    6 5 VIEW (Cost=2 Card=2 Bytes=46)
    7 6 TABLE ACCESS (FULL) OF 'SYS_TEMP_0FD9D6967_3C610F9' (TABLE (TEMP)) (Cost=2 Card=2 Bytes=24)
    8 5 VIEW (Cost=2 Card=2 Bytes=46)
    9 8 TABLE ACCESS (FULL) OF 'SYS_TEMP_0FD9D6967_3C610F9' (TABLE (TEMP)) (Cost=2 Card=2 Bytes=24)
    here you can see the table is accessed only once then only the result set generated by the WITH is accessed.
    Thanks,
    Karthick.

  • Does anyone know of a way to reduce the number of colors of an image

    I have read all the documentation on this and there is no clear and easy way to reduce the number of colors of a bufferedimage. It would be also handy to know how to ensure that the image stays under specific size.
    thanks for any help
    Lee

    Yeah Abuse you are sooo right, i have written code to do this and it was/wasnt fun depending on your sense of humour...
    Heres how it works..
    *1 pick a pallet, the more colours, the bigger the result file, but better quality
    -so decide what you want
    *2 for each pixel find the nearest colour in your pallet that matches it.
    -its important to count each pixel rather than each colour in the original as a large block of 1 colour should have more weight
    *3 then for each colour in you pallet average out the colour of all pixels that would become that colour
    -the average becomes the new best colour for that pallet entry
    -since the pallet has now changed repeat 2 and 3 as long as you want, or rather untill nothing changes
    * if at any time a pallet entry has nothing that maps to it then make it become a colour that is near the most used pallet entry.
    have fun, its called cluster theory and is used lots in converting images to .GIF

  • I have lightroom 5.7. Now I have apple TV to connect my Mac to the TV scree. I wish to do a slide show on the TV. However, on the Mac, using ITunes to share photos with the TV, I cannot locate all photo files. What is the best way to save the slide show a

    I have lightroom 5.7. Now I have apple TV to connect my Mac to the TV scree. I wish to do a slide show on the TV. However, on the Mac, using ITunes to share photos with the TV, I cannot locate all photo files. What is the best way to save the slide show and put it where the Mac sharing can find it? So far, I made on one folder in Lightroom, put some photos there and I found them easily. Can this be done with a slide show? Please help quickly! Also worried that the photos I put on the new folder are hard to find afterwards, when the show is done. Where do they go when I delete from from the new folder?I am not alone

    Not that I'm aware of. You just export JPEG copies to a folder that you can point iTunes to. For instance, I have created a folder in my Pictures folder called Apple TV. And within that folder I have other folders of pictures that I can choose from in iTunes to share with Apple TV. But there doesn't seem to be any way to share a Lightroom slideshow. If you have laid to create a video file that would probably work. Apple TV is a little clunky in my opinion. Some things are a little more difficult to do now than they were a while back. I probably haven't provided you with much help, but just keep experimenting and I think you will figure it out.

  • Is there any way to reduce the JPEG compression ap...

    I'm wondering if there is any way to reduce the fierce amount of JPEG compression applied to photos taken with the 6220 classic? I'm 99.99% sure that there isn't, but I thought I'd ask anyway.
    I'm a professional graphic designer with 15 years experience, and as such understand the technicalities of digital imaging better than most.
    What the general public fails to understand is that ever higher megapixelage doesn't automatically equate to ever higher quality images.
    The 6220 classic has a 5MP camera, which is one of the reasons I bought it, along with the fact that it has a Xenon flash and a proper lens cover. Its imaging quality also generally gets very positive reviews online.
    However, the 6220 classic takes far poorer photos than my 5 year old Olympus digital camera which only shoots 4MP. Why is this? Many reasons. The Olympus has a much larger imaging chip, onto which the image is recorded (physical size as opposed to pixel dimensions), a far superior lens (physical size & quality of materials), optical (not digital) zoom, and the ability to set various levels of JPEG compression, from fierce (high compression, small files, low quality images) to none at all (no compression, large files, high quality TIFF-encoded images).
    When I first used the camera on the 6220 classic (I've never owned a camera phone before) I was appalled at the miniscule file sizes. A 2592 x 1944 pixel image squashed into a few hundred kilobytes makes a mockery of having decent pixel dimensions in the first place, but then the average consumer neither cares about nor would notice the difference. They're not going to be examining & working on their images in Photoshop on a 30" Apple Cinema Display.
    Is fierce JPEG compression (and an inability to alter it) the norm with camera phones, or do other camera phones (perhaps from other manufacturers) allow greater latitude in how images are compressed?
    Thanks.
    Solved!
    Go to Solution.

    Believe me, I was very aware that this was a phone with a camera attached, not a dedicated camera, before I bought it. I went into this with my eyes open. I knew the lens, imaging chip, zoom, etc, would all be grossly inferior, but given all of this, surely the phone manufacturers should help to compensate for this by adding a few lines of code to the software to reduce (or ideally remove) the JPEG compression, or at least give the user the option to do so if they want? The fierce compression just makes images obtained with compromised hardware even worse than they would have been otherwise.
    It adds insult to injury and is totally unnecessary, especially given that the memory card in the 6200 classic is 1GB but the one in my Olympus is only 128 MB! It's not as if lack of storage space is an issue! On the Olympus I can only take about 8 pictures without compression (although I could obviously buy a much larger memory card). On the 6220 classic, given the ridiculous amount of compression, there's room for over 1200 photos! It would be far better to let 70 uncompressed images be stored than 1200 compressed ones. Does anyone seriously need to take over a thousand photos on a camera phone without having access to a computer to offload them? I doubt it.
    Also, compressing the images requires processing power, which equals time. If they were saved uncompressed, the recovery time between shots would be reduced, although obviously writing the larger files to memory may offset this somewhat.
    Just to give people an idea, an uncompressed 8-bit RGB TIFF with pixel dimensions of 2592 x 1944 takes up approximately 14.5 MB of space. (The exact number of bytes varies slightly depending on the header information stored with the file). The 3 photos I've taken so far with the 6220 classic (and that I've decided to actually keep) have files sizes of 623, 676 & 818 KB respectively. An average of these 3 sizes is 706 KB. 706 KB is less than 5% the size of 14.5 MB, which means that, on average, the camera, after is records the 5038848 pixels in an image, throws over 95% of them away.
    I'm deeply unimpressed.

  • I am moving from PC to Mac.  My PC has two internal drives and I have a 3Tb external.  What is best way to move the data from the internal drives to Mac and the best way to make the external drive read write without losing data

    I am moving from PC to Mac.  My PC has two internal drives and I have a 3Tb external.  What is best way to move the data from the internal drives to Mac and the best way to make the external drive read write without losing data

    Paragon even has non-destriuctive conversion utility if you do want to change drive.
    Hard to imagine using 3TB that isn't NTFS. Mac uses GPT for default partition type as well as HFS+
    www.paragon-software.com
    Some general Apple Help www.apple.com/support/
    Also,
    Mac OS X Help
    http://www.apple.com/support/macbasics/
    Isolating Issues in Mac OS
    http://support.apple.com/kb/TS1388
    https://www.apple.com/support/osx/
    https://www.apple.com/support/quickassist/
    http://www.apple.com/support/mac101/help/
    http://www.apple.com/support/mac101/tour/
    Get Help with your Product
    http://docs.info.apple.com/article.html?artnum=304725
    Apple Mac App Store
    https://discussions.apple.com/community/mac_app_store/using_mac_apple_store
    How to Buy Mac OS X Mountain Lion/Lion
    http://www.apple.com/osx/how-to-upgrade/
    TimeMachine 101
    https://support.apple.com/kb/HT1427
    http://www.apple.com/support/timemachine
    Mac OS X Community
    https://discussions.apple.com/community/mac_os

  • I have a iMac Desktop and MacBook Pro laptop. I generate my work related files on both of these machines. What is the best way to keep the files in these machines synchronized?

    I have a iMac Desktop and MacBook Pro laptop. I generate my work related files on both of these machines. After a few days or weeks, I have new files on some folders on either of the machines that is not on the other machine. What is the best way to keep the files in these machines synchronized?

    How did you transfer the files to the iMac.  If you exported the files out of the MB library using Kind = Current you should get the edited version.  Any other  option may not.
    If you want to keep the two libraries "synced"  any photos you want to move to the iMac should be added to an album, connect the two Mac with a LAN, Target Disk Mode,  Transferring files between two computers using FireWire, with WiFi. and use the paid version of  iPhoto Library Manager to copy that album from the MB library to iMac library.  It will also copy the original and edited versions, keywords, titles, etc.
    OT

Maybe you are looking for

  • Preventing stripping style in rich text (rte)

    I would like to customize Rich Text so that it accepts and does not strip <style> tags I feel I could use the beautifier node (htmlRules>serializer>beautifier) like referenced here  : http://dev.day.com/docs/en/cq/current/widgets-api/output/CQ.form.r

  • Send XML through JSP pages

    Hi friends I was wondering if it's possible to send XML data when a form button is pressed. Suppose we have in the client machine a browser displaying a HTML Form with one text inputtype and one button to send this information to the JSP Server. If t

  • HT1386 i have an itouch synced with itunes and want to sync iphone too.

    I cant get iphone 4 to sync with itunes.  Itunes is on mac and previously and currently is synced with itouch.  i want to add the iphone as an additional synced device.  i have authorized the iphone in itunes but it does not show up as a device to sy

  • Kill session in RAC

    Hello, I have a RAC environment, and I need to kill a session, I user "alter kill session <SID> , <SERIAL>", but the session still appear in the database, so I tried to kill througn the OS (Unix), but the spid don´t match with the PID of SO. Any sugg

  • NEED TO PLACE THE RECORDS POSITION BASED IN THE UNIX APPLICATION SERVER

    HI GURUS,                         My internal table has 5 fields I have to palce these 5 fields in the application server(/var/opt...) at position based. Each field has to place in different position in the same row in the application server.. please