DrawImage takes long time for images created with Photoshop

Hello,
I created a simple program to resize images using the drawImage method and it works very well for images except images which have either been created or modified with Photoshop 8.
The main block of my code is
public static BufferedImage scale(  BufferedImage image,
                                      int targetWidth, int targetHeight) {
   int type = (image.getTransparency() == Transparency.OPAQUE) ?
                    BufferedImage.TYPE_INT_RGB :
                    BufferedImage.TYPE_INT_RGB;
   BufferedImage ret = (BufferedImage) image;
   BufferedImage temp = new BufferedImage(targetWidth, targetHeight, type);
   Graphics2D g2 = temp.createGraphics();
   g2.setRenderingHint
         RenderingHints.KEY_INTERPOLATION, 
         RenderingHints.VALUE_INTERPOLATION_BICUBIC
   g2.drawImage(ret, 0, 0, targetWidth, targetHeight, null);
   g2.dispose();
   ret = temp;
   return ret;
}The program is a little longer, but this is the gist of it.
When I run a jpg through this program (without Photoshop modifications) , I get the following trace results (when I trace each line of the code) telling me how long each step took in milliseconds:
Temp BufferedImage: 16
createGraphics: 78
drawimage: 31
dispose: 0
However, the same image saved in Photoshop (no modifications except saving in Photohop ) gave me the following results:
Temp BufferedImage: 16
createGraphics: 78
drawimage: 27250
dispose: 0
The difference is shocking. It took the drawImage process 27 seconds to resize the file in comparison to 0.78 seconds!
My questions:
1. Why does it take so much longer for the drawImage to process the file when the file is saved in Photoshop?
2. Are there any code improvements which will speed up the image drawing?
Thanks for your help,
-Rogier

You saved the file in PNG format. The default PNGImagReader in core java has a habit of occasionally returning TYPE_CUSTOM buffered images. Photoshop 8 probably saves the png file in such a way that TYPE_CUSTOM pops up more.
And when you draw a TYPE_CUSTOM buffered image onto a graphics context it almost always takes an unbearably long time.
So a quick fix would be to load the file with the Toolkit instead, and then scale that image.
Image img = Toolkit.getDefaultToolkit().createImage(/*the file*/);
new ImageIcon(img);
//send off image to be scaled A more elaborate fix involves specifying your own type of BufferedImage you want the PNGImageReader to use
ImageInputStream in = ImageIO.createImageInputStream(/*file*/);
ImageReader reader = ImageIO.getImageReaders(in).next();
reader.setInput(in,true,true);
ImageTypeSpecifier sourceImageType = reader.getImageTypes(0).next();
ImageReadParam readParam = reader.getDefaultReadParam();
//to implement
configureReadParam(sourceImageType, readParam);
BufferedImage img = reader.read(0,readParam);
//clean up
reader.dispose();
in.close(); The thing that needs to be implemented is the method I called configureReadParam. In this method you would check the color space, color model, and BufferedImage type of the supplied ImageTypeSpecifier and set a new ImageTypeSpecifier if need be. The method would essentially boil down to a series of if statements
1) If the image type specifier already uses a non-custom BufferedImage, then all is well and we don't need to do anything to the readParam
2) If the ColorSpace is gray then we create a new ImageTypeSpecifier based on a TYPE_BYTE_GRAY BufferedImage.
3) If the ColorSpace is gray, but the color model includes alpha, then we need to do the above and also call seSourceBands on the readParam to discard the alpha channel.
3) If the ColorSpace is RGB and the color model includes alpha, then we create a new ImageTypeSpecifier based on an ARGB BufferedImage.
4) If the ColorSpace if RGB and the color model doesn't include alpha, then we create a new ImageTypeSpecifier based on TYPE_3BYTE_BGR
5) If the ColorSpace is not Gray or RGB, then we do nothing to the readParam and ColorConvertOp the resulting image to an RGB image.
If this looks absolutely daunting to you, then go with the Toolkit approach mentioned first.

Similar Messages

  • Takes long time for shutdown after adding to domain

    Hi,
    Workstation OS - Window 7
    Domain Controller OS - Window server 2008 R2 standard
    Following thing i have measure with Stopwatch.
    1) When the laptop is in workgroup, it takes just 17 second for shutdown.
    2) when i added the same laptop in domain ( corp.abc.com ), now it takes 1minutes and 22 second for shutdown. ( it just show shutting down screen )
    Why the shutdown time increased so much?
    Do you have any idea?
    Note:- i have not make any changes in laptop, nor added any software. I have done the above testing because user start complaining that after putting laptop in domain it takes long time for shutdown. It's happening to all laptop where the OS is window 7
    There is no logoff scripts / gpo's as well as we don't have roaming profiles.
    Please advice.
    Thanks & Regards,
    Param
    www.paramgupta.blogspot.com

    Hi,
    To troubleshooting this issue, please install Windows Performance Tools (WPT) Kit. The Windows Performance Tools (WPT) Kit contains performance analysis tools, and is designed for analysis of a wide range of performance problems including application start
    times, boot issues, deferred procedure calls and interrupt activity (DPCs and ISRs), system responsiveness issues, application resource usage, and interrupt storms.
    To get the installer, you have to install the Windows 7 SDK.
    Microsoft Windows SDK for Windows 7 and .NET Framework 3.5 SP1
    http://www.microsoft.com/en-us/download/details.aspx?id=3138
    For shutdown tracing:
    Run command:
    xbootmgr -trace shutdown -noPrepReboot -traceFlags BASE+CSWITCH+DRIVERS+POWER -resultPath C:\TEMP
    Collect logs and post them for further troubleshooting.
    For more information please refer to following MS articles:
    Long Shutdown Time on Windows 7 Ultimate x64
    http://social.technet.microsoft.com/Forums/en/w7itproperf/thread/11a42a93-efd2-4184-9ce8-bbc1438b7ea6
    Long shutdown time on Windows 7 64 bit laptop
    http://social.technet.microsoft.com/Forums/en-US/w7itproperf/thread/4440fc6e-c81e-440c-9183-9b7e176729d2
    Lawrence
    TechNet Community Support

  • HT4759 Hello .. I've been subscribed for ic;oud for 20$ per year and I found it useless for many reasons: that I can not disconnect my mobile while the uploading process and it takes long time for uploading my data .. Its not a reliable system that's why

    Hello .. I've been subscribed for ic;oud for 20$ per year and I found it useless for many reasons: that I can not disconnect my mobile while the uploading process and it takes long time for uploading my data .. Its not a reliable system that's why I need to deactivate the space service and take my money back .. Thanks

    The "issues" you've raised are nothing to do with the iCloud service.
    No service that uploads data allows you to disconnect the device you are uploading from while uploading data. Doing so would prevent the upload from completing. It is a basic requirement for any uploading service that you remain connected to it for uploading to be possible.
    The time it takes to upload data to iCloud is entirely dependent on how fast your Internet connection is, and how much data you are uploading. Both of these things are completely out of Apple's control. Whichever upload service you use will be affected by the speed of your Internet connection.

  • Bex Reports takes long time for filtering

    Hi,
    We have gone live in last December.And already our inventory cube contains some 15 million records,sales cube contains 12 million records.
    Is there any specific limit to number of records.Because while doing filtering in inventory report or sales reports it is taking very  long time.
    Is there any alternative or we should delete some the data from the cube.
    for filtering any value it is taking long time than running the query itself.
    Pls help...
    Regards,
    viren.

    Hi Viren,
    I hope a cube can perform well even at 100 million records with some performance tunning. So i absolutely doubt why it is taking long time for your cube with just 10-15 million records.
    Do a performance analysis and check whether aggregates will be helpful or not.
    Check the below link for how to do a performance analysis.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/d9fd84ad-0701-0010-d9a5-ba726caa585d
    Hope it helps.
    Thx,
    Soumya

  • Analyze a Query which takes longer time in Production server with ST03 only

    Hi,
    I want to Analyze a Query which takes longer time in Production server with ST03 t-code only.
    Please provide me with detail steps as to perform the same with ST03
    ST03 - Expert mode- then I need to know the steps after this. I have checked many threads. So please don't send me the links.
    Write steps in detail please.
    <REMOVED BY MODERATOR>
    Regards,
    Sameer
    Edited by: Alvaro Tejada Galindo on Jun 12, 2008 12:14 PM

    Then please close the thread.
    Greetings,
    Blag.

  • When I use update software from my mobile it take long time for checking update software and no thing happened to update my IOS software

    When I use update software from my mobile it take long time for checking update software and no thing happened to update my IOS software

    Servers have been swamped. Keep trying and be patient. Wait a few days.

  • Report takes long time for few records

    hi frends,
    I m facing one problem with my Web based erp application which is developed in .net , in my application when i open the  report from my applicaiton , in my temp folder there one file gets created name is "rpt conmgr cache"
    bcoz of this for few records also my report takes too much time and opens very slow and it takes long time, and it happens in some of the reports only , other reports are working cool and its not creating any file in temp folder,,, so can u guide me whats this file and what can be the solution for it,
    Thanks
    Mithun

    hi sabhajit,
    i have already checked the sql query it is taking less then seconds.
    any other steps u want me to check then pls let me know?
    thanks mithun

  • Takes Long time for Data Loading.

    Hi All,
    Good Morning.. I am new to SDN.
    Currently i am using the datasource 0CRM_SRV_PROCESS_H and it contains 225 fields. Currently i am using around 40 fields in my report.
    Can i hide the remaining fields in the datasource level itself (TCODE : RSA6)
    Currently data loading takes more time to load the data from PSA to ODS (ODS 1).
    And also right now i am pulling some data from another ODS(ODS 2)(LookUP). It takes long time to update the data in Active data table of the ODS.
    Can you please suggest how to improve the performance of dataloading on this Case.
    Thanks & Regards,
    Siva.

    Hi....
    Yes...u can hide..........just Check the hide box for those fields.......R u in BI 7.0 or BW...........whatever ........is the no of records is huge?
    If so u can split the records and execute............I mean use the same IP...........just execute it with different selections.........
    Check in ST04............is there are any locks or lockwaits..........if so...........Go to SM37 >> Check whether any Long running job is there or not.........then check whether that job is progressing or not............double click on the Job >> From the Job details copy the PID..............go to ST04 .....expand the node............and check whether u r able to find that PID there or not.........
    Also check System log in SM21............and shortdumps in ST04........
    Now to improve performance...........u can try to increase the virtual memory or servers.........if possiblr........it will increase the number of work process..........since if many jobs run at a time .then there will be no free Work prrocesses to proceed........
    Regards,
    Debjani......

  • Take long time for loading

    Hi Experts,
    One of my data target (Data comming from another ODS) is taking long time for loading, basically it takes below 10 times only, but today it is running from last 40 minesu2026
    In Status Tab it showing....
    Job termination in source system
    Diagnosis
    The background job for data selection in the source system has been terminated. It is very likely that a short dump has been logged in the source system
    Procedure
    Read the job log in the source system. Additional information is displayed here.
    To access the job log, use the monitor wizard (step-by-step analysis)  or the menu path <LS>Environment -> Job Overview -> In Source System
    Error correction:
    Follow the instructions in the job log messages.
    Can anyone please solve my problem.
    Thanks in advance
    David

    Hi Experts,
    Thanks for your answers, and my load failed when the data comes from one ODS to another ODS. find the below job log
    Job started
    Step 001 started (program SBIE0001, variant &0000000007169, user ID RCREMOTE)
    Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
    DATASOURCE = 8ZPP_OP3
             Current Values for Selected Profile Parameters               *
    abap/heap_area_nondia......... 20006838008                             *
    abap/heap_area_total.......... 20006838008                             *
    abap/heaplimit................ 83886080                                *
    zcsa/installed_languages...... ED                                      *
    zcsa/system_language.......... E                                       *
    ztta/max_memreq_MB............ 2047                                    *
    ztta/roll_area................ 5000000                                 *
    ztta/roll_extension........... 4294967295                              *
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:31:55, End = 06.09.2010 01:31:55
    Asynchronous transmission of info IDoc 3 in task 0003 (1 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 2 in task 0004 (2 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:00, End = 06.09.2010 01:32:00
    Asynchronous transmission of info IDoc 4 in task 0005 (2 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 3 in task 0006 (3 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:04, End = 06.09.2010 01:32:04
    Asynchronous transmission of info IDoc 5 in task 0007 (3 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 4 in task 0008 (4 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:08, End = 06.09.2010 01:32:08
    Asynchronous transmission of info IDoc 6 in task 0009 (4 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 5 in task 0010 (5 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:11, End = 06.09.2010 01:32:11
    Asynchronous transmission of info IDoc 7 in task 0011 (5 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Asynchronous send of data package 13 in task 0026 (6 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:01, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:44, End = 06.09.2010 01:32:45
    tRFC: Data Package = 8, TID = 0AEB465C00AE4C847CEA0070, Duration = 00:00:17,
    tRFC: Start = 06.09.2010 01:32:29, End = 06.09.2010 01:32:46
    Asynchronous transmission of info IDoc 15 in task 0027 (5 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 14 in task 0028 (6 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:48, End = 06.09.2010 01:32:48
    tRFC: Data Package = 9, TID = 0AEB465C00AE4C847CEF0071, Duration = 00:00:18,
    tRFC: Start = 06.09.2010 01:32:33, End = 06.09.2010 01:32:51
    Asynchronous transmission of info IDoc 16 in task 0029 (5 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 15 in task 0030 (6 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:52, End = 06.09.2010 01:32:52
    Asynchronous transmission of info IDoc 17 in task 0031 (6 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 16 in task 0032 (7 parallel tasks)
    tRFC: Data Package = 10, TID = 0AEB465C00684C847CF30070, Duration = 00:00:18,
    tRFC: Start = 06.09.2010 01:32:37, End = 06.09.2010 01:32:55
    tRFC: Data Package = 11, TID = 0AEB465C02E14C847CF70083, Duration = 00:00:17,
    tRFC: Start = 06.09.2010 01:32:42, End = 06.09.2010 01:32:59
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:32:56, End = 06.09.2010 01:32:56
    Asynchronous transmission of info IDoc 18 in task 0033 (5 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 17 in task 0034 (6 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:33:00, End = 06.09.2010 01:33:00
    tRFC: Data Package = 12, TID = 0AEB465C00AE4C847CFB0072, Duration = 00:00:16,
    tRFC: Start = 06.09.2010 01:32:46, End = 06.09.2010 01:33:02
    Asynchronous transmission of info IDoc 19 in task 0035 (5 parallel tasks)
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
    Result of customer enhancement: 100,000 records
    Asynchronous send of data package 18 in task 0036 (6 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
    tRFC: Start = 06.09.2010 01:33:04, End = 06.09.2010 01:33:04
    Asynchronous transmission of info IDoc 20 in task 0037 (6 parallel tasks)
    ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    Job cancelled
    Thanks
    Daivd
    Edited by: david Rathod on Sep 6, 2010 12:04 PM

  • Adobe form take long time for Check/Send at portal

    Hi Experts
    We have a form at Portal, which take long time when we click on Check to validate the form, its taking around 3 mins.
    where as other forms are not taking this much time. Can anyone help us on this issue.
    Thanks
    Sajal

    Hi Sajal,
    did you already contact your basis-guys to trace the performance of the ADS itself?
    Sounds to me, that the connection of the portal is not that good and maybe this is one of the problems.
    Also check on the interface to the form. what takes the time... the driver-program fetching the data or even the form itself.
    Additional to that you should have a look inside the form and see how much scripting is inside. Sometimes there is a lot of unnecessary source inside and out of that, you didn't share the form (if it is a SAP-delivery) I cannot get more in detail with my answer here.
    Hope it gives you a clue where to start with your journey.
    ~Florian
    PS: If you use the search with keywords "ADS + trace" you find a lot of useful information.

  • How to tune this SQL (takes long time to come up with results)

    Dear all,
    I have sum SQL which takes long time ... can any one help me to tune this.... thank You
    SELECT SUM (n_amount)
    FROM (SELECT DECODE (v_payment_type,
    'D', n_amount,
    'C', -n_amount
    ) n_amount, v_vou_no
    FROM vouch_det a, temp_global_temp b
    WHERE a.v_vou_no = TO_CHAR (b.n_column2)
    AND b.n_column1 = :b5
    AND b.v_column1 IN (:b4, :b3)
    AND v_desc IN (SELECT v_trans_source_code
    FROM benefit_trans_source
    WHERE v_income_tax_app = :b6)
    AND v_lob_code = DECODE (:b1, :b2, v_lob_code, :b1)
    UNION ALL
    SELECT DECODE (v_payment_type,
    'D', n_amount,
    'C', -n_amount
    * -1 AS n_amount,
    v_vou_no
    FROM vouch_details a, temp_global_temp b
    WHERE a.v_vou_no = TO_CHAR (b.n_column2)
    AND b.n_column1 = :b5
    AND b.v_column1 IN (:b12, :b11, :b10, :b9, :b8, :b7)
    AND v_desc IN (SELECT v_trans_source_code
    FROM benefit_trans_source
    WHERE income_tax_app = :b6)
    AND v_lob_code = DECODE (:b1, :b2, v_lob_code, :b1));
    Thank You.....

    Thanks a lot,
    i did change the SQL it works fine but slows down my main query.... actually my main query is calling a function which does the sum......
    here is the query.....?
    select A.* from (SELECT a.n_agent_no, a.v_agent_code, a.n_channel_no, v_iden_no, a.n_cust_ref_no, a.v_agent_type, a.v_company_code,
    a.v_company_branch, a.v_it_no, bfn_get_agent_name(a.n_agent_no) agentname,
    PKG_AGE__TAX.GET_TAX_AMT(:P_FROM_DATE,:P_TO_DATE,:P_LOB_CODE,A.N_AGENT_NO)  comm,
    c.v_ird_region
    FROM agent_master a, agent_lob b, agency_region c
    WHERE a.n_agent_no = b.n_agent_no
    AND a.v_agency_region = c.v_agency_region
    AND :p_lob_code = DECODE(:p_lob_code,'ALL', 'ALL',b.v_line_of_business)
    AND :p_channel_no = DECODE(:p_channel_no,1000, 1000,a.n_channel_no)
    AND :p_agency_group = DECODE(:p_agency_group,'ALL', 'ALL',c.v_ird_region)
    group by a.n_agent_no, a.v_agent_code, a.n_channel_no, v_iden_no, a.n_cust_ref_no, a.v_agent_type, a.v_company_code, a.v_company_branch, a.v_it_no, bfn_get_agent_name(a.n_agent_no) ,
    BPG_AGENCY_GEN_ACL_TAX.BFN_GET_TAX_AMOUNT(:P_FROM_DATE,:P_TO_DATE,:P_LOB_CODE,A.N_AGENT_NO),
    c.v_ird_region
    ORDER BY c.v_ird_region, a.v_agent_code DESC)
    A
    WHERE (COMM < :P_VAL_IND OR      COMM >=:P_VAL_IND1);
    Any idea to make this faster....
    Thank You...

  • Sharepoint Designer workflow takes long time for execution of action

    Hi All ,
    I have created declarative workflow using SharePoint designer 2010.which is getting executed successfully,But taking lot of time for execution.
    Below are details of it
    workflow contains only one activity "assign Task to User" and workflow will start automatically after uploading document.
    workflow takes 10 minutes to create task for user , 10 minutes to claim task and 10 minutes to execute if any action(Approve or Reject) is taken on task.
    no error in log file or event log related to workflow.
    options tried:
    1.I have tried options  suggested in article(http://www.codeproject.com/Articles/251828/How-to-Improve-Workflow-Performance-in-SharePoint#_rating ),but no luck
    2. Reduced the interval of worflow timer job to 1  from 5 .still no luck
    Any thoughts regarding this would be appreciated.
    ragava_28

    Hi Thuan,
    I have similar issue posted here
    http://social.msdn.microsoft.com/Forums/sharepoint/en-US/82410142-31bc-43a2-b8bb-782c99e082d3/designer-workflow-with-takes-time-to-execute?forum=sharepointcustomizationprevious
    Regards,
    SPGeek03

  • Takes long time for songs to start playing

    I have window 7. I have problems when i try to play a song it takes a long time and it displays on the screen " loading URL". Why does it take so long. I have trie itunes versions 8-10.5. The only itunes that it doesn't happen in is itunes 7. Itunes also freezes up sometimes. Thanks in advance.

    Hi,
    To troubleshooting this issue, please install Windows Performance Tools (WPT) Kit. The Windows Performance Tools (WPT) Kit contains performance analysis tools, and is designed for analysis of a wide range of performance problems including application start
    times, boot issues, deferred procedure calls and interrupt activity (DPCs and ISRs), system responsiveness issues, application resource usage, and interrupt storms.
    To get the installer, you have to install the Windows 7 SDK.
    Microsoft Windows SDK for Windows 7 and .NET Framework 3.5 SP1
    http://www.microsoft.com/en-us/download/details.aspx?id=3138
    For shutdown tracing:
    Run command:
    xbootmgr -trace shutdown -noPrepReboot -traceFlags BASE+CSWITCH+DRIVERS+POWER -resultPath C:\TEMP
    Collect logs and post them for further troubleshooting.
    For more information please refer to following MS articles:
    Long Shutdown Time on Windows 7 Ultimate x64
    http://social.technet.microsoft.com/Forums/en/w7itproperf/thread/11a42a93-efd2-4184-9ce8-bbc1438b7ea6
    Long shutdown time on Windows 7 64 bit laptop
    http://social.technet.microsoft.com/Forums/en-US/w7itproperf/thread/4440fc6e-c81e-440c-9183-9b7e176729d2
    Lawrence
    TechNet Community Support

  • Batch creation program takes long time for large file

    Hi,
    I am uploading batch using a custom program which uses BAPI_BATCH_SAVE_REPLICA.The program takes 4 hrs for uploading 100000 records.
    But when I am using a file with 400000 records. It creates the second batch after 8 hrs from the starting of the batch job. The job takes
    more than 50 hrs to complete.
    Any suggestion?
    Regards,
    Tan

    We had a simular problem loading materials where it would have taken over 23 hours to lad over 500,000 material masters.  It may be that Logged changes are switched on to the table that you are loading the data into and that te,porairy you need to get this switched off or copy and amend the bapi that you are using so that it doesn't update the log.
    The other thing that you can do is to split your data file into smaller batches and load it that way.

  • Wait for Image created with Toolkit.createImage()

    I try to create a new image using Toolkit.createImage(byte[]). When I try to proccess this image afterwards with an ImageObserver Container.prepareImage() I get an error returned from Container.checkImage().
    It looks like Toolkit.createImage(byte[]) did not finish the image-creation. But how can I wait for it?

    Try to use a MediaTracker:
    MediaTracker tracker=new MediaTracker(this);
    tracker.addImage(image,0);
    try
    tracker.waitForID(0);
    catch(InterruptedException e){}

Maybe you are looking for

  • New DVD drive not recognized on my Satellite Pro 4600

    Hi all, I have recently bought a brand new NEC ND-6650 DVD writer. This drive is NOT recognized by my Satellite Pro 4600, at all - neither Windows 2000 nor in the BIOS (v 2.60). Bios updates higher than version 2.60 are not provided. Does this mean I

  • Solaris 10 x86 ends in grub

    Hi Frends! When I try to boot Solaris 10 on my x86 PC it takes me to the GRUB> prompt. Solaris 10 been a perfect gem of an Operating system been working perfectly fine till now. My PC spec: x86 PC Celeron 2.0 ghz 735 mb ram. s3 prosavage ddr onboard

  • How to import video on a mini DVD-R to MacBook?

    I have used Diva before but that does not import that audio...i would like to use something, preferably free, that imports both audio and video. thank you

  • Import Recording Session from Pyramix into Logic

    Hi, how to import a recording session from pyramix into Logic Is it possible via OMF/AAF export? Can I use a usb/fw harddrive with just one FAT32 Volume for Files Transfer from PC Who has experiences with OMF Import into Logic Cheers Markus

  • Passing parameter to servlet

    I can get a parameter with the ServletRequest.getAttribute(...) right? But how can I give the parameter to the servlet? Is it something like http://.../servlet/framePack.frameBase?table=env ?