How to improve site download time?

I use DW CS4.  I have several sites that involve many photos.  I notice that it sometimes takes a very long time to get the sites up.  I host with hostmonster.  I know they offer a file compression option.  Is it an advisable for me to compress my jpeg photos in my sites to improve initial download time for the sites I design?  Does compression improve the timeframe on this?  Thanks for your help.

How many files are being downloaded onto the page?
What is the approximate file size of these images?
How many machines have you tested this on?
What Operating System?
Yes your computer if dated and running Windows XP (Good ol Microsoft) run slow as crap with all the patches they push out to slow down your system, ergh I mean to fix your system.
Anyhow you have many many factors.
1. Size really is not an issue in todays day and age unless your dealing with dial up its not 1995 AOL were in the age of HIGH SPEED
2. Users computer system configuration
3. Your hosts output and how they regulate the bandwidth to the little people such as us who work for a living over those big corps out their
4. Browsers are you testing older vesions or newer version
5. How much of the system resources are utilized and available for the system your on
Just some ideas.

Similar Messages

  • How to improve quality of time capsule connection signal?

    how to improve quality of time capsule connection signal?
    i have one time capsule in Ground floor but the signal is very weak in the first floor? how can i improve the Quality internet signal???

    wassimfrombeirut wrote:
    Dear LaPastenague
    Thank you for your reply
    i just need to ask you a question; i am a new  system admin using mac.
    i need a software that can manage my internet bandwidth; qouta; and restriction to facebook; youtube and other web
    could you guide me to a software i can use????
    note that my environment is fully mac laptop; and using Time capsule as Wi Fi source to connect to enternet.
    thank you;
    best regards
    wassim;
    The best software to do what you want, is not on a Mac. It is a router.. gargoyle.
    You can keep using the TC as wifi source but get a router that can can take gargoyle firmware.. the best in my testing is netgear WNDR3700. Gargoyle itself is a free download.
    http://www.gargoyle-router.com/download.php
    There are other routers and router boards it will work on.. but make sure you pick something compatible. I have used several different routers and the one above is the best so far.

  • How to improve the load time of my swf group

    Hi,
    I need help to have some tricks to improve my load time on my swf captivate online traning. My training has 6 sections and it takes 3 minutes to download each time I open the window of the training. It takes too much time and if there are 50 users at the same time, it will take lots of my website bandwidth. Do you have any tips on captivate settings or other tips to help reduce my training download time? I do not understand why the 6 modules loading simultaneously and not every time I click to start a new part of training.
    Can you help me with my problem?
    Thank you

    Bryan,
    If "read from spreadsheet file" is the main time hog, consider dropping it! It is a high-level, very multipurpose VI and thus carries a lot of baggage around with it. (you can double-click it and look at the "guts" )
    If the files come from a just executed "list files", you can assume the files all exist and you want to read them in one single swoop. All that extra detailed error checking for valid filenames is not needed and you never e.g. want it to popup a file dialog if a file goes missing, but simply skip it silently. If open generates an error, just skip to the next in line. Case closed.
    I would do a streamlined low level "open->read->close" for each and do the "spreadsheet string to array" in your own code, optimized to the exact format of your files. For example, notice that "read from spreadheet file" converts everything to SGL, a waste of CPU if you later need to convert it to DBL for some signal processing anyway.
    Anything involving formatted text is not very efficient. Consider a direct binary file format for your data files, it will read MUCH faster and take up less disk space.
    LabVIEW Champion . Do more with less code and in less time .

  • HOW TO IMPROVE SITE TO BE SHOWN BY GOOGLE

    Hello, is there any recommendation on how to improve a site created in iWeb in order to show better on Google? My site www.cykasy.cz shows more or less well but when I search some typical keywords, it falls really behind substantially less important sites (even sites which are not much visited).
    I was for instance recommended to insert:
    * meta keywords
    * meta description
    * H2 headline
    but do not know how to handle it in iWeb. I have basic knowledge of HTML.
    Thank you,
    Message was edited by: PeregrinusX

    PeregrinusX wrote:
    ...is there any recommendation on how to improve a site created in iWeb in order to show better on Google?
    See this article:
    SEO For iWeb: How to get your iWeb Websites into Google & Other Major Search Engines
    As you can see, they recommend using their free SEO tool:
    http://www.ragesw.com/products/iweb-seo-tool.html
    PeregrinusX wrote:
    I was for instance recommended to insert: meta keywords
    See this article:
    Google does not use the keywords meta tag in web ranking
    By the way, rather than posting your URL like this:
    www.cykasy.cz
    ...include the prefix to make it conveniently clickable:
    http://www.cykasy.cz

  • How to improve the execution time of my VI?

    My vi does data processing for hundreds of files and takes more than 20 minutes to commplete. The setup is firstly i use the directory LIST function to list all the files in a dir. to a string array. Then I index this string array into a for loop, in which each file is opened one at a time inside the loop, and some other sub VIs are called to do data analysis. Is there a way to improve my execution time? Maybe loading all files into memory at once? It will be nice to be able to know which section of my vi takes the longest time too. Thanks for any help.

    Bryan,
    If "read from spreadsheet file" is the main time hog, consider dropping it! It is a high-level, very multipurpose VI and thus carries a lot of baggage around with it. (you can double-click it and look at the "guts" )
    If the files come from a just executed "list files", you can assume the files all exist and you want to read them in one single swoop. All that extra detailed error checking for valid filenames is not needed and you never e.g. want it to popup a file dialog if a file goes missing, but simply skip it silently. If open generates an error, just skip to the next in line. Case closed.
    I would do a streamlined low level "open->read->close" for each and do the "spreadsheet string to array" in your own code, optimized to the exact format of your files. For example, notice that "read from spreadheet file" converts everything to SGL, a waste of CPU if you later need to convert it to DBL for some signal processing anyway.
    Anything involving formatted text is not very efficient. Consider a direct binary file format for your data files, it will read MUCH faster and take up less disk space.
    LabVIEW Champion . Do more with less code and in less time .

  • How do you schedule download times?

    We have to use Hughes Net. There are severe download penalties if you download in peak times. I cannot add any new programs, download iTunes, software updates without affecting usage. We can download between 2-7 am and I don't wish to stay up to download
    Any suggestions other than leaving Hughes, not an option out in the sticks.

    The Hughes satellite website says to use a download manager.
    Then they say they can't recommend one. Bummer!
    http://www.nationwidesatellite.com/HughesNet/service/HughesNetfair_accesspolicy.asp
    But then I see this page mention a HughesNet Downloader?
    http://www.dslreports.com/forum/r24590060-HN9000-Download-Directv-on-demand-movi e-via-HughesNet
    You can't schedule anything in itunes. Perhaps you can find a downloader that can do it.

  • How to improve the activation time for a standard DSO

    Hi all,
    I'm facing an issue related to the activation of a DSO. The SM37 log is as follows....
    12:03:16 Job started
    12:03:16 Step 001 started (program RSPROCESS, variant &0000000006946, user ID BWREMOTE)
    12:03:20 Attivazione is running: Data target ZFIZIASA, from 93,627 to 93,627
    12:18:00 Overlapping check with archived data areas for InfoProvider ZFIZIASA
    12:18:00 Data to be activated successfully checked against archiving objects
    12:18:02 Status transition 2 / 2 to 7 / 7 completed successfully
    12:18:13 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4IC45QJA588GKZ0M7JEJ3HCAR with ID 1218130
    12:18:19 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4IC45QJA588GKZ0M7JEJ3HCAR with ID 1218190
    12:18:20 Parallel processes (for Attivazione); 000003
    12:18:20 Timeout for parallel process (for Attivazione): 000300
    12:18:20 Package size (for Attivazione): 020000
    12:18:20 Task handling (for Attivazione): Processi batch
    12:18:20 Server group (for Attivazione): Nessun gruppo di server config
    12:18:20 Activation started (process is running under user BWREMOTE)
    12:18:20 Not all data fields were updated in mode "overwrite"
    12:18:20 Process started
    12:18:20 Process completed
    12:18:20 Activation ended
    Please have a look into the bold 3rd & 4th line. I couldnt able to analyse where the issue is and what to do to minimize the time for activation.
    It is very challenging, please reply!
    Please help.
    Thanks in adv.
    Ajay

    Hi Kundan,
    Thanks for the response!
    Actually, I have two identical DSO, having all the char & KF same and feeding the data from same DS at the same time but the issue is...
    1) For 1st DSO, Activation log....
    01.07.2010 02:02:41 Job started
    01.07.2010 02:02:41 Step 001 started (program RSPROCESS, variant &0000000006946, user ID BWREMOTE)
    01.07.2010 02:02:46 Attivazione is running: Data target ZFIZIASA, from 93,751 to 93,751
    01.07.2010 02:19:27 Overlapping check with archived data areas for InfoProvider ZFIZIASA
    01.07.2010 02:19:27 Data to be activated successfully checked against archiving objects
    01.07.2010 02:19:27 Status transition 2 / 2 to 7 / 7 completed successfully
    01.07.2010 02:19:28 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICC6UMXYT00Z2DYOQV8QL8CJ with ID 021
    01.07.2010 02:19:30 Parallel processes (for Attivazione); 000003
    01.07.2010 02:19:30 Timeout for parallel process (for Attivazione): 000600
    01.07.2010 02:19:30 Package size (for Attivazione): 020000
    01.07.2010 02:19:30 Task handling (for Attivazione): Processi batch
    01.07.2010 02:19:30 Server group (for Attivazione): Nessun gruppo di server config
    01.07.2010 02:19:30 Activation started (process is running under user BWREMOTE)
    01.07.2010 02:19:30 Not all data fields were updated in mode "overwrite"
    01.07.2010 02:19:30 Activation ended
    2) For 2nd DSO, the activation log is..
    01.07.2010 02:01:13 Job started
    01.07.2010 02:01:13 Step 001 started (program RSPROCESS, variant &0000000006947, user ID BWREMOTE)
    01.07.2010 02:01:35 Attivazione is running: Data target ZFIGL_02, from 93,749 to 93,749
    01.07.2010 02:01:43 Overlapping check with archived data areas for InfoProvider ZFIGL_02
    01.07.2010 02:01:43 Data to be activated successfully checked against archiving objects
    01.07.2010 02:01:53 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICCAR91ALHE1VHSB6GV9PAPV with ID 02015300
    01.07.2010 02:01:54 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICCAR91ALHE1VHSB6GV9PAPV with ID 02015400
    01.07.2010 02:01:56 Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCTL_4ICCAR91ALHE1VHSB6GV9PAPV with ID 02015600
    Now my client is asking to lower the activation time for 1st one as it is taking for 2nd one.
    I'm toatally blank now that what & how to do it!
    The volume of data ...
    ZFIZIASA
    Active table - Number of entries: 13.960.508
    Change Log Table - Number of entries: 13.976.530
    ZFIGL_02
    Active Table - Number of entries: 21.923.947
    Change Log table - Number of entries: 21.938.657
    Thanks,
    ajay
    Edited by: sap.ajaykumar on Jul 6, 2010 1:47 PM

  • How to improve server response time...

    we have two solaris5.8 machines with 4G of Ram in each of the them. So they are pretty big. We have round-robine load balancing... We are using loadRunner to monitor the performance of our applicaion.So far our applicaion has not even pasted 50 users load test. If we are performing the load test the site is not usable because it gets extremely slow. So we thought by
    increasing the request thread (in both kxs & kjs)and connection pool size, we might increase the performance but it did just the oppsite. AS the number of user increases the response time inscreases as well...Our application framework is based on MVC. What is the best setting for application server for the large number of users....Currently in production our old application supports 200 users at a time(using loadRunner)
    we are using oracle 8i database....
    any help will be appreciated!!
    Thanks

    Hi,
    we are having problems with server response on Solaris although this is with
    an Informix database and we do not have load balancing set up. Our server
    has 3GB of RAM. However the same app ran fine on an NT iPlanet installation
    where the server only had 512MB RAM.
    This has occasioned a lot of hard work trying to pin down the problem. We
    are currently investigating the JDKs. JDK 1.3 almost completely eliminates
    the problem we had. Using the java profiler we noticed that in JDK 1.2.2
    most of the CPU time seemed to be spent in the socket read and socket accept
    methods. I've attached the profile output for JDK1.3 and 1.2.2, there is
    certainly a marked difference.
    I would be really interested to see if you get similar results. I simply
    added -Xrunhprof:cpu=samples to the JAVA_ARGS in iasenv.ksh.
    Andy
    "Mansoor Quraishi" <[email protected]> wrote in message
    news:[email protected]..
    we have two solaris5.8 machines with 4G of Ram in each of the them. So
    they are pretty big. We have round-robine load balancing... We are
    using loadRunner to monitor the performance of our applicaion.So far
    our applicaion has not even pasted 50 users load test. If we are
    performing the load test the site is not usable because it gets
    extremely slow. So we thought by
    increasing the request thread (in both kxs & kjs)and connection pool
    size, we might increase the performance but it did just the oppsite.
    AS the number of user increases the response time inscreases as
    well...Our application framework is based on MVC. What is the best
    setting for application server for the large number of
    users....Currently in production our old application supports 200
    users at a time(using loadRunner)
    we are using oracle 8i database....
    any help will be appreciated!!
    Thanks
    Try our New Web Based Forum at http://softwareforum.sun.com
    Includes Access to our Product Knowledge Base![Attachment profile.log.jdk12, see below]
    [Attachment profile.log.jdk13, see below]

  • How to improve computer response time

    I have the computer listed below, but would really like to have faster start both of the system and of the applications. I don´t use my computer for demanding tasks such as video editing, but I still want a quick response from the computer. I have a 7200 rpm drive and I´ve heard that a SSD isn´t going to give me much of an improvement and not worth the money(the only option on a MacBook Pro, I've heard). The amount of storage capacity isn´t a priority: I just wan´t a fast computer for simple everyday tasks and don´t like to wait for a slow hard drive to do it´s work.
    In Mac Pro, a striped RAID 0 with 4 identical HDDs should give some improvement, or is this only the case when large files are accessed?
    Also it ought to be an even more important thing one dealing with 4 cores and above.
    Can one of you Mac Pro guys give me a hint of what would be a good idea?
    I have some money to spend.

    90% of the time there is no need to use stripped RAID or even mirrored.
    There are other ways and I'd start with what you know, then experiment but 'striping' was designed for scratch performance and when storage was limited in size and the only way to get 200MB/sec was with 8 drives and $2k in equipment.
    Do some reading on Barefeats for now.
    http://www.barefeats.com/hard103.html
    http://www.barefeats.com/hard112.html
    http://www.barefeats.com/harper13.html
    http://techreport.com/articles.x/16130/10
    The only time you would want to is working with 2GB files in CS3. Or video. But you really will be hard pressed to outgrow a 10K drive even with the 'pedal to the metal' those are designed to do a lot of work.
    Use one drive for OS, one drive for media and library files, 3rd for scratch, 4th as internal backup. Outgrow that first. A Mac Pro is a lot of workstation afterall. And if/when quad-core chips come in more flavors, and more applications can exploit more than 2-4 cores like video and pro apps, you probably don't need 8-cores today. Heck, a dual-core 2.8GHz can handle most any consumer needs.
    CPU crunch: various Mac Pro configurations vs others
    http://www.barefeats.com/octopro3.html
    http://www.barefeats.com/harper7.html
    http://www.barefeats.com/hdvid02.html
    http://www.barefeats.com/harper.html
    iMac vs Mac Pro is a common question
    http://www.macworld.com/article/133467/2008/05/imaccomparison.html

  • How to improve the run time of this query

    Is there any way to improve this query,
    I have a table which SR_WId with atleast one subtype as 'Break Fix', 'Break/Fix','Break Fix/Corrective Maint','Break Fix-Corrective Maint'. then i can have other subtype as 'Break Fix', 'Break/Fix', 'Follow Up', 'Follw-Up','T&'||'M Break Fix','Break Fix/Corrective Maint','Break Fix-Corrective Maint'.
    Let me know if this is okay or to modify it.
    SELECT DISTINCT A.SR_WID AS SR_WID
    FROM WC_SR_ACT_SMRY_FS A
    WHERE EXISTS
    (SELECT NULL FROM WC_SR_ACT_SMRY_FS B
    WHERE B.ACT_TYPE = 'Maintenance'
    AND B.ACT_SUBTYPE in ('Break Fix', 'Break/Fix','Break Fix/Corrective Maint','Break Fix-Corrective Maint') AND B.NBR_OF_ACTIVITIES = 1 AND B.SR_WID = A.SR_WID
    GROUP BY B.SR_WID
    HAVING COUNT(B.SR_WID) >= 1)
    AND A.ACT_TYPE = 'Maintenance'
    AND A.ACT_SUBTYPE IN ('Break Fix', 'Break/Fix', 'Follow Up', 'Follw-Up','T&'||'M Break Fix','Break Fix/Corrective Maint','Break Fix-Corrective Maint')

    SELECT DISTINCT A.SR_WID AS SR_WID
               FROM WC_SR_ACT_SMRY_FS A
              WHERE EXISTS(
                       SELECT   NULL
                           FROM WC_SR_ACT_SMRY_FS B
                          WHERE B.ACT_TYPE = 'Maintenance'
                            AND B.ACT_SUBTYPE IN
                                  ('Break Fix',
                                   'Break/Fix',
                                   'Break Fix/Corrective Maint',
                                   'Break Fix-Corrective Maint')
                            AND B.NBR_OF_ACTIVITIES = 1
                            AND B.SR_WID = A.SR_WID
                       GROUP BY B.SR_WID
                         HAVING COUNT(B.SR_WID) >= 1 )
                AND A.ACT_TYPE = 'Maintenance'
                AND A.ACT_SUBTYPE IN
                      ('Break Fix',
                       'Break/Fix',
                       'Follow Up',
                       'Follw-Up',
                       'T&' || 'M Break Fix',
                       'Break Fix/Corrective Maint',
                       'Break Fix-Corrective Maint');First of all, you can omit the GROUP BY and HAVING part in the sub-select -
    we already know thath the only value for B.SR_WID can be A.SR_WID and
    COUNT(B.SR_WID) will always be at least 1, otherwise the sub-select will
    return nothing.
    As a second step, you could transform EXISTS() into IN():
    SELECT DISTINCT SR_WID
               FROM WC_SR_ACT_SMRY_FS A
              WHERE SR_WID IN(
                       SELECT B.SR_WID
                         FROM WC_SR_ACT_SMRY_FS B
                        WHERE B.ACT_TYPE = 'Maintenance'
                          AND B.ACT_SUBTYPE IN
                                ('Break Fix',
                                 'Break/Fix',
                                 'Break Fix/Corrective Maint',
                                 'Break Fix-Corrective Maint')
                          AND B.NBR_OF_ACTIVITIES = 1)
                AND ACT_TYPE = 'Maintenance'
                AND ACT_SUBTYPE IN
                      ('Break Fix',
                       'Break/Fix',
                       'Follow Up',
                       'Follw-Up',
                       'T&' || 'M Break Fix',
                       'Break Fix/Corrective Maint',
                       'Break Fix-Corrective Maint');

  • How to improve jsp load time?

    Hi everyone --
    Got a question for all of you.
    I have a form in a JSP, which does a (Oracle 9i) db lookup on 13 fields and populates the drop-down boxes. The load time for the JSP is slooooww.
    These are my options for speeding up the load time (I think):
    1. When the web/app server starts the application, have it do all the db lookups and store the resultsets...somewhere. Maybe have some dblistener thread that periodically checks for updates to the db. Then, when someone actually requests the JSP, all the JSP has to do is read the pre-populated drop-down boxes and load.
    2. Wait until someone requests the JSP and store all the resultsets as cookies for the session. The resultsets will be good for the home JSP and its sister. This doesn't really solve the problem, since the initial load will still be slow.
    3. Combine all the lookup statements into one huge statement, and store the results somewhere. This method queries the db only once, and the drop-down boxes can be populated from one query (instead of 13 separate connections and queries).
    All of these ideas seem mediocre to me, and they don't address the fact that the results of the queries change contantly. What I need is a
    way to make the JSP load quickly and with the most current information.
    Does anyone have any ideas? Which of my suggestions is rather boneheaded?
    Thanks for any advice,
    -Kwj

    It's still not perfect, but it's better at least.
    I'm querying 4 different tables overall, and I have a Db class that the jsp instantiates.
    from home.jsp:
    Db d = new Db();
    ResultSet [] all = d.getAll();
    <body>
    %>
    <tr><td align=right>Correspondence ID: </td><td><select>
    <option label=""></option>
    <%
        rs = all [1];
        while(rs.next()){
    %>
            <option><%=rs.getString("sak_correspondence")%></option>
    <%
    %>
    </select></td>
    </body>
    [\code]
    from Db.java:public ResultSet [] getAll() throws SQLException
    ResultSet [] all = new ResultSet [4];
    all [0] = con.prepareStatement
    ("select nam_category from doco.corr_category " +
    "order by nam_category asc").executeQuery();
    all [1] = con.prepareStatement
    ("select sak_correspondence, dte_received, dte_response_due, " +
    "dte_sent, file_path, id_outref, subject " +
    "from doco.correspondence").executeQuery();
    all [2] = con.prepareStatement
    ("select nam from doco.corr_type order by nam asc").executeQuery();
    all [3] = con.prepareStatement
    ("select nam_subcategory from doco.corr_subcategory " +
    "order by nam_subcategory asc").executeQuery();
    return all;
    }//getAll()

  • How can I improve the response time of the user interface?

    I'm after some tips on how to improve the response time to mouse clicks on a VI front panel.
    I have  data acquistion application which used to run fine, but after spending a couple of weeks making a whole bunch of changes to it I find that the user interface has become a bit sluggish.
    My main GUI VI has a while loop running 16 times a second, updating some waveform charts and polling about a dozen buttons on the front panel.
    There is sometimes a delay (variable, but up to 2 seconds sometimes) from when I click on a button to when it becomes depressed. I have wired the iteration terminal of the while loop to an indicator on the front panel and can see that the while loop is ticking over during the delayed response to the mouse click, so I know that the problem is not that the whole program is running slow, just the response to mouse clicks.
    Also, just for debugging purposes, I have indicators of the iterations of all the main while loops in my program on the front panel, so I can see that there are no loops running abnormally fast either.
    One thing I've tried is to turn off multi-threading, and this does seem to work - the response to mouse clicks is much faster. However, it has the side effect of making the main GUI while loop run less evenly. I was trying to get a fairly smooth waveform scrolling across the screen, and when multi-threading is off it gets a bit jerky.
    Any other suggestion welcome..
    (I am using LabVIEW 7.1, Windows 2000).
    Regards,
    Mark.

    Hi Altenbach,
    Thanks for your reply. In answer to your questions:
    I am doing both DAQ board and serial data acquisition. I am using NIDAQ traditional for the DAQ board, and VISA for the serial. I have other similar versions of this program that do only DAQ board, or only serial, and these work fine. It was only when I combined them both into the same program that I ran into problems.
    The multiple while loops are actually in separate VIs. I have one VI that acquires data from the DAQ card, another VI that acquires data from the serial port, another VI that processes the data and saves to file, and another VI, the GUI VI, that displays the data in graphs and charts.  The data is transferred betwen the VIs via LV2 globals.
    The GUI VI is a bit more complicated than I first mentioned. It has tab control, with 4 waveform charts on one page, 4 waveform graphs on another page, and 3 waveform graphs on another page. The charts have a history length of 2560, and 16 data points are added 16 times a second. The wavefom graphs are only updated once per minute.
    I don't use the value property at all, but I do use lots of property nodes for changing the properties of the graphs and charts e.g. changing plot colours, Y scale range etc. There is only one local variable (for the Tab control). All the graphs and charts have data wired directly to their terminals.
    I haven't done any profiling yet.
    I am building arrays in uninitialised shift registers, but this is all well under control. As the experiement goes on, more data is collected and stored, and so the memory usage does gradually increase, but only to the extent that I would expect.
    The CPU usage is 100%, but I thought this was always the case when using NIDAQ  with DAQ cards. Or am I wrong about this? (As a side note, I am using NIDAQ traditional, but would NIDAQmx be better?)
    Execution priority of the GUI vi (and all the other VIs for that matter) is set to normal.
    The program is a bit large to post here, and I'm not sure if my company would be happy for me to publicise it anyway, so I suspect that this is turning into one of those questions that are going to be impossible to answer.
    Just as a more general question, why would turning off multi-threading improve the user interface response?
    Thanks,
    Mark.

  • How to improve quality of photos or clipart downloaded from web?

    Hi,
    My name is Gail and I am new to this forum. I am a novice in Adobe Photoshop Elements 2. Searched but couldn't find answer to this question:
    When I download images from the web (right click and save as)they often appear blurry, especially if I enlarge. Usually only the choice of jpeg or bitmap to Save As. How can I improve the quality? Please answer simply as I have limited knowledge in this area. :)
    I have Photoshop Elements 2. Operating system is Windows XP. I usually use the clipart or photo in Printshop 21 when creating greeting card, etc. or in making a DVD cover design.
    Can someone help, please?
    Gail

    Hi Gail,
    I'm afraid the short answer to your question is that often you cannot improve the quality.
    The holy grail of web site design is to create web pages that take up as little computing memory as possible. The reason for this is down to the fact that the contents of web pages have to be transmitted over the public Internet - bit by bit and byte by byte from a hosting server to the viewer's desktop. Therefore fewer bits and bytes to shunt around mean faster download times and improve the length of time it takes an entire page to display correctly on your screen.
    With this ideology in mind many applications, Photoshop Elements included, have features built into them to downsize images to the smallest possible file size (bytes) for displaying at the author's chosen size at an acceptable quality.
    That said, there are other examples on the Internet where people provide 'thumbnail' images that link to larger versions of that image or even to the full original file size. Look out for those instead.
    Hope this helps.
    Mark
    PS. Don't forget about the whole issue of copyright. That is another reason why some people will render their work to such small file sizes (i.e. to prevent unauthorised copying and printing).

  • How to let user download multi files at the same time in WebDynpro ABAP?

    hi all:
    As you know, WeyDynpor has provided upload/download UI element, but it seems that it only support one file upload/downlaod at the same time.The following is the API method to download one file in Webdynpro:
        cl_wd_runtime_services=>attach_file_to_response(
         EXPORTING
           i_filename      = lv_filename
           i_content       = lv_content
           i_mime_type     = lv_mine_type
           i_in_new_window = abap_true
           i_inplace       = abap_false
    *      EXCEPTIONS
    *        others          = 1
    but if when use click one button, we want to provide user a html file plus 2 icons files which are used as this html file's resource file, then how to let user download these 3 files together at the same time?
    one simple way is calling the download api (cl_wd_runtime_services=>attach_file_to_response) 3 times,
    but it is very ugly that three popup windows are shown to let user select every file's download path, which is unaccepted.
    So anyone know more convienient way to handle it?
    thanks.

    Hi,
    I suggest you to zip the files and attach it to the response. Do the add file part for each of your files
         "References
         DATA lr_zip TYPE REF TO cl_abap_zip.
         "Variables
         DATA lv_zip_xstring TYPE xstring.
         DATA lv_zip_name TYPE string.
         DATA lv_file_content TYPE xstring.
         DATA lv_file_name  TYPE string.
         "Create instance
         CREATE OBJECT lr_zip.
         "Add file
         lr_zip_attachments->add(
           EXPORTING name = lv_file_name
                  content = lv_file_content ).
         lr_zip_attachments->save( RECEIVING zip = lv_zip_xstring ).
         "Attach zip file to response
         cl_wd_runtime_services=>attach_file_to_response(
           EXPORTING i_filename      = lv_zip_name
                     i_mime_type     = 'ZIP/APPLICATION'
                     i_content       = lv_zip_xstring ).

  • I downloaded a song from i-tunes.  It plays on my computer, but won't play on my i-phone.  Any suggestions on how to fix this? Or how can I re-download the song without getting charged a second time.  I could not find an option to "report a problem".

    I downloaded a song from i-tunes.  It plays on my computer, but won't play on my i-phone.  Any suggestions on how to fix this? Or how can I re-download the song without getting charged a second time.  I could not find an option to "report a problem".

    I could not find an option to "report a problem".
    Log in to the Store. Click on "Account" in your Quick Links. When you're in your Account information screen, go down to Purchase History and click "See all".
    Find the item that is not playing properly. If you can't see "Report a Problem" next to the entry, click the "Report a problem" button. Now click the "Report a Problem" link next to the item.
    (Not entirely sure what happens after you click that link, but fingers crossed it should be relatively straightforward.)

Maybe you are looking for