Using swf file for very large static image

i have a large photo that i need to use as a banner image in a website.  i have tried optimizing it for the web in fireworks -- to reduce the file size obviously -- but the client isn't happy with the resolution.  i haven't had this issue come up before but i see his point (to a degree), as the detail and sharpness of the original is far superior.
i have almost no familiarity with creating with flash, but i remember reading somewhere that swf files do not have the same sizes as jpeg images.
would it be feasible to include the single static image as a swf file to preserve its original resolution?  i mean, would that image as a swf file be smaller than the image as a jpeg?
mille grazie.

There are two things at play here... image quality (often harmed by compression) and image resolution (# of pixels width by height). What is the real problem?
Whether you render your image as a GIF, JPG, PNG or SWF file, they will all have the same resolution.
Packing an image in SWF should be a last resort as not everyone has Flash installed. For example, iPhone users cannot see SWF.

Similar Messages

  • How to use swf file for Spark Skin

    Hi all
    I have little confuse about how to use swf file with Spark Skin,
    in Flex3 , i use Flex Skin Design Extension for Flash to deal with skin ,
    so basically , I just create a swf file and import to the project , then everything is ok
    but , seem that not working in flex4 , spark skin.
    1` I can't find the Flex Skin Design for the FlashCs4
    2` I try to use FlashCs3 version to import the skin art to the project , but seemed not working.
    3` I Google and check the Flex4 Help , seemed Spak skin need the Skin-Class , which I don't know how to use that with swf file
        what I have searched is how to use FXG , or some jpg file in that skin class, none of them are use swf file.
    so , is there any way that let user use the swf file to deal with the skin?
           is the Flex Skin Design can use in Flex4?
    Thanks

    Hello,
    I'm new to Flex, but have come to it from Flash Pro.  I'd like to know the same thing.  I think I found the answer here:
    http://www.flashallys.com/blog/spark-button-skinning-with-flash-symbols/
    However my question is now: Is this a good approach to use?
    99% of the googling I've done tells me to skin components with fxg files.  I've tried that by creating graphics and exporting from flash, but using swfs containing lots of graphics is much faster so I'd rather use that.
    The other thing I've done is to create custom components using these helpful tutorials:
    Creating component in flash:
    http://www.webkitchen.be/2008/12/12/video-tutorial-make-flex-components-with-flash-cs4/
    Dealing with Resizing with method overrides for your flash component:
    http://www.psyked.co.uk/flex/creating-flex-components-the-easy-way-for-flash-ide-converts. htm
    This allows me the flexibility of Flash Pro design with the layout, transistions, data binding etc from Flex.
    So my question is: Is this approach (skinning spark components with swfs, and using custom swc components made in flash) a bad idea for any reason?  E.g. does it create slow mobile apps?
    Cheers
    Chris

  • Slow Performance or XDP File size very large

    There have been a few reports of people having slow performance in their forms (tyically for Dynamic forms) or file sizes of XDP files being very large.
    These are the symptoms of a problem with cut and paste in Designer where a Process Instruction (PI) used to control how Designer displays a specific palette is repeated many many times. If you look in your XDP source and see this line repeated more than once then you have the issue:
    The problem has been resolved by applying a style sheet to the XDP and removing the instruction (until now). A patch has been released that will fix the cut and paste issue as well as repair your templates when you open them in a designer with the patch applied.
    Here is a blog entry that describes the patch as well as where to get it.
    http://blogs.adobe.com/livecycle/2009/03/post.html

    My XDP file grow up to 145mb before i decided to see what was actually happening.
    It appears that the LvieCycle Designer ES program sometimes writes alot of redundant data... the same line millions of times over & over again.
    I wrote this small java program which reduced the size up to 111KB !!!!!!!!!!!!!!!!!! (wow what a bug that must have been!!!)
    Here's the sourcecode:
    import java.io.BufferedReader;
    import java.io.BufferedWriter;
    import java.io.FileNotFoundException;
    import java.io.FileReader;
    import java.io.FileWriter;
    import java.io.IOException;
    public class MakeSmaller {
    private static final String DELETE_STRING = "                           <?templateDesigner StyleID aped3?>";
    public static void main(String... args) {
      BufferedReader br = null;
      BufferedWriter bw = null;
      try {
       br = new BufferedReader(new FileReader(args[0]));
       bw = new BufferedWriter(new BufferedWriter(new FileWriter(args[0] + ".small")));
       String line = null;
       boolean firstOccurence = true;
       while((line = br.readLine()) != null) {
        if (line.equals(DELETE_STRING)) {
         if (firstOccurence) {
          bw.write(line + "\n");
          firstOccurence = false;
        } else {
         bw.write(line + "\n");
         firstOccurence = true;
      } catch (FileNotFoundException e) {
       e.printStackTrace();
      } catch (IOException e) {
       e.printStackTrace();
      } finally {
       if (br != null) {
        try {
         br.close();
        } catch (IOException e) {
         e.printStackTrace();
       if (bw != null) {
        try {
         bw.close();
        } catch (IOException e) {
         e.printStackTrace();
    File that gets generated is the same as the xdp file (same location) but gets the extension .small. Just in case something goes wrong the original file is NOT modified as you can see in the source code. And yes Designer REALLY wrote that line like a gazillion times in the .xdp file (shame on the programmers!!)
    You can also see that i also write the first occurrence to the small file just in case its needed...

  • How to save FrameworkElement as very large raster image?

    I have to save my FrameworkElement as
    very large raster image. For now I use the RenderTargetBitmapclass
    and a BitmapEncoder,
    in this way:
    RenderTargetBitmap bmp = new RenderTargetBitmap(ElementWidth, ElementHeight,
         90, 96, PixelFormats.Default);
    bmp.Render(MyElement);  // OutOfMemoryException here
    PngBitmapEncoder encoder = new PngBitmapEncoder();
    encoder.Frames.Add(BitmapFrame.Create(bmp));
    using (var stream = File.Create(filePath))
    { encoder.Save(stream); }
    Where ElementWidth and ElementHeight are large numbers (about 10000x6000). But with this solution there's a OutOfMemoryException when i try to Render my
    element.
    There are other ways to do what I need (without causing an OutOfMemoryException)? Thanks.

    Following link may help you
    http://social.msdn.microsoft.com/Forums/en-US/wpf/thread/c5e31d70-08d1-4402-8016-0a0b7af49b04/
    Gaurav Khanna | Microsoft VB.NET MVP

  • Buffered TDMS file still very large

    Hello, I am trying to to stream to a TDMS file. I am writing approximately 10 channels of data at 20 Hz. I was creating a new file evey 5 mins or so and defragging the file once I was finished with it, which was giving me a reasonable size of file. I decided that I would like to make bigger files (over the period of hours) and the the defragging would be too computationally costly to do on the fly. I have been trying to set a buffer for the TDMS files but the resultant files are very large. You can tell the file isn't being writen until its closed (from watching its size in windows explorer) which suggests the file is being buffered. However its about 6 times bigger than an ASCII file of the same data, with and index almost as big again.
    Does anyone have any ideas why it would appear to buffer but not actually  reduce the data size. I'm running Windows XP, LabVIEW 10.0.
    When I create each new file I run this to set the buffers. (I have also hard wired in a buffersize of 100000 on a one minute file to no avail.
    Niall
    Thanks
    Niall

    I has this marked as the solution but it turns out it only helped one of my programs. I am still getting very large (fragmented) TDMS and index files for another program. The problem is that if I just defrag the (supposedly buffered) file, it interupts the data logging because it takes so long. I'm 99% sure that it *is* buffereing as it doesn't write till it closes the file, and if I use the read properties function it reads back a set buffer size. Here is the VI that actually writes the data. There bit at the top is for writing an optional acii file so you can ignore that.
    Its maybe a bit hard to see whats going on in the next one, but this is where the file is created before being passed to the setbuffer VI which I posted earlier. It also closes that last file
    Its really hacking me off now and holding me up from going no to other stuff. It would be great if someone had some ideas.
    Thanks
    Niall

  • Log and transfer, my transfered files become very large...

    When I use Log and transfer, my files become very large. For instance, The card that I'm transferring from only holds 32G and after I transfer the footage to an external drive it has become around 70G. I checked the codec on transfered files and its showing apple pro res 422 1920 x 1080. Is there an import setting I'm missing? This is all new to me...

    This is normal.
    Your camera likely shoots AVCHD, a highly compressed version of MPEG 4.
    It is so compressed because it uses a "Long GOP" format where only every 15th frame is independent (i frame).
    Those in between only contain the differences to the prior i frame and cannot exist without it.
    (It's actually somewhat more complicated, but now you know what it is called, you can do more research)
    You need the ability to edit on any frame, however. The footage is transcoded to make every frame independent so you can do just that. It takes more space.

  • How can we suggest a new DBA OCE certification for very large databases?

    How can we suggest a new DBA OCE certification for very large databases?
    What web site, or what phone number can we call to suggest creating a VLDB OCE certification.
    The largest databases that I have ever worked with barely over 1 Trillion Bytes.
    Some people told me that the results of being a DBA totally change when you have a VERY LARGE DATABASE.
    I could guess that maybe some of the following topics of how to configure might be on it,
    * Partitioning
    * parallel
    * bigger block size - DSS vs OLTP
    * etc
    Where could I send in a recommendation?
    Thanks Roger

    I wish there were some details about the OCE data warehousing.
    Look at the topics for 1Z0-515. Assume that the 'lightweight' topics will go (like Best Practices) and that there will be more technical topics added.
    Oracle Database 11g Data Warehousing Essentials | Oracle Certification Exam
    Overview of Data Warehousing
      Describe the benefits of a data warehouse
      Describe the technical characteristics of a data warehouse
      Describe the Oracle Database structures used primarily by a data warehouse
      Explain the use of materialized views
      Implement Database Resource Manager to control resource usage
      Identify and explain the benefits provided by standard Oracle Database 11g enhancements for a data warehouse
    Parallelism
      Explain how the Oracle optimizer determines the degree of parallelism
      Configure parallelism
      Explain how parallelism and partitioning work together
    Partitioning
      Describe types of partitioning
      Describe the benefits of partitioning
      Implement partition-wise joins
    Result Cache
      Describe how the SQL Result Cache operates
      Identify the scenarios which benefit the most from Result Set Caching
    OLAP
      Explain how Oracle OLAP delivers high performance
      Describe how applications can access data stored in Oracle OLAP cubes
    Advanced Compression
      Explain the benefits provided by Advanced Compression
      Explain how Advanced Compression operates
      Describe how Advanced Compression interacts with other Oracle options and utilities
    Data integration
      Explain Oracle's overall approach to data integration
      Describe the benefits provided by ODI
      Differentiate the components of ODI
      Create integration data flows with ODI
      Ensure data quality with OWB
      Explain the concept and use of real-time data integration
      Describe the architecture of Oracle's data integration solutions
    Data mining and analysis
      Describe the components of Oracle's Data Mining option
      Describe the analytical functions provided by Oracle Data Mining
      Identify use cases that can benefit from Oracle Data Mining
      Identify which Oracle products use Oracle Data Mining
    Sizing
      Properly size all resources to be used in a data warehouse configuration
    Exadata
      Describe the architecture of the Sun Oracle Database Machine
      Describe configuration options for an Exadata Storage Server
      Explain the advantages provided by the Exadata Storage Server
    Best practices for performance
      Employ best practices to load incremental data into a data warehouse
      Employ best practices for using Oracle features to implement high performance data warehouses

  • How to use .mov files for video in Flash...

    Hello-
    I am trying to use .mov files for my Flash videos. I know it uses .mp4/flv/f4v, but I really need to use .movs. I know this works... but how?
    Suggestions?

    unfortunately running it through Adobe Encoder makes it a much, much larger file and the quality goes out the window. i really need to use .mov.
    any suggestions?

  • Saving a swf-file for flash player older than 10.3 in Flash CC

    Hi,
    I am working with Flash CC and need to save my swf-files for Flash Player older than 10.3.
    Is there any chance to do so?
    Besides working with CS6 again ;-)
    Thanks for every help!

    In the meantime I found a very good answer myself.
    For anyone who might need it, too:
    Re: Can Flash Pro CC publish to flash player 9??

  • Loading swf files for anonymous user login

    Hi,
    We are having swf file retrieving from oracle database in home page. I can able to view swf file for normal users, but anonymous user login, not able to see swf file in the page. Other image format like jpg/gif files anonymous user login images are loading properly. Pls. kindly advise, where issue occurs.
    kumar. G

    Hi Ram
    Thanks for your response. Let me know, how to connect to oracle database for anonymous user login. Based on that i will work around that approach for fixing this issue.
    By
    kumar. g

  • Option to always use sidecar files for XMP

    Lightroom gives you the option to write metadata to XMP files, which is convenient if you want to share data with other programs (ACR, various DAM programs). In the beta versions, this data would always be stored in .XMP sidecar files, presumably for safety reasons. However, V1 writes the XMP data into the image file itself for those formats that support it (DNG, JPG, TIFF).
    I would really like to have the option to *force* Lightroom to use sidecar files for XMP data, like the betas did. The current V1 behavior (embedding data) is quite inefficient for fairly common backup scenarios.
    For example, I tend to back up all my raw files onto DVD after importing and culling. In addition I make a daily backup of all metadata that gets sent over the internet to a web server - a relatively small upload. What happens if I use DNG for my raw files and update something as small as a keyword? Indeed, *all* of the corresponding DNG files are modified, and because my backup software cannot know that only the XMP data has changed, all files must be burned to DVD again. If the metadata were stored in sidecar files, a simple, efficient metadata backup would take care of things.
    To implement this, a simple switch in the preferences would suffice, and the necessary code is already there. The only possible problem is the occurrence of two files with identical names, excluding the extension. For this case, I can see four different 'solutions':
    1) Force XMP metadata to be synchronized between files with equal names
    2) Make an .XMP sidecar file for only one of the files (always the raw file, for example)
    3) Update the XMP specification to specify what should be done in this situation (maybe include the file extension in the XMP data)
    Reards,
    Simon

    Actually, I don't (notice the 'if' :)). And this is only one of the reasons. I just picked DNG as an example, to keep things simple and because it seems Adobe wants DNG to work well for photographers.
    However, the same applies to JPG and TIFF images. I often end up changing details in the metadata after the image has been finished. For example, to correct the spelling in a keyword that has been applied to half the images in my database... Talking about a nightmare scenario!
    Simon

  • Is it possible to substitute a .swf file for a .png in the pwdconfirm.css?

    Presently my code in the pwdconfirm.css looks like this:
    display: block;
    color: #000;
    background: #FFF url(../images/vfw_message_logo.png) no-repeat left bottom;
    padding: 20 20 20 40;
    width: 200px;
    height: 40px;
    border-bottom: 1px solid #000;
    border-right: 1px solid #000;
    margin-top: 20px;
    text-align: center;
    I want to substitute a .swf file for the .png file above.  I've tried it and it doesn't show up at all when I preview it in the browser.  Is it possible?  The link to the webpage is:  http://www.vfwofwa.org/links.
    Thanks,

    Gramps,
    Thanks for the information.  I'm new to Spry.  I've been taking classes on
    Lynda.com.  Just trying to tweak the site like James Williamson does.
    Later,

  • Steps to load the data by using flat file for hierarchies in BI 7.0

    Hi Gurus,
    steps to load the data by using flat file for hierarchies in BI 7.0

    hi ,
    u will get the steps int he following blog by Prakash Bagali
    Hierarchy Upload from Flat files
    regards,
    Rathy

  • I can't open swf files for schoolwork. Installed VCL, Flip4Mac and Adobe flash and shockwave players. Can get white screen with connection erro message with adobe flash player. Help me!

    i can't open swf files for schoolwork. Installed VCL, Flip4Mac and Adobe flash and shockwave players. Can get white screen with connection erro message with adobe flash player. Help me!

    Apple dropped playback support for .swf formats in QuickTime almost ten years ago.
    Adobe Flash Player is the only way to view them.

  • Grid Control Architecture for Very Large Sites: New Article published

    A new article on Grid Control was published recently:
    Grid Control Architecture for Very Large Sites
    http://www.oracle.com/technology/pub/articles/havewala-gridcontrol.html

    Oliver,
    Thanks for the comments. The article is based on practical experience. If one was to recommend a pool of 2 management servers for a large corporate with 1000 servers, what that would mean is that if 1 server was brought down for any maintenance reason (for eg. applying an EM patch), all the EM work load would be on the remaining management server. So it is better to have 3 management servers instead of 2 when the EM system is servicing so many targets. Otherwise, the DBAs would be a tad angry since only 1 remaining managment server would not be able to service them properly during the time of the maintainance fix on the first management server.
    The article ends with these words: "You can easily manage hundreds or even *thousands* of targets with such an architecture. The large corporate which had deployed this project scaled easily up to managing 600 to 700 targets with a pool of just three management servers, and the future plan is to manage *2,000 or more* targets which is quite achievable." The 2000 or more is based on the same architecture of 3 managment servers.
    So as per the best practice document, 2 management servers would be fine for 1000 servers, although I would still advise 3 servers in practice.
    For your case of 200 servers, it depends on the level of monitoring you are planning to do, and the type of database managment activities that
    will be done by the DBAs. For eg, if the Dbas are planning on creating standby databases now and then through Grid Control, running backups daily
    via Grid Control, cloning databases in Grid Control, patching databases in Grid Control and so on, I would definitely advise a pool of 2 servers
    in your case. 2 is always better than 1.
    Regards,
    Porus.
    Edited by: Porushh on Feb 21, 2009 12:51 AM

Maybe you are looking for