Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

Raws, even if not compressed, Sould be smaller than a 24-bit Tiff, since they have only one bitplane. My T3i shoots 14 bit 18MP raws and has a raw file size of 24.5 MB*. An uncompressed Tiff should have a size of 18 MP x 3 bytes per pixel or 54 MB.
*There must be some lossless compression going on since 18 MP times 1.75 bytes per pixel is 31.5MB for uncompressed raw.

Similar Messages

  • I am developing a book via email with a collaborator. iBooks Files are sent to me via a link to preview on iPad. This worked fine for a while but now getting download errors and the unhappy mac face saying download failed. Any ideas?

    I am developing a book via email with a collaborator. iBooks Files are sent to me via a link to preview on iPad. This worked fine for a while but now getting download errors and the unhappy mac face saying download failed. Any help appreciated. Very frustrating!

        I can see that this issue has been quite extensive, and frustrating, and I am so sorry for all that has happened societygirl! I would like to help you work this issue out. Please follow & send me a Direct Message, so I can get your account specifics and help finally bring this to a resolution.
    Thank you,
    MichelleH_VZW
    Follow us on Twitter @VZWSupport

  • HT4914 I am adding files to Match and the program keeps crashing. No firewall conflict.Any idea?indows 7

    I am trying to add music files to ITunes Match but it keeps crashing ITune frequently. Any suggestions?

    Many thanks.
    With those symptoms, I'd try the following document:
    Apple software on Windows: May see performance issues and blank iTunes Store
    (If there's a SpeedBit LSP showing up in Autoruns, it's usually best to just uninstall your SpeedBit Video Accelerator.)

  • Problem exporting '.txt' file size 23 KB and '.zip' file size 4 MB

    I am using Apex 3.0 version screen to upload '.txt' file and '.zip' file containing images.
    I can successfully export '.txt' file and '.zip' file containing images as long as '.txt' file size is < 23 KB and '.zip' file size < 4 MB from database table 'TBL_upload_file' to the OS directory on the server.
    processing of Larger files (sizes 35 KB and 6 MB) produce following Error Message.
    ‘ORA-21560: argument 2 is null, invalid or out of range’ error.
    Here is my code:
    I am using following code to export Documents from database table 'TBL_upload_file' to the OS directory on the server.
    create or replace procedure "PROC_LOAD_FILES_TO_FLDR_BYTES"
    (pchr_text_file IN VARCHAR2,
    pchr_zip_file IN VARCHAR2)
    is
    lzipfile varchar(100);
    lzipname varchar(100);
    sseq varchar(1000);
    ldocname varchar(100);
    lfile varchar(100);
    -- loaddoc (p_file in number) as
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32000);
    l_amount NUMBER := 32000;
    l_pos NUMBER := 1;
    l_blob BLOB;
    l_blob_len NUMBER;
    l_file_name varchar(200);
    l_doc_name varchar(200);
    a_file_name varchar (200);
    end_pos NUMBER;
    begin
    -- Get LOB locator
    SELECT blob_content,doc_name
    INTO l_blob,l_file_name
    FROM tbl_upload_file
    WHERE DOC_NAME = pchr_text_file;
    --get length of blob
    l_blob_len := DBMS_LOB.getlength(l_blob);
    -- save blob length to determine end position
    end_pos:= l_blob_len;
    -- Open the destination file.
    -- l_file := UTL_FILE.fopen('BLOBS','MyImage.gif','w', 32767);
    l_file := UTL_FILE.fopen('BLOBS',l_file_name,'WB', 32760); --use write byte option supported in 10G
    -- if small enough for a single write
    IF l_blob_len < 32760 THEN
    utl_file.put_raw(l_file,l_blob);
    utl_file.fflush(l_file);
    ELSE -- write in pieces
    -- Read chunks of the BLOB and write them to the file
    -- until complete.
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
    UTL_FILE.put_raw(l_file, l_buffer);
    utl_file.fflush(l_file); --flush pending data and write to the file
    -- set the start position for the next cut
    l_pos := l_pos + l_amount;
    -- set the end position if less than 32000 bytes, here end_pos captures length of the document
    end_pos := end_pos - l_amount;
    IF end_pos < 32000 THEN
    l_amount := end_pos;
    END IF;
    END LOOP;
    END IF;
    --- zip file
    -- Get LOB locator to locate zip file
    SELECT blob_content,doc_name
    INTO l_blob,l_doc_name
    FROM tbl_upload_file
    WHERE DOC_NAME = pchr_zip_file;
    l_blob_len := DBMS_LOB.getlength(l_blob);
    -- save blob length to determine end position
    end_pos:= l_blob_len;
    -- Open the destination file.
    -- l_file := UTL_FILE.fopen('BLOBS','MyImage.gif','w', 32767);
    l_file := UTL_FILE.fopen('BLOBS',l_doc_name,'WB', 32760); --use write byte option supported in 10G
    -- if small enough for a single write
    IF l_blob_len < 32760 THEN
    utl_file.put_raw(l_file,l_blob);
    utl_file.fflush(l_file); --flush out pending data to the file
    ELSE -- write in pieces
    -- Read chunks of the BLOB and write them to the file
    -- until complete.
    l_pos:=1;
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
    UTL_FILE.put_raw(l_file, l_buffer);
    UTL_FILE.fflush(l_file); --flush pending data and write to the file
    l_pos := l_pos + l_amount;
    -- set the end position if less than 32000 bytes, here end_pos contains length of the document
    end_pos := end_pos - l_amount;
    IF end_pos < 32000 THEN
    l_amount := end_pos;
    END IF;
    END LOOP;
    END IF;
    -- Close the file.
    IF UTL_FILE.is_open(l_file) THEN
    UTL_FILE.fclose(l_file);
    END IF;
    exception
    WHEN NO_DATA_FOUND THEN
    RAISE_APPLICATION_ERROR(-20214,'Screen fields cannot be blank, Proc_Load_Files_To_Fldr_BYTES.');
    WHEN TOO_MANY_ROWS THEN
    RAISE_APPLICATION_ERROR(-20215,'More than one record exist in the tbl_load_file table, Proc_Load_Files_To_Fldr_BYTES.');
    WHEN OTHERS THEN
    -- Close the file if something goes wrong.
    IF UTL_FILE.is_open(l_file) THEN
    UTL_FILE.fclose(l_file);
    END IF;
    RAISE_APPLICATION_ERROR(-20216,'Some other errors occurred, Proc_Load_Files_To_Fldr_BYTES.');
    end;
    I am new to the Oracle.
    Any help to modify this scipt and resolve this problem will be greatly appreciated.
    Thank you.

    Ask this question in the Apex forums. See Oracle Application Express (APEX)
    Regards Nigel

  • Exported JPEG file sizes

    A number of my images exported to JPEG have a larger file size than the original RAW file. My export is fairly basic, quality 100, color space sRGB, no resizing, no watermark. Anyone any idea why ? I thought JPEG were supposed to be compressed.

    Possibly because you are not setting an output resolution so light room is making a best interpolation.
    Try setting the dimensions you want or the dimension for the long edge and use a resolution of 300ppi for printing or 72 for uploading to the web.

  • Export constrain file size

    The lab my client uses doesn't like to handle jpegs larger than about 2MB. OK, that's their rule and I have to make it work.
    In PhotoShop, when I converted to jpeg I could see how large a file it was going to produce. In LightRoom I don't know how to do that. I can choose a quality, which isn't measured in anything. And I can enter a dpi, which shouldn't make any difference if the pixel count is unchanged. There's no way to sort by size. So I guess and export them all, then check the sizes and go back again and re-export the files that are too large. Very Clumsy.
    Can anyone advise me how to get these Canon 5D files to convert to jpegs at around 2MB?
    -wick

    Through trial and error (no help from LightRoom), I decided to contrain the file size to within a 2K by 2K pixel rectangle. This gives me roughly the right file size. thanks to all for your suggestions. In the end, of course, I either have to downsize the image or raise the compression to around 60 (as Ian suggested). In the end, assuming a print size no larger than 8 x 10, do you think the better solution is to downsize or upcompress?
    -wick
    PS: Has anyone else come across the bug mentioned in my second post, where exporting the same file over and over leads to the wrong file getting exported? Maybe LightRoom just got bored?

  • Problem exporting txt' file size is 23 KB and '.zip' size 4 MB

    I am using Apex 3.0 version screen to upload '.txt' file and '.zip' file containing images.
    I can successfully export '.txt' file and '.zip' file containing images as long as '.txt' file size is < 23 KB and '.zip' file size < 4 MB from database table 'TBL_upload_file' to the OS directory on the server.
    processing of Larger files (sizes 35 KB and 6 MB) produce following Error Message.
    ‘ORA-21560: argument 2 is null, invalid or out of range’ error.
    Here is my code:
    I am using following code to export Documents from database table 'TBL_upload_file' to the OS directory on the server.
    create or replace procedure "PROC_LOAD_FILES_TO_FLDR_BYTES"
    (pchr_text_file IN VARCHAR2,
    pchr_zip_file IN VARCHAR2)
    is
    lzipfile varchar(100);
    lzipname varchar(100);
    sseq varchar(1000);
    ldocname varchar(100);
    lfile varchar(100);
    -- loaddoc (p_file in number) as
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32000);
    l_amount NUMBER := 32000;
    l_pos NUMBER := 1;
    l_blob BLOB;
    l_blob_len NUMBER;
    l_file_name varchar(200);
    l_doc_name varchar(200);
    a_file_name varchar (200);
    end_pos NUMBER;
    begin
    -- Get LOB locator
    SELECT blob_content,doc_name
    INTO l_blob,l_file_name
    FROM tbl_upload_file
    WHERE DOC_NAME = pchr_text_file;
    --get length of blob
    l_blob_len := DBMS_LOB.getlength(l_blob);
    -- save blob length to determine end position
    end_pos:= l_blob_len;
    -- Open the destination file.
    -- l_file := UTL_FILE.fopen('BLOBS','MyImage.gif','w', 32767);
    l_file := UTL_FILE.fopen('BLOBS',l_file_name,'WB', 32760); --use write byte option supported in 10G
    -- if small enough for a single write
    IF l_blob_len < 32760 THEN
    utl_file.put_raw(l_file,l_blob);
    utl_file.fflush(l_file);
    ELSE -- write in pieces
    -- Read chunks of the BLOB and write them to the file
    -- until complete.
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
    UTL_FILE.put_raw(l_file, l_buffer);
    utl_file.fflush(l_file); --flush pending data and write to the file
    -- set the start position for the next cut
    l_pos := l_pos + l_amount;
    -- set the end position if less than 32000 bytes, here end_pos captures length of the document
    end_pos := end_pos - l_amount;
    IF end_pos < 32000 THEN
    l_amount := end_pos;
    END IF;
    END LOOP;
    END IF;
    --- zip file
    -- Get LOB locator to locate zip file
    SELECT blob_content,doc_name
    INTO l_blob,l_doc_name
    FROM tbl_upload_file
    WHERE DOC_NAME = pchr_zip_file;
    l_blob_len := DBMS_LOB.getlength(l_blob);
    -- save blob length to determine end position
    end_pos:= l_blob_len;
    -- Open the destination file.
    -- l_file := UTL_FILE.fopen('BLOBS','MyImage.gif','w', 32767);
    l_file := UTL_FILE.fopen('BLOBS',l_doc_name,'WB', 32760); --use write byte option supported in 10G
    -- if small enough for a single write
    IF l_blob_len < 32760 THEN
    utl_file.put_raw(l_file,l_blob);
    utl_file.fflush(l_file); --flush out pending data to the file
    ELSE -- write in pieces
    -- Read chunks of the BLOB and write them to the file
    -- until complete.
    l_pos:=1;
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
    UTL_FILE.put_raw(l_file, l_buffer);
    UTL_FILE.fflush(l_file); --flush pending data and write to the file
    l_pos := l_pos + l_amount;
    -- set the end position if less than 32000 bytes, here end_pos contains length of the document
    end_pos := end_pos - l_amount;
    IF end_pos < 32000 THEN
    l_amount := end_pos;
    END IF;
    END LOOP;
    END IF;
    -- Close the file.
    IF UTL_FILE.is_open(l_file) THEN
    UTL_FILE.fclose(l_file);
    END IF;
    exception
    WHEN NO_DATA_FOUND THEN
    RAISE_APPLICATION_ERROR(-20214,'Screen fields cannot be blank, Proc_Load_Files_To_Fldr_BYTES.');      
    WHEN TOO_MANY_ROWS THEN
    RAISE_APPLICATION_ERROR(-20215,'More than one record exist in the tbl_load_file table, Proc_Load_Files_To_Fldr_BYTES.');     
    WHEN OTHERS THEN
    -- Close the file if something goes wrong.
    IF UTL_FILE.is_open(l_file) THEN
    UTL_FILE.fclose(l_file);
    END IF;
    RAISE_APPLICATION_ERROR(-20216,'Some other errors occurred, Proc_Load_Files_To_Fldr_BYTES.');     
    end;
    I am new to the Oracle.
    Any help to modify this scipt and resolve this problem will be greatly appreciated.
    Thank you.

    Ask this question in the Apex forums. See Oracle Application Express (APEX)
    Regards Nigel

  • Creating .pdf from a ppt or .doc in acrobat 9 resulting in strangely large file sizes

    I am printing to .pdf from Microsoft Word and Powerpoint (previously Office 07 - now Office 10) and the file sizes are sometimes doubling in size. Also, for some larger documents, multiple .pdfs are generating as if acrobat is choosing where to cut off a document due to large file size and generating secondary, tertiary, etc., docs to continue the .pdf. The documents have placed images (excel charts, etc) but they are not enormous - that is until they are being generated into a .pdf. I had hoped that when i upgraded to Office 10, this issue would go away, but unfortunately, it hasn't. Any ideas? thoughts? Thanks so much!

    One thing that can happen with MS Office applications when an image contains transparency is the "image" is actually composed of multiple one pixel images when converted to PDF. This could be confirmed by examining the file. Try selecting the images using the TouchUp Object tool to see if what looks like a single image it entirely selected.

  • ACR PSD files - larger file size and max compatible

    When using ACR > Save As to save a bunch of PSD files from raw files the resulting PSD file sizes are larger.
    for example:
    if I output and save from ACR directly to a folder as a PSD, I'm seeing the file size at 63.1 MB
    If I open the raw file to PS and save it within Photoshop to a PSD file the file size is 56.1 MB ( this is true whether the Max Compatible is turned on or off... probably because it's a flat PSD file)
    why the difference?  same output settings from ACR, bit depth etc.
    Also,
    The file saved out of ACR appears to be "tagged" as a Maximum compatible file even though it is flat, so subsequent saves, even if the PS pref is set to Never for maximum compatibility are disregarded. Whereas the same file that was opened into PS from ACR and then saved with the Never PS pref behaves correctly.
    Is there a preference setting within ACR to not save the PSD's as Maximum compatible?
    thanks
    j
    ACR 6.4
    PS 12.0.4
    10.6.7

    Hi Noel
    I may not have been clear in my post. It's not about PSD v. Raw file size, just comparing PSD files.
    Starting with a raw file ( in this case a 5DmII CR2)
    Using the same settings (mid-bottom of ACR window) when saving a file using the "save image" within ACR or opening the file into PS and then saving. This is where I'm seeing the file size difference.
    The second part of the problem is that PSD files that are saved directly out of ACR using the "save image" (bottom left of ACR window) are all set as Maximum Compatibility. As a result opening these files will always be have the extra data and saving time that files with Maximum Compatibility enabled have. The only way to get around this is to have your PS prefs set to Never or Ask and the "Save As" to overwrite the file.
    I just reprocessed a folder of PSD's that were saved out of ACR by overwriting as above and the 36 files (with layers, retouching etc) went from 6.9GB to 5.6GB, and the save time is also faster.
    This isn't intended to be a discussion about the benefits or disadvantages of the Maximum Compatibiliy "feature", but just a question as to whether it can be turned off in PSDs saved from ACR.
    thanks
    j

  • Video Stalls - Larger File Size

    Just downloaded Battlestar Galactica season 2 and the video stalls at full screen. Do not have this problem with previous Battlestar Galactica shows and movies that I have downloaded in the past.
    I notice the File Size and Total Bit Rate of the new shows are three times greater in size than the shows I've downloaded in the past.
    Using the utility ativity monitor, I notice I'm maxing out my CPU when I go to full screen which I think is causing my video stall.
    I also notice that now my previous TV show downloads don't fully fill my screen like that did before the new downloads.
    I hope someone might know how I can get my TV shows back to working normal at full screen and why the new TV shows file sizes are larger than before.
    Thanks for Your Help
    Ricky
    PowerBook G4 15" LCD 1.67GHz   Mac OS X (10.3.9)  

    Thanks for the reply. Shortly after I posted, I started to relize that Apple was offering better TV Show quality which results in larger file sizes. I also went to my Energy Saver section in System Prefrences and found my Optimize Energy Settings was set for Presentations. I changed it to Highest Performance which resolved the issue.
    iTunes 7.0.2
    Quicktime 7.1.3
    RAM 512MB
    Thanks for your help.

  • Facebook application grow large file size

    Why facebook applications grow larger and larger file size. What can I do. Since this program is to use the file size is big time. I use facebook version 4.0.3 on iphone4 run on iOS 5.0.1.
    thank you.
    Eak.

    Most of the logs rotation for the different componets can be set in EM console but if you just want to do it with a cron or script here is what I have used in the past. Edit it to your needs.
    #!/bin/sh
    ORACLE_HOME=/home/oracle10FRS
    apache_logs=$ORACLE_HOME/Apache/Apache/logs
    webcache_logs=$ORACLE_HOME/webcache/logs
    sysman_logs=$ORACLE_HOME/sysman/log
    j2ee_logs=$ORACLE_HOME/j2ee/OC4J_BI_Forms/log/OC4J_BI_Forms_default_island_1
    also_j2ee=$ORACLE_HOME/j2ee/home/log/home_default_island_1
    max_size=1500000000
    for file in `ls $apache_logs/*_log`
    do
         size=`ls -l $file | awk '{print $5}'`
         echo "File $file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$file
         fi
    done
    for wfile in `ls $webcache_logs/*_log`
    do
         size=`ls -l $wfile | awk '{print $5}'`
         echo "File $wfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$wfile
         fi
    done
    for sfile in `ls $sysman_logs/*log`
    do
         size=`ls -l $sfile | awk '{print $5}'`
         echo "File $sfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$sfile
         fi
    done
    for jfile in `ls $j2ee_logs/*.log`
    do
         size=`ls -l $jfile | awk '{print $5}'`
         echo "File $jfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$jfile
         fi
    done
    for j2file in `ls $also_j2ee/*.log`
    do
         size=`ls -l $j2file | awk '{print $5}'`
         echo "File $j2file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$j2file
         fi
    done

  • Need to get large file sizes down to downloadable sizes-how?

    I am using Compressor 4 and have several professional videos I produced with Final Cut Studio in QuickTime/.MOV format using ProRes 442. They are all 720p and of course, all have large file sizes. I needed to convert them so they are compatible with iTunes as well as Apple devices (AppleTV, iPad/iPod/iPhone, etc). They also need to be downloadable in a reasonable time and I of course want to maintain a quality image. Several of the videos are about an hour long, some a little more.
    I an unable to get most of these files to be small enough for practical download and still maintain an acceptable quality. For example:
    I have one video (QuickTime rendered from FCPS) that runs 54 minutes. 1280x720, ProRess 422, Linear PCM with a Bit rate of 72.662. The best I seem to be able to get out of compressor is to use the setting options for "HD for Apple Devices (5 Mbs)" which gives me a file that's just over 2Gb. Not bad, and the quality is still very good, but the file size is just too big for people to download. I get reports of people getting estimates of 77 hours to download this one file.
    I did try using the setting under Video Sharing Services / Large 540p Video Sharing just to see what it would give me, but it really wasn't that much smaller and the quality was noticeably degraded.
    I'm confused as to what to do about this. I have downloaded things from iTunes before that are good quality, about an hour and their file sizes are simply bnot this big.
    Can anyone give me some advice? Thank you. Grateful for anything you can offer.
    MC

    If you're not concerned with precise average bitrate, use h.264 codec, set bitrate to "automatic" and use the quality slider. Be sure to have "keyframes" on "automatic"; every 24 frames or so makes h.264 quite inefficient and is only useful for live-streaming, where you want people to be able to resynchronise as quickly as possible. Having multi-pass enabled might help, but I'm not sure what it does when using constant-quality encoding.
    Do a test-encoding of a representative clip of maybe about rwo minutes or so, play with the quality slider and see how far down you can go and still be satisfied with the quality. Using "automatic" bitrate uses as many bits to encode each frame as is necessary to get the required quality, and file sizes will vary depending on the type of material. Complex textures and high motion will use more bits, still frames and slow pans will use very little.
    A light chroma-channel denoising may help improve perceived quality, as may very light sharpening in case you scaled down the material.
    If you're still not happy with the bitrate vs. quality tradeoff, try reducing the resolution, but generally you should get 720p down to around 2.5 to 3.0 Mbps with adequate quality.
    On the other hand, If someone needs 77 hours to download 2 GB, they have another problem anyway. That sounds like a 56k modem connection. What's he doing trying to download HD content anyway?
    Bernd

  • Saving jpeg and png files large file size

    Ive recently purchased web premium 5.5 and Im trying to save files in jpeg and png format in Photoshop and Im getting unexpectedly large file sizes, for example I save a simple logo image 246x48 which consists of only 3 basic colours and when I save it as a jpeg quality 11 it reports the file size in the finder as 61.6k, I save it as a png file and its 60.2k also I cropped the image to 190x48 and saved it as a png and the file size is actually larger at 61.9k saving as a jpeg at quality 7 the files size is a still relatively large 54k.
    I have a similar png non indexed colour logo on my mac I saved in CS3 on a pc which is actually larger at 260x148 and it's only 17k and that logo is actually more complex with a gloss effect over part of it. Whats going on and how do I fix it, It's making Photoshop useless especially when saving for the web
    Thanks

    Thanks I had considered that but all my old files are reporting the correct files sizes I have been experimenting and fireworks saves the file at png 24 at 2.6k and jpegs at 5.1k, but I don't really want to have to save the files twice once cut from the comp in photoshop and again in fireworks juggling between the two applications is a bit inconvenient even with just one file but especially when you have potentially hundreds of files to save.
    Ive also turned off icon and windows thumbnail in photoshop preferences and although this has decreased the file size they are still quite large at 27k and save for the web is better at 4k for the png and 16k for the jpeg. Is there anyway to get Fireworks file saving performance in Photoshop ? it seems strange that the compression in Photoshop would be second rate in comparison to fireworks given they are both developed by Adobe and Photoshop is Adobes primary image editing software.

  • Does FW CS5 Mac crap out w large file sizes?

    Does FW CS5 Mac crap out with large file sizes like CS4 did? It seems like files over a 4-5MB tend to be flaky... Is there an optimum file size? I'm running a Mac Tower 2.93 GHz Quad Core, w 8 GB RAM

    Why not download the trial version and find out?

  • XSLT with large file size

    Hello all,
    I am very new to XSL. I have been reading a lot on it. I have a small transformation program which takes an xml file, an xsl file and performs the transformation. Everything works fine when the file size is in terms of KB. When I tried the same program with the file size of around 118MB, it is giving me out of memory exception. I would appreciate any comments to make my program work for bigger file size. I am posting my java code and xsl file
    public static void xsl(String inFilename, String outFilename, String xslFilename) {
    try {
    // Create transformer factory
    TransformerFactory factory = TransformerFactory.newInstance();
    // Use the factory to create a template containing the xsl file
    Templates template = factory.newTemplates(new StreamSource(
    new FileInputStream(xslFilename)));
    // Use the template to create a transformer
    Transformer xformer = template.newTransformer();
    // Prepare the input and output files
    Source source = new StreamSource(new FileInputStream(inFilename));
    Result result = new StreamResult(new FileOutputStream(outFilename));
    // Apply the xsl file to the source file and write the result to the output file
    xformer.transform(source, result);
    } catch (FileNotFoundException e) {
    System.out.println("Exception " + e);
    } catch (TransformerConfigurationException e) {
    // An error occurred in the XSL file
    System.out.println("Exception " + e);
    } catch (TransformerException e) {
    // An error occurred while applying the XSL file
    // Get location of error in input file
    SourceLocator locator = e.getLocator();
    int col = locator.getColumnNumber();
    int line = locator.getLineNumber();
    String publicId = locator.getPublicId();
    String systemId = locator.getSystemId();
    System.out.println("Exception " + e);
    System.out.println("locator " + locator.toString());
    System.out.println("line : " + line);
    System.out.println("col : " + col);
    System.out.println("No Exception");
    xsl file :
    <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
    <xsl:output method="xml" indent="yes"/>
    <xsl:template match="/">
    <xsl:element name="Hosts">
    <xsl:apply-templates select="//maps/map/hosts"/>
    </xsl:element>
    </xsl:template>
    <xsl:template match="//maps/map/hosts">
    <xsl:for-each select="*">
    <xsl:element name="host">
    <xsl:element name="ip"><xsl:value-of select="./@ip"/></xsl:element>
    <xsl:element name="known"><xsl:value-of select="./known/@v"/></xsl:element>
    <xsl:element name="targeted"><xsl:value-of select="./targeted/@v"/></xsl:element>
    <xsl:element name="asn"><xsl:value-of select="./asn/@v"/></xsl:element>
    <xsl:element name="reverse_dns"><xsl:value-of select="./reverse_dns/@v"/></xsl:element>
    </xsl:element>
    </xsl:for-each>
    </xsl:template>
    Thanks,
    Namrata
    </xsl:stylesheet>

    One thing you could try to do is avoid using xpath like ".//" and "*".
    I had many problems in terms of memory consuptiom and performance with xpaths like above.
    Altrought you have a little more work to code your xslt it performs better, and probably you will write once and run many times.

Maybe you are looking for

  • Connection Pool and Database Sessions

    Hi, Is there any way to use the connection pool or Datasource while connecting to database?If I am using a stateless sesssion bean and using a Data Access layer which just creates a database session to write the persistence toplink objects how I can

  • OS 10.2.8 to more recent OS

    I have this imac: System version : Mac OS X 10.2.8 (6R73) Kernel version : Darwin Kernel Version 6.8 Machine speed : 400 MHz Bus speed : 100 MHz Number of processors : 1 L2 cache size : 512K Machine model : iMac (version = 83.0) Boot ROM info : 4.1.9

  • Separating logs in OSB based on project - is this possible?

    Hi, I'm using OSB 11.1.1.3.0 and I'd like log messages generated from the log action to go to a different log file based on OSB project. I've found messages on the forum that discuss this, for example: alsb logging These posts only explain how to log

  • Neo2 Problem: Multiple HDs are seen as the same connection

    Ok, I'm just checking to see if this is indeed a problem Windows recognizes my 2 sata drives and 1 IDE drive all under the same device 0. When I try to install XP sp2, all the options on the screen show the same device #0. I believe this also corrupt

  • Saving/ Capturing part of a podcast

    Hi , I would like to record / capture part of a podcast . How can this be done with Tiger OS ? Thanks Koolsan