XSLT with large file size

Hello all,
I am very new to XSL. I have been reading a lot on it. I have a small transformation program which takes an xml file, an xsl file and performs the transformation. Everything works fine when the file size is in terms of KB. When I tried the same program with the file size of around 118MB, it is giving me out of memory exception. I would appreciate any comments to make my program work for bigger file size. I am posting my java code and xsl file
public static void xsl(String inFilename, String outFilename, String xslFilename) {
try {
// Create transformer factory
TransformerFactory factory = TransformerFactory.newInstance();
// Use the factory to create a template containing the xsl file
Templates template = factory.newTemplates(new StreamSource(
new FileInputStream(xslFilename)));
// Use the template to create a transformer
Transformer xformer = template.newTransformer();
// Prepare the input and output files
Source source = new StreamSource(new FileInputStream(inFilename));
Result result = new StreamResult(new FileOutputStream(outFilename));
// Apply the xsl file to the source file and write the result to the output file
xformer.transform(source, result);
} catch (FileNotFoundException e) {
System.out.println("Exception " + e);
} catch (TransformerConfigurationException e) {
// An error occurred in the XSL file
System.out.println("Exception " + e);
} catch (TransformerException e) {
// An error occurred while applying the XSL file
// Get location of error in input file
SourceLocator locator = e.getLocator();
int col = locator.getColumnNumber();
int line = locator.getLineNumber();
String publicId = locator.getPublicId();
String systemId = locator.getSystemId();
System.out.println("Exception " + e);
System.out.println("locator " + locator.toString());
System.out.println("line : " + line);
System.out.println("col : " + col);
System.out.println("No Exception");
xsl file :
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<xsl:output method="xml" indent="yes"/>
<xsl:template match="/">
<xsl:element name="Hosts">
<xsl:apply-templates select="//maps/map/hosts"/>
</xsl:element>
</xsl:template>
<xsl:template match="//maps/map/hosts">
<xsl:for-each select="*">
<xsl:element name="host">
<xsl:element name="ip"><xsl:value-of select="./@ip"/></xsl:element>
<xsl:element name="known"><xsl:value-of select="./known/@v"/></xsl:element>
<xsl:element name="targeted"><xsl:value-of select="./targeted/@v"/></xsl:element>
<xsl:element name="asn"><xsl:value-of select="./asn/@v"/></xsl:element>
<xsl:element name="reverse_dns"><xsl:value-of select="./reverse_dns/@v"/></xsl:element>
</xsl:element>
</xsl:for-each>
</xsl:template>
Thanks,
Namrata
</xsl:stylesheet>

One thing you could try to do is avoid using xpath like ".//" and "*".
I had many problems in terms of memory consuptiom and performance with xpaths like above.
Altrought you have a little more work to code your xslt it performs better, and probably you will write once and run many times.

Similar Messages

  • Does FW CS5 Mac crap out w large file sizes?

    Does FW CS5 Mac crap out with large file sizes like CS4 did? It seems like files over a 4-5MB tend to be flaky... Is there an optimum file size? I'm running a Mac Tower 2.93 GHz Quad Core, w 8 GB RAM

    Why not download the trial version and find out?

  • Creating .pdf from a ppt or .doc in acrobat 9 resulting in strangely large file sizes

    I am printing to .pdf from Microsoft Word and Powerpoint (previously Office 07 - now Office 10) and the file sizes are sometimes doubling in size. Also, for some larger documents, multiple .pdfs are generating as if acrobat is choosing where to cut off a document due to large file size and generating secondary, tertiary, etc., docs to continue the .pdf. The documents have placed images (excel charts, etc) but they are not enormous - that is until they are being generated into a .pdf. I had hoped that when i upgraded to Office 10, this issue would go away, but unfortunately, it hasn't. Any ideas? thoughts? Thanks so much!

    One thing that can happen with MS Office applications when an image contains transparency is the "image" is actually composed of multiple one pixel images when converted to PDF. This could be confirmed by examining the file. Try selecting the images using the TouchUp Object tool to see if what looks like a single image it entirely selected.

  • Video Stalls - Larger File Size

    Just downloaded Battlestar Galactica season 2 and the video stalls at full screen. Do not have this problem with previous Battlestar Galactica shows and movies that I have downloaded in the past.
    I notice the File Size and Total Bit Rate of the new shows are three times greater in size than the shows I've downloaded in the past.
    Using the utility ativity monitor, I notice I'm maxing out my CPU when I go to full screen which I think is causing my video stall.
    I also notice that now my previous TV show downloads don't fully fill my screen like that did before the new downloads.
    I hope someone might know how I can get my TV shows back to working normal at full screen and why the new TV shows file sizes are larger than before.
    Thanks for Your Help
    Ricky
    PowerBook G4 15" LCD 1.67GHz   Mac OS X (10.3.9)  

    Thanks for the reply. Shortly after I posted, I started to relize that Apple was offering better TV Show quality which results in larger file sizes. I also went to my Energy Saver section in System Prefrences and found my Optimize Energy Settings was set for Presentations. I changed it to Highest Performance which resolved the issue.
    iTunes 7.0.2
    Quicktime 7.1.3
    RAM 512MB
    Thanks for your help.

  • Facebook application grow large file size

    Why facebook applications grow larger and larger file size. What can I do. Since this program is to use the file size is big time. I use facebook version 4.0.3 on iphone4 run on iOS 5.0.1.
    thank you.
    Eak.

    Most of the logs rotation for the different componets can be set in EM console but if you just want to do it with a cron or script here is what I have used in the past. Edit it to your needs.
    #!/bin/sh
    ORACLE_HOME=/home/oracle10FRS
    apache_logs=$ORACLE_HOME/Apache/Apache/logs
    webcache_logs=$ORACLE_HOME/webcache/logs
    sysman_logs=$ORACLE_HOME/sysman/log
    j2ee_logs=$ORACLE_HOME/j2ee/OC4J_BI_Forms/log/OC4J_BI_Forms_default_island_1
    also_j2ee=$ORACLE_HOME/j2ee/home/log/home_default_island_1
    max_size=1500000000
    for file in `ls $apache_logs/*_log`
    do
         size=`ls -l $file | awk '{print $5}'`
         echo "File $file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$file
         fi
    done
    for wfile in `ls $webcache_logs/*_log`
    do
         size=`ls -l $wfile | awk '{print $5}'`
         echo "File $wfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$wfile
         fi
    done
    for sfile in `ls $sysman_logs/*log`
    do
         size=`ls -l $sfile | awk '{print $5}'`
         echo "File $sfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$sfile
         fi
    done
    for jfile in `ls $j2ee_logs/*.log`
    do
         size=`ls -l $jfile | awk '{print $5}'`
         echo "File $jfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$jfile
         fi
    done
    for j2file in `ls $also_j2ee/*.log`
    do
         size=`ls -l $j2file | awk '{print $5}'`
         echo "File $j2file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$j2file
         fi
    done

  • Premiere Pro CS4: Exporting to High Quality SWF with Low File Size

    Adobe Program: Premeire Pro CS4 (to build slide show video)
    Adobe Program: Adobe Media Encoder (to export project into flash file)
    Project Details: A 20 minute slide show with still images and audio.
    Project Source: HDV 720p, 29.97 fps
    Objective: Export slide show into a flash file so it can be watched online. The file must be small enough for most people to be able to stream the video but good quality (like videos found at Hulu.com).
    Problem: I'm struggling to find an export setting that will let me maintain both good quality and a small file size.
    The export setting that allows the best quality is:
    Format: FLV / F4V
    Preset: F4V - HD 720p
    File Size: 800MB
    Troubleshooting History: I have tried the following Presets:
    F4V - Same as Source
    F4V - 1080p Source, Half Size
    F4V - 720p Source, Half Size
    F4V - 720p Source, Half Size, VBR 2pass
    F4V - Web Large, Widescreen Source
    While the file size is much reduced with these Presets, I lose too much video quality.
    Inquiry: Does anyone have advice for meeting my listed objective?

    I understand that if you want a smaller file you must sacrifice quality and if you want better quality you must expect a larger file size. I am looking for a solution in between the two extremes (again, I draw on the example of Hulu.com). Mainly, I am looking to reduce the file mentioned above from 800MB to about half that.
    Does anyone know a way to manipulate the export settings to make that possible and still maintain as high a picture quality as possible?

  • Need to get large file sizes down to downloadable sizes-how?

    I am using Compressor 4 and have several professional videos I produced with Final Cut Studio in QuickTime/.MOV format using ProRes 442. They are all 720p and of course, all have large file sizes. I needed to convert them so they are compatible with iTunes as well as Apple devices (AppleTV, iPad/iPod/iPhone, etc). They also need to be downloadable in a reasonable time and I of course want to maintain a quality image. Several of the videos are about an hour long, some a little more.
    I an unable to get most of these files to be small enough for practical download and still maintain an acceptable quality. For example:
    I have one video (QuickTime rendered from FCPS) that runs 54 minutes. 1280x720, ProRess 422, Linear PCM with a Bit rate of 72.662. The best I seem to be able to get out of compressor is to use the setting options for "HD for Apple Devices (5 Mbs)" which gives me a file that's just over 2Gb. Not bad, and the quality is still very good, but the file size is just too big for people to download. I get reports of people getting estimates of 77 hours to download this one file.
    I did try using the setting under Video Sharing Services / Large 540p Video Sharing just to see what it would give me, but it really wasn't that much smaller and the quality was noticeably degraded.
    I'm confused as to what to do about this. I have downloaded things from iTunes before that are good quality, about an hour and their file sizes are simply bnot this big.
    Can anyone give me some advice? Thank you. Grateful for anything you can offer.
    MC

    If you're not concerned with precise average bitrate, use h.264 codec, set bitrate to "automatic" and use the quality slider. Be sure to have "keyframes" on "automatic"; every 24 frames or so makes h.264 quite inefficient and is only useful for live-streaming, where you want people to be able to resynchronise as quickly as possible. Having multi-pass enabled might help, but I'm not sure what it does when using constant-quality encoding.
    Do a test-encoding of a representative clip of maybe about rwo minutes or so, play with the quality slider and see how far down you can go and still be satisfied with the quality. Using "automatic" bitrate uses as many bits to encode each frame as is necessary to get the required quality, and file sizes will vary depending on the type of material. Complex textures and high motion will use more bits, still frames and slow pans will use very little.
    A light chroma-channel denoising may help improve perceived quality, as may very light sharpening in case you scaled down the material.
    If you're still not happy with the bitrate vs. quality tradeoff, try reducing the resolution, but generally you should get 720p down to around 2.5 to 3.0 Mbps with adequate quality.
    On the other hand, If someone needs 77 hours to download 2 GB, they have another problem anyway. That sounds like a 56k modem connection. What's he doing trying to download HD content anyway?
    Bernd

  • Saving jpeg and png files large file size

    Ive recently purchased web premium 5.5 and Im trying to save files in jpeg and png format in Photoshop and Im getting unexpectedly large file sizes, for example I save a simple logo image 246x48 which consists of only 3 basic colours and when I save it as a jpeg quality 11 it reports the file size in the finder as 61.6k, I save it as a png file and its 60.2k also I cropped the image to 190x48 and saved it as a png and the file size is actually larger at 61.9k saving as a jpeg at quality 7 the files size is a still relatively large 54k.
    I have a similar png non indexed colour logo on my mac I saved in CS3 on a pc which is actually larger at 260x148 and it's only 17k and that logo is actually more complex with a gloss effect over part of it. Whats going on and how do I fix it, It's making Photoshop useless especially when saving for the web
    Thanks

    Thanks I had considered that but all my old files are reporting the correct files sizes I have been experimenting and fireworks saves the file at png 24 at 2.6k and jpegs at 5.1k, but I don't really want to have to save the files twice once cut from the comp in photoshop and again in fireworks juggling between the two applications is a bit inconvenient even with just one file but especially when you have potentially hundreds of files to save.
    Ive also turned off icon and windows thumbnail in photoshop preferences and although this has decreased the file size they are still quite large at 27k and save for the web is better at 4k for the png and 16k for the jpeg. Is there anyway to get Fireworks file saving performance in Photoshop ? it seems strange that the compression in Photoshop would be second rate in comparison to fireworks given they are both developed by Adobe and Photoshop is Adobes primary image editing software.

  • Large file size increase when editing in PS3

    I shoot mostly JPG with most file sizes in the 5MB range. Today I sent 3 files to PS3 for auto alignment and blending to a panorama. When complete, the resulting PSD file back in my Lightroom library is 495MB. When I export the PSD panorama as a JPG for printing a 9" x 36" with 180 dpi the resulting JPG is only 6.1MB. I imported the 6.1MB JPG panorama back into Lightroom for comparison with the 495MB PSD file and they look nearly identical even at 100%. Will someone please explain to me what is happening to cause these remarkable file size changes ? Any reason other than the time it takes not to replace the PSD file in my library with the smaller JPG when finished with the process ?
    Powerbook G4 - 1.67GHz - 2GB SD Ram; OS 10.4.11; Library on external drive; D300

    jpeg is a lossy compression that uses an algorithm based on human vision research that gets rid of information in the file that we cannot generally really see and compresses color ranges that we are less likely to see changes in (typically reds) more than colors our eyes are very sensitive to (yelow and green). At least when you use a high quality setting. PSD on the other hand, uses a lossless compression algorithm that retains ALL information in an image regardless of whether you should be able to see it or not. Now sometimes you might want to do an edit that for example lifts up shadows. If you started with a jpeg, that information might already be gone, but when you saved as a psd, you might be able to retrieve it as the compression algorithm did not already throw away the information. Another example is subtle gradients in certain colors, such as blues or reds. Say you want to make the color more saturated. You're likely to start seeing banding with the jpeg, while you'll probably get good results with a losslessly compressed file.
    Lastly, the psds Lightroom writes are typically 16 bits, which means again more precision. They can also contain layers such as adjustment layers, which allows you to later go back on your edits. Jpeg cannot do any of these things. It might be alright to store a finalized file as jpeg, but be aware that you'll be limited to what you can do with them if you change your mind. Harddisk space is supercheap nowadays.

  • Photoshop CS6 keeps freezing when I work with large files

    I've had problems with Photoshop CS6 freezing on me and giving me RAM and Scratch Disk alerts/warnings ever since I upgraded to Windows 8.  This usually only happens when I work with large files, however once I work with a large file, I can't seem to work with any file at all that day.  Today however I have received my first error in which Photoshop says that it has stopped working.  I thought that if I post this event info about the error, it might be of some help to someone to try to help me.  The log info is as follows:
    General info
    Faulting application name: Photoshop.exe, version: 13.1.2.0, time stamp: 0x50e86403
    Faulting module name: KERNELBASE.dll, version: 6.2.9200.16451, time stamp: 0x50988950
    Exception code: 0xe06d7363
    Fault offset: 0x00014b32
    Faulting process id: 0x1834
    Faulting application start time: 0x01ce6664ee6acc59
    Faulting application path: C:\Program Files (x86)\Adobe\Adobe Photoshop CS6\Photoshop.exe
    Faulting module path: C:\Windows\SYSTEM32\KERNELBASE.dll
    Report Id: 2e5de768-d259-11e2-be86-742f68828cd0
    Faulting package full name:
    Faulting package-relative application ID:
    I really hope to hear from someone soon, my job requires me to work with Photoshop every day and I run into errors and bugs almost constantly and all of the help I've received so far from people in my office doesn't seem to make much difference at all.  I'll be checking in regularly, so if you need any further details or need me to elaborate on anything, I should be able to get back to you fairly quickly.
    Thank you.

    Here you go Conroy.  These are probably a mess after various attempts at getting help.

  • Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

    Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

    Raws, even if not compressed, Sould be smaller than a 24-bit Tiff, since they have only one bitplane. My T3i shoots 14 bit 18MP raws and has a raw file size of 24.5 MB*. An uncompressed Tiff should have a size of 18 MP x 3 bytes per pixel or 54 MB.
    *There must be some lossless compression going on since 18 MP times 1.75 bytes per pixel is 31.5MB for uncompressed raw.

  • Exporting with specific file size possible ?

    Hi,
    say I have master files in the region of 50 - 350Mb and need to upload the versions at best possible JPEG quality with a file size restriction of say 8Mb.
    How would you do this straight out of Aperture ?
    AppleScript ?
    Thanks for any pointers,
    chris

    Thanks Ian,
    I thought so.
    If anybody could point me to a nice AppleScript/Automator workflow that could help me with full size exports on Finder level, please let me know.
    Thanks a lot in advance,
    chris

  • Wpg_docload fails with "large" files

    Hi people,
    I have an application that allows the user to query and download files stored in an external application server that exposes its functionality via webservices. There's a lot of overhead involved:
    1. The user queries the file from the application and gets a link that allows her to download the file. She clicks on it.
    2. Oracle submits a request to the webservice and gets a XML response back. One of the elements of the XML response is an embedded XML document itself, and one of its elements is the file, encoded in base64.
    3. The embedded XML document is extracted from the response, and the contents of the file are stored into a CLOB.
    4. The CLOB is converted into a BLOB.
    5. The BLOB is pushed to the client.
    Problem is, it only works with "small" files, less than 50 KB. With "large" files (more than 50 KB), the user clicks on the download link and about one second later, gets a
    The requested URL /apex/SCHEMA.GET_FILE was not found on this serverWhen I run the webservice outside Oracle, it works fine. I suppose it has to do with PGA/SGA tuning.
    It looks a lot like the problem described at this Ask Tom question.
    Here's my slightly modified code (XMLRPC_API is based on Jason Straub's excellent [Flexible Web Service API|http://jastraub.blogspot.com/2008/06/flexible-web-service-api.html]):
    CREATE OR REPLACE PROCEDURE get_file ( p_file_id IN NUMBER )
    IS
        l_url                  VARCHAR2( 255 );
        l_envelope             CLOB;
        l_xml                  XMLTYPE;
        l_xml_cooked           XMLTYPE;
        l_val                  CLOB;
        l_length               NUMBER;
        l_filename             VARCHAR2( 2000 );
        l_filename_with_path   VARCHAR2( 2000 );
        l_file_blob            BLOB;
    BEGIN
        SELECT FILENAME, FILENAME_WITH_PATH
          INTO l_filename, l_filename_with_path
          FROM MY_FILES
         WHERE FILE_ID = p_file_id;
        l_envelope := q'!<?xml version="1.0"?>!';
        l_envelope := l_envelope || '<methodCall>';
        l_envelope := l_envelope || '<methodName>getfile</methodName>';
        l_envelope := l_envelope || '<params>';
        l_envelope := l_envelope || '<param>';
        l_envelope := l_envelope || '<value><string>' || l_filename_with_path || '</string></value>';
        l_envelope := l_envelope || '</param>';
        l_envelope := l_envelope || '</params>';
        l_envelope := l_envelope || '</methodCall>';
        l_url := 'http://127.0.0.1/ws/xmlrpc_server.php';
        -- Download XML response from webservice. The file content is in an embedded XML document encoded in base64
        l_xml := XMLRPC_API.make_request( p_url      => l_url,
                                          p_envelope => l_envelope );
        -- Extract the embedded XML document from the XML response into a CLOB
        l_val := DBMS_XMLGEN.convert( l_xml.extract('/methodResponse/params/param/value/string/text()').getclobval(), 1 );
        -- Make a XML document out of the extracted CLOB
        l_xml := xmltype.createxml( l_val );
        -- Get the actual content of the file from the XML
        l_val := DBMS_XMLGEN.convert( l_xml.extract('/downloadResult/contents/text()').getclobval(), 1 );
        -- Convert from CLOB to BLOB
        l_file_blob := XMLRPC_API.clobbase642blob( l_val );
        -- Figure out how big the file is
        l_length    := DBMS_LOB.getlength( l_file_blob );
        -- Push the file to the client
        owa_util.mime_header( 'application/octet', FALSE );
        htp.p( 'Content-length: ' || l_length );
        htp.p( 'Content-Disposition: attachment;filename="' || l_filename || '"' );
        owa_util.http_header_close;
        wpg_docload.download_file( l_file_blob );
    END get_file;
    /I'm running XE, PGA is 200 MB, SGA is 800 MB. Any ideas?
    Regards,
    Georger

    Script: http://www.indesignsecrets.com/downloads/MultiPageImporter2.5JJB.jsx.zip
    It works great for files upto ~400 pages, when have more pages than that, is when I get the crash at around page 332 .
    Thanks

  • Saving TIFF with LZW compression results in larger file size.

    I know this never used to be the case with previous versions of Photoshop, but since I upgraded to CS5.1, when I save files with LZW compression checked, the resultant file size is higher than if I uncheck the box.
    For example one file with LZW compression is 56.3MB, and without it is 47.8mb
    Is it just me, or have Adobe got their code mixed up?
    Edit: This appears to only affect 16-bit images, with 8-bit images, LZW works as expected.
    Message was edited by: Toundy. Added info about 16-bit images.

    From the above link:
    Before you Post:
    Firstly - have you really checked the 'help' option in the program? Many problems can be solved far faster by getting the answer from the Help File.
    Secondly - check the Forum FAQ folder. There's advice there on many common questions and problems.
    Thirdly - use the 'Knowledgebase Search' option near the top of this page. Or you can click here to go to the relevant page and enter your search words there - or just search for 'Photoshop' there to see summaries of all the relevant items.
    Fourthly, try searching this forum. It may well be your question has already been asked and answered.The archives are full of useful advice. Please remember to perform a search to see if your question has previously been answered. You can click here to search the Photoshop-for-Macintosh forum.
    Lastly, Check whether you are entitled to Adobe technical support -  click here for a summary of support options.
    If you can't find the answer by any of those methods, do post your question in the relevant forum here.

  • Working with Large files in Photoshop 10

    I am taking pictures with a 4X5 large format film camera and scanning them at 3,000 DPI, which is creating extremely large files. My goal is to take them into Photoshop Elements 10 to cleanup, edit, merge photos together and so on. The cleanup tools don't seem to work that well on large files. My end result is to be able to send these pictures out to be printed at large sizes up to 40X60. How can I work in this environment and get the best print results?

    You will need to work with 8bit files to get the benefit of all the editing tools in Elements.
    I would suggest resizing at resolution of 300ppi although you can use much lower resolutions for really large prints that will be viewed from a distance e.g. hung on a gallery wall.
    That should give you an image size of 12,000 x 18,000 pixels if the original aspect ratio is 2:3
    Use the top menu:
    Image >> Resize >> Image Size

Maybe you are looking for