ADT Not Working With Large Amount Of Files

Hi,
I posted a similar issue in the Flash Builder forum; it seems these tools are not meant for anything too big. Here is my problem; I have an application that runs on the iPad just fine, however I cannot seem to be able to package any of the resource files it uses, such as swf, xml, jpg, and flv. There are a lot of these files, maybe around a 1000. This app won't be on the AppStore and just used internally at my company. I tried to package it using Flash Builder, but it couldn't handle it, and so now I have tried the adt command line tool. The problem now is that it is running out of memory:
Exception in thread "main" java.lang.OutOfMemoryError
        at java.util.zip.Deflater.init(Native Method)
        at java.util.zip.Deflater.<init>(Unknown Source)
        at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:428)
        at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:338)
        at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:273)
        at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:247)
        at com.adobe.air.ADTOutputStream.addFile(ADTOutputStream.java:367)
        at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:161)
        at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:67)
        at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:163)
        at com.adobe.air.ADT.parseArgsAndGo(ADT.java:504)
        at com.adobe.air.ADT.run(ADT.java:361)
        at com.adobe.air.ADT.main(ADT.java:411)
I tried increasing the java memory heap size to 1400m, but that did not help. What is disturbing though is why this day and age, the adt tool is written solely for 32 bit java runtimes, when you can no longer buy a 32bit computer. Almost everything is 64bit now. Being able to utilize a 64bit java runtime would eliminate this.
Has anyone else had to deal with packaging a large number of files within an iPad app?
thanks

Hi,
Yes the bug number is 2896433, logged against Abobe Air. I also logged one against Flash Builder for the same issue basically, FB-31239. In the tests I have done here, both issues are the result of including a large number of files in the package. By reducing the number of files included, these bugs do not appear.
I would liketo try out the new SDK, but it won't be available until a few more weeks, correct?
The main reason I want to include all these files is because the iPads will not always have a wireless connection.
thx

Similar Messages

  • Converting tdm to lvm/ working with large amount of data

    I use a PCI 6251 card for data aquisition, and labview version 8; to log 5 channels at 100 Khz for approximately 4-5 million samples on each channel (the more the better). I use the express VI for reading and writing data which is strored in .tdm format (file size of the .tdx file is around 150 MB). I did not store it in lvm format to reduce the time taken to aquire data.
    1. how do i convert this binary file to a .mat file ?
    2. In another approach,  I converted the tdm file into lvm format, this works as long as the file size is small (say 50 MB) bigger than that labview memory gets full and will not save the new file. what is an efficient method to write data (say into lvm format) for big size files without causing labview memory to overflow? I tried saving to multiple files, saving one channel at a time, increased the computer's virtual memory (upto 4880 MB) but i still have problems with 'labview memory full' error.
    3.  Another problem i noticed with labview is that once it is used to aquire data, it occupies a lot of the computer's memory, even after the VI stops running, is ther a way to refresh the memory and is this mainly due to  bad programming?
    any suggestions?

    I assume from your first question that you are attempting to get your data into Matlab.  If that is the case, you have three options:
    You can treat the tdx file as a binary file and read directly from Matlab.  Each channel is a contiguous block of the data type you stored it in (DBL, I32, etc.), with the channels in the order you stored them.  You probably know how many points are in each channel.  If not, you can get this information from the XML in the tdm file.  This is probably your best option, since you won't have to convert anything.
    Early versions of TDM storage (those shipping with LV7.1 or earlier) automatically read the entire file into memory when you load it.  If you have LV7.1, you can upgrade to a version which allows you to read portions of the file by downloading and installing the demo version of LV8.  This will upgrade the shared USI component.  You can then read a portion of your large data set into memory and stream it back out to LVM.
    Do option 2, but use NI-HWS (available on your driver CD under the computer based instruments tab) instead of LVM.  HWS is a hierarchical binary format based on HDF5, so Matlab can read the files directly through its HDF5 interface.  You just need to know the file structure.  You can figure that out using HDFView.  If you take this route and have questions, reply to this post and I will try to answer them.  Note that you may wish to use HWS for your future storage, since its performance is much better than TDM and you can read it from Matlab.  HWS/HDF5 also supports compression, and at your data rates, you can probably pull this off while streaming to disk, if you have a reasonably fast computer.
    Handling large data sets in LabVIEW is an art, like most programming languages.  Check out the tutorial Managing Large Data Sets in LabVIEW for some helpful pointers and code.
    LabVIEW does not release memory until a VI exits memory, even if the VI is not running.  This is an optimization to prevent a repeatedly called VI from requesting the same memory every time it is called.  You can reduce this problem considerably by writing empty arrays to all your front panel objects before you exit your top level VI.  Graphs are a particulary common problem.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • GPU Acceleration not working with large .psd?

    My GPU accelleration works fine, but after I saved a large psd and quit, the next time I opened the file, the gpu was greyed out in the options. If I quit PS and restarted it, gpu would work again for new and other files, but when I try to open this same one, accelleration is disabled every time -- worked fine when I first created the file.
    Is there a reason that this file in particular is suddenly incompatible with accelleration?  It's not really that big of a file, 600mb.  I'd really like to know why this is happening and somehow get my accelleration back for this one file I'm working on.

    That's very interesting.
    I've had OpenGL become disabled on about 2 occasions without knowing why it happened (never so far with 12.0.3 though).
    Clearly there's some kind of "restorative" logic in Photoshop to try to overcome the numerous possibilities for failure due to OpenGL implementations being so diverse.
    What's the pixel count, bit depth, number of layers, styles, content, etc. of this file?
    What display adapter (video card) do you have?  What version of drivers?
    I have recently seen a problem where the Text tool won't show its bounding box or cursor with a large file that's zoomed in.  I don't know whether this has anything to do with what you're seeing.
    I don't know how feasible it is but it's possible Chris Cox or someone at Adobe might like to see the file with which a problem like this can be reproduced.
    -Noel

  • Loosing indexes working with large amounts of data

    SCENARIO
    We are working on an Interface project with ORACLE 10g that works basically like this:
    We have some PARAMETER TABLES in which the key users can insert, update or delete parameters via a web UI.
    There is a download process that brings around 20 million records from our ERP system into what we call RFC TABLES. There are around 14 RFC TABLES.
    We developed several procedures that process all this data against the PARAMETER tables according to some business rules and we end up with what we call XML TABLES because they are sent to another software, completing the interface cycle. We also have INTERMIDIATE TABLES that are loaded in the middle of the process.
    The whole process takes around 2 hours to run.
    We had to create several indexes to get to this time. Without the indexes the process will take forever.
    Every night the RFC, INTERMIDIATE and XML tables need to be truncated and then loaded again.
    I know It might seem strange why we delete millions of records and then load them again. The reason is that the data the users insert in the PARAMETER TABLES need to be processed against ALL data that comes from the ERP and goes to the other software.
    PROBLEMS
    As I said we created several indexes in order to make the process run in less than 2 hours.
    We were able to run the whole process in that time for a few times but, suddenly, the process started to HANG forever and we realized some indexes were just not working anymore.
    When running EXPLAIN we figured the indexes were making no effect and we had some ACCESS FULLS. Curiously when taking the HINTS off and putting them back in the indexes started working again.
    SOLUTION
    We tried things like
    DBMS_STATS.GATHER_SCHEMA_STATS(ownname => SYS_CONTEXT('USERENV', 'CURRENT_SCHEMA'), cascade=>TRUE);
    dbms_utility.analyze_schema
    Dropping all tables and recreating the every time before the process starts
    Nothing solved our problem so far.
    We need advice from someone that worked in a process like this. Where millions of records are deleted and inserted and where a lot of indexes are needed.
    THANKS!
    Jose

    skynyrd wrote:
    I don't know anything about
    BIND variables
    Stored Outlines
    bind peeking issue
    or plan stability in the docs
    but I will research about all of them
    we are currently running the process with a new change:
    We put this line:
    DBMS_STATS.GATHER_SCHEMA_STATS(ownname => SYS_CONTEXT('USERENV', 'CURRENT_SCHEMA'), cascade=>TRUE);
    after every big INSERT or UPDATE (more than 1 million records)
    It is running well so far (it's almost in the end of the process). But I don't know if this will be a definitive solution. I hope so.
    I will post here after I have an answer if it solved the problem or not.
    Thanks a lot for your help so farWell, you best get someone in there that knows what those things are, basic development, basic performance tuning and basic administration all are predisposed on understanding these basic concepts, and patching is necessary (unless you are on XE). I would recommend getting books by Tom Kyte, he clearly explains the concepts you need to know to make things work well. You ought to find some good explanations of bind peeking online if you google that term with +Kyte.  
    You will be subject to this error at random times if you don't find the root cause and fix it.
    Here is some food for your thoughts:
    http://structureddata.org/2008/03/26/choosing-an-optimal-stats-gathering-strategy/ (one of those "what to expect from the 10g optimizer" links does work)
    http://kerryosborne.oracle-guy.com/2009/03/bind-variable-peeking-drives-me-nuts/
    http://pastebin.com/yTqnuRNN
    http://kerryosborne.oracle-guy.com/category/oracle/plan-stability/
    Getting stats on the entire schema as frequently as you do may be overkill and time wasting, or even counterproductive if you have an issue with skewed stats. Note that you can figure out what statistics you need and lock them, or if you have several scenarios, export them and import them as necessary. You need to know exactly what you are doing, and that is some amount of work. It's not magic, but it is math. Get Jonathan Lewis' optimizer book.

  • Working with large amount of data

    Hi! I am writing an peer-to-peer video player and I need to operate with huge amounts of data. The downloaded data should be stored in memory for sharing with other peers. Some video files can be up to 2 GB and more. So keep all this data in RAM - not the best solution i think =)
    Since the flash player does not have access to the file system I can not save this data in temporary files.
    Is there a solution to this problem?

    No ideas? very sad ((

  • Working with  large Avi (DivX) File - PE 2

    Hi there
    I'm trying to use PE2 to edit & generate scene markers in a 1.24GB avi file.
    The problem I'm having is that frames are very slow to update on the timeline making it very difficult to work with, probably due to the size of the file file & my low system specs.
    Any thoughts on how to best tackle this? Would it be best to split the avi file into more manageable pieces? If so, how would I do that?
    Any suggestions would be much appreciated.
    Thanks & regards
    Stephen

    Export the DivX clip as a DV-AVI file using File>Export>Movie. Then replace the DivX clip on the timeline with the DV-AVI clip.

  • Flex 14 SD card reader not working with larger cards on Windows 8.1

    The card reader in my Flex 14 is behaving very oddly. When I insert a 16GB SDHC card it makes the 'USB inserted' noise and the card shows up under device manager but I can't access it through file explorer. The card reader works fine with smaller SD cards that I have. I tried the card in another, Win7 laptop and it worked fine.
    I see a number of other people are having a similar problem and I have followed all of the troubleshooting advice I can find, but nothing is working. Does anyone have any other suggestions?
    This is what I have tried:
    1. Leaving the card in the camera and attaching the camera via USB cable to the laptop: same result, noise, but can't access the card.
    2. Trying to reassign the drive letter through storage controller: it shows the card but if you right click it to assign it a drive letter it says the 'file is missing'.
    3. Downloaded the Win8.1 x64 driver from Lenovo website and installed. This stopped the card being recognised at all.
    4. Installed the Win8 xc4 driver: this got me back to the noise but no action stage.
    5. Updated Windows.
    6. Uninstalled and reinstalled the driver numerous times both with the card inserted and without it and with restarting the computer after unistall and after reinstall.
    7. Ran Ubunutu from a live USB: This allowed me to see the card and copy its contents to the laptop's HDD.
    So it seems to be some incompatibility between the Lenovo drivers and Win8.1. Potentially it's to do with the USB drivers rather than the card reader drivers as the result when using the USB cable and the camera was the same as putting the card in the card reader. Is there a way to update these?
    Any other troubleshooting steps I could take?
    Thanks.

    What size are the cards?
    Another way. You can use a USB flash drive & the camera connection kit.
    Plug the USB flash drive (works the same with an SD card) into your computer & create a new folder titled DCIM. Then put your movie/photo files into the folder. The files must have a filename with exactly 8 characters long (no spaces) plus the file extension (i.e., my-movie.mov; DSCN0164.jpg).
    Now plug the flash drive into the iPad using the camera connection kit. Open the Photos app, the movie/photo files should appear & you can import. (You can not export using the camera connection kit.)
    Using The iPad Camera Connection Kit
    http://support.apple.com/kb/HT4101
    Secrets of the iPad Camera Connection Kit
    http://howto.cnet.com/8301-11310_39-57401068-285/secrets-of-the-ipad-camera-conn ection-kit/
     Cheers, Tom 

  • Bookmark sync not working with large set and maybe javascript bookmark bookmarklets??

    Bookmark sync via iCloud doesn't seem to work for me. I use Chrome as my primary browser but keep a huge set of bookmarks synced to Safari via Xmarks.
    Basically I either get no bookmarks on my iOS devices (iPhone 4 and iPad 1) or a huge number of repeated versions of 1Password's very large java bookmark.
    I've tested by deleting all my Safari bookmarks and starting with a clean Mac. When I do this everything works and changes on any device propagate to the others.
    As soon as I bring in my big bookmark set from Xmarks, it all breaks.
    Do I need to perform the bookmark clean up I've been putting off for a decade or so?

    Can't help with your bidirectional bookmark syncing question but try this for backing up your touch's bookmarks.
    Export/backup all of your Safari/Internet Explorer bookmarks (depending on where you're syncing them from) and keep them is a safe place. This is your master bookmarks list. Then delete all bookmarks/folders from your browser. Now sync your bookmarks with the touch with the following iTunes settings:
    Under your touch's 'Info' tab, select the checkbox 'Sync bookmarks with Internet Explorer/Safari'.
    After syncing, your touch's bookmarks (and only those from your touch) will be synced to your browser. Now export/backup these bookmarks...these are your touch's bookmarks. Now you have a both a backup of your master bookmarks and your touch's bookmarks. When you want to sync/restore your touch's bookmarks back to the touch, delete all your PC browser's bookmarks and import the touch bookmarks:
    Under iTunes/your touch/'Info'/Web browser & Advanaced, select the checkboxes 'sync bookmarks with IE/Safari' and 'Replace information on this iPod: Bookmarks' and sync again.
    This will transfer your touch's bookmarks back to the touch. Now delete all bookmarks again from IE/Safari and re-import your master bookmarks. Note that after importing them back in, they will likely be in alphabetical order rather than the date added order.
    Also note that (at least in Windows) your browser needs to be open for bookmark syncing to take place.

  • Encore CS5 constantly locks up/freezes working with large H.264 file

    This is driving me mad, I'm trying to author my first Blu Ray with this program, a single layer output with H.264/AVC video. I've encoded the video elsewhere and I have a 21GB .264 file. It's a BR legal file but as soon as I import it into Encore CS5 the fun and games begin. Firstly it takes a good five minutes to import and Encore appears to have crashed. Finally though it does import and looking at 'Blu Ray transcode' it says 'Don't transcode', so it's a legal file. However if I then try to do anything with the file in Encore, like add it to a timeline the whole program locks up again. 10 minutes later it's added as a timeline but trying to navigate that timeline to add chapters is impossible, the program locks up (Not responding) if I try and move the slider or pretty much do anything. It means making a BR with this file is pretty much impossible.
    My PC spec is as follows:
    Core i5 2500k 3.3Ghz
    8GB DDR3 1600mhz RAM
    120GB Crucial SSD boot drive with Encore installed on
    500GB Hard Disk & 2TB Hard Disk
    Windows 7 64-Bit Home Premium
    Is there anything I can do to actually make Encore usable with this large file and finally finish authoring a disc? I've tried loading the file from all three drives and it's the same. The media cache is currently on my 2TB drive because I want to save space on my SSD and minimize writes to it.

    Taking 5 minutes to import is normal.  Encore has to index the entire file so it knows where the keyframes are.  Don't try to do anything while that's happening.
    Once in, however, Encore should respond normally.  At this point, I'm drawing a blank as to why it isn't.

  • WebUtil_File.File_Open_Dialog VERY SLOW with large amounts of files

    Dear Friend,
    We are using WebUtil_File.File_Open_Dialog on the client to allow for the selection of an image from a folder on the server, which contains around 2-3000 files.
    It takes around 10-15 minutes to display the files within the folder. This is obviously unacceptable and I was wondering if anyone had come up with a viable solution to this problem, as i noticed similar issues had been posted in the past and no solution was suggested.
    The function call for the above operation is: :control.fichier := WebUtil_File.File_Open_Dialog(:global.path||'images\bdsurvey\', '', '*.jpg |*.jpg| (All files)*.* |*.*|', 'Nom du fichier photo');
    Many thanks,
    John

    This can be done by the following:
    Tuning the file system - reducing/filtering unwanted files, defragmentation, FAT (NTFS might consume more time)
    I am not sure whether you can do anything for this in AS.
    Alternatively there are some java classes which may be used instead of web util.

  • Crossgrade Upgrade Not working with previous LiveType Media files

    I have recently received the crossgrade discs and was able to successfully upgrade FCP, Motion, DVDSP...however, when it came to LiveType...and requested the previous LiveType Media discs, the installer didn't seem to recognize the first disc, and spit it out, thus stalling the rest of the install...anyone else experienced this issue, or have any suggestions?

    None of what I posted you should need the device for. Now if you did not delete any folders then it should snap back to where the file was stored in the first place. If you still have issues it is possible for you to search for the files though windows. This maybe a little work around that you may need.
    If Garmin did not change any of the files types then this maybe the trick. Using the search function look for .dat or .bak you may find some under the palm/[Hotsync Username].
    Next thing would be to see if Garmin could help you. From what it sounds you just need to know where the file location is and an old forum board or KB just might help you.
    Post relates to: Treo 800w (Sprint)

  • ADE Not Compatible With Password Protected PDF Files

    The version of ADE I downloaded will not work with password protected PDF files.  What do I do?
    Thanks.

    ADE does not (yet) support files that use the Password Security handler, so you will want to use a different PDF viewer, such as Adobe Reader.

  • VM Manager's Launch Console not working with OpenJDK and IcedTea

    Fedora 15
    java-1.6.0-openjdk-1.6.0.0-59.1.10.3.fc15.x86_64
    icedtea-web-1.0.4-1.fc15.x86_64
    ICEDTEA-WEB NOTES
    Invalid XML
    An error like
    netx: Unexpected net.sourceforge.jnlp.ParseException: Invalid XML document syntax. at net.sourceforge.jnlp.Parser.getRootNode(Parser.java:1203)
    indicates that the JNLP file is not valid XML. The error happens because netx uses an XML parser to parse the JNLP file. Other JNLP client implementations may use more lenient parsers and may or may not work with the given JNLP file. Errors caused by malformed JNLP files can often lead to subtle bugs, so it is probably best to fix the JNLP file itself. A tool like xmlproc_parse might be able to pinpoint the error.
    XMLPROC_PARSE OUTPUT
    $ xmlproc_parse ovm_rasproxy-ws.jnlp
    xmlproc version 0.70
    Parsing 'ovm_rasproxy-ws.jnlp'
    E:ovm_rasproxy-ws.jnlp:3:89: Undeclared entity 'machineName'
    E:ovm_rasproxy-ws.jnlp:3:89: ';' expected
    Parse complete, 2 error(s) and 0 warning(s)
    JAVAWS OUTPUT
    $ javaws ovm_rasproxy-ws.jnlp
    netx: Unexpected net.sourceforge.jnlp.ParseException: Invalid XML document syntax. at net.sourceforge.jnlp.Parser.getRootNode(Parser.java:1243)
    JAVAWS -VERBOSE OUTPUT
    $ javaws -verbose ovm_rasproxy-ws.jnlp
    No User level deployment.properties found.
    Starting security dialog thread
    JNLP file location: ovm_rasproxy-ws.jnlp
    Status: CONNECTED DOWNLOADED STARTED +(CONNECTED DOWNLOADED STARTED) @ ./ovm_rasproxy-ws.jnlp
    <?xml version="1.0" encoding="UTF-8"?>
    line: 2 <jnlp spec="1.0+" codebase="http://192.168.4.21:7001/ovm/rasproxy/"
    line: 3 href="ovm_rasproxy-ws.jnlp?vmachineId=0004fb0000060000046ffdb8a331ce21&machineName=mytestmachine">
    line: 4 <information>
    line: 5 <title>Oracle VM Remote Access Service</title>
    line: 6 <vendor>Oracle</vendor>
    line: 7 <homepage>http://support.oracle.com/</homepage>
    line: 8 </information>
    line: 9 <resources>
    line: 10
    line: 11 <j2se version="1.5+"
    line: 12 href="http://java.sun.com/products/autodl/j2se" />
    line: 13 <jar href="ovm_rasproxy-signed.jar" main="true" />
    line: 14 <jar href="OvmCoreApi.jar" />
    line: 15 <jar href="MgmtUtil.jar" />
    line: 16 <jar href="Odof.jar" />
    line: 17 <jar href="commons-logging-1.1.1.jar" />
    line: 18 </resources>
    line: 19
    line: 20
    line: 21 <application-desc main-class="com.oracle.ovm.ras.proxy.RasProxyApplet">
    line: 22 <argument>-server</argument>
    line: 23 <argument>192.168.4.21</argument>
    line: 24 <argument>-service</argument>
    line: 25 <argument>003600010004fb0000060000046ffdb8a331ce21</argument>
    line: 26
    line: 27 </application-desc>
    line: 28 <security>
    line: 29      <all-permissions/>
    line: 30 </security>
    line: 31 <update check="background"/>
    line: 32 </jnlp>               net.sourceforge.jnlp.ParseException: Invalid XML document syntax.
         at net.sourceforge.jnlp.Parser.getRootNode(Parser.java:1243)
         at net.sourceforge.jnlp.JNLPFile.<init>(JNLPFile.java:177)
         at net.sourceforge.jnlp.JNLPFile.<init>(JNLPFile.java:162)
         at net.sourceforge.jnlp.JNLPFile.<init>(JNLPFile.java:148)
         at net.sourceforge.jnlp.runtime.Boot.getFile(Boot.java:267)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:196)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:56)
         at java.security.AccessController.doPrivileged(Native Method)
         at net.sourceforge.jnlp.runtime.Boot.main(Boot.java:173)
    Caused by: net.sourceforge.nanoxml.XMLParseException: XML Parse Exception during parsing of a jnlp element at line 32: Unexpected end of data reached
         at net.sourceforge.nanoxml.XMLElement.unexpectedEndOfData(XMLElement.java:1094)
         at net.sourceforge.nanoxml.XMLElement.readChar(XMLElement.java:877)
         at net.sourceforge.nanoxml.XMLElement.resolveEntity(XMLElement.java:1013)
         at net.sourceforge.nanoxml.XMLElement.scanString(XMLElement.java:658)
         at net.sourceforge.nanoxml.XMLElement.scanElement(XMLElement.java:915)
         at net.sourceforge.nanoxml.XMLElement.parseFromReader(XMLElement.java:512)
         at net.sourceforge.nanoxml.XMLElement.parseFromReader(XMLElement.java:464)
         at net.sourceforge.jnlp.Parser.getRootNode(Parser.java:1239)
         ... 8 more
    Caused by:
    net.sourceforge.nanoxml.XMLParseException: XML Parse Exception during parsing of a jnlp element at line 32: Unexpected end of data reached
         at net.sourceforge.nanoxml.XMLElement.unexpectedEndOfData(XMLElement.java:1094)
         at net.sourceforge.nanoxml.XMLElement.readChar(XMLElement.java:877)
         at net.sourceforge.nanoxml.XMLElement.resolveEntity(XMLElement.java:1013)
         at net.sourceforge.nanoxml.XMLElement.scanString(XMLElement.java:658)
         at net.sourceforge.nanoxml.XMLElement.scanElement(XMLElement.java:915)
         at net.sourceforge.nanoxml.XMLElement.parseFromReader(XMLElement.java:512)
         at net.sourceforge.nanoxml.XMLElement.parseFromReader(XMLElement.java:464)
         at net.sourceforge.jnlp.Parser.getRootNode(Parser.java:1239)
         at net.sourceforge.jnlp.JNLPFile.<init>(JNLPFile.java:177)
         at net.sourceforge.jnlp.JNLPFile.<init>(JNLPFile.java:162)
         at net.sourceforge.jnlp.JNLPFile.<init>(JNLPFile.java:148)
         at net.sourceforge.jnlp.runtime.Boot.getFile(Boot.java:267)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:196)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:56)
         at java.security.AccessController.doPrivileged(Native Method)
         at net.sourceforge.jnlp.runtime.Boot.main(Boot.java:173)
    netx: Unexpected net.sourceforge.jnlp.ParseException: Invalid XML document syntax. at net.sourceforge.jnlp.Parser.getRootNode(Parser.java:1243)

    Both realvnc 4.1.3 and tigervnc 1.1.0.1 work on the Linux side.
    I was able to get the console to work by wiping the contents of the directory I created at the beginning of this exercise and then copying in a fresh OVM 3.0.2 created ovm_rasproxy-ws.jnlp file.
    [gordon@ufo Downloads]$ rm /home/gordon/.icedtea/cache/http/192.168.4.16/ovm/rasproxy/*
    [gordon@ufo Downloads]$ javaws ovm_rasproxy-ws.jnlp
    java.io.EOFException
         at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:340)
         at java.io.DataInputStream.readUTF(DataInputStream.java:589)
         at java.io.DataInputStream.readUTF(DataInputStream.java:564)
         at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:744)
         at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:55)
         at java.security.KeyStore.load(KeyStore.java:1201)
         at net.sourceforge.jnlp.security.KeyStores.createKeyStoreFromFile(KeyStores.java:358)
         at net.sourceforge.jnlp.security.KeyStores.getKeyStore(KeyStores.java:124)
         at net.sourceforge.jnlp.security.KeyStores.getKeyStore(KeyStores.java:103)
         at net.sourceforge.jnlp.security.KeyStores.getCertKeyStores(KeyStores.java:157)
         at net.sourceforge.jnlp.security.VariableX509TrustManager.<init>(VariableX509TrustManager.java:91)
         at net.sourceforge.jnlp.security.VariableX509TrustManager.getInstance(VariableX509TrustManager.java:395)
         at net.sourceforge.jnlp.runtime.JNLPRuntime.initialize(JNLPRuntime.java:210)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:182)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:56)
         at java.security.AccessController.doPrivileged(Native Method)
         at net.sourceforge.jnlp.runtime.Boot.main(Boot.java:173)
    netx: Unexpected java.io.IOException: /home/gordon/.icedtea/cache/http/192.168.4.16/ovm/rasproxy/ovm_rasproxy-ws.jnlp (No such file or directory) at net.sourceforge.jnlp.JNLPFile.openURL(JNLPFile.java:255)
    [gordon@ufo Downloads]$ cp ovm_rasproxy-ws.jnlp /home/gordon/.icedtea/cache/http/192.168.4.16/ovm/rasproxy/
    [gordon@ufo Downloads]$ javaws ovm_rasproxy-ws.jnlp
    java.io.EOFException
         at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:340)
         at java.io.DataInputStream.readUTF(DataInputStream.java:589)
         at java.io.DataInputStream.readUTF(DataInputStream.java:564)
         at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:744)
         at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:55)
         at java.security.KeyStore.load(KeyStore.java:1201)
         at net.sourceforge.jnlp.security.KeyStores.createKeyStoreFromFile(KeyStores.java:358)
         at net.sourceforge.jnlp.security.KeyStores.getKeyStore(KeyStores.java:124)
         at net.sourceforge.jnlp.security.KeyStores.getKeyStore(KeyStores.java:103)
         at net.sourceforge.jnlp.security.KeyStores.getCertKeyStores(KeyStores.java:157)
         at net.sourceforge.jnlp.security.VariableX509TrustManager.<init>(VariableX509TrustManager.java:91)
         at net.sourceforge.jnlp.security.VariableX509TrustManager.getInstance(VariableX509TrustManager.java:395)
         at net.sourceforge.jnlp.runtime.JNLPRuntime.initialize(JNLPRuntime.java:210)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:182)
         at net.sourceforge.jnlp.runtime.Boot.run(Boot.java:56)
         at java.security.AccessController.doPrivileged(Native Method)
         at net.sourceforge.jnlp.runtime.Boot.main(Boot.java:173)
    java.io.EOFException
         at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:340)
         at java.io.DataInputStream.readUTF(DataInputStream.java:589)
         at java.io.DataInputStream.readUTF(DataInputStream.java:564)
         at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:744)
         at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:55)
         at java.security.KeyStore.load(KeyStore.java:1201)
         at net.sourceforge.jnlp.security.KeyStores.createKeyStoreFromFile(KeyStores.java:358)
         at net.sourceforge.jnlp.security.KeyStores.getKeyStore(KeyStores.java:124)
         at net.sourceforge.jnlp.security.KeyStores.getKeyStore(KeyStores.java:103)
         at net.sourceforge.jnlp.security.KeyStores.getCertKeyStores(KeyStores.java:157)
         at net.sourceforge.jnlp.tools.JarSigner.checkTrustedCerts(JarSigner.java:414)
         at net.sourceforge.jnlp.tools.JarSigner.verifyJars(JarSigner.java:269)
         at net.sourceforge.jnlp.runtime.JNLPClassLoader.verifyJars(JNLPClassLoader.java:946)
         at net.sourceforge.jnlp.runtime.JNLPClassLoader.initializeResources(JNLPClassLoader.java:422)
         at net.sourceforge.jnlp.runtime.JNLPClassLoader.<init>(JNLPClassLoader.java:169)
         at net.sourceforge.jnlp.runtime.JNLPClassLoader.getInstance(JNLPClassLoader.java:283)
         at net.sourceforge.jnlp.Launcher.createApplication(Launcher.java:650)
         at net.sourceforge.jnlp.Launcher.launchApplication(Launcher.java:436)
         at net.sourceforge.jnlp.Launcher$TgThread.run(Launcher.java:830)
    Oct 4, 2011 4:13:41 PM com.oracle.ovm.ras.proxy.RasProxyApplet main
    INFO: Server : 192.168.4.16
    Oct 4, 2011 4:13:41 PM com.oracle.ovm.ras.proxy.RasProxyApplet main
    INFO: service id : 003600010004fb0000060000281734f86e6aab47
    Oct 4, 2011 4:13:42 PM com.oracle.ovm.ras.proxy.external.VncViewerLauncherFactory getVncViewerLauncher
    INFO: Os is : linux
    Oct 4, 2011 4:13:53 PM com.oracle.ovm.ras.proxy.ProxyThread setupSSL
    INFO: DONE SSL Handshaking
    The console starts and all is well with the world.
    This is obviously not how one would want to start a console by downloading the .jnlp file and running it on the command line so I am now going to remove the /home/gordon/.icedtea/cache/http/* and see what happens.
    First, let me say that nothing happened. It did not work. I did confirm that it creates the directory structure again /home/gordon/.icedtea/cache/http/192.168.4.16/ovm/rasproxy/ but there is only one file in the directory.
    [gordon@ufo rasproxy]$ cat ovm_rasproxy-ws.jnlp.info
    #automatically generated - do not edit
    #Tue Oct 04 16:43:29 EDT 2011
    last-modified=0
    last-updated=1317761009801
    content-length=883
    If I save that ovm_rasproxy-ws.jnlp file instead of opening it and copy it to the /home/gordon/.icedtea/cache/http/192.168.4.16/ovm/rasproxy/ directory the console will now start successfully with one minor drawback - it will only launch that one VM based on the .jnlp file that I downloaded.
    Help?

  • Speed up Illustrator CC when working with large vector files

    Raster (mainly) files up to 350 Mb. run fast in Illustrator CC, while vector files of 10 Mb. are a pain in the *blieb* (eg. zooming & panning). When reading the file it seems to freeze around 95 % for a few minutes. Memory usage goes up to 6 Gb. Processor usage 30 - 50 %.
    Are there ways to speed things up while working with large vector files in Illustrator CC?
    System:
    64 bit Windows 7 enterprise
    Memory: 16 Gb
    Processor: Intel Xeon 3,7 GHz (8 threads)
    Graphics: nVidia Geforce K4000

    Files with large amounts vector points will put a strain on the fastest of computers. But any type of speed increase we can get you can save you lots of time.
    Delete any unwanted stray points using  Select >> Object >> stray points
    Optimize performance | Windows
    Did you draw this yourself, is the file as clean as can be? Are there any repeated paths underneath your art which do not need to be there from live tracing or stock art sites?
    Check the control panel >> programs and features and sort by installed recently and uninstall anything suspicious.
    Sorry there will be no short or single answer to this, as per the previous poster using layers effectively, and working in outline mode when possible might the best you can do.

  • Uploading files to website not working with Safari in Windows 7

    Safari version: 5.1.7
    OS: Windows 7
    Hi,
    I am a website owner, and I have a feature on my site where users can upload large (up to 50MB) music files to my server. The feature works in Firefox on OSX & Windows platforms and it also works in Safari on OSX. The uploading feature is not working with Safari for Windows. Do you have any idea why and how to fix this? It can't be an issue with my server as it works in other browsers fine and works fine in Safari for OS X.
    If you have any information on how to solve this or things for me to try to rectify this for use with Windows operating system it would be much appreciated.
    Thanks.

    GREAT NEWS!!!!
    I have solved this issue myself with some help from my a user on Microsoft Forums. The issue stems from the mime type of each Browser Chrome, IE, & Safari for Windows defaults as audio/wav. When the mime type is changed to audio/x-wav everything works. Firefox defaults to audio/x-wav and when changed to audio/wav it DOES NOT work.
    Once I input some code in my .php upload page to change the mimetype if audio/wav is detected everything works GREAT!!!
    So Here is what you need to do: Find your upload page and input this code:
    echo "<p>MIME Type: ".$_FILES["file"]["type"]."</p>";   
    right before your "If/then" statement of file type. For me it was near line 30 in my upload.php page, but I'm sure this is different for everyone.
    This will detect and DISPLAY the default mime type of your browser on the error page when your upload doesn't work. Once you know what mime type works for your file type then you can change your "if/then" statement for mime type upload to change your mime type to the correct one.
    I don't want to give the code here, because I'm sure it's specific to your file types and your site construction, but this should lead you on the right track.
    ***This is the solution if you ARE NOT ABLE to upload any certain file in a certain Browser. It has to do with mime type construction***
    I hope this helps others like it helped me!!!!!

Maybe you are looking for

  • CRM IC WebClient: Check Web Input for OO

    Hi all experts, I am new to the enhancement of CRM. So, all kind of helps will be greatly appreciated. For your info, I'm using CRM version 5.0 My requirement is to add extra field ID number at Identify Account Screen at Webclient. I have extended an

  • Filtering images from the selection

    I am writing a script that involves scaling and sorting images. Which means i have to access the geometricBounds property of the images. Now, when the selection contains either an EPS or PDF file the script fails. So i am trying to filter through the

  • Use of Single Instance Store (SIS) in DPM 2012 R2

    Hi, Please help me to understand use of installing Single Instance Store (SIS) feature while installation of DPM 2012 R2 or reason behind installing it.

  • Anti-virus support for CUOM/USM 8.7 on Windows 2008

                       On the "Supported Devices and Interoperable Software Table for Cisco Prime Unified Operations Manager 8.7" guide the only listed supported anti-virus is McAfee Virus Scan Enterprise 8.0. My customer uses McAfee Virus Scan Enterpris

  • Reader not printing

    My Adobe Reader suddenly will not print to my Canon Pro9000 (un specified error) and will only print a quarter of a page to my i960. If I try to open Updater I get the whirling beach ball. I have rebooted everything, no changes Adobe Reader vers 10.1