LV 8.2 Vs. LV 7.1 and large VI file size

Hi,
I upgraded a large SW developed in LV 7.1 to LV 8.2. After some error corrections I saw that a large Vi in LV 7.1 (a Vi template .vit) that was 18 MB, in LV8.2 was 2.6 MB !
The editing of this Vi is very very very slow, and it's unacceptable. When I save it I can have a coffee break !!
Also, to open the reference to this Vi it's necessary 35-40s but in LV 7.1 2 seconds were enough.
Finally I see also a reduction of the file dimension of some Vi that in LV 7.1 had a large file size (5-8 MB).
Some suggestion about this big-big problem ?
It's possible that LV 8.2 compress (like ZIP) large VI ?
Thanks and regards

The extreme difference in file size indicates that LabVIEW is able to "squeeze" your vi down much more than the typical %50 that ZLIB is capable of. LabVIEW 8.0 introduced compression of the vi. See this LAVA thread. If your VI contains items that are highly compressible, (BMPs, large strings of repeating text?) then the CPU must work more to "enflate" and "deflate" the file when loaded or saved (development environment slower). Your reported values lead me to beleive there is "SOMETHING" in your VI that could be improved.
The other solution is to improve the computer's effeciency; start by cleaning up the hard disk and optimizing it, then upgrade memory, then as a last resort upgrade the CPU. LV 8.2 takes more memory. If your physical RAM is not up to par and if your disk drive is full, you might be paging to disk alot and that would slow you down.
Now is the right time to use %^<%Y-%m-%dT%H:%M:%S%3uZ>T
If you don't hate time zones, you're not a real programmer.
"You are what you don't automate"
Inplaceness is synonymous with insidiousness

Similar Messages

  • Export QuickTime file with new audio and maintain the file size and quality as the original.

    I shot some footage for a client yesterday and ran into an issue. To make a long story short I have QuickTime mov files shot with my Panasonic GH4 that have a buzzing sound in the audio track. I have clean audio from a recorder that can be sync'd. Is there a way for me to do this for the client and deliver them as the same QuickTime file but with the clean audio and keep the file size close to the original and not have quality loss in the image?
    If so I will probably put all of the spanned clips together from each interview and sync the audio before delivery. I am just not sure which codec will give the client the same quality footage compared to the originals and not have a massive difference in the overall file size. They did request that they be Quicktime MOV files though so that is a must.
    I don't see this as an option in the codecs in the export settings in PP, but is there a way to export as ProRes or another MAC native codec that will save them from having to convert it on their end? I am on a PC with Adobe CS5.5 so I am not too familiar with MACs, but from what I understand they would need to convert the files if I sent them straight out of the camera.
    I found some related search results for this but they pertained to "Best quality" for export. I am not sure how the varying options work but I don't want to create files that are considerably larger, just not less quality.
    If anyone has experience with this it would be greatly appreciated.
    Thanks,
    Cole

    Here's the workflow: I imported the video footage into iMovie '08 and did my edits. Then I exported it (share) to my desktop with compression set @ 740 X 480. Then I used QuickTime Pro to fix the audio track. The file plays perfectly with both audio tracks working. It's a QuickTime file (.mov).
    I hope this jars any replies as to why the file when uploaded to my iWeb gallery drops the second audio track.
    Hmm,
    Jack

  • Nio ByteBuffer and memory-mapped file size limitation

    I have a question/issue regarding ByteBuffer and memory-mapped file size limitations. I recently started using NIO FileChannels and ByteBuffers to store and process buffers of binary data. Until now, the maximum individual ByteBuffer/memory-mapped file size I have needed to process was around 80MB.
    However, I need to now begin processing larger buffers of binary data from a new source. Initial testing with buffer sizes above 100MB result in IOExceptions (java.lang.OutOfMemoryError: Map failed).
    I am using 32bit Windows XP; 2GB of memory (typically 1.3 to 1.5GB free); Java version 1.6.0_03; with -Xmx set to 1280m. Decreasing the Java heap max size down 768m does result in the ability to memory map larger buffers to files, but never bigger than roughly 500MB. However, the application that uses this code contains other components that require the -xMx option to be set to 1280.
    The following simple code segment executed by itself will produce the IOException for me when executed using -Xmx1280m. If I use -Xmx768m, I can increase the buffer size up to around 300MB, but never to a size that I would think I could map.
    try
    String mapFile = "C:/temp/" + UUID.randomUUID().toString() + ".tmp";
    FileChannel rwChan = new RandomAccessFile( mapFile, "rw").getChannel();
    ByteBuffer byteBuffer = rwChan.map( FileChannel.MapMode.READ_WRITE,
    0, 100000000 );
    rwChan.close();
    catch( Exception e )
    e.printStackTrace();
    I am hoping that someone can shed some light on the factors that affect the amount of data that may be memory mapped to/in a file at one time. I have investigated this for some time now and based on my understanding of how memory mapped files are supposed to work, I would think that I could map ByteBuffers to files larger than 500MB. I believe that address space plays a role, but I admittedly am no OS address space expert.
    Thanks in advance for any input.
    Regards- KJ

    See the workaround in http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4724038

  • Do i need acrobat pro to open a pdf i created in quark and reduce the file size?

    do i need acrobat pro to open a pdf i created in quark and reduce the file size?

    i'll try that! thank you.
    Daniel Flavin <mailto:[email protected]>
    June 24, 2014 at 3:23 PM
    >
          do i need acrobat pro to open a pdf i created in quark and
          reduce the file size?
    created by Daniel Flavin
    <https://forums.adobe.com/people/Daniel+Flavin> in /Creating, Editing
    & Exporting PDFs/ - View the full discussion
    <https://forums.adobe.com/message/6493113#6493113>

  • Can I resize photos and reduce JPEG file sizes by using Photpshop Elements?

    Can I resize photos and reduce JPEG file sizes by using Photpshop Elements?

    Go to Image>Resize>image size.
    For purpose of web posting, go to File>save for web.

  • FILE and FTP Adapter file size limit

    Hi,
    Oracle SOA Suite ESB related:
    I see that there is a file size limit of 7MB for transferring using File and FTP adapter and that debatching can be used to overcome this issue. Also see that debatching can be done only for strucutred files.
    1) What can be done to transfer unstructured files larger than 7MB from one server to the other using FTP adapter?
    2) For structured files, could someone help me in debatching a file with the following structure.
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_2
    300|Line_id_2|1234|Location_ID_2
    400|Location_ID_2|1234|Dist_ID_2
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_N
    300|Line_id_N|1234|Location_ID_N
    400|Location_ID_N|1234|Dist_ID_N
    999|SSS|1234|88|158
    I would need a the complete data in a single file at the destination for each file in the source. If there are as many number of files as the number of batches at the destination, I would need the file output file structure be as follows:
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    999|SSS|1234|88|158
    Thanks in advance,
    RV
    Edited by: user10236075 on May 25, 2009 4:12 PM
    Edited by: user10236075 on May 25, 2009 4:14 PM

    Ok Here are the steps
    1. Create an inbound file adapter as you normally would. The schema is opaque, set the polling as required.
    2. Create an outbound file adapter as you normally would, it doesn't really matter what xsd you use as you will modify the wsdl manually.
    3. Create a xsd that will read your file. This would typically be the xsd you would use for the inbound adapter. I call this address-csv.xsd.
    4. Create a xsd that is the desired output. This would typically be the xsd you would use for the outbound adapter. I have called this address-fixed-length.xsd. So I want to map csv to fixed length format.
    5. Create the xslt that will map between the 2 xsd. Do this in JDev, select the BPEL project, right-click -> New -> General -> XSL Map
    6. Edit the outbound file partner link wsdl, the the jca operations as the doc specifies, this is my example.
    <jca:binding  />
            <operation name="MoveWithXlate">
          <jca:operation
              InteractionSpec="oracle.tip.adapter.file.outbound.FileIoInteractionSpec"
              SourcePhysicalDirectory="foo1"
              SourceFileName="bar1"
              TargetPhysicalDirectory="C:\JDevOOW\jdev\FileIoOperationApps\MoveHugeFileWithXlate\out"
              TargetFileName="purchase_fixed.txt"
              SourceSchema="address-csv.xsd" 
              SourceSchemaRoot ="Root-Element"
              SourceType="native"
              TargetSchema="address-fixedLength.xsd" 
              TargetSchemaRoot ="Root-Element"
              TargetType="native"
              Xsl="addr1Toaddr2.xsl"
              Type="MOVE">
          </jca:operation> 7. Edit the outbound header to look as follows
        <types>
            <schema attributeFormDefault="qualified" elementFormDefault="qualified"
                    targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/"
                    xmlns="http://www.w3.org/2001/XMLSchema"
                    xmlns:FILEAPP="http://xmlns.oracle.com/pcbpel/adapter/file/">
                <element name="OutboundFileHeaderType">
                    <complexType>
                        <sequence>
                            <element name="fileName" type="string"/>
                            <element name="sourceDirectory" type="string"/>
                            <element name="sourceFileName" type="string"/>
                            <element name="targetDirectory" type="string"/>
                            <element name="targetFileName" type="string"/>                       
                        </sequence>
                    </complexType>
                </element> 
            </schema>
        </types>   8. the last trick is to have an assign between the inbound header to the outbound header partner link that copies the headers. You only need to copy the sourceDirectory and SourceGileName
        <assign name="Assign_Headers">
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:fileName"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceFileName"/>
          </copy>
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:directory"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceDirectory"/>
          </copy>
        </assign>you should be good to go. If you just want pass through then you don't need the native format set to opaque, with no XSLT
    cheers
    James

  • JTable and large CSV files

    I have been looking for a clear answer (with source code) to a problem commonly asked in the forums. The question is "how do I display a large comma delimited (CSV) file in a JTable without encountering a Java heap space error?" One solution is described at http://forum.java.sun.com/thread.jspa?forumID=57&threadID=741313 but no source code is provided. Can anyone provide some example code as to how this solution works? It is not clear to me how the getValueAt(r, c) method can be used to get the (r+1)th row if only r rows are in the TableModel. I greatly appreciate any help.

    Perhaps if I posted my code, I might get a little help. First, my class that extends abstract table model
    public class DataTableModel extends AbstractTableModel{
         private static final long serialVersionUID = 1L;
         Object[][] data;
         DataColumnFormat[] colFormat;
         int nrows, ncols, totalRows;
         public DataTableModel(Object[][] aData, DataColumnFormat[] aColFormat, int aTotalNumberOfRows){
              data=aData;
              colFormat=aColFormat;
              nrows=data.length;
              ncols=data[0].length;
    //          number of rows in entire data file.
    //          This will be larger than nrows if data file has more than 1000 rows
              totalRows=aTotalNumberOfRows;
         public int getRowCount(){
              return nrows;
         public int getColumnCount(){
              return ncols;
         public String getColumnName(int aColumnIndex){
              return colFormat[aColumnIndex].getName();
         public Object getValueAt(int r, int c){
              if(colFormat[c].isDouble()){
                   return data[r][c];
              return data[r][c];
         public boolean isCellEditable(int nRow, int nCol){
              return true;
         @SuppressWarnings("unchecked")
         public Class getColumnClass(int c) {
            return getValueAt(0, c).getClass();
         protected void updateData(){
    //          replace values in data[][] object with new rows from large data file
    }Suppose data = new Object[1000][100] but my CSV file has 5000000000 lines (to exaggerate). By my understanding getValueAt(r, c) could not go beyond r=1000 and c=100. So, how would I update data with the next 1000 lines by way of the getValueAt(r,c) method? Moreover, how would I implement this so that the table appears to scroll continuously? I know someone has a solution to this problem. Thanks.

  • BUG: "Save and compact" increases file size by 1KB each time?

    Flash CS4
    I noticed that when I do "save and compact" on a .fla file, it causes the file size to increase by 1KB each time--even when no changes were made.  Can this be fixed?

    Yup. It works perfectly in CS3. Now the problem you will have is that you don't have CS3 installed. Even if you will save your files as CS3 format, you cannot (as far as i know) "save and compact" it as CS3, it will prompt that it must be converted from older version CS3 to CS4 and you end up with the same problem. Perhaps there is a setting to force CS4 to save and compact in CS3 version but i doubt it. The only way i know to solve it is save the file as CS3 format (But you will lose some XMP file info metadata, i'm not sure if that is necessary though) then close CS4, and open CS3 then "save and compact" it there. This is definitely a bug where some information might have been duplicated in the file's info or history. If you are lucky an adobe employee might be able to give you a solution to this or just confirm this bug.

  • Quicktime and large mpeg2 files

    can Quicktime export large (@5 or 6 gigabyte) mpeg2 files?
    I put a DVD (non-commercial, non-copyrighted) onto my hard drive and it plays back fine with other apps. like Media Player, but Quicktime can't play or export it. It works for about the first 15 minutes of the video then the picture freezes though the sound continues. Also the data size is messed up and is listed as a negative number (such as "-2.34257934359".
    thanks
    dell   Windows XP  

    QT doesn't export mpeg2 and has some limit in file length. This freeware will play large mpeg2 files as long as you have purchased the Apple mpeg2playbackcomponent ($20 from Apple Store). The component doesn't play the longer files but this freeware apparently does if you have the component installed. Mpeg StreamClip
    With the component and Streamclip you can export both sound and video to other formats but you can't export to mpeg2. Streamclip does allow some editing of mpeg2 files

  • PDFs rendered on AIX and Linux different file sizes

    Hello,
    If I render a PDF on AIX my file size is around 400K. If I render the the same PDF on Linux, the file size is around 1.2MB. I am using the same XDP and XML file. I am using LiveCycle Forms v7.
    Any ideas?
    Thanks, Greg

    Just found out that the file size is larger on Linux when I apply usage rights with Reader Extensions.

  • With Premiere elements 11, when Publish/Share (saving) a finished video, selecting H.264  .mov file type, what are the recommended Codec and Aspect settings for a crisp video and a small file size?

    With Premiere Elements 11, when Publish/Share (saving) a finished video, selecting H.264  .mov file type, what are the recommended Aspect and Codec settings , for a crisp video with a small file size?

    Gary Krouse
    We will customize a very detailed reply, but
    (a) What are the properties of your source media?
    (b) What did you or the program set as the project preset to match the properties of the source media?
    (c) What specific requirements do you have for this H.264.mov export file?
    File size can be a compromise between bitrate, file size, and quality.
    Thanks.
    ATR

  • Exporting Comp to Flash Animation (and Keeping the File Size Small)

    I have an animation I have made in AE which isn't terribly complicated.
    Maybe 450 X 250, mostly blank background. The artwork was done in Illustrator. There are about a dozen layers, all AI artwork, and the animation is about two seconds.
    When I export the animation as an SWF file, the file size is 1.2MB. What do I need to do to make the file size a lot smaller?
    I am not used to making animations for Flash. My AE animations are usually meant for video applications, but someone decided they want this animation to go on their web site instead.
    Thanks!

    If it is a frame by frame animation try importing the MOV file into Flash rather than exporting from AE.
    Not sure if this will work but it might.  When you publish the file in Flash you will get a resulting SWF file that is created automatically.  See if it is smaller.

  • CS4 and Raw- small file size

    I have RAW files that are anywhere from 7-15MB.  They are opening as 10x15 inches.  My JPEGS open in much larger files sizes - about 50+ inches.  I have downloaded the latest Adobe RAW plug-in updates for CS4.  Using a Canon 40D.  Searched and seached online for an explanation. Thank you

    I think it just has to do with the resolution settings in the files.
    They are the same size but the resolution is set different.
    For example, the camera raw files are probably around 240 and the jpegs are around 72.
    If you open a jpeg and camera raw file and set the resolution of the jpeg to match the camera raw file, then they should
    be the same document size.
    Find the resolution of the camera raw file under Image>Image Size.
    Open one of the jpegs and in Image>Image Size uncheck Resample Image and type in the same resolution as the raw file
    and see if the document size is the same.

  • Adding Click Tags to Flash and also reducing file size?

    Hi,
    I was wondering if anyone knows how to add a Click Tag to a Flash movie?
    I have the file finished but they require Click Tags added to each file.
    Also my file is currently around 100kb and they need it under 30kb (last time they got someone to reduce my file to 30kb from 110kb), so Im sure this can be done but not quite sure how. I am using Flash CS4,
    thanks,
    Paul

    I would recommend going to 15 if you are going to play with frame rate.
    If you want to know how data rate effects the size
    double-click the setting for your movie
    look at the inspector
    Under the summary tab compressor show you how big the file size will be at _ data rate etc.
    Changing the data rate will be reflected here.
    When a client doesn't want a file over 15mb I use this tool to know and understand how data rate and other settings effect the overall file. This works for h.264 but may not be the same for every setting.

  • Load and Read XML file size more than 4GB

    Hi All
    My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
    1. I read XML file over HTTP port into XMLTYPE column in table.
    2. I read value no.1 from table and extract to insert into another table
    On test db, everything is work but I got below error when I use production XML file
         ORA-31186: Document contains too many nodes
    Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
    Belows are some part of my code for your info.
    1. Read XML by line into variable and insert into table
    LOOP
    UTL_HTTP.read_text(http_resp, v_resptext, 32767);
    DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
        END LOOP;
        INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
    2. Read cell value from XML column and extract to insert into another table
    DECLARE
    CURSOR c_xml IS
    (SELECT  trim(y.cvalue)
    FROM XMLTAB xt,
    XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
    COLUMNS
    cvalue
    VARCHAR(50)
    PATH '/') y;
        BEGIN
    OPEN c_xml;
    FETCH c_xml INTO v_TempValue;
    <Generate insert statement into another table>
    EXIT WHEN c_xml%NOTFOUND;
    CLOSE c_xml;
        END
    And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
    Could you please suggest any solution to read large XML file and improve performance?
    Thank you in advance.
    Hiko      

    See Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
    The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
    So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
    If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
    HTH

Maybe you are looking for

  • Order Change vs Order Display in VA05

    Does anyone know what would cause an R/3 originated order to open from a VA05 list in Change mode, while a CRM originated order would open from the same listing in Display mode? The orders are the for the same customer, and same order type, however t

  • PF status for secondary ALV

    Hi all, I have a selection screen in my report and the output has a primary ALV and one secondary ALV. I have set PF status for the Primary ALV and on clicking a button in the application toolbar, the secondary ALV gets displayed. Now my pbm is when

  • How can I eliminate error in printing a cd jewel case cover in latest ITunes version?

    How can I eliminate the printing error in Itunes when attempting to print a cd jewel case cover?  I'm using the latest iTunes version. 

  • Flex 4 states and using includein with actionscript

    Hello, How do you handle states and the "includeIn" tag in ActionScript? For example: var vGroup:VGroup = new VGroup(); var initialState:State = new State(); initialState.name = "initialState"; initialState.overrides.push(new AddChild(this, vGroup));

  • IMail group mail??!

    Hi everybody and tnxs for reading. Is there a way to create a contacts group so as I can send emails to all the people of that group by typing its name only? Tnx you. Merry Christmas