Tdms viewer+large files issue

I built a VI that stores large [up to 50x50] 2D arrays continuously, every new 2D array is stored as a new group in a TDMS file.
The VI streams out the data, it's OK, however when it finishes and the TDMS Viewer pops up, the LabVIEW keeps hanging and [seemingly] doing nothing. The TDMS files that the viewer tries to display can have easily over 10^6 pieces of 2D arrays inside if the measurements were running for days for example.
Is there a clever way to make the TDMS Viewer display my file or at least say the quarter of it?
Thanks,
Krivan

Also, if I use the code attached and I try to open that large file the LV keeps hanging [all open LV windows], the program does not enter the while loop because the open tdms VIs do nothing.
However, if I run this program with a very small sized tdms file [~300 kbyte] containing only a couple of thousands of matrixes, it runs without any problem.
I haven't tried the defragmenter so far because here it was reported to be not advisable.
I also tried to read only a small portion of the files wiring 1000 to the read VI but [in case of the large files] the problem remains the same, the operation does not get through the TDMS Open VIs.
Is there a clever way to handle large TDMS files?
Thanks,
Krivan
Attachments:
read_average_TDMS_difference_plots.vi ‏22 KB

Similar Messages

  • Unable to View Large Files

    We're having a recurring issue where users start getting 'cannot access the file' error messages when trying to view documents in document libraries, it'll start with the larger files (over 10mb) and spread eventually to all documents in all libraries. On
    the client side it looks as if the download is being interrupted or timing out after about 10 seconds (though the limit is set at 6min). 
    The only thing that appears to fix the issue is a complete reboot of the server. Something we're not keen on doing multiple times a day. 
    No error messages or warnings in the server logs and we don't appear to be hitting any thresholds. 
    Any ideas?

    It was some time ago since I had to tackle this problem, but I think my article at
    http://www.loisandclark.eu/Pages/largefiles.aspx will provide some insights.
    Kind regards,
    Margriet Bruggeman
    Lois & Clark IT Services
    web site: http://www.loisandclark.eu
    blog: http://www.sharepointdragons.com

  • View large file

    How can I view the large text files? Is there any solution for this problem?

    I meant very large file. Only part of this file can locate in memory.
    How can I do such scrolling.

  • Stor.E TV+ Samba Large Files Issue

    Hi,
    Using SAMBA I can copy small files to my Stor.e TV+ but cannot copy files > ~500MB. Is there any update to software/bios that resolves this ? Can't see any available downloads on the Support Pages for this device. Can we configure the Samba Share ourselves at all ?
    I require to be able to copy Large Files over Samba to prevent having to locally attach the Device to PC using USB each time I want to copy my large digital files over to the Media Player.
    Thanks

    Hi
    I think this could be due to energy saving settings of the computer or the NAS has interrupted the connection.
    If the energy saving settings of your laptop or NAS are active, then they can go into a idle state or sleep mode even though they are being accessed via the network.
    I would recommend adjust the settings of the medias source to prevent it going into sleep while transferring files via the network.
    You could also try different connection modes:
    - LAN setup
    - Wlan setup (Adhoc connection)
    - PPPoE Setup (if your device is connected to network via DSL)
    I think it would be interesting to know what connection is affected.
    Good luck mate

  • Acrobat 9 Large File Issues?

    I work with very large Excel files that I need to turn into pdf's.  Very large meaning 30,000 to 50,000 rows in the spreadsheet.  Once turned into a pdf it is 586 pages long.  I had Adobe Acrobat 9 Standard and I began to experience issues.  If I were in the Excel spreadsheet and clicked on Acrobat, create pdf, the process would take 10-15 minutes and the resulting pdf was not correct (in Excel, I would do a pagebreak preview and the last column would be indicated as on a new page.  However, I would drag the line to include the last column and save the file that way.  But, the pdf created would still have that last column as a separate page, thus creating over a 1,000 pages in the pdf).  Whatever I did I could not get it to work.  However, if I selected "Print" and chose Adobe Acrobat as the printer, it would create a perfect looking pdf in about a minute.  Much faster and it looked correct.  The issue I found however was in searching for words in that pdf using the pdf search, some of the words were findable and some it indicated could not be found - even though the words were in fact in the pdf.
    I thought that perhaps I was beyond the capabilities of Acrobat 9 Standard so I installed Acrobat 9 Professional.  Now I can, in fact create a pdf directly from Excel.  It takes about 5 minutes but it completes and it looks good.  When I search for the words it now finds all of them.  However, as a test, I chose the print method which againn took less than a minute and produced a perfect looking pdf.  However, when doing the word searches again it did not find some of the words.
    Does anyone have any idea why?  I would much prefer to create the pdf using print because it is so much faster but I have to have search find all the words.

    Are those poorly-searchable pdf's confined to Excel source docs or does the problem also occur when starting from other Office apps such as Word?
    FWIW I once saw a web review of a free pdf writer utility that gave searchable pdfs from word processor apps but not from Excel. Maybe some pointers there?
    http://www.techsupportalert.com/best-free-pdf-writer.htm
    Just a wild-assed guess (I don't even have Pro 9 or Office 2007), do those missing words in the poorly-searchable pdf's contain ligatures, where text character strings such as 'fi' and 'fl' are replaced by single, special characters? Possibly such words might remain invisible to your pdf text search because you are typing the plain vanilla text version into the Acrobat search box?

  • Photoshop Elements 7 saving large files issue

    I use Photoshop Elements 7, I recently bought a new camera (24 megapixels). PSE7 is not happy, I am having lots of issues saving my work. Is anyone else having these issues?

    DJ, My computer is windows 8, below are the system requirements for PSE12 - looks like it will run in 64-bit. Thanks for pointing me in the right direction!
    No, you have seriously misunderstood what I said. PSE will not run as a 64 bit application under Windows. Even though you have a 64 bit version of Windows, PSE will NOT run as a 64 bit application under Windows. It will run as a 32 bit application under Windows 64bit, giving it a maximum of approx 3 GB of memory to work with. I don't know if that will make your problems go away or not. Please download the 30-day free trial of PSE 12 and see if your problems go away or not before you make a purchase.
    From the system requirements
    all other applications run native on 32-bit operating systems and in 32-bit compatibility mode on 64-bit operating systems

  • Streaming large files issue (flash streaming media server)

    We are streaming a one hour F4V from streaming media server 3.5.2 and for some reason it is seeing our one hour video as being 10 hours long. We have tons of other videos and never ran into this problem in any of the other files. This is the only file that exceeds one hour. Any ideas. This occurs in the default player from the streaming server.

    Also before anyone asks we are using RTMP and the file type is F4V not FLV.

  • WRT350N Large File Issue

    I am connecting wirelessly to my WRT350N using the WPC300N wireless card, XP Pro. If I try and back up my laptop, either using the USB storage connection or using a gigabit NAS, the wireless connection either degrades to 1Mbs or drops off completely. This is whether I use a firewall or not.
    If I connect using a slower NAS, this does not happen. Any idea of what the problem may be or whether there is a setting that may be adjusted?
    Thanks.

    I am connecting wirelessly to my WRT350N using the WPC300N wireless card, XP Pro. If I try and back up my laptop, either using the USB storage connection or using a gigabit NAS, the wireless connection either degrades to 1Mbs or drops off completely. This is whether I use a firewall or not.
    If I connect using a slower NAS, this does not happen. Any idea of what the problem may be or whether there is a setting that may be adjusted?
    Thanks.

  • Why does the Reduced Size PDF command create larger files?

    I have been a happy user of Adobe Acrobat vers. 7 until Adobe ceased the support of that version and I was unable to print and do other activities since my last Mac OS vers.10.6.8 update, so I was forced to upgrade to Adobe's Acrobat's X latest version 10.0.0.  I had hope to be able to produce Adobe Acrobat 3D files but found out later that Adobe does not do that anymore and I would have to buy it as a plugin from Tetra 4D for another $399.00, oh well. I then wanted to reduce the file size of some larger pdfs I had edited.  It took a while to locate the command location, now in File>Save>Reduced SizePDF, once I activated the command on my file it actually increased its size!!  Why did this happen?  Should I continue to use my disabled, Adobe unsupported vers 7 to get the superior performance I once had?  This issue was mentioned in an earlier thread that started out as to where to locate this command but the resultant larger file issue remained unanswered.  Currently unhappy on my purchase.  Will Adobe offer a free upgrade to fix this?

    I agree with Peter. Use Pages '09, but keep a document migration strategy present. When you create a Pages document in Pages '09, export it to Word .doc too. Export any current content in Pages v5.2.2, and then stop using it. That means no double-clicked documents.
    Pages v5 is not an evolution of Pages '09 — it is an incomplete rewrite, and uses a different, internal document architecture than did Pages '09. This internal architecture is to blame for the export translation bloat into Word's different document architecture. It does not matter whether you type a single character, or multiple pages, you will get about a 500Kb .docx file.
    Opening, and resaving this monster .docx file in MS Word in Office for Mac 2011, LibreOffice, or TextEdit will result in a very small .docx footprint.

  • Problems viewing large tif files in preview

    i'm having trouble viewing large tif files in preview. the tif files are on a cdrom - when i click on the file to open it, it opens only the first 250 pages of the file. the file has approximately 3,000 pages. how can i get preview to open the rest of the pages?
    thanks for any suggestions.
    mac mini   Mac OS X (10.4.6)  

    no trick - i didn't create the cdrom but it only has 3 tif files with approximately 3000 pages each, not 3000 large tif files. plus several smaller pdf files, but those aren't giving me any problems.
    i don't know whether they're compressed, but i still can't get more than the first 250 pages to open, even after copying the file to my desktop. if anyone has any other ideas, i'd much appreciate it.
    mac mini   Mac OS X (10.4.6)  

  • Help with Aperture/T2i/iMovie/large file ( 2GB) issue.

    I am having an issue with a very large file (3.99 GB) that I shot with my Rebel T2i. The file imports fine into Aperture, but then it isn't recognized at all by iMovie. I found out that iMovie can only handle files that are 2 GB or smaller.
    So now, I am trying to figure out how to chop my mega file neatly into a pair of 2GB halves. When I use the trim function, that does not seem to do the trick -- this may be a case of Aperture's non-destructive nature actually working against me.
    Does anyone have a solution for this? My intuition suggests that this may be a job for QuickTime Pro -- but I wasn't sure how that works now that we all have QuickTime X.
    Much appreciated.

    The file may well be in the wrong format, can you tell us more about it. See This

  • Viewing Large PowerPoint File

    We are moving from Palm Treos to iPhones and I have a user concerned about viewing PowerPoints. How does a user view a PowerPoint (or any large file) in an iPhone which is too large to email? Average 50MB.
    In the Treos we upload the file and open it from its drive.

    If you can put those Powerpoint files to a Web page accessible for iPhone user, you can just view Powerpoint file from a web link.
    Pertti

  • Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    I think more likely, it's a memory / scratch disk issue.  1.25 gigabytes is a very big image file!!
    Nancy O.

  • Quicklook consumes a lot of memory on column view previewing large files

    After upgrading to Mountain Lion, everytime I click on a large file (like a disk image, DMG, ISO) on finder's column view, it starts spinning the icon and quicklookd consumes a lot of memory even causing other apps to swap to disk. After the preview icon is done, the quicklookd memory is dealocated.
    This do not happen on Lion.

    Just found out that a plugin (QLStephen - http://whomwah.github.com/qlstephen/) was causing the problem.
    Removed it from /Library/Quicklook, restarted quicklook (qlmanage -r) and the problem is gone.

  • [BUG] Performance Issue When Editing Large FIles

    Are other people experiencing this on the latest JDevelopers (11.1.1.4.0 and the version before it) ?
    People in the office have been experiencing extremely slow performance happening seemingly at random while editing Java files. We have been using JDev for almost 10 years and this has only become an issue for us recently. Typically we only use the Java editing and the database functionality in JDev.
    I have always felt that the issue has been related to network traffic created by Clearcase and have never really paid attention, but a few days ago after upgrading to the latest version of JDev, for the first time I started having slowdowns that are affecting my speed of work and decided to look into it,
    The main symptom is the editor hanging for an unknown reason in the middle of typing a line (even in a comment) or immediately after hitting the carriage return. All PCs in the office have 2Gig or more RAM and are well within recommended spec.
    I've been experimenting for a few days to try and determine what exactly is at the root of the slowness. Among the things I have tried:
    o cutting Clearcase out of the equation; not using it for the project filesystem; not connectiing to it in JDev
    o Not using any features other than Java editing in JDev (no database connections)
    o never using split panes for side by side editing
    o downloading the light version of JDev
    o Increasing speed of all pop-ups/dynamic helpers to maximum in the options
    o disabling as many helpers and automations as possible in the options
    None of these have helped. Momentary freezes of 3-5 seconds while editing are common. My basic test case is simply to move the cursor from one line to another and to type a simple one-line comment that takes up most of the line. I get the freeze most usually right after typing the "//" to open the comment - it happens almost 100% of the time.
    I have however noticed a link to the file size/complexity.
    If I perform my tests on a small/medium sized file of about 1000 lines (20-30 methods) performance is always excellent.
    If I perform my test on one of our larger files 10000 lines ( more than 100 methods) the freezes while editing almost always occur.
    It looks like there is some processor intensive stuff going on (which cannot be turned off via the options panel) which is taking control of the code editor and not completing in a reasonable amount of time on large Java files. I have a suspicion that it's somehow related to the gutter on the right hand side which show little red and yellow marks for run-time reports of compile errors and warnings and haven't found any way to disable it so I can check.

    Just a small follow-up....
    It looks like the problem is happening on only a single Java file in our product! Unfortunately it happens to be the largest and most often updated file in the project.
    I'm still poking around to figure out why JDev is choking consistently on this one particular file and not on any of the others. The size/complexity is not much bigger than the next largest which can be edited without problems. The problem file is a little unusual in that it contains a large number of static functions and members.
    Nice little mystery to solve.

Maybe you are looking for