Opening large tdms files in excel takes forever, can anything be done?

After recording some daq channels for 4 hours my file is 120mb large, with about 20columns of data and 150k rows. Opening this in excel takes at least 5min, several not responding screen fades, and growing fear that all my data is impossible to get to.
What can I do about this? Is there a split tdms file option I can use or a way to speed up excel maybe?
Solved!
Go to Solution.

The solution is to not make the TDMS files so large. Perhaps modify the code so it starts a new file every 1/2 hour or something. Could you post a screen shot of the part of the code that is doing the saving?
Something that just occurred to me is that the TDMS file format is optimized for writing - not reading so stuff is just sort of streamed into the file with an index keeping straight what goes with what. The results is a file where the data from individual channels can be very fragmented, like parts of a file on a hard drive. I think there is a routine for defraging a TDMS file. Alternately, you could save the data in a temporary file (not necessarily TDMS) and resave it to the final TDMS file after all the data is collected.
Mike...
Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion
"... after all, He's not a tame lion..."
Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

Similar Messages

  • Problem opening multiple .tdms files in subroutine

    Heya-
    I'm using LabVIEW 8.2, professional development system.
    I'm using the Open / Read / Close TDMS functions as part of a subroutine in a larger program.  The subroutine is supposed to open one of several TDMS files (each of which contains bulk data saved previously), extract the data, then restructure into a 3D array, which is passed back to the main program.  The particular file TDMS file to open is controlled by the main program.  The TDMS Close function is used after the data is extracted.
    When running solo, the subVI performs as expected.  When running as a subVI, it functions fine on the first call, but subsequent calls to open different data sets fail to actually open the TDMS file (or at least, the Read TDMS function generates an empty array).  However, returning to the orginal file that was opened, the subVI works fine - it opens the TDMS file.  It doesn't matter which file was intially used - whichever one was opened first functions fine, any others do not.  That would make me think that the first TDMS file isn't being closed, except that I am using the Close TDMS function after reading.
    Attached are the subVI (TDMS_to_3D_array), a quick test program that uses the subVI, and a .zip of two data sets
    Not sure if the global variables located in the SubVI will default, so they are: UpperMaxCol: 55, UpperMinCo: 0, UpperMaxPixel: 48, UpperMinPixel: 6
    Thanks in advance for the help
    Dan
    Attachments:
    Data.zip ‏1060 KB
    TDMS_to_3D_array.vi ‏54 KB
    TDMS_loadfault_tester.vi ‏39 KB

    Hi Dan,
    Perhaps you would try a few things.
    Use the TDMS Flush VI in the subVI before the TDMS Close VI.
    Try running the program with the subVI open.
    Wire the TDMS File refnum from the subVI back to the parent and observe its value as the program switches between TDMS files.
    Perform the TDMS read in the parent VI, wire the TDMS Data and the Array Size of the TDMS dataset to the subVI.  Does this make any difference?
    Try the subVI without the Channel Name string indicator.
    Include error cluster wires throughout the entire subVI.  Wire them through the TDMS VIs, then through the nested For structures, and then back to the parent VI.
    Let us know what happens.

  • Signal Express Large TDMS File Recording Error

    Hello,
    I have the following application and I am looking for some tips on the best way to approach the problem with Signal Express:
    I am attempting to using Signal Express 2009 (Sound and Vibration Assistant) to collect random vibration data on three channels over an extended period of time -- about 20 hours total.  My sample rate is 2kHz.  Sampling at that rate over that period of time invovles the creation of a very large TDMS file, which is intended for various types of analysis in signal express later or some other application later on.  One of the analysis functions to be done is a PSD (Power Spectral Density) plot to determine the vibration levels distributed over a band of frequencies during the log. 
    My original solution was to collect a single large TDMS file.  I did this with Signal Express recording options configured to save and restart "in current log" after 1 hour worth of data is collected.  I configured it this way because if there is a crash/sudden loss of power during data collection, I wanted to ensure that only up to an hours worth of data would be lost.  I tested this option and the integrity of the file after a crash by killing the SignalExpress process in the middle of recording the large TDMS file (after a few save log file conditions had been met).  Unfortunately, when I restart signal express and try to load the log file data in playback mode an error indicating "TDMS Data Corrupt" (or similiar) is displayed.  My TDMS file is large, so it obviously contains some data; however, Signal Express does not index its time and I can not view the data within the file.  The .tdms_index file is also present but the meta data.txt file is not generated.  Is there any way to insure that I will have at least partially valid data that can be processed from a single TDMS file in the event of a crash during mid-logging?   I don't have too much experience dealing with random vibration data, so are there any tips for generating vibration level PSD curves for large files over such a long time length?
    My solution to this problem thusfar has been to log the data to seperate .TDMS files, about an hour in length each.  This should result in about 20 files in my final application.  Since I want to take a PSD, which ends up being a statistical average over the whole time period. I plan on generating a curve for each of these files and averaging all 20 of them together to get the overall vibration PSD curve for the 20 hour time period.

    JMat,
    Based on the description of your application, I would recommend writing the data to a "new log" every hour (or more often). Based on some of my testing, if you use "current log" and S&V Assistant crashes, the entire TDMS file will be corrupted. This seems consistent with what you're seeing.
    It would be good if you could clarify why you're hoping to use "current log" instead of "new log". I'll assume an answer so I can provide a few more details in this response. I assume it's because you want to be able to perform the PSD over the entire logged file (all 20 hours). And the easiest way to do that is if all 20 hours are recorded in a continuous file. If this is the case, then we can still help you accomplish the desired outcome, but also ensure that you don't lose data if the system crashes at some point during the monitoring.
    If you use "new log" for your logging configuration, you'll end up having 20 TDMS files when the run is complete. If the system crashes, any files that are already done writing will not be corrupted (I tested this). All you need to do is concatenate the files to make a single one. If this would work for you, we can talk about various solutions we can provide to accomplish this task. Let me know.
    Now there is one thing I want to bring to your attention about logging multiple files from SignalExpress, whether you use "current log' or "new log". The Windows OS is not deterministic. Meaning that it cannot guarantee how long it takes for an operation to complete. For your particular application, this basically means that between log files there will be some short gap in time that the data is not being saved to disk. Based on my testing, it looks like this time could be between 1-3 seconds. This time depends heavily on how many other applications Windows has running at the same time.
    So when you concatenate the signals, you can choose to concatenate them "absolutely", meaning there will be a 1-3 second gap between the different waveforms recorded. Or you can concatenate them to assume there is no time gap between logs, resulting in a pseudo-continuous waveform (it looks continuous to you and the analysis routine).
    If neither of these options are suitable, let me know.
    Thanks, Jared 

  • How can I open a tdms file and replace a subset of data then save that change without re-writing the entire file again?

    Hi all,
    Is it possible to open a tdms file and make a small change an an array subset then save the file without having to save the whole dataset as a different file with a new name? That is to say, is there something similar to "Save" in MS Word rather than "Save As"... I only want to change a 1D array of four data points in a file of 7M data points.
    I am not sure if this make sense? Any help is apreciated.
    Thanks,
    Jack

    You can use either one, but for your application, I would use the synchronous.  It requires far less setup.  When you open the file, set both enable asynchronous and disable buffering to FALSE to enable you to use synchronous with arbitrary data sizes.
    Attached code is LabVIEW 2011.
    This account is no longer active. Contact ShadesOfGray for current posts and information.
    Attachments:
    UpdateTDMS.zip ‏20 KB

  • As opening an xml file in excel?

    I can not open a xml file in excel, and I get a message that is not a valid spreadsheet, and if the same file can be opened from a normal machine, anyone know how?

    altairdfdfdf wrote:
    I can not open a xml file in excel, and I get a message that is not a valid spreadsheet,
    It's not a spreadsheet, that is why.
    Open it with TextEdit.

  • Getting extra line after each record when opening a .txt file in excel

    Hi All,
    I have developed a program which downloads a file at application server.
    each record of file is 500 characters long & have CRLF at end.
    the file looks fine when opened in .txt format.
    however when i download it from application server to presentation server (using function "download to my computer"), & at presentation when i try to open it in excel format, it shows a blank line after every record.
    i don't want this blank line to appear if i download it & open it in excel.
    the file record is declared as char500 type.
    Please suggest how to deal with this.
    thanks in advance.
    Regards,
    Puja.

    Hi Puja,
    Check the file in the application server whether it has any gaps between the lines.
    Or else as you said if the file looks ok in .txt format, download the file in .txt and open the same file in excel (i.e. open with excel)
    Hope this sloves your problem.
    Regards,
    SB.

  • Errors on open large jpeg files in Windows 2000

    Q. I am having problems opening large jpeg files (eg 40"x30" @270 ppi)
    after having saved them in Windows 2000
    A. This is a known issue and can be resolved either by turning off
    thumbnails for JPEGS or by using the JPEG2000 file format. The
    Photoshop plugin for Windows can be obtained from Adobe:-
    http://www.adobe.com/products/photoshop/cameraraw.html
    Carol
    (Posted from the UK)

    This is from the post at this site: Re: Adobe CS6 file associations are all messed up
    Just change the directory to the correct EXE for cc 2014 and you should be good.
    Correct Answerby Curt Y on Jun 5, 2012 7:08 PM
    Here is a registy fix for Win7 machines to the file association problem.
    This from TinusHDCA user:
    I installed Photoshop CS6, then found CS5 was still there. I thought, hey, I don't need CS5 anymore, so I uninstalled CS5. then I found out that Photoshop was no longer available in 'Open with' when right-clicking an image...
    So I went into the Registry, HKEY_CLASSES_ROOT\Applications\Photoshop.exe\shell\open\command, and changed the (Default) from "C:\Program Files\Adobe\Adobe Photoshop CS5 (64 Bit)\Photoshop.exe" "%1" to "C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Photoshop.exe" "%1" (so changed CS5 to CS6) - and now I see 'Adobe Photoshop CS6' in my 'Open with' list when right-clicking an image...

  • Opening large photo files produces black screen

    When opening a small (less than a meg) photo, it works find. I'm able to move to the next file. If I open a large file (8-10 megs) it sometimes works fine however if I use the arrow to move to the next file in the folder I get a black screen. Also if I have a folder of many large photo files I can see all the thumbnail photos at a small size. If I try to increase the view size to large some will be black, however if I click on the file many times it opens fine, sometimes not.  It was working fine, just recently started to act up.  Using adobe photoshop I can open each file so I know they are there and not corrupt files.

    Hello @JohnBruin,
    Welcome to the HP Forums, I hope you enjoy your experience! To help you get the most out of the HP Forums I would like to direct your attention to the HP Forums Guide First Time Here? Learn How to Post and More.
    I have read your post on how opening large photo files produces a black screen on your desktop computer. I would be happy to assist you in this matter!
    If you boot your system into Safe Mode, are you able to open large photographs? Is this a recent or a reoccurring issue? 
    In the meantime, I recommend following the steps in this document on Computer Locks Up or Freezes (Windows 8). This should help prevent your system from defaulting to a black screen. 
    I would also encourage you to post your product number for your computer. Below is a is an HP Support document that will demonstrate how to find your computer's product number. 
    How Do I Find My Model Number or Product Number?
    Please re-post with the results of your troubleshooting, as well as the requested information above. I look forward to your reply!
    Regards
    MechPilot
    I work on behalf of HP
    Please click “Accept as Solution ” if you feel my post solved your issue, it will help others find the solution.
    Click the “Kudos, Thumbs Up" on the right to say “Thanks” for helping!

  • When I open a pdf file on the net, I can only view page 1. How do I get to the other pages?

    When I open a pdf file on the net, I can only view page 1 and cannot move to page 2, 3, etc. How can I change pages?

    See [[Firefox keeps opening many tabs or windows]]
    You can have infinite tabs opening if you have selected Firefox as the application to handle a file if you get an ''Open with'' download window.
    Firefox should not be selected as the application to handle a file and you have to remove the action that is associated with that file type.
    You can delete [http://kb.mozillazine.org/mimeTypes.rdf mimeTypes.rdf] in the [http://kb.mozillazine.org/Profile_folder_-_Firefox Profile Folder] to reset all actions or set that action in Tools > Options > Applications to 'Always Ask'.
    See http://kb.mozillazine.org/File_types_and_download_actions ("File handling in Firefox 3 and SeaMonkey 2" and "Reset Download Actions")

  • Opening TDMS files in Excel - column limit

    Hi all,
    I'm saving data I obtain using LabVIEW as a TDMS file and have downloaded the add-in to open the files in Excel:
    http://zone.ni.com/devzone/cda/epd/p/id/2944
    I'm using Excel 2007, which has a column limit of 16384 and a row limit of 1048576.  Each of my datasets is 1024 rows values large and I have 10,000 datasets in total - well within either limit.  However, the Excel importer will not open the file - pop-up errors occur and Excel opens multiple files.  Has anyone encountered this before?
    Miika

    Have the same issue.  As requested, I've attached a file with 300 dummy groups, each of which as 4 channels of dummy data.
    When loaded into Excel, only the first 254 groups are shown:
    Certified LabVIEW Architect
    Wait for Flag / Set Flag
    Separate Views from Implementation for Strict Type Defs
    Attachments:
    abc.vi ‏3381 KB

  • Opening a Somat .sie file in DIAdem takes forever

    Hi All,
    I'm new to DIAdem but know LabVIEW well. I am trying to open a Somat .sie file. I have the Somat .sie data plugin installed and DIAdem sees the file fine but it takes forever to load. The file opens fine using SoMat InField (and very quickly). I've attached a sample for you to take a look at.
    Anyone who can pop it open straight away?
    Thanks,
    Phil

    Oops. Never attached the file.
    Change .txt to .sie.
    Attachments:
    edaq_test (1) - Copy.txt ‏4756 KB

  • Primavera Contract managment take much time when open large PDF files.

    Dears,
    i have a big problem!
    i made integration between the PCM and sharepoint 2010 and make migration from the file system to sharepoint.
    the sharepoint database reach 355GB
    after that unfortunately, when i try to open large pdf attachment through PCM(Primavera Contract Managmnet) it take much time then whan was opened from the file server.
    i made everthing upgrde the RAM and processor but the problem still exists.
    please help!
    Edited by: 948060 on Sep 19, 2012 1:48 AM

    we start store attachment on 2007. all of these files are migrated to sharepoint 2010 now on the staging enviroment.
    but, we faced the performance issue as mentioned above.
    the large files (begin 5MB) take a lot of time to open through the PCM

  • Tdms files in excel

    I have used a NI c-RIO 9075 to log data in a TDMS file. The wirte to a Measurement file was used and all the data saved to a file of my choice. The issue is that when i open the file in excel each data point is saved in a different worksheet and there is so much data that it takes several minutes to open the file. Is there a way around this issue without redoing the tests? 

    adamharden25 wrote:  The issue is that when i open the file in excel each data point is saved in a different worksheet and there is so much data that it takes several minutes to open the file. Is there a way around this issue without redoing the tests? 
    That sounds like you wrote the file wrong.  Each wooksheet should be a different Group.  So if you change your group for each sample, you will get a different worksheet for each sample.
    Can you post any code and/or a TDMS file for us to look at?
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Trying to move files to trash takes forever, and finder windows take forever to load files/apps

    after recently installing Mackeeper (yes, I know..) and attempting to uninstall, my macbook pro 13" al. is taking forever to move a file to the trash, and it equally takes forever to open a finder window. running 10.6.8.

    MacKeeper is an exception to the rule that system modifications should be uninstalled according to the developer's instructions. Although not generally recognized as such, MacKeeper is in fact a trojan, because the developer distributes an uninstaller that purports to delete it, but leaves behind something that causes popup ads to appear in the user’s web browser. Try following the instructions here:
    how to uninstall MacKeeper « Phil Stokes
    I can't personally vouch for the accuracy of those instructions, but others seem to have had success with them.

  • Error opening Local Cube File in Excel 2007

    Excel crashes, the only error given is "Microsoft PowerPivot Engine has stopped working"
    Event viewer log is equally vague:
    "The description for Event ID 22 from source MSOLAP$LocalCube cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local
    computer.
    If the event originated on another computer, the display information had to be saved with the event.
    The following information was included with the event:
    Internal error: An unexpected exception occurred. Internal error: An unexpected exception occurred."
    Local cube file is created using version 11 of Microsoft.AnalysisServices, using a relational sql data source connected via SQLNCL11.1. Excel is version 2007. OS is Windows 7 sp 1 64 bit. File is created succesfully, no errors.
    Also baffling is that some members of my team can open this file successfully and others cannot (including myself). I can not find anything in common in regards to software installed on any of the systems.

    Hi Infide,
    According to your description, your PowerPivot Engine stopped unexpectedly. Right?
    In this scenario, since other member can open the cube file successfully, there should be no corruption in the cube file. One possibility which can cause this error is the memory issue. Please refer to an article below:
    Memory Considerations about PowerPivot for Excel
    For troubleshooting, please enable "PowerPivot Diagnostics" functionality to create a trace file to capture some useful information:
    PowerPivot Options & Diagnostics Dialog Box
    How can I see what internal commands PowerPivot executes in its engine
    If doesn't help, I would recommend to submit a feedback to the Microsoft Connect at this link
    https://connect.microsoft.com/SQLServer/Feedback.
    This connect site will serve as a connecting point between you and Microsoft, and ultimately the large community for you and Microsoft to interact with. Your feedback enables Microsoft to offer the best software and deliver superior services, meanwhile you
    can learn more about and contribute to the exciting projects on Microsoft Connect.
    Best Regards,
    Simon Hou
    TechNet Community Support

Maybe you are looking for