Best way how to write FPGA data in rt cRIO system in tdms file

Hej,
I am struggling to write measured data from an analog input (NI 9215) sampled at up to 20 kHz to a tdms file in the rt system (crio-9022). I just need to save several periods of 4 arbitrary analog signals at frequencies between 5 Hz and 1KHz. So storing up to 50k values should already be enough.
I use a high priority and a low priority loop. First I tried to adapt the example from the "Getting Started with CompactRIO - Logging Data to Disk" (http://zone.ni.com/devzone/cda/tut/p/id/11198). But when I used this in my high priority loop (running at 1ms), the loop runs out of time and the rt system becomes unresponsible. If I change the number of elements to write (the number of elements to wait for in the fifo read block) it becomes better, but still data is lost because the loop finishes late.
So I was thinking to create a RT fifo and to store all the values from the measurement first in this memory inside the high priority loop and then write the values to the tdms file in the low priority loop. This time I used the read/write fpga block instead of the FPGA fifo block. It was already working better but writing the files to the tdms file took a lot of time since each value was read and written to the tdms file individually. Unfortunately I could not find a possibility how to write the whole rt fifo to the tdms file at once. Is there a block available or is it possible to create a big array first and then write the data to the tdms file at once? My code I tried is in this second picture.
I hope someone can give me some tips which method should be better for my project and a hint what I did wrong or what I can optimize. I stucked for days now on how to save my measurements on the cRIO system.
Thank you very much in advance. Have a nice weekend.
Best regards
Andy
Solved!
Go to Solution.

HiXiebo and Christian,
thank you very much for your answers. Actually, my high priority loop is much slower. I run it with a maximum loop time of 50us = 20kHz or slower, depending on my Signal I want to measure. So my data producing rate is maximum 4*8*20k=640 KB/s. My low priority loop runs at 2ms to 5 ms (much slower then the high priority loop), since I am doing just some simple math calculation there and control the front panel in this loop.
I understand that it is much more efficient to write blocks of data (e.g. 1024*32KB instead of just 32KB) to a file with TDMS. But is it also the same for a queue or RT FIFO, i.e. does the block size of the data chunks also matter for the queue and RT FIFO?
@Xiebo: I understand that caching the read data from the FPGA in the high priority loop first will improve my code. But I do not know how I can cache the data I read? I was thinking to do it with the FPGA FIFO, but the FPGA read/write blocks seem to be faster for me and I do not know why? Can you tell me a block/vi to cache the data I read from the FPGA or maybe even an example?
@Christian: This NI_MinimumBufferSize property looks exactly what I was looking for. But my question is now if I should put the tdms write VI's in my high priority loop and read directly from the FPGA FIFO buffer to the file as it is done in the Disk logging example at http://zone.ni.com/devzone/cda/tut/p/id/11198? Or is it better to read the data from the FPGA via the FPGA read/write function, write the data to a RT FIFO in the high priority loop and then write the data with the  NI_MinimumBufferSize property option to the tdms file from the RT FIFO in the low priority loop?
In summary, I am still unsure if the FPGA FIFO or the FPGA read/write function with a queue or RT FIFO is better for me and how I can create a cache to build chunks of data blocks to write.
Thank you very much in advance for your help.
Best regards
Andy

Similar Messages

  • What is the best way to kill/stop a data load?

    Hi.
    What is the best way to kill/stop a data load?
    I have a data load from my QA R/3 system that is extracting 115.000.000+ records. The problem is that the selection in the function module used in the data source does not work, and the problem was not detected because of the nature of the data on the development system.
    I could kill processes owned by my background user (on both R/3 and BW) but I risk killing other loads, and sometimes the job seems to restart if I just try to kill processes. If I remove transactional RFCs in SM58 the load does not terminate; I only skip one or more datapackages. I have also tried to change the QM-status in the monitor to red, but that does not stop the load either...
    So isn't there a nice fool-proof way of stopping a dataload?
    Best regards,
    Christian Frier

    Hi,
    There r 2 ways to kill the job.
    One is using transation RSMO locate the job and display the status tab double click on the yellow light that is shown on the line total, a pop will come 'set overall status ' is displayed select the desired status that is red and save it. Then return to the monitor page and select the header tab double ckick on the data target right click and then goto 'manage',there should be request sitting there probably with yellow lights , highlight the line with the faulty request click the delete button then click refresh button.
    Second is goto SM37 and click on the active selection and enter the jobname and then click excute the particulr job should appear highlight the jobname then click on the stop iconthat appears on the taskbar( 3 rd from left)
    hope it is clear.
    Regards-
    Siddhu

  • How to write a data type into an excel file?

    Hi everyone,
    What I am trying to do is to get some data from MS SQL then write into an excel file.
    stmtExcel.executeUpdate("Insert into [Output$] (CustomerID, ProductID, [Date], OrderQty) Values(" + rsSQL.getInt("CustomerID") + ", " + rsSQL.getInt("ProductID") + ", '" + (rsSQL.getString("Date")).substring(0,10) + "', " + rsSQL.getInt("OrderQty") + ")");
    There is no problem to write into an excel file, however, after all data is imported, excel sheet treats the the integers or dates as Text format. The funny part is, after I double click one of the cells with an integer value, that cell's format will be changed to integer, same thing happens with the date type.
    I am wondering if there is any way I can write an integer or whatever type into an excel file, which excel can recognize their own types instead of Text format. Thank you so much in advance.
    Sincerely,

    Yes, use an API that supports such things, like Andy
    Khan's JExcel:
    http://www.andykhan.com/
    %Thanks, duffymo. I haven't tried it yet, but according to the website, it seems that is possible! thanks again!

  • Best way to  back up your data

    Which is the fastest and best way to back up your data in case of any problem ? Still to transfer to an external HD ?
    Thanks

    Hi Ferro;
    Best and simplest way to back up is Time Machine to an external drive.
    I think that any backup plan should alway be to an external drive. If you backup to an internal drive and the Mac fails, what good is your backup then?
    Allan

  • Best way to transfer internal HD data in Mavericks to Mac with Yosemite ?

    Hello,
    I'm using a 2007 iMac with Mavericks and will be getting a new one which will presumably come with Yosemite installed. What's the best way to transfer all the data from my internal HD on the old system, to the new one ?
    I use SuperDuper to make backups to external HD's, so if I make a bootable copy of the mac HD to an ext HD using SuperDuper, will everything function fine despite the different OS's ?
    Thanks,
    Matrose.

    You can make a bootable copy of your system now, but you won't be able to boot from it with the new computer. They are not usually backwards compatible with the OS. But when you first boot into the new system, use SetUp Assistant to migrate the data from the cloned copy. That will work just fine.

  • How to write the data into EEPROM using Labview?

    How to write the data into EEPROM using Labview?

    You would need some sort of EEPROM programmer. Typically might
    communicate with it via serial. I don't know how you would do this in
    LV. You would need to have the command protocol for the programming
    device to start with.
    Doug De Clue
    gpibssx wrote in message news:<[email protected]>...
    > How to write the data into EEPROM using Labview?

  • How to fetch the Date column(or Month column) from the file name from the specified path in ODI 11g

    Hi ALL,
    Can any one help us regarding How to fecth the Date column(or month column) from the file name specified in the path in a generalized way .
    For example :
    file name is :subscribers (Cost) Sep13.csv is specified in the below path
      E:\Accounting\documents\subscribers (Cost) Sep13.csv
    here I need to fetch the "Sep13" as a Date column in the ODI 11g in the generalized way.
    Can any one help us in this case as early as possible.

    I would suggest using a piece of Jython code for this.  Something like this...
    import os
    import os.path
    filelist  = os.listdir(E:\Accounting\documents\)
    for file in filelist:
    datestr = file[19:-4]
    You'd need to work out what to do with datestr next...  perhaps write it to a table or update an ODI variable with it.
    Hope this is of some help.

  • What is the best way to connect a MacBook Pro to my HiFi system ?

    Hi,
    I'm new to Apple support communities having just bought my first Macbook Pro (a 17" 2.5GHz i7 model) and this is my very first community post.
    Im a bit of a Hi-Fi enthusiast, so, with my new MacBook Pro I've taken the opportunity of completely reloading my iTunes library in high quality Apple Lossless format. Took a while but now I've completed the task Im pleased I've done it.
    My question is, what is the best way to connect a Macbook Pro to a high quality Hi-Fi preamp - my Hi-fi system is musical Fidelity, and the pre amp (an A5  CR Pre amp) and this pre amp, which has superb audio quality, only has analogue phono inputs ? 
    I'm interested to understand what both the wired and wireless connection options are. I'm guessing one option must be to use the headphone output, but I have no idea if the output from the headphone socket on my new macbook (which is presumably designed to drive headphones) would be at a suitable level for a phono line level input ?
    Ive also read a few comments about using digital connection - and my second question would be - what is the audio digital connection on my macbook ?
    Aplogies if this question has been asked lots of times before but I've looked through some previous posts and can't find any answers in simple and straightforward language concentrating on the issue of connecting a MacBook Pro to an analogue phono pre amp input ?
    Thanks in anticipation of some advice.

    What is the best way to connect a MacBook Pro to my HiFi system?
    Look at the back of the reciever and see what your connection options are, the front headphone jack is output usually.
    The Mac's  analog audio in/out ports doubles as a optical in/out ports.
    The analog won't over power your amp, it's just a headphonejack, like a powered mic, just enough power to get the signal to the destination in one piece.
    Since you have a analog amp, your stuck with inferior analog connections, however you mentioned you wish to listen to the best sound quality of your lossless music.
    To do that your going to need a decent surround sound system with Toslink optical import jacks, a Toslink stereo jack to regular Toslink adapter, Toslink cable and enable the optical output in your System Preferences >Sound
    Another method is to use a Airport Express, it also has duo analog/opical out port. Then set up the Airport Express for iTunes streaming using the Airport Utility, then in iTunes setup the lower right corner one can set the connection.
    I recommned a Harmon Kardon surround sound system, with their 100 watt satelight speakers and the 200 watt subwoofer. Enable the "concert hall" and other neat effects which will turn your 2.0 (stereo) iTunes music into 4 channel speaker sound filling the room with music and sending the lower channels to the subwoofer.
    It will likely bring tears to your eyes that you can actually hear the full quality of your music as intended, I almost quarranty you will never want listen to music on a crappy iPhone or iPod again.
    If a surround sound system is out of your budget, you can get by using the Harmon Kardon "Go Play" portable stereo and a analog stereo cable.
    If your going to use your analog amp to power reference speakers, then there are various analog adapters of all sorts to make the appropriate connections once you know what they are, check out Radio Shack online for them.

  • How to write adapter module to convert the xml to pdf file

    Hi all,
      how to write adapter module to convert the xml to pdf file.
    Please any body assist step by step procedure.

    PI 7.1 XML to PDF transformation
    have you seen below links:
    http://forums.sdn.sap.com/thread.jspa?threadID=1212478
    http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/14363

  • Best way to stream lots of data to file and post process it

    Hello,
    I am trying to do something that seems like it should be quite simple but am having some difficulty figuring out how to do it.  I am running a test that has over 100 channels of mixed sensor data.  The test will run for several days or longer at a time and I need to log/stream data at about 4Hz while the test is running.  The data I need to log is a mixture of different data types that include a time stamp, several integer values (both 32 and 64 bit), and a lot of floating point values.  I would like to write the data to file in a very compressed format because the test is scheduled to run for over a year (stopping every few days) and the data files can get quite large.  I currently have a solution that simply bundles all the date into a cluster then writes/streams the cluster to a binary file as the test runs.  This approach works fine but involves some post processing to convert the data into a format, typically a text file, that can be worked with in programs like Excel or DIAdem.   After the files are converted into a text file they are, no surprise, a lot larger than (about 3 times) the original binary file size.
    I am considering several options to improve my current process.  The first option is writing the data directly to a tdms file which would allow me to quicly import the data into DIAdem (or Excel with a plugin) for processing/visualization.   The challenge I am having (note, this is my first experience working with tdms files and I have a lot to learn) is that I can not find a simple way to write/stream all the different data types into one tdms file and keep each scan of data (containing different data types) tied to one time stamp.  Each time I write data to file, I would like the write to contain a time stamp in column 1, integer values in columns 2 through 5, and floating point values in the remaining columns (about 90 of them).  Yes, I know there are no columns in binary files but this is how I would like the data to appear when I import it into DIAdem or Excel.  
    The other option I am considering is just writing a custom data plugin for DIAdem that would allow me to import the binary files that I am currently creating directly into DIAdem.  If someone could provide me with some suggestions as to what option would be the best I would appreciate it.  Or, if there is a better option that I have not mentioned feel free to recommend it.  Thanks in advance for your help.

    Hello,
    Here is a simple example, of course here I only create one value per iteration in the while loop for simplicity. You can also set properties of the file which can be useful, and set up different channels.
    Beside, you can use multiple groups to have more flexibility in data storage. You can think of channels like columns, and groups as sheets in Excel, so you see this way your data when you import the tdms file into Excel.
    I hope it helps, of course there are much more advanced features with TDMS files, read the help docs!

  • What is the Best way To Copy and paste data from 1 book to another

     I have 18 sheets in 5 different books that I want to extract data from specific cells.  What is the best way to do this?  Example:  1 sheet is called Numbers E-O1 data in 13:WXYZ. The data updates and moves up 1 row every time I enter
    a new number. So let's say I enter the number 12. Through a lot of calculations the output goes in 13:WXYZ. To what I call a counter which is a 4 digit number.  Anyways, how can I send that 4 digit number to a totally different sheet?  To bullet
    what I'm talking about
    data in cells Row 13:WXYZ in book called Numbers sheet E-O1
    send data to book called "Vortex Numbers" Sheet E-O row 2001:CDEF
    What formula or Macro can I use to make this work?
    thank you!

    Hello Larbec,
    Syntax:
    '[BookName]SheetName'!Range
    Enter in cell  2001:CDEF:
    ='[Numbers]E-O1'!13:WXYZ
    This assumes that the file is open in Excel. Otherwise you need to add the path:
    'ThePath[BookName]SheetName'!Range
    Best regards George

  • What is the best way to stack DAQ aquired data in labview?

    I'm developing an application to work with an M-series daq card and labview 8.5 to output a signal and then record on 8 differential inputs for a short period of time (~10 ms). I need to stack my data, however, because the incoming signal will be very, very small, even after amplification. So basically i'm running a slightly modified version of the multifunction Synch AI-AO.vi (included with the install of daqmx). What is the best way for me to rerun this vi a set number of times and add new data directly to the old data (not cat-ing or anything, like |sample 1 of run 1| + |sample 1 of run 2| = stacked stample 1).
    A slightly modified version of the mutlifunction synch AI-AO.vi is attached.
    Attachments:
    des_v2_Multi-Function-Synch AI-AO.vi ‏143 KB

    Hi LSU,
    see attachment on how to "stack" several measurements. I simply add the waveforms and use a shift register to keep the last iterations value.
    Writing to files in each iteration is extremly CPU consuming - especially with express vis. Using for loops for just one iteration is "senseless". You could enable the conditional terminal of the for loop to realize your stop feature.
    For your message 4:
    Have you ever tried all the things you asked for? Sometimes it's easiest to just try&error
    And for the "n=n+x" question: It really helps to take the free online courses offered by NI!
    Message Edited by GerdW on 11-11-2009 06:27 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome
    Attachments:
    des_v2_Multi-Function-Synch AI-AO.vi ‏128 KB

  • How to write plan data into Transactional Cube from Visual Composer ?

    Dear experts,
    Visual Composer is very powerfull to build a 'nice-looking' web-report (I think is much better than Web Application Designer in term of freely position the chart/graph/table/selection parameter in anywhere we like. We can't have this flexibility in WAD) plus it also has strong integration to R/3 for posting the  transactions via RFC-Enabled Function Module (such as changing customer credit limit, approve PO, approve Sales Order, etc), so that the R/3 user won't even have to know the TrxCode.
    In addition to this, I just wondering if VC also capable to write plan data into transactional cube via Write-Enable Query. We can simply insert a BW Query into VC as planning layout (set the edit-mode to "editable") plus having some Wizards to make it more user-friendly and self-explanatory for the users.
    But I am not sure how to save the changes back to transactional cube from VC.
    Could anyone please kindly advise how to achieve this.
    "Seeing is believing" is always my learning method, so I would appreciate some "actioanable" explanation or hints.
    Many thanks,
    Sen

    Hi Sen,
    As far as I know, it's not possible to write data into transactional cubes directly from a table in Visual Composer through standar method, but it's possible to integrate a WAD with your input ready query into a URL element in VC (You can fill entry variables in VC and call WAD dinamically), so it's possible to create a planning application in VC calling dinamically WAD's.
    Another way to interact through VC is creating button objects that calls planning sequences (with BAPI calling RSPLS_PLSEQ_EXECUTE). Also can fill variables in VC and pass them to sequence.
    Regards,
    Enrique

  • Best practice how to retrieve & update data w/o any jsf-lifecycle-overhead

    I have a request scoped jsf managed bean called "ManagedBean". This bean has a method annotated with "@PostConstruct" that retrieves data from a database. The data is shown in a jsp "showAndEditData.jsp" in <h:inputText /> components - so the data is editable.
    The workflow is as follows:
    First, when navigating to "showAndEditData.jsp", the ManagedBean is created, the "@PostConstruct"-method is invoked, and the data retrieved from the database is shown to the user.
    Second, the user changes the data.
    Third, the user presses the submit button, the ManagedBean is created again, the "@PostConstruct"-method is invoked again, and the data is retrieved from the database again. Then the data is overridden by the changes the user made and passed to the business-tier (where it will be saved to the database).
    Every step that i marked with "*again*" is completely unneccessary and a huge overhead.
    Is there a way to prevent these unneccessary steps.
    Or asking in other words: Is there a best practice how to retrieve and update data efficently and without any overhead using JSF?
    I do not want to use session scoped managed beans, because this would be a huge overhead as well.

    The first "again" is neccessary, because after successfull validation, you need new object in request to store the submitted value.
    I agree to the second and third, really unneccessary and does not make sense.
    Additionally I think it�s bad practice putting data in session beansTotal agree, its a disadvantage of JSF that we often must use session.
    Think there is also an bigger problem with this.
    Dont know how your apps are working, my apps start an new database transaction per commit on every new request.
    So in this case, if you do an second query on postback, which uses an different database transaction, it could get different data as for the inital request.
    But user did his changes <b>accordingly</b> to values of the first snapshot during the inital request.
    If these values would be queried again on postback, and they have been changed meanwhile, it becomes inconsistent, because values of snapshot two, do not fit to user input.
    In my opionion zebhed has posted an major mistake in JSF.
    Dont now, where to store the data, perhaps page scope could solve this.
    Not very knowledge of that section, but still ask myself, if this data perhaps could be stored in the components and on an postback the data are rendered from components + submittedvalues instead of model.

  • Best way to manage global application data

    Hello,
    I'm looking for the best way to manage my global application data. I have a program containing about 70 application settings which are loaded from an ini file and organised as clusters and loose variables. I want to be able to use these settings across multiple vi's. The application settings can be modified during execution. Some vi's can change settings and these changes need to be visible accros all vi's.
    What is the best way to manage this. Ideally, I would have one cluster or class containing all the application settings but since the sub vi's run independend, I can't wire them through. 
    From what I've read so far, global variables ain't really well suited for this due to race conditions. The FGV might be a solution but it is not clear to me how to implement this for many variables. (One FGV or multiple FGV's, etc... Does anyone have a good example of this?)
    Are there any other good solutions? Any advice is welcome!
    Best regards,
    Wouter

    You can also create a singleton LVOOP object (maybe a few if you want to logically organize your settings). When you initialize the system you create the object and it uses a DVR internally to store your data. Since this is a singleton object you do not need to wire it through your code. Your subVIs simply call the appropriate methods and it will interact with the class data. A FGV is basically the same thing except that you only have a single VI so you are more limited with your inputs and outputs. A singleton object let's you refine the connector pains based on the methods and what they are doing.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

Maybe you are looking for

  • Indesign Acting Stange

    Having reinstalled twice my InDesign Suite 5.  My InDesign no longer allows me to "cut" text or move text or elements with my mouse.  They will move with the arrow keys. Copy command stopped working also.  Everything else seems fine.  No viruses foun

  • What camcorders can I buy to support Final cut pro.

    My price range is up to $500. I need a good camcorder that is compatible with final cut pro.

  • How to reset Windows 2008 R2 Domain Controller "Administrator" password?

    Hello Everyone, I have lost Administrator password for the following system: Windows 2008 R2 Domain Controller setup on same machine Stand alone server - no workstations or other servers invovled I still have the "Directory Service Restore Password"

  • Getting rid of Album Ratings (white stars)

    How do I get rid of all album ratings. One used to be able to remove some of them using a grid and then choosing albums ratings on a pop-up menu - then setting to none. In the appallingly crap iTunes 11, this doesn't seem possible. Ok, it's official

  • BEx Query not able to excecute on SAP Portal using Internet Explorer 9

    Hello, What config. do I need to do in the Internet Explorer 9 so that it will excecute the query on SAP portal. Error message: Internet Explorer cannot display the webpage Thanks Arjun