Time Resolution in ELVIS DAQ

Hi All,
I need to find the time difference between two pulses that are generated by my circuit. For this, I need to know how reliable the time stamps are in data collected from the NI ELVIS DAQ Board (i.e. if I sample data at 1kHz, how sure can I be that the delta t between two data points is 1 ms) ?
Thanks,
Jari

Hi Jari,
Are you using ELVIS I or ELVIS II? If you are using ELVIS I, the specs will depend on which DAQ card you are plugging into. With ELVIS II, the DAQ is built-in, so the specifications are available in this manual: http://www.ni.com/pdf/manuals/372590b.pdf which specifies that the timing accuracy is 50 ppm. If you are not familiar with ppm used to describe clock accuracy, I recommend looking through this article: http://www.best-microcontroller-projects.com/ppm.html. One caveat to keep in mind is that the sampling rate that you specify will be generated by dividing down the internal base clocks. Therefore, the sampling rate you specify should be an integer multiple of the internal timebase (ELVIS II has a 0.1, 20 and 80 MHz timebase). Otherwise, the DAQ will just round to the nearest sampling rate that can be derived by dividing the internal timebase by an integer value. See this document for more information: http://digital.ni.com/public.nsf/allkb/4BBE1409700F6CE686256E9200652F6B. 

Similar Messages

  • About time resolution on PCI6221 card

    Hi all,
      I am going to purchase a PCI-6221 DAQ card. I read the specification and it turns out that the sampling rate is 250kS/s, which means it can produce or collect 250000 sample per second, correct? So does it mean the time resolution is 1/250000=4000ns? I am reading the datasheet and it shows that the timing resolution is 50ns. So my question is what does that 50ns mean? If I have an externel signal with frequency is higher than 250kHz, does it mean PCI-6221 is not be able to capture that signal correctly?

    Looking at the specs, this MB Pro comes under late 2007 spec, but checking supported resolutions, it more than covers native resolutions for both main display and the NEC simultaneously.
    Can anyone shed some light on this, or have I got this wrong? Is it down to DVI or could I use VGA with an adaptor - would this give me better resolution?
    Cheers.

  • Is there a way to increase the time resolution to 0.1 ms.?

    Hello. I am using "Download - Using_Counters_to_Time_Program_Execution.zip " that I downloaded from NI Developer Zone. I think this gives me a time resolution of 1 ms. Is there a way to increase the resolution to 0.1 ms? Thank you

    ilter,
    As for your test VI using "Count Events or Time.vi", that is not working because you are only calling the VI once.  In order to get a proper elapsed time, you need to call the VI a second time as shown in the attached picture "Using Count Events or Time VI.jpg". 
    In order to modify the original example to use a 100 kHz timebase instead of the 20 MHz timebase, see the attached picture "Time With Counters (DAQ) - Modified to 100 kHz.jpg".  The changes you need to make are circled in red. 
    I also meant to point out in my previous post that there is another example on the DevZone that does the same thing (time code with counters) using the newer DAQmx driver.  I would encourage you to use this example as it will make your application more supportable and maintainable in the future.  By default, this example also uses the 20 MHz timebase.  In order to modify it to use the 100 kHz timebase, see the attached picture "Time_With_Counters (DAQmx) - Modified to 100 kHz.jpg"  To select the 100 kHz timebase in the DAQmx terminal constant, you will have to right click on that constant, go to "I/O Name Filtering" and select "Include Advanced Terminals".  You will have to select "/Dev1/100kHzTimebase", and then you can delete the "/Dev1/" part. 
    Hope this helps!
    Best regards,
    Message Edited by Jarrod B. on 04-05-2006 10:10 AM
    Attachments:
    Time_With_Counters (DAQmx) - Modified to 100 kHz.JPG ‏437 KB
    Using Count Events or Time VI.JPG ‏114 KB
    Time With Counters (DAQ) - Modified to 100 kHz.JPG ‏405 KB

  • CITADEL and RELATIONAL DATABASE Alarms & Events Logging Time Resolution

    Hello,
    I would like to know how I can setup Logging Time Resolution when Logging Alarms & Events to CITADEL or RELATIONAL DATABASE.
    I tried use Logging:Time Resolution Property of Class: Variable Properties without success.
    In my application I need that timestamp of SetTime, AckTime and ClearTime will be logged using Second Time Resolution, in other words, with fractional seconds part zero.
    I am using a Event Structure to get SetTime, AckTime and ClearTime events and I want to UPDATE Area and Ack Comment Fields thru Database Connectivity but when I use SetTime timestamp supplied by Event Structure in WHERE clause Its not possible get the right alarm record because there are a different time resolution between LV SetTime timestamp and timestamp value logged in database.
    Eduardo Condemarin
    Attachments:
    Logging Time Resolution.jpg ‏271 KB

    I'm configuring the variables to be logged in the same way that appears on the file you send, but it doesn't work... I don't know what else to do.
    I'm sending you the configuration image file, the error message image and a simple vi that creates the database; after, values are logged; I generate several values for the variable, values that are above the HI limit of the acceptance value (previously configured) so alarms are generated. When I push the button STOP, the system stops logging values to the database and performs a query to the alarms database, and the corresponding error is generated... (file attached)
    The result: With the aid of MAXThe data is logged correctly on the DATA database (I can view the trace), but the alarm generated is not logged on the alarms database created programatically...
    The same vi is used but creating another database manually with the aid of MAX and configuring the library to log the alarms to that database.... same result
    I try this sabe conditions on three different PCs with the same result, and I try to reinstall LabVIEW (development and DSC) completelly (uff!) and still doesn't work... ¿what else can I do?
    I'd appreciate very much your help.
    Ignacio
    Attachments:
    error.jpg ‏56 KB
    test_db.vi ‏38 KB
    config.jpg ‏150 KB

  • Timer( ) resolution in CVI 2009 SP1

    In the CVI 2009 SP1 Contents Help for Timer() function I find
    The resolution is normally 1 microsecond. However, if you set the useDefaultTimer configuration option to True, the resolution is 55 milliseconds.
    But if you click on the useDefaultTimer hyperlink, a new page is opened, where you can read that the resolution is 1 millisecond if you set useDefaultTimer to False.
    Which is the right resolution?
    I wrote my application considering 1 microsecond, but I found strange problems, and I think it is 1 ms.
    Vix
    In claris non fit interpretatio
    Using LV 2013 SP1 on Win 7 64bit
    Using LV 8.2.1 on WinXP SP3
    Using CVI 2012 SP1 on Win 7 64bit, WinXP and WinXP Embedded
    Using CVI 6.0 on Win2k, WinXP and WinXP Embedded

    Hi vix,
    Thanks for bringing those documentation problems to our attention.
    1. There does seem to be a problem with a lot of the function prototypes in the .chm help in the CVI 2009 SP1 and 2010 versions, where an extra asterisk seems to have been added to a number of output parameters. We'll fix this and post a corrected version of the affected .chm files as soon as possible. In all these cases, the parameter data type as it appears in the Parameters section of the help topic is correct, as is the prototype in the corresponding function panel.
    2. Concerning the timer resolution issues, the async timer resolution is in fact 1 millisecond, as confirmed by the GetAsyncTimerResolution function. But note that asynchronous timers are a library in and of themselves, and are not covered by what is discussed in the documentation of the Timer() function or the useDefaultTimer option.
    The resolution of the Timer() function should be 1 microsecond, as of CVI 9.0 and later. The function documentation is correct. When that change was made, however, the documentation of the useDefaultTimer option was incorrectly not updated. It should say that "the performance counter timer provides a resolution of 1 microsecond".
    You mentioned that you didn't think that the resolution of the Timer() function was 1 microsecond. If you run the following code, what do you see?
    int     i;
    double  time[10];
    for (i = 0; i < 10; i++)
        time[i] = Timer();
    for (i = 0; i < 10; i++)
        DebugPrintf ("timer = %f\n", time[i]);

  • Mouse Coordinates and Timer Resolution

    First - I am not a Flash or ActionScript programmer. I have
    been asked by someone if I could port my ActiveX code (private) to
    the Flash client. In researching all of the Flash, Flex and
    ActionScript documentation, it appears that almost everything I
    need is present... Almost....
    A. My program relies on Mouse Coordinates being fed to it in
    twips. The only possible references I get to this issue have come
    from searching this forum.
    - Is it true that the X and Y coordinates are returned to
    ActionScript as floating point values that represent fractional
    pixel values that I can translate to twips?
    B. My program also relies on good timer resolution. The
    Windows GetTicks() API is not sufficient, because it is returned
    via the Windows message queue and can be off enormously at a
    MouseMove event. Therefore, in my ActiveX code, I call the
    QueryPerformanceCounter(), which gives me the resolution I need.
    - Can anyone tell me what timer API the Flash client engine
    is using for the values it returns?
    Thank you,
    Grant

    I still don't understand your problem and apparently nobody else does either since there are no responses. Why don't you write a simple program (like I did for you on your last post) that demonstrates the problem.
    "A picture is worth a thousand words".

  • Which software, c++ or labview..., for high time-resolution( µs)?

    Hi,
    i need to measure data with a time resolution of less than a few microseconds. Is this possible with labview? Or will i have to use C++?
    The interface cards, i could use are: CB-68LP or BNC-2120 both from national instruments
    Thanks
      Alex

    Alex,
    As the data acquisition is not timed by the software but by the hardware that you are using it doesn't matter which programming language you choose for your task. By the way, you have just provided information about the external connector blocks but not about the data acquisition device that you will use
    LabVIEW is capable of analyzing continuous data streams with several MB/second which is comparable to C++ so I don't think that this will be the main criteria to decide which programming language to choose.
    You should choose whichever approach is more appealing to you (graphical or text oriented). In my opinion the graphical approach is more intuitive and faster and it's easier to maintain code that was written some time ago but I don't want to start a debate on principles with this statement.
    If you prefer the text based approach please have a look at NI's Measuerement Studio which provides great features for user interface design, signal analyzing and data acquisition.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • How to get better milisecond timer resolution

    I am running a Measurement and Computing DAQ card.  In software we are polling the card for voltage level at rapid intervals, these intervals are controlled using a call to the millisecond timer vi.  We have set the software to poll to at about 10 ms intervals.  We can't seem to get it to poll the DAQ card that fast and it is very inconsistent - sometimes 14 sometimes even 20.  Is there a way to get the intervals down towards 10 ms intervals more precise or is it because the OS doesn't return to the thread fast enough?
    Do you have any suggestions?

    Windows is not a deterministic OS. It's always doing other things in the background so software timing is not reliable. Instead of polling, cannot the board be set for continuous acquisition using it's on-board clock for a 10msec sample rate? If you're using LabVIEW 7.1 or greater, you can also look into using the timed loop.

  • How can i input an IRIG time as timebase for DAQ

    i've got a 6062E board for collecting analog data but need to have IRIG time from another source input as the time base for the labview DAQ channels.  How can i do this with labview and my existing h/w ?
    thanks,
    jac

    Hello again,
    I wrote a VI to convert BCD to and from decimal and attached it here.  I quickly put it together moments ago, so it is likely not the most efficient possible, but it should work
    Hope this helps!
    Best Regards,
    JLS
    Best,
    JLS
    Sixclear
    Attachments:
    BCD To and From Decimal.vi ‏62 KB

  • Set time at start of DAQ to 0

    I am using 'DAQ Assistant' (subVI) to measure some AI's I am then storing this data using 'write LVM'(subVI). I acquire for a length of time when a trigger is received, it then stores and waits for the next trigger where the process is repeated. When I look at the LVM file I see that the time axis doesn't start at 0 seconds. I think the time actually starts when I start the program running because as you look at the files sequentially the start time always gets bigger, although I am not sure as I am only using express vi's so it all quite elusive and clever and not in my control. In order for my time axis to start at 0 seconds what do I need to reset.
    Nice

    Hello,
    I have had this post in my inbox for a few days and have been checking to see if you have posted further questions or if the posts from other users helped you out enough to get moving again with your project.
    If you have sorted out your problems then that is great, if you are still having the problems then please feel free to post to this thread again and I will monitor the thread for your response. I can then look into this further for you. I'm still not quite clear on exactly what you are trying to do though, so maybe if you could post a piece of example code which shows the problem then I will be in a better position to advise you.
    Best regards,
    Peter H
    Applications Engineer
    National Instruments UK

  • How to get rid of the "time out" when using DAQ AI in the example program

    I try an example file called "AcqVoltageSamples_IntClkDigRef" (Visual C++ .NET). It works great. However, if the program has not recieved the data, it sent out a timeout message. How to get rid of the "time out"? I cannot find anywhere in the code. Is this a property I have to reset somewhere else?
    Thank you,
    Yajai.

    Hello Yajai,
    The example program will use the default value for timeout, 10 seconds. To change this, you will have to set the Stream.Timeout value. I inserted this function into the example and set it equal to -1, and the program will wait indefinitely for the trigger signal without timing out. Please see the attached image to how this was implemented.
    I hope this help. Let me know if you have any further questions.
    Regards,
    Sean C.
    Attachments:
    SetTimeout.bmp ‏2305 KB

  • Can I get better time resolution than 1ms in labview?

    Hello,
    I am trying to generate an external waveform which requires timing to be in the microsecond range, I was trying to use the wait function to define the timing dt but found that I could not get below 1 ms. I tried to specify timing in the microsecod range by overloading the wait function with a float (0.200) to represent 200 us but found behavior to be erratic. Again, I am trying to create a waveform for extenal hardware which does not support buffering and requires software to drive.
    Thanks!
    Joel Abdullah

    "I am trying to create a waveform for extenal hardware which does not support buffering and requires software to drive."
    Sorry but that simply can not be done in a Windows environment due to the limitaions of the OS (1KHz was "way-fast" when WIndows replaced DOS).
    It can be done in a Real-Time OS like Pharlap for example.
    Attempts to deterministically perform any operation at a rate faster than 1KHz in a non-deterministic OS can be best described as an "exercise in futility."
    Trying to help,
    Ben
    Note to Waldermar:
    Timed loops can perform rather well at 1KHz or even better if you have a hardware clock available.
    Message Edited by Ben on 05-28-2008 10:47 AM
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Method to return bit resolution of detected DAQ card

    In the application I am developing, I am looking at the posibility of a user using one of several scope cards. The only thing that is important to me right now is to have a .vi that returns the bit-resolution of the detected card (i.e. is it an 8-bit, 12-bit, 14-bit, etc. card). Any ideas?

    Actually, NI-SCOPE (2.7 and later) has a property named "Resolution" found under the "Acquisition" properties that reads the actual resolution in bits of valid data for the session's High Speed Digitizer.
    Best Regards

  • Time resolution on the Macintosh

    I am trying to learn the resolution available with System.currentTimeMillis() on different platforms. I found references to the resolutions for Windows 9x (~54ms), Windows NT/2k/XP (~10ms) and Linux/Solaris (1 ms). However, I have not found a reference for the Mac OS or OS X. Could anyone point me toward that information?

    try here it's a good start
    http://developer.apple.com/java/javaintro/docs.html

  • How can I make a timer that will record elapsed time up to 15 hours with a PCI-6503E DAQ card?

    I'm a beginner LabVIEW programmer, but need to take measurements of elapsed time from the start of data acquisition for up to 15 hours. I'm using a PCI-6503E card. Looking at examples on the web and LabVIEW, itself, I was only able to find a way to take time measurements with a DAQ card up to 167 s, unless two counters were tied together, but there are no instructions on how to do that. Could someone help me? Thanks.

    I suggest you to look in the Resouce Library: inside it there is plenty of useful examples.
    I found this VI with two cascaded timers in the Measurement hardware > Counter/Timer > Event/Time Measurement cathegory:
    http://zone.ni.com/devzone/devzoneweb.nsf/opendoc?​openagent&5293158F4EC950C4862568C1005F6CD9&cat=C70​29C9ACBD3DF7386256786000F8EE6
    Hope this helps.
    Roberto
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

Maybe you are looking for

  • Host command in forms 10g

    hi all i am using host command in forms 10g. but it is not executing . is there any distinct syntax or pre-requirement for executing this command kindly help thanks

  • Problems even after rolling back to 8.0

    My 2nd generation shuffle was working fine (for the most part). Then I updated to iTunes 8.1 and noticed the exact same problems that everyone else is noticing, namely that syncing was completely broken. (http://discussions.apple.com/thread.jspa?thre

  • Adding contacts doesn't work

    Just downloaded skype. Instructions -which are impossible to find- say to click on People icon on the left and then to click on the + on the right to add contacts. Clicking on Peopke does nothing. How do you add contacts???

  • Error when opening a .fm file w/ double-click

    WinXP/Pro, patched up. FrameMaker 7.0p579 On a network drive, when I double-click to open most .fm files, I get critical error and FM load aborts. This does not happen when I do this for files on local hard drive. For the net drive files, if I first

  • Since updating to iOS 6, I'm getting kicked off wifi

    Since I updated to iOS 6, my iPhone 4 will kick me off wifi, and go onto 3G. I'll be sitting at home on Facebook, or playing a game, then stop to check the time and see that I'm on 3G. Before the iOS 6 update, this never happened. It's really frustat