Scanning rate with a ISA board

I have a problem with a ISA board. I use low level I/O functions for loop scanning a input port.
I have something like that:
do{
x=inp(adress);
}while(flag);
In this way I have a good scanning rate (1-10 usec),
but I haven't any control of my program.
If I insert in loop a function like GetUserEvent() or
ProcessUserEvent() to catch a keystroke for terminating
the loop, the scanning rate becomes very low (30-50 msec).
Is there a way to control a loop like this and have in the same
time a good scanning rate?

The only other feasible way to process events during the scanning loop you have is to make your program multithreaded. With the user interface running in the main thread and this processing loop running in a second thread. After you have set it up in this fashion you can then use global flag variables to communicate process information back and forth, and use things like thread safe queues to reliably pass data between threads.
Take a look at the "Multithreading Overview" linked to in the online help in CVI 5.5. It can also be found as a .pdf file on your hard drive in the cvi\bin directory. Multithreading is a new feature in CVI 5.5 and is not available in earlier versions.
Hope that helps you out.
Jason Foster
Applications Engineer
National Instruments
www.ni.com/ask

Similar Messages

  • DAQ if scan rate with external timebase, what will happpen to channel rate

    PCI-MIO-16XE-50: I make scan clock use a much more stable external timebase by Alternate Clcok rate spec in AI clock config.vi. In this case, what happens to channel clock? Which timebase will it use? The external one or the internal one? If the internal one, does it mean that the final data from DAQ is not stable and accurate? Shall I configure both scan rate and channel rate using the external timebase?

    When an external timebase changed by using the RTSI bus as described in http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/5601c8268d625a3c86256cf50071199d?OpenDocument, this will change the DAQ cards Master Timebase. The MasterTimebase signal, or Onboard Clock, is the timebase from which all other internally generated clocks and timebases on the board are derived. Therefore, both the scan and channel rate be based on the external timebase.

  • How to set scan rate with NI Switch scan voltages

    Hi
    I have SCXI 1130 switch and NI 4070 DMM . I have connected 3 voltage channels on the SCXI .
    When I read just one channel at one time, I get correct voltage reading. here I gave scan input as ch0->com0.
    Later , i placed a For loop in the block diagram and programmatically wired the scan channel input,
    for the switch and read the voltage output from the DMM, i do not get the correct outputs.
    That is , for my 3 channels, i gave For loop iteration count as 3 and ,'i ' is taken and appended for ch i ->com0. the DMM measurement is not proper. But if I highlight execution in the block diagram ( if i put the bulb and the probe), i can see the correct output voltages coming out. The moment, i turn off the execute high light, the program gives incorrect output. So  do I have to give a scan dealy or what time has to be set to get correct values. I am using software trigger in the bloack diagram.

    Hi Hema,
    CJC is an acronym for Cold-Junction Compensation, and this value adjusts for the change in voltage caused by the thermocouple wire to copper wire junction. 
    For example, a J-type thermocouple will have thermocouple wire consisting of iron and constantan metals.  When these iron and constantan metals meet the copper at the switch connection, a difference in voltage results.  This difference in voltage is the "cold-junction".  The difference in voltage resulting from the iron and constantan connection in the thermocouple is the "hot-junction".  When you measure temperature using a thermocouple, what you desire is the "hot-junction" change in voltage.  Unfortunately, the DMM is going to measure the sum of both the "cold" and "hot" junctions, and a CJC measurement is needed so we can adjust the measurement to remove the undesired offset.
    Once Cold-Junction Compensation is performed, converting from voltage to temperature is fairly simple.  Each thermocouple type has its own temperature to voltage conversion equation and associated coefficients.  Here's a great resource for the equations, coefficients, and specific voltage to temperature tables:
    NIST ITS-90 Thermocouple Database
    http://srdata.nist.gov/its90/main/
    Hope this helps!
    Chad Erickson
    Switch Product Support Engineer
    NI - USA

  • Scan Rate - Sample Rate confusion with the E-Series

    I have been reading a lot of post on the messaging board concerning the scan rate and the sample rate but I still dont get it. I am using an E series and am aware that this card does not have a simultaneous
    sampling of each channel
    I want a time stamp for each value i get from each channel. If I have channels 1,2,3,4 with rate dividors
    1,2,4,8
    sample rate set to X
    scan rate set to X/(Number of channels+1) (ghost channel in order to avoid error -10092) ie X/5
    what is the sample rate of each channel ?

    VKral,
    Here are a few KnowledgeBase entries that I hope will clear up the confusion:
    Data Acquisition Sampling Terminology
    http://digital.ni.com/public.nsf/websearch/DBE7AC32661BCF9B86256AC000682154?OpenDocument
    How Is the Actual Scan Rate Determined When I Specify the Scan Rate for My Data Acquisition?
    http://digital.ni.com/public.nsf/websearch/5782F1B396474BAF86256A1D00572D6E?OpenDocument
    Hope this helps. Have a great day!

  • How can I record several channels of data at different rates with the same card?

    I am building an application in which I need to record several channels of DC voltage data to disk at different rates, for example, Channel 1 at 40 Hz, Channels 3-5 at 10 Hz, Channels 6-8 at 1 Hz, etc.
    All these channels are time-referenced, so I need to make sure data for channels recorded at slower rates line up with their correct time stamp.
    Currently, I am building all data into a 2D array one row per loop iteration.

    Hello;
    At this time, the NI DAQ boards have only one scan clock, so, you can only define one valid scan rate for all of your channels.
    The better approach for you is to sample all channels at the highest rate you will need, and then discard the samples you don't need to save.
    Hope this helps.
    Filipe

  • How to set the scan rate in the example "NI435x.vi" for the 435x DAQ device?

    I am using LabVIEW 6i, A 4351 Temperature/Voltage DAQ, and am using the example vi "NI 435x thermocouple.vi"
    How do I set the scan rate on this VI? I added a counter to just to see the time between acquisitions, and it is roughly 4s for only 4 channels. The Notch Filter is set at 60, but is there a control for scan rate on this example. I'd like to acquire around 1 Hz.

    Using this DAQ example, it may be easiest to simply place a wait function in the loop and software time when the data is extracted from the board. You can also take a look a the NI-435x pallette and the examples it ships with, along with the timing specs on page 3-3 of the 435x manual and set the filters to meet your needs:
    http://www.ni.com/pdf/manuals/321566c.pdf
    Regards,
    Chris

  • DAQ MIO scan rate, channel rate and external timebase

    Using PCI-MIO-16XE-50 to collect data: if I use RTSI replaces the internal timebase with my external one(which is more stable), then I define a scan rate of 1000 scans/s, I don't define channel rate(using the default setting). In this case, will both the scan and channel rate be based on the external timebase? Or only scan rate is based on external timebase, or neither of them is based on? I am confused. Thanks for help

    When an external timebase changed by using the RTSI bus as described in http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/5601c8268d625a3c86256cf50071199d?OpenDocument, this will change the DAQ cards Master Timebase. The MasterTimebase signal, or Onboard Clock, is the timebase from which all other internally generated clocks and timebases on the board are derived. Therefore, both the scan and channel rate be based on the external timebase.

  • HOW DO I ACHIEVE A TIME STAMP, IN THE FIRST COLUMN OF MY SPREADSHEET IN EXCEL,SHOWING THE TIME INTERVAL, INCREASED BY THE SCAN RATE ?

    FROM AI START VI TO FORMAT AND APPEND VI TO CONCATENATE STRINGS,SECOND INPUT IS TAB,THEN EVEY OTHER ONE IS TAB. THIRD AND FIFTH INPUT IS ATTACHED TO GET DATE/ TIME VI, THEN END OF LINE, THEN CHANNEL 0; TAB; CHANNEL 1;TAB, AND SO ON THROUGH CHANNEL 7. THEN THE LAST INPUT IS END OF LINE. ON THE SPREADSHEET I RECIEVE THE SCAN RATE, THEN THE DATE ,THEN THE TIME; ALL ON THE FIRST ROW. THE NEXT ROW HAS THE COLUMN LABLES, AND DATA GOES TO THE CORRECT COLUMN. BUT NO TIME STAMP COLUMN. I WOULD LIKE IT TO BE IN COLUMN "A" EVEN WITH THE FIRST ROW OF DATA. EX." 10:01:01 200 300 400
    10:01:03 200 300 400
    THANK YOU
    FOR YOUR HELP, JOE BOB CRAIN

    I think the best way is to generate an array of time values to send to your spreadsheet BEFORE channel 0 data.
    To generate the array you can use Ramp Pattern.vi from 0 to 'Actual Scan Rate' (from AI Start) * number of scans performed.
    That way in your spreadsheet you will have the first line same as now, next the columns with time(secs), channel 0, channel 1... channel 7.
    If you need the time column with date and time string, you can use Get Date/Time in Seconds.vi when you begin acquisition, then sum the seconds obtained to the array calculated as above and next use Format Date/Time String.vi to obtain the time stamp you need, next build array of time stamps and sent to your spreadsheet.
    Hope this helps
    Roberto
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • How to calculate min/max scan rate of PCI-MIO-16XE-50

    My board: PCI-MIO-16XE-50 , 20MHz timebase
    The min/max scan rate is0.00596-20K scan/s and the channel rate is 1.53-20Kchannels/s
    How do they calculate these data?

    Hi May
    Check out the knowledge base locate at http://digital.ni.com/public.nsf/websearch/EA4E306828AF05B586256B8F00661A8B?OpenDocument . I think it answers your question.
    Serges Lemo
    Applications Engineer
    National Instruments

  • Calculating motion rate for given line scan rate

    Dear friends,
    I need to understand the calculation of motion rate for given line scan rate.
    i have line-scan camera connected to servo motor.
    i know that the line-scan camera can run at 7.2 khz line rate (7200 lines in 1 second).
    Each pixel need to be 0.1mm (in the direction of motion). 10 lines of scanning is 1mm.
    i need to scan image of object that its length is 400mm (0.1mm*4000lines)
    First, the Conveyor stop, the object that i need to test with the linescan camera is now 5cm from the camera.
    i want that the servo motor will move the conveyor for 45cm and stop. 
    (the conveyor accellerate, move const speed, decellerate, stop. the line scan camera connected to 10khz encoder for each revolution)
    Now i'm getting lost with the calculation.....
    if 1pixel = 0.1mm , then i need the conveyor belt be able to move 0.1mm for each pulse of the encoder???
    i want that the servo motor will rotate chain (like the attached image),
    but i dont know how i need to calculate the gear diameter, the chain pitch, etc...
    and specially, how this parameters influence the line scan rate, and what is the role of the encoder here...
    Thanks for any support.
    Moti 
    Attachments:
    Chain_gear.jpg ‏17 KB

    It sounds like you are mixing two methods of using a line scan camera.
    The first method is to run the camera at a fixed speed.  If the speed of your belt is constant, this can work okay.  To calculate the desired speed, let B=belt speed (mm/sec), L=line rate (lines/sec), R=mm/line (0.1mm/line in your case)
    B = L / R    This equation can be rearranged to calculate any of the quantities.
    The second method is to use an encoder to trigger each line captured by the camera.  In this case, the speed of the belt is not important as long as you don't go too fast for the camera.  In this case, you need to know how many mm the belt will travel for each pulse.  To calculate this, you would probably measure how far the belt travels during one revolution of the encoder.  You should also know the number of encoder pulses per revolution.  Dividing distance by pulses gives you distance per pulse.  Since you want 0.1 mm/pulse, you would need to adjust either the number of pulses in the encoder or the diameter of the belt roller.  If E=encoder pulses and D=belt distance, R = D / E.
    Bruce
    Bruce Ammons
    Ammons Engineering

  • Converting Units in LabView and specifying sampling rate with Universal Library functions

    Hi,
    I am having trouble converting units in LabView 7.0 and having it write to a file / output on a chart in Nm instead of in volts.  I can't seem to find any straightforward instructions, eventhough it seems like a simple task. 
    Another task that would seem rather simple is changing the sampling rate with the XAIn program from the Universal Library for LabView (this is version 5.40).
    If anyone could help me out, I would greatly appreciate it - I have tried other sources without much luck!
    Thanks
    Jenna

    Are you really needing to change units or do you need to scale the voltages read from your sensors? If it's scaling, it would be trivial to set this up in mAX or with the DAQ Assistant if you were using an NI board. Since you are not, then you could use the Scaling and Mapping Express VI. The y axis of a chart is just a label. You can use the text tool to change it to anything you want.
    I have no idea how to change sample rate for your board. Have you asked the vendor?

  • Keithley 2701 LV driver: Setting the scan rate?

    I downloaded the Keithley 2701 LV driver and the example programs work great. There is just one question that I have. I cannot find anywhere in the examples or in the LV drivers how to change the scan rate or scan interval. Everything else I'm able to change, such as number of points, channels, voltage levels, etc., but I can't figure out how to change my scan rate to once every 0.5 seconds or 1 second.
    I'm not sure, maybe this is a question for Keithley but since it is a LV driver I decided to post here first.
    I'm using LV8.2
    Jeff

    I am using the Keithley 2700 with the 7700 cards.  There is a vi named "ke27xx Configure Aperture Timing.vi" in the "ke27xx.llb" .
    The vi sets the integration time for the A/D converter.  I was measuring 500uv and needed to improve my accuracy so increased my integration time.
    Brian
    LV 8.2
    Brian

  • During continuous scanning thermocouples with a SCXI 1120 module is the cjc reference read each scan?

    Did the developers of the DAQmx drivers improve the temperature scan capability of the SXCI 1120 module over the methods used in the traditional drivers?
    When using the SCXI 1120 module for temperature measurement and in the continuous scan mode with traditional drivers the cold junction reference was only checked on the initial start of the scan. In order to maintain accuracy it was necessary to restart the scan.
    For the most part I have switched to the 1125 module for thermocouple measurement but I need to use some 1120 modules on a test stand now and I will be using the DAQmx drivers on these. I was in hopes that with the DAQmx drivers this behavior was corrected.
    I can conduct testing to determine this but would like to know ahead of time if this is still an issue.
    Thanks

    Thank you Ben for the answer.
    I checked the article you referenced appears to me that I will be adding the physical address in the channel strings then do the voltage to temperature conversion and then correct the appropriate channels associated with the cjc in question. I will also need to adjust some on board jumpers for the SCXI 1120.  If I am wrong please correct me.
    If scanning the physical channel is possible as suggested in this article, would it not be possible to create the channel as a virtual global channel and reference it as the cold junction in Max? Then the temperature adjustment could be done at this level of acquisition rather than having to sort this out in the LabVIEW program. This takes me back to the days with LabVIEW 4.0 before we had the virtual channels and when we had to handle the temperature conversions and other scaling ourselves. Again if I am incorrect please let me know, for I would like to use the method recommended in the article if I don't have to perform all the calculations in the application.
    Just wondering if anybody reading this has actually used this method. If so, how did it work for you?
    I may end up building a start up routine testing for 1120 modules being used for thermocouple measurement then if found run a stop and restart scan every 5 minutes to prevent large errors from being introduced.

  • Non-time based scan rate

    How do I acquire analog data with a scan rate based on a digital trigger and not based on time. I have two signals one analog and one digital. I want to acquire an analog data point every time the digital signal goes from high to low and from low to high.
    Thank you.

    I don't think LabVIEW DAQ hardware supports triggering on both edges. It appears you have to trigger on one or the other.
    The only thing that I can think of to try is to take your analog signal and split it into two different analog inputs. Use the example "Acq&Graph Voltage-Int Clk-Dig Start.vi". Insert it in a new vi twice. You will have to open it and modify the connector so the Physical Channel and Trigger Edge are inputs and the graph is the output. Then wire each one with a different analog input and trigger edge and run them sequentially. Hopefully it will run fast enough for you.
    Randall Pursley

  • During slow scan rates, are events captured between scans?

    We are using 32 thermocouples to measure a heated cabinet performance and are using the Continuous Acquisition to Spreadsheet VI. I am saving the data to a text file while plotting the data to a chart in Labview 6.1. I only wanted to read temperature every 60 seconds (0.01667 scans/sec) but set the scan rate rounds off to 2 decimal places, so at 0.02 scan rate we read data every 50 seconds. I set the buffer size to 1 and the number of scans to write at a time to 1. Does this mean I have a lag of one scan in the buffer or am I seeing the latest scan written to the chart? Also, temperature events between writes to the screen sometimes show up on the chart, sometimes not. It seems to depend on how long the e
    vent takes place or how close to the next write. I would appreciate any insight on how the software handles the events between writes. Thanks.

    Willard,
    You can actually change the digits of precision in LabVIEW to display more than 2 decimal places. If you right-click on the scan rate control and select 'Format and Precision' you can alter the digits of precision to include 5 decimal places.
    By setting the buffer size to 1 you are setting a memory size to hold only 1 scan at a time. By setting the number of scans to write at a time to 1 that means that every time the AI Read VI is called within the while loop, the VI will only read 1 scan from the buffer. Therefore, the hardware is set-up to read 1 scan every 50-60 seconds and so the hardware places that scan in the buffer. Then every time the AI Read VI is called in software, it reads 1 scan from the buffer. So, if the AI Read VI already read
    the 1 scan from the buffer, it will wait until the buffer gets the next scan from hardware (50-60 seconds) and then plot it on the chart. So, with the set-up you have, you will not see any lag of temperature data. Once the hardware scans, the software will almost immediately read that scan. I hope this helps.
    Regards,
    Todd D.
    Applications Engineer
    National Instruments

Maybe you are looking for

  • Black screen (or screen refresh) during the process of start up. T400, Vista home basic.

    My 2 weeks old T400 Vista home basic can start up successfully. But every time when I start or restart it, during the start process, after inputting the password, there are 4 times of black screen (or screen refresh). Is it normal? Thanks. 

  • Acrobat 9 Pro Extended: How to use Windows 7 Snipping Tool to Paste into PDF

    Acrobat 9 Pro Extended on Windows 7. I frequently use the Windows built-in Snipping Tool to copy parts of pages and paste them into other applications. Works great. I can't find a way to let Acrobat allow me to Paste. I create the snip like I always

  • Loading apps problem

    I seem to have a problem with my iphone 3g. Im trying to download apps from the app store. its lets me, just when the icon pops up on the screen it says loading but nothing happens. i thought maybe it was just taking longer then normal. so i download

  • Error "Maximum number of items in FI reached"

    Hi All, We are getting error "Maximum number of items in FI reached", if there are more than 999 items. We have found an SAP note "Note 36353 - AC interface: Summarizing FI documents". This note seems to do more with customizing & doesnt need any cod

  • Calling sp2010 workflow from within sp2013 workflow

    Hi, I have created a sp2010 workflow on a list in sp2013 through designer. The workflow has just one step in it - "Send an email". Now I have a 2013 workflow as well on the same list. I inserted a ""start a sp2010 workflow"" step in it. Now when I tr