SCXI-1125 scan rate controlling

SCXI-1125 scan rate is 333kHz in the multiplexed mode. Does anyone knows how to slow the scan rate?.
regards.
naushica

According to the web page "http://digital.ni.com/public.nsf/allkb/410A70C25A4D12B486256A1E0070BDAE"
You will lose resoltuion for increasing the scan rate.  Isn't it?
If the scan rate is 100kS/s and the input range is +-5V  with the
14bit resolution, does that mean  that  I have a
noisefloor of 600uV for that range?
If I want to measure a signal in +-1V range does that means the noise floor is around 120uV?
The absolute accuracy table on the other hand states about  an accuracy full scale range but
doesn't state about scanning speed.
For example, 18-bit  ADC module PXI-6289 data sheet states that +-5V range has a full scale
accuracy of 510uV.If  converted to bits, it will be less than 15bits.
So even though I reduced the scanning speed to abt 50kS/s will I get an accuracy of 16bits?
I have a SCXI-1125 + PXI-6289 installed in a PXI-1052 combo chassis.
Please help on this matter.

Similar Messages

  • During slow scan rates, are events captured between scans?

    We are using 32 thermocouples to measure a heated cabinet performance and are using the Continuous Acquisition to Spreadsheet VI. I am saving the data to a text file while plotting the data to a chart in Labview 6.1. I only wanted to read temperature every 60 seconds (0.01667 scans/sec) but set the scan rate rounds off to 2 decimal places, so at 0.02 scan rate we read data every 50 seconds. I set the buffer size to 1 and the number of scans to write at a time to 1. Does this mean I have a lag of one scan in the buffer or am I seeing the latest scan written to the chart? Also, temperature events between writes to the screen sometimes show up on the chart, sometimes not. It seems to depend on how long the e
    vent takes place or how close to the next write. I would appreciate any insight on how the software handles the events between writes. Thanks.

    Willard,
    You can actually change the digits of precision in LabVIEW to display more than 2 decimal places. If you right-click on the scan rate control and select 'Format and Precision' you can alter the digits of precision to include 5 decimal places.
    By setting the buffer size to 1 you are setting a memory size to hold only 1 scan at a time. By setting the number of scans to write at a time to 1 that means that every time the AI Read VI is called within the while loop, the VI will only read 1 scan from the buffer. Therefore, the hardware is set-up to read 1 scan every 50-60 seconds and so the hardware places that scan in the buffer. Then every time the AI Read VI is called in software, it reads 1 scan from the buffer. So, if the AI Read VI already read
    the 1 scan from the buffer, it will wait until the buffer gets the next scan from hardware (50-60 seconds) and then plot it on the chart. So, with the set-up you have, you will not see any lag of temperature data. Once the hardware scans, the software will almost immediately read that scan. I hope this helps.
    Regards,
    Todd D.
    Applications Engineer
    National Instruments

  • How to increase scan rate of NI Switch SCXI 1130

    Hi,
    I have NI PCI 4070 DMM used with NI SCXI 1130 sitch module. I have connected 10 thermocouples to 1130 module. I am scanning the channels and reading the values in the program using niSwitch and niDMM VIs. I am using software trigger in the program. I have configured Software Trigger in niDMM configur trigger and niDMM configure Multipoint. I get the correct values when i scan using chi->com0, where i goes from 0 to 9. But the problem is that the rate of scanning is very slow.
    There is niSwitch Configure scan Rate.vi, here i have given scan delay as 0 second.
    It takes one second for one channel when i run the program. why is this , is this because i used software trigger for each channel scan? how to improve the scan rate. ?

    Sorry for the confusion, I started writing a post and got interrupted and came back to it too late.  You can disregard the last post and here is my final answer:
    I would actually recommend that you use synchronous scanning if you want to maximize the speed of your scan, rather than using software triggers.  If you use synchronous scanning, the DMM will generate a digital pulse (Measurement Complete) each time it completes a measurement, allowing the switch to advance to the next entry in the scan list the instant the DMM has completed its measurement.  The DMM will then take the next meausurement after a specified harware-timed interval.  This will be much more efficient than sending software triggers back and forth to time the scanning.  To set up your application using synchronous scanning, follow these steps:
    Open the LabVIEW shipping example "niSwitch DMM Switch Synchronous Scanning.vi", found in the NI Example Finder in the folder Hardware Input and Output » Modular Instruments » NI-SWITCH (Switches).
    Physically connect the Measurement Complete output trigger from the DMM to the trigger input of the switch.  How you will do this depends on what type of chassis you are using (PXI/SCXI combo chassis or separate chassis) and what switch terminal block you're using.  If you need assistance with this please provide more details about your hardware setup and I'd be happy to help out.  The following resource may be helpful here: KnowledgeBase 3V07KP2W: Switch/DMM Hardware Configurations.
    Select valid values for all other front panel controls and run the VI.
    I hope this is helpful.  Please let me know if I have misunderstood your application, or if you would like me to go into more detail on any specific part of the solution provided above. 

  • Need continuous acquisition of 16 channels through SCXI-1125 to PCMCIA DA board + FFT on all channels. Can it be done?

    Where's the limit? Assume a new Dell Latitude running NT with 128 MB RAM and a fast P3. Equip it with the DAQCard-AI-16E-4 connected to an SCXI chassis with 2 SCXI-1125 modules.
    How fast can I run the sample rate (per channel) if I want to do continuous DAQ on all 16 channels while computing the power spectrum of all channels once per second (using one second's worth of samples)?
    The DAQCard has a tiny FIFO, and I'm hoping to use pretriggered scans so I don't have to wait for data. Is that an option without the DMA functionality of the PCI E-Series boards? Without it I will have to constantly empty the FIFO into a "software" b
    uffer which will cost boku (many) processor cycles, correct?
    So what say the experts... how fast can this thing run? How about if I ditch the laptop and go PXI with the whole deal? Lets make the case!
    Thank you!

    I believe the PCMCIA interface is going to be the limiting factor in this case. The maximum continuous rate for a DAQCard will vary from computer to computer, but it isn't extremely fast.
    Being optimistic, let's estimate 64 K samples/second. I don't know if this is high or low nowadays, but it is higher than I could ever achieve a couple of years ago. Divide that by your 16 channels, and you have 4K blocks. I think you could process this in 1 second easily. This gives you 1 Hz resolution up to 2 Khz.
    Now, if you ditch the laptop and go to PXI, it is a whole different story. DAQ rates will not be the limiting factor. You just need to find out how large of a block you can process and keep up.
    My approach to this problem would be to figure out what analysi
    s you need, then figure out what hardware can accomplish it. Do you really need 1 Hz resolution? Do you need to process every bit of data that comes in? Are you going to store the results, or just display them on the screen? Can you do post-processing of the data if necessary?
    Bruce
    Bruce Ammons
    Ammons Engineering

  • Scanning rate with a ISA board

    I have a problem with a ISA board. I use low level I/O functions for loop scanning a input port.
    I have something like that:
    do{
    x=inp(adress);
    }while(flag);
    In this way I have a good scanning rate (1-10 usec),
    but I haven't any control of my program.
    If I insert in loop a function like GetUserEvent() or
    ProcessUserEvent() to catch a keystroke for terminating
    the loop, the scanning rate becomes very low (30-50 msec).
    Is there a way to control a loop like this and have in the same
    time a good scanning rate?

    The only other feasible way to process events during the scanning loop you have is to make your program multithreaded. With the user interface running in the main thread and this processing loop running in a second thread. After you have set it up in this fashion you can then use global flag variables to communicate process information back and forth, and use things like thread safe queues to reliably pass data between threads.
    Take a look at the "Multithreading Overview" linked to in the online help in CVI 5.5. It can also be found as a .pdf file on your hard drive in the cvi\bin directory. Multithreading is a new feature in CVI 5.5 and is not available in earlier versions.
    Hope that helps you out.
    Jason Foster
    Applications Engineer
    National Instruments
    www.ni.com/ask

  • How to set the scan rate in the example "NI435x.vi" for the 435x DAQ device?

    I am using LabVIEW 6i, A 4351 Temperature/Voltage DAQ, and am using the example vi "NI 435x thermocouple.vi"
    How do I set the scan rate on this VI? I added a counter to just to see the time between acquisitions, and it is roughly 4s for only 4 channels. The Notch Filter is set at 60, but is there a control for scan rate on this example. I'd like to acquire around 1 Hz.

    Using this DAQ example, it may be easiest to simply place a wait function in the loop and software time when the data is extracted from the board. You can also take a look a the NI-435x pallette and the examples it ships with, along with the timing specs on page 3-3 of the 435x manual and set the filters to meet your needs:
    http://www.ni.com/pdf/manuals/321566c.pdf
    Regards,
    Chris

  • Scxi 1125 R202 getting burned

    Wondering if anyone else has seen this issue.
    I have about 200 of these cards (SCXI-1125) in service in production test machines.  Quite a few of the failed units have R202 burned on the SCXI-1125 card.  Sometimes replacing R202 fixes the issue, but more often it does not.  Since i do not have a schematic of the 1125, it is hard for me to tell if this is a design flaw, or if it is something that we are doing.  Luckily, I also have a stack of 32 of these cards that were purchased for a project that was cancelled, so spares is not an issue thankfully. 

    We are testing power supplies, specifically DC to DC converters.  So yes, static DC voltages.  We are also reading static DC voltages from the analog controls of some of the other instruments in the rack (power supplies and loads).  At no time is there more than +10V present on any channel.  These being 8 channel devices, I would find it strange that we are only seeing damage to one component.  As far as I know, the way we have it cabled, we are using all 8 channels separately and using the mux in the 6031e card to channel the readings to the AtoD.
    So I would like to know the purpose of R202 in the circuit of the SCXI1125?

  • Noisy thermocouple signal with SCXI-1125

    Hi all
    Am using SCXI -1125, with 1328 terminal blocks, I have set the filter to 4hz in the signal conditioner. Am using 256 T type thermocouple, with 1 sample per channel at a very low sampling rate "1Hz".
    Am receiving lot of noise in the signal, aprox +/- 1deg. But there are other channels which are very stable with hardly 0.2deg variation. I have tried connecting the negative of the noisy signal to ground, but with no results. Can anyone give some input on this?
    Thanx a lot
    Arun

    Arun,
    Are you sure you are setting these properties correctly? Before
    modifying any of these properties, you must specify the active
    channels. Please take a look at the attached example program. It sets
    the active channels to all channels in the task and then sets the
    auto-zero mode. A similiar process should work for enabling the filter
    and setting the cutoff frequency.
    Hope this helps!
    Ryan Verret
    Product Marketing Engineer
    Signal Generators
    National Instruments
    Attachments:
    Cont Acq Thermocouple Samples-Int Clk.vi ‏98 KB

  • HOW DO I ACHIEVE A TIME STAMP, IN THE FIRST COLUMN OF MY SPREADSHEET IN EXCEL,SHOWING THE TIME INTERVAL, INCREASED BY THE SCAN RATE ?

    FROM AI START VI TO FORMAT AND APPEND VI TO CONCATENATE STRINGS,SECOND INPUT IS TAB,THEN EVEY OTHER ONE IS TAB. THIRD AND FIFTH INPUT IS ATTACHED TO GET DATE/ TIME VI, THEN END OF LINE, THEN CHANNEL 0; TAB; CHANNEL 1;TAB, AND SO ON THROUGH CHANNEL 7. THEN THE LAST INPUT IS END OF LINE. ON THE SPREADSHEET I RECIEVE THE SCAN RATE, THEN THE DATE ,THEN THE TIME; ALL ON THE FIRST ROW. THE NEXT ROW HAS THE COLUMN LABLES, AND DATA GOES TO THE CORRECT COLUMN. BUT NO TIME STAMP COLUMN. I WOULD LIKE IT TO BE IN COLUMN "A" EVEN WITH THE FIRST ROW OF DATA. EX." 10:01:01 200 300 400
    10:01:03 200 300 400
    THANK YOU
    FOR YOUR HELP, JOE BOB CRAIN

    I think the best way is to generate an array of time values to send to your spreadsheet BEFORE channel 0 data.
    To generate the array you can use Ramp Pattern.vi from 0 to 'Actual Scan Rate' (from AI Start) * number of scans performed.
    That way in your spreadsheet you will have the first line same as now, next the columns with time(secs), channel 0, channel 1... channel 7.
    If you need the time column with date and time string, you can use Get Date/Time in Seconds.vi when you begin acquisition, then sum the seconds obtained to the array calculated as above and next use Format Date/Time String.vi to obtain the time stamp you need, next build array of time stamps and sent to your spreadsheet.
    Hope this helps
    Roberto
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • How to calculate min/max scan rate of PCI-MIO-16XE-50

    My board: PCI-MIO-16XE-50 , 20MHz timebase
    The min/max scan rate is0.00596-20K scan/s and the channel rate is 1.53-20Kchannels/s
    How do they calculate these data?

    Hi May
    Check out the knowledge base locate at http://digital.ni.com/public.nsf/websearch/EA4E306828AF05B586256B8F00661A8B?OpenDocument . I think it answers your question.
    Serges Lemo
    Applications Engineer
    National Instruments

  • Connecting iBook to television - scan rates

    This may well be a stupid question, but it's late at night and my brain is already asleep!
    I want to connect my iBook G4 to a television to display some multimedia presentations, so I've bought the Apple Video Adapter. I also intend to get an s-video cable to connect the adapter to the television.
    When checking around on the Apple Support pages, I found some information about supported display resolutions for PAL, and all were quoted as 50Hz. My television has a 100Hz scan rate, so what effect (if any) will this have on the picture my iBook can display?
    Thanks.
    MDP

    Okay, this is not an issue.
    MDP

  • Need to be Able to use a TB-1328 with an SCXI-1125

    I need some labview code to show me how to read temp from a SCXI-1125 with a TB-1328 thermocouple on it. I have a K type thermocouple wire, I have to stick with Labview V7.0, NI-DAQ 6.9.3f5, and this is running on an RT machine using a PXI-1010 split PXI/SCXI chassis, with a PXI-6040E running the SCXI bus. Please if you could, tell me how to remote DAQ configure this. Thank you.

    Hello,
    If you have NI-DAQ 6.9.3 support installed for LabVIEW 7.0, then you should have an example which will show you how to acquire a thermocouple reading from a SCXI-1125. The example which I am referring to is called "SCXI-1125 Thermocouple.vi". Regardless of what terminal block you are using with the SCXI-1125, you need to make sure that you configure it accordingly in Measurement and Automation Explorer.
    For assistance with Remote DAQ configurations, please refer to the following knoweldgebase: What Do I Need to Know to Set Up My Remote SCXI System?
    Regards,
    Jared A

  • Calculating motion rate for given line scan rate

    Dear friends,
    I need to understand the calculation of motion rate for given line scan rate.
    i have line-scan camera connected to servo motor.
    i know that the line-scan camera can run at 7.2 khz line rate (7200 lines in 1 second).
    Each pixel need to be 0.1mm (in the direction of motion). 10 lines of scanning is 1mm.
    i need to scan image of object that its length is 400mm (0.1mm*4000lines)
    First, the Conveyor stop, the object that i need to test with the linescan camera is now 5cm from the camera.
    i want that the servo motor will move the conveyor for 45cm and stop. 
    (the conveyor accellerate, move const speed, decellerate, stop. the line scan camera connected to 10khz encoder for each revolution)
    Now i'm getting lost with the calculation.....
    if 1pixel = 0.1mm , then i need the conveyor belt be able to move 0.1mm for each pulse of the encoder???
    i want that the servo motor will rotate chain (like the attached image),
    but i dont know how i need to calculate the gear diameter, the chain pitch, etc...
    and specially, how this parameters influence the line scan rate, and what is the role of the encoder here...
    Thanks for any support.
    Moti 
    Attachments:
    Chain_gear.jpg ‏17 KB

    It sounds like you are mixing two methods of using a line scan camera.
    The first method is to run the camera at a fixed speed.  If the speed of your belt is constant, this can work okay.  To calculate the desired speed, let B=belt speed (mm/sec), L=line rate (lines/sec), R=mm/line (0.1mm/line in your case)
    B = L / R    This equation can be rearranged to calculate any of the quantities.
    The second method is to use an encoder to trigger each line captured by the camera.  In this case, the speed of the belt is not important as long as you don't go too fast for the camera.  In this case, you need to know how many mm the belt will travel for each pulse.  To calculate this, you would probably measure how far the belt travels during one revolution of the encoder.  You should also know the number of encoder pulses per revolution.  Dividing distance by pulses gives you distance per pulse.  Since you want 0.1 mm/pulse, you would need to adjust either the number of pulses in the encoder or the diameter of the belt roller.  If E=encoder pulses and D=belt distance, R = D / E.
    Bruce
    Bruce Ammons
    Ammons Engineering

  • Scan Rate - Sample Rate confusion with the E-Series

    I have been reading a lot of post on the messaging board concerning the scan rate and the sample rate but I still dont get it. I am using an E series and am aware that this card does not have a simultaneous
    sampling of each channel
    I want a time stamp for each value i get from each channel. If I have channels 1,2,3,4 with rate dividors
    1,2,4,8
    sample rate set to X
    scan rate set to X/(Number of channels+1) (ghost channel in order to avoid error -10092) ie X/5
    what is the sample rate of each channel ?

    VKral,
    Here are a few KnowledgeBase entries that I hope will clear up the confusion:
    Data Acquisition Sampling Terminology
    http://digital.ni.com/public.nsf/websearch/DBE7AC32661BCF9B86256AC000682154?OpenDocument
    How Is the Actual Scan Rate Determined When I Specify the Scan Rate for My Data Acquisition?
    http://digital.ni.com/public.nsf/websearch/5782F1B396474BAF86256A1D00572D6E?OpenDocument
    Hope this helps. Have a great day!

  • Keithley 2701 LV driver: Setting the scan rate?

    I downloaded the Keithley 2701 LV driver and the example programs work great. There is just one question that I have. I cannot find anywhere in the examples or in the LV drivers how to change the scan rate or scan interval. Everything else I'm able to change, such as number of points, channels, voltage levels, etc., but I can't figure out how to change my scan rate to once every 0.5 seconds or 1 second.
    I'm not sure, maybe this is a question for Keithley but since it is a LV driver I decided to post here first.
    I'm using LV8.2
    Jeff

    I am using the Keithley 2700 with the 7700 cards.  There is a vi named "ke27xx Configure Aperture Timing.vi" in the "ke27xx.llb" .
    The vi sets the integration time for the A/D converter.  I was measuring 500uv and needed to improve my accuracy so increased my integration time.
    Brian
    LV 8.2
    Brian

Maybe you are looking for