Agilent E4980A Measurement Speed and Data Acquisition Speed

I am using E4980A LCR meter, I need very fast data aquisition unfortunately limited by device own speed with 5.6 ms per measurement which means nearly 178 Hz. I am using usb interface with the software provided by NI, in that program I made some modification taking the reading side of the programme in a while loop .
To test my speed I start the program count 5s and stop saving the results to an excell file. If I plot the results in a graph while programme is working I have 60 Hz speed. If I don't, then it is 80 Hz which are far below the potential maximum speed of 178 Hz. If my lap top battery is very low then my speed is worse. But with no battery problem, my celeron laptop performance is the same as an i7 laptop.
Here it is my programme, it is the same provided NI but some modification. What can I do to have higher data acquisiton speed ?
Probably, USB's speed is not enough but I can not believe that while it can save gigabytes of data in a few min can't take 200 datapoint in 1s. Why can't I have 178 Hz speed now and how can I reach this limit ?
Regards.
Attachments:
Read Measurement.PNG ‏44 KB
Read Measurement.PNG ‏44 KB

Instrument communications via serial, USB, Ethernet, GPIB, etc just tend to be slow.  The instrument has to interpret the data, react to it, and then send data back.  That takes time.  A few ms per measurement is quite normal.  One option you might have is you could tell the instrument to take several measurements and then request all of the measurements once it is done.  I haven't looked into the E4980A yet to see if it can do that.
What exactly are you trying to measure.  There might be better ways to get "fast" readings.
There are only two ways to tell somebody thanks: Kudos and Marked Solutions

Similar Messages

  • Does anyone know how to display (in LabVIEW) the memory use during execution of an image and data acquisition VI to predict when it is time to cease the acquisition to prevent the program crashing?

    Does anyone know how to display (in LabVIEW) the memory use during execution of an image and data acquisition VI to predict when it is time to cease the acquisition to prevent the program crashing?
    I am acquiring images and data to a buffer on the edge of the while loop, and am finding that the crashing of the program is unpredictable, but almost always due to a memory saturation when the buffers gets too big.
    I have attached the VI.
    Thanks for the help
    Attachments:
    new_control_and_acquisition_program.vi ‏946 KB

    Take a look at this document that discusses how to monitor IMAQ memory usage:
    http://digital.ni.com/public.nsf/websearch/8C6E405861C60DE786256DB400755957
    Hope this helps -
    Julie

  • While loop and data acquisition timing worries

    Hello everyone, 
    I apologize in advance if this is a silly question, but I could not find the answer around here. 
    I have made a VI to record continuously from 64 analog channels at a 5kHz sampling rate. I then stream this data to a tdms file. The data acquisition and write functions are in a while loop. 
    I the same loop, I have a bunch of other loops, that each run on their own wait timers to help limit the amount of memory they take up. I am now worried that this may somehow affect my data acquisition timing.
    If I put a bunch of timed loops within another loop, does the outer loop run at the same pace as the slowest of the inner loops? And could that mess up my sampling rate?
    I have attached my VI, in case what I just wrote makes no sense at all. 
    Thanks for any tips...
    Attachments:
    Record_M8viaDAQv3.vi ‏237 KB

    Well, looking at your code you will only write to your TDMS file one time. You have multiple infinite loops within the main/outer loop. That means that the main loop will only run a single iteration because it cannot complete an iteration until all code within it completes. With at least two infinite loops inside the loop it will never complete. Not too mention the only way to stop your code is to hit the stop/abort button. NOt a very good way to stop your code. As someone once said using the abort to stop your code is like using a tree to stop your car. It will work but not advised.
    As Ben mentioned try to understand data flow better. You have unnecessary sequence frames in your code where normal data flow will control the execution sequence.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

  • Synchrosnised video and data acquisition

    I am a Labview user who would like to capture video and data (from one video
    camera and 4 strain gauges) simultaneously, then be able to play the video
    whilst being able to process the data streams. Is there a simple way to do
    this? - Thanks!

    I use Imaq for Labview to do the image acquisition, while using a traslation
    system. It works. The Motion controller is controlled by Labview too.
    (RS232)
    It doesnt seem u need syncronization from ur mail, maybe just a simultaneous
    acquisition. In this case it's very easy.
    Saint
    "Kevin Powell" ha scritto nel messaggio
    news:[email protected]..
    >
    > I am a Labview user who would like to capture video and data (from one
    video
    > camera and 4 strain gauges) simultaneously, then be able to play the video
    > whilst being able to process the data streams. Is there a simple way to do
    > this? - Thanks!

  • Simultaneous analog output 1k sine wave and data acquisition

    Although many topics of simultaneous analog output and input can be found in NI Developer Zone, I have not found a case similar to my project. I'm trying to generate a 1kHz sine wave by using the analog output of NI DAQPad-6070E, and acquire one channel input signal simultaneously. I found most of the examples can not work properly above 100Hz. Any suggestions are welcome. Thank you.
    Jian

    I have sucessfully solve the problem when I observed the increase of
    scan backlog number. Simply increase the buffer size of AI/AO and you
    can make the 1kHz sine wave output and data acquired properly.
    On Tue, 3 Feb 2004 09:03:07 -0600 (CST), jujian wrote:
    >Although many topics of simultaneous analog output and input can be
    >found in NI Developer Zone, I have not found a case similar to my
    >project. I'm trying to generate a 1kHz sine wave by using the analog
    >output of NI DAQPad-6070E, and acquire one channel input signal
    >simultaneously. I found most of the examples can not work properly
    >above 100Hz. Any suggestions are welcome. Thank you.
    >
    >Jian

  • Timing control signal and data acquisition

    I have a series of relays which when activated will close different elements of a circuit and I also have a sourcemeter(Keithley) sending a constant source current and reaing the voltage back.(But when no relays are closed the circuit is open and no source current flows )
    I have a microcontroller(pgmmed for 9600 baud rate) based hardware circuit which closes different relays sequentially based on the input it recieves from labview pgm passed on through a Rs232 cable(9600/8 bit/no flow control)
    So i made a sequence structure of sending signal to microcontroller signal then wait for the signal transmission and realy activation and then read the voltage value form keithley sourcemeter back to labview thro a GPIB cable.
    These three events are reapeated sequentially in a while loop. 
    A schematic picture is attached.
    BUt the problem I ma facing is the maximum speed of reading is limited to 200msec/reading even thought the relay activation time is as low as 30 msec.
    I tried with many different delay times (140-60 msec...but more or less each reading takes up 200msec irrespective of the delay time..) 
    What could be the bottle neck is it the Rs232 setting at 9600 baud rate..can changing that to higher baud rate/including flow control solve this problem?
    Attachments:
    daq_printscreen.GIF ‏45 KB

    siva0182 wrote:
    Find attached the picture of the vi where I tried to put in timers.one in each of the frame and at the end added one more frame just to measure the time elapsed .
    The last frame and the frame with sourcemeter measurment show adifference of about 140 msec.
    Is there any way to improve the speed of this portion?
    Probably not.  It sounds like the Keithly VI is taking 140 msec.  You can try opening that up to see if there is anything in there that could be causing an unnecessary slowdown.  But trying to tighten it up too much may cause occasional errors (such as timeouts) in trying to get a response back. 
    But surprisingly I tried displaying the actual timer value at the start and at the start of sourcemeter read function ..but the display of both indicators is almost equal with little difference of only 10 msec.?????  I'm not sure which of the several timers you are referring to here.  What is the value going into the wait function between the VISA write and the Keithly VI?
    When you do benchmarking, the timer functions should be in a frame all by themselves.  Having it mixed with another function or VI causes a bit of uncertainty as to when the time was taken.  For instance, you dont' really know if the Keithly VI was started before the time was taken or after.  (The time was probably taken first, but without a data flow or sequence structure dependency, you can't be sure.)

  • Stepper Motor with Linear Stage (Position Control and Data Acquisition)

    Hey All,
    So.. I've attached a stepper motor to a linear stage and so far it's working pretty well.  Using a stepper motor driver from Pololu, I've simplified the control of the motor by just using output pulses from the counter output of a PXI-6143.  I've tested the motor using the Pulse Train examples in LabVIEW and all is working well.  
    My goal now is to allow the user to collect data from a pressure sensor, attached to the linear stage, every X number of steps.  From what I can tell so far, and please correct me if I'm wrong, the motor step movement is synchronised with the pulse train input.  That is to say, if I give the motor a 200 step pulse train, the pulse train ouput is completed at the exact time the motor has moved 200 steps.  From this, I've created a VI that moves the motor X steps, aquires the data point, and then repeats this process for the required amount of data points.  The problem with this is that the motor movement is not continuous; it stops for a split second to take the data point.
    How can I have labview ouput a pulse train of say 1000 steps and record a data point every 50 steps?
    Two ideas that came to mind were:
    1. Use the counter input port on the card to count the pulses being sent to the motor.  
    2. Use an encoder connected to the motor shaft.
    I wanted to stay away from theses ideas though since they require resources from the DAQ card.  
    Thanks,
    Ryan

    Hi Ryan,
    Just to cover all the bases, what version of LabVIEW are you using and can you attach your VI? Initially your ideas sound like they should work, do you expect to be nearly maxing out the DAQ?
    Thank you,
    Deborah Y.
    LabVIEW Real-Time Product Marketing Manager
    Certified LabVIEW Architect
    National Instruments

  • How do you cyclicly trigger data acquisition after n pulses counted

    Hello all, please forgive my ignorance because I am very new
    to lab view and data acquisition. I am working on a system which is going to
    scan an object and produce an image. The gimble that I am scanning the object
    with is an X-Y type of gimble with stepper motors on each axis. The stepper
    motor controller will output pulses real time to indicate the real time
    position of the gimble in each axis. What I need to be able to do is count
    pulses from the stepper motor controller and then output a trigger pulse to
    trigger the data acquisition in a buffered mode when N number of pulses have
    passed and then generate another pulse to stop the acquisition after another N
    number of pulses have passed. The controller puts out 10,000 pulses per degree
    of travel. The velocity that I am traveling at is 20 degrees per second, so
    timing here is really important. I need to be able to utilize the speed of the
    daq card and not so much the speed of the computer to iterate through a loop. I
    have tried using the count down feature in the NIDAQ MX library but it does not
    appear to be useful to me. I set it up and it will count down but once it hits
    zero it continues to count down. My expectation was that it would either
    restart the down count or it would stop. I was expecting some sort of trigger
    event to take place once the count reached the zero point but I did not observe
    any sort of event taking place. Once again my knowledge and background is
    really limited so I could be missing something really fundamental here. I have
    tried using some of the legacy functions which would enable me to do exactly
    what I want to do but they do not seem to work with my daq card. I have a NI
    PCI-6122 and if anyone has any knowledge on how to get this type of card to
    talk to some of the non MX functions I would be more than happy to hear how. It
    seems to me though, that I am limited to the MX functions which I can not
    really translate into what I have learned I can do with the legacy functions. I
    thank you all once again for taking the time to read this I and I will
    appreciate any and all responses that can be helpful.
    ~ Randy Brown

    I have run a few more tests and obtained some data per the request of a telephone support engineer. I have some scope screen shots that might be able to shed some light on what is going on. I will provide a brief description of what I discovered before I show the resulting data. I discovered that using the number of up ticks and down ticks suggested does not yield the right timing for the clock pulses that I will need for triggering my data acquisition. When I use 55 low ticks and 2 high ticks as my settings I end up getting a pulse every 32 pulses read on the PFI line. I get the same results when I interchange the numbers, for example, when I set the program up for 2 low ticks and 55 high ticks I get the same resulting one clock pulse per 32 pulses on the PFI line. I started playing with the numbers and come to find that I was able to generate a pulse every 57 pulses in this setup. I set the high ticks to 2 and the low ticks to 71 and once I did that it generated a pulse every 57 pulses in. The results are not ideal though, a number of things happen within the first second of operation. One mode of operation the clock output pulse latches after a few pulses generated. Another mode of operation that I noticed was that it would generate n number of pulses and then just stop even though the program was still running. The results I am getting are not reproducible when it comes to the long-term operation of the clock pulse generation but the bottom line is not matter what happens the end result after 1 second is not what is expected. I will show below screen shots of my program and also scope shots for the respective modes of operation.
    Front End interface
    Block Diagram
    55 High ticks and 2 low ticks results
    55 low ticks and 2 high ticks results
    77 Low ticks and 2 high ticks results
    Undesired Latch after 1 second of operation
    N number of pulses generated and stopped while program was still running
     It appears the the long term operation (and when I say long term I mean after a second) is intermittent, it either latches high or low after a random number of pulses are generated on the clock output. I am not sure why this is happening. The one setup that I came up with that generates a pulse every 57 pulses is not going to work for the setup that I have I think I would have to reduce the 71 to 69 in order to compensate for the two pulses that happen while the output pulse of the clock is high. To be honest I have no idea what is going on and I am starting to wonder about my daq card. Being that it is not really reproducing the same results I am starting to think maybe something is wrong with it. Another possibility is that it might be the bnc 2110 that I am using. I will try another one tomarrow and see if this problem persisits. I am leaving now so I won't be able to try that as of yet but I wanted to pass this info and data along such that maybe you will notice something and be able to lead me in the right direction. Thank you again for all of your help.
    ~ Randy Brown

  • Data Acquisition VI

    Hello,
    I am a graduate student and new to labview program, I have spent my couple of weeks to learn about labview and data acquisition techniques in labview. So far I  am able to code a program as attached (block diagram) which basically runs only at highlight execution mode. I need to run the program throuhout the test period and record all the datas. I have NI- SCXI 1001 hard ware device  and 2013 labview. I need to read from multiple channel at a time though it seems to be doing with this code. The output as attached is also not clear. I want the output to be tabulated with topic like Load, Stroke, Potentiometer 1 and Potentiometer2 etc. Please all help me to solve this problem. Your suggestions are highly appreciated.
    Solved!
    Go to Solution.
    Attachments:
    block diagram.PNG ‏13 KB
    front panel display.PNG ‏42 KB
    out put2.txt ‏1 KB

    Dear Dennis,
    I have incorporated your comments and also revised from my side, now it is working  little bit, I still wants to improve its functionality in the area of displaying graph, and time flexibility, please suggest me. If you have any informative links and documents regarding data acquisition please convey to me.
    I have attached screenshot of block diagram for your kind information.
    Attachments:
    block diagram-27 march.PNG ‏20 KB

  • Does anyone know how to display (in LabVIEW) the memory use during execution of an image and data acquisitio​n VI to predict when it is time to cease the acquisitio​n to prevent the program crashing?

    Does anyone know how to display (in LabVIEW) the memory use during execution of an image and data acquisition VI to predict when it is time to cease the acquisition to prevent the program crashing?
    I am acquiring images and data to a buffer on the edge of the while loop, and am finding that the crashing of the program is unpredictable, but almost always due to a memory saturation when the buffers gets too big.
    I have attached the VI.
    Thanks for the help
    Attachments:
    new_control_and_acquisition_program.vi ‏946 KB

    got these vi's off ni site a while ago - see if they help
    Attachments:
    Memory_Monitor.zip ‏132 KB

  • High-speed data acquisition and DMA

    Hi, I have a custom home-made 4 channel, 16 bit data acquisition board that has onboard memory of 2 GB and can sample at 250 MSPS. The daq returns data in 1 MB data block size. The Labview wrapper allows us to operate this board in two modes (continuous and acquire once). In acquire once mode, I can use the circular buffer architecture and stream data to the disk (difficult to view data in real-time). The continuous mode work in a way that arims the daq board acquires the data, rel-arms the daq for next 1 MB data and can return data for real-time display. However, during this arm/re-arm time crucial information is lost. The solution would be to access the DMA data as fast as we can and process it. Can you please show me an example to achieve this in simulation mode?
    Thanks
    RY

    Then you say the DAQ time take 1 sec, and the processing time about 4 second. How do you know this? Without having any VIs. And why do you have to read your DAQ unit every second. Also are you sure your processing part of your program is optimal. It is quite easy for a beginner to write Labview code that is very slow then dealing with arrays.
     I am quite sure you have some Labview program. But you for some reason do not like the thought of posting your code. Just post them and let us take a look. Hard to help not knowing any thing
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • How to improve speed of data acquisition? Help needed urgently.

    I want to convert analog signals to digital signals and simultaneously perform some search on the data acquired and this whole process has to be done continuously for few hours.
    So I tried writing two programs in Matlab, one acquires the analog data and converts it to digital and saves the data in small chunks on hard disk (like file1, file2, file3,...) continuously. The other program performs the search operation in those chunks of data continuously. I run both the programs at a time by opening two mat lab windows.
    But the problem Iam facing is that the data acquisition is slow. As a result I get an error message in the second program saying that
    "??? Error using ==> load
    Unable to read file file4.mat: No such file or directory."
    Iam unable to synchronize the two programs. I cannot use timers in search program because I cannot add any delays.
    Iam using a NI PCI-6036E ,16 Bit Resolution ,200 KS/s Sampling Rate A/D board.
    Should I switch to some other series such as M series having sampling rate of the order MS/s?
    Can anyone please tell me how to improve the speed of data acquisition?
    Thanks.

    Gayathri wrote:
    I want to convert analog signals to digital signals and simultaneously perform some search on the data acquired and this whole process has to be done continuously for few hours.
    So I tried writing two programs in Matlab, one acquires the analog data and converts it to digital and saves the data in small chunks on hard disk (like file1, file2, file3,...) continuously. The other program performs the search operation in those chunks of data continuously. I run both the programs at a time by opening two mat lab windows.
    But the problem Iam facing is that the data acquisition is slow. As a result I get an error message in the second program saying that
    "??? Error using ==> load
    Unable to read file file4.mat: No such file or directory."
    Iam unable to synchronize the two programs. I cannot use timers in search program because I cannot add any delays.
    Iam using a NI PCI-6036E ,16 Bit Resolution ,200 KS/s Sampling Rate A/D board.
    Should I switch to some other series such as M series having sampling rate of the order MS/s?
    Can anyone please tell me how to improve the speed of data acquisition?
    Thanks.
    Hi gayathri,
    well my email is [email protected]
    if ur from india mail me back.
    Regards
    labview boy

  • How to increase measuring speed of Agilent 34401A

    The process of my program (written in Labview) is:
    1) A control code is automatically generated by the program, and the code is sent to multiplexing circuit (the chips I used are ADG706) through digital port (USB6501 from NI)  then a coresponding channel is selected. (for example code hex0000 selects 1st channle,0001 selects 2nd channel, and so on......)
    2) Multimeter Agilent 34401A measures the current value of this channel, and send this value back to the computer for storage. After this, the program goes back to step 1) for next channel, and this program executes like this.
    The functions mentioed above are all realized, and the program works well, but the problem is the speed is slow. It takes a little more than 1 second to measure one channel, totally I have 2048 channels (32 groups and 64 channels in each group, I used two layers of "for" cycles), and it takes about 32 minutes to finish measure all these 2048 channels, too slow, I'd like to reduce them to 10 minutes.
    I tried to reduce the resolution to 4.5, turn off the autozero function of multimeter, but nothing happens, no change of the speed. Now every time the multimeter measures a channel, it makes a "click" sound, does it mean that every trigger the multimeter only takes one measurement? Is there anyway I can increase the measuring speed? Can I send the multimeter 64 control condes a time, and then let the multimeter takes 64 measurments in one trigger? Is it right? If so, how can I realized it?  If I'm wrong about it, can you give me some solutions?
    Thank you very much! 

    I don't have the manuals with me, but I do have a couple of these in my lab.   There is an internal buffer where you can store (I think) 512 readings.   You can use a digital output to trigger the DMM, and only transfer after every 512 readings (4 times) or perhaps one read per group as you mentioned.  I suggest one read per trigger (you can do many more) since it is easier to synchronize with your channel changes.  I believe the commands are INIT and FETCH?, check your manual.  The two times I know the instrument "clicks" are changing functions and certain range changes.  I suspect in your attempts to speed things up you are sending some config. commands each time.  Configure it once initially and make certain you are only sending measurement related commands during the scan, any config. changes (even if you resend the current value) will likely cost you time. 

  • Measurement Probe Data Acquisition and Calculation

    Hi there,
    Can we make data acquisition from measurement probe ? I mean, can we put the data to the grapher so we can "convert" it to excel or something ? I need to show the V(dc), V(p-p), I(dc), I(p-p), etc... in the grapher view.
    Second, according to my reference, V(dc) = V(p-p)/phi or 0.318 x V(p-p) , where phi = 3,14
    This also contribute to current, I(dc) = I(p-p)/phi or 0.318 x I(p-p)
    In the calculation, MultiSim will use 0.333... or 0.32 and the result will be in quite different. So, anyone can tell me where can I get the formula which MultiSim used to calculate the circuit ?
    Ghost Recon Team Leader
    Ghost Recon Team Leader

    Hay thnx dude..
    I found the example and now it works
    once again thanks a lot

  • Agilent 34461A Read? and Measurement Commands

    Hi All,
    Right now, I have some written using LabView.  I am monitoring some current over time until the device breaks, and everything is working as I want it to.
    I have my code setup so that the measurement command is sent every x seconds (a variable that can be set) in a while loop.  I was wondering if there is a way to have it constantly measuring and then I only read the current every x seconds.  The reason I am asking is because everytime this command is sent, you can hear the Agilent's relays clicking when it is triggering, and if I am sending this command every second, they might be hard on the equipment.  
    Is there a way for it always be measuring like when it is not in remote mode?  Can I use the Read? command to get read the current?
    I don't expect everyone to be familiar with this meter of course.  Since I am new to LabView, I was wondering if this is possible (maybe even for other pieces of equipment).
    Thanks, Natalie

    Can you share your code?  If you are doing the same measurement over and over again, then you should just have to set up the measurement once (before your loop as part of the initialization) and then just do the read when you need the data.  My guess is that you are setitng up this measurement each time you want data.

Maybe you are looking for

  • Do I need to use Bindings?

    Hi All I wonder if you can help I have a class (driver.m) that takes an input from a USB device (the callback function is embedded into the class and passes data back to the instance of the class). There are a number of instance variables that change

  • Creating a new schema or adding some more tables in existing schema ??

    Hi All, We have a new requirement from the client asking us to include a few new functionalities(pages--- all together a new application has to be embedded into our application) in our application. By adding these new functionalities, the number of h

  • How do I see what (cabled) link speed I have between base stations?

    Hi I do own one new Airport Extreme base station and one older Airport Extreme (flat one). Both are capable of 1 Gbit-speed via ethernet cable. I have the new base station as "master" and the old as extender (via cable). How can I ensure that the spe

  • Memory leak after upgrading to ColdFusion 10

    We recently upgraded from CF8 to CF10 and we're running into some issues.  We started off getting a lot of OutOfMemory errors with the default heap settings. Chaning them to 768MB/1280MB which has helped, but we're still running into occasional OOM i

  • Flash player update scam

    Windows 7 IE 10 FlashPlayer version 11.7.700.224 McAfee Malwarebytes Anti-Malware I get the following pop-up screens when browsing internet. It seems random, not exclusive to any one website I visit. I can't isolate it to a specific location. McAfee