Labview timing problem

I'm kind of stuck with a bit a problem in Labview 7 (base ed.). I've got a PCI-6036E that can sample at 200KS/s with a 24bit onboard timer.
I've got to set up an experiment that monitors a current indirectly through a potentiostat (1A = 1V scale). The program needs to integrate the current/voltage with respect to time until it reaches a specific value and then tells the program to generate an analogue signal output (this will over and over)
Surely it should be quite straight forward to read a voltage on one of the inputs, summing this value in a loop until this limit is reached, then outputting a voltage.
The problem is I'm not quite sure how to approach this considering that I need a time interval of the loop to be ~1-10 microseconds.
Many thanks,
Paul

The only way you're going to get microsecond loop speeds is to move up to the new FPGA board, currently only available for PXI.
Under Windows, or any GUI OS for that matter, loop speeds and determinism get shaky in the double digit milli-second rates, and anything less than 10 mS is really not reliable. LabVIEW Real Time can get much better loop timing, but you're still limited to the milli-second rate as the CPU can't process any faster than that.
Anything running on the FPGA is running in hardware and extremely fast and deterministic loop times are possible. I�ve run them in the nano-second range just doing some testing. Data from applications running on the FPGA can be accessed from La
bVIEW Real Time, LabVIEW or both so you can control what�s going on.
Your best bet would be to either call NI or your local NI rep and have them get you some more info.
Ed
Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.

Similar Messages

  • Can anyone help me with Labview/GPIB timing problems?

    I am trying to use a very simple VI (downloaded from the NI site) to control a Newport 1830C optical power meter.
    I have been spending most of my time trying to use the VI issue a command to read the current displayed power. When I run the VI normally, I get no response from the meter, and the VI simply terminates without any error message. (The "remote" indicator does light on the meter, though, so it seems to be receieving the command.)
    However, when I run the VI in the "highlight exectution" mode, everything seems to work just fine, and I receive a correct reading from the meter! Needles to say, this makes it really hard to debug the problem.
    My guess is that I have some kind of timing problem, and t
    he "highlight execution" feature fixes it by introducing delays. Is this correct? If so, how can I achieve the same thing by putting in delays by hand?
    thanks, Whitney White

    Whitney,
    I agree that this is very likely to be a timing issue. VIs run much slower in highlight execution mode, therefore your instrument may need more time to prepare the data you have requested. Try wiring a delay inline prior to the GPIB reads and/or writes, or try using Service Requests (SRQs) to give the instrument enough time to generate the data it needs to send back to the computer.
    Another possibility is that your instrument may not be IEEE 488 compliant. Some older instruments indicate that they are ready to receive data before they really are. Try changing the GPIB bus timing to the slowest setting (in the GPIB Configuration Utility). This only affects the time that the GPIB controller waits before valid data is on the bus and the DAV line i
    s asserted (a step in the GPIB handshaking that occurs each time a byte of information is sent to the device) so this may not be effective.
    Lastly, you may wish to try the instrument driver for your instrument. There is a contributed driver at this location: LabVIEW Traditional Instrument Driver: Newport Optical Power Meter 1830C. A 'contributed driver' means that someone outside of National Instruments wrote the driver and then submitted it for posting. This driver also contains a good example of adding a time out.
    Regards,
    Heather S.
    Applications Engineer
    National Instruments

  • Timing problems with AU Ivory Piano + ESB and Logic 7.2

    I have problems with Ivory and Logic 7.2:
    The Ivory (AU version 1.50) plays well in tempo, but when I want to bounce (export track as audio file) the Ivory plays totally out of tempo.
    Even when I touch the record button on audio tracks or when I want to record my mix on 2 audiotracks, the Ivory plays out of tempo.
    Even freezing the Ivory track gives the same problem.
    Then I need to go back to my last saved version.
    The same actions with Emagic, Native Instruments and Spectrasonics AU plugins, and everything works fine.
    I throwed away already the Ivory prefs, no positive results.
    My Ivory sample files are on a separate 400 GB internal disk, and my Audio I'm recording on a external Glyph disk.
    I think I have this problem since I've upgraded from Logic 7.1.1 to 7.2 but I'm not sure. I didn't worked long with Logic 7.1.1.
    I don't work in Protools, but I use the DAE and Direct TDM (no Core Audio)
    It happens in the most simple application: no DAE or AU plug-ins, only one Ivory AU plug-in.
    This is my configuration:
    Ivory AU (Audio object-Direct TDM-Instr) output ESB 1-2 -> input ESB1 2 (Audio Object-DAE) -> Output 1-2 of my digidesign interfaces
    Ivory becomes totally unusable for me this way. Is someone having the same problems?
    Suggestions?
    Wim Claes
    Setup
    Logic 7.2
    Protools 7.1cs4
    192, 2x96i, 1x 96 interface
    G5 dual 2,7   Mac OS X (10.4.3)   4 GB RAM

    I tried already a lot of things, but it is very clear:
    1. Instantiating Ivory as AU- Ivory plays in tempo
    2. Record enable an audio track in the arrange window - Ivory plays out of tempo
    This happens with the default song of logic, no other plugins or AU-instruments are instanciated.
    I can not find a way to convert my Ivory-midi to audio without the timing problems;
    Bouncing, freezing, record to audio, export as audio file: all those things result in timing problems.
    I tried this all with emagic plugins (ES2...) and Native Instruments plugins and then I don't have the tempo problem

  • [Q] Refreshing/Timing problem 21:9-monitor

    Hi fellas!
    I'm using a Macbook Pro 15" non-retina (Windwith GT-650Mtogether with my AOC q2963pm 21:9 monitor through HDMI. What I have encountered is what I think is a timing-problem. I cannot find the settings which works perfect.
    Either, the screen looks a bit grainy at the max resolution of 2560x1080 with the games working (this with Atomatic timing in NVIDIA settings),
    or I have a crispy display, but the games ends with a flimmering and black screen after I quit or Alt+Tab out of a fullscreen-game which causes the NVIDIA-driver to stop responding and restarting (using CVT Reduced Blank).
    I had to create a custom resolution in order to use the full 2560x1080 resolution instead of just 1920x1080. I tried to manually create something that should meet the requirements, but with no luck. Either, the display goes black, or I get the same problems as listed above.
    Here are the specs of the screen:
    Resolution: 2560x1080 @ 60Hz
    Scanning Frequency: H: 30-99 KHz / V: 50-76 Hz
    Pixel Frequency: 185.58 MHz
    Does someone know which settings I should use? The only thing I can think of to fix this is buying a Displayport-cable as "no one" ohters is experiencing this issue. Will this easy solution fix it?

    Try the Thunderbolt connection.
    <http://store.apple.com/us/product/MD861ZM/A/apple-thunderbolt-cable-20-m>

  • Labview programing problem

    Hi, i have problem with my labview programing which in associate with cRio-9025
    which also doing the data acquisition of temperature.
    below is my programing, pls help me to check on it, see error 
    did i made.
    Apprecite it, thanks
    Attachments:
    asas.png ‏435 KB

    Hi, as Apok has mentioned, if you click on the run arrow it will come up with a list of errors and some explanation.
    on a quick look though;
    you have an error on the TDMS file open block - the operation constant is shown with a otted line (and seems to be spelt incorrectly).
    only one input of the build array in the timed loop is connected
    the TDMS file out isn't wired in the case statement (TRUE case) in your while loop
    not necessarily an error, but the shift register storing erros isn't wired up in the input to the loop so will only pass through an error on the last iteration
    It's also not an error, but bad practise to use flat sequence structures to control program flow - you're better off using data flow to manage things if possible. You could leave the first flat sequence in place to initialise variables, wire the error cluster line to both loops and remove the other two flat sequences.
    Regards
    Paul
    Regards
    Paul
    CLD running LabVIEW 2012 32 & 64 bit on Windows 7 64 bit OS.

  • Timing problem when doing hourly rate

    Note:
    There are two problems:
    1)Usually each passed test takes more than 1 minute, thus incrementing Track no. But what if the 2nd failed test completes in less than 1 minute, track no cannot be incremented, leading to wrong result in hourly rate.
    2) For hourly rate, it needs to be calculated after each hour. The problem if the test lapses more than one minute, the same minute skips,
    Therefore, hourly rate cannot be calculated at each hour. Using more than or less than function won't work as the hourly rate will be calculated twice or more in each hour.
     See the red box in the bloick diagram
    See my attached.
    I am using Labview 7.1
    Pls advise I will grateful to your generous help
    Attachments:
    timing.vi ‏108 KB

    Hi Englund,
    Thanks
    Ya I know. I am trying to test whether the function is working  for my main program and test machine. Sometimes, in case of of failed test, the whole test will take less than 1 minute to complete. For the passed test, of course the teat machine will continue to test on more stages. But when the test fails in  first or earlier stages, the track number cannot be incremented.
    What's more I want the to calculate the hourly rate, so if it misses 60 minute mark, Hourly rate won't  be calculated. I can use the equal sign but the minute tracker cannot possibly detect every minute becos the test takes 1 minute, more or less, missing the mark.
    I have checked the function (wait until next ms multiple) as u suggest and found that this is not function I am looking for.
    If you think that I missed out. Maybe post your small example  on how it works.
    Clement

  • Enhance Timing Problem

    Anyone notice that when Enhance Timing is applied to an audio track, that track takes a couple of seconds to start playback? I'm sure it has to do with the time it takes for the real-time processing to kick in, but it's a problem when trying to export the song to disk and that track doesn't start right away...

    That didn't seem to work. I locked the bass track, but there's still a pause before that track starts when I play the song from the start.

  • Possible memory timing problem

    Hello, I hope you all can help me.  I have been wrestling with this problem for about two weeks and have not been able to fix it.
    I recently upgraded my system by adding another gig of memory, setting up a raid array, and adding a X-fi Extreme Music card.  Before the upgrade my system ran smoothly for about 10 months.  After the upgrade it randomly freezes - the system locks up and won't take any more input from the mouse or keyboard.  The only way to recover is to hold the power button down for a few seconds.  After that it boots normally and runs perfectly until the next freeze.  The freeze can happen at any time, in any application, while the system is idle, or even during bootup.  I thought for a long time it was the sound card, but now I suspect my memory timings are off.  The problem is, I can't figure out how to adjust them in the BIOS.
    My system is this:
    CPU: AMD Athalon 64 3500+
    Motherboard: K8N Neo2 - Bios 1.B
    Memory: 4 sticks of Corair Value Select PC 3200 DDR-400 - 512 meg each
    Video: ATI Radon X800 Pro
    HD: 2X 74gig Raptors in a raid array (striping)
    The only PCI card is the soundcard: Creative X-Fi Extreme Music
    Power: Antec Neo Power 480
    OS: Windows XP, SP2
    I think that's everything, if I am missing something, please let me know.  I have run Microsoft's memory test utility overnight, and it has checked out.  The memory timing should be 3-3-3-8, but I can't figure out how to set that in the Bios.  Also, is there a change I should be making for going from two sticks to four?  I don't overclock or anything.
    I have been trying to figure this out on my own, but I am coming up short.  Any help would be appreciated.
    Thanks in advance
    Claudius

    Hi Claudius.  Welcome to the forum.  Here are a few tips that will make it easier for you to get help here.  Create a sig as described in the forum rules (hint-use profile button in sign-on screen) which will locate all the info about your system in a convenient place at the bottom of each post like I have.  It's the same info about your system that you posted in your first post, but having it there in each post makes everyone's life easier.  Also indicate whether you have AMD Venice 3500+ (which I think you have).
    Timings terminology.  It gets confusing because memory specs are listed in different orders when people refer to them, in bios and in post screen.  First number should be cas latency (Tcl). Then Ras# to Cas# delay (Trcd). Then Row Precharge Time (Trp). Min Ras# Active time (Tras) is last number although post screen puts it next to last.
    Tcl-cas=3
    Trcd=3
    Trp=3
    Tras=8
    That's what bios using serial prescence detect (SPD) should pick up under auto for your ram.  I doubt bios detected and used Tras 3 like you thought as it would not boot with Tras 3.
    Get CPU-Z online for a great tool that gives you a lot of memory info.  It shows a lot of info about your chip too.  I suspect timings were detected properly or else your machine would not have booted and run.
    After you added more identical memory (corsair value select) your machine apparently booted and ran with 2 gigs of ram, at least that is what appears from your post.  Microsoft's memory test said things are good.  (Don't know what that test is, and if you really want to test it you should use Memtest available online to make a bootable floppy and test memory for about 1 hour.)  However, it sounds like your memory is ok and you have some kind of sound card interrupt (IRQ conflict).  Did you disable on board sound in bios since you are using a sound card?  Try that if you can.  If you still get lockups with onboard sound disabled and sound card working, try removing sound card and seeing whether system still locks up.
    Good luck.
    Edit-  In re-reading your post, it may be that system detected Tras 7 instead of Tras 8.  You can manually set tras 8 in the bios if you want to, but use CPU-Z first and check CPU-Z memory tab (to see what system is using) and CPU-Z SPD tab (to see what your ram is telling your system to use).  I think you will find both are 3,3,3,8.  If not, you can always change Tras to 8 in bios and leave everything else on auto.  However, you could also change dramm voltage for memory in bios to 2.7 volts without doing anything any harm and this could help memory if you are really having a memory (as opposed to IRQ conflict) problem.

  • Labview Graph Problem

    Hello,
    I have a problem with a tool witch takes excel columns and plots graphs.
    The problem is that when i create the png of the graph it additionaly appears the poit (0;0) witch should be on graph.
    In the preview i can't see this point. I attached an example.
    Attachments:
    Daimler Star22_unknown_unknown_unknown_ACCX_Offset vs temp.png ‏13 KB

    Have you tried putting a probe onto the data going into the graph to check there aren't any extra data points (i.e. at 0,0)?
    Certified LabVIEW Architect, Certified TestStand Developer
    NI Days (and A&DF): 2010, 2011, 2013, 2014
    NI Week: 2012, 2014
    Knowledgeable in all things Giant Tetris and WebSockets

  • LabView installer problem : undetected CRIO in MAX

    Hello,
    I'm working on a data acquisition project in LabView 2010 SP1 composed by a LabView Host program (XP SP3 PC) and a CRIO acquisition module. They communicate together through Shared Variables and Network Streaming.
    I made an installer of my Host project composed by: runtime 2010 SP1, MAX, NI-RIO 3.6.0 driver and Microsoft Silverlight .
    I created this installer on my Dev PC and installed it to a Client PC (see the joint image). The whole program works on the Client PC and the data exchange (Shared Variables and Network Streaming) is OK.
    The problem is that the CRIO is not detected by MAX in Remote Devices. On the other hand, when I use the NI Network Browser, it works well (it detects the CRIO).
    The other problem is that, even if I have installed the NI-RIO 3.6.0 driver, the RIO Device Setup is not present on the Client PC.
    If I install all the standard drivers from the NI device Drivers DVD, it installs everything and then the CRIO is well detected and the RIO Device Setup is also installed.
    My conclusion is that I haven't included all the necessary packages in my installer.
    What are the necessary components to install in order to have:
    the CRIO fully detected
    the RIO device setup installed
    Must I also install some DAQmx components ?
    Thanks in advance for your help.
    Attachments:
    Ni-programs.jpg ‏87 KB

    There is a difference between a network and subnet. I have a similiar problem where I can see the device but can not configure it. You can tell if you are on the same subnet by retireving the ip address for the client and get the subnet mask of the client machine. The example is the following:
    client machine ip address 192.168.20.37 subnet mask 255.255.255.0
    CRIO machine 192.168.22.15
    They are not on the same subnet. If subnet is 255.255.0.0 it is on the same subnet.

  • Labview Synthesiser: Problem outputting sound via MacBook soundcard

    Hi all!
    First time posting here, hopefully i'm in the right section of the forum. Anyway, i'm a part time student and currently studying a module on Labview in university. As a mini project i decided to build a synth using Labview. I've it mostly built but i'm having a bit of trouble outputting sound to the soundcard. I'm also having a bit of trouble getting the waveforms to play for a longer time period.
    I was sort of copying the setup of one of the example VIs (generating sound vi i think) and another vi i found online but i can't seem to get mine to work using my synth design. I've two problems, one is that the waveform only plays for a very short time but the main problem is that i'm getting an error (error 4803) saying the soundcard cannot accomodate the specified configuration but as far as i can see my setup is more or less the same as the generating sound vi (which works on my fine macbook). Obviously i'm missing something so i decided to come on here and ask for help.
    I'm guessing the datatype connected to my sound output configure vi could be causing a problem since it has a red dot on the input terminal. Any suggestions on how i should fix this? 
    I've my vi attached. Any help would be appreciated!
    Cheers! 
    Edit: I've already fixed the error 4803 problem. Had to change the input to the sound output configure sub vi. Now i just have to figure out how to get the sound to play for longer. Any ideas anyone?
    Solved!
    Go to Solution.
    Attachments:
    LabVIEWSynth.vi ‏94 KB

    OK. You have several problems.
    The cluster order in your Format cluster is Rate, Bits, Channels while the order in the "sound format" cluster on Sound Output Configure.vi is sample rate (S/s), number of channels, bits per sample. LabVIEW connects clusters according to the cluster order. How to avoid this: Pop up on the sound format conpane terminal on the Sound Output Configure.vi icon on the block diagram and choose Create Control. That will put a control with the same names and cluster order on the front panel. You can edit the names if you wish as long as you do not change the cluster order. The alternate is to Unbundle the data from your cluster control and then bundle it to the input cluster. I show this in the modification of your VI attached.
    The VI does not respond to the Stop button until the event structure executes, which only happens when a key is pressed. Fix: Add an event case for the Stop Value Changed event. I show this in the modification of your VI attached.
    The VI does not recognize changes in Octave, Amplitude, Osc Select, or Filter Frequency until the second keypress after changing any of these controls. Why? Dataflow. Those controls are probably read within microseconds after an iteration of the for loop starts. They are not read again until the next iteration regardless of when or how many times they are changed. The loop will not iterate until the Event structure completes, which only happens when a key is pressed. The Fix: Event cases for Value Changes on those controls. Note that this does not work because now there is no defined frequency. So, you also need some shift registers. Because of the problems mentioned, I did not put this into the modified VI.
    Next, the event structure freezes the front panel until the code inside has completed. This becomes quite apparent when you set the duration to 2 seconds and press several keys quickly. The fix to this and the problem in the paragraph above is a parallel loop archtitecture, such as the Producer/Consumer Design Pattern.
    Not a problem but a different way of doing something: Use Scale by Power of 2 from the Numeric palette in place of the case structure connected to Octave.  I show this in the modification of your VI attached.
    Now to your question about tone duration: The duration of a signal generated by the Sine Waveform.vi and the others is determined by the sampling frequency and the number of samples. You are a student so you can do the math. You need to adjust the number of samples because the sampling frequency is fixed.
    The modified VI works fine on my iMac.
    Lynn
    Attachments:
    LabVIEWSynth.2.vi ‏89 KB

  • Labview addons problem

    While opening an existing project, I have been prompted to find the control named "Telnet Session.ctl". The labview is trying to load it from "\addons\internet\telnet\telnet.llb\Telnet Session.ctl", but couldn't find it. What should I do? I also see this box during the process.
    Solved!
    Go to Solution.
    Attachments:
    addon.png ‏15 KB

    Thanks Dennis. In will have a deeper look into the problem. For now, I think it needs telnet functions since one of the error I am getting is SubVI 'Telbet Read.vi'ubVI is misssing. Another is 'type Definition 'telnet_handle': type definition not found or contains error' and many more.

  • Labview memory problem

    I have an application generated by IOTech for using their analog I/O Card with LABVIEW 7.1. When I start the application. My task manager shows Labview.exe size about 30000K. Then after it goes on increasing with no final limit. This causes my PC to slow down and some time to crash situation. What is the reason for such huge memory increase, I dont understand. The file is attached here with with a hope to get solution.
    Thanks.
    Attachments:
    DaqBoard 1000 High Stress_v71.zip ‏1056 KB

    That is chowing down 1.8 Meg a minute so that would be a problem.  You options are:
    Add more memory to a point where you can complete a test
    Set break points in the program to identify what step or steps are allocating the memory.  Once identified you can make a decision to limit total memory consumed (use delete array at certain size) or FIFO data to disk to to prevent PC from crashing.
    Hope this help,
    Matthew Fitzsimons
    Certified LabVIEW Architect
    LabVIEW 6.1 ... 2013, LVOOP, GOOP, TestStand, DAQ, and Vison

  • Labview webcam Problem

    Could anyone help me, please?
    I've download LabVIEW webcam library from Peter Parente'Homepage
    http://www.cs.unc.edu/~parente/labview/webcam1.4.zip
    I use my webcam to capture indicator display screen but the charactor that i get is reverse (like look in the mirror)
    Could anyone suggest me how to solve this problem? Is there any VI to flip the image and can use with Peter's Libray? I use LabVIEW 6i,sir
    Thank you very much
    Attachments:
    display_capture.JPG ‏9 KB

    Hi Prart
    1.- If you have IMAQ vision 7.1, download USB IMAQ driver fron this web and will have no such problems.
    2.- If not, convert your image to a 2D array an see attached picture about how to revert image.If it does not run for you, try transpose 2d arrrays.
    Hope it helps
    Cheers
    Alipio
    "Qod natura non dat, Salmantica non praestat"
    Attachments:
    mirror.jpg ‏20 KB

  • Labview 6 problem

    When I try to Run Labview 6, it always comes out Error 37, which seems related to Error I32 and out ref on computer? How can I solve this problem. I am looking forward to help. Thanks in advance.

    If this program was provided with the laser, maybe you should contact them. This does not seem to a problem with LabVIEW per se, but with the provided code.
    Did it ever work before, or is this a new installation? Could it be you are using the wrong serial port?
    Do you have access to the  block diagram of the program or is it passworded? Hit "ctrl+e" when the VI is not running. Anything?
    If the program instructions require you to run the program using the "Run Continuously" button, it seems that not a lot of thought and time went into the code development and it is probably buggier than an ant farm. WIthout seeing the code, it is impossible to tell.
    LabVIEW Champion . Do more with less code and in less time .

Maybe you are looking for