Synchronize waveform output from labview to bnc 2110

Hi,
I'm new to LabView and I would appreciate examples on how to output a waveform graph from LabView to a BNC2110.
Eventually we would need several graphs, both analog and digital outputting to a BNC2110 synchronized with a clock.
Thanks.  Appreciate any help.
Grace

You can't output anything to a BNC 2110. The BNC 2110 is a dumb terminal block. You can however, output signals (not graphs which are visual elements) from a DAQ card that you control with LabVIEW. You would then connect the DAQ card to the BNC 2110. If you have an NI DAQ card, then there are a large number of example programs that show you how it could be used.

Similar Messages

  • Create console output from LabView

    Hi all.
    Is there any way of creating a console string output when calling a dll from LabView?
    I have to call a C++ program from LabView and I'm using a dll. I need to debug this program, and I think that using "cout" is the fastest and simplest way.
    Thanks in advance.
    Regards,
    Francisco

    No, there is no console output when calling a DLL. You could have your DLL simply write debug information to a log file.

  • Splitting data from waveform output from niScope Multi Fetch Cluster

    I am trying to split the data of a 1-D array of clusters.  I am using a PXI-5421 function generator and a PXI-5122 digitizer.  The NiScope Multifetch Cluster.vi retruns output data of the waveform as a 1-D array of cluster of 3 elements.  This array contains information from channels 0 and 1.  I am trying to extract the waveform component information from each of the channels so I can operate on them an re-assemble two waveforms.  Can someone point me in the right direction?  I can't seem to come up with the right tool from the array or cluster tools.  Thanks.
    Jeff
    Solved!
    Go to Solution.

    You just use an Index Array and an Unbundle by Name or Unbundle.
    Message Edited by Dennis Knutson on 04-30-2009 10:41 PM
    Attachments:
    Index and Unbundle.PNG ‏4 KB

  • How can I programati​cally control the names of files output from Labview into .pdf format (ie with Adobe PDF Writer or Distiller)​?

    I would like to save some data forms for a standard test controlled with labview in a pdf format. Due to the large number of forms and test reports, I would like to have Labview automatically assign the .pdf file name based on test number already contained in the labview code. How can I do this.
    Note: This question is very similar to:
    "Labview and Adobe Acrobat output" posted by John Balone on 1/26/2000. The reponses offered good suggestions but it is not clear to me how to implement them.

    This information is essential if you plan to use the Acrobat Distiller printer driver and any of the examples listed here:
    http://zone.ni.com/devzone/devzone.nsf/webcategori​es/EADE78F29101E8DB862567AC0058596B?opendocument&n​ode=DZ52095_US
    -Graeme, rayodyne.com
    Attachments:
    1_-_Printer_Configuration_with_Acrobat_Distiller.jpg ‏64 KB
    2_-_Printer_Configuration_with_Acrobat_Distiller.jpg ‏33 KB

  • SRS DS340 Arbitrary Waveform load from LabView

    Has anyone else experienced problems using LabView to load arb wave data
    into one of these machines? Does anyone have a working piece of Labview to
    fulfil this task that I could see?
    Pete

    Has anyone else experienced problems using LabView to load arb wave data
    into one of these machines? Does anyone have a working piece of Labview to
    fulfil this task that I could see?
    Pete

  • Arbitrary waveform generation from large text file

    Hello,
    I'm trying to use a PXI 6733 card hooked up to a BNC 2110 in a PXI 1031-DC chassis to output arbitrary waveforms at a sample rate of 100kS/s.  The types of waveforms I want to generate are generally going to be sine waves of frequencies less than 10 kHz, but they need to be very high quality signals, hence the high sample rate.  Eventually, we would like to go up to as high as 200 kS/s, but for right now we just want to get it to work at the lower rate. 
    Someone in the department has already created for me large text files > 1GB  with (9) columns of numbers representing the output voltages for the channels(there will be 6 channels outputting sine waves, 3 other channels with a periodic DC voltage.   The reason for the large file is that we want a continuous signal for around 30 minutes to allow for equipment testing and configuration while the signals are being generated. 
    I'm supposed to use this file to generate the output voltages on the 6733 card, but I keep getting numerous errors and I've been unable to get something that works. The code, as written, currently generates an error code 200290 immediately after the buffered data is output from the card.  Nothing ever seems to get enqued or dequed, and although I've read the Labview help on buffers, I'm still very confused about their operation so I'm not even sure if the buffer is working properly.  I was hoping some of you could look at my code, and give me some suggestions(or sample code too!) for the best way to achieve this goal.
    Thanks a lot,
    Chris(new Labview user)

    Chris:
    For context, I've pasted in the "explain error" output from LabVIEW to refer to while we work on this. More after the code...
    Error -200290 occurred at an unidentified location
    Possible reason(s):
    The generation has stopped to prevent the regeneration of old samples. Your application was unable to write samples to the background buffer fast enough to prevent old samples from being regenerated.
    To avoid this error, you can do any of the following:
    1. Increase the size of the background buffer by configuring the buffer.
    2. Increase the number of samples you write each time you invoke a write operation.
    3. Write samples more often.
    4. Reduce the sample rate.
    5. Change the data transfer mechanism from interrupts to DMA if your device supports DMA.
    6. Reduce the number of applications your computer is executing concurrently.
    In addition, if you do not need to write every sample that is generated, you can configure the regeneration mode to allow regeneration, and then use the Position and Offset attributes to write the desired samples.
    By default, the analog output on the device does what is called regeneration. Basically, if we're outputting a repeating waveform, we can simply fill the buffer once and the DAQ device will reuse the samples, reducing load on the system. What appears to be happening is that the VI can't read samples out from the file fast enough to keep up with the DAQ card. The DAQ card is set to NOT allow regeneration, so once it empties the buffer, it stops the task since there aren't any new samples available yet.
    If we go through the options, we have a few things we can try:
    1. Increase background buffer size.
    I don't think this is the best option. Our issue is with filling the buffer, and this requires more advanced configuration.
    2. Increase the number of samples written.
    This may be a better option. If we increase how many samples we commit to the buffer, we can increase the minimum time between writes in the consumer loop.
    3. Write samples more often.
    This probably isn't as feasible. If anything, you should probably have a short "Wait" function in the consumer loop where the DAQmx write is occurring, just to regulate loop timing and give the CPU some breathing space.
    4. Reduce the sample rate.
    Definitely not a feasible option for your application, so we'll just skip that one.
    5. Use DMA instead of interrupts.
    I'm 99.99999999% sure you're already using DMA, so we'll skip this one also.
    6. Reduce the number of concurrent apps on the PC.
    This is to make sure that the CPU time required to maintain good loop rates isn't being taken by, say, an antivirus scanner or something. Generally, if you don't have anything major running other than LabVIEW, you should be fine.
    I think our best bet is to increase the "Samples to Write" quantity (to increase the minimum loop period), and possibly to delay the DAQmx Start Task and consumer loop until the producer loop has had a chance to build the queue up a little. That should reduce the chance that the DAQmx task will empty the system buffer and ensure that we can prime the queue with a large quantity of samples. The consumer loop will wait for elements to become available in the queue, so I have a feeling that the file read may be what is slowing the program down. Once the queue empties, we'll see the DAQmx error surface again. The only real solution is to load the file to memory farther ahead of time.
    Hope that helps!
    Caleb Harris
    National Instruments | Mechanical Engineer | http://www.ni.com/support

  • PCI 6014 and BNC 2110 digital output

    Hi,
    I try to use PCI-6014 through BNC-2110 to giving on a digital output 5V.
    Please give advice or examples.
    Thanks

    First, there's the Getting Started with DAQmx. Then on the Measurement I/O>DAQmx palette is the DAQ Assistant. Also, open the example finder and from the main tab, go to Hardware Input & Output>DAQmx>Digital Generation. The simplest examples are Write to Digital Port and Write to Digital Line.

  • LabVIEW Video Waveform Source from Video Test Image for Multisim simulation

    Hi,
    I want to simulate a video amplifier circuit in Multisim. I need a Composite Video Signal Source for feeding the input to the amplifier. In one of the application note I saw that LabVIEW can be used to create Video Waveform Source from Video Test Image (BMP). I've attached the sheet for reference. My questions are below:
    1. Do I need to create the Video Signal Source in LabVIEW or is it already available with LabVIEW library?
    2. How to call/access this Video Signal Source from LabVIEW for simulation in Multisim?
    3. Any examples available?
    4. Which versions of LabVIEW and Multisim supports this feature?
    Any help is truly appreciated.
    (Topic of the Application Note: Circuit Design Using Simulation and Virtual Instrumentation, Applications in Biomedical Engineering)
    Regards,
    Sinoj
    Attachments:
    circuit_design_simulation_and_virtual_instrumentation_page26.pdf ‏314 KB

    Hi Mahmoud,
    Thanks for your reply.
    I searched for the VI 'Simple create composite signal from BMP image.vi' on internet (since it is named so in the Application Note), but could not find. I've some experience with LabVIEW, so I hope that once I get this VI I can plug-in to Multisim. But in order to add it to 'lvinstruments' folder of Multisim installation directory I need the DLL and LLB file (not the VI, I understand so). If these DLL and LLB files are available I think it will be easier to integrate with Multisim, otherwise I don't know what to do if I have just the VI.
    Anyway attached the complete Application Note.
    Regards,
    Sinoj
    Attachments:
    circuit_design_simulation_and_virtual_instrumentation.pdf ‏2293 KB

  • No output from Simulink to LabVIEW with Simulation Interface Toolkit (0/1)

    I'm able to send data from labview to simulink, but the labview
    display doesn't show any output from the simulink model. I've attached
    two very simple labview and simulink files which gives me this
    problem. I have the same problem when trying to implement the sinewave
    example in the manual for the toolkit as well.
    I'm using LabVIEW 7.1 Pro, Matlab 6.5 and Simulink 5.0 on WinXP pro.

    The Simulation Interface Toolkit 2.0.1 or earlier does not work with LabVIEW 7.1. There is a patch available (or available soon) from NI support that corrects this problem. THe patch is Simulation Interface Toolkit 2.0.2.

  • Output of BNC-2110

    Hi.
    I am trying to generate an analog output to the BNC-2110 via graphical programming. How do I do it?
    Also, may I know if it is possible to convert this output into PWM before I output to the BNC-2110?
    Many thanks.

    By itself, the BNC-2110 is not capable of inputing or outputing any type of signal. It is just a dumb terminal block. You have to have it connected to an actual data acquisition board in your system. There are many shipping example for various types of data acquitisiton boards. The recomended api is something called DAQmx. Go the Hlpe>Find Examples and expand the Hardware Input and Output section.
    PWM is possible with some boards. You'll hve to find out the type in your system.

  • BNC 2110 2111 synchronization

    Hi, we are using several equipments including 2 BNC-2111 and 3 BNC-2110 with DAQCARDs 6062 in different computers for a big measurement. We are wondering whether it is possible to synchronize the measurement among these equipments. Our basic idea is to generate the tricker signal to PFI0 and other PFI channels. Does it work? Any suggestion is appreciated.

    Hi knji,
    On your BNC-2110 you have the "Digital and Timing IO" connector block. This is what you will use to connect your signals for your quadrature encoder. In the application note it says "Pin 37/PFI8/GPCTR_Source." You would connect this to PFI 8 on your BNC-2110. The other connections are just the same. For more information on the pinout for your DAQ card look in the E-series help manual:
    http://digital.ni.com/manuals.nsf/websearch/0E0DFDBB7706687A86256F6300560584?OpenDocument&node=132100_US
    Go to IO Connector >> IO Connector Pinouts >> 68-Pin Connector
    -Sal

  • How to display error messages and output from Matlab (which Matlab would typically send to its command window but no longer does when called by Labview) into Labview or allow it to be dumped into Matlab Command Window?

    Using Labview 6i and Matlab 6.1. I want to be able to see Matlab warnings and error messages either in the Matlab Command Window or in Labview itself. Currently Matlab is called by Labview (which is working). However I would like to debug and/or modify my Matlab script file to better understand how the two programs are interfacing. It is difficult since no data or messages can be displayed currently to the Matlab command window. I would like to change that if it is possible - Labview is suppressing that from happening. If not possible to send these
    messages to Matlab Command Window can I make it at least possible to see Matlab's actual warnings and/or error messages in Labview?

    I don't think you can debug your Matlab script from labVIEW. The following webpage talks about this:
    http://digital.ni.com/public.nsf/3efedde4322fef198​62567740067f3cc/19106e318c476e608625670b005bd288?O​penDocument
    My suggestion would be to write a script in Matlab and thoroughly test it before calling the script from a Matlab script node in LabVIEW.
    Chris_Mitchell
    Product Development Engineer
    Certified LabVIEW Architect

  • Output data from LabVIEW, input to C++ code

    I currently have a LabVIEW VI which grabs data (range and angle measurements) from the RS232 serial port, and formats this data into two values - X and Y coordinates (double data types).  What I want to do is pass these individual numerical values (not an array of X/Y coordinates) to a C++ gesture recognition program that inputs X and Y coordinates and determines the gestures.
    What is the best way of passing a value from LabVIEW to C++ code?
    I apologize if this was answered in another thread - I searched through some, but couldn't find any information relevant to my question.  Thanks for the help!

    Hi delvec28,
    delvec28 wrote:
    I currently have a LabVIEW VI which grabs data (range and angle measurements) from the RS232 serial port, and formats this data into two values - X and Y coordinates (double data types).  What I want to do is pass these individual numerical values (not an array of X/Y coordinates) to a C++ gesture recognition program that inputs X and Y coordinates and determines the gestures.
    What is the best way of passing a value from LabVIEW to C++ code?
    I apologize if this was answered in another thread - I searched through some, but couldn't find any information relevant to my question.  Thanks for the help!
    You may want to build a DLL.  A DLL is like a collection of functions - compiled in a way to be used by other programs.
    If the C++ code calls a LabVIEW function which returns values to the C++ code, then the LabVIEW code will be compiled as a DLL.
    C++ code could also be compiled into a DLL usable by LabVIEW.
    There are also ways for separate applications to share data - LabVIEW can be an ActiveX server, LabVIEW also supports DDE (Dynamic Data Exchange)  - these are both Windows-OS-specific.  LabVIEW can be a .NET client, though (as far as I know) LabVIEW cannot yet implement a .NET server.
    TCPIP is yet another (OS independent) method of sharing data between LabVIEW and another application - it's really not too complicated (at least not on the LabVIEW side .)
    Are there two applications running (C++ + LabVIEW)?  If not, in which language is the main program written in?
    Cheers!
    "Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)

  • Difficulty synchroniz​ing data from VISA resource with data from a physical channel

    Essentially what I'd like my program to do is control the electrical power going (sourcing either current or voltage) into a light and measure the intensity of the light at a given power level, and then do this automatically for ~1000 increments of the source voltage/current.
    One of my lab partners made a program a while back which does what we want("LIV Sweep Rev.vi" the first image in the link at the end of this post), but it's a bit sloppy: the program is able to interact with the power supply (connected directly to the computer via USB) and make it turn on, increase the voltage/current, and record the "IV" characteristics just fine. The program can also interact with the light detector (connected via NI-BNC 2110) and gather the photocurrent data. The problem, however, is that the data wouldn't be in sync. The photocurrent data for when the light was actually supplied with 1V would be improperly recorded in the cell for when 2V are applied to the light. To fix this problem my lab partner added in a time delay so that the detector will pause for ~0.35 sec (user-specified in front panel) before gathering data.
    The program works, but I figured there had to be a better way. The thing about the timing, however, is that it changes from run to run. Sometimes a delay of 0.45 s works well, and other times the power supply will have shut down while the detector is still gathering data (and thus we miss the low end of the sweep). Other times the detector will turn on too early, which will cut off the high end of the sweep.
    (Note: I have next to zero experience with LabVIEW, but I know a little bit of java, so I understand most programming jargon)
    I spent all day yesterday trying to find out how to synchronize two data aquisitions (my attempt is shown in the "LIV SweepSummerDuncanRev3.vi" in the link at the bottom, 2nd image) but I'm running into trouble when I try to trigger the sample clock using the VISA Resource Name for the power supply.
    The programs can be viewed here:
    http://imgur.com/a/Up3eS
    I'd really appreciate any and all advice that you folks have to offer!

    Change the code such that rather than using a ramp from your power supply, just output a single value. Then do your measurement and then step to the next value in your ramp.
    You can use a State Machine (SEarch if you do not know that term in LV).
    Some gear will allow specifying a ramp driven by an external clock. If you widget can handle an external clock/trigger that approach could run faster.
    Without hardware to syncronize the various sub-systems you would have to resort to using a RT OS and depending on the instruments, even that could be hit-n-miss.
    So re-code to step-measure-step-measure etc.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Difficulti​es connecting 2 bnc-2110 signals to the scope

    Hi,
    maybe someone has encountered the same difficulties...
    I have a circuit with DAQmx blocks that generates a waveform and writes it to the buffer.
    The outputs channels 0,1 concide with the outputs on the bnc-2110 block.
    These outputs are wired with coax to scope channels 1 (x) and 2 (y).
    Now when I disconnect 1 cable I see a perfect waveform. But once I connect both cables to the scope
    I get only a perfect reading at channel Y. Channel X is corrupted.
    When I switch the cables at the bnc-2110 and leave the connection at the scope, still Y is perfect, X is corrupted.
    So connecting output channels 0 or 1 to scope channel 1 (x)  does not make a difference. X stays corrupted.
    The same holds for Y. It stays perfect.
    The odd thing is that when I connect scope channel 1 (x) with a waveform generator, and scope channel 2 (y) with
    channel out 0 or 1 of the bnc, I get 2 perfect readings. Only thing I can see is that the output of the generator is grounded.
    When I reverse this. I get a perfect reading from the bnc at scope channel 2 (y) and again a corrupted signal at scope channel 1 (x)?!
    So it seems that I get 2 perfect readings when x from the bnc has been replaced by the function generator.
    Both should be grounded. The switch is in position GS (instead of FS). I connected the scope and computer to the same
    electrical adapter, to be sure, but still have the impression I have grounding issues...
    Bob 

    Hi Bob,
    Sorry for the late answer.Is your problem solved?
    If no, could you be a little bit more specific when you says DAQmx blocks? Could you say us which hardware you are using and what is your program?
    So we can do the test or try to reproduce the problem.
    Regards,
    Julien Roland - District Sales Manager
    NI Belgium - Technical Support
    Don't forget to rate a good answer

Maybe you are looking for