Handling Differential Signals from cRIO

Hello All,
  Looking for methods to measure differential signals with cRIO.  I know this can be done pretty easily with DAQmx interfaces, but cRIO does not appear to inherently be capable of understanding differential signals.  Application is to measure analog resolver voltages (HI and LO).  Using cRIO-9082 with 9205 module.  Labview 2013.
Question also posted on hardware boards.
Thanks in advance.
GSinMN

Answered my own question.  Turns out I was trying to configure the inputs in the wrong place.  In the project, right click on the Module, and select properties, not the individual pins.
Thanks anyway.
GSinMN

Similar Messages

  • Handling POSIX signals in AIR

    Is there any way to handle signals on POSIX platforms in AIR? I'm developing a kiosk application with AS3 for AIR that will run on Linux and some features would be easier to do if I could handle the signals from the OS (-HUP, -INT, etc).
    If not, is there any other way to listen for events from other applications or the OS. Like for example send a signal to AIR app to reload configuration after it's been updated.

    How about using a native process as a bridge to your AIR application ? You can implement stub handlers in your native process just to send messages to the AIR application. The fact that you need to signal the spawned process instead of your AIR application may be of course a drawback, depending on your implementation. As for the interface with other applications, the first thing that comes to my mind is using sockets.

  • Fastest way to transfert data from cRIO

    Is anybody know the fastest way to transfert data from cRIO? I tested shared variable, but it's not fast enough. What is the fastest speed could we achieve with shared variable in cRIO? how can I transfert 50000 32 bit word per second from cRIO to my PC? This should run 24h/day.
    Thanks
    B.
    Benoit Séguin
    Software Designer

    Hi Benoit,
    Thanks for your post and I hope your well. I noticed you've not received a reply - and I would like offer my advice.
    Shared variables are one way to communicate over the Ethernet. You can use UDP, TCP and VI Server also. 
    UDP is the fastest network protocol but it includes little error handling and can lose data. TCP is commonly used for large systems. TCP takes more programming effort than shared variables, however, it provides a good mix of speed and reliability. You can use the VI server to run Vis and/or access and set the value of controls and indictors, but it is slower and not recommended for larger data sets. 
    Please let me know what you think,
    Kind Regards,
    James.
    Kind Regards
    James Hillman
    Applications Engineer 2008 to 2009 National Instruments UK & Ireland
    Loughborough University UK - 2006 to 2011
    Remember Kudos those who help!

  • Differential Signal to Single Ended Signal Conversion

    Hi,
    Im facing a problem here. I need an ADC that reads 0-20mA. The signal comes from a signal conditioner that creates a differential signal. However, a NI device with single ended inputs is much cheaper than differential inputs.
    The signal conditioner already exists and I cant change that.
    Im looking for something that can convert the differential signal to a single-ended signal.
    I tried finding an IC that does that, but couldnt find one. Do they exist?
    Maybe its easier for voltage than for current.
    How is this usually done?
    Best regards,
    Arvel

    Use a current sensing resistor and then use a current shunt amplifier.  The amplifier will amplify the voltage drop across the reistor.  From there, you can just read the output voltage with a DAQ.  Create a DAQmx scale to do the math to convert that voltage into a current reading.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • TTL signals from motor outputs

    I'm currently using a stepper motor with a MID-7604 drive and a PCI-7344 controller. I would like to output TTL signals from the drive at certain motor positions, but do not have an encoder (which is required for breakpoint signals). Is it possible to `construct' a TTL type signal using the low-level motion functions in LabView, and then output them through a motor axis that is not currently being used?

    Hello,
    Depending on the type of output that you want to generate (single pulse or pulse train) you could use the Digital IO port of the motion controller with the Set IO Port MOMO.flx function to toggle DIO lines or you could use the Configure PWM Output.flx function to generate a pulse train.  The downside is that this will be software timed based on the current position as determined by the controller.
    There is not any way to manually modify the motion control signals that are generated by the controller.  That is all handled by the FPGA of the controller.
    Regards,
    Scott R.
    Applications Engineer
    National Instruments

  • Generating 1mv output signal from AO1 using NI 9381 card

    Hi,
    I am using NI 9074 cRIO & NI 9381 IO card, I am trying to generate 1mv signal from the AI port.
    1. Programatically i am supplying 1mv to AO port and reading AI port.
    2. On the AI port i am getting 22mv without connecting AO port to it .
    3. Is there any method to achieve this task.
    Please help me in this.Thank you.

    According to the specifications the offset error can be as large as 16 mV depending on calibration and temperature.
    Devices with input multiplexers can experience an effect called ghosting which results from capacitances in the input circuitry charging to unknown voltages due to leakage currents from adjacent channel inputs or other internal circuit nodes. Measurements made with the input open or floating do not have any meaning. The manual recommends input source impedances less than 1000 ohms.
    As has already been mentioned the resolution is larger than 1 mV. When you combine the offset and gain errors of both the AO and AI channels, in the best case you will not know the output within 11.5 mV and in the worst case the error could be greater than 66 mV.
    If you need both 1 mV resolution and 1 mV accuracy, you will need a better device.
    Lynn

  • How to trap (interrupt) signal from JVM

    How to capture (interrupt) signal from JVM in case of JVM termination due to Error/RuntimeException in shell script (Unix/Linux OS). Is it possible?
    I am executing an java class from shell script. In cetain condition it throws RuntinmeException.
    I want to capture the signal from terminating JVM in case of the RuntimeException.
    How to acheive this ? Can any one solve this problem ?

    Hi
    Using the POSIX signal java program , We can trap the JVM signals.
    Addition info
    Use sun.misc.SignalHandler - interface
    and MEthod sun.misc.Signal.handle( new sun.misc.Signal(signalName), this );

  • Read/acquire signal from remaining empty channels of NI9205 by modifying the vi of ATI Force/Torque Transducer which uses the first 6 channelsof DAQ card.

    Hello,
    I am using force/torque transducer by ATI Automation. It provides me with its own vi to measure and write the data of the 6 signals (3 axes Forces, 3 axes torques)
    Following is my Ni system: -
    cDAQ 9172 chassis: Slot 5 - NI 9401, Slot 6 - NI 9205, Slot 8 - NI 9237. (Excitation voltage to the sensor/transducer (0-5V) using 9237).
    I connect the 6 signals from the transducer cable to channels ai0,1,2,3,4,5 in NI 9205(differential). The vi uses a calibration 6*6 matrix to finally display the calibrated voltage data.
    Now, I wish to use the remaining availabe channels that empty in 9205, namely ai17, ai18, ai19 for other signal measurements.
    ai17 channel is connected to Hall sensor  that gives out a square waveform corresponding to the rotor rpm. ai18, ai19 is connected to rotor-motor power supply's voltage and current signal respectively.
    But I am not able to access these 17, 18,19 channels from ATI company's vi. The vi loads the complete DAQ card (9205), uses only first 6 channels, hence the array data wire consists of only 6 channels. I can split these into 6 individual signals only.
    If I use a separate DAQ task to read channels ai17,18,19, then I get the error that this channels are reserved for some other task. Please tell me how can I access all the remaining channels apart from the first 6.
    I am attaching the vi here.
    Steps to run the "MEASUREMENT main.vi" : -
    Load calibration file:-  FT8840.cal;
    Load DAQ card: NI 9205
    Load ctr0 of NI 9401 (I am using this counter for rpm measurement purpose, this is my addition to the actual vi) 
    Attachments:
    DAQ - Copy.zip ‏574 KB

    There's nothing I can do to help.  I don't have your hardware to be able to modify your code and to set it up to make sure it runs properly.  Most people on the forum probably don't either.  This is where you'll have to put your programming and LabVIEW skills to work to solve your problem and make it run the way you want to.  If you run into a specific problem and get stuck, then please post back.

  • Wireless signal from iMac to TV?

    I have a new 20" iMac G5 isight on the way and have a question. Has anyone tried to hook up the iMac (or other mac) wireless to your TV. There is products like this www.amazon.co.uk/exec/obidos/ASIN/B000068TYR
    ref=erra_acc_dp1/026-6032989-4654855 that will send a signal from your antenna to your TV wirelessly. I thought mabye in combination with a TV tuner like EZ there would be a way to get the signal from the mac to the TV. That way I dont have to drill a cabel-hole in my livingroom floor down to my office to enjoy FrontRow on my TV. Or is there any other solutions/suggestions?
    (And i dont think there will be an Airport Express A/V launched next week... though I wish!)

    You should be able to connect your ATV directly to your iMac using Internet sharing, but you can't use the same NIC that the iMac is using for its connection. If your iMac is connected to your router by wi-fi, you can share the Ethernet port with the ATV. If your iMac is connected to your router by Ethernet cable, you can share your iMac's AirPort with the ATV.

  • How to read and display a signal from my a miccrontroller (MCB1700) onto labview connected via a CAN port on a PXI machine

    How do you read and display a signal from my a miccrontroller (MCB1700) onto labview connected via a CAN port on a PXI machine?
    I tried using a DAQ Assistant but the CAN port is not included as one of the supported physical channels even though all its drivers are upto date.
    Please help..
    Thanks.
    Solved!
    Go to Solution.

    Attached herewith is a print screen of what is showing on MAX
    The CAN ports are on NI PXI-8461
    Hopefully that clarrifies something.
    Attachments:
    Untitled.png ‏212 KB
    Untitled.png ‏212 KB

  • How to read and display a signal from my PIC Microcontr​oller onto Labview?

    Hi,
    I am doing a project on a Pulse Oximeter and i am trying to read and display the signal from my PIC Microcontroller on Labview? How do i go about doing it? I am using the PIC16F877 and also making use of the USART?
    How do i implement and initialise the USART in PIC16? What are the steps to be taken, please guide me through the process? The link provides information regarding the USART connection, initialisting and stuff? Is it correct to use these codes in my program?Link: http://ww1.microchip.com/downloads/en/AppNotes/007​74a.pdf
    Also, i am using the RS232 serial interface to connect to the PC? The connector i am using is the DB9 Connector. Which template and VI can i use? Am i supposed to use the NI-DAQmx and VISA? Also, for my USART connection, i am using the MAX232 Driver which is applicable for my application as i am working with +5V. So far, i have been reading and trying out on Labview and the steps i have taken are:
    1) Open a New VI and used the Instrument I/O(Read & Display) template?
    2) How do i configure the Instrument I/O Assistant Express VI to read the info from my device connected to COM1?
    3) I was reading the Labview Manual, 'Getting Started with Labview' and i was following the steps under Page 55, Communicating with an Instrument. Am i on the right track?
    4) How do i check and make sure that the port settings of the I/O Assistant and my PIC Micro Match
    Please help me out and guide me through the process. I am student at a polytechnic and i am very new to Labview and the software is used as a development tool for the project. I have a deadline to meet and i hope i can get a response as fast as possible. Your help will be kindly and greatly appreciated. I hope to hear from you guys soon.You can e-mail me all your answers at [email protected]
    Thank You
    Best regards,
    Ashwin
    Ashwin Kumar Mansukhani
    Attachments:
    Getting Started with Labview.pdf ‏901 KB

    Hi Ashwin,
    It is a good idea to first be able to communicate with the microcontroller using some sort of serial communication software such as "HyperTerminal" or "Procomm", etc. Refer to MicroChip's recommendations on this.
    Once that works, then you are ready to use LabView.
    Here is a link which covers many aspects of serial communication. Link - click here
    You probably received a development kit. They usually have a readily available interface. It's been a long time since I played with the PIC, but I seem to remember that you need to program the serial communication driver (as well as at least a bootloader) to get the serial communication going. The driver contains the necessary protocols so that you PC can have a machine conversation with the target (PIC).
    It sounds like a fun & interesting project. Please avoid to have replies to your personal email. By having the answers posted to this thread, you will get much more support and advice.
    Have fun,
    JLV

  • How to read and display a signal from my PIC Microcontroller onto Labview

    Hi,
    I am doing a project on a Pulse Oximeter and i am trying to read and display the signal from my PIC Microcontroller on Labview? How do i go about doing it? I am using the PIC16F877 and also making use of the USART?
    How do i implement and initialise the USART in PIC16? What are the steps to be taken, please guide me through the process? The link provides information regarding the USART connection, initialisting and stuff? Is it correct to use these codes in my program?Link: http://ww1.microchip.com/downloads/en/AppNotes/00774a.pdf
    Also, i am using the RS232 serial interface to connect to the PC? The connector i am using is the DB9 Connector. Which template and VI can i use? Am i supposed to use the NI-DAQmx and VISA? Also, for my USART connection, i am using the MAX232 Driver which is applicable for my application as i am working with +5V. So far, i have been reading and trying out on Labview and the steps i have taken are:
    1) Open a New VI and used the Instrument I/O(Read & Display) template?
    2) How do i configure the Instrument I/O Assistant Express VI to read the info from my device connected to COM1?
    3) I was reading the Labview Manual, 'Getting Started with Labview' and i was following the steps under Page 55, Communicating with an Instrument. Am i on the right track?
    4) How do i check and make sure that the port settings of the I/O Assistant and my PIC Micro Match
    Please help me out and guide me through the process. I am student at a polytechnic and i am very new to Labview and the software is used as a development tool for the project. I have a deadline to meet and i hope i can get a response as fast as possible. Your help will be kindly and greatly appreciated. I hope to hear from you guys soon.You can e-mail me all your answers at [email protected]
    Thank You
    Best regards,
    Ashwin
    Ashwin Kumar Mansukhani
    Attachments:
    Getting Started with Labview.pdf ‏901 KB

    Hi Ashwin,
    I am not familiar with the PIC Microcontrollers, but I am assuming you mean that you have the microcontroller sending out serial data that you want to read on another computer with LabVIEW. Please let me know if this is incorrect.
    What type of data is coming out of the serial port? Is it ASCII, or binary? The reason I ask this is that the serial communication in LabVIEW is done through a protocol called VISA, which uses ASCII data to send and receive. You can later convert this data into whatever form you need, but this is what it is designed to read and write.
    You can check the settings such as baud rate and data bits in a configuration utility called Measurement and Automation Explorer, or MAX. When you open up the MAX interface, you can expand the Devices and Interfaces entry on the left, and then expand the Ports entry to see your serial port. When you highlight this port, select the Port Settings tab at the bottom of the window, and you can see what the current settings are, and change them if you need to. You can also set these parameters in LabVIEW using the VISA Configure Serial Port VI.
    You can also test communication in MAX by right clicking on the correct port and choosing Open VISA Session. Then choose the Basic I/O light blue tab, and then go to the Read tab. When you click execute, it should read in what is coming from the serial port. This will allow you to verify that the correct information is coming in before even trying to acquire the data in the LabVIEW environment.
    In LabVIEW, the best resource to use the Basic Serial Write and Read example program that ships with LabVIEW. By examining the block diagram of this program, you will be able to see the basic programming flow of serial communication in LabVIEW.
    I hope this information was helpful! Please let us know if there's anything else we can help with.
    john
    Applications Engineer

  • How to compare signals from two different .tdms files?

    i'm developing a lie detection system in labview.so inorder for that i need to compare the threshold physiological signals from the .tdms file with the signals i acquire continuelsy from the subject for each question asked.i use respiration monitor and heart beat monitor together with the sensorDAQ to acuire the physiological signals.so which are the function i should make use of?i have the following toolkits
    1. advanced signal processing toolkit
    2. adaptive filter toolkit
    3. digital filter design toolkit
    4. biomedical toolkit
    please give me advise on this.
    thank you.
    Solved!
    Go to Solution.

    Lie detector systems (polygraphs) generally work by looking at the physiological responses to the own subjects baseline, and do this in real-time.  You could use a file that represents the stored "baseline" for the subject, but the overall environment, circumstances, general state of the subject, etc. may not be similar enough to easily see subtle changes.
    From the stored file, you might determine a set of parameters or thresholds that indicate "normal, relaxed" for each signal.  You could read the file into your LabVIEW program, process each channel, and save these parameters.  Then when you have the new data, you can read the new data file in and process it in a similar way and then determine whether the parameters are outside the limits that you established for a "lie".
    You may get more responses to this question if posted in the Biomedical User Group
    Visit the NI Biomedical User Group at:
    www.ni.com/biomedusers

  • How can I tell if signals from two devices are truly synchronised?

    Hi there,
    How does one check that signals from two devices (two separate devices in a single X-series chassis) that should be synchronised actually are?   I am using a PXIe6361 and PXIe4331 on a PXIe-1073, with Labview 2001 SP1 64-bit. All devices are using the Sample Clock from the 4331 device, and an AI Start Trigger, so they should be synchronised.
    I thought that writing the signal data to file and checking the time stamp for each column of data would be the most accurate, but I have been told that timestamps are software created and therefore don't reflect the actual time that the signals were acquired by the hardware.  When I do this, the timestamps vary by up to 150ms which is larger than I expected.
    If I set the x-axis of the waveform graphs (on the GUI) to "Time" then it appears that the first data sample is taken at different times for the two plots (one plot per device).
    If I set the x-axis of the waveform graphs (on the GUI) to "Ignore time stamp" (so that the x-axis just starts from 0 rather than a date-time) then the first data point occurs at "0" for both graphs. However, I'm not sure that this reflects the actual alignment of the signal.
    What is the best way to check if signals collected on different devices in the same chassis are actually synchronised?
    Thanks,
    Claire.

    Hi Lynn,
    Thanks for your help and for sending the demo.
    I understand the concept of how the signals will look if they're not synchronised and your demo shows that nicely. I guess I have been perplexed by someone else telling me that the timestamps in the output file (and following from that I assume timestamps on a waveform graph) do not give an indication of whether signals are synchronised. The reason they gave for this was that the timestamps are manufactured by the software, not the DAQ hardware.  They suggested that I put the setting "ignore waveform timestamps" on my waveform graphs, and then check that both signals come in at the same time (i.e. both start at zero), but I'm not convinced about this. 
    When I use an analog trigger, neither the timestamps in my output file or on the two waveform graphs are synchronised. If I don't use the trigger, then there is far less disparity in the timestamps in the output file. I've attached two output files here, and my VI.
    This is my first attempt to synchronise a voltage module and a strain gauge module on an X-series chassis, so I want to make sure that I'm achieving the best synchronisation that I can, and the difference in behaviour with and without the trigger worries me.
    Thanks,
    Claire.
    Attachments:
    without trigger.txt ‏5 KB
    with trigger.txt ‏6 KB
    Multi-Device Synch-Analog Input-Finite Acq-Analog Start_Claire_wDigitalin_12June2012 PTbridge.vi ‏196 KB

  • Extreme Base station can't extend signal from Mini desktop ?TCP/IP settings

    I have a Mac Mini connected to DSL via a lynksys 8 port workgroup switch. I had 6 rooms in my house connected to the router by ethernet wiring. Worked great for several years. Now 2 rooms cabling seems to have gone bad (mice or squirrels, maybe). To avoid rewiring the connection outlets or pulling wire again (Ugh!), we bought an extreme base station. It transmits the internet signal fine when connected to the ethernet network (roaming), but I cannot get it to allow clients to access the internet in WDS mode.
    The base station is in a central location where there is not an ethernet jack. It picks up the airport signal from the Mini (with extreme card and yes, I have the MAC address of the Mini chosen).
    Internet connection is "shared" on the Mini and the laptops have no trouble accessing internet within close range of the Mini. The laptops can't access the internet through the base station. I suspect it's how the TCP/IP is assigned. I have tried every possible combination in the Airport Admin Utility.
    In the WDS diagrams, it always shows 2 or more Base stations, never a Desktop and a Base. Are these incompatible?

    ...but can't figure out how for the Mini....
    When you select "Share your Internet connection with AirPort-equipped computers," click AirPort Options to give your network a name and password.
    Since they are the "same" network does it matter?
    If one has no encryption or is using weaker encryption it will be the weak link for others to access your entire network.
    Normally, if you want to seamlessly roam between the 2 wireless networks, ensure that they are using the same network name (SSID), same encryption type, same encryption level, and same encryption password. However any wireless device connecting to the Mac mini's network will be on a separate subnet.
    Now I've got to figure out how to get files off the Office Desktop from the laptops, should I set up afp for airport or ethernet?
    Enable Personal File Sharing on the Desktop.

Maybe you are looking for

  • White screen on startup 2010 Macbook pro

         The computer worked fine a couple of days ago, when the computer was given to me. I reformatted and reinstalled OS X through disk utility to remove all of the previous data. It worked fine up until I took it took school. When I took it out of my

  • Maintain descriptions in other languages for SPRO definitions

    Hi, In SPRO, I need to maintain descriptions for Plants and Storage Locations in  2 Other languages apart from EN. Kindly suggest me the solution Best regards

  • How can I connect my Excel file located at the same computer to my Lookout process using DDETable.

    I'm currently using Lookout ver.5 in my Win98se.Attached here is the sample of my xls file. Attachments: pioneer_report.xls ‏31 KB

  • Manifest Class-Path: attribute setting

              Hi there,           Though the classpath entry is made it is not reacting as expected...Need help           in understanding the classpath attribute behaviour on web Logic Server 7.0...           how do I make a jar file inside a war file(w

  • I need help fixing a small error

    I am using a trial version of Flash Pro 8 and i bought a flash site template. I am editing text in the site to my liking and when I tested the site I recieved this error message: Error: A 'with' action failed because the specified object did not exis