PCI 6023E Analog Input with DAQmx

Hello everybody,
I have some problems with a PCI 6023E Card. I`m using Lab View 7.1 on Windows 2000.
When i start the MAX (Devices and.../NI-DAQmx - Devices.../PCI-6023E) and try to open the "Testpanel"
i get the following message:
Lab View: An exeption occurred within the external code
called by a Call Library Node. This might have corrupted
Lab Views memory. Save any work to a new location and
restart Lab View.
VI "BlockatContainerState2.vi" was stopped at node
0xA10 of Sub VI "DAQmx

sorry i had a problem thats the rest of the message
Sub VI "DAQmx Assistant_DAQmx Create AI Channel (sub).vi"
i get the same message when i try to use the DAQmx Assistant for
analog Input measurements in Lab View.
I hope you can help me
best regards
Chris

Similar Messages

  • How do I get a 100 V 100 Hz signal into a NI PCI-6111 analog input?

    I have a 100 volt 100 Hz signal that I want to bring in on NI PCI-6111 analog input (+/- 5V input). Is there some small off the shelf transformer or other type of signal conditioning I should use? Basically I just need to time stamp other analog inputs with this one coming on.

    The PCI-6111 has a built in attenuator to allow up to 42V inputs. In order to read 100 volts you will have to use some sort an external attenuator. National Instruments offers several different modules and terminal blocks in the SCXI form that would accomplish this.
    I would recommend reading through this KB for more information.
    http://digital.ni.com/public.nsf/websearch/38AEFF122C8D732F8625629800519927?OpenDocument
    I hope this helps.
    Joshua

  • Pci-6251 analog input set up

    Hi all,
    Can I set the analog input of PCI-6251 in mix configuration (some in differential mode and some in single ended mode). Because in my set up the differential option is greyed out.
    Thanks
    dphan128
    Solved!
    Go to Solution.

    Yes you should be able to do that. You have 16 analog inputs. For differential you will use two of those per channel. See the DAQ M series user manual. You can program channels on an M Series device to acquire with different ground references.To enable multimode scanning in LabVIEW, use NI-DAQmx Create Virtual Channel.vi of the NI-DAQmx API. You must use a new VI for
    each channel or group of channels configured in a different input mode.
    Remeber that for a differential that you would wire for example AI 0 to the + side and AI 8 to the - side. So now you can't use the AI 8 for single ended. AI 1 is related to AI 9 etc. up to AI 7, AI 16.
    Hope this helps.
    Using LabVIEW 2010SP1 and TestStand 4.5

  • The device PCI 6036E: Analog Input error:"the device is not responding to the base address",Output :No signal

    I am using PCI 6036E DAQ card, DAQ 6.9.2. , I have not connected to any external device.
    When I tested card(in Test Panel of MAX2.2,DAQ 6.9.2) an error appearing �the device is not responding to the base address� � then I pressed Yes, Then TestPanel window appearing�
    In analog Input Tag, there is an error �-10805�. This error also appearing when I run the standard analog input samples (use Delphi 6.0).
    I tested analog output functions in TestPanel, Delphi Examples, LabView 6.1. There are not error appearing but there is not output signal (using Oscilloscope to see).
    Output, Input digital functions are working correctly.
    Next I try to foll
    ow all the instruntions as recommended:
    -Changing TO another slots PCI
    -Reinstall Os WinXP,98SE�
    -Use another Computer�
    But all of the cases are still not working.(the above error still appeared, no output signal)

    Nvd,
    I'm sorry to hear about the problems that you are running into with your DAQCard. I can understand your frustration with the errors that you are seeing. You mention that you have tried to do some troubleshooting, I was wondering if you have tried all the troubleshooting techniques listed in this KnowledgeBase:
    http://digital.ni.com/public.nsf/websearch/DCFCDB240788F8D486256D6A00834D80?OpenDocument
    If you have tried all of those things, I would suggest one additional thing to try. I would suggest updating your NI-DAQ drivers to version 7.0. These drivers are the newest versions of our Data Acquisition Drivers, and can be downloaded from our website here:
    http://digital.ni.com/softlib.nsf/webcategories/85256410006C055586256BBB002C128D?OpenDocument&
    node=132060_US
    If you have tried all of the above troubleshooting options, please post a detailed description of what you have tried. This may help to clarify what is going wrong.
    Jed R.
    Applications Engineer
    National Instruments

  • Error -200072 using analog input with 3 PXI 6120 cards on realtime mx system

    I have just upgraded to the mx drivers for the 6120 S series boards.
    I am trying to sample 12 analog inputs at once with a pretrigger. (4 channels per board)
    The error message -200072 comes up.
    One board works fine, when I add the second board's channels the error occurs.
    Each board shows up as A,B,C respectively in MAX and in the Labview browse menu for selecting channels.
    Greg Morningstar
    Takata

    Probably the best way to do this would simply be to use the Route Signal VI to make it to where each of your boards looks at a particular line for the trigger. You can do the same thing for the clock so that they are all sampling at the same time.
    You will also want to make sure that your device is defined in MAX. Once you do that, everything should be pretty easy to implement. You might also want to look at some of the examples that show how to do RTSI. It's almost the same as you would do for a PXI system.
    Otis
    Training and Certification
    Product Support Engineer
    National Instruments

  • PCI-4472 not functioning with DAQmx Linux 8.0

    We have multiple systems with PCI-4472 cards in them running SUSE Linux, and all but one of them function properly.  On the particular system in question (which has identical hardware and software to the functioning systems), nilsdev lists no devices.  Further investigation revealed the following excerpt from dmesg:
    allocation failed: out of vmalloc space - use
    vmalloc=<size> to increase size.
    Unable to handle kernel NULL pointer dereference at virtual address
    000002fc
     printing eip:
    f9ede608
    *pde = 00000000
    Oops: 0002 [#1]
    SMP
    Modules linked in: nidsark nistcrk nicdrk nistc2k nimru2k nimxpk ipt_pkttype
    ipt_LOG ipt_limit nipxirmk nidimk nimsdrk nidmxfk nimxdfk nimstsk nimdbgk
    niorbk speedstep_lib freq_table nipalk nikal snd_pcm_oss snd_mixer_oss snd_seq
    snd_seq_device button battery ac af_packet video1394 raw1394 edd ide_cd cdrom
    sk98lin ohci1394 ieee1394 snd_intel8x0 snd_ac97_codec snd_ac97_bus snd_pcm
    snd_timer snd soundcore snd_page_alloc i2c_i801 i2c_core hw_random generic
    ehci_hcd intel_agp agpgart uhci_hcd usbcore shpchp pci_hotplug ip6t_REJECT
    ipt_REJECT ipt_state iptable_mangle iptable_nat iptable_filter ip6table_mangle
    ip_conntrack ip_tables ip6table_filter ip6_tables ipv6 parport_pc lp parport
    nls_iso8859_1 nls_cp437 vfat fat nls_utf8 ntfs dm_mod ext3 jbd sg fan thermal
    processor ata_piix libata piix sd_mod scsi_mod ide_disk ide_core
    CPU:    1
    EIP:    0060:[<f9ede608>]    Tainted: PF    U VLI
    EFLAGS: 00010246   (2.6.13-15-smp)
    EIP is at nidsark-unversioned0004550+0x70/0x11e4 [nidsark]
    eax: 00000000   ebx: f6be1d68   ecx: 00000000   edx: 00000000
    esi: f6514770   edi: f6514760   ebp: f6be1964   esp: f6be1958
    ds: 007b   es: 007b   ss: 0068
    Process nipalsm (pid: 6952, threadinfo=f6be0000 task=c2344540)
    Stack: f6be1d68 f6514b68 f6be1d68 f6be197c f9efb4d0 f6514760 f6be1d68 f6514b68
           f6b82690 f6be1a1c f9e93409
    f6514b68 f6be1d68 f6514760 f6be1d68 00000004
           f6be1d6c 00000100 f6be19b8
    f95bbf19 f6be1bd0 f6be1d6c f6be1d6c f6be1b98
    Assuming that the NULL pointer deference was caused by the preceding failed vmalloc, I attempted to increase the vmalloc size to 512MB (which is certainly overkill).  This did nothing to clear the error.  One final detail is that this computer dual boots Windows, while the others have only ever run Linux.  It shouldn't matter, but I feel it is worth mentioning in the event that DAQmx for Windows somehow mangles the configuration of the card to a point where it no longer functions with DAQmx for Linux.  It should be noted that the card works just fine under Windows.
    I have the output from niSystemReport available in the event that it will help in diagnosing the problem, but am unable to attach it due to the size restriction of this forum.  The call trace from dmesg is in the attached file.  Any help would be greatly appreciated.
    -Peter Lisherness
    Message Edited by PeterLisherness on 07-10-2006 12:36 PM
    Attachments:
    dmesg.txt ‏5 KB

    Peter,
    I have a couple of other questions which might help us narrow this down.
    1.  You said both the working and non working machines have identical hardware and software.  I'm assuming this means both machines are Pentium 4 3.2 GHz SMP machines 2 GB of RAM, have the same motherboards etc.  I also assume that this means that they also are both running SUSE 10.0 and have the same updates, and kernel versions.  Is this correct?  A system report from the working machine could also help us confirm this.
    2.  Can you easily reproduce the Oops?  When does it occur?  Does it happen when the machine first boots up, or do you have to run nilsdev first?
    Thanks,
    Shawn B.
    Use NI products on Linux? Come join the NI Linux Users Community

  • Trig an analog input with NI6115 through digital data sent by NI6534

    Hello,
    I have 2 NI cards: NI6534 (master), NI6115 (slave)
    I have succeeded to do the following point:
    The slave wait a trigger given by a generated pattern data through NI6534 to start an analog acquisition
    (I noted that the start of the analog acquisition is delayed by 100 ns, is that normal?)
    Now I want to delay the analog acquisition (let's say by exactly 5.0 microseconds)
    Any idea how to do that
    Loranger

    Loranger,
    The delay you are seeing is expected. When using a trigger pulse to "trigger" acquisition on another card, there will be a small delay. Usually the card will actually "trigger" within 1-2 ticks of the timebase clock after the trigger pulse is received over the RTSI line. 2 ticks of the timebase on the NI6115 (timebase of 20MHz) corresponds to 100ns.
    In regards to delaying the analog input aquisition; this is what I would suggest. I would have the trigger from the NI6534 go to a counter on the NI6115. You could program the counter to generate a delayed pulse (5 us) when the trigger from the 6534 is received. The output of the counter could then be used to trigger the analog input acquisition. This should work for you. I hope this helps.
    Todd
    D.
    National Instruments
    Applications Engineer

  • How to read 2 analog inputs with VISA

    Hello everyone! I want to read two analog inputs using VISA serial communication from the arduino.  I'm currently able to read one of the input but how do I read two analog inputs on labview?

    Hi lamela,
    again I have to ask: HOW does the string look like? Is it so hard to provide an exact example of your received string???
    I think the string is showing one value from the two inputs
    Are you sure or are you guessing? How can we tell what you might "think" when you aren't able to provide examples?
    You wrote you send values from the Arduino using the command "println(value1, value2)", but now you "think" you only receive ""value1" in LabVIEW?
    Get your data communications clear! (And learn to provide meaningful examples!)
    Edit after your edit:
    my code to serial print two inputs: Serial.println(Voltage, Current);
    Even if I print separately: Serial.println(Voltage);
                                                 Serial.println(Current);
    The first command is completely different then the second part! In the first you should receive two values in one line, in the 2nd one you receive one value per line: You need to adapt your string parsing to these strings!
    Again: send both values in one line. Use the LF as termchar. Separate the values by some separator char like comma. Check the received string for validity. Use some error checking in your VI.

  • Monitor multiple channels for analog trigger with DAQmx drivers

    Hello! I would like to start a data acquisition of multiple analog channels (16) from an analog trigger. I would like trigger to monitor four of the (same) channels, and trigger when any one of them reaches a certain voltage. I found an example that would work with the Traditional DAQ drivers (using occurances), but can't figure out how to do something similar in DAQmx.
    Time is also an issue, as I would like to collect the first 80 milliseconds of data after the trigger (at a rate of 500,000 Hz).
    I'm using LabView 7.0 and collecting data off of two PXI-6133 cards.
    Thanks for your help!

    Hi Denise-
    After some research, I have found that it is not possible to use the functionality of DAQ Occurrences in DAQmx. Ironically, the reason that this functionality is available in Traditional and not DAQmx is due to the exploitation of an inherent limitation of Traditional that was upgraded in DAQmx. The multi-thread capability of DAQmx is a major advantage for most applications, but in this case it prevents the use of occurrences as they existed in Traditional DAQ.
    In short, this means that you can't directly use this functionality in DAQmx. You can however emulate this functionality with minimal software analysis of the incoming signal. I have attached a modified example VI that logs data to a chart only when the analog level of one of the channels being measured has exceeded a user-defined reference value. Basically, the task is running continuously in the background but the data is not actually logged until the signal is above a predetermined "trigger" level.
    Please let me know if the attached example is helpful for your application. You will see the input channels listed in the format "DevX/ai0:y" where X is the device number and y is the highest channel number of interest.
    Regards,
    Tom W
    National Instruments
    Attachments:
    Cont Acq&Graph Voltage-Int Clk Analog SW Trigger.vi ‏83 KB

  • Can not get analog input with SCC-SG04

    I have used the wizard and wrote a simple AI input channel loop with a digital readout. The voltage doesn't change.

    Well, I guess I'm not sure what all you have tried or how you have things set or configured. I would first try verifying the input through the test panels in MAX. Make sure you have your DAQ card set to NRSE mode, make sure you have your SC termination equipment set as the "Accessory" to the DAQ card under it's properties, double check your connections, then see if you can read the voltage through the MAX test panels. After you can read the voltage there, then you can try using a program. I hope this helps!
    Russell

  • FPGA Analog Input with Scan Interface

    Hi all,
    I am rather new to the FPGA Module and have a question concerning analog sampling.
    Before, I used a cRIO in scan mode to perform following tasks:
    A state machine on the host PC performs one task after the other, e.g: output a voltage, run a stepper motor (with the NI 9512), Acquire voltage signals using the scan engine mode, logging data ; next voltage, next stepper position, next measurement, logging to the file,.... and so on.
    Now, I want to perform the same task, however, with a sampling rate that exceeds the performance of the scan engine. So I think I need to jump to FPGA Mode.
    The question is: Can I "mix" the measurement in FPGA while running others tasks of the state machine in scan mode? In more detail, I only "need" the FPGA Mode for the data acquisition, however, for running the stepper motor the Scan mode is perfectly fine and somewhat easier.
    One idea of mine was to create a fpga-vi on the target that acquires a certain amount of voltage samples and sends it to the host. Now, could I run the state machine as I did before, just for the state "voltage measurement" refer to the fpga-vi on the target and get the data?
    Thanks for any hint,
    Jack

    It would be best to not mix the scan engine and the FPGA.  Yes, you can do hybrid mode, but it uses A LOT of your FPGA resourses.  You can still use the state machine in your host, but you will just need to change the commands slightly to use the FPGA interface instead.  Use a DMA FIFO to send your analog data to your host to be logged.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Analog input using Daqmx

    hello
     i m using Labview 2011 but don;t have the Daqmx in my labview and i want a examples in basis of usb62589
    regads upasana

    1. Please post in the Hardware forums
    2. There is no USB-62589
    3. Why don't you install DAQmx and look for examples?
    Christian

  • Dv video time stamp with analog input

    Hi,
    I have a dv video which I want to time stamp with a analog input.  For the video, which is dv brought into the computer with ieee1394, LabView calls up a C++ program which then stores the video.  Simultaneously, in the program which calls up the C++, an analog signal is also stored (seperately).  I know it sounds tricky as well as messy but this is the best we've been able to work out, given the uncorroperative nature of LabView with dv camera.  Does anyone have suggections for time stamping these two signals so that I know exactly where in the video the analog signal occurs?  The PCI board we use is the NI 6014, which I know is timestamp capable, but I dont know if it can stamp the video feed since it doesnt come in through the NI card.
    All thanks in advance for any help!
    Daniel

    Daniel,
    Does your DV camera have any sort of timing or triggering signals? Without these, it will be virtually impossible to correlate measurements. It is cetainly possible to accurately time an analog input with your DAQ card, but correlating these to the digital video would be extremely difficult. Is this video streaming in real time? Or are you reading it off of an already-recorded tape? If this video is streaming in real time, with an acquisition started in software, I would suggest attempting to start the software DV reading at the same time as your DAQ task. If you do have an external signal that can trigger the video acquisition, then I would suggest using that same signal to start your analog acquisition.
    Hope this helps,
    Ryan Verret
    Product Marketing Engineer
    Signal Generators
    National Instruments

  • Scanning of analog inputs in PXI 7831R FPGA

    Hi all,
    I am new to Labview FPGA Module. I am using Labview 7.1.1 and Labview FPGA Module 1.1 . I am using PXI 7831R FPGA Card.
    I developed a program which is used to scan analog Inputs with given scan rate for given scan duration. I gave input as pulse signal with 1Sec period and 2 V amplitude.
    If I scan one analog Input with 10ms scan rate for 1000ms scan duration I am getting correct values. But if I use 2 or more analong signals to scan at the same time then I am getting Multiple of periods. And also If I increase or decrease scan rate I am getting strange values. Could any body please check my code and help me.
    Thanks in Advance.
    Regards,
    Sashi
    Attachments:
    AnlogIn_FPGA.zip ‏247 KB

    customise your front panel with advanced picture creation metods
    Attachments:
    SUF.ctl ‏20 KB

  • Triggered Analog Input

    I want to trigger an analog input with a pulse generate
    by gating counter 0 with the Analog Output UPDATE* signal.
    any suggestion?
    Thanks,
    Erin

    Response-For what it is worth this isn't a very straight forward thing to
    achieve with LabVIEW VIs. I have been trying to do the same thing for a
    month now with no luck. If it is any consolation, NI tech support hasn't
    provided any working solution to this either. If I get it figured out in
    a few days I'll try to remember to write back.
    Try using the analog scan clock to trigger the analog read instead of the
    traditional "triggering" schemes in the LabVIEW documentation. Their terminology
    is misleading. Per LabVIEW the term "trigger" relates to the oscilloscope
    concept where you specify voltage level and edge-transition to recognize
    a waveform in order to high-speed sample the incoming waveform.
    In the embedded system the term "trigger" refers a specifi
    c point in time
    (defined as an "event") usually represented by a signal transition (rising
    or falling edge) at which you want something done such as read an analog
    input or save a counter value. In microcontrollers this is generally handled
    with a specialized hardware port called an input capture which is designed
    to automatically recognized signal transitions and simultaneously fire off
    an interrupt (read as "trigger") that can be used to start your unique action.
    LabVIEW documentation doesn't seem to have a clue about this concept. Perhaps
    you know all this already. Hope it helped in some manner anyway.
    Frank
    "Erin Ryan" wrote:
    >>I want to trigger an analog input with a pulse generate>by gating counter
    0 with the Analog Output UPDATE* signal.>>any suggestion?>>Thanks,>Erin

Maybe you are looking for