PCMCIA E series card Voltage Range

In one of our application we want to use E-series daq card for a single channel analog input.
Voltage range is from 0.01 mv to 2 V.
Please let me know could I measure voltage of this range and how?
Please let me know what minimum voltage we can measure using PCMCIA E series card.

Vishal,
The range on our current PCMCIA E Series boards is +/-10V. There are multiple example programs to measure analog voltages for common programming environments that will work with our PCMCIA boards. Hope this helps. Have a great day!

Similar Messages

  • Driver for PCMCIA-CAN series 2 card needed

    Looking for the driver for the PCMCIA-CAN series 2 card.......
    Solved!
    Go to Solution.

    Not sure what version is needed, but this may get you started:
    http://search.ni.com/nisearch/app/main/p/ap/tech/pg/1/sn/ssnav:sup,catnav:du,n13:hardwareDriver,n8:2...
    -AK2DM
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

  • Voltage Range E Series DAQ

    I have a PCI-MIO-16E DAQ.  My input voltage range is 0.7 - 1.1 volts.  The manual seems to suggest that I can only choose voltage ranges that have discrete values, ie. 0-1; 0-2; 0-5 etc. 
    The NI Measurement & Automation Explorer allows me to select 0.7 - 1.1 volts, however.  So my question is:  will the resolution be based on dividing the range 0.7 - 1.1 volts into 4096 parts, or by dividing 0 - 2 volts into 4096 parts?
    Thanks,
    Shawn Clark

    The actual input ranges of the DAQ board are fixed. I don't know the specifics of your board but the ranges might be +/-10, +/-5, +/-1, 0-1, 0-5, 0-10, etc. When you specify a minimum and maximum value, NI-DAQ uses that information to select the most appropriate range. So, in your case, if the board supports 0-2 volts, that is the range that will be selected and that's the value to use for resolution. If the DAQ board does not have a 0-2 range and only has a 0-5 range, then you need to use this number.

  • Using MIO-16XE-10, I am trying to relate per gain setting the resultant actual voltage range.

    I have tried to find the conversion table for gain setting and resultant voltage. I am a programmer, and am having trouble understanding the relationship, or better yet, an algorithm, so I can programmably set a gain, based on the current voltage returned after a measurement. The idea is, to get the most reliable resistance measurement by assuring that the current measurement taken with a specific gain setting, is not taken on the lower 1% of that gain.
    bookieb

    Hi Bill:
    Thanks for contacting National Instruments.
    The best way to use the most appropriate gain is to set the voltage range of the incoming signal and let the NI-DAQ driver choose the best gain to apply for that range.
    The DAQ card has some predefined gain settings which the NI-DAQ driver selects from depending on the input limits.
    To find out what input ranges correspond to what gain, please refer to Appendix A Specifications of the E Series User Manual. Table on page A-2 shows the relationship between the input signal ranges and the channel gain that the NI-DAQ driver chooses.
    I hope this helps. Please do not hesitate to to contact me back if you need additional information.
    Re
    gards,
    Bharat S
    Applications Engineer
    National Instruments
    Penny

  • PCMCIA-FBus Series 2 is not recognized under Windows XP

    I'm trying to get a PCMCIA-FBus Series 2 card work under Windows XP Professional SP2 toghether with an Elan P111 PCI to PCMCIA Adapter, but when I plug in the card Windows does not recognize it - I only get a "pcmcia unknown_manufacturer" message and no driver is installed.
    I've tried to install the driver manually, but then I get an error code 10, Device could not be started.
    The Computer is a Dell Optiplx 740, AMP Athlon 64 X2 3800+. NI-Configurator v3.1, v3.1.1 and v3.2.1 have been tested. The Elan P111 driver is PSeries v5.07.08.

    Hi,
    I am not sure whether this PCI-PCMCIA adapter works well with Windows XP SP2.  I suggest that you also contact the adapter manufacturer for the compatibility issue.
    We did do some test on Elan PCI-PCMCIA adapter, but I am sorry that we are unable to make verifications on all kinds of hardware and software combination. There exists the possibilities that the adapter does not work on some PC's. 
    Another suggestion that you can try is to install the PCI-PCMCIA adapter driver on a clean Dell Optiplx 740 (Windows XP SP2 without Microsoft hotfix ). I think this can help you identify whether there are compatibility problems between the adapter and system.
    Regards
    Feilian (Vince) Shen

  • How do i read out the actual voltage range of the daq board as calculated by labview based on my input voltage range

    I am somewhat new to DAQ so I hope I describe this well enough, but I have encountered a requirement for my VI that I am not sure how to obtain from LabVIEW. It is my understanding that when using a DAQ board (I'm using a PCI-6034 E) you can specify the voltage range you expect from your measurements, and then the software calculates the actual voltage range it will be measuring to ensure that you get the best resolution in your range that you can given the voltage step size, etc. of the equipment. I know how to extract the voltage input that I enter into the virtual channel, but I was wondering how I can extract the actual voltage range that the hardwar
    e/software is using based on its own calculations from my input. Can anyone help me with this?

    If I understand correctly, you want to retrieve the actual gain value (and, therefore, usable input range) used by your DAQ card for a particular measurement.
    I think there's a good KnowledgeBase entry with your solution entitled "Programmatically Determining the Gain Used by a DAQ Device". Here's the URL:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/c93d3954ec02393c86256afe00057bb0?OpenDocument
    Regards,
    John Lum
    National Instruments

  • M Series Card Performance

    Hi,
    Now I am using PXI 6052 E Daq card with Real Time Controller ( P3 and 833 MHz Processor with 512 MB Ram runns LabVIEW RTOS). Controller runns one time critical loop at 1ms rate. I am setting Daq to acquire 20,000 Samples/sec. Every ms I am reading 20 samples using Read AI mx VI. It takes ~250 microseconds to complete the read operation.
    We want to go for M series because of it's low cost. Preferably M Series PXI 6229. This card can offer same performnace and data rate? ( means AI Read completes within 250 microseconds?).

    PXI 6229 has max scan rate of 250 Ksamples/sec.
    PXI 6052 has maxscan rate of 333 Ksamples /sec
    As per your requirement, you need 20K samples /sec. so your required scan rate is within range
    I have used this M series card and found it to  give better permomance with respect to ADC settling time for multi channel scans than E series cards.
    So, you probably would be happy with how Mseries cards will perform

  • Changing filter cut off on M series cards

    I want to sample 32 channels with the M series 628x series cards. But at the maximum sampling rate that would afford me a 15Khz sampling per channel. To avoid aliasing I need an analog filter at 5 KHz. The M series card only offers a 40 Khz filter. Is there anyway to change this filter's cutoff (-3db) to 5 Khz? Any suggestion is greatly appreciated, but I'd like to avoid building a 32 separate analog filters in front each channel.

    Hi,
    I'm really worry about this issue.
    Our customer has a PCI 6289 and he bought it becuase on the catalogue it said: "Programmable lowpass filter"
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/141​15
    He wants to use the antialiasing filter option with a cutoff frequency of 1KHz.
    The answer you wrote means that he cannot use any cutoff frequency, just 40kHz.
    So , why it is written "Programmable lowpass filter"?
    what is or are the ranges of the "Programmable lowpass filter"?
    Why isnt documented on the M series specification?
    Thanks
    Dalia
    Dalia Russek
    Application Engineer Manager
    National Instruments ISRAEL Ltd

  • Pcmcia-fbus/2 card start up

    I'm using NI-FBUS Configurator 3.2.1 in combination with the PCMCIA/2 series 2 cards.
    At startup the fieldbus communication manager will not start up (hang) until my conector of the card is disconnected.
    Both  ports are set as link master device. (p.s.CM does startup fine on the same segment in combination with a singel port card)
    I reproduced this failure on different laptops with the same result.
    Somebody else with the same problems?

    Please send an email to us from www.ni.com/ask with some detailed information, including the date you bought the card, and the error code or screenshot of the hang.
    Ryan Shi
    National Instruments

  • MY PCMCIA-FBUS/2 CARD LOCKS THE LAPTOP UP, WHEN I INSTALL THE CARD.

    I HAVE A PCMCIA-FBUS/2 CARD THAT LOCKS UP THE LAPTOP WHEN IT IS INSTALLED. WE HAVE TRIED IT ON 3 DIFFERENT LAPTOPS. ALL OF THEM HAVE THE COMMUNICATION MANAGER SOFTWARE & HAVE BEEN USING FIELDBUS. THE 1 THING DIFFERENT I HAVE FOUND IS WE ARE USING THE SERIES 2 F-BUS CARDS.  DO I NEED TO LOAD A DIFFERENT SOFTWARE FOR THIS TYPE OF CARD, OR UPRADE TO A DIFFERENT TYPE?
    TERRY MCCLAIN

    Hello Terry,
    Would you please let us know what kind of Cardbus (PCMCIA) controller is adopted in the laptop? This is the key point.
    According to our test, some particular Cardbus controllers are not compatible with our PCMCIA-FBUS series 2 interface card, e.g. Ricoh R5C475II. With these controllers, inserting the PCMCIA-FBUS S2 always hangs the whole OS; the OS becomes active again when we draw the PCMCIA-FBUS S2 out of the slot.
    Attachments:
    CadbusController.jpg ‏115 KB

  • N275GTX Lightning Core Voltage Range

    Hi, I am using the MSI Utility Afterburner to overclock my N275GTX Lightning 1792MB card.
    One of the options is to increase the core gpu voltage by an offset of 0 - 200 mV.
    GPU-Z says that the VDDC voltage of the card is 1.0500 and so a max OC would take that up to 1.250.
    What I cannot find anywhere is the documented voltage range for my video card. I do not want to increase the voltage beyond what it can handle.
    Can anyone please tell me the voltage range of my card?

    http://forums.vr-zone.com/overclockers-hideout/478135-lightning-strikes-again-msi-n275gtx-1792mb-ddr3-lightning-edition-d.html
    "I raised the vGPU to 1.1415v max on the s/w."

  • PCMCIA-CAN Series 2 hanging

    Hello,
    We are using a PCMCIA-CAN series 2 board on two of our machines to get/send PDO data to CANOpen network of 6 nodes.
    Built in 2007, Labview 2010 application (using CANOpen library 1.1.3 in combination with NI-CAN version 2.7.4) seemed to work OK mainly because we used it on short period of time (under 12h). To run more longer tests, we run now the machine over 24h continuously and this revealed a small bug. We investigated the case first with NI-SPY. It reported a communication lost problem with the board firmware and propose different solutions or checks (windows interrupts, application code, etc...). 
    In our application, we have two loops to send and get PDOs on the network. We use function "Wait for PDO" to wait for a TPDO during a time slot and if not received we switch to wait for another one and so on. The other loop takes care of the RPDOs the same way. We have no more than 14 RPDOs or TPDOs for our network. We checked the code and even slow down the switching but bug is still appearing.  
    We then run the application with other faster computer and made all software updates. Now, NI-TRACE reports (attached with this post) other errors but symptoms are still the same. Application is not able to communicate with board to get/send the PDOs values (or send/recaive SDOs) even the nodes on the CAN network are running OK. It seems that firmware implemented on board stopped functionning or hanged.
    We contacted NI sales representative in order to get version 1.1.4 of the CANOpen library (support for LV 2010) but did not get any proposal at this time. We suppose that there is no firmware update as the product is now obsolete. Of course, recommandation was also to switch to new NI PCI board with new library for industrial communications but which is unforunately a little too large to be installed in our panel PC. Another manufacturer hardware will fit for sure...
    Reading back my posts in 2007-2008, I remembered also that the PCMCIA boards had more limitations than others PCI boards for example. But, before changing hardware and rewriting the code, I want to give these boards (and all the work done in the past) a last chance.
    Does one of you have suggestions or solutions for me ?
    Thanks in advance.
    Attachments:
    20130329SMTSpyCapture.nitrace ‏304 KB
    nicanErr.txt ‏5 KB

    Hello,
    Please find below two snapshots of the code and in particular the TPDO and RPDO loops I have in my main vi.
    The TPDO loop receives an array of clusters containing each with  TPDO elements declared. Array contains 7 elements actually. TPDO time refresh time has been set to 200ms so that loop switches on a waiting time for each TPDO around 28ms. CAN network speed is 1Mbits/s and the 6 CAN nodes has been setup to send their TPDOs every 20 or 30ms. Only the one used are sent.
    The vi "Get TPDO data" is implementing the TPDO wait function. If a new TPDO is available, its value is compared with the stored one (in array element). If new, the TPDO cluster is updated and put in a queue. Then, in this Producer/Consumer queue principle, the second loop dispatches the values in the variables.
    The RPDO loop is based on same principle except that the 2 RPDOs are set every 50 ms. The application then uses SDO to send other commands to the nodes.
    When our problem appears, we notice that values are not refreshed anymore on the front panel, even commands with SDO are no more possible. Spying the CAN network shows normal activity of the nodes.
    First error message we got from NI-SPY was :
    “Overflow in the lower level read queue of the CAN card (frames lost).  NI-CAN reads this queue at Windows interrupt-time. Solutions: Avoid tasks that generate excessive interrupts on your PC (mouse, ethernet, ...); Avoid running other
    applications during your test (screen savers, MAX, ...); use Series 2 Filter Mode to filter incoming traffic; For CAN Objects (Frame API), increase read queue length or call Read more frequently”
    So we checked our configuration and computer (scren saver, network connection, etc...). We also ran the application with another one, made updates... Now NI-TRACE reports the information contained in the files attached to my last post. Hard to find the bug if it occurs only after several hours of running time...
    With these information, any suggestions or comments ?
    Thanks.
    Attachments:
    TPDO Loop.PNG ‏100 KB
    RPDO Loop.PNG ‏68 KB

  • How to generate an interrupt using DI change detection on m-series card

    Hi,
    I want to generate an interrupt on the positive edge of a digital input signal on the IO connector.
    Does anybody know how to configure an m-series card (PXI-6224) for this use through RLP programming?
    Thanks in advance,
    Richard

    Richard vl wrote:
    I want to generate an interrupt on the positive edge of a digital input signal on the IO connector.
    Does anybody know how to configure an m-series card (PXI-6224) for this use through RLP programming?
    RuthC wrote:
    I also want to generate an external interrupt on an M- series pci-6229, and on a pci-6602.
    1. Is there an exampe how to configure the registers?
    2. which external signals can genarate interrupts on those cards?
    Hi Richard, hi Ruth,
    Let me address your questions together: first for 662x (part of M Series) digital change detection and then for 6602 (part of 660x).
    622x (M Series)
    Digital change detection has not been released in the DDK for M Series devices. If you must use an M Series device, please ask your field engineer to contact NI support so we can discuss options. On the other hand, digital change detection has been released in the DDK for X Series devices (63xx) [1].
    If you can use one from that family, then your programming will be much easier -- the RLP manual discusses change detection as well as interrupts (Chapter 1: Interrupts, beginning on PDF page 48), and the example distribution demonstrates how to configure change detection on the device (dioex3). The last piece is data transfer: the example's data transfer mechanism is DMA, so you would need to supply your own interrupt handler to move data to the host (or alert the host that a DMA transfer has completed).
    6602 (660x family)
    Moving to the 6602, change detection is not possible. The 660x device family only supports polling for transfering data read on the digital lines [2].
    Please let me know if I overlooked anything in your questions.
    [1] NI Measurement Hardware Driver Development Kit
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/11737
    [2] NI 660x Specifications
    http://digital.ni.com/manuals.nsf/websearch/57893F11B0C0687F862579330064FF6F
    Joe Friedchicken
    NI VirtualBench Application Software
    Get with your fellow hardware users :: [ NI's VirtualBench User Group ]
    Get with your fellow OS users :: [ NI's Linux User Group ] [ NI's OS X User Group ]
    Get with your fellow developers :: [ NI's DAQmx Base User Group ] [ NI's DDK User Group ]
    Senior Software Engineer :: Multifunction Instruments Applications Group
    Software Engineer :: Measurements RLP Group (until Mar 2014)
    Applications Engineer :: High Speed Product Group (until Sep 2008)

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • Is there a simulator for E-Series Cards?

    Hi Folks,
    I developed an application that uses an E-Series card. To make any updates, I need to connect remotely to the system that has the card. I'd like to try several programming techniques in a controlled environment first, then move the program to the actual system, so I need to be able to simulate the E-Series card.
    Thanks,
    Mike

    Mike,
    Below are the links to a couple of tutorials concerning RDA:
    Networking Two PCs for Remote Data Acquisition
    Developing Networked Data Acquisition Systems with NI-DAQ
    Spencer S.

Maybe you are looking for

  • RSSDK300 Number 17559.000 cannot be encoded as a BCD

    Hello experts, I'm getting an error while creating a Datasource to an External BI System. The error is : RSSDK300 Number 17559.000 cannot be encoded as a BCD of len gth 7 with 3 decimal places at field VALOROBJECTIVO Today I've done many DataSources

  • Finding the directory path

    i'm not sure what is the best way to determin th epath (location) of a directory or a file. example (IDE enviroment) [app]    |----[config]    |           |---- log4j.properties    |           |---- application.properties    |    |----[classes]     

  • Dynamically updating Midlets

    Hi, I have made a Midlet which displays images on the mobile phone.(Packed the images into the Jar file) I want to dynamically update the images often. Is there an option of dynamically updating the images, when the application is live on the mobile?

  • Partitioning the database from the most recent to the last partitioned

    Hi everyone We have a huge database which is partitioned based on datetime column and for each month. The last partition that we have on it was for 12/1/2013. And since then we have accumulated data with millions of rows already till 3/1/2015. We wan

  • Saving iweb as PDF

    Every time I try to save my iweb page as a pdf file, one side of the page is cut off. I can't figure out how to downsize the page so it will all fit into the PDF file correctly for printing or other purposes, like emailing as an attachment.