Digital IO with DAQCard 1200

Hi!
Does someone know the maximum datarate for digital I/O with DAQCard 1200 (PC-Card) ?
Is it similar to the 1MHz limit on the AT-Bus?
Thank you for your help.

Edmund,
The digital lines on the DAQCard-1200 do not have hardware timing associated with them, so the maximum read/write rate will depend on software timing.
This means that the read/write will depend on how fast your code can execute. One thing to remember is that as other programs call for CPU time you application code will slow down and if nothing else is using CPU time your application will run faster up to a point.
If you need hardware timed digital I/O I would recommend using a DAQCard-6533.

Similar Messages

  • DAQCard 1200: How can I send two TTL singals on separate channels with adjustable delay shortly after (typically a few milliseconds) external triggering?

    Hi, this is probably a simple question, but I have never used the digital and timing part of any DAQboard before... I wish to use an old Power Macintosh 7200 with a DAQCard 1200 for timing and synchronizing external events . The scenario is as follows; an external device (in my case; a CCD camera) sends a trigger signal (TTL, 600 ns). I want this signal to trigger the DAQCard, whereafter I want the DAQCard to send two TTL pulses (20 microsecond duration) from two separate channels (one pulse from each channel). The first pulse must be sent within a few milliseconds after the trigger is recieved. Th
    en I wish to be able to control the delay from the first pulse is sent and to the second pulse. This delay should preferably be adjustable from a few microseconds and up to at least 20 milliseconds. How can I achieve that kind of operation? (I need a few tips to get started.. like for instance: should I connect the external trigger signal to the EXTTRIG pin or to the gate pin of one of the counters? Do I have to interconnect the counters in some way? Do I have to invert the OUT signals? etc..)Thanks for your help in advance.
    regards,
    Michal

    I have not personally used the DAQCard 1200, but if it is similar to the E-series devices, you route the trigger signal to the gates of both counters.
    In your code, use the Data Acquisition->Counter->Generate Delayed Pulse VI twice, once for each counter. You can then set an input on that VI to select to trigger on the gate, and set the delay and pulse width of the pulse. Since both will start from the same trigger, you can delay either pulse as long as you wish, from microsecond to seconds, making the second pulse follow the first.
    Mark

  • I want to use an old Daqcard 1200 with Labview 8.5. Where is the driver?

    I'm using Labview 8.5. I have an old Daqcard 1200 and there doesn't seem to be a driver for it. I downloaded an older version of NI-DAQ that claimed to have support for that card, but when I tried to install it wanted me to uninstall my newer version. Is there a way to use this card without going back to an ancient version of NI-DAQ?

    Unless there is some specific technical problem with interfacing to the old equipment, I don't think it is unreasonable at all to expect them to include drivers for legacy equipment. It would cost them very little to throw a few extra drivers into a package that encompasses their whole product line anyway, other than the missed opportunity costs associated with not selling new equipment in a few rare cases (which is probably what this is really about) .
    NI will probably explain better than i do. in general, they changed the way the DAQ driver operates, introducing also DAQmx. this is incompatible with the old way of functionning. for you, it forces you to choose the driver version to use.
    But anyway, it's not worth it for me to set up a machine just to run this thing. I'm just going to trash it. Thanks for the help.
    why dont you sell it over ebay? there are quite a bit that would love using it, for cheap enough. for example, this was one of the latest cards exhibiting 3 counters, operating differently than the modern version. it has its usefullness...Also, if this is the only one card installed in the computer, you wont miss anything with the new drivers
    ... And here's where I keep assorted lengths of wires...

  • DAQCard 1200 / SC-2043-SG - Other Analogue Sensor Inputs?

    Hello!
    I am relatively new to these things, so sorry for any ignorance
    I have a self designed servo-hydraulic structural test rig with a load cell and displacement sensor on the hydraulic cylinder.
    Signals from both these sensors are inputted via a CB 50LP connector board and then a DAQCard 1200 to a PC using a Labview 6i program, through pins 1 and 2.
    I also have an in-house made strain gauge signal box that inputs to pins 3, 4, 5 & 6.
    The DAQ card is using NRSE.
    Now, this all worked fine for years, but now the in-house strain gauge board is kaput, so I am trying to connect up a SC-2043-SG that we have instead of the kaput inhouse board and the CB 50LP.
    I have read the DAQ Card and SC-2043-SG manuals, and pretty much understand what I have to do, but....
     Page 3.6 in the manual says, "Pins 1-8: Analog Input Channels 0 through 7—These pins carry the conditioned strain gauge bridge signals (referenced to AISENSE) to the DAQ board. They are not routed to screw terminals."
    So my problem is, how can I connect the first two analogue channels (pins 1 & 2) to my load cell and displacement transducer, whilst using pins 3 to 8 for the strain gauges, with the SC-2043-SG?
    I though I could just cut the 1 & 2 wires of the ribbon connector from the DAQ card to the SC-2043-SG and connect them directly to the sensors (and then not use strain gauge channels 1 & 2 on the SC-2043-SG), but am not sure if this would work / have any unwanted implications.
    Would this be OK, or is there an easier, neater way?
    Thanks
    Bife
    Solved!
    Go to Solution.

    Hi Bife,
    The SC-2043 is designed for strain gauge only, so you can't connect basic voltage signals.
    It is true that you can cut the cable in order to provide your own signals to the DAQ card.
    (not recommended, but it should work)
    You can also contact us so that we can suggest a better hardware (maybe the SC-2345 that can do multiple types of signal conditionning in a single enclosure).
    Best regards,
    Thomas B. | CLAD
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    >> Inscrivez-vous gratuitement aux Journées Techniques : de l'acquisition de données au contrôle/com...

  • Using a DAQCard-1200 on Windows XP w/ LabView 8.0

    I'm attempting to connect a Windows XP equipped laptop running LabView 8.0 with a DAQCard-1200.  I originally gathered data from an experiment from an older computer running LabView 6, and so I copied the vi over to the new system.  Originally I was missing some subVIs from the DAQ folder.  Now I've copied files from the older version that it was missing (AI acquire waveforms, 1easyio.lbb, files of that nature) into the new folder DAQmx.  These files create the error message "... is not executable".  I've read some stuff of the discussion boards that it seems to me that DAQ isn't the same as DAQmx, but I'm not sure why.  As soon as I fix this problem, I'm also having difficulty installing the new 6.9.3 driver for the DAQCard-1200 because theres a newer version installed.  I found the discussion board topic about how to uninstall newer versions through the control panel and I did that, but that's actually when the problems started happening with non-executable vi's.
    Any help would be appreciated.
    Mike

    Hi Mike,
    You are correct that Traditional NI-DAQ and DAQmx are different.  Before setting up a system you need to determine which one is right for you.  You can check here to see the most recent version(s) of the driver which are compatable with your hardware. The DAQCard-1200 is not supported by DAQmx, and the most recent version of Traditional DAQ which is compatable is NI-DAQ 6.9.3.
    The second thing to take into consideration is software compatibility.  You must check to see what LabVIEW versions are compatable with DAQmx 6.9.3.  This information is provided to you in a Knowledge Base.  Traditional NI-DAQ was designed for use in LabVIEW 6.1.  It will also install to earlier versions back until LabVIEW 5.0.1.  Although it is not recommended, it is possible to use DAQ 6.9.3 with LabVIEW 8.0.  The procedure for getting this to work is outlined in Using Older Versions of NI-DAQ (6.9.3) with LabVIEW 7.x or 8.x.
    After installing the correct version of the driver, you can try opening and running your VI.  There may be some minor modifications required to get the application to run in LabVIEW 8.0.  Do not copy DAQ or DAQmx VIs between computers.  It is likely that this can cause difficulty in communicating with your device due to incompatibilities (i.e a newer VI may not be compatable with your DAQCard-1200).  If you find that you are still unable to run your VI after following the above suggestions, post back with the problem you are having for further guidance. If possible, it would be recommended to upgrade your hardware and begin working with DAQmx to ensure the best possible support with your application.
    Hope this helps,
    Jennifer O.
    Applications Engineer
    National Instruments

  • Easiest way to connect DAQCard-1200 to BNC Input

    I have a DAQCard-1200 and I'm looking to capture a waveform from a BNC input signal. I can't seem to find any cables that go along with it, so I was wondering what the simplest setup is for collecting one channel of BNC input. Do I have to buy a Ribbon Cable and connect that to a BNC Adapter Board?
    Thanks for your help.
    Steve Russell

    Steve,
    To obtain BNC connectivity with your DAQCard-1200, you will want to use the BNC-2081 connector block. The PR50-50F cable will connect your DAQCard-1200 directly to the BNC-2081. Below is a link to the user manual for the BNC-2081:
    BNC-208X Series User Manual
    Good luck with your application.
    Spencer S.

  • R507 Digital Camera with Windows 7

    HELP!  I have an HP R507 digital camera with a SunDisk 64 MB memory card. Recently, I  upgraded my O/S to Windows 7 , updated the drivers, and replaced the  batteries in the camera. For some reason, the camera's memory  will not empty  after I download  pictures. WHAT SHOULD I DO??  RSVP! Thanks! Jim  
    This question was solved.
    View Solution.

    I have solved the problem! Thanks!!  Jim (Namaycush) 
    WHEN ALL ELSE FAILS, READ THE INSTRUCTIONS!! 

  • Digital Output With Timer (Simulation)

    Hello everyone, I just learned how to make LabVIEW program a week ago. I try to make a simulation of Digital Output by LabVIEW (my attachment). In this simulation I have a slider as an input (0-10 V), two numeric control (upper limit and bottom limit), a waveform chart that plot those 3 value, and two boolean LED (P0.0 and P0.1) as an indicator. In this simulation you can fill any number (between 0-10) in the numeric control as a limit for your slider input. If the input from a slider exceed those upper and bottom limit then the boolean LED will turn on, P0.0 if exceed upper limit and P0.1 if exceed bottom limit. The problem is I don't know how to make timer for those boolean LED. As example:
    1) Make an input from slider,
    2) If input (1) exceed the upper limit,P0.0 will turn on for 5 second,then turn of for 10 second,
    3) If in that 10 second you change the input back to normal (between upper and bottom limit) then P0.0 will stay turn of until the input from slider exceed the upper limit again,
    4)If in that 10 second you didn't change the input (stay exceed the upper limit) then P0.0 will repeat the process (2) until you the input from slider back to normal.
    (Same process for input that exceed the bottom limit).
    Can you help me to make this timer? Thank You (I'm sorry I made a double post):newbie
    Regards
    Juventom
    Attachments:
    Digital Output With Timer.vi ‏16 KB

    Hello Juventom,
    As I understand it you want to be continuously checking the value of the sliding bar and comparing that to the upper and lower limit controls whilest also chaning the LED booleans to true for 5 seconds then false for 10 seconds if the sliding bar value is outside of the limits.
    To do this you would probably be best using a parallel loop design, where you have 3 while loops in place of the one you have currently. Each one of these while loops would be responsible for a part of your program (e.g. the top one would display your values on the graph and the second one who check the sliding bar value against the upper limit and then turn on the LED, etc)
    I've found this tutorial about multiple loop programs and I think you should look at the section entitled "Parallel Execution"
    http://zone.ni.com/devzone/cda/tut/p/id/3749
    This way you can use normal delay VIs but when they run they only pause that loop rather than the whole program.
    Please let me know how you get on with this, and ask me if you need further help.
    James W
    Controls Systems Engineer
    STFC

  • Can you use a digital recorder with Skype on a Mac mini?

    Can you use a digital recorder with Skype on a Mac mini?

    Have a look at http://www.ecamm.com/mac/callrecorder/

  • Digital Signatures with Adobe Reader

    So i created an adobe form with acrobat 9 and sent it out for all to digitally sign.  about half are getting an error when they try to sign it.  "The credential selected for signature is invalid"
    We are a government agency and use Common Access Cards (CAC) certificates to digitally sign with. 
    i was hoping someone out there could either point me to a good recource for adobe and digital signatures or to a possible fix. 
    now we have narrowed it down to the problem being with the specific machine.  the user can digitally sign the document on another machine, but not on thier own machine.  Also, no one else can sign the doc on thier machine either. 
    Thanks in advance!

    issue still exists.  ive been searching for some info on how adobe handles digital signatures, like what folders are created on the machine.  im thinking maybe i can clear out the app data for acrobat or something.  im at a loss at the moment.

  • How do I layer digital backgrounds with photoshop elements 2011? [was:Please Help]

    How do I layer digital backgrounds with photoshop elements 2011?

    What do you mean by "layer digital backgrounds"? Can you explain a little more about what you're trying to do?

  • How to install adobe elements on a minibook with a 1200 x 600 resolution

    how to install adobe elements on a minibook with a 1200 x 600 resolution?

    You don't... that is below the minimum specifications
    http://www.adobe.com/products/photoshop-premiere-elements/tech-specs.html

  • Digital Signatures with SmartCards.

    Hi guys,
    Has anyone implemented in R/3 digital signatures with smartcards?
    Currently I'm at customer side trying to implement digital signatures within workflow processes using ABAP SSF functions. The smartcard devices are already installed, but I can't read the data inside the smartcard, moreover, I can't link the smartcard device with R/3 and I don't know how to do it…
    I read in some Weblogs and documents that it is necessary a SAP-certified external security product. I believe this external security product is the software that comes inside of smartcard drivers CD. It is something like a little application on which we can sign in data and put our fingerprint.
    I guess it is not supposed to develop an interface application between smartcard and R/3! When I started these developments I thought that I only needed to configure some environment variables to connect these devices with R/3 and then develop the ABAP flow logic with SSF Functions - Am I right?
    Can anyone provide me some guidelines for this issue?
    Thanks in advance,
    Ricardo.

    The SmartCard device is present at the frontend PC - and that's the place where the digital signature operation has to take place. Important is the "What You See Is What You Sign" principle: it has to be ensured that the data that is to be signed (using the private key stored on the SmartCard) is exactly the same as the one that is displayed to the user.
    Notice: there is a different scenario where the server is signing the data (after prompting the user for userID and password and validating that information).
    The signed data is then transported back to the server where it is stored (to ensure auditibility); usually you'll have to keep the (archived) data for years; the public key need to be archived as well.
    Notice: it is possible to attach the certificate (-> public key) which has been used to sign the data to the signed data.
    Regards, Wolfgang

  • Digital Handshaking with two PCI-DIO-32HS Cards

    Hardware: two PCI-DIO-32HS Cards
    Software: LabVIEW 5.1, NI DAQ 6.6
    Problem:
    I'd like to do burst digital handshaking with two PCI-DIO-32HS cards.
    One being used for sending bit stream while the other receive.
    Suppose I want to use burst handshake mode.
    How should I wire the connections?
    Where should I wire the REQ, and ACK line from the sending card?
    Should I wire REQ from card one to REQ of the other card?
    Also, how do I configure labVIEW VI to do burst handshaking mode.
    Can anyone send me a VI that can do this.
    Thanks a lot.

    Matt,
    I would recomend using the DIdoubleBufPatternGen.C examples that ships with NI-DAQ. You can find it in your \NI-DAQ\Examples\VisualC\Di folder. If you don't have this example on your machine, you can get it by running NI-DAQ Setup and selecting support for C/C++.
    This example does double buffering to allow you to continuously acquire data from your card. Data is transfered only when a full 1/2 buffer is ready. You can set how long to acquire data by setting the number of half buffers to read, or by modifying the read loop conditional parameters to fit you needs. See the NI-DAQ help on how to set you REQ pulse rate to 100kS/s.
    Nick W.
    www.ni.com/ask

  • Digital camera with iChat

    I have a digital camera with a usb connection. Can I use that for a video chat? I'm using chat 3.1.9
    Thanks, Michael

    Hi,
    Most likely the answer is No.
    I have a Kodak DC4800 It connects to iPhoto over USB.
    It can not be seen as a Live Camera over this route.
    It does in fact have a very small single channel socket (Smaller than a 3.5mm jack) that has a RCA connector on the other end.
    This can be used to Display pics on a TV Screen (There is a Menu item in the Set up for PAL or NTSC) which can be handy to take Indoor pics if you can not see the display properly. If you have a VCR Recorder you can record video that way.
    I used it to go through a DV converter.
    This was used to convert an Analogue CamCorder but I tried it with the camera.
    If the Video can be seen over USB in say iMovie (It can see many more Video codecs than iChat or Photo Booth) or Quicktime Player (if you have the Pro capabilities to Record) then it will be a question of an Utility for iChat. (http://www.ecamm.com/mac/iusbcam/) This can change the codecs of some USB cameras that have drivers and can be "Seen" in various other apps. (Essentially it converts it to DV format).
    If the USB feed can not be seen in any App you would need a Mac Driver for it much like you would for a WebCam.
    There are Third party Drivers fro many web cams as well as some cameras coming with drivers on the Install Disk or downloadable from the Manufacturer's Web Site.
    However these tend not to exist for Stills cameras, nor do the third Party ones work.
    5:46 PM Sunday; September 13, 2009

Maybe you are looking for

  • ICal creates calendar account every time I open iCal - duplicate Events

    Hi, I have a Lion Server and 2 Clients (with Lion). All the latest updates are installed on all computers. On the Lion Server is the iCal Server running. Ok I configured iCal on a client (Lion). This client has 2 cal accounts (in Preferences -> Accou

  • XML-comments not allowed in IDOC_AAE-Adapter?

    Hi, we have a File to IDOC scenario and are using the IDOC_AAE adapter for sending IDOCs to the SAP system. In our mapping we have added some comments into the XML-IDOC-structure. By sending the document to the SAP-system we are getting the following

  • Finding images NOT in subfolder

    I upload images into one folder. Then I move them into subfolders. I want to check to see what images I may have missed and are still in the main folder. How can I do this? Right now I sift through all the images one by one looking for them. Is there

  • Apply Licensing for 5508 HA

    Hi All, I have 2 5508, 5508-A has 500 permanent licenses and 5508-B 200 evaluation licenses, I'm setting up location awareness by adding 180 APs to 5508-B and then migrating APs on 5508-A to 5508-B and then setting up HA. It is being done this way to

  • Color Consistency Across Macs and PCs?

    We have an encoding and creative workflow that is entirely mac/FC Studio base, but authoring is done on a PC. We have run into a very peculiar problem involving color shifting... M2Vs and Still Images (Tiffs) are exported from Motion Projects using C