Compatibility of Pattern I/O with a PXI-6533 card with real time application

Hi,
I use a PXI-6533 card on a chassis with a PXI-8145 RT controller. I want to acquire 32 bits of data with this card at 1,25 kHz. These data are generated by an external FPGA card together with a clock signal. Referring to the 6533 documentation, a simple way to synchronize the acquisition is to use pattern I/O, entering the clock signal on the REQ pin. I tried the LabVIEW example Cont Pattern Input.vi. It seems to go well, but it does not work when I change this .vi priority to Time-critical or any higher priority than normal. Is there an incompatibility between this type of acquisition and real-time ?

Hi,
Sorry for this late answer, I effectively added a �wait until next ms function� in the time-critical loop and it seems to work. In fact, I don�t see exactly how, but it works with 1ms up to 50 ms.
Nevertheless, the next step is to add the content of this example in a more complex system with a communication loop to transfer the acquired data to the Host PC. I already have such an architecture, with a �time-critical priority� .VI to read the data at 1.25 kHz and a RTFIFO to transfer them to a second �normal priority� .VI loaded on the PXI, that ensures a TCP communication with the HOST PC. This architecture works correctly on another application with an internal clock, BUT doesn�t with this example, even with your solution.
In
fact, it seems to work when I put the two .VIs loaded on the PXI in �normal� priority, but with a tremendous jitter (the frequency varies from 1 to 2 kHz) and a scan backlog that gets full very fast. Then, the PXI crashes as soon as I change any priority.
You�ll find attached these three .VIs and I hope this could help to find a solution.
Cordially
PS: Since we started this thread, I�m in relation with NI Support France on that subject. Nevertheless, we haven�t found any solution yet and any new idea is welcome !
Attachments:
HOST_Comm.vi ‏532 KB
RT_Comm.vi ‏265 KB
TCLoop.vi ‏256 KB

Similar Messages

  • Using PXI 6541 in real time applications

    Hi,
    I am using PXI 6541 for data acquisition from an electronic card. Electronic card generates an interrupt signal and PXI acquire data  when it gets interrupt signal. The interrupt signal is generated 1 KHz. But using my windows XP, I cannot acquire data on this high rate. The highest rate that I can acquire data is 50 Hz. If I want to acquire data with 1 K Hz, from electronic card using PXI 6541 and using Windows XP,  What Should I do.

    Are you trying to use the 'interrupt signal' as a sample clock for your acquisition?  If so, you should be able to route the signal to the PFI input and configure it as the sample clock.  
    As mentioned in your other thread (http://forums.ni.com/t5/Digital-I-O/Using-PXI-6541-in-real-time-applications/m-p/2561885), I would also recommend looking through some of the examples to get a starting point for your application.  
    James K.
    National Instruments
    Applications Engineer

  • CRIO and ni 9234 modules not working or communicating through fpga with accelerometers, fpga connected to real time application which is also connected to shared variables linked to modbus slave

    Hi,
    I have a compact rio which has a 4 way chassis attached to that chassis is three ni9234 modules they are linked using fpga to a real time application then using shared variables in the low speed loop that are linked to a modbus slave to communicate with dcs, the ni 9234's have accelerometers connected to them with iepe ac coupled option on the c modules, my problem is the real time application seems to be running okay even when power loss occurs it restarts with no problem and the fpga writes to the portable hard drive bin files fine but without a accelerometer connected I get low noise readings as soon as I connect a accelerometer to any one of the 10 outputs it just goes to a fixed number (0.03125) as soon as disconnect it again it reverts back to reading noise, I have run a scan on the modules and only get a spike when I connect or disconnect the accelerometer, I have tested the voltage at the pins of the module and I get 22 volts dc which makes it more likely that the hardware is not the problem but a software is maybe causing this to hang-up, I attach project and files for your perusal. I also carried out a new project which in scan mode directly linked the module input to shared variable and the same scenerio again. Help would be much appretiated. 
    Many thanks
    Jason
    Solved!
    Go to Solution.
    Attachments:
    logger 2plusmodbus2.zip ‏679 KB

    Whren using waveform acquisition with the 9234s we recommend the following FPGA and RT template.
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/209114
    it can be extended as a data logger with:
    http://zone.ni.com/devzone/cda/epd/p/id/6388
    or using shared variables combined with scan engine
    http://zone.ni.com/devzone/cda/tut/p/id/9851
    The FPGA in all of these, as well as the RT framework have been used successfully by 1000s of users.  I would recommend giving these a try. 
    Preston Johnson
    Principal Sales Engineer
    Condition Monitoring Systems
    Vibration Analyst III - www.vibinst.org, www.mobiusinstitute.com
    National Instruments
    [email protected]
    www.ni.com/mcm
    www.ni.com/soundandvibration
    www.ni.com/biganalogdata
    512-683-5444

  • HT204053 I recently had to cancel my debit card because of unauthorized purchases on Facebook. I have tried to update with 4 different credit cards and every time I get a message saying invalid security code. How do I fix this?

    I recently had to cancel my debit card because of unauthorized purchases on Facebook. I have tried to update with 4 different credit cards and every time I get a message saying invalid security code. How do I fix this?

    Debit card? Are you sure?
    USA iTunes Store does not appear to accept debit cards - http://www.apple.com/legal/itunes/us/terms.html  "The iTunes Store, Mac App Store, App Store, and iBookstore services (“Services”) accept these forms of payment: credit cards issued by U.S. banks, payments through your PayPal account, iTunes Cards, iTunes Store Gift Certificates, Content Codes, and Allowance Account balances."

  • Pxi 1042 configuration for Real-Time

    I have purchased
    PXI-1042
    controller 8187
    one module for data capturing.
    I want to configure the said pxi for Real-Time purposes,
    so what steps should I follow?
    thanks
    Abbas
    Solved!
    Go to Solution.

    The Real-Time Deployment License is different from the LabVIEW RT module.
    When you purchase a PXI controller, part of what you are buying is the operating system that is installed on it.  If you purchased the controller with Windows on it then you purchased the copy of Windows that came with it.  Now that you want to put the real-time operating system on it, you need to purchase the license for it.
    But that's all legal stuff.
    And you are correct that there is a copy of the RTOS on the controller.  But if your controller has its disk formatted as NTFS you will not be able to use it.   The RTOS requires that the hard drive be FAT32. 
    If your controller is FAT32 then you can boot into BIOS (hit delete while booting), and select Real-Time in the Boot Configuration to make your controller boot into Real-Time.
    If you controller is not FAT32 then you will need to format the controller first, using the disks I have previously mentioned.
    Please let me know if you have any questions.
    Justin Parker
    National Instruments
    Product Support Engineer

  • No real time RAM preview with Pr CS6.0.1 Open, Real time RAM preview with Pr Closed

    When Premiere is closed I see real time performance on RAM previews (29.97fps), smooth UI interaction with and without Fast Draft enabled.  When Premiere is opened I get poor fps in RAM previews, lagging UI interaction unless Fast Draft is enabled.  Premiere Memory preferences are set to 6GB reserved for other apps, and Optimized rendering is for Memory.  It seams that as soon as Pr takes on any processing AE's performance just dies until Pr is closed. 
    I've tried clearing caches in AE and in Premiere, restarts, and repair disk permissions,
    Included is a realtime screen capture of what i'm experiencing, I open and close premiere twice to illustrate that it's not an isolated occurrence.
    any ideas?
    System Specs:
    dual 2.26ghz quadcore MacPro4.1
    OSX 10.7.4
    32gigs RAM,
    4000 for Mac.
    CUDA 4.2.10, GPU driver 1.3.4.0. 
    Cache drive is a 2TB RAID 0 with 250MB/sec
    Media is on RAID 6 with 600+MB/sec.
    Premiere Pro CS6.0.1
    After Effects CS6.0.1

    I tried the same thing on my MacBook Pro and there was no issue.  Here's my specs:
    Processor  2.4 GHz Intel Core i7
    Memory  8 GB 1333 MHz DDR3
    Graphics  AMD Radeon HD 6770M 1024 MB
    Software  Mac OS X Lion 10.7.4 (11E53)
    I'd suggest:
    Submit bugs to http://www.adobe.com/go/wish . More on how to give feedback: http://bit.ly/93d6NF

  • Build real-time application with Compact RIO

    Good afternoon,
    I am currently trying to run a VI on compact RIO and would like to control it through remote front panel. I followed steps on this link http://digital.ni.com/public.nsf/allkb/AB6C6841486E84EA862576C8005A0C26 and successfully done everything with a simple example.
    However when I moved on and did the same thing to a more complicated VI (my purpose is to make this VI work), everything was fine until I reboot the compact RIO. After a few seconds connection lost between the host computer and cRIO, and I had to shut it down and delete the startup file (with extension .rtexe).
    I am not sure what happened since everthing works fine with simple VI but not the complicated one. It could because the second VI has many sub VIs as well as objective functions loaded in it, it could also because the VI takes too much memories of the CRIO and stop it from connecting to the host computer.
    If anyone have any ideas of how to make it work please let me know.
    Thanks very much
    Carl

    Hello zzzfreedom,
    There are a number of potential issues I can see with the VI you're trying to deploy as a startup executable.  How do you intend to interact with this VI? Are you running the front panel as a remote panel or connecting to the VI using debug tools? A few points:
    - Your VI will run immediately when the RIO boots unless you're using debugging tools to prevent this from happening, keep that in mind.  It looks like you've accounted for this and required an initialize or network trigger of some sort for some of your loops, but the AI loop will start quickly and appears that it may require user input. 
    - You have several "user prompt" style express VIs.  These will not work (or will not work as expected) on a standalone RT target.  There is usually no front panel to interact with!
    - Like dialogs, event structures watching for user interaction probably aren't going to do what you want.
    - You are writing quite a bit of data to the VI's front panel, and there is at least one chart indicator.  Again, how will the user interact with this VI?  It looks like you need a host VI that will run on a machine the user will interact with.
    - You're using quite a few local variables.  It looks like you've taken a lot of care to protect against race conditions, but this causes a lot of data copies and tends to be error prone.
    - I've not analyzed all cases, but it looks like you have a number of places where the execution of a timed loop may be blocked under certain conditions.  This will likely rail the CPU due to the much higher priority of the timed loops.
    - What will happen if you lose connection with the server in your TCP command loop?  it doesn't look like there is any way for the user to reconnect without restarting the RIO.
    If you do intend to run this as a remotely accessible VI on your RT target, another point to note is that when running from the development environment, the front panel of your VI executes on the host machine. Once you deploy it as a remote front panel or debuggable RTEXE, everything is hosted on the RIO, and this has the potential to bog things down quickly.
    Here are a few references I think you might find helpful:
    LabVIEW Help: Real-Time Operating Systems - see considerations for Express VIs and Front Panel interaction
    http://zone.ni.com/reference/en-XX/help/370622L-01/lvrtconcepts/rt_osnotes/
    LabVIEW Help: Real-Time Module on VxWorks Targets - see unsupported features
    http://zone.ni.com/reference/en-XX/help/370622L-01/lvrtconcepts/rt_vxworks/
    NI LabVIEW for CompactRIO Developer's Guide -lots of good general information on architecting RT applications, network communication and hosts, etc. It looks like you're using the RIO Scan Engine, so the FPGA portion might not be relevant at this time.
    http://www.ni.com/compactriodevguide/
    Best Regards,
    Tom L.

  • How to make 1D Array but with only one element filled in Real Time

    Hi folks,
    here I am with another question. I want to implement an prediction discrete state space observer which is going to run on a CRIO real time target. I am going to do it just like in the example which comes with LV.
    I have some questions regarding the input and outputs which in the example those are "dummy".
    My model is a SISO model, but the function "Construct SS model" returns parameters (A,B,C,D Matrices) as 2D arrays, so once you connect the cluster model into the Discrete Observer model, it takes y and u as 1D arrays despite of the fact that there is a SISO model.
    I realized that the function I am using in the simulations, uses 1D arrays but with only one element filled:
    Does anyone knows how to implement such 1D arrays in Real Time? I guess the way to do it is preallocating one array of zeros of size 1, and then recirculating it through some SR, and replacing the element with my real input and output, but at the dummy.vi, they are using a simple "build array"
    function.

    Ok, I did it that way. But I am facing another problem right now...
    At some point the Discrete Observer return a NAN array, you will see the code in the code snippet?
    I get rid of that component by component, but the observer gets "stuck" in it. So my Control law is zero... but the state stimate is NAN.
    Also I am attaching the VI.
    I do not know why, since in the simulation program all runs well. any thoughts? Maybe the internal numeric precision of the State Space Model?
    Attachments:
    RT - Pole Placement + Complete Observer.vi ‏40 KB

  • Gigabit ehternet card in real time PXI chassis

    I was reading document # DZ52103_US about general real-time
    information. This document states that plug in ethernet, serial, and
    GPIB cards are not supported for real-time I/O operations. I want to
    use a plug in PXI gigabit ethernet card to transfer data from my real
    time PXI chassis to a host laptop computer. Can this be done?

    Currently, the only PXI ethernet controllers supported in LabVIEW Real-Time are the PXI-8211 and the PXI-8212. These are not Gigabit cards. This information is documented in the following link and will be updated as ethernet device support in LabVIEW Real-Time changes.
    If you have any further questions, please post a reply.
    Regards,
    Kristi H
    Applications Engineer
    National Instruments

  • Problems with my Mobo - Hangs at "Testing Real Time Clock" - Whats up?

    Specs:
    MSI K7 Master - MS-6341
    AthlonXP 1800+
    512 PC2100 DDR
    40 gig Maxtor or Seagate 7200rpm, 2mb cache - tried both
    Pine 56x CD Rom
    eVGA GeforceFX 5200
    Generic 420watt PS AND Antec TruePower 430 watt PS - same results
    Alright, I can access the BIOS, etc. but after the first startup screen (with ram, processor type, IDE config, etc.) the screen just has a underscore blinking at the top left...and the error system says Testing Real Time Clock - Ive basically narrowed it down to the IDE connectors since I can unplug the hard drives and it gets past that section and asks for a boot drive - anyone have similar problems? I know he board is old but its for the family and Im stuck right now and I got some anxious members of my family wanting to know whats up with the system - any help will be appreciated...
    It should be noted that I was able to install/run Windows for a couple of days - then I got stability problems which was from the BIOS overclocking my ram past spec but I fixed that and Windows was fine but now I cant boot back up and it cant be virus or bad hard drive because Ive tried more than one - some people who also have this problem can sometimes boot to their OS but its usually not the case...thanks again for the suggestions..
    Things Ive Tried:
    Replaced the CMOS battery
    Cleared CMOS
    Change IDE channels
    Tried just the hard drive
    Removed jumper from hard drive
    Cleared CMOS
    Reseated the ram
    Reseated Video card
    Cleared CMOS
    If someone can find a working solution for this, Ill give them something in return - like some DVDs or cash or something, I really need some help..
    - Adam/vicks

    No problems with the mobo physically - I got it new from a trusted guy on some forums I goto (anandtech) new from an RMA - still sealed...
    I have it running....again....it doesnt like that 1st IDE channel for some reason - Ive restarted/rebooted a few times and nothing has happened so hopefully its fixed but still not cool - Im almost 100% sure it will happen again...

  • Error -200072 using analog input with 3 PXI 6120 cards on realtime mx system

    I have just upgraded to the mx drivers for the 6120 S series boards.
    I am trying to sample 12 analog inputs at once with a pretrigger. (4 channels per board)
    The error message -200072 comes up.
    One board works fine, when I add the second board's channels the error occurs.
    Each board shows up as A,B,C respectively in MAX and in the Labview browse menu for selecting channels.
    Greg Morningstar
    Takata

    Probably the best way to do this would simply be to use the Route Signal VI to make it to where each of your boards looks at a particular line for the trigger. You can do the same thing for the clock so that they are all sampling at the same time.
    You will also want to make sure that your device is defined in MAX. Once you do that, everything should be pretty easy to implement. You might also want to look at some of the examples that show how to do RTSI. It's almost the same as you would do for a PXI system.
    Otis
    Training and Certification
    Product Support Engineer
    National Instruments

  • Is there any PXI oscilloscope card, with sampling rate 1Gs/s or higher

    Hi all
    In my test lab, I am using TDS684B oscilloscope. Is there any PXI card which can perform the same function as TDS684B? I searched the NI catalogue and they don�t have 1Gs/s sampling rate scope. Is there any other vendor for PXI and may be they might have. Please give me suggestion?
    Many thanks.
    Saw

    I would say that Acqiris is worth a look. They have very good software support for their hardware. One can actually simulate hardware using they software while debugging at home or consulting. I have also used Gage and that product was pretty pathetic in terms of software support and hardware flexibility (it was 4yrs ago so perhaps that has changed)
    Reinis Kanders

  • Can RF signals be used with ni pxi-7831R card

    what is the frequency range of signals that can be analyse with FPGA 7831 card

    The analog input can acquire signals at 200 kHz. The digital inputs can acquire signals in one clock cycle of the FPGA, so the actual acquisition speed for digital depends on your specific application code, but can be as fast as 40 MHz. Processing of data also depends on the amount of code in your processing routine.
    As an alternative to the onboard analog inputs, it is possible to use a faster external A/D converter and use the high speed digital interface to read the data from the external ADC.
    Christian L
    Christian Loew, CLA
    Principal Systems Engineer, National Instruments
    Please tip your answer providers with kudos.
    Any attached Code is provided As Is. It has not been tested or validated as a product, for use in a deployed application or system,
    or for use in hazardous environments. You assume all risks for use of the Code and use of the Code is subject
    to the Sample Code License Terms which can be found at: http://ni.com/samplecodelicense

  • MAX 3.1.1 hangs when Itry to run Test Pannel on PXI 6533 card

    It appears that I need to update my NI-DAQ driver (currently Rev 6.8), but I need to know which version
    to use to be compatable with LabView 6.0 and MAX 3.1.1.
    Thanks!

    Duplicate thread.  See NI_DAQ rev for LabView 6.0.

  • Is it possible to boot a 7300 with a PCI USB card with OS 8.5 on a flash drive? Thanks

    Just wondering if it's possible to run a PowerMac 7300/180 with the operating system installed on a flash drive which fits into a USB PCI card? The old hard drive finally went bad and was just wondering if there was any form of flash storage alternative to purchasing a new SCSI hard drive. I have an original OS 8.5 installation CD, would that install onto a flash drive if the computer is able to recognize it? Thanks!

    Unfortunately, USB support in the older "PCI" Power Macs isn't firmware-based, so it won't be found in the 7300's ROM code.  If available, it's through the addition of a USB PCI card and is driver-based.  You can't boot from a device, if the drivers needed for recognition of the bus (on which that device is connected) are loaded after the boot sequence has been initiated.  USB flash drives are very useful for extra data storage in those old Macs, but you really need to be running OS 8.6 and download/install the "USB Adapter Card Support 1.4.1" for recognition of the PCI card itself.  If running OS 8.6, I've found that the optimal database of supported USB peripherals is contained in the USB Support files, that are included in the OS 9.1 Update.  The files need to be extracted with "TomeViewer" and placed in the OS 8.6 Extensions Folder.  Again, this will only help with the recognition of a USB device when the computer is up and running.  It cannot make a connected USB flash drive capable of booting the computer.

Maybe you are looking for

  • Error: An integer constant expression is required within the array subscrip

    hello all, here is a small piece of code which compile well with g++: #include <iostream> using namespace std; int main () { int i= 0, j=4 ; cout <<" i=" ; cin >> i ; cout  << "i="<<i<< endl; if ( i > 1) {    double xx [i+5];    int n= i+5;    for (i

  • SQL INSERT statement

    hello everyone, I'm afraid I'm stuck with some of my project. It should be known I'm a beginner in Java yet trying to understand SQL and it's not working very well!!! Can anyone tell me why I'm getting a syntax error with the following?: try{  //set

  • Russian characters in Applet.

    Does anybody know how I can get Russian characters to display in a java applet chat screen?

  • ErrorCodes.ini and the logs

    I have chosen to ignore some errors by putting the error codes in ErrorCodes.ini. When the error occurs the the following ends up in the log: java.sql.SQLException: ORA-20002: xxxxx Wed Dec 13 07:40:20 CET 2006: Agent: Dropping the message since the

  • Refresh whole relational model from database

    Version 3.2.09 - Build MAIN-09.30 Hi, I have created a model of my database in SQL Developer using the Data Modeller component by dragging all the table objects from the Tables tree onto the relational and logical panes. As I make changes to the data