NI XNET

Hi,
I am pretty new to both CAN and Labview. I would appritiate it if you can help me to get around this problem:
I have a NI PCI 8512/2 CAN card on my PC from which I am trying to send signals to a laptop with Vector CANcard XL interface.
CANanalyzer is in running mode on the laptop side.
When I "Transmit Single Frame on CAN" (in a cyclic manner) from "NI XNET Bus Monitor" (with termination checked), I can see the correct frames received in CANalyzer trace window and no error frames. But when I send the same frame with a Labview VI (the very simple VI I attached), CANalyzer receives Error frames instead. Consider that the waiting time in the while loop is quite long and so the CAN Comm State begins with Error Active and changes into Error Passive state after a short time (in less than 2 seconds). Then the last error is Stuff and receive error counter is 128.
But What does MAX do that my VI does not?
Attachments:
Wakeup4.vi ‏18 KB

Well, the problem is solved!
The default value for termination in Hi speed CAN is off. I changed it into on and it worked.

Similar Messages

  • Xnet problems labview 8.6.1

    Existing Labview code on clients sytems 8.6.1 so have to work with this.
    Using CAN 2 PCMCIA card for original program development using API Channels was no problem and client ECU could be read without fail over long periods (24 hours plus).
    PCI card dual channel had to be used for client deployment which required the installation of XNET to support this card. Version used was 1.5 because after this xnet no longer supports Labview 8.6.1.
    Can drivers loaded version 2.7.3 (last support for 8.6.1)
    When the originally developed program was run with PCI card and XNET problems occurred with the CAN link failing after about 10 minutes. Error message summerised was insufficient sampling time or/and too much data specified. Sample rate is 1ms (client ECU has new data available every 20ms so 1ms should be well within limits?)
    From what I gather the PCI card buffers are not read fast enough, fill up to limit then just crash........
    So after lots of messing around and reading forum it seems that Frames are the way forward for the versions of drivers/software I am using.
    Ok, that's the background, so what do I want to know?
    1) Using MAX have created .ncd file for Channels, is it possible to use this for Frames? Found the XNET Database Editor and see you can develop   
        clusters?
    2) Using the Examples file tried XNET read vi and although it seems to allow me to select a frame it does not recognise my installed PCMCIA CAN
        interface, Why?              Looking in MAX the card is there and tests fine..............
    3) Tried a CAN Example and that quite happily found my CAN PCMCIA card and also obtained data from Clients ECU, Why?
    Summay
    Without going through the pain of what Labview/Drivers no longer supports etc I just want to know what I need to concentrate on?
    I think one conclusion is that I need to use frames, channels cannot cope?
    XNET vi's are limited with legacy CAN and perhaps Labview 8.6.1 so do not use there vi's??
    Stick with legacy CAN vi's using Frames and 'chunk' the frames using std LV code?
    Appreciate comments on best route to go. In fairness the XNET stuff looks good but I get the feeling its not applicable to the version of Labview I am using?
    Jack
    Labview Version 8.5
    Labview Version 8.6
    Labview Version 2013

    Hi Jack,
    I hope you are well. In response to your questions please see below (If you are still having issues).
    1)      The .ncd file is used for configuration information about CAN messages such as frames and channels.
    Further information can be found here:
    http://digital.ni.com/public.nsf/allkb/FCC9622168F856B486256CA2005C981B
    You can use the NI Database editor if you need to store information about the frames and signals on the network. It is possible to use a CAN network without using the Database Editor, but highly recommended.
    2  & 3) To use a XNET example, you will have to be using an XNET supported card (shown in the XNET Readme file). The PCMCIA CAN interface sits on the CAN driver level so relies on the CAN driver.
    Further information can be found here:
    http://www.ni.com/white-paper/9727/en/
    http://www.ni.com/white-paper/2732/en/#toc9
    I look forward to hearing from you.
    Kind Regards,
    Aidan H
    Applications Engineer
    National Instruments UK & Ireland

  • Why can't i read lin slave response in xnet

    Hello,
    we have developed a small lin application that starts xnet schedulle and sends master frames and headers.
    I am trying to read all the frames on the bus and unfortunately i only see the master frames. The slave headers from master are not filled by the slave.
    We are using single point frame approach.
    Also, i have observed that when i disconnect the slave from the network, the slave headers are send very fast(as it should be). But when i
     connect the slave, the headers are sent more rare and the payload is not completed by the slave.
    The slave is working ok! because we send frames manually without the schedulle and the slave always responded good.
    Which seems to be the problem?

    It's hard to say, can you post your source?  I've had success with LIN and XNet with signals, but haven't had a need to try frames yet.  My setup is similar to what you described.  I create two sessions for Signals Out Single-Point, and Signals In XY.  Then I have a write LIN Schedule on the Signal Out session selecting a schedule from the database.  Then I write Signal Single-Point on the Signal Output.  Then periodically I read Signal XY using the Signal Input session.  After doing this my read then returns all my signals once every 200ms or so, which is based on the schedule selected.  I've never done frame on LIN but I assume it is similar.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.

  • No signals appear in Signals tab of NI-XNET Bus Monitor

    Hi, I am trying to use a NI 9862 USB to read a J1939 PGN.  I will choose PGN 65265 (CCVS1), SPN 84 (Wheel Based Vehicle Speed) as an example.  In the Monitor tab of NI-XNET Bus Monitor, I can see that the Frame counter is increasing at the expected rate for that ID, and the Frame Name column is filled in with my custom text.  A couple weeks ago, I was able to click on the Signals tab and see the payload for the Signal I had defined listed converted to engineering units.  Now, nothing is appearing in the Signals tab.  I have used Add/Remove Signals to select the signal of interest to list in the "Signals in View".  Any ideas?  I did recently swap out a bad NI 9862 for a new one, but I don't see how that would be related to this issue.
    Solved!
    Go to Solution.

    Hello barkeram
    If you are at the add/remove signals, when you select names to search=signals, Are you getting results?
    Is the frame view still working fine? did you change any setting like Interface and Database?
    Regards
    Frank R.

  • Possible to allow user to select XNET DB?

    I'm writing an application to create a test sequence incorporating user-selected CAN commands. I want to allow the user to select the XNET DB file while the application is running, but this doesn't seem to be possible, at least not as far as I can tell. I have already successfully written all the parsing code I need, now I just have to figure out how to let the user choose the DB file.
    Any ideas? I'm looking into the Generic 'Create Session' approach right now, but I don't know if that's going to work.
    Thanks in advance!

    Under the XNet palette there is a palette for database control.  Generally what I do is delete the existing database on startup, clear the error (incase no database with that name exists) then add that database back with the path to the file that the user provides.  This way the user just replaces the DBC file (or whatever) and the application uses the new one.  There are all kinds of other things you can do with the database editing palette too.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.

  • Error -1074384569; NI-XNET: (Hex 0xBFF63147) The database information on the real-time system has been created with an older NI-XNET version. This version is no longer supported. To correct this error, re-deploy your database to the real-time system.

    Hello
    I have a VeriStand-Project (VSP) created with my Laptop-Host (LTH) which works with my PXI, while
    deploying it from my LTH. Then I have installed the whole NI enviroment for PXI and VeriStand use on a
    industrial PC (iPC). I have tried to deploy my VSP from the iPC to the PXI but the following error
    message arose on my iPC:
    The VeriStand Gateway encountered an error while deploying the System Definition file.
    Details: Error -1074384569 occurred at Project Window.lvlibroject Window.vi >> Project
    Window.lvlib:Command Loop.vi >> NI_VS Workspace ExecutionAPI.lvlib:NI VeriStand - Connect to System.vi
    Possible reason(s):
    NI-XNET:  (Hex 0xBFF63147) The database information on the real-time system has been created with an
    older NI-XNET version. This version is no longer supported. To correct this error, re-deploy your
    database to the real-time system. ========================= NI VeriStand:  NI VeriStand
    Engine.lvlib:VeriStand Engine Wrapper (RT).vi >> NI VeriStand Engine.lvlib:VeriStand Engine.vi >> NI
    VeriStand Engine.lvlib:VeriStand Engine State Machine.vi >> NI VeriStand Engine.lvlib:Initialize
    Inline Custom Devices.vi >> Custom Devices Storage.lvlib:Initialize Device (HW Interface).vi
    * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * • Unloading System
    Definition file... • Connection with target Controller has been lost.
    The software versions of the NI products (MAX/My System/Software) between my LTH and the iPC are
    almost the same. The only differences are:
    1. LabView Run-Time 2009 SP1 (64-bit); is installed on LTH but missing on iPC. The iPC has a 32-bit system.
    2. LabView Run-Time 2012 f3; is installed on LTH but missing on iPC.
    3. NI-DAQmx ADE Support 9.3.5; something strage on the LTH, because normally I am using NI-DAQmx 9.5.5 and all other DAQmx products on my LTH are 9.5.5. That means NI-DAQmx Device Driver 9.5.5 and NI-DAQmx Configuration 9.5.5.. On the iPC side all three products are 9.5.5.. That means NI-DAQmx ADE Support 9.5.5, NI-DAQmx Device Driver 9.5.5 and NI-DAQmx Configuration 9.5.5..
    4. Traditional NI-DAQ 7.4.4; The iPC has this SW installed. On the LTH this SW is missing.
    In order to fix this problem I have formatted my PXI and I have installed the following SW from the iPC:
    1. LabVIEW Real-Time 11.0.1
    2. NI-488.2 RT 3.0.0
    3. NI_CAN 2.7.3
    Unfortunately the above stated problem still arose.
    What can I do to fix this problem?
    I found a hint on http://www.labviewforum.de/Thread-XNET-CAN-die-ersten-Gehversuche.
    There it is written to deploy the dbc file againt.
    If this is a good hint, so how do I deploy a dbc file?
    I would feel very pleased if somebody could help me! :-)
    Best regards
    Lukas Nowak

    Hi Lukas,
    I think the problem is caused by differenet drivers for the CAN communication.
    NI provides two driver for CAN: NI-CAN and NI-XNET.
    NI-CAN is the outdated driver which is not longer used by new hardware. NI replaced the NI-CAN driver with NI-XNET some years ago, which supports CAN, LIN and the FLEXRAY communication protocol.
    You wrote:
    In order to fix this problem I have formatted my PXI and I have installed the following SW from the iPC:
    3. NI_CAN 2.7.3
    NI CAN is the outdated driver. I think that you should try to install NI-XNET instead of NI-CAN on your PXI-System, to get rid of the error message.
    Regards, stephan

  • How to extract Attribute Value from a DBC file with LabWindows and NI-XNET library

    Hi all,
    For my application, i would like to feed my LabWindows CVI Test program with data extracted from *.dbc file (created by another team under Vector CANdb++).
    These files contains all CAN frame definition
    and also some extra information added to :
    Message level,
    Signal level,
    Network Level
    These extra information are set by using specific ATTRIBUTE DEFINITIONS - FUNCTIONALITY  under Vector CANdb++
    The opening of the DataBase works under NI-XNET DataBase Editor as in LabWindows using: nxdbOpenDatabase ( ... )
    No attribute seems be displayable under the NI-XNET DataBase Editor (it's not a problem for me)
    Now, how, using the NI-XNET API and CVI, be able to extract these specially created attributes ?
    Thanks in advance.
    PS : In attached picture, a new attribute called Test_NI, connected to a message
    Attachments:
    EX1.jpg ‏36 KB

    Hi Damien, 
    To answer your question on whether the XNET API on LabWindows/CVI allows you to gain access to the custom attributes in a DBC file, this is not a supported feature. The DBC format is proprietary from Vector. Also, custom attributes are different for all customers and manufacturers. Those two put together make it really difficult for NI to access them with an API that will be standard and reliable.
    We do support common customer attributes for cyclic frames. This is from page 4-278 in the XNET Hardware and Software Manual : 
    "If you are using a CANdb (.dbc) database, this property is an optional attribute in the file. If NI-XNET finds an attribute named GenMsgSendType, that attribute is the default value of this property. If the GenMsgSendType attribute begins with cyclic, this property's default value is Cyclic Data; otherwise, it is Event Data. If the CANdb file does not use the GenMsgSendType attribute, this property uses a default value of Event Data, which you can change in your application. "
    Link to the manual : http://digital.ni.com/manuals.nsf/websearch/32FCF9A42CFD324E8625760E00625940
    Could you  explain us the goal of this attribute, and what you need it on your application.
    Thanks,
    Christophe S.
    FSE East of France І Certified LabVIEW Associate Developer І National Instruments France

  • Convert binary data to decimal in CAN XNET

    I am using NI-XNET Read function to read CAN frames from port CAN1 , the data out put comes in cluster in binary format is there any function that I can use to convert the binary to decimal ? I prefer not to write another sub-vi to do conversion but using the existing functions that can use my .dbc file and parse all frames in decimal format.
    Thanks.

    Yes Doug You are correct. We have a PCB and we have connected two inductive  sensors over the PCB which are measuring the distance and transpose it to voltage. The PCB has a software which transpose this voltage through an ADC and through the Can Bus we are reading the sensors .To communicate with the PCB  we are using a USB to Can device from IXXAT to communicate, we have build the Labview drivers and our problem as you mention is that we want to transpose this data to voltage. How we can do this? I try several times to upload photos or even the program we have create but for some reason the site is not letting me to do that. If you have an email i can send you the program we have create to tell us your opinion in this problem we have, we can also send you the Labview drivers and the datasheet of the sensor we are using.
    Attachments:
    Read from Can.vi ‏37 KB

  • Trouble with Simple XNET CAN Example...

    I am trying to get a simple XNET CAN write frame going. Here is the VI I created...
    I keep getting this:
    I looks like I have a frame set up, so what else could be the issue?

    Hey,
    I am "trying" to create a session and send data on the fly. I changed my vi back to the simple one I had earlier in the day:
    Attachments:
    XNET_CAN_Write.vi ‏12 KB

  • XNET CAN filtering based on payload information of different frames

    Dear all,
    The CAN frames I recieve per cycle are with the following identifiers: 
    [1 frame] 513 (Radar status- range, elevation, target type)
    [1 frame]1536 (Target status - No of targets near field from byte 1 and No of targets far field from byte 2)
    Followed by 96 message pairs of which 1-32 pairs are for near field and 33-96 are for far field.
    1793 (Target Info 1)
    1794 (Target Info 2)
    For the filter, I need to to identify the 1536 frame first and read byte 1 and byte 2 which gives me information about the number near and far targets respectively. 
    Then I need to pick (equal to byte1) number of message pairs from the first 32 pairs and retrieve sigals from that; and then pick (equal to byte2) number of message pairs from then next 64 message pairs and retrive signals. 
    As this plots only the targets that are valuable. I am also looking to create more efficient projects that can be scaled to multiple radars. Any tips for efficient documentation would be much appreciated.
    Kind Regards,
    Red
    Attachments:
    MainRadar.vi ‏129 KB
    radar-.dbc files.zip ‏5 KB

    Hello Red,
    I saw your code. you need do some modifications.
    1. your using STOP vi in ForLoop. when it's true then it will stop your main VI completely. It's like abort button. Instead of that right click on forloop and select Stop terminal.
    2. your passing two index arrays to Forloop (CAN Frames & Search ID Array). if your giving like this, it will always executes based on least array index. So suppose you have 100 frames then it will runs up to 7 times (because your array have 7 elements)
    3. why are taking index from foorloop and again using other loop with Shift registers?  you can take last value directly from your 1st FoorLoop. Right click on the Tunnel and Select Last element.
    4. In your VI, your always clearing your XNET reference. Suppose CAN Frame is not present in next iteration then it wont take previous value. it will give you default value (max- 0)
    Munna

  • How to get Timestamp from NI-XNET hardware?

    Hy All,
    How can I get Timestamp directly from the hardware (NI-XNET) with the format of the timestamp at the 'Raw Frame Format'?
    Thanks for advance,
    Aviad

    OK! I got it: nxReadState.
    Sorry, it's so clear from the documentation...

  • NI-XNET nxBlink() what session type to use ?

    What session to pass to nxBlink() NI-XNET function ?
    I tried to pass sessions obtained from:
    - nxCreteSession(..., nxMode_FrameInStream, ...)
    - nxCreteSession(..., nxMode_FrameOutStream, ...)
    But I get error. The nxBlink() help only says use "The XNET Interface I/O name".
    but that doesn' tell me what kind of session out of these to use:
    nxMode_SignalInSinglePoint
    0
    nxMode_SignalInWaveform
    1
    nxMode_SignalInXY
    2
    nxMode_SignalOutSinglePoint
    3
    nxMode_SignalOutWaveform
    4
    nxMode_SignalOutXY
    5
    nxMode_FrameInStream
    6
    nxMode_FrameInQueued
    7
    nxMode_FrameInSinglePoint
    8
    nxMode_FrameOutStream
    9
    nxMode_FrameOutQueued
    10
    nxMode_FrameOutSinglePoint
    11
    nxMode_SignalConversionSinglePoint
    12

    I already found out that system type of session leads to interface/port type of session that can be used
    to blink the particular port LED.
    Thank you, Radek
     // Open system session.
            nxCheckErr(nxSystemOpen(&l_SystemRef));
     // Each Interface is represented by a u32 (4 bytes).
             l_NumberOfInterfaces = l_PropertySize /4;
             l_pInterfaceNames = (char**)malloc(sizeof(char*)*l_PropertySize);
             l_InterfaceBuffer = (u32 **)malloc(l_PropertySize);
             // This property returns the u32 value for each interface.  
             // You can use this value to get individual interface properties
             nxCheckErr(nxGetProperty (l_SystemRef, nxPropSys_IntfRefs, l_PropertySize, l_InterfaceBuffer));       
             // Get the individual interface names.
             for (i = 0; i < l_NumberOfInterfaces; i++)
                nxCheckErr(nxGetPropertySize((nxSessionRef_t)(l_In​terfaceBuffer[i]), nxPropIntf_Name, &l_PropertySize));
                l_pInterfaceNames[i] = (char*) malloc(l_PropertySize);
                nxCheckErr(nxGetProperty((nxSessionRef_t)l_Interfa​ceBuffer[i], nxPropIntf_Name, l_PropertySize, l_pInterfaceNames[i]));
                // Identify the port that match the resourceDescriptor.             
                if (0 == strcmp(resourceDescriptor, l_pInterfaceNames[i]))
                    actualInterface = (nxSessionRef_t)l_InterfaceBuffer[i];
                    nxBlink(actualInterface, 1); // Blink particular port for identification and
                    Delay(1);                    // indicate that input stream was succesfully opened.           
                    nxBlink(actualInterface, 0);    

  • XNET Change Database Keep Alias

    I have a need to allow users to change a database file used in an XNet database at run-time.  My intention was to start by deleting a database with a constant name something like "XNet Database".  If this alias doesn't exist then the Remove Alias will return an error which I just clear because I don't care.  Next I use the Add Alias and set the database name to "XNet Database" using the new file the user provides.
    Then if the user chagnes the database file to a new one, my software performs a Remove on "XNet Database" and then an Add using the new file.  The problem is after doing the remove and then add, all references to the "XNet Database" will be for the first file, not the second one the user choose.  If I close the software and re-open then it will use the new alias.  So is there a way to force a close of a database in the software, so that I can use the same database name, but assign a new database file?
    I also tried generating random database names.  Something like "XNet Database %d" where %d can be a random number.  In my code I still remove all aliases, then add an alias using the new file the user selected.  The problem with this is after doing this 7 times I get the following error:
    Error -1074384592
    NI-XNET:  (Hex 0xBFF63130) Too many open files. NI-XNET allows up to 7 database files to be opened simultaneously. Solution: Open fewer files.
    So this confirms my suspicions that the database is still in memory, even after performing the remove alias.  How can I force a close of a database, so that I can allow my users to use a new database file?
    EDIT: okay so I missed the Close Database Polymorphic.  Doing a Close All before the remove seems to fix it.  If I don't close all and just try to close that one database it doesn't work.  In my situation I think I can live with just closing all.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.

    Hi,
    On the original instance, do:
    $ sql
    SQL> connect as sysdba
    SQL> spool filenames.txt;
    SQL> select * from dba_data_files;
    SQL> select * from v$logfile;
    The list of datafiles and logfiles for all the tablespaces should be in 'filenames.txt'.
    $ sql
    SQLDBA> connect internal;
    SQLDBA> startup mount [DBNAME];
    SQLDBA> ALTER DATABASE RENAME FILE '<old path>/filename' to '<new path>/filename';
    In order to change the Control File, Take Current SPFILE Backup, Shutdown the DB Modify the Parameter Files (your Control Files). Start the DB, and perform the Above. !!
    Coming to your Second Question, If you are using the DBCA, you have the Option to Select the Appropriate Paths, while in the process of DB Creation, at that time you can specify.
    Else
    If you are creating DB manually, then can specify the path while you creating your DB.
    - Pavan Kumar N

  • Passing values to XNet Controls (Bug?)

    I've run into something that may be a bug.
    I'm trying to pass some values from a custom Step Type to XNet controls on a VI. The Step Type has parameters added for each type of XNet control, so in the sequence, they are passed as "Step.Xnet_Interface", "Step.Xnet_Database" and "Step.Xnet_Signal". These are all of type "LabVIEWIOControl".
    When the first step runs, it seems to correctly pass all the values to all the controls. When the second step runs with different values, the XNet controls do not get updated with the new values. They instead stay at the value from the first step. 
    The VI also has a string control for each of 3 XNet control types that gets populated with the "DeviceName" portion of the IO Control from the Step Parameters. These do get updated correctly.
    The attached contains a sequence, the VI, a Degub.ini which is the Types configuration and a couple small XNet database files just to have a couple to work with.
    The first two steps in the sequence are run from the custom step type. The last two are just LabVIEW Actions running the same VI. The last two steps from the Action do correctly update all the controls while the first two from the custom step type only correctly update the String controls.
    Am I missing something simple here or is this not working correctly?
    Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
    Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.
    Attachments:
    XNet Controls.zip ‏20 KB

    It's good to know it's not just me.
    For now I've worked around this issue by changing the XNet controls to Strings and casting them to the proper XNet controls type in the VI's.
    Seems to work so that's what I'm going with.
    Thanks
    Ed
    Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
    Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.

  • XNET order of frames

    Hi,
    I have some problems with the XNET Write CAN Block. I use a loop for reading Data from CAN via XNET, doing some calculations and lateron sending those via XNET to CAN. I would like to send data at the end of every loop that is why I chose event in the DBC for a trigger. The data is sent to the write block as an array (Signal Input Single Point). Now when I check the order of frames on the CAN-Bus via additional tools (CANALYZER by Vector), the messages come out in random order. The message identifier seems not to have any effect on the order.
    I tried the same setup with cyclic trigger. In this case the order of frames answers to the frame id. But this would lead to timing delay if CAN and my loop don't run synchronous (Windows...).
    Any suggestions? Is this behaviour normal? I thought that the frame id would handle the order if event is triggered for all frames at the same time...
    Thx Jan

    Hello!
    Could you please supply your .vi, or even better, break it down to a minimal example in which I can easily see what you are doing?
    Regards, RomBe

  • Xnet database from txt file

    I have a CAN database in a .txt format.  I used access and excel to map it to a .xml file.  When I try to add it to the NI-XNET Database Editor in the Manage NI-XNET databases and add alias  it give me the error: x BFF63081.  How can I take a .txt file database and convert it to a NI-XNET database?

    I think the common formats that XNET support are CANdb and FIBEX files (amongst others...e.g. for Flexray). I think FIBEX is the name. A CANdb file is essentially just a text file so maybe you can parse out the messages fairly easily that way. FIBEX files are the file format that LabVIEW creates when you edit/create an XNET database from their user interface.
    If you don't want to use either of those formats, then you need to parse the text file in LabVIEW and use it to manually create the XNET signals/database. You can create a :memory: database, populate it with a cluster and then populate that with signals using data from your text file.
    In the LabVIEW examples under Hardware Input and Output >> CAN >> NI-XNET >> Databases are some examples of manipulating an XNET database so that should point you in the right direction of which VIs you need to use.
    Certified LabVIEW Architect, Certified TestStand Developer
    NI Days (and A&DF): 2010, 2011, 2013, 2014
    NI Week: 2012, 2014
    Knowledgeable in all things Giant Tetris and WebSockets

Maybe you are looking for

  • How stop iCal from switching all my events to the date beforeI assigned them? It saves but then when I open the program again all my events have been messed up.

    iCal on my macbookpro moves events to the day before I originally saved them..especially birthdays. I will create an event for the 2nd of September, for example, and then when I restart my computer and open iCal the same event is saved for the 1st ev

  • A question about student version upgrade

    Hi everyone, I do know, that student version is only for noncommercial work, and while i'm just a student, i don't wanna take that risk that I might get commercial work to do, and I would't be allowed to do it. Now what I've got in my mind, is that i

  • Can anyone understand this Console error?

    I get this entry constantly: 1/19/08 1:33:39 PM Console[991] Error loading /Library/ScriptingAdditions/QXPScriptingAdditions.osax/Contents/MacOS/QXPScript ingAdditions: dlopen(/Library/ScriptingAdditions/QXPScriptingAdditions.osax/Contents/MacOS/QX P

  • Software activation Sign In Issues

    Can't sign in to activate Adobe Acrobat XI Pro ......error message says not connected to internet or internet clock is not set correctly...... Computer is connected and the Clock synchronization is correct according to computer How can I sign in to a

  • Finding RFC's for WEBGUI

    Hi All, User can create Quatation through WEBUI,backend CRM server will work,but CRM will pick the data from ERP system through RFC's. Here i want to know that While creating or changing quatation through WEBUI,needs to find out waht program/FM trigg