Labview and Visual C++ 6.0

Howdy,
I need to send data from a Visual C++ application to Labview over a
tcp/ip network. How do I do it? Are there any examples to study out
there?
Thanks,
M Soderberg
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

One word: DataSocket
Go to http://www.natinst.com/datasocket/ for more info. It's easy and painless.
Regards,
Carl Nybro
NI
Ed Hutchinson wrote:
> [email protected] wrote in message <7n9bhq$id0$[email protected]>...
> >Howdy,
> >
> >I need to send data from a Visual C++ application to Labview over a
> >tcp/ip network. How do I do it? Are there any examples to study out
> >there?
> >
> >Thanks,
> >
> >M Soderberg
> >
> >
> >Sent via Deja.com http://www.deja.com/
> >Share what you know. Learn what you don't.
>
> Buried deep in the examples on the MSDN Library disk 1 that comes with VC++
> 6.0 is code for simple TCP and UDP server/client apps. Not a straight
> example of comms between VC++ and LabVIEW but it will give you the basis for
> such. At least I
hope so :-}
>
> If you're interested run MSDN and under the contents window go down this
> tree
>
> MSDN Library Visual Studio 6.0
> -Visual C++ Documentation
> -Samples
> -SDK Samples
> -Internet Samples
> -NetDS Samples
> -Winsock: Windows Sockets Samples
> -Simple
>
> Cheers,
> Ed Hutchinson
> GNS Ltd, New Zealand.

Similar Messages

  • Labview and Visual C++

    Hello,
    how do i let LabView (5.1) and Visual C++ successfully cooperate?
    Any ideas? All help is welcome, because i don't have a clue...
    I'm a newbie at both languages, so it's really difficult.
    Thx,
    Bull's Eye

    On the first link I see something like 'Calling a dll made by LabView 6i'.
    The problem is that i'm working with LabView 5.1, so can't really use that
    one. Little question i got: can't LV 5.1 generate dll's at all??
    The second link doesn't help me either. There's one .cpp file, but I don't
    know what to do with it. What kind of project to put it in etc...
    Anyway, thx for the info...
    Bull's
    "Dennis Knutson" schreef in bericht
    news:[email protected]..
    > If you want to control LabVIEW from a Visual C++ program, you can do
    > it by making a .dll from the VI. The application builder is required
    > for this. There are examples of external programs calling a LabVIEW
    > dll at
    >
    http://zone
    .ni.com/devzone/devzone.nsf/webcategories/E2A99E7E10D5725D862567A
    C004F0A53?opendocument&node=DZ52048_US.
    > Another way to control LabVIEW is through ActiveX. You can go to
    >
    http://zone.ni.com/devzone/devzone.nsf/webcategories/46E7994B7483D781862567C
    300662667?opendocument&node=DZ52051_US
    > for an example. Good luck.

  • I am trying to integrate simulink model (.mdl) file with SIT of Labview for RCP and HIL purpose. I am using Labview 8.6, Simulink 6.6 with RTW 6.6 and RTW embedded coder 4.6 ,Visual C Express 2008 and Visual C++ Express 2008.

    I am trying to integrate simulink model (.mdl) file with SIT of Labview for RCP and HIL purpose. I am using Labview 8.6, Simulink 6.6 with RTW 6.6 and RTW embedded coder 4.6 ,Visual C Express 2008 and Visual C++ Express 2008. I have selected system target file as nidll.tlc,make command as make_rtw and template nidll_vs.tmf. When I try to generate .dll file I get the following error.
    Attachments:
    SITProblem.JPG ‏101 KB

    Hi,
    No . I could not solve the issue. Presently we are using microautobox (from Dspace)for doing the RCP.
    Himadri 

  • LabView Vs HP VEE and Visual C++

    I am looking for some information on the differences between LabView,
    HP VEE, and Visual C++. I am especially interested in Object
    Oriented Programming and how that technique compares to what is
    written in HP VEE and LabView.
    Sent via Deja.com http://www.deja.com/
    Before you buy.

    In article <[email protected]>,
    [email protected] wrote:
    > > I am looking for some information on the differences between
    LabView,
    > > HP VEE, and Visual C++. I am especially interested in Object
    > > Oriented Programming and how that technique compares to what is
    > > written in HP VEE and LabView.
    > >
    >
    > Keep in mind who I work for and that my impressions aren't totally
    > unbiased -- they are my opinion.
    >
    > LV is a compiled dataflow language that is quite powerful in what can
    be
    > written in it because of the huge number of libraries available for it
    > and the modularity. The language itself still has quite a simple
    > syntax, and it is good for rapid prototyping. The language shares a
    few
    > features with object oriented languages, but it is not an object
    > oriented language.
    >
    > HP-VEE is an interpreted language that is part dataflow with lots of
    > control flow melded in. It also have libraries that target it towards
    > engineering apps, but it hasn't been around as long, and its libraries
    > and language are not nearly as powerful. It also has a few features
    > that are like those in object oriented languages, but it isn't object
    > oriented either.
    >
    > C++ is very flexible, very complicated, and quite object oriented,
    > although it is by no means the holy grail of OOP (object oriented
    > programming). Like C, it is meant to be structured, but flexible
    enough
    > to write things like operating systems, complete with embedded
    assembly,
    > and the like.
    >
    > I know two of these quite well, and I'm familiar with VEE and quite a
    > few similar tools. As for OOP, object oriented design is something
    that
    > you can do on any project, and then implement it in anything you want.
    > There are papers at the last couple NIWeeks by Stepan Riha about GOOP
    or
    > graphical OOP with LV. It explains how to construct objects out of VI
    > sets so that your application is better isolated from changes to the
    > different objects. I highly recommend looking at his NIWeek99 paper.
    >
    > If you want to more easily learn OOP, you may want to look at JAVA.
    If
    > you have lots of time and must have the most powerful tool ever
    written,
    > then be sure to stock up on the C++ books and bury yourself in a hole
    > for awhile so that it has time to soak in. If you have a job to do,
    I'd
    > say you should look at Stepan's paper and try to apply some of those
    > techniques to your next project and use whatever tool you are
    proficient in.
    >
    > Greg McKaskle
    >
    Thanks for the info, Greg. I went looking for Stepan Riha's paper
    on OOP in LabVIEW from NIWEEK 99, but I could not find any of
    the papers on National Instruments website. Will they be posted
    later or am I not looking in the right place?
    Sent via Deja.com http://www.deja.com/
    Before you buy.

  • Is it possible to convert a LabView .ldd Instrument Driver into a dll that can be used with MAX and visual basic?

    I got the following problem. I'm using a Stanford DS360 UltraLow Destortion Function Generator, and would like to access it with the CwIVIFgen ActiveX Controll. In the driver database i found a instrument driver for LabView (ds360fg.llb). But its a llb file and i am not sure what to do.
    Is it possible to convert it?
    Is it possible to use it anyways?
    Maybe i overread the answer, but i search the forums for quite a while now and i am not a bit wiser.
    Thanks in advance
    Felix

    If you have LabVIEW and the application builder, a dll can be created from it but it cannot be used as an IVI driver. The only way to create an IVI instrument driver is with LabWindows/CVI. I personally don't have the time right now to create the dll but maybe if you post to the LabVIEW forum and get someone there to do it. Another other option is to hire an Alliance member to create either a dll from the LabVIEW code or to create an IVI driver from scratch.

  • DAQmxCfgSa​mpClkTimin​g NICARD 6031 E PCI and Visual C++ 6.0

    To the attention of the support department or someone else wellcome! -):
    I am using an NICARD 6031 E PCI. I am also using Visual C++ 6.0.
    I declared a variable for an external ANALOG SAMPLE CLOCK. I am supplying
    this CLOCK signal to the input PIN /Dev1/PFI7
    //, m_sSampClk(_T("ai/SampleClock"))
    I am not using this variable now and this is why is commented. Instead, I
    just call /Dev1/PFI7 below in the routine.
    I am using the following routine to SAMPLE input values based on the
    external CLOCK signal entering the /Dev1/PFI7 PIN
    nErrorCode = DAQmxCfgSampClkTiming (m_hAnalogTask, //task handle
    "/Dev1/PFI7", //Sample Clock from ATMEGA88 (AI SAMP CLK)
    50000, //Sampling rate (float64) points per second
    DAQmx_Val_Rising, //Acquire samples on rising edges of the
    Sample Clock.
    DAQmx_Val_FiniteSamps, //Finite mode for buffered hardware
    timing acquisition
    128); //128 samples per channel
    if(nErrorCode!=0) //error encountered
    DAQmxGetErrorString(nErrorCode, szErrorMsg, 1024);
    AfxMessageBox(szErrorMsg);
    Can you please let me know if my routine is correct and I do not need to
    declare a variable for the SAMPLE CLOCK because I call it on the routine
    directly? I searched examples on your website but I only find LABVIEW and
    not Visual C++ examples.
    Regards,
    Javier Contreras.
    Solved!
    Go to Solution.

    Hi Brad
    First of all , thanks very much for your help, it is very much appreciatedI reply below your comments for clarity.
    Hi Javier,
    To debug a system
    with multiple possible points of failure, it really helps to break it
    down into smaller, easily verifiable pieces.
    For example:
    Connect
    a known good signal. If you don't know if your position detector works
    correctly, then connect something else in its place for now, like a
    battery or a function generator. Once you have verified that you can
    correctly measure a battery or a signal from a function generator, then
    connect your position detector and figure out how to get results that
    you expect from it
    Ok. I could connect a power supply with a resistance and then I would be injecting a current to the system. In that way I would see on the screen the expected current I am injecting. 
    Verify that the signal is connected
    correctly using the device test panels in MAX. (Or the DAQ Assistant in
    MAX, which allows external clocks and other settings that the device
    test panel does not. To get there, right-click "Data Neighborhood",
    select "Create New...", select "DAQmx Task", etc.)
    Thanks. I already checked whether the connections are properly done in Measurement and Automation Explorer. Never used the DAQ Assistant in Measurement and Automation Explorer but I will try it too. However the connections are correct. Differential with CH0 (PINs 3 and 4 on the 100 pin connector) and CH1 ((PINs 5 and 6 on the 100 pin connector), which is the declared in my code as m_sChannels(_T("Dev1/ai0,Dev1/ai1")). I need differential.
    Write your
    DAQmx code separately from your UI code. Acquire the data and display
    it in a simple, easily verifiable manner (hence OutputDebugString or printf or the debugger). Make sure you get the results you expect.
    In the past and now I am using the same UI code to display properly other values from very similar circuits and other circuits, so I believe my UI is working properly for sure.
    Write your UI code separately from your DAQmx code. Feed it fake data and make sure it gets displayed correctly.
    Again, I am using exactly the same base UI code for other setups and it works fine. 
    Once all of the pieces work in isolation, put them together and see if they work together.
    I am pretty sure that my UI code works perfectly from other setups.
    Looking at your NIWrapper.cpp, here are some specific things you should check:
    You
    are setting the min/max expected values to -2 and 0. Are those correct?
    If you always read -2 or you always read 0, then you should try a wider
    range, like -10 and 10. If you're expecting 0-1.8 V, then 0 to 2 would
    be better.
    Yes, may be I should change it to 0 to 2. However, to be on the safe side just in case my signal polarity is changed, can I use -2 to 2?
    You are setting the terminal configuration to
    DAQmx_Val_Diff. Make sure you have your signals connected to ACH0/ACH8
    and ACH1/ACH9 and that the signals are within +/-10 V of AIGND.
    s(Yes the signals are connected properly as I checked in Measurement and Automation Explorer. I declared them as m_sChannels(_T("Dev1/ai0,Dev1/ai1")). Signals within +-10V of AIGND. AIGND is connected to the GND of my circuit as it should. What happened to the UNICODE, etc conversion ISSUE we talked last time? Do I need conversion, etc as you said? I do not have a clue on this conversion issue even after reading your link! -) 
    DAQmx
    ignores external sample clocks that happen before the task is started.
    If you need to send a serial command to start your external device (I
    don't know, I didn't read your whole program), it would be better to
    start the DAQmx task, send the serial command, and then read from the
    DAQmx task, so that you don't miss any external sample clocks.
    Yes, this is another issue I was working on. I was trying to send a char, 'r' over the RS232 to synchronise as I did in another code in the past which worked fine.
    However, I did not seem to make the MAX232 component do what I wanted. I am using the MAX232a from MAXIM and I attach the datasheet in this email. From my UI code, I first send a char 'r' to an ATMEGA128 microcontroller, and if the ATMEGA128 sees the char 'r' then it starts the sending of data from another component and sends the external CLOCK to the NICARD AIsample pin.
    However, the char 'r' from my UI code did not seem to arrive to the ATMEGA128 since it did not start the data acquisition or the clock. I already used this char 'r' approach in the past with the ATMEGA128 too and it worked fine with another type of SMD MAX232. Would it be fine if I measured the voltage values of the MAX232 pins while transmitting the char and then LET you know these values for you to check if the MAX232 works fine?
    I also tried another approach to synchronise which is to set from the UI code one pin from the NICARD to HIGH, and then ATMEGA checks if this pin was HIGH and if it was then it transmitted. However, this PIN seemed to stay HIGH and the approach did not work. Of course I would like to synchronise but I have not managed, so I have the ATMEGA128 code sending inside a FOR loop on every iteration.
    And the NICARD of course will be receiving sets of 129 CLOCKs on every iteration but spaced with a GAP of 10ms. I wish I could synchronise but it is not working. Can you check the values of the voltages of the MAX232a if I send you the values myself while it is transmitting char 'r'? Just for you to check of course.
    You're passing -1 as the timeout for DAQmxWaitUntilTaskDone(). This
    will cause DAQmx to wait forever if you don't get enough external
    sample clocks, hanging your program.
    Yes, I am passing -1. But I am providing enough CLOCK samples (129). I have the CLOCK waveform on a photo I made of the oscilloscope. This hanging of the program was the case, when the ATMEGA128 did not send anything because the PIN remained in the one state HIGH or LOW without changing everytime as it should, in order to inform the ATMEGA128 that it could send data.
    I attach the MAX232 component datasheet.
    Thanks and best regards.
    Javier.
    Attachments:
    MAX220-MAX249.pdf ‏397 KB

  • LabVIEW and Vision (8.2) 2D FFT - Some differences noted

    Hello:
    I am attaching a small llb showing the 2d FFT results obtained using both the LabVIEW and Vision (8.2) functions.  Some differences in results are seen.  The Vision results seem to have more components (move the scroll bars to get to approximate vision container center) while the LabVIEW results show only one pixel at near the center that is not zero. I wanted to know if someone can explain the differences seen. 
    Given that I might trust the Vision results more, I need to display them either in a vision container of the size shown that autoscales X and Y and does not show scrollbars (i.e. behavior like intensity graph), or I need to be able to convert the vision container results so as to be able to display in intensity graph.  I don't know how to get the vision container to do the fomer and since the results for the vision container are in complex format, it is not clear to me how to do the latter conversion. 
    Any thoughts are appreciated.
    Sincerely,
    Don
    Attachments:
    2D FFT Comparison.llb ‏2184 KB

    Hi Gavin:
    The end of my post above proposes to do exactly what you state: converting vision container to array using IMAQ image to array.  But how do you do it?  Remember, the vision container is of type complex - you do not have the option to specifiy complex when using that function. Run the attached and see the error one gets.
    It is not clear to me from the references you site why we should get different answers between the two functions.  Does this mean when I go to another function library such as from visual basic or c++ I would get 2 more different answers?  There is some subtle difference between the functions that only the R&D dept. can probably tell us.  On something gross like the 2d fft example (C:\Program Files\National Instruments\LabVIEW 8.2\examples\analysis\dspxmpl.llb\2D FFT of a Pulse.vi) included with LabVIEW, the results are substantially the same.
    Sincerely,
    Don
    Attachments:
    2D FFT Comparison.llb ‏2192 KB

  • LabVIEW and C#

    Hi, I'm experiencing problems transferring strings between LabVIEW and
    Microsoft Visual C#. I have a LabVIEW dll file which contains a method
    returning a string. I use this method in C# (using the [DllImport] command),
    but it doesn't seem to work. When I change the method to return an integer
    instead, it works fine. So I'm really looking for a casting function or
    something, to use LabVIEW strings in C#.
    Anyone?
    Cheers,
    Aasmund

    Hi Aasmund,
    you can take a look at this example:
    http://sine.ni.com/apps/we/niepd_web_display.DISPLAY_EPD4?p_guid=B123AE0CB9B3111EE034080020E74861&p_node=DZ52048&p_submitted=N&p_rank=&p_answer=&p_source=External
    That shows how to pass strings between LV DLL and Visual C++.
    It might help.
    Good luck,
    Alberto

  • Labview vs visual basic vs visual C

    Hi,
    I am hoping to get advice about LabVIEW vs Visual Basic/C for developing education software.  I have programmed in LabVIEW for test and measurement applications. 
    I have also programmed a couple of short programs in C, but have never programmed in Visual C or Visual basic.  I want to develop software for educational purposes for
    students with special needs such as spelling and reading comprehension programs.  These programs would involve a lot of displaying text, graphics and
    sounds to the students and measuring student inputs via text or clicks on buttons to determine whether they need additional help from the computer to learn the lessons. 
    I was was wondering if anyone has the knowledge to advise using LabVIEW vs Visual C vs Visual Basicfor such programming tasks?
    I am guessing that Visual C and Visual Basic may have more flexibility in displaying text, graphics, sounds etc. since this is mostly what they used for.  On the other hand,
    I am famliar with the ease of coding in LabVIEW for test/measurement applications but wondering if there might be limitations or difficulties in programming with LabVIEW
    for a more general purpose windows application such as the education software I am planning.   I would have to learn Visual C or Visual Basic, but if those environments
    would be easier or better in the long run, then I will be better starting off with one of them then getting down the road with labVIEW and changing directions.  Any advice will
    be welcomed.  Thanks!
    Dave Adams

    Hi Adam,
    I suspect I will get a lot of crap from the LabVIEW community here for saying this, but if you have the time to learn VB .NET, I think you will find that it makes things much easier in the long run for applications that involve a lot of user interface design.  In my experience, I have found that there is little that cannot be done in LabVIEW - it is more a matter of elegance.  Software like you are describing is best implemented using an event-driven approach.  VB .NET (as well as the older versions of VB, and C#) make this type of design pattern extremely easy to implement.  With the .NET framework libraries available to you, virtually any type of user interface control that you would need is available to use and the documentation provided on MSDN blows LabVIEW documentation out of the water.
    One caveat: if you expect to be running this software on other platforms other than Windows, .NET may not be the right choice.  (There are .NET runtimes available on some other platforms (like Mono for Linux), but you would wind up needing to modify a lot of the user-interface code and a lot of the framework features in general are in beta.)
    Also, the express version of Visual Studio is free and there are really no limitations to the express edition that would matter for writing a simple Windows application like you are describing.  Note that a fairly convincing argument can be made for why Visual Studio is THE best IDE on the market right now.
    On the other hand, using a tool you are familiar with has its benefits.  Nevertheless, I have never met a software engineer that could not learn VB .NET, who could somehow still write decent LabVIEW code.  LabVIEW is easier to use only for people without a software background - and those are the same people who really shouldn't be writing software (as evidenced by the hundreds of pitiful LabVIEW VIs that I'm exposed to on a daily basis).
    Anyway, I hope this helps,
    Rob

  • Labview and ole-server

    Hi,
    I need information about the activation of an "OLE server" via Labview.
    This server is activated without any problems by using the "CreateObject ("servername")" command within the Excel and Visual Basic environment and all methods are available. I tried an automation-refnum to find the server in the Labview Library - without any success.
    How might I realise this Labview Problem?
    Thanks
    Alex

    Hi BSS
    Your ActiveX device must be registerd in windows.
    Than you can create for example an ActiveX Container.
    Open the context menu by right-clicking and  select "Insert ActiveX Object ..."
    Is your ActiveX device not listed there. Choose "Create Object from File"
    instead of "Create Control" and directly select it
    I give you some attachements to visual the actions.
    Lars
    Attachments:
    ActiveX general programming example.PNG ‏3 KB
    ActiveX Container.JPG ‏82 KB
    Create Object From File.JPG ‏53 KB

  • Transfert between Labview and HTML page

    Is it possible to transfert data between LABVIEW and a HTML page without using a java applet?
    Is Labview including this possibility?
    The aim is to pilot a system via internet with a html page.

    Hi Mat,
    LabVIEW CGI is the best solution for that. I know that you can build ActiveX controls (in Visual Basic) that talk to labVIEW. But LabVIEW CGI is the easiest way of doing that. You should have some knowledge of HTML inorder to work with LabVIEW CGI.
    Also note that in order to have the LabVIEW CGI capability you should have Internet Toolkit for LabVIEW (available from NI). For more information
    http://www.ni.com/labview/internet/
    The examples that ship with the toolkit should get you started on it. LabVIEW CGI VIs would allow you to execute VI from web browser and get results back in web browser in form of HTML page.
    Hope this helps,
    A Rafiq
    National Instruments
    Hope this help.

  • I am receiving the data through the rs232 in labview and i have to store the data in to the word file only if there is a change in the data and we have to scan the data continuasly how can i do that.

    i am receiving the data through the rs232 in labview and i have to store the data in to the word or text file only if there is a change in the data. I have to scan the data continuasly. how can i do that. I was able to store the data into the text or word file but could not be able to do it.  I am gettting the data from rs232 interms of 0 or 1.  and i have to print it only if thereis a change in data from 0 to 1. if i use if-loop , each as much time there is 0 or 1 is there that much time the data gets printed. i dont know how to do this program please help me if anybody knows the answer

    I have attatched the vi.  Here in this it receives the data from rs232 as string and converted into binery. and indicated in led also normally if the data 1 comes then the led's will be off.  suppose if 0 comes the corresponding data status is wrtten into the text file.  But here the problem is the same data will be printed many number of times.  so i have to make it like if there is a transition from 1 to o then only print it once.  how to do it.  I am doing this from few weeks please reply if you know the answer immediatly
    thanking you 
    Attachments:
    MOTORTESTJIG.vi ‏729 KB

  • Hyperlinks do not work when exporting to PDF (Crystal and Visual Cut)

    I have a question about Crystal Reports XI Release 2 and Visual Cut 11. I create reports in Crystal and export them using Visual Cut. Some reports contain hyperlinks to files on a network drive. However, the links only work while in Crystal. The PDF files that are printed through Visual Cut display the links, but they are not clickable. How can this issue be fixed?
    In case it helps, the links also work when I export reports from Crystal to HTML, Word, and Excel, but not when I export to Acrobat. I believe it is a problem with tagging/accessibilityfeatures. So is there a setting/formula in Crystal or Visual Cut that would allow the exported PDF files to be accessible?

    While Crystal exports to pdf indeed lose hyperlink functionality (both in Crystal as well as in Visual CUT), Visual CUT allows you to use Crystal formulas (acting as hidden Tags) to create hyperlinks in the resulting pdf file. Another option supported by Visual CUT it to automatically detect file references in the pdf text and turn them into hyperlinks.  Contact MilletSoftware for more detail.
    Edited by: Ido Millet on Oct 27, 2010 5:11 PM

  • Problem with Labview and an ARM Cortex

    Good morning,
    I am currently trying to use Labview with a board from ST Microelectronics (MCBSTM32) with an ARM Cortex Processor.
    I use the SDK and have followed the tutorials.
    But, when I try to launch the program (the simple loop as write in the tutorial n°2 : http://zone.ni.com/devzone/cda/tut/p/id/7029 ), Keil gives me an error via Labview:
    "Argument 'DARMSTM' not permitted for option 'device'."
    It seems that Keil does not allow an ARM Cortex from ST as the device.
    Moreover, after having this problem, I am unable to use a Keil project ,even a project which worked before, without Labview. I need to restart the computer.
    I also try to launch the Keil project generated by Labview, without using Labview, and it works. But as soon as I use Labview, I have the error.
    Did anyone already have this error or know how to solve it
    Thank you for your answer and sorry for my bad english.
    Regards,
    Raphaël VAISSIERE

    Hi Raphi,
    So let me make sure I understand,
    The project created in LabVIEW errors out with the message "Argument 'DARMSTM' not permitted for option 'device" 
    If you open the same project in Keil uVision, it runs fine
    Here are my questions:
    1. So how does the code run when run through Keil? does it deploy and run fine?
    2. Did you follow the porting procedure completely?
    Your target STM32F103RB  is techincally supported by Keil but you need to port the RTX kernel to it. This paragraph explains it:
    To determine if your target already supports the RTX Real-Time Kernel, browse to the \Keil\ARM\Startup directory, then browse to the folder that corresponds to the manufacturer of your ARM microcontroller. If there is an RTX_Conf*.c file for your target, then the RTX Real-Time Kernel has already been ported for your ARM device. If no such file exists, skip to chapter 4 for more information on the RTX Real-Time Kernel and a guide for porting RTX to your ARM microcontroller. 
    You also need to port the Real-Time agent to it.
    I just want to make sure that you have followed the guidelines. If you have and are still having problems, we will continue to explore this.
    Thanks,
    National Instruments
    LabVIEW Embedded Product Support Engineer

  • I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages.

    I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages. 
    Specifically I am attempting  to read data frames using the RP1210_READMESSAGE .   am able to configure the hardware and send a message to the J1939 bus. .    I think I have not configured something correctly.  I can use the RP1210_SENDMESSAGE and see the message I have sent on the bus using CANalyzer   When I use the RP1210_READMESSAGE   I get the timestamp from a message and the return from the function sends back the correct number of bytes (the number matches the number of bytes I sent out plus four bytes from the timestamp).  What I am having trouble with is actually receiving the data. I have had the same type of behavior from two different hardware (Vector CANcase XL and Nexiq USB Link), so I don't think the issue is vendor specific.
    Has anyone been able to make the RP1210_RECIEVEMESSAGE function work correctly?
    Thanks for any help

    Thanks
    I have already tried that.  The links are the NI RP1210 wraper. The problem I am having is using labview to interface with the RP1210 layer.  The RecieveMessage char*fpchAPIMessage this is the output which is a pointer to a cahracter array.  In this variable I can receive the timestamp of the message but not the message.  The retun showns the correct amount of bytes are aviaable (18 for a 8 byte message) but I can only get the 4 byte timestamp  I think I have to dereference this pointer to view the data.  I am not sure how to fix this. 

Maybe you are looking for

  • ORA-39126 during an export of a partition via dbms_datapump

    Hi , i did export using datapump in command line everything went fine but while exporting via dbms_datapump i got this: ORA-39126 during an export of a partition via dbms_datapump ORA-00920 'SELECT FROM DUAL WHERE :1' P20060401 ORA-06512: at "SYS.DBM

  • I just installed the new iTunes update and now 10.6.8 cannot open iTunes.

    No other operating, booting or recognition issues.

  • EAX options dissape

    Hi, I was setting up my EAX settings to my Audigy 2 ZS ( got Cosonic 5. Home theater headphones ) I set it up to Environment Garage and changed the CMSS 3D to the 2nd option i think it was. While i was mixing around with the diff settings i was liste

  • A Question

    This is a part of my code. I have two error messages, and I can't seem to get these right. One error occurs at createPane with one parameter in a for loop. The other error occurs where JPanel signature is. Does anyone know why? Any help would be appr

  • Advertisment Report says successful but program did not install...help?

    I created the package and package works outside of sccm. Here is the execmgr.log info which says the program installs successfully but the progam is not on the machine.  It is in the cache  but does not seem to install... what's happening? Policy is