Visa read problem

Hello!!
I made a program that reads data from an Arduino microprocessor. Once I realize the exe and connect Arduino, the program starts reading the data, but only for a few minutes ... then it stops and i have to disconnect and reconnect Arduino....any ideas?
Thanks a lot!

Hi
when you disconnect your arduino, do you turn off the power? If yes, it may be a memory issue in your arduino. I've worked once with Arduino, and we had an issue with data stacking over and over in the arduino
Giuliano Franchetto
Student at the l'Ecole Nationale Supérieure des Mines de Saint-Etienne, cycle ISMIN (FRANCE)

Similar Messages

  • Visa Read problem from a PIC24's UART port

    Thanks for taking the time to read my post.
    I am using Labview 8 on Win XP CPU 3.4, 2GB RAM PC.
    I am using a microcontroller PIC24 (on the Explorer 16 development board), which I have programmed to acquire an AC singal at a rate of about 4 kHz. The PIC24 has a 10bit ADC and the acquire value is then padded to16bits in total (format: 000000xxxxxxxxxx).
    The serial setting for the pic and my Labview program are 115200, 8 data bits, no parity, 1 stop bit.
    In order to send the acquired value to my PCs serial port (Remember UART = 8-bits of data only) I divide the 16 bit word into MSB and LSB. I then send the MSB and LSB 8-bit value one after the other.
    The total data rate for this communication is about 64kbps.
    The problem is that when I run the code I have written in Labview the CPU usage shoot up to 70% and I also see buffer overruns (a non continuous sine wave). If I a time delay in my while loop the buffer overruns increase.
    Also if I try and use the ‘bytes at port’ VI the data I get is meaningless.
    I would be grateful if someone can look at my code and give me some suggestion as to how I could make the ‘Visa Read’ VI more efficient.
    Regards
    Alex
    Attachments:
    Serial Client for PIC24.vi ‏101 KB

    Dear all,
    I do not know if you have been following my post, but I am still getting buffer overruns (a non continuous sine wave) when using VISA Serial Read.
    The only way to avoid this, is by making the VISA Read VI, read 2-bytes at a time (with no time delay in the main while loop). However, when I do this I see the CPU usage shoot up to around 60% (which is something I would expect anyway, as the main while loop is executing as fast as possible).
    I have attached the working code below and would appreciate ANY comments BIG or small.
    I am still puzzled as to why when I connect the ‘Bytes at Port’ Property Node, the data I get is not correct.
    I have gone through the Labview Examples, as well as the LV Basics 1 course examples (which are similar) and I have also looked in the Labview for Everyone / Labview Graphical Programming books.
    However, I have found the examples to be far too simple, for what I am trying to achieve.
    I am seriously thinking of purchasing the LV Instrument Control Self-Paced Course, but I am not quite certain this would help me much. I have read the Course outline provided by NI, but this did not provide me with more valuable information.
    Can anyone that has ‘done’ this course advice me as to whether the material contains info on ‘high’ speed acquisition using VISA Serial Read/Write?
    The course is slightly price at a cost of around £240(with academic discount) and as far as I understand the courses examples (might) use two HP instruments (Multimeter and Function Generator) and a Tektronix oscilloscope, all of which I am not in direct contact with.
    Regards
    Alex
    Attachments:
    Serial Client PIC24-Serial - Read 2-bytes.vi ‏42 KB

  • VISA read Problem: After a while the device stop responding

    Hi everyone,
    I am facing a really strange bug that anything that I have done could not solve it. I have a PIC device that I made myself communicating with LabVIEW by a RS232 -> USB converter. Everything seems to be ok, but suddenly (it can last 30, 45 or 50 seconds), LabVIEW stops receiving data and any buttons do not respond anymore, not even the stop button. I have to remove the usb converter to get the buttons to respond again, and then I got the error:
    Other programs that opens this same terminal can communicate with my device for hours without errors. I sure open the session and only close after the while loop. Can anybody help me?
    Solved!
    Go to Solution.

    Hello,
    Same problem for me! It can last 1 hour or 5 hours but at some point the two parts of the program that have a RS-485 to usb stop. This program was working perfectly until I changed the computer. The OS was vista (32 bits) and it is now 7 (64 bits). This is the only thing that has changed.
    I also found that if more parts of the program are running this problem appears earlier. It was definitely not issue before since the program could be running for weeks.
    Thanks for your help.

  • VISA Read over USB Problem: After a while the xBFFF0015 Timeout Error occurred

    Hi,
    I have trouble using labview with my non-NI USB device:
    The device is an analog input DAQ board. I was able to setup communication with the board using a VISA driver specifically created for this board and direct FW calls using the product's firmware specification provided by the manufacturer.
    This method has been working pretty good so far, but when I try to get large amount of data (64k samples @100ksps), the VISA Read returns the 'VISA: (Hex 0xBFFF0015) Timeout expired before operation completed.' error.
    Please see the attached screenshot of the block diagram for details.
    First, an 'analog input scan start' command is sent to the DAQ device, and then the vi tries to read all collected data from the device, once the right amount of data retrieved, or no more data is available the data collecting process (the while loop) ends and an 'analog input scan stop' command is sent to the DAQ device.
    The data collection starts with no problem, but after a while, in loop# 400, the VISA Read hangs and then returns the error mentioned above.
    I tried to increase the time out value, but it didn't help, the error occurred after the same number of loops, the VISA Read got hosed and the error occurred after the longer timeout expired.
    I also tried to add some delays in the loop, but it didn't help either.
    I am not sure what do I miss here and I would highly appreciate if anyone could give me some guidance how to solve this issue.
    Thanks,
    John
    Attachments:
    usb-read.png ‏18 KB

    I just wanted to specify that this is not an NI board avoiding to make people think this is a hardware issue. And I think that the rest of the code is irrelevant in this case.
    I believe that I don't use the VISA functions correctly. I assumed that someone who used these functions before would be able to point out the obvious steps missing in the data collecting process using the VISA functions.

  • VISA Read function Read buffer problem in serial communication

    Hi,  I use VISA write and read function in serial communication app, the device continuously sends 0x00 if it is not receive a request from Labview program running on PC.
    And the request sent by labview is programmable. I met a weird problem, each time the request changes, the VISA read buffer output port still shows the last request firstly, from second time, shows the right request.
    It works like: Req code: ... 50, 51,51,51,50....;  VISA Read buffer: ...50, 50, 51, 51, 51, 51, 50....
    Please refer to the program.
    Attachments:
    readOne_test.vi ‏21 KB

    How are you running this?  You don't have a while loop around it.  Is it part of a larger VI?  Please don't tell me you are using the run continuously button.
    You don't have any wait statement between you VISA Write and your bytes at port.  So it is very likely the receive buffer is still empty since you didn't give your VI time to wait for the device to turn around and give a reply.  If you read 0 bytes, your VISA read string will be empty.  How does your decoder subVI (which you didn't include) handle an empty string?

  • Visa Read Overrun Problem

    Dear all,
    I am using VISA Read Function to read data coming from my Hardware.
    The data is coming really fast. The H/W is sending 1600 bytes after every 10ms.
    I am using Event Structure(timeout event) to check whether 1600 bytes are at port or not. Timeout is set to 1ms. i even tried with 0ms.
    Baud Rate = 3000000.
    but the program is giving Buffer Overrun problem when i am trying to read data using VISA Read.
    I have set the buffer size to 64000 but with no success. The problem still seems to persist.
    i have played around with Receive buffer of COM Port in device manager. i tried all the values from 64 to 1600, again with no success.
    every time i run the program i get an error stating,"A character was not read from the H/W before the next char arrived" at VISA Read Function.
    and i am not doing anyother processing other than Reading data from the Port in the timeout event. 
    What could be the problem?
     Thanks,
    Ritesh 

    Ritesh,
    The over-run is still caused by the driver of the hardware.  What you are observing is that the driver is able to keep up with bursts of data coming into the port.  So for the first few thousand bytes coming in, there is no error.  However, after a while, the driver falls behind and cannot keep up (hence the over-run error).
    How does FTDI recommend that you use their ports in LV?  VISA may interact with the driver in a different way or set properties to different values.  All of this can have an impact on how the driver performs.  Also, this DLL that they provided may not return overrun errors the same way that VISA does.  The malformed string may just be missing data that was missed because of an over-run error.
    You have not mentioned anything about the types of flow control that are available to you.  The only real way to prevent over-run errors in data intensive communication is to have ways to hold-off the data (flow control).  Since the data will be interrupted for a fraction of a second, the driver will have a chance to get the data out of the hardware in time.  Communication will automatically resume once the data is read out of the hardware.
    Thanks,
    Steven T.

  • VISA usb, read problems

    Hello,
    I'm doing a project where Labview must communicate with a uC (microchip, mikroC PRO for PIC). Unfortunately, it doesn't work correct, sending data to the uC is good but cann't transmit data to the PC (Labview). The program remains in the loop "while (! HID_Write (& writebuff2, 64)). '
    For writing and reading, I use the example program "USB RAW - bulk.vi". If I 'pushed the Bulkin button then I get the following error welding "Error -1073807302 occurred at VISA Read in USB RAW - Bulk.vi" but if I click Bulk Out, then I receive data in the uC.
    Can someone help me maybe? thanks 
    Piece of uC code:
    unsigned char readbuff2[64] absolute 0x500;   // USB Buffers should be in USB RAM
    unsigned char writebuff2[64] absolute 0x540;  // USB Buffers should be in USB RAM
    void USB_enable()
      char cnt;
      HID_Enable(&readbuff2,&writebuff2);     
    void USB_Communication()
      char Read_Reg;
      if (HID_Read())
        switch (readbuff2[0])
          case Reg_Addr_Software_Version:      //read Software_Version
            writebuff2[0]= Reg_Addr_Software_Version;
            writebuff2[1]= Software_Revision_H;
            writebuff2[2]= Software_Revision_L;
            writebuff2[3]= 0;
            break;
          default:
            writebuff2[0]= Reg_Addr_Automatic_Response_Warning;
            writebuff2[1]= 0;
            writebuff2[2]= Address_Fail;
            writebuff2[3]= 0;
            break;
        while(!HID_Write(&writebuff2,64));
      } // end HID_Read()
    Solved!
    Go to Solution.

    Did you look at the links describing the solution?
    "Yes I solved the problem.
    I found the solution in this topic: http://forums.ni.com/t5/LabWindows-CVI/Control-PIC18F4550-via-USB-with-LabWindows-CVI/td-p/694804
    And see the solution in topic:
    http://forums.ni.com/t5/Instrument-Control-GPIB-Serial/VISA-RAW-FOR-USB-USING-PIC18F4550/m-p/2064030...
    Regards,
    Ronald"
    He said: "The descriptor file (USBdsc.c) of MikroC Pro was not correct" and made some modificacations described in the forum post
    Taylor B.
    National Instruments

  • Visa Read Timeout Occurs with multiple Reentrant VI Calls

    I have written a test application in Labview (6.1) which will be used to test (burn-in) up to 15 serial instruments through a 16 Port USB->RS232 Hub. Here's how it works:
    When the App loads, I am transmitting a Connect command to each of 15 com ports (one-at-a-time) using VISA. If I receive the proper response from the unit on that port, I add the port to an array and continue on to the next system. Once I've found all systems on the hub, I wire my array of active Visa references to a for loop in which I open up to 15 reentrant VIs which will run in the background in parallel. Each of these reentrant VIs (all are idential with the exception of the Visa Resource they use) running in the background are sending commands to the the respective instrument and receiving a response. One Function in particular "Get Unit Status" is important and the response determines whether or not the instrument is functioning correctly. Here's the problem -- In my Main Loop, I am continuously acquiring indicator values from each of the reentrant VIs that are running in the background. After a period of time (not consistent) I will lose communication with a port (the symptom is no response from the unit). I've looked closely at the COMM engine I created and found that the Visa Write function is completing without error, then when I perform a Visa Read I immediately get the "Timeout occured before operation completed" error (please keep in mind that this occurs after 100-5000 successful attempts at writing/reading). Eventually another port will drop out, followed by another. This seems to stop occurring and the remaining systems run to completion without a problem.
    Some background on what how I'm setting up my Visa Sessions...
    When I originally scan for systems (before I load and run the Reentrant VIs)
    - Init Visa Port
    - 19200, 8, N, 1
    - Use Termination = True
    - Timeout = 400mS (I've tried larger values already) 400mS should be plenty
    - Termination Char=13 (/r)
    - Open Visa Session
    - Visa Write "CONN18/r" (the command required to connect to my instrument)
    - Visa Read with 1 for requrested byte count to read 1 byte at-a-time, concatenating the results until /r is received (or 1000mS timeout occurs -- this is not a VISA timeout) I've also tried 16 for requested byte count and just waiting for Visa to timeout -- both methods work.
    Once all 16 ports are scanned I Close ALL of the ports using the Visa Close Function.
    It is important to know at this time that I "AM" using proper wiring flow to ensure open occurs before write, write occurs before read, etc.
    I'm assuming at this time that all of my Visa sessions are closed.
    On to the Reentrant VIs:
    Inside each reentrant VI I first Initialize all of my variables and Init/Open a 'New'? Visa session using the same parameters mentioned above.
    Then I enter the "Run" case structure where all of the communication begins.
    I am using the same Communications Engine to operate the instrument as before (the only difference being that all of the VIs in the comm engine are now reentrant and operate at higher priorities) I have actually saved two different versions of the engine (one for the reentrant calls and one for when I first scan for systems from my Main GUI).
    When I init the reentrant VI, I am placing the Duplicate Visa Resource output of my Visa Open Function on a shift register. When I enter the Run case, it takes the Resource from the register on the left, wires through any Comm Engine Vis then back out to the shift register on the right and keeps going for a 12-hour period or until "Get Unit Status" has returned 60 naughty results.
    On my Main GUI I am continuously (every 500mS) I am Getting certain Indicator Values from each reentrant VI AND I am also setting some control Values on each reentrant VI. There is no VISA interaction between each Reentrant VI, and the Main GUI.
    As I said earlier, up to 15 systems will run for a time, then one will stop responding, followed by another, and another until a few remaining systems will run to completion.
    Any advice as to why I'm encountering the timeouts with the VISA read fucntion as I have metioned would be appreciated. I managed to find one suggestion which uses the Bytes at Port function to ensure there is data at the port before doing a Read otherwise, skip the read and retry the whole operation -- I haven't tried this yet.
    Sorry for the wordiness of my question. If anyone would like some screen shots of portions of my code (I can't submit the actual code because some of it is confidential) I'd be happy to post them.
    Doug.

    Hi Doug,
    The first thing I would recommend is the solution you have already found, to check and see if there is data at the port before attempting a read. I would be interested to see if this will solve the problem. Does there seem to be any trend to which ports have this timeout error? How many ports does it cut down to before operation seems to continue as expected? Does this number vary, or is it always the same number of ports? I think the best thing to do will be to identify constant attributes of how the error is occurring so that we can narrow it down and see what is going on.
    John M

  • VISA read in exe file is not working

    Hi all,
    I am having problems with VISA read in an exe file created.
    I am trying to write to and read from a programmable power supply via RS232. The VI writes a command to the instrument to set the voltage level. It then writes another command, requesting the resulting current value. This value is then read by VISA read
    The VI is working fine on the development PC, which has LabVIEW installed. The exe file is also working fine on this PC. However, when I try to run the exe file on another PC (I've tried several) everything seem to work except for the VISA read functions. The voltage level command is sent, as well as the On and OFF commands, but the current is not read back.
    I guess there must be something I have missed in the installation. I am working in LabVIEW 8.5. I have created an installer and included
    Runtime Engine 8.5.1
    VISA runtime 4.5
    Is there something else I should do? I am really running out of ideas here...
    I hope someone has a clue about this!
    Clara 

    Clara-
    1. Have you verified that the COM port settings in Windows (check under device manager) are matching how you initialize them (Baud, bits, parity, and flow control) and that these match the power supply's settings?
    2. Also, are you trapping an error message after the attempted Read command (this will make it a lot easier to diagnose).
    3. Do you programmatically close the VISA session at the end of the program?
    4. You can always post the code to see if the forum members will catch the porblem.
    ~js
    2006 Ultimate LabVIEW G-eek.

  • NI VISA read stops at zero character, returning an 0xBFFF003E error

    Hi
    I’m trying to read some serial data from a UUT using the NI-VISA read function. The data is mostly text but does include some control codes. The first of these appears after the ‘OK’ in the Serial Bytes window on the front panel. More text should follow but for some reason, the read function stops at the first zero character (index 144 in the Byte Array), and returns an 0xBFFF003E (-1073807298) error. I found another thread where someone had a similar problem and I’ve tried the fix for this plus a few other things, but nothing’s worked. If I use Hyperterminal, the entire data block is returned as it should be.
    I wondered if this was anything to do with the 7.1 version of Labview I’m using (upgrade is on the cards). The version of NI VISA I’m running is 4.2.
    Very much appreciate any thoughts.
    Thanks
    Bruce

    The error code itself is a generic VISA error which often happens with USB to RS-232 interfaces. Does your device connect to the PC through USB as a virtual COMM port? If so what chip and Windows driver is it using?
    Also your function somehow looks wrong. The only criteria for the read loop to terminate is if there is an error on the VISA read or the Teststand termination status gets true. Generally if you use VISA Bytes at Serial Port you are almost always doing something wrong! That function does absolutely not synchronize with anything in your data. You will read whatever is there at that moment and that could be a partial message, no bytes at all (LabVIEW is typically many times faster than any serial device even if it is super high speed), or multiple messages.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Error -1073807253 occurred at VISA Read in transient SR830.vi VISA: (Hex 0xBFFF006B) A framing error occurred during transfer.

    Hi,
    I am have written a program with labview to make transient c-v measurement using a stanford research SR830 lockin amplifier. The program seems to be runing fine, but sometimes it is givvibg an error:
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    but if I press ok, the program again starts running. What might be the poblem? BTW I googled a bit and I see that in the labview topic "RS-232 Framing Error with HP 34401A Mulitmeter" by pkennedy32 this is what is said about framing error:
    ""Framing Error" in an RS-232 context means a very specific thing - when the receiver was expecting a stop bit, the line was not in SPACE condition. This can be the result of:
    1... Baud rate mismatch (although other problems would likely crop up first).
    2... Data Length problem, If I send 8 data bits and you expect 7, the stop bit is in the wrong place.
    3... Parity setting mismatch - If I send 7 data bits + parity and you expect 7 data bits and no parity, the stop bit is in the wrong place.
    4... Mismatch in # Stop bits - If I send you 7 Data bits + parity + one stop bit, and you expect 7 data bits + parity + TWO stop bits, the second one might not be correct, although most devices do not complain about this.
    But, I must say that this is the same com port setting that I use to measure c-v hysterysis, but I never gt this error there.
    I attach the program herewith for your kind perusal. Please help me resolve this issue.
    Thanks in advance.
    Solved!
    Go to Solution.
    Attachments:
    transient SR830.vi ‏94 KB
    csac.vi ‏8 KB
    sr830 initialize1.vi ‏15 KB

    @Dennis Knutson  you are right I checked the read indicator in backslash mode, and instead of a \n it is sending \r. So I changed the \n in my write strings to \r. But, if I keep the CLOSE VISA outside my loop instead of inside as you suggested, the termination character appears to come in the middle of the read string instead at the end. And since the read terminates at the \r so it is displaying some junk value before the \r, but if I put the CLOSE VISA outside the loop and play along with the bytes at the read buffer, I see the whole read string with the \r  at the end of the string. But, whenever the values are in exponential form (when close to zero) like 6.938839 e-5, I always get a time out error whatever be the timeout that I put at the VISA initialize. And subsequently, if I stop the program and run it again the machine program hangs and I donot get any reading. Then after I close it again and start, sometimes it hangs for some more or starts working. If I put an arbitrarily large byte count at the READ VISA, then I always get the time out before the operation completed error.
    @ Ravens Fan I have removed the CSAC VI altogether and taking the CH! And CH” reading separately, instead as one string. So, no more issues with that.
    I use the control at the delay so that I can choose how much delay I want to set, and I use the math operation because I am using adding up the delay time to keep track of the time elapsed. Because in the end I have t plot a time vs. CH! And CH” readings.
    I am not sure but probably I am making some silly errors. Please help me out. 
    Attachments:
    transient SR830-2.vi ‏103 KB
    sr830 initialize1.vi ‏15 KB

  • VISA Read gets incorrect data from serial connection

    I am having difficulty using the VISA functions in LabVIEW to read data from a virtual COM port. Data is being sent from a serial to USB chip via a USB connection using OpenSDA drivers. I have a python program written to read from this chip as well, and it never has an issue. However, when trying to achieve the same read in LabVIEW I am running into the problem of getting incorrect data in the read buffer using the VISA Read function.
    I have a VISA Configure Serial Port function set up with a control to select the COM port that the device is plugged into. Baud rate is default at 9600. Termination char of my data is a newline, which is also default. Enable termination char is set to true. A VISA Open function follows this configuration, and then feeds the VISA Resource Name Out into a while loop where a VISA Read function displays the data in read buffer. Byte count for the VISA Read is set to 20 so I can read more of the erroneous datat, however actual data will only be 6-12 bytes. The while loop has a wait function, and no matter how much I slow down the readings I still get incorrect data (I have tried 20ms thru 1000ms). 
    The data I expect to receive in the read buffer from VISA Read is in the form of "0-255,0-255,0-255\n", like the following:
    51,93,31\n
    or
    51,193,128\n
    And occasionally I receive this data correctly, however I intermittently (sometimes every couple reads, sometimes a couple times in a row) get incorrect readings like this:
    51,1\n
    51,193739\n
    \n
    51,1933,191\n
    51,,193,196\n
    51,1933,252 
    51,203,116203,186\n
    Where it seems like the read data is truncated, missing characters, or has additional characters. Looking at these values, however, it seems like the read was done incorrectly because the bytes seem partially correct (51 is the first number even in incorrect reads).
    I have search but haven't found a similar issue and I am not sure what to try from here on. Any help is appreciated. Thanks!
    Attachments:
    Serial_Read_debugging.vi ‏13 KB

    The first thing is that none of the error clusters are connected so you could be getting errors that you are not seeing. Are you sure about the comm parameters? Finally, I have never had a lot of luck looking for termination characters. You might want to just capture data and append each read into one long string just to see if you are still seeing this strangeness.
    What sort of device is returning the data? How often does it spit out the data? How much distance is there between it and your computer? Can you configure it append something like a checksum or crc to the data?
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Memory leak in Real-Time caused by VISA Read and Timed Loop data nodes? Doesn't make sense.

    Working with LV 8.2.1 real-time to develop a host of applications that monitor or emulate computers on RS-422 busses.   The following screen shots were taken from an application that monitors a 200Hz transmission.  After a few hours, the PXI station would crash with an awesome array of angry messages...most implying something about a loss of memory.  After much hair pulling and passing of the buck, my associate was able to discover while watching the available memory on the controller that memory loss was occurring with every loop containing a VISA read and error propogation using the data nodes (see Memory Leak.jpg).  He found that if he switched the error propogation to regular old-fashioned shift registers, then the available memory was rock-solid.  (a la No Memory Leak.jpg)
    Any ideas what could be causing this?  Do you see any problems with the way we code these sorts of loops?  We are always attempting to optimize the way we use memory on our time-critical applications and VISA reads and DAQmx Reads give us the most heartache as we are never able to preallocate memory for these VIs.  Any tips?
    Dan Marlow
    GDLS
    Solved!
    Go to Solution.
    Attachments:
    Memory Leak.JPG ‏136 KB
    No Memory Leak.JPG ‏137 KB

    Hi thisisnotadream,
    This problem has been reported, and you seem to be exactly reproducing the conditions required to see this problem. This was reported to R&D (# 134314) for further investigation. There are multiple possible workarounds, one of which is the one that you have already found of wiring the error directly into the loop. Other situations that result in no memory leak are:
    1.  If the bytes at port property node is not there and a read just happens in every iteration and resulting timeouts are ignored.
    2.  If the case structure is gone and just blindly check the bytes at port and read every iteration.
    3.  If the Timed Loop is turned into a While loop.
    Thanks for the feedback!
    Regards,Stephen S.
    National Instruments
    Applications Engineering

  • How to read three parameters for exemple temperatur​e,current and voltage from the same port with visa read and separate them in a table

    Hi i want to read parameters with visa read, from three sensors related to  pic16f877a, using  module xbee ..but i want to make every one from those parameters (temperature, voltage and current) in a table ..it's the first time i use Labview so i don't know if ther is a solution for my problem so if any one have any idea please help me. thnx in advance. 

    [email protected] wrote:
    Hi i want to read parameters with visa read, from three sensors related to  pic16f877a, using  module xbee ..but i want to make every one from those parameters (temperature, voltage and current) in a table ..it's the first time i use Labview so i don't know if ther is a solution for my problem so if any one have any idea please help me. thnx in advance. 
    The short answer is: "Yes, of course."  But you are going to have to do the legwork and learn LabVIEW basics before we can offer meaningful help.
    Bill
    (Mid-Level minion.)
    My support system ensures that I don't look totally incompetent.
    Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

  • RS232 De-multiplexing data using Visa Read

    Hello
    I'm trying to do a simple task, but being new to this I'm having problems.
    I transmit data down a RS232 line at 9600 baud. I have two channel of data say x and y where A and B are one byte each.
    I multiplex/alternate them down the line i.e ABABABABABAB.....etc
    At the labview end I want to seperate the data back to seperate channels for plotting on a graph. i.e A with time and B with time.
    There must be a way of toggling some array builder conected to the visa Read VI ie odd for A and even for B and then plotting them in two seperate wave charts.
    Iv'e got a single channel working but A and B are mixed together. (see attachment).
    Any ideas if it can be done.
    Thanks Bill
    Attachments:
    RS232-single.gif ‏15 KB

    RatHunter wrote:
    > Hello
    >  
    > I'm trying to do a simple task, but being new to this I'm having problems.
    >  
    > I transmit data down a RS232 line at 9600 baud. I have two channel of data say x and y where A and B are one byte each.
    > I multiplex/alternate them down the line i.e ABABABABABAB.....etc
    >  
    > At the labview end I want to seperate the data back to seperate channels for plotting on a graph. i.e A with time and B with time.
    >  
    > There must be a way of toggling some array builder conected to the visa Read VI ie odd for A and even for B and then plotting them in two seperate wave charts.
    >  
    > Iv'e got a single channel working but A and B are mixed together. (see attachment).
    >  
    > Any ideas if it can be done.
    >  
    > Thanks Bill
    >
    Hi Bill
    I think you were 90% of the way there.
    If all you want to do is see the two waveforms on diffierent plots you
    can right click on your chart and select stacked plots. I think you
    have to increase the size of your ledgend to show two plots.
    If you want the data on two different charts, index your array to two
    different charts. I'm assuming by your picture that you are reading 2
    bytes at a time.
    Using two graphs could be useful if your plots had different time
    scales. But i think the stacked plots makes more sense.
    Good Luck
    Eric

Maybe you are looking for

  • I have updated firefox 5.0 and it is a mesh now. No normal startpage, plugins gone, how can i get my old firefox back ?? Thanks !! in English

    Question i have updated firefox 5.0 and it is a mesh now. No normal startpage, no tools, no extra, plugins gone, how can i get my old firefox back ?? Thanks !!

  • Screen exit in purchase order MM06E005

    Dear Experts. I have implemented a USER EXIT (MM06E005) to tcode ME21N at header level (Add new tab and one field in it) and respective field getting updated in the table EKKO. But my problem is that field is not display at ME23N transaction. So plea

  • SSL mutual authentication with Tomcat and IE

    Hi, I am trying to set up mutual ssl with Tomcat. Everything works fine on the server but I cannot authenticate the client. The client is my internet explorer browser. This is what I have tried. -Generated an ssl server certificate using keytool. -Ge

  • Can't remove old CC

    Hello all, I haven't had a credit card in my name since about three years ago, when I first made my iTunes account. Since then, I have removed that credit card from my account and have made do with iTunes cards. The place I usually bought cards at ha

  • Migration to reports 6i

    Hi, I have a problem with a reports 6i in batch mode. I am doing a migration reports 3.0 to a reports 6i, about 1000 reports. My comand line is this: for %%F in (%1\*.rdf) do C:\orant\BIN\Rwcon60.exe userid="%2" stype=rdffile source=%%F dtype=repfile