Visa read 8 bits to 32 bits ?

I run a counter  to measure the number  of  pulses  count  and i send that number  over  serial to bluetooth HC 06  to virtual serial port in my laptop . to do that i break unsigned 32 bit value count into 8 bit pieces a b c d  shift it 0, 8, 16 and 24 bit positions to the right to send it via  UART  .problem is how to  reassemble 8 bit pieces with labview on the other end.i have to shift a b c d 0, 8, 16 and 24 bit positions to the left and reassemble them with or .this is a c example :
unsigned char a,b,c,d;//final result dcba
a=0x09;b=0xEF;c=0xCD;d=0xAB;
intcount;
count=0x00000000;
count=count|a;
inttemp=b<<8;
count=count|temp;
temp=c<<16;
count=count|temp;
temp=d<<24;
count=count|temp;
Solved!
Go to Solution.

Oh, you can use an array if you have multiple samples...
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
Attachments:
String to [U32].png ‏16 KB

Similar Messages

  • Can't visit jsp pages in 64 bit mode without first visiting them in 32 bit

    we have an application that we deployed into a standalone oc4j running on solaris 10 x86 sun workstation.
    Oc4j is run in 64 bit mode. After Oc4j is started, any jsp page we try to visit, the page fails to be shown and the
    below error is shown in the log:
    INFO: Unable to dispatch JSP Page : Exception:oracle.jsp.provider.JspCompileException: Errors compiling:/opt/oc4j/j2ee/home/application-deployment_
    s/App/App/persistence/_pages//_info_jspx.java
    Oct 17, 2009 1:46:41 PM oracle.jsp.logger.JspMessages infoCannotDispatchJspPage_
    INFO: Unable to dispatch JSP Page : Exception:oracle.jsp.provider.JspCompileException: Errors compiling:/opt/oc4j/j2ee/home/application-deployment_
    s/App/App/persistence/_pages//_info_jspx.java
    As a workaround to this issue, we had to visit every page in the application while oc4j is running in 32 bit mode and then restart OC4j in 64 bit
    mode. After this, we are able to successfully visit every jsp page. This is very painful as everytime we redeploy the application, we have to
    revisit all pages in 32 bit mode before we can revisit them in 64 bit mode.
    How can we avoid this painfull excercise of having to visit all pages in 32 bit mode in order to successfully visit them in 64 bit mode ?
    we are using:
    JDeveloper 10.1.3.2.9.4066 with JDK 1.5.0_06.

    Could be corrupted preferences.
    In the Finder, choose Go > Go to Folder from the menu.
    Type ~/Library/Preferences in the "Go to the folder" field.
    Press the Go button.
    Remove the  com.apple.logic.pro.plist  file from the Preferences folder. Note that if you have programmed any custom key commands, this will reset them to the defaults. You may wish to export your custom key command as a preset before performing this step. See the Logic Pro User Manual for details on how to do this. If you are having trouble with a control surface in Logic Pro.
    Be sure and use the ~ character in the path name.

  • DVD can't read or use 64 bit Win 7 install

    I just upgraded to a late 2012 Mac Mini and wanted to install Boot Camp.  With this model and OS version, I only have Boot Camp 5 which requires 64 bit WIndows.  This model should support that.  But the Mac doesn't seem to recognize the 64-bit DVD when I put it in, and trying to boot with it gives a message to the effect of "operating system not found" or "installer disk not found" or some indication that the DVD is not bootable.
    Is there something I'm doing wrong?  I actually have two copies of WIndows 7 on DVD, and in both copies, the 32 bit DVD can be read but the 64 bit can not.  And since Boot Camp 5 doesn't even support 32-bit installations, I can't even use that as a fallback position.  Any ideas? 

    Update:  Found a workaround to this.  Basically when I was actually in Mountain Lion and when trying to directly boot from DVD, the disk wasn't being recognized as a valid bootable volume.
    Then I tried rebooting and holding down the Option key for a list of bootable devices.  One of them was a DVD called "Windows" and using the boot menu to launch the install worked.  For some reason I was unable to directly recognize this as a valid, bootable device... 

  • Reader in IE9 64 bit?

    I have no problem using Reader in IE 9 32 bit browser, but it will not install in the 64 bit version.  When using the 64 bit version, any click on a .pdf in IE9 just makes the cursor blink.  Using Manage Add-Ons, Reader does not appear.  I ran repair on reader 9.3, installed 9.4 and run repair on 9.4, still no success in getting the add-on identified in the 64 bit browser.  I can't find anywhere on the MS site on how to install or force an add-on to be recognized, although I can delete or disable one that has been installed.  Has anybody been successful at getting Reader in a 64 bit browser?

    Please post questions or problem regarding beta software in the beta forum; that is what beta software is for - to get feedback to the developers.

  • Google Reader Errors in 64-bit Safari 4.0.3

    Since the update to Snow Leopard and moving to a 64-bit version of Safari (Version 4.0.3 (6531.9)), I have been experiencing a script error in GOOGLE READER (http://www.google.com/reader/).
    After several minutes of browsing, I receive an error message stating "Oops…an error occurred. Please try again in a few seconds." After the error, I can view currently loaded content, but attempts to view any additional content results in an empty folder and the following error: " "The feed being requested cannot be found." As a result, I am logged out of Google Reader and must log-in again. Interestingly, I am only logged out at the Google Reader site and can access other Google web properties without logging in.
    I do not experience this error using Camino or FF, only Safari.
    I've tried emptying the browser cache, restarting Safari and even resetting Safari. I've even tried using a secure page (https) instead of the traditional Google Reader URL. None of these have any effect. Error seems to be specific only to Google Reader on 64-bit Safari.
    Any thoughts? Suggestions?

    "The feed being requested cannot be found."
    If it happens in both 32-bit and 64-bit mode, but not another browser, that would seem to implicate the HTML and/or JavaScript being used by Google Reader is somehow incompatible with Safari, so it's not a Safari or Snow Leopard issue, unless you didn't have this same problem with the same feed in a previous version of Safari.
    I note from your signature that you are using an older version of OSX.
    Yes, but that makes no difference. Safari 4 is Safari 4, regardles of which version of the OS it's running on.

  • Reader compatability with x64 bit browsers?

    I'm hoping to find out how I can use Adobe reader plugin with x64 bit IE7 & IE8 so I can read and search PDF files from within the browser. I use Vista X64 which actually installed both 32 & 64 bit versions of IE7 and when I use the 32 bit version of IE7 pdf's open in the browser just fine but when I try to open a pdf using the 64 bit version of the browser it insists on opening an external program instead of just opening the file in a tab of the browser. Is this even possible with a 64 bit browser? Is the solution something simple like I simply need to reinstall Adobe Reader 9.1 via the 64 bit browser to enable the feature or is it a hopeless idea?
    I'm mainly wondering because I'm considering installing IE8 and don't want to have any problems

    If you really have to have an older version, try these places
    Old or Used Software http://www.emsps.com/oldtools/ Or http://www.retrosoftware.com/
    http://forums.adobe.com/message/1636890 warns about buying from eBay

  • DAQ vi to perform digital write and read measurements using 32 bits binary data saved in a file

    Hi
    DAQ vi to perform digital write and read measurements using 32 bits binary data saved in a file
    Two main
    sections:
    1)     
    Perform
    write and read operations to and fro different spread sheet files, such that
    each file have a single row of 32bits different binary data (analogous to 1D
    array) where the left most bit is the MSB. I don’t want to manually enter the
    32 bits binary data, I want the data written or read just by opening a file
    name saves with the intended data.
          2)     
    And
    by using test patterns implemented using the digital pattern generator or  build digital data functions or otherwise, I need to
    ensure that the     
                binary data written to a spreadsheet file or any supported file type
    then through the NI-USB 6509 is same as the data read.
    I’m aware I can’t use the simulated
    device to read data written to any port but if the write part of the vi works I
    ‘m sure the read part will work on the physical device which I’ll buy later.
    My Plan
    of action
    I’ve
    created a basic write/read file task and a write/read DAQ task for NI USB 6509
    and both combine in a while loop to form a progress VI which I’m confuse of how
    to proceed with the implementation.
    My
    greatest problem is to link both together with the correct functions or operators
    such that there are no syntax/execution errors and thus achieve my intended
    result.
    This
    project is one of my many assignments for my master thesis, so please i’ll
    appreciate every help as I’m not really efficient with LabVIEW programming but
    I prefer it because is fun and interesting if I get to know it.
    Currently I’m
    practicing with LabVIEW 8.6/NI DAQmx 8.8 Demo versions and NI USB 6509
    simulated device.
    Please see
    the attached file for my novice progress, thanks in
    advance for the support
    Rgds
    Paul
    Attachments:
    DIO_write_read DAQ from file.vi ‏17 KB

    What does your file look like?  The DAQmx write is expecting a single U32 value, not an array of I64. 
    Message Edited by vt92 on 09-16-2009 02:42 PM
    "There is a God shaped vacuum in the heart of every man which cannot be filled by any created thing, but only by God, the Creator, made known through Jesus." - Blaise Pascal

  • Latest Acrobat reader plugin for 32-bit Safari?

    I'm visiting web pages that have PDF that require the latest Adobe Reader.  I'm using 5.1 Safari on an older Intel iMac that is only 32-bit.  The latest download of reader claims it's only for 64-bit macs. Is there a version of the plugin that can in fact 32-bit and read the newer acrobat reader format?

    The Reader app is 32 bit.
    By default, it opens on 64 bit OS X versions in 32 bit mode.
    Having a CoreDuo Mac shouldn't be a problem for Reader 11.0.4.

  • Network stream using 64-bit and 32-bit VIs on same computer

    I have a project where I'm forced to use a VI written in 32-bit LV2012 (because I need mathscript, which isn't supported in 64-bit) to communicate via network streams with a VI written in 64-bit LV2012 (they only version I can use an externally written 64-bit toolkit for my camera communication) on the same PC.
    I need a network stream target to run on the 32-bit VI, and a host to run on 64-bit VI.
    It kept not working, and finally I checked using the example host.vi and target.vi, and discovered if I open both in either 32-bit or 64-bit, they work, but if I open one in 32-bit and the other in 64-bit it won't connect, giving error -314100 (specified endpoint doesn't exist).
    So question: is there any inherent reason this happens, since both 64-bit and 32-bit should support Network Streams? My first obvious alternative would be rewriting the mathscript code in native LV so i can all be on 64-bit VI's, but it's quite big (based on old, proven Matlab script) and that would be a significant effort I don't have time for now.
    The reason I'm using net streams is that I also get data from 2x other networked PC's at the same time. Is there an alternative to network streams that can communicate data over a network in real(ish)-time?
    Cheers
    Dan

    I just tested this and was able to get it to work.  Were you running both reader and writer applications on the same computer?  If so, I suspect you might be running into context naming issues with multiple application instances running on the same computer.  See this link for more info how to specify a context in the URL.  I suspect you might find it easier to create your own test VI for this rather than modify the shipping examples.

  • Read/Write unsigned 32 bit integer to BLOB

    Dear All,
    I am writing an array of unsigned 32 bit integers to a Blob field, using delphi. I can also read the same from delphi.
    In a stored proc, I need to read the unsigned 32 bit integer from the blob.
    I have used dbms_lob.read function.
    Here for dbms_lob.read(lob_loc,amount,offset,buffer)
    I've given
    lob_loc - the locator
    amount - 4 bytes
    offset - 1
    buffer- ?? The buffer here has to be a datatype which corresponds to unsigned 32 bit. Which datatype in oracle can be used here ? I tried to give raw, but it doesnot seems to give the required out put.
    Please help !!!

    What version of oraoledb? There were some issues with NUMBER datatype in earlier versions, I'd recommend testing the most recent version.
    To patch 10.2 oraoledb, apply the 10.2.0.4 database patch to the client install.
    Hope it helps,
    Greg

  • Adobe Reader 11, No 64-bit execution

    From a 64-bit version of Windows Vista has turned out. And 64 non-cpu and long doeeotguyo map out the operating system.
    I use Windows 7 64-bit version.
    In the 64-bit versions of the Windows application Notepad, Calculator, Paint, etc., and making all 64-bit and 64-bit runs.
    Why Adobe Reader is not doing properly support 64-bit versions of the Windows 64-bit version is not, I would like to know if they provide only 32 bits. Please also released 64-bit versions of Adobe Reader.
    ※Please see attached photos are running Adobe Reader program is exceptionally almost the only 32-bit.

    This is the same problem as in http://forums.adobe.com/thread/1334245
    So far nothing has fixed the problem there, except the user doing a system restore to a few weeks back.

  • The vi is identifyng the number of bytes to be read but the VISA Read vi is not able to read the data from the port.

    We are trying to communicate with the AT106 balance of Mettler Toledo.The VI is attached.
    We are sending in "SI" which is a standard command that is recoginsed by the balance. The balance reads it.The indicator after the property node indicates that there are 8 bytes available on the serial port. However, the VISA read VI fails to read the bytes at the serial port and gives the following error:
    Error -1073807253 occurred at VISA Read in visa test.vi
    Possible reason(s):
    VISA: (Hex 0xBFFF006B) A framing error occurred during transfer.
    The Vi is atttached.
    Thanks
    Vivek
    Attachments:
    visa_test.vi ‏50 KB

    Hello,
    You should also definitely check the baud rates specified; a framing error often occurs when different baud rates are specified, as the UARTs will be attempting to transmit and receive at different rates, causing the receiving end to either miss bits in a given frame, or sample bits more than once (depending on whether the receiving rate is lower or higher respectively). You should be able to check the baud rate used by your balance in the user manual, and match it in your VI with the baud rate parameter of the VISA Configure Serial Port VI.
    Best Regards,
    JLS
    Best,
    JLS
    Sixclear

  • VISA read in exe file is not working

    Hi all,
    I am having problems with VISA read in an exe file created.
    I am trying to write to and read from a programmable power supply via RS232. The VI writes a command to the instrument to set the voltage level. It then writes another command, requesting the resulting current value. This value is then read by VISA read
    The VI is working fine on the development PC, which has LabVIEW installed. The exe file is also working fine on this PC. However, when I try to run the exe file on another PC (I've tried several) everything seem to work except for the VISA read functions. The voltage level command is sent, as well as the On and OFF commands, but the current is not read back.
    I guess there must be something I have missed in the installation. I am working in LabVIEW 8.5. I have created an installer and included
    Runtime Engine 8.5.1
    VISA runtime 4.5
    Is there something else I should do? I am really running out of ideas here...
    I hope someone has a clue about this!
    Clara 

    Clara-
    1. Have you verified that the COM port settings in Windows (check under device manager) are matching how you initialize them (Baud, bits, parity, and flow control) and that these match the power supply's settings?
    2. Also, are you trapping an error message after the attempted Read command (this will make it a lot easier to diagnose).
    3. Do you programmatically close the VISA session at the end of the program?
    4. You can always post the code to see if the forum members will catch the porblem.
    ~js
    2006 Ultimate LabVIEW G-eek.

  • Error -1073807253 occurred at VISA Read in transient SR830.vi VISA: (Hex 0xBFFF006B) A framing error occurred during transfer.

    Hi,
    I am have written a program with labview to make transient c-v measurement using a stanford research SR830 lockin amplifier. The program seems to be runing fine, but sometimes it is givvibg an error:
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    but if I press ok, the program again starts running. What might be the poblem? BTW I googled a bit and I see that in the labview topic "RS-232 Framing Error with HP 34401A Mulitmeter" by pkennedy32 this is what is said about framing error:
    ""Framing Error" in an RS-232 context means a very specific thing - when the receiver was expecting a stop bit, the line was not in SPACE condition. This can be the result of:
    1... Baud rate mismatch (although other problems would likely crop up first).
    2... Data Length problem, If I send 8 data bits and you expect 7, the stop bit is in the wrong place.
    3... Parity setting mismatch - If I send 7 data bits + parity and you expect 7 data bits and no parity, the stop bit is in the wrong place.
    4... Mismatch in # Stop bits - If I send you 7 Data bits + parity + one stop bit, and you expect 7 data bits + parity + TWO stop bits, the second one might not be correct, although most devices do not complain about this.
    But, I must say that this is the same com port setting that I use to measure c-v hysterysis, but I never gt this error there.
    I attach the program herewith for your kind perusal. Please help me resolve this issue.
    Thanks in advance.
    Solved!
    Go to Solution.
    Attachments:
    transient SR830.vi ‏94 KB
    csac.vi ‏8 KB
    sr830 initialize1.vi ‏15 KB

    @Dennis Knutson  you are right I checked the read indicator in backslash mode, and instead of a \n it is sending \r. So I changed the \n in my write strings to \r. But, if I keep the CLOSE VISA outside my loop instead of inside as you suggested, the termination character appears to come in the middle of the read string instead at the end. And since the read terminates at the \r so it is displaying some junk value before the \r, but if I put the CLOSE VISA outside the loop and play along with the bytes at the read buffer, I see the whole read string with the \r  at the end of the string. But, whenever the values are in exponential form (when close to zero) like 6.938839 e-5, I always get a time out error whatever be the timeout that I put at the VISA initialize. And subsequently, if I stop the program and run it again the machine program hangs and I donot get any reading. Then after I close it again and start, sometimes it hangs for some more or starts working. If I put an arbitrarily large byte count at the READ VISA, then I always get the time out before the operation completed error.
    @ Ravens Fan I have removed the CSAC VI altogether and taking the CH! And CH” reading separately, instead as one string. So, no more issues with that.
    I use the control at the delay so that I can choose how much delay I want to set, and I use the math operation because I am using adding up the delay time to keep track of the time elapsed. Because in the end I have t plot a time vs. CH! And CH” readings.
    I am not sure but probably I am making some silly errors. Please help me out. 
    Attachments:
    transient SR830-2.vi ‏103 KB
    sr830 initialize1.vi ‏15 KB

  • Read in Binary form in VISA read function

    Dear All,
    I have connected my Device to the serial Port. and data read from the buffer is stored in text format. I want to view the data in binary format .
    Actually, i have performed the same function in Visual Basic. there also if i view the data in text format, it shows some junk values. but if i view the data in binary, it shows the actual data coming from the instrument .
    I dont know how to modify the VISA read function. can any one pls tell me how can i read the data in binary format?
    Thanks
    Ritesh

    Oops!  No it didn't.
    Lab VIEW 2012 (if that makes a dif)
    I'm using the VISA Read function to take ADC readings in from a microcontroller.  The VISA Read function outputs the data as a string.  Easy to convert the string to U8, either with the conversion function or type cast function, and works great except for a tiny corner case when the ADC reading is zero.  The VISA Read function treats the 8-bit zero reading as a null character and strips it out.
    Apparently, since this is done by the VISA Read function as it's building the string, type casting and or converting the output string from the VISA Read function doesn't "bring the zeros back".
    I've tried setting the VISA property "Discard NUL Characters" to false, and that didn't seem to help.
    My current work around is just to never have the micro send a zero ADC reading. 
    Anyway, I'm a Lab VIEW noob, so while this isn't essential to my project, I remain curious about how to send Lab VIEW serial data that isn't automatically considered characters, thrown into a string with all the zeros stripped out.
    Regards,
    Who

  • Parity error occurs in VISA read when building an executable

    Hi,
    I am doing some serial communication in LabVIEW and have battled for a long time with parity error where it appeared as a lot of 0's were added to the data I read through VISA read. I fixed my problem according to this description:
    Can I Do 9-bit Serial Communication Instead of 7 or 8 bits?
    Where i modified the visa ini file to disable error replacement. This seemed to help out the problem - at least i got the data i expected, until i built an application.
    When i build and EXE the problem occurs again when i run the application on another computer - not the computer on which it is built. I have made sure to include a customed configuration file, i.e. to include the modified visa.ini file to make sure that the error replacement is disabled in the application as well. When i look in the configuration file that follows the application it also appears as if the error replacement is disabled, however, it seems not to work since a lot of 0's are once again filled into my dataset.
    I have of course made sure that the serial port setting are set up correctly.
    I am using LV2013.
    Anyone tried this before or is able to help somehow?
    Thank you
    Nicolai

    Sure, I have attached my VI - however i'm not sure it provides any useful information in this case. What I am doing is simply reading a serial port every 750 ms, accessing specific data in it and plotting in graphs. The VI works perfectly fine on the development computer.
    I don't think the VI or the serial port are what's preventing me. It seems like the configurations in my visaconf.ini file are not transferred to the deployment computer. I have tried the following from the knowledge base:
    How to Include VISA Settings in a LabVIEW Installer
    Why Does Serial Communication Not Work on my LabVIEW Deployed Executable?
    Storing VISA Aliases and Moving Them to Another Machine
    I can also see that the 'DisableErrorReplacement' parameter is set in the .ini file that comes along with the application, but it seems like it is not applied since I keep receiving all these annoying 0's that ruin my data.
    As you can see in the VI i configure my serial port in the 'false'-state, and then on the development machine I have just added 'DisableReplacementError=1' to the visaconf.ini file which solved my problem before trying to distribute the app.
    Hope some of you guys can help.
    Best regards
    Nicolai
    Attachments:
    Read datablock.vi ‏55 KB

Maybe you are looking for