32 bit acquisition

Hi,
afaik java natively support 16bit acquisition maximum.
Since i need to acquire 32bit signals from a professional board, i was wondering if there's a solution to do that in java.
(I already tried with jsasio stuff but with no great results. Sometimes with some boards it works, some other times, it doesn't.)
I need also to control the input / output route via software.
Do you know if there's a nice way to do all that? Or i have to write a wrapper dll in c?
Thanks a lot

Go to:
START > Control Panel > Programs & Features
Then double-click on "National Instruments Software" and remove everything related to Vision and IMAQ.
Then reboot your PC.  And then reinstall it.
When you reinstall it, make sure the installer indicates that the software will be installed to "C:\Program Files\..." and not "C:\Program Files(x86)\..."

Similar Messages

  • 24-bit sound

    Is it possible to play 24-bit sound with LabView using SO config SO write
    etc.?
    TM

    Of course I'm not sure what device you're talking about...
    But I looked into this one of these Soundblaster 5.1 cards as a possible
    cheap 24-bit alternative to the NI cards. When I called the creative labs
    developer support group and pressed them for information, they claimed that
    the card in question actually was normal 16-bit I/O and the 24-bit
    advertised on the box was only in some internal sections of the card.
    So in the end Labview only supported the 16-bit modes, but the soundcard
    specs were kind of deceptive anyway and I couldn't have done 24-bit
    acquisition. Double check with the developer support line for your hardware
    so you're not on a wild goose chase like I was.
    -joey
    "TM" wrote in message news:3e2eb7db@newsgroups....
    > Is i
    t possible to play 24-bit sound with LabView using SO config SO write
    > etc.?
    >
    > TM
    >
    >

  • Code Improvement : sorting function

    I have the PXI-5122 digitiser (that acquires in 14-bit, NI-Scope 2.5)and I do the following measurement for :
    - sample rate = 50MS/s
    - record length = 4M (in 16-bit)
    ->acquisition time = 0,08sec
    and I want to sort these 4 million samples.
    My algorithm is this one:
    Consider that I need 2^N divisions (usually N=7 til 9)
    I use a shift right function of N bit on my 4M datas so I get an index to a counter that I increment for each occurancy:
    for each acquisition
    for all the samples do
    index = (data_16bit>>N)
    hist(index)++
    end
    end
    My PXI is a Pentium3 1,2GHz and in the worst case it seems that should take less than 1 second. But in fact it takes around 26 sec for an acquisition of 0,08se
    c !!!
    So I try to compare with a sortinf method not only using the output wfm of niScope Fetch Binary 16 but also the gain and offset of the channel. So I get now Voltages which are float so it should be slower to handle float than just binary numbers. It´s not the case with this float handling it takes 26,1 sec to sort the same amount of data (4M). Do LabView recognise every number nubers as float ? Because it is very weird that a basic low level application like bit shifting + counter incrementing is not faster than the same one with float.
    I assume that my algorithm is not perfect so that´s why I would like to know if you have some deas that could be more convenient with LabView.
    Thanks for your support.
    Attachments:
    sorting.bmp ‏335 KB

    Instead of reading and writing to a local variable, use a shift register. This will *dramatically* improve the performace. Right-click on the array output tunnel and select "create shift register". Then wire "Count In" to the left shift register. Finally, delete the local variables.

  • Need advice on best way to acquire multiple vibration waveforms from a PXI-4472 board

    I'm using a PXI-4472 board to acquire vibration data from 8 proximity probes, each with different scaling. Confused as to best approach to achieve this given all the options with MAX, LabView and its Sound & Vibration toolset scaling. Reading other posts, it appears that 24 bit acquisition also requires some special treatment.
    This acquisition needs to be done continuously.
    As an additional side note, I will eventually need to synchronize these channels with the channels on 2 other 4472 boards.
    Any advice on how to use MAX in combination with LabView's AI CONFIG, START, READ, CLEAR and scaling
    for this application would be greatly appreciated.
    All the best,
    Hunter

    Hello Hunter --
    In regards to your questions...
    For scaling, I would recommend using either MAX or the Sound and Vibration Toolset (SVT) scaling, but not both together. The SVT is a bit more flexible, and in general I prefer that. One other thing to keep in mind about MAX is that virtual channels for accelerometers and microphones provide an option to control IEPE excitation, but in fact this can only be set from LabVIEW for the 4472.
    24 bit acquisition does not require any special treatment if you are only reading the scaled (i.e. voltage or engineering unit) data. If you use the binary array option for AI read, you will only receive the top 16 bits.
    If you use the SVT scaling, you should not worry about MAX at all. If you you do define virt
    ual channels in MAX for scaling, you should simply wire a channel array with those channel names into AI Config. Other than this, what you do in MAX should not change how you write the LabVIEW code.
    Please see the attached example for synchronizing PXI-4472s.
    Hope this helps!
    Bryan
    Attachments:
    PXI_4472_Continuous_Synchronized_AI.vi ‏156 KB

  • Imaq C byte alignment

    Hello,
    I am using a linear CCD camera which enables 8, 10 and 12 bits acquisition. The camera is currently set to 12 bits, and I am retrieving the data from a C program, compiled against imaq.lib, with the functions defined in niimaq.h.
    However, I can't figure out exactly how the data is encoded in the output array. The call is to imgGrab(Sid, &ImaqBuffer, TRUE), where ImaqBuffer is Int8 *, and therefore the 2048 pixels of the camera are encoded in a 4096 times 8 bits array.
    Now I was assuming that the 12 bits are encoded such that ImaqBuffer[2*i] corresponds to the MSBs, and ImaqBuffer[2*i+1] to the LSBs, and therefore (((ImaqBuffer[2*i] << 8) | ((ImaqBuffer[2*i+1])) ) >> 4) & 0xFFF would represent the actual pixel value.
    But all values retrieved this way are multiple of 4, which means that either I'm doing it wrong, or the output is only 8 bits.
    I haven't found this information in the docs, but perhaps I got a bit lost in it.... so any help would be greatly appreciated.
    Thanks,

    BruceAmmons wrote:
    Try treating the value as a 16 bit value to begin with.  Get rid of the shift on the LSB and the and with 0xFFF.  Take a look at those values and you will be able to figure out how it work.  I would expect 8 bits of data in the LSB and 4 in the MSB, which is the opposite of how you did it.
    Bruce
    Thanks for your answer.
    I get apparently good values by using only a 8 bits shift of the MSB followed by a OR on the LSB with (ImaqBuffer[2*i] << 8) | ImaqBuffer[2*i+1].
    However, I systematically get overflows when too much power reaches the camera (signal is returning to 0). But I hardly see why the numbers don't fit on these two 8 bits registers (the camera is supposed to support 10 and 12 bits per pixels). Any clue?

  • 64-Bit Labview Vision Acquisition

    I am currently in the process of upgrading from LabVIEW 7.1 to LabVIEW 2011. I have been able to get everything working with LabVIEW 2011 (32-Bit). I have installed the Vision Acquisition and all works well with LabVIEW 2011 (32-Bit). When I try to use the LabVIEW 2011 (64-Bit) I no longer have the Sub Vis that are in the Vision and Motion Pallet. I was wondering if the LabVIEW 2011 (64-bit) needs an additional download in order to allow my Vision Acquisition toolbox to work.
    Thanks,
    FChiragh

    Go to:
    START > Control Panel > Programs & Features
    Then double-click on "National Instruments Software" and remove everything related to Vision and IMAQ.
    Then reboot your PC.  And then reinstall it.
    When you reinstall it, make sure the installer indicates that the software will be installed to "C:\Program Files\..." and not "C:\Program Files(x86)\..."

  • Vision acquisition 12 bit

    Hello
    I am using a Hamamatsu Firewire Camera with 12bit, but cant make it work in the 12bit-mode, as the image acquisition window in LabView everytime shows just the 8bit data. It would be nice to use 12bit data for our research. Does anybody have some information for me to make this thing work?
    Yours faithfully

    With new expermentation we can get no images using the DCAM-API from Hamamatsu, acquiring an image with more than 8 bit is not possible when using the standard-driver for the cam.
    With the Driver for DCAM-API we get 12 bit images using the programm HCImage supplied by Hamamatsu.

  • Creating a PR for acquisition of an asset

    Hi everyone,
    This week some friend were talking about the process to buy an asset and we're a little bit concerned about the situation below.
    A user created at IMA11 a request to buy some laptops and then he created an internal investment order related the IM request made before. So far everything is fine.
    He called our asset sector and asked it to create an asset number related to the internal order that he has been created. With the asset number the user gone to the ME51N transaction to create the PR using the asset number provided by the asset sector.
    At this time, for some reason, he didn't notice and instead of buy some notebooks he bought some chairs and the system didn't alert him telling that the asset number are for laptops not chairs.
    Is this correct, or the system really doesn't provide that kind of information and an asset number created for some particular demand can be used to buy everything that the user wants?
    Another thing, we've noticed that the asset number can be used for more than once, is this correct as well?
    PS.: In our company IM management isn't turned on, I mean, Investment acquisitions doesn't check budget .
    Thanks a lot for your answers.

    Hi AP,
    When the user is creating the request via IMA11, there is a tab called Variant.
    In this tab, there is an option called Plan Values. In this option the user can assign that the request is related to a Material by inputing the material code. But, based on you've said before, I realized that there isn't any constraint between the material code choosen in IMA11 transaction and the material code choosen in ME51N.
    Thanks.
    Best Regards,
    Rodrigo

  • Data acquisition from SR 530 lockin

    Hello,
    I need to read the output of the sr530 vector lockin and store them in a file.
    I was wondering if there is any driver which will let me read frequency, output amplitudes & relative phase between the input & output signal!
    Any help will be much appreciated!
    Thanks a lot,
    Ashutosh.

    Thanks a lot for the help Matt.
    I can connect the device via GPIB. Let me tell you a bit more about the problem. As of now I was using a single channel lockin, EG & G 5209, to read the data(so I am kindda familiar wtih data acquisition... not much though). And the only thing I was reading was the output amplitude of the lockin.
    Now I have to use the vector lockin, SR 530, and I will need to read both the output amplitudes, the phase & the frequency (all four of them)at the same time. And I am looking for a driver which can do this, i.e. which can read all four data simultaneously & store the result in a *.txt or *.dat file.
    Do we have something like that?
    Thanks,
    Ashutosh.

  • Data acquisition time not continuous, but starting over and over again

    Hi,
    My data acquisition time is not continuous but loops over and over
    again.  Could someone please take a look at my vi to see if you
    have any thoughts on how to fix this.  Also, I tried to set up my
    vi so that I could take the mean of the signal at any time I
    choose.  However, at the same instance that the data acquistion
    time "starts over" again, the mean is calculated (before I trigger it to take the mean)
    I have attached my vi and my output file as a text file so that what
    I'm trying to say will (hopefully) make a bit more sense. 
    Thanks for your help in advance!
    Sherri
    Attachments:
    Datalog1 71805.vi ‏1252 KB

    Hi,
    Thanks for responding to my question. Attached is the vi with added
    comments as to what I think I am doing.  By DAQ time, I mean the
    first column of data that is in the data file that I attached
    previously.  I am only simulating a signal (at the beginning) so
    that I can program in my office.  I do, however, have the same
    problem as stated above when I change the simulate signal to DAQmx
    using my instrument.
    I hope this clearer.  If not, I'll try again. 
    Thanks so much!
    Sherri
    Attachments:
    Datalog1 71805 83.vi ‏1228 KB

  • Image acquisition not functioning after upgrading computer

    Hi all,
    I've run into a bit of a snag with our LabVIEW program. This program was written by someone that no longer works at my company and his VI scripts and installer do not exist, at least no where that we can locate. We've migrated the executable as well as the LabVIEW runtime environment over to a new Windows 7 PC (the original computer was running Windows XP). The program loads as it should but when it comes to the image acquisition within the program, the camera image is garbled and basically useless.
    Using NI MAX, I am able to detect the camera and I can set it to the settings from the original PC but the image is even scrambled inside NI MAX. The camera in question is an ATV Oscar F810C which is detected as such by NI MAX. I've updated the Vision runtime to the February 2014 build available on NI.com (about a 2GB download). Here are some screenshots of the original PC's settings in NI MAX:
    In reality, the image in NI MAX should look like this:
    On the new system with the same settings it looks like this:
    And when running the program on Windows 7 it looks like this:
    If I change the Video Mode to one of the lower res ones, such as the first greyscale 1024 x 768 option, the image stops being scrambled inside NI MAX and I'm able to see the image properly. However, the image acqusition inside the program still does the same thing. Without having any information on the contents of the program or access to the original files, it would appear to be some kind of driver issue since the setup works fine on the old system.
    Any ideas on what may be the issue?

    billko wrote:
    It may be that you have to download a WIn7 driver for the camera.  Dll(s) might be incompatible.
    Sorry, I did not see that that was one of the first things you did. 
    Bill
    (Mid-Level minion.)
    My support system ensures that I don't look totally incompetent.
    Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

  • How to improve speed of data acquisition? Help needed urgently.

    I want to convert analog signals to digital signals and simultaneously perform some search on the data acquired and this whole process has to be done continuously for few hours.
    So I tried writing two programs in Matlab, one acquires the analog data and converts it to digital and saves the data in small chunks on hard disk (like file1, file2, file3,...) continuously. The other program performs the search operation in those chunks of data continuously. I run both the programs at a time by opening two mat lab windows.
    But the problem Iam facing is that the data acquisition is slow. As a result I get an error message in the second program saying that
    "??? Error using ==> load
    Unable to read file file4.mat: No such file or directory."
    Iam unable to synchronize the two programs. I cannot use timers in search program because I cannot add any delays.
    Iam using a NI PCI-6036E ,16 Bit Resolution ,200 KS/s Sampling Rate A/D board.
    Should I switch to some other series such as M series having sampling rate of the order MS/s?
    Can anyone please tell me how to improve the speed of data acquisition?
    Thanks.

    Gayathri wrote:
    I want to convert analog signals to digital signals and simultaneously perform some search on the data acquired and this whole process has to be done continuously for few hours.
    So I tried writing two programs in Matlab, one acquires the analog data and converts it to digital and saves the data in small chunks on hard disk (like file1, file2, file3,...) continuously. The other program performs the search operation in those chunks of data continuously. I run both the programs at a time by opening two mat lab windows.
    But the problem Iam facing is that the data acquisition is slow. As a result I get an error message in the second program saying that
    "??? Error using ==> load
    Unable to read file file4.mat: No such file or directory."
    Iam unable to synchronize the two programs. I cannot use timers in search program because I cannot add any delays.
    Iam using a NI PCI-6036E ,16 Bit Resolution ,200 KS/s Sampling Rate A/D board.
    Should I switch to some other series such as M series having sampling rate of the order MS/s?
    Can anyone please tell me how to improve the speed of data acquisition?
    Thanks.
    Hi gayathri,
    well my email is [email protected]
    if ur from india mail me back.
    Regards
    labview boy

  • I have a problem with running an EXE file on win2000, the Lab View is 5.1 and I do not know if it is 16 bit...

    I have a problem with running an EXE file on win2000, the Lab View is 5.1 and I do not know if it is 16 bit...what should I do?

    Hi Arika,
    The drivers that you need to install to make your executable work depends on what your executable is doing. To get started, you need to have the LabVIEW Run-Time Engine installed on your target machine (the Win2000 machine you are planning to use) in order to run your executable. Next, you need to determine what drivers your executable uses, if any. For example, if you are using GPIB instrument control, you will need to install the NI-488 drivers on your target machine. If you are performing data acquisition, you will need to install NI-DAQ drivers. If you are doing image acquistion, you will need to install NI-IMAQ drivers.
    All these drivers are available for downloading on ni.com. To get the drivers, go to ht
    tp://www.ni.com/support , click on the link that takes you to Drivers and Updates (under Option 3), and click on the links to get to the driver(s) you need. For example, if you need the LabVIEW 5.1.1 Run-Time engine, click on the All Drivers and Updates by Application link on the main page (http://www.ni.com/softlib.nsf/). Then click on the LabVIEW link, Windows 2000, Run Time Engine, and then you will see the link to get to the page to download the LabVIEW 5.1.1 Run-Time Engine.
    I hope this information helps.
    Best Regards,
    Wilbur Shen
    National Instruments

  • Labview with Vista 64 bit native support

    I'm waiting day after day to see a new Labview release supporting Vista 64 bit (I mean running at 64bit natively and not in a 32bit emulation layer). I expected it last year... then I waited believing that of course... it'll come soon... but now I'm really panicking.
    64bit CPUs and Vista x64 are now available from a long time ! What's going on ? 
    Did I miss some release note ??? Maybe is already available and I don't know ? (I hope so!)
    In the mean time Microsoft announced the discontinuation of Windows XP on June 2008 this year (3 moths from now)!!!
    Does anyone knows when a full 64bit Labview version will be released ? And what about Vision libs ? (I also need full 64 bit Vision Libs for my applications...)
    I have big memory allocation limits due to 32bit addressing (max 2GB)  that I cannot afford anymore... not in the 2008.
    I write Vision acquisition software using Labview, IMAQ Hardware and Advanced IMAQ Vision Libs.
    As everybody know, memory allocation is "generally" limited to 2GB by the 32bits OS (little more with some tricks) but max image allocation in Labview is anyway much smaller because of default memory fragmentation (Labview and Vision LIBs require allocating a "contiguous memory block" and generally no more then 3-500MB are really available for a single image even with 4GB of memory installed in your PC! This is caused by all DLLs and processes running in memory at different addresses and fragmenting the memory).
    I spoke several time with NI concering this problem in the last years and I was told to wait... in the mean time I had to write a special "memory defragmentation" tool myself in order to free up to 1.2GB of RAM for imaging with Labview. Not enough for me and anyway not a clean solution.
    Then after Microsoft announced Vista 64bit (a few years ago now) and all PC manufacturer starting building 64bit architecture I expected to solve all my problems soon because of the much larger usable memory addressability...
    but this cannot be done yet if Labview and Vision LIBs run in 32bit emulation only !!!
    Furthermore Vista 32bit is much slower then XP and uses much more memory !!! This is really a nightmare as I believe that for the first time I'm facing the possibility that soon I'll have to deliver a "newer" but worst product to my customers.
    In fact I'll be forced soon to move from LV 7/8 to LV 8.5 (just in order to support Vista) but only to be slower and having the same memory limitations (or maybe more).
    Maybe I'm wrong ?
    Can anyone reassure me... LV 9 is just after the corner with full 64bit as requested ??? I'm really in panick !
    Thanks...
    I'm really a LV lover but cannot wait anymore...

    Dear Matt,
    I was aware of the /3GB and I tryed time ago with LV 7 and XP but without being able to allocate a larger image anyway...
    I'll try again and for sure 4GB under Vista 64 is better then anything !!! Do you know if "Image Create.vi" (a VI included in the Vision package) is also able to allocate up to 4GB ?
    Concerning Memory defrag I started using some tools available in the marlet (there are good ones for memory analyzing but no one is capable of defragging the memory really)... then a coupple of years ago I wrote a small application in Labview based on 2 free tools provided on the web (those are free tools for developpers provided from Microsoft)...
    One is Listdlls.exe which is a command line capable of listing all running DLLs (associated with any running application) and it produces a memory map of the DLLs allocation in memory which allow you to understand how memory is fragmented.
    The other one is also a command line tool named Rebase.exe which allow you to a DLL to a different address in memory.
    In fact once rebased a DLL will generally load automatically starting at the new address instead of loading in the middle of the memory or at the original address... but not always as some rare system DLLs will anyway try to load always at the same position (as for Asian Language Pack on Chinese/Japanese OS). Furthermore some DLLs need to be loaded to a specific address.
    I wrote a program in order to automatically create a memory map of all running DLLs (including system ones) using Listdlls.exe and automatically rebasing them after the first 128MB in memory. Rebasing system/running DLLs is generally more difficult (because of the required access policy and because you must plan a replacement only after next reboot) ... my software take care of this.
    Furthermore rebasing running/system DLLs is also dangerous and can crash your system... after many test I limited the rebasing to the memory addresses included between 256MB and 1.6GB... I never have problems rebasing all DLLs in these range (including system ones) under XP 32bit SP2.
    I also move DLLs to free space only (not on already occupyed space). Furthermore I never allocate DLLs in the first 128MB (as you'll crash your system probably). In this way I was capable of freing a lot of memory (generally about 15-20 DLLs are reallocated and memory between is free between 300 and 1.6)... something more can be done manually on the higher memory addresses but it take times and must be checked always. In automatic mode my software is capable of freing memory in 2 seconds (but you need a reboot).
    Backing up original DLLs is also suggested in order to be able to restore them in case of any problem. I implemented an automatic backup and a restore option for this. A manual mode allow me to manually move DLLs in memory (but it's more dangerous and never allow my customer to do it).
    Generally after rebased the memory stay defragmented until you install a new OS patch (which replace old DLLs).
    Then you must be aware then in order to mantain the memory clean when running your application in Labview you should preallocate a large memory block when starting your application preventing labview from allocating resources in your free memory (it must use memory outside the 300-1600MB range). This is another problem... I found it hard to solve and my application is doing a good job but not perfect for this.
    p.s. Labview and LV Runtimes are major cause of memory fragmentation as they load just in the middle of the system memory... these are the first DLLs ro be rebased !!!

  • 64-bit Windows 7 Pro's Windows XP Mode, Palm Desktop 4.1.4 sync with Palm Vx?

    I last talked about Palm Desktop compatibility with 64-bit Windows at http://forums.palm.com/palm/board/message?board.id=software&message.id=37767.  Fortunately, I haven't replaced my laptop yet, so I am in the enviable(?) position of planning the acquisition of a spanking new machine with 64-bit Windows 7 Pro (sidenote: likely a 14" ASUS).  However, my PDA is still Palm Vx, which requires Palm 4.1.4, which requires 32-bits, which is provided in Windows 7 Pro by its Windows XP Mode.  Has anyone heard any word of whether XP mode allows seamless sync'ing through the serial or IR ports?  Serial is most preferrable, of course, due its speed, lack of headaches with line-of-sight alignment, and the fact that it needs to be docked anyway for recharging.
    I'm not sure if it's as dependent on the Palm Desktop version as much as it depends on the Hotsynch Manager, which is 4.1.0.  There are probably certain Hotsynch Manager versions that are compatible with Palm Desktop 4.1.4 and Palm Vx.
    Post relates to: Palm Vx

    I last talked about Palm Desktop compatibility with 64-bit Windows at http://forums.palm.com/palm/board/message?board.id=software&message.id=37767.  Fortunately, I haven't replaced my laptop yet, so I am in the enviable(?) position of planning the acquisition of a spanking new machine with 64-bit Windows 7 Pro (sidenote: likely a 14" ASUS).  However, my PDA is still Palm Vx, which requires Palm 4.1.4, which requires 32-bits, which is provided in Windows 7 Pro by its Windows XP Mode.  Has anyone heard any word of whether XP mode allows seamless sync'ing through the serial or IR ports?  Serial is most preferrable, of course, due its speed, lack of headaches with line-of-sight alignment, and the fact that it needs to be docked anyway for recharging.
    I'm not sure if it's as dependent on the Palm Desktop version as much as it depends on the Hotsynch Manager, which is 4.1.0.  There are probably certain Hotsynch Manager versions that are compatible with Palm Desktop 4.1.4 and Palm Vx.
    Post relates to: Palm Vx

Maybe you are looking for

  • Empty strings

    How can i effectively detect for empty strings, i'va tried "if (someString.length()==0) ... " and the expression "(someString=="")". who can explain???? gr. Razorblade

  • Updating tables in logical standby database

    Dear DBAs, Is it possible to update non replicated tables in the logical standby database, but have the same schema name? "Alter session disable guard" works only for the current session, in fact i want it for all connected users whithout stopping th

  • Hello! I would like to know the procedure to turn a word document into a PDF document via Adobe Reader.

    Hello! I would like to know the procedure to turn a word document into a PDF document via Adobe Reader. Do I have to pay for it or just sign up? Thank you in advance.

  • Sleeves & Case for macbook.

    What is the best Sleeve or Case for macbook?

  • Related to Basket redirction

    Hi Explain how the basket will progress into checkout for both logged-in and not logged-in users. Could you please tell me  we need to do an any customization for  redirecting  user to payment page or login on page when user click  checkout button(in