Scaling analog inputs

This is generic , so I won’t go into the hardware unless necessary, range is simplified 0-20Ma in, 0-20 scale out.
I am having problems with scaling in Lookout, my input is 0-20Ma, input conversion is done in BCD for a unipolar FS output of 16533.
My scaling in Lookout is; raw 0-16533 /  eng units 0-20
The readings are like this;
Input   -  output
0Ma.  - 0
4Ma. – 2.52
12Ma. – 11.27
20Ma. – 20
So I tried converting to BIN  unipolar FS output of 4095
My scaling in Lookout is raw 0-4095 /  eng units 0-20
The readings are like this;
Input   -  output
0Ma.  - 0
4Ma. – 1.65
12Ma. – 12
20Ma. – 20
What am I missing?

Your numbers look quite linear to me, only a few counts off what is expected. The values given scale just fine with my testing. The puzzle is fun but I think it's time for you to share more about your hardware and what driver object you are using. The only hardware I have handy that uses BCD datatypes are automation direct PLCs. Every thing scales just as it should. i.e 2084 BCD =2.52 scaled, 3306 BCD scales to 3.99, likewise 3306 Dec scales to 3.99  and 2084 Dec scales to 2.52.
When you did your test what where the scaled values at each step?
%maReadScaled
000??
2554135??
751512405??
1002016533??

Similar Messages

  • SbRio analog input scaling

    I have an sbRio 9612 and I'm building an application that takes analog inputs (0 - 10 vdc) and scales them to an engineering unit like psi for example.  I know how to do the scaling directly on the properties of the analog input, but I was wondering if it is possible to get to those properties in the realtime?  Basically the inputs should be calibrated every 6 months, and if I want to calibrate the entire system from the sensor to the actual input then I need a way to adjust the scaling or calibration of the input in the labview realtime. 
    Here is the example process.
    1) Put a known pressure on the pressure sensor, for this example 10 psi.  The pressure sensor generates a voltage that goes into AI0 and that scales it to 10.1 psi, so I want to be able to calibrate it back to 10 psi. 
    Any built in methods to do this?

    It is unfortunate that there is not a direct way to programatically modify the scaling factors on a RIO scan engine IOV.  This information is defined in the project and deployed to the target encoded into a file called variables.xml.  It is possible to decode this XML file, modify the scaling factor, replace the original file and then reboot.  I have mentioned this in a LabVIEW RT idea post but never got any traction.  It seems fundamental that if you offer scaling configuration in the project you should be able to programmatically modify it in a built application.

  • Use analog input as sample clock

    Hi,
        I have a PCI 6115 DAQ card. I currently perform an
    analog acquisition on ai0, with an external clock on PFI7. But
    sometimes, my clock signal is not high enough and the acquisition does
    not occur. At some NI show, I heard a trick to solve this problem :
    plug the clock on analog input (say ai1), the clock signal gets
    amplified by the card internal amplifiers, and then route this
    amplified signal to the sampling clock. This seems to be a wonderfull
    solution, but I cannot find out to actually redirect the amplified ao1
    to the sampling clock. Does someone know how to do it ?
    Thanks a lot,
    Jérôme Lodewyck

    I tested the attached example on a simulated device so hopefully it will work on a real one without any kinks.  You didn't specify your programming environment, so I'm assuming you're using LabVIEW.  If not, hopefully you can translate to the appropriate ADE based on the picture of the block diagram. 
    In the example, I'm using an AO task to program the analog trigger as specified.  This has two consequences.  First, you won't be able to perform hardware timed AO while the AI acquisition is running.  If this isn't acceptable, you'll need to try the second approach described in the next paragraph.  Second, you'll have to wire the signal to PFI0 instead of an AI channel.  With this configuration, the signal will be seen with a +/- 10V range and referenced to AI Gnd.  Since the trigger DAC is an 8 bit comparator circuit for this board, you'll have ~80 mV of resolution.  You didn't mention what the amplitude or DC offset (if any) of your signal is, but hopefully this resolution will suffice.  You can use the level and hysteresis properties for the analog trigger to filter out noise in the analog signal or account for DC offset. 
    If the constraints listed above aren't to your liking, you can try to use a second AI channel as a trigger channel.  This has some advantages and disadvantages.  The disadvantages are that this requires you to use a trigger with your AI task and it also requires you to acquire another channel of data.  You mentioned the trigger wasn't a problem so this can probably be taken care of with simple analog start trigger.  The data can easily be thrown away, but depending on your sampling rates, it might require a lot of extra bus bandwidth or processing power when scaling the data.  On the positive side, it doesn't require you to use up your AO channels needlessly and you can apply gain to the input signal in order to effectively increase the resolution of the trigger circuit.  You can also apply a low pass filter and different terminal configuration if desired.  The gain, coupling, terminal configuration, filtering, and coupling applied to the signal is controlled by the values used in the Create Channel VI and the Channel Property node.  To create an example that does this, simply start with one of the shipping examples for an Analog Start trigger, change the trigger source to one of the AI channels instead of a PFI or APFI pin, and change the clock source to the AnalogComparisonEvent as shown in the attached example. 
    That should do it.  Good luck with your application and post back if you have additional troubles.
    Attachments:
    AI - External Clock Using Analog Trigger Circuit.vi ‏81 KB
    AI - External Clock Using Analog Trigger Circuit.JPG ‏60 KB

  • Covert analog input signal (NI 9203) to meaningful units

    I'm very new to LabVIEW and I wanted to do something that I thought was fairly simple. Using the NI 9203 analog input module I wanted to measure pressure using a sensor that has an output of 4-20 mA. Altough I get some signal at the end on my screen, it's not realy the pressure. How can I convert this to a meaningful signal? I also took a look at the exeple VI's in the module specific file, but I don't have a clue how to use them.
    Hopefully someone can help me out.
    Jurgen

    Hi Jurgen,
    You have two possibilities. But you must first know the relation between the current created by your sensor and the pressure mesured by your sensor. Normally you should receive this kind of things with your sensor.
    First one, you configure the scale in MAX (Measurement and Automation eXplorer) and you can associate this scale to a task. In this case, the value you will receive in LabVIEW will be automaticly converted into the right (scaled) units. If you are not familiar with MAX and creating scales, you can take a look at this article. or the attached file or yet the MAX NI-DAQmx Help (Start >> All Programs >> National Instruments >> NI-DAQ >> NI-DAQmx Help).
    Second one, you create a subVI (VI into another VI) in LabVIEW that will make the conversion in LabVIEW itself. By this way, this is simply a VI that makes the correlation between the current and the pressure.
    I Hope this will help you.
    Regards,
    Julien Roland - District Sales Manager
    NI Belgium - Technical Support
    Don't forget to rate a good answer

  • CAPTURE ANALOG INPUT DROP

     I have 3 analog inputs which I continuously read and display the scaled results on 3 gauges (voltage converted to pressure 0v=0Bar 10v=300Bar) the final part of my test is to record the pressure drop of one of the analog input channels from 280Bar to 0Bar this should only take 70mS, and display this on a graph against time. Im struggling to find a way to capture this I have had a look ath the Daq examples but just dont seem to get this to work. Any help would be great.

    Hi
    Check the below image. Might help to acquire data. I think you have to sample higher than 70msec for reading the channels.
    Let me know if you have problem.
    Thanks
    Viral

  • Analog inputs no longer working on Audigy 2 ZS Platinum Pro's external dr

    I have an Audigy 2 ZS Platinum Pro. After two years of use without problems, the analog inputs on the external dri've have stopped working. I get no signal from them, not even noise. All 3 inputs are silent, Line in / Mic and Line 2 in the front, and Line 3 in the back. Otherwise the dri've appears to function normally - I get output from the headphone jack, and the volume control knob works. I have not tested digital inputs or outputs, as I currently have no digital devices to connect to them.
    I had made no major changes to my computer - software or hardware - when the problems began. I recently added a new SATA hard dri've, but the Audigy dri've did work after that.
    I've taken steps suggested in another thread on this forum and uninstalled all Creative software and drivers, used the latest driver package's uninstall program as well as Driver Cleaner, and finally manually deleted everything related to Creative and Audigy from the system registry (and rebooted quite a few times in between). I then installed the original drivers from the CD that came with the sound card (a setup that has worked before), but still the problem persists.
    Any ideas on where to even begin looking for a solution? I find it hard to believe that it's a hardware problem, since three inputs suddenly went silent at the same time, and no other connections were affected. Yet I've done as clean a reinstall as possible, and that hasn't solved the problem.
    My current computer configuration:
    OS: Windows XP Pro (SP2 and all latest updates)
    CPU: Athlon XP 500+
    RAM: 2x 52 MB
    Motherboard: Abit AN7 (integrated sound chip disabled)
    Video card: Sapphire Radeon 9800 Pro
    Network card: D-Link DFE-530TX
    Hard dri've : Seagate ST320026A, 20 GB IDE
    Hard dri've 2: Maxtor 6L080J4, 80 GB IDE
    Hard dri've 3: Seagate ST330083, 300 GB SATA-II

    I don't have a Platinum Pro or the dri've, but a general hardware troubleshooting step I would take would be to remove the soundcard and either move it to another slot or re-seat it. Also, I'd disconnnect and reconnect all cables to and from the card and the dri've. If I had another machine I could try the card or dri've in, I'd do that. If I could borrow another dri've or even another card, I'd test those too. (Swapping parts is often the simplest diagnostic.) Yeah, maybe it's not a hardware problem, but I'd take these steps just to rule out some possibilities.
    Edit: And if you can remove or disconnect the SATA dri've and restore the system to the condition before you installed it, that's another good test.Message Edited by Katman on 2-8-2005 2:56 AM

  • Usb 6009 multiple analog inputs

    I am currently attempting to sample two different analog inputs at different sampling rates using a USB 6009.  I keep getting the 'resource reserved' error and am wondering if this is not possible using this DAQ.  Questions:
    1.  Does creating two analog input channels on the device cause this error?
    2.  Is it possible to sample at different rates on channels created in the same task?  (i am trying to 'slow down' the second analog input to display switch points to a customer)
    3.  Running multiple analog inputs using independent timing would be better achieved by switching to a higher end DAQ?  If so which would you recommend?
    I have attached my vi.  Thank you in advance for your help. 
    I surf therefore I am....
    Attachments:
    demo_nolvl.vi ‏27 KB

    The DAQ boards only have 1 timing clock for the analog inputs/outputs.  So you can only have 1 sample rate on a given card.  I would recommend just sampling at the highest of the desired rates in a single task.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • How do I use the analog input (ADC) to drive a motion profile in MAX

    We are trying to use an anolog input signal from a DAQ card to control the motion contol axis.  For now, we want the motor speed to follow a sinusoidal voltage waveform, and later on, we will be using a more complex velocity profile. 
    In order to accomplish this task, we have searched the user forum, and found information suggesting to operate the servo motor in 'slave' mode under the 'gearing' option.  Therefore, we have set the Gear Master to 'ADC Channel-1', Mode set to 'absolute', Gear ratio 1:1, and provided the sinusoidal voltage (+/- 5V, 3 Hz) to the AIN1 connector on the MID-7654 controller.  When using the 1-D Interacitve 'single axis' control panel, the motor runs at constant speed in one direction only.  It does not react to the sinusoidal speed profile request as we expected. 
    How do we 'turn on' the 'slave' mode in MAX?  Please talk us through the necessary steps and settings?  Thanks.
    Staffan

    Staffan,
    you can configure and enablethe Gearing mode in MAX and it should be activated after saving the settings and initializing the board. In 1D-Interactive you can't use Gearing. For better flexibility I recommend not using MAX for configuring and enbling Gearing. You better should do this in your application.
    In fact there is a perfect LabVIEW-example that ships with the NI-Motion driver (Master Analog Input - Slave Axis Gearing.vi)
    I hope this helps,
    Jochen Klier
    National Instruments
    Message Edited by Jochen on 10-02-2007 08:37 AM
    Attachments:
    gearing.jpg ‏162 KB

  • I am getting a -50101 error when trying to get analog input from a compact rio

    I am just doing the initial setup of my compactRIO system and have been able to successfully add the cRio in Max and then into a new project.  I created a very simple VI with analog input that I am trying to read into an indicator, following one of teh tutorials.  The module I am reading the input from is a cRio-9201.  The input is voltage.  I have already added the module to teh project under the FPGA, which is under the RIO in the project tree.  Any ideas what I may be missing would be greatly appreciated!
    Thanks

    The 9201 C Series module needs to be created under the FPGA Target
    (cRIO-910x) in the LabVIEW Project. You can either discover the module
    or create it by type. In case you haven't created the FPGA Target
    (cRIO-910x) under the cRIO Controller (cRIO-900x), you must do that
    first. You also have the option of discovering the cRIO-910x or to
    create it by type. I suggest you do it through discovery, so you don't
    need to manually configure the addresses.
    As how to create the items, that's done through right-click on the
    cRIO-900x and select New>>Targets and Devices.... It will pop up
    a dialog in which you expand FPGA Target and it will discover the FPGA
    Target. Similar steps are followed to create the 9201 module.
    If you don't see the FPGA Target and C Series module options, then you
    need to make sure NI-RIO 2.0 is installed in you computer.
    JMota

  • How do I acquire multiple signals in the NI cDAQ 9172 using 2 analog input modules?

    Hi everyone,
    Is anyone familiar with using the NI cDAQ 9172?  This is my first time using it and I am not sure what exactly I am doing wrong... Alone with a NI 9233 analog input module, it works great.  I can grab my 4 signals from each channel and go.  However, when I add another analog input module to the mix, it gets an error.  It looks like it is reading only one and not the other.  It bombs at one of the DAQ start task and read task.
    Basically, the block diagram is just a double of the one 9233 working alone.  Is there something needed for input to make the 9172 working?  Any ideas?
    Any help is much appreciated.  Thanks!!

    Hi Jud,
    Both threads are correct.  The cDAQ-9172 has a single analog input timing engine, so both of your analog input modules will need to be in a single task.  The other VI you referenced shows one analog input task (with channels added from two modules) as well as an analog output task.  Analog output has a separate timing engine from analog input, so both of those can run in parallel independent tasks.
    The beginning of this thread is a good example; a single DAQmx Create Task followed by a DAQmx Create Virtual Channel for channels from each module.  Also, Getting Started with NI-DAQmx will give you the fundamentals for data acquisition, though I don't know how many of their examples use CompactDAQ.
    Regards,
    Kyle

  • Synchroniz​ing two counter frequency inputs with multiple analog inputs

    Hello all,
    I'm fairly new to LabVIEW and I'm trying to collec​t data from multiple sources with synchronized tim​ing on the acquisition but I'm having trouble figu​ring it out. My problem is that I've got two count​er frequency inputs, one optical tachometer readin​g one pulse per revolution, and a max machinery fl​ow meter with a k factor of 12000. I can't seem to​ figure out how to sync the timing with my multiple analog inputs. I've be​en attempting to get the tachometer  to sync with ​the analog inputs first by following the example l​inked here. (https://decibel.ni.com/content/docs/DOC-10785) So far each time I run it I either get a timeout e​rror on the DAQmx read or a "Multiple sample clock​ pulses were detected" error (see attached image).  It seems if I slow the sampling rate way down to ​say 10 hz and ensure that the tachometer signal is​ over 800-1000 RPM (13-17 Hz) before starting the VI then the program will run without errors until ​the RPM drops below that threshold then the "Multi​ple sample clock pulses" error occurs.  The code is attached below.
    Does anyone know of a more effective way of syncin​g counter frequency inputs with analog inputs?  I'd like to have a VI that can show 0 RPM (and ev​entually 0 flow as well, but I think I need to fig​ure out the timing of one counter before I add ano​ther as it seems I can't have two counters in the ​same task). Any help on this would be greatly appr​eciated.
    LabVIEW version 13.0
    cDAQ-9178 Chassis with NI 9401 for the two counter inputs and NI 9205 for the analog inputs.
    Thanks!
    Richard
    Solved!
    Go to Solution.
    Attachments:
    SimpleDAQ.vi ‏44 KB
    LV_Error.JPG ‏31 KB

    Maybe third times the charm? 
    So I've finally got a good handle on why the VI is having problems at low RPM though I'm somewhat embarassed how long it took me to do that
    Because I have the counter time synced to my Analog input task if it doesn't see at least two pulses between the two clock pulses set by the analog input task I get the -201314 "Multiple sample clock pulses" error. This seems fine at first as it just sets a minimum RPM that I can measure and it's well below the area I'm interested in so no problems there.  I tried a simple error handler that would clear the error when it happend assuming the loop would keep iterating until the RPM went above that minimum at which point I would get a signal again. This is not the case, the read function just continues to spit out the -201314 error even after the RPM is back in the readable range. So then I tried adding two case structures so that when the error occured it would stop the task, clear the error, and then start the task again on the next loop iteration (Code Attached). This also doesn't work as the error shows up again on the stop task and then AGAIN on the start task on the next loop iteration. It seems this error is not actually being cleared and once it happens it stays with the task regardless of what the error cluster is carrying. 
    Anyone have any ideas?  The only solution I can think of is to just clear all tasks and recreate them each loop iteration until the RPM is readable again but that strikes me as a horribly clunky solution.
    Richard 
    Attachments:
    SimpleDAQ_1_Start Stop.vi ‏48 KB

  • Audigy 4 Pro Hub Analog inputs problem

    &Analog inputs in Audigy 4 Pro external hub seem to be unsupported in Windows 7. Am I wrong?
    I installed latest drivers, set mixer to monitor sound from line in, also tryed win ASIO, but I am unable to get any sound from Line in/Mic in.
    Please help me clarify this issue, I?ve been trying to find an answer through different forums for hours! Thanks

    Issue solved... line in /Mic is now working. Audigy 4 pro analog input is supported using latest windows 7 drivers.

  • Scanning of analog inputs in PXI 7831R FPGA

    Hi all,
    I am new to Labview FPGA Module. I am using Labview 7.1.1 and Labview FPGA Module 1.1 . I am using PXI 7831R FPGA Card.
    I developed a program which is used to scan analog Inputs with given scan rate for given scan duration. I gave input as pulse signal with 1Sec period and 2 V amplitude.
    If I scan one analog Input with 10ms scan rate for 1000ms scan duration I am getting correct values. But if I use 2 or more analong signals to scan at the same time then I am getting Multiple of periods. And also If I increase or decrease scan rate I am getting strange values. Could any body please check my code and help me.
    Thanks in Advance.
    Regards,
    Sashi
    Attachments:
    AnlogIn_FPGA.zip ‏247 KB

    customise your front panel with advanced picture creation metods
    Attachments:
    SUF.ctl ‏20 KB

  • How to get signal from analog input and send it to analog output (real-time​)

    Hi everyone,
    I am doing simple task in Visual C++ and I am using PCI-6221(37 pin).
    Basically, I want to send same signal from 'analog input' to 'analog output'
    at the same time (almost), to make it real-time application.
    Can someone please provide me sample program. 
    I would appreciate if you could provide me with the good tutorial which explains
    step by step everything about programing NI-DAQmx for C/C++.
    Best Regards,
    Khassan
    Solved!
    Go to Solution.

    This is my code in C++, you can optimize it if it looks too messy. This code reads signal from analog input and outputs it through analog output. 
    To make this code work additional  include directories and library directories must be added from NI.
    I hope it helps someone.
    #include <stdio.h>
    #include <conio.h>
    #include "NIDAQmx.h"
    #include <math.h>
    #define DAQmxErrChk(functionCall) { if( DAQmxFailed(error=(functionCall)) ) { goto Error; } }
    int main(int argc, char *argv[])
    int32 error=0;
    TaskHandle taskHandleRead=0,taskHandleWrite=0;
    int32 read=0;
    float64 dataRead[1000];
    char errBuffRead[2048]={'\0'};
    char errBuffWrite[2048]={'\0'};
    bool32 done=0;
    int32 written;
    DAQmxErrChk (DAQmxCreateTask("",&taskHandleRead));
    DAQmxErrChk (DAQmxCreateAIVoltageChan(taskHandleRead,"Dev1/ai0​","",DAQmx_Val_Cfg_Default,-10.0,10.0,DAQmx_Val_Vo​lts,NULL));
    DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandleRead,"",100.0,DAQ​mx_Val_Rising,DAQmx_Val_ContSamps,0));
    DAQmxErrChk (DAQmxCreateTask("",&taskHandleWrite));
    DAQmxErrChk (DAQmxCreateAOVoltageChan(taskHandleWrite,"Dev1/ao​0","",-10.0,10.0,DAQmx_Val_Volts,NULL));
    DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandleWrite,"ai/SampleC​lock",100.0,DAQmx_Val_Rising,DAQmx_Val_ContSamps,1​000));
    DAQmxErrChk (DAQmxStartTask(taskHandleRead));
    DAQmxErrChk (DAQmxStartTask(taskHandleWrite));
    while( !done && !_kbhit() )
    DAQmxErrChk (DAQmxReadAnalogF64(taskHandleRead,1,10,DAQmx_Val_​GroupByScanNumber,dataRead,1000,&read,NULL));
    DAQmxErrChk (DAQmxWriteAnalogF64(taskHandleWrite,read,0,10.0,D​AQmx_Val_GroupByChannel,dataRead,&written,NULL));
    _getch();
    Error:
    if( DAQmxFailed(error) )
    DAQmxGetExtendedErrorInfo(errBuffRead,2048);
    DAQmxGetExtendedErrorInfo(errBuffWrite,2048);
    if( taskHandleRead!=0 )
    DAQmxStopTask(taskHandleRead);
    DAQmxClearTask(taskHandleRead);
    if( taskHandleWrite!=0 )
    DAQmxStopTask(taskHandleWrite);
    DAQmxClearTask(taskHandleWrite);
    if( DAQmxFailed(error) ){
    printf("DAQmx Error: %s\n",errBuffRead);
    printf("DAQmx Error: %s\n",errBuffWrite);
    printf("End of program, press Enter key to quit\n");
    getchar();
    return 0;

  • Analog Input and Output in One Single VI

    I need help in setting both analog input and output in one single VI. How do I assign channels to be either input or output? How do I simultaneously uses both in one single VI with a while loop structure?? Which AO am I suppose to use to obtain signal from the function generator I have built to feed into the DAQCard-1200?? Help!!
    Attachments:
    Test1.vi ‏48 KB

    One thing you'll need to be aware of is that you will need to DMA lines: one for AI and one for AO. If you don't, then you can configure the DAQCard to do without DMA using the Config VI. But you certainly can do this.
    As far as your function generator, you will want to do a buffered analog output. You will write your buffer of points to the buffer, and then tell NI-DAQ how fast to update your analog output channel with these values.
    So, you can be reading from AI and checking the AO process in the same while loop. Just make sure you handle the while loop execution (the wait it exits) correctly. This can get tricky when you're doing two types of measurements.
    J.R. Allen

Maybe you are looking for

  • Unable to send email to the users from Project Server Workflow

    Hello all, I am facing issues in sending emails to the users from the workflow in SharePoint Designer 2013. I am trying to send emails to valid users in a Project server 2013 Workflow by Email action. I have set up the SMTP server successfully, I hav

  • Error when connect to Advanced Queuing Oracle

    Dear sir, I'm using intellij IDE write one a class java to connect Advanced Queuing, my code follow : queueConnectionFactory = AQjmsFactory.getQueueConnectionFactory("host", "sid", 1521, "thin"); queueConnection = queueConnectionFactory.createQueueCo

  • -208 error when connecting iPod: No photos uploaded

    I am using iTunes 6.0.1 with an iPod photo. Recently when I try to transfer data to iPod, only the music is transferred. When it tries to transfer photos, it gives me the following dialog: The iPod "[name of my iPod]" cannot be updated. An unknown er

  • Steps for Sales order creation via IDOC

    Hi, I am getting a file on the application server ,which is a purchase order IDoc outbound from another SAP system. I need to use this file to create a sales order through an IDOC in my SAP system. What are the various settings and steps need to be u

  • CREATING TRANSPORT REQUEST TO OBJECT

    Hi All,        I created an object in a LOCAL OBJECT. Can I create Transport Request for that iobject. If so how ? Thanks & regards , Mani.