Signal Input and Histograms

I am trying to input a signal from an outside noise generator into one of the ContAcq&Graph.vi and also take a histogram of the signal. The problem I am having is trying to get the Histogram.vi wired correctly in the X input since is a DBL and all of the outputs of the signal is in SGL, thus I can't make the connection. Thanks for the help.

What kind of data acquisition card are you using? There should not be a problem with the wiring between the nodes you are working with, since Labview will automtically coerce the data type to the correct format. I suspect you have a DIMENSION conflict. Enable your context help, and then pause on the wire you are trying to connect. It should tell you exactly what the conflict is. Do you have an example of the VI you're working on?
Eric
Eric P. Nichols
P.O. Box 56235
North Pole, AK 99705

Similar Messages

  • How to get signal from analog input and send it to analog output (real-time​)

    Hi everyone,
    I am doing simple task in Visual C++ and I am using PCI-6221(37 pin).
    Basically, I want to send same signal from 'analog input' to 'analog output'
    at the same time (almost), to make it real-time application.
    Can someone please provide me sample program. 
    I would appreciate if you could provide me with the good tutorial which explains
    step by step everything about programing NI-DAQmx for C/C++.
    Best Regards,
    Khassan
    Solved!
    Go to Solution.

    This is my code in C++, you can optimize it if it looks too messy. This code reads signal from analog input and outputs it through analog output. 
    To make this code work additional  include directories and library directories must be added from NI.
    I hope it helps someone.
    #include <stdio.h>
    #include <conio.h>
    #include "NIDAQmx.h"
    #include <math.h>
    #define DAQmxErrChk(functionCall) { if( DAQmxFailed(error=(functionCall)) ) { goto Error; } }
    int main(int argc, char *argv[])
    int32 error=0;
    TaskHandle taskHandleRead=0,taskHandleWrite=0;
    int32 read=0;
    float64 dataRead[1000];
    char errBuffRead[2048]={'\0'};
    char errBuffWrite[2048]={'\0'};
    bool32 done=0;
    int32 written;
    DAQmxErrChk (DAQmxCreateTask("",&taskHandleRead));
    DAQmxErrChk (DAQmxCreateAIVoltageChan(taskHandleRead,"Dev1/ai0​","",DAQmx_Val_Cfg_Default,-10.0,10.0,DAQmx_Val_Vo​lts,NULL));
    DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandleRead,"",100.0,DAQ​mx_Val_Rising,DAQmx_Val_ContSamps,0));
    DAQmxErrChk (DAQmxCreateTask("",&taskHandleWrite));
    DAQmxErrChk (DAQmxCreateAOVoltageChan(taskHandleWrite,"Dev1/ao​0","",-10.0,10.0,DAQmx_Val_Volts,NULL));
    DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandleWrite,"ai/SampleC​lock",100.0,DAQmx_Val_Rising,DAQmx_Val_ContSamps,1​000));
    DAQmxErrChk (DAQmxStartTask(taskHandleRead));
    DAQmxErrChk (DAQmxStartTask(taskHandleWrite));
    while( !done && !_kbhit() )
    DAQmxErrChk (DAQmxReadAnalogF64(taskHandleRead,1,10,DAQmx_Val_​GroupByScanNumber,dataRead,1000,&read,NULL));
    DAQmxErrChk (DAQmxWriteAnalogF64(taskHandleWrite,read,0,10.0,D​AQmx_Val_GroupByChannel,dataRead,&written,NULL));
    _getch();
    Error:
    if( DAQmxFailed(error) )
    DAQmxGetExtendedErrorInfo(errBuffRead,2048);
    DAQmxGetExtendedErrorInfo(errBuffWrite,2048);
    if( taskHandleRead!=0 )
    DAQmxStopTask(taskHandleRead);
    DAQmxClearTask(taskHandleRead);
    if( taskHandleWrite!=0 )
    DAQmxStopTask(taskHandleWrite);
    DAQmxClearTask(taskHandleWrite);
    if( DAQmxFailed(error) ){
    printf("DAQmx Error: %s\n",errBuffRead);
    printf("DAQmx Error: %s\n",errBuffWrite);
    printf("End of program, press Enter key to quit\n");
    getchar();
    return 0;

  • No signal from built-in input and MIDI crashing

    Been trying to use GB to work out some rough ideas for a track. Idea was to drag and drop the stereo track into GB and then just plug the guitar straight into the computer using the 1/8" input. When I do that, no signal is getting to GB.
    I've checked the "Sound" setting under "System Preferences," tried adjusting the "preferences" within GB, and also tried adding a "new basic track" and selecting "built in input." In all cases no signal input is coming to the meters within GB.
    The other thing that's going on, and I think might be at the root of the problem, is before updating to 10.6, I had a Digi-02 at my disposal and had Pro Tools LE on the computer. The version I had was not compatible with 10.6 so removed it, but I think some traces are still in the computer as every time I open GB I get a "CoreMIDI.framework" message and the audio/midi crashes. Reading the details box I'm getting the following:
    "Process: MIDIServer [27096]
    Path: /System/Library/Frameworks/CoreMIDI.framework/MIDIServer
    Identifier: com.apple.audio.midi.CoreMIDI
    Version: 1.7 (42)
    Code Type: X86 (Native)
    Parent Process: launchd [97]
    PlugIn Path: /Library/Frameworks/DirectIO.framework/DirectIO
    PlugIn Identifier: com.digidesign.framework.DirectIO
    PlugIn Version: 7.1.1 (7.1.1f86)
    Date/Time: 2010-07-18 12:17:04.696 -0700
    OS Version: Mac OS X 10.6.4 (10F569)
    Report Version: 6
    Interval Since Last Report: 216034 sec
    Crashes Since Last Report: 11
    Per-App Interval Since Last Report: 858 sec
    Per-App Crashes Since Last Report: 10"
    I've tried doing a search for any files with "digi" or "ProTools" in the name to delete, but I'm getting no culprits. Any suggestions would be appreciated.
    dp

    Spotlight doesn's search Library folders, maybe that's why you don't find anything. Take a look at these two folders and remove anything that you don't need (i.e. remove anything that you don't know - in a basic setup, there should be nothing in there):
    HD/Library/Audio/MIDI Drivers
    HD/Library/Audio/Plug-Ins/Components

  • How to temporally match/save Input and Output Channels's data?

    Hello, I have a voltage-input, voltage-output SISO system, and need to indentify its dynamic response (or transfer function) as a reference to a PID control. Without using the system identification toolbox, my goal is to generate/provide a sine sweep input (ranging 1Hz-5kHz for 10sec) to the system and to save the corresponding output response signal as well as the sweep input signal simultaneously.
    I got a sample program online and am trying to modify it as attached, but I really need your advices/comments concerning several problems I am facing with:
    1) With the below setting, the number of the acquired Input channel samples obtained is 5461, which is smaller than expected (i.e., 10000). What would I be missing in setting parameters?
    - H/W: NI-USB-6341 DAQ
    - AO: continuous samples with the waveform timing (1k sampling, 10000 samples => 10 sec, slower sampling just for testing purpose)
    - AI: continuous samples, samples to read: 30k, rate: 1k
    2) I am using flat sequence structure as a way to making the start point of saving AI and AO data same, but wonders if this is a right method or if there is other better approaches. (I had an idea of using an internal clock with "1-sample mode" for sync, but this may not work at the high speed sine sweep like 5kHz, right?)
    3) I just want to provide the sine sweep just once to the system and do not require the "reset" functionality implemented in the original sample program. I failed to remove this "reset" part as I did not fully understand the sample code. If I run the attahed, the generated AO signal is periodically provided to the system. Please give me any advice to modify the program as I want.
    Thanks for your help and valualble time.
    Attachments:
    siso.vi ‏230 KB

    Hi J. Kim,
    To begin, you say you want to synchronize your analog input and output so that they start at the exact same time, yes? To achieve tight synchronization, you need to use DAQmx VIs instead of the DAQ Assistant. There's a good overview of DAQmx VIs  here. There's also a document that deals specifically with synchronization in DAQmx here. Additionally, if you go to Help>>Find Examples in the LabVIEW example finder, you can see many other examples of acquiring data using DAQmx.
    As for your configuration, you have your analog input DAQ assistant wired far before your analog output DAQ assistant in your sequence strucutre, so the dataflow of the program will cause the analog input DAQ Assistant to execute before the output. They cannot be in different sequences if they are executing simoultaneously, and I would not use flat sequences here except to start the two tasks in DAQmx. Where did you find this example?
    Best,
    Dan N
    Applications Engineer
    National Instruments 

  • "No Signal Input" on external VGA monitor

    About a year ago my 4 yr emac that I gave my dad starting having the video on the built in display go out periodically. When it started happening more and more, rather than get a whole new computer, we purchased a 22 widescreen VGA monitor and hooked it up via a mini-DVI to VGA adapter. Everything was great and worked perfect. Just last week all of a sudden the VGA monitor is displaying "No Signal Input" The emac is starting up and making all of the correct sounds, but no picture at all.
    I've tried booting from a disk and that was unsuccessful. I've connected the monitor to a different computer (old Dell laptop) and the monitor displayed fine.
    What are the odds that the Mini-DVI to VGA adapter just went bad?
    The next thing I was going to try was to reset the Parameter RAM.
    Any other suggestions?
    My monitor also has a DVI input. Should I try connecting to it instead? (although I'll have to buy a new adapter for DVI).
    THANKS

    Hey Kat,
    What are the odds that the Mini-DVI to VGA adapter just went bad?
    That could happen. If you jiggle the plug you might get it to come back. I just had that problem on an iMac G4.
    Richard

  • Audio input and output

    I am trying to get audio to play through from the audio input to the audio output. I have plugged a line-level keyboard into the audio input and as I play it, it shows that it has plenty of input level in the sound panel of the system preferences. However, there is no sound coming out. I have tried speakers, headphones and the Griffin IMic. If I use a program like SoundStudio3 and click the audio "thru" button it works fine, but it does not work without.
    Is there a THRU switch that I am missing somewhere? I tried the Applications/Utilities/AudioMIDI setup to configure it, but the Thru button at the bottom is greyed out (not in operation).
    Thanks.
    2 GHZ intel core duo   Mac OS X (10.4.8)  

    Hi Rich,
    Welcome to apple forum.
    In my opinion, I don't think OS X have the capability to sent the audio input "THRU" to output directly (I could be wrong).
    You need to use Garage band or third party software like you mention before to allow it.
    Maybe if you connect third party device that is "supported" the signal become audible and you can hear it. Like mBox using core audio driver.
    http://www.apple.com/macosx/features/coreaudio/
    Good Luck.

  • Java Input and Output streams

    I have maybe simple question, but I can`t really understand how to figure out this problem.
    I have 2 applications(one on mobile phone J2ME, one on computer J2SE). They commuinicate with Input and Output Streams. Everything is ok, but all communication is in sequence, for example,
    from mobile phone:
    out.writeUTF("GETIMAGE")
    getImage();
    form computer:
    reply = in.readUTF();
    if(reply.equals("GETIMAGE")) sendimage()
    But I need to include one simple thing in my applications - when phone rings there is function in MIDlet - pauseApp() and i need to send some signal to Computer when it happens. But how can i catch this signal in J2SE, because mayble phone rings when computer is sending byte array? and then suddnely it receives command "RINGING"....?
    Please explain how to correcly solve such problem?
    Thanks,
    Ervins

    Eh?
    TCP/IP is not a multiplexed protocol. And why would you need threads or polling to decipher a record-oriented input stream?
    Just send your images in packets with a type byte (1=command, 2=image, &c) and a packet length word. At the receiver:
    int type = dataInputStream.read();
    int length = dataInputStream.readInt();
    byte[] buffer = new byte[length];
    int count, read = 0;
    while ((count = dataInputStream.read(buffer,count,buffer.length)) > 0)
    read += count;
    // At this point we either have:
    // type == -1 || count = -1 => EOF
    // or count > 0, type >= 0, and buffer contains the entire packet.
    switch (type)
    case -1:
    // EOF, not shown
    break;
    case COMMAND: // assuming a manifest constant somewhere
    // process incoming command
    break;
    case IMAGE:
    // process or continue to process incoming image
    break;
    }No threads, no polling, and nuthin' up my sleeve.
    Modulo bugs.

  • Analog Input and Output in One Single VI

    I need help in setting both analog input and output in one single VI. How do I assign channels to be either input or output? How do I simultaneously uses both in one single VI with a while loop structure?? Which AO am I suppose to use to obtain signal from the function generator I have built to feed into the DAQCard-1200?? Help!!
    Attachments:
    Test1.vi ‏48 KB

    One thing you'll need to be aware of is that you will need to DMA lines: one for AI and one for AO. If you don't, then you can configure the DAQCard to do without DMA using the Config VI. But you certainly can do this.
    As far as your function generator, you will want to do a buffered analog output. You will write your buffer of points to the buffer, and then tell NI-DAQ how fast to update your analog output channel with these values.
    So, you can be reading from AI and checking the AO process in the same while loop. Just make sure you handle the while loop execution (the wait it exits) correctly. This can get tricky when you're doing two types of measurements.
    J.R. Allen

  • Synchronize input and output tasks to start at the same sample point [C++ NI_DAQmx Base]

    I'm trying to initiate the analog input and output streams to start reliably at the same sample. I've tried triggering the output from the start of the input using the following code [NI-DAQmx Base 2.1 under Mac OS X with an M-Series multifunction board]. It compiles and runs, but gives an error message at the call to "DAQmxBaseCfgDigEdgeStartTrig". Any suggestions about synchronized I/O on this platform?
    #include "NIDAQmxBase.h"
    #include
    #include
    #include
    #define DAQmxErrorCheck( functionCall ) { if ( DAQmxFailed( error=( functionCall ) ) ) { goto Error; } }
    int main( int argc, char *argv[] )
    // Task parameters
    int32 error = 0;
    TaskHandle inputTaskHandle = 0;
    TaskHandle outputTaskHandle = 0;
    char errorString[ 2048 ] = {'\0'};
    int32 i;
    time_t startTime;
    // input channel parameters
    char inputChannelList[] = "Dev1/ai0, Dev1/ai1";
    float64 inputVoltageRangeMinimum = -10.0;
    float64 inputVoltageRangeMaximum = 10.0;
    // output channel parameters
    char outputChannelList[] = "Dev1/ao0, Dev1/ao1";
    char outputTrigger[] = "Dev1/ai/StartTrigger";
    float64 outputVoltageRangeMinimum = -10.0;
    float64 outputVoltageRangeMaximum = 10.0;
    // Timing parameters
    char clockSource[] = "OnboardClock";
    uInt64 samplesPerChannel = 100000;
    float64 sampleRate = 10000.0;
    // Input data parameters
    static const uInt32 inputBufferSize = 100;
    int16 inputData[ inputBufferSize * 2 ];
    int32 pointsToRead = inputBufferSize;
    int32 pointsRead;
    float64 timeout = 10.0;
    int32 totalRead = 0;
    // Output data parameters
    static const uInt32 outputBufferSize = 1000;
    float64 outputData[ outputBufferSize * 2 ];
    int32 pointsToWrite = outputBufferSize;
    int32 pointsWritten;
    for( int i = 0; i < outputBufferSize; i++ )
    outputData[ 2 * i ] = 9.95 * sin( 2.0 * 3.14159 * i / outputBufferSize );
    outputData[ 2 * i + 1 ] = -9.95 * sin( 2.0 * 3.14159 * i / outputBufferSize );
    // ------------------- configure input task -----------------------
    DAQmxErrorCheck ( DAQmxBaseCreateTask( "", &inputTaskHandle ) );
    printf( "Created input task\n" );
    DAQmxErrorCheck ( DAQmxBaseCreateAIVoltageChan( inputTaskHandle, inputChannelList, "", DAQmx_Val_RSE, inputVoltageRangeMinimum, inputVoltageRangeMaximum, DAQmx_Val_Volts, NULL ) );
    printf( "Created AI Voltage Chan\n" );
    DAQmxErrorCheck ( DAQmxBaseCfgSampClkTiming( inputTaskHandle, clockSource, sampleRate, DAQmx_Val_Rising, DAQmx_Val_ContSamps, samplesPerChannel ) );
    printf( "Set sample rate\n" );
    // ------------------- configure output task -----------------------
    DAQmxErrorCheck ( DAQmxBaseCreateTask( "", &outputTaskHandle ) );
    printf( "Created output task\n" );
    DAQmxErrorCheck ( DAQmxBaseCreateAOVoltageChan( outputTaskHandle, outputChannelList, "", outputVoltageRangeMinimum, outputVoltageRangeMaximum, DAQmx_Val_Volts, NULL ) );
    printf( "Created AO Voltage Chan OK\n" );
    DAQmxErrorCheck ( DAQmxBaseCfgSampClkTiming( outputTaskHandle, clockSource, sampleRate, DAQmx_Val_Rising, DAQmx_Val_ContSamps, samplesPerChannel ) );
    printf( "Set sample rate\n" );
    // trigger output when input starts
    DAQmxErrorCheck ( DAQmxBaseCfgDigEdgeStartTrig( outputTaskHandle, outputTrigger, DAQmx_Val_Rising ) );
    printf( "Set output trigger\n" );
    // ------------------- configuration -----------------------
    // write output signal
    DAQmxErrorCheck ( DAQmxBaseWriteAnalogF64( outputTaskHandle, pointsToWrite, 0, timeout, DAQmx_Val_GroupByScanNumber, outputData, &pointsWritten, NULL ) );
    printf( "Write output signal\n" );
    // set up input buffer
    DAQmxErrorCheck ( DAQmxBaseCfgInputBuffer( inputTaskHandle, 200000 ) ); // use a 100,000 sample DMA buffer
    // initiate acquisition - must start output task first
    DAQmxErrorCheck ( DAQmxBaseStartTask( outputTaskHandle ) );
    DAQmxErrorCheck ( DAQmxBaseStartTask( inputTaskHandle ) );
    // The loop will quit after 10 seconds
    Dr John Clements
    Lead Programmer
    AxoGraph Scientific

    Hi Michael,
    First of all, thanks very much for taking the time to investigate this problem! Much appreciated.
    You asked for "an actual error code you got and any description that is given". The full output from the program that I posted earlier in this thread is appended to the end of this message. In summary, following the call to...
    DAQmxErrorCheck ( DAQmxBaseCfgDigEdgeStartTrig( outputTaskHandle, outputTrigger, DAQmx_Val_Rising ) );
    ... with ...
    char outputTrigger[] = "Dev1/ai/StartTrigger";
    ...the error message is ...
    DAQmxBase Error: Specified route cannot be satisfied, because the hardware does not support it.
    You asked "specifically which M series device you are using"? It is the PCIe 6251 (with BNC 2111 connector block). I'm testing and developing on an Intel Mac Pro (dual boot OS X and Windows XP).
    You asked for "the location you pulled the code from". Here it is...
    http://zone.ni.com/devzone/cda/epd/p/id/879
    ...specifically from the file "Multi-Function-Synch AI-AO_Fn.c".
    I adapted the NI-DAQmx calls to their NI-DAQmx Base equivalents.
    Finally, you asked "Is the trigger necessary, or do you just need to know that the measurements are running on the same clock?". I believe that some kind of sychronized trigger is necessary in my situation (correct me if I'm wrong). Timing is crucial. Say I initiate an analog output stream that delivers a voltage command step 5 ms from the onset. I need to record the response (analog input stream) so that its onset is accurately aligned (synchronized) at 5 ms. A typical recording situation would stimulate and record a short data 'sweep', then wait for the (biological) system to recover, then stimulate and record another short sweep, and repeat. I need all the recorded sweeps to align accurately so that they can be averaged and analyzed conveniently.
    I definitely do not want my customers to rely on an expensive external TTL pulse generator to initiate and synchronize each 'sweep'. That would effectively eliminate the cost advantage of an NI board, as well as adding unnecessary complexity in setup and use. It would be a show-stopper for me.
    It seems perverse, but would it be possible to use a digital output channel connected directly to a digital input chanel to trigger the input and output streams?
    Regards,
    John.
    Full output from test program. Compiled with gcc 4 under OS X...
    [Session started at 2007-05-23 14:17:01 +1000.]
    LoadRuntime: MainBundle
    CFBundle 0x303cc0 (executable, loaded)
    _CompatibleWithLabVIEWVersion: linkedAgainst: 08208002
    _CompatibleWithLabVIEWVersion: result= false, mgErr= 1, theActualVersion= 00000000
    _CompatibleWithLabVIEWVersion: linkedAgainst: deadbeef
    _CompatibleWithLabVIEWVersion: Reseting Linked Against
    _CompatibleWithLabVIEWVersion: linkedAgainst: 08208002
    _CompatibleWithLabVIEWVersion: result= true, mgErr= 0, theActualVersion= 00000000
    _CompatibleWithLabVIEWVersion: linkedAgainst: 08208002
    _CompatibleWithLabVIEWVersion: result= true, mgErr= 0, theActualVersion= 00000000
    com.ni.LabVIEW.dll.nidaqmxbaselv
    CFBundle 0x313760 (framework, loaded)
    {type = 15, string = file://localhost/Library/Frameworks/nidaqmxbaselv.framework/, base = (null)}
    Amethyst:Library:Frameworks:nidaqmxbaselv.framework
    2007-05-23 14:17:02.248 test-ni[4445] CFLog (21): Error loading /Library/Frameworks/LabVIEW 8.2 Runtime.framework/resource/nitaglv.framework/nitaglv: error code 4, error number 0 (no suitable image found. Did find:
    /Library/Frameworks/LabVIEW 8.2 Runtime.framework/resource/nitaglv.framework/nitaglv: mach-o, but wrong architecture)
    CFBundle 0x1751fdc0 (framework, not loaded)
    Created input task
    Created AI Voltage Chan
    Set sample rate
    Created output task
    Created AO Voltage Chan OK
    Set sample rate
    DAQmxBase Error: Specified route cannot be satisfied, because the hardware does not support it.
    test-ni has exited with status 0.
    Dr John Clements
    Lead Programmer
    AxoGraph Scientific

  • How to synchronize analog input and output from two different USB daq boards

    Hi all,
    I have two very differnt USB boards the NI USB 6008, which I am using to acquire the data (Analog Input) and a NI USB 9263, it is an Analog Output only board that I am using to deliver a signal (in this case a square pulse). The reason why I am not using the 6008 Analog Ouputs is because I need to deliver negative voltages and need the full +/-10V range.
    Looking at similar posts I am pretty sure that I can't use an external trigger or a shared clock, I also tried to use the synchronization of timed structures but no cigar.
    I am including a quick vi that I whipped out showing how the signal jitters due to the lack of synchronization. The AO from the 9263 connects to the AI in the 6008 in this example.
    Solved!
    Go to Solution.
    Attachments:
    Test Pulse.vi ‏117 KB

    I talked to a specialist in the phone and tols me that it is not possible.

  • Monitor display no signal input

    my computer turn on but the monitor display no signal input.  I plug my monitor to other computer and the screen shows my icons and everything but when I plug other monitor to my computer it say no signal input. How do I fix my computer to communicate with the monitor.

    When desktop goes blank then there's no more signal from the pc.
    So the monitor says 'NO SIGNAL INPUT'.
    What's the problem?
    V.
    *** Say 'Thanks' with Kudos ***

  • In Labview 8.5, what happens if the signal input exceeds the signal input range set by the DAQ Assistant?

    Hello all,
    This should be a pretty simple question, but I can't seem to find the answer on-line and don't currently have the capabilities to test this:
    I'm using LabVIEW 8.5 and have a VI that imports sensor data through the DAQ Assistant. In the configuration tab there is a signal input range. What happens if my sensor exceeds this range? Will I get a warning? Will the value default to the maximum (or minimum)? I was interested in writing in some code to display an error as I approach the limits of this range, but was unsure if I also needed to include some code to display an error if the range is exceeded as well.
    Thanks for the help,
    Tristan
    Solved!
    Go to Solution.

    Hello Tristan,
    The behavior depends on the range you choose and the device you are using.
    If you are using a device with only one valid input range, we will use this range even if you set a smaller minimum and maximum in the DAQ Assistant.  Thus, if your device only supports ±10V and you set the range to ±8V, you will still continue to get valid data after your sensor exceeds 8V until you approach 10V.  Once you reach the limit of the range of your device, the output will "rail" and just return the maximum value until the signal drops below the maximum again.
    Note: A device that is nominally ±10V usually has some overshoot (like ±10.2V) that is typically specced in the manual.
    However, if you are using a device with multiple input ranges then things get more complex.
    The NI-DAQmx drive will pick the smallest range that fully encompasses the range you choose.  So, suppose your device supports the following input ranges: ±0.2V, ±1, ±5V, and ±10V and you choose 0V - 3V as the range in the DAQ assistant.  The NI-DAQmx driver is going to look at your input range and the list of input ranges that your hardware supports and choose the smallest that encompasses the full range you set.  This would the ±5V, because that's the only range that contains up to 3V.  As a result, any input signal between ±5V will be returned and any outside this range will "rail" to either the maximum or minimum value.
    We do this because using smaller ranges make more effective use of the resolution of the ADC.  Thus we try to use the most efficient range based on what you request without picking a range that will cause you to miss data.
    Let me know if I can clarify this further. 
    Seth B.
    Staff Test Engineer | National Instruments
    Certified LabVIEW Developer
    Certified TestStand Developer
    “Engineers like to solve problems. If there are no problems handily available, they will create their own problems.”- Scott Adams

  • Trying to use signal in and signal out but couldn't pass parameters

    Hi all,
               I am trying to use Signal out from 1st iview and trying to get the same parameters to the second iview where this iview gets populated when i click a button in result of first iview.
    I have created these two iviews in same model and it deployed without errors.
    When i run the first iview and give the parameters and say submit i get the result and when i click a button a new iview opens(requirement) but its not taking the same parameters as i have given in first iview.
    Could any one give me suggestions how to use them or where should i correct it.
    Thanks
    Venkat

    Hi,
    please follow these steps you will be able to achieve the results:
    1.> Go to the form of the first iview.Drag output port and create signal out with proper name.Click the link created in between the signal out and the form and give a proper name(event name). This name you have to mentioned in the button action property.Go to the form and click the button property.Then in action tab you can mentioned the action as custom action.
    2.> Configure the signal out means select the fields which you want to transfer to the other iview.
    3.> Then go to the other layer and drill down in that.
    4.> create a form and drag input port of it and create a signal in and give the same name as you have mentioned in the signal out.
    5.> configure the signal in. And give all the fields which you have mentioned in the signal out.(it is case sensitive, so better give the correct names)
    6.> Double click the form and select all the fields which you want to be visible on the form.
    7.> just save nad deploy the model and you can transfer data between the two iviews.
    You can check with this link having same type of issue:
    Re: pass data between forms in different layers
    Regards,
    Nutan

  • Synchronization of analogue input and analogue output?

    Hi there,
    I have a signal synchronization problem:
    I am sending two waveforms (i.e choice between sinus square triangle etc see attached VI) to a mechanical system and then I'm reading the acquired signal from this same system (and it should be similar).
    The signal I am reading is indeed similar, however, the synchronization is not perfect. Whenever I changes the frequency of the signal the phase of the aquired signal is shiffting...
    Does anyone have an idea how I should synchronize both send and aqcuired signal?
    Many thanks,
    Best,
    Renaud
    PS: I attached the VI in question 
    Attachments:
    Galvo_Monitor_3.vi ‏43 KB

    Hi,
    As you stated, there is mechanical system inbetween, it is obvious to have delay in response. If you try the same code by direct wiring Analog Input and Output, you will observe no delay in them.
    Regards,
    DCKAN

  • Analog Levels vs SPDIF Levels Input and Output in Logic Pro

    Hello,
    I ran a test last night for recording input and output levels from my Yamaha Motif XS8 through an Apogee Ensemble to compare Analog to SPDIF
    I connected two TRS cables from the L and R outputs on the Motif XS into the Analog Inputs on the Apogee and also have the SPDIF connection from the Motif to the Apogee.
    I put the master fader all the way up on the XS for the volume for analog.
    The ensemble in Maestro has a +4 and -10 reference notional level option for analog inputs. i had it at +4 but changed to -10 and the analog got louder (i figured it would get louder for +10, confusing).
    anyways, why is it that I can record louder levels for analog than the digital transfer?
    I tracked both options at the same time then recorded vocals over it. the digital sound is too low. what's up with that?
    and when I tried to bounce the recording to listen to it in ITunes, the volume levels were way lower than my cd playing through iTunes. Please enligthen me on these?
    I use to record in a Roland 2480, and had similar results with loudness, but got a little louder through mastering but still....pro cds are way louder and still clear.

    I think there is much confusion here!
    In summary, you wont be able to control the recording level of S/PDIF.
    The reason is that you don't want to!
    You need to think of the SPDIF connection as being more like a file transfer method. You are copying the digital data at an output to your harddrive in effect. If I send you an MP3 via email you'd never imagine that your email software is capable of changing the gain of the MP3 I send you. This might sound daft but its a useful analogy. If you need to increase the "volume" of that MP3 then you'd need to ask the sender. Its the same with your set-up.
    There could well be somewhere on your synth that adjusts the instruments level, other than the master (analogue) output control. For example, make sure the midi volume of the intrument being played is set to full - ie midi vol 128. Perhaps there is somesort of virtual mixer onboard to control all the muti-timbral parts so make sure your part has its virtual fader turned up.
    This is what (basically) is going on in the chain...
    Your synth creates sound in its "digital brain". This sound is sent to an "output stage" which will distribute the sound to various outputs. In the case of the S/PDIF it will just send the raw digital data untouched. For the analogue side the digital signal (same as the one sent to S/PDIF) will be converted to analogue and then sent to a amplifier to get it to an appropriate "line level". This final level could well be controlled by an anogue volume control which could be adding more gain (than you think) too.
    When things go to your sound card/ daw...
    The purpose of a analogue gain control is to set the i/p signal so that it suitably loud to beat any noise that exists in your input circuits - so that a good signal to noise ratio is achieved. Analogue signals need to work in the right loudness zone (so to speak) as the analogue electronics will be designed to handle signal levels of a particular range. the gain control is there to make sure the signal is in that range.
    Digital signals are far more predictable though and there is no advantage to your recordings if the incoming digital signal gets an increase of level at the input stage. All you are doing here is effectively adding a few zeros to the binary digital data!
    Lets face it the point of recording is to get a copy of the original sound, that is as similar to the original as possible. With S/PDIF you get a perfect copy of what's coming out of your synth - so job's a good un!
    If, when you come to mix in logic, you find the level of the digital recording is indeed too low for mixing/mastering purposes then just boost it in logic via a fader or via the gain plugin.
    Those referrence values of -10dBV and +4dBu refer to analogue voltage levels only. they have nothing to do with the digital domain. The -10/+4 switch will be only relevant to analogue inputs and outputs. Using an analogue VU meter you should find that a sine wave that peaks at 0dBVU (totally different to 0dBFS BTW) is the equivant of a digital sine wave peaking at -18dBFS.
    The analogue headroom (how loud you can go before things distort) depends on the analogue electronics and varies with different design. Analogue stuff, like mixers) often has headroom of 24dB or more. So that digital stuff can interface with analogue properly we allow for that analogue headroom to be around 18dB (usually enough in practise!)... hence -18dBFS(digital)=0dBVU(analogue).
    To make your digital and analogue input signals sound similar in level you will probably have to reduce the gain of the analogue input. If you set the incoming analogue signal to peak around -14dB (or less!) or so you will probably find things more equal. If you are working in 24 bit your analogue levels can be seemingly very low before sound quality is affected. Its quite safe to record at -20 or even -30dB as shown on logic's meters for eg.
    I hope all this waffle helps LOL!

Maybe you are looking for

  • Need help with Desktop Office Integration (DOI)

    Hi all, i need help to read an Excelsheet into an int. table. Its the first time, that i use the SAP DOI. I copy different coding into my Report to get connection to an existing Excelsheet. Here is my Coding: * first get the SAP DOI i_oi_container_co

  • Print Preview class

    I'm trying to build a WordProcessor and i'm stuck on the print preview. I've been using the Manning book, Swing. I think they have some typo's in the print chapter or I'm missing something. Well here the problem: Wordwrite.java:103: cannot resolve sy

  • Some thumbnails don't show?

    In finder, (my pictures folder more specifically) some thumbnails do not ever show. There is simply the generic .jpeg file "picture" there, but when you click the picture, the appropriate picture displays in "Preview." Does anyone know how to remedy

  • Spell checker quirky running Outlook mail with Firefox, works fine when running Windows

    Using Firefox with Outlook mail. Spell checker misses many obvious typos and misspellings. After correcting an error or two with spell checker underlining of remaining uncorrected errors disappears. Spell checker works properly when using Windows wit

  • LDIF Support for colons in attribute values?

    This is an invalid LDIF entry: dn: cn=APPLICATION,ou=DBOLS_Profiles,ou=lookup,o=systems,dc=xyx,dc=abc objectClass:olsProfiles objectClass: top olspolicyname: MAIN olsmaxreadlabel: abc:def/jkl:mno because the olsmaxreadlabel attribute's value contains