PCI-6251 into LABVIEW DAQ Assist block interrupti​ons and resets

Signals being received from PCI-6251 into a LABVIEW DAQ Assist block interrupts and resets every 10 seconds after 1 minute of data acquisition. Would this be a memory buffer problem or is it something related to the hardware? I am using a BNC-2110 hardware connector. Please view the attached image.
Attachments:
Interuption1a.GIF ‏4 KB

Hey Peter,
Ahh, I understand the graph now!  Hmm, this is very strange behavior.  Instead of using the DAQ Assistant, try to use the explicit DAQmx VIs (see picture).  You have more explicit control over what is going on with the DAQmx VIs and keep as much as you can out of the loop.
Regards,
Erik
Attachments:
analoginput.JPG ‏23 KB

Similar Messages

  • DAQ assistant block

    Hi,  in my block diagram (labview),  i cannot see DAQ assistant block.  i check it by going express- input. But i donot see DAQ assistan block. if you please tell me.

    Did you install DAQmx after installing LabVIEW?
    LabVIEW Champion . Do more with less code and in less time .

  • Using PCI-6251 with Labview 6.1

    I recently purchased a PCI-6251 multifucntion DAQ board to get a higher sampling rate than my old PCI-MIO-16E-4 board.  However I have existing software that I want to run on the new board.  The board came with NI-DAQmx which will only work with Labview 7 or above.  I am currently running NI-DAQ 6.9.3 and the Measurement and Automation program does not recognize the new PCI-6251 board.  Is there anyway to get Labview 6.1/NI-DAQ 6.9.3 to recognize and configure the new board?  Or will I be forced to upgrade my Labview?

    The M-series boards (PCI-62xx are M-series) require NI-DAQmx driver. The NI-DAQmx driver requires a newer Labview. No way around it.
    John Weeks
    WaveMetrics, Inc.
    Phone (503) 620-3001
    Fax (503) 620-6754
    www.wavemetrics.com

  • LabVIEW DAQ Assistant

    Como hago que el DAQ Assistan pare si no tengo ningún contador ni reloj, al cabo de los segundos deseados por el usuario??
    Grácias.
    Isabela. Una ignorante.

    Hola  Isabela
                       gracias por utilizar el foro de National Instruments. Contestando a tu pregunta,
    Puedes meter el VI Express del DAQ Assistant dentro de un ciclo while controlado por un VI Express como Elapsed time.
    Creo que ya te mande un ejemplo de esto asi que voy a estar al pendiente de tus resultados
    y si necesitas otra cosa por favor hazmelo saber.
    saludos
    Erwin Franz R.

  • LabVIEW won't let me configure DAQ assistant

    Hi,
    When I launch the Data Acquisition with NI-DAQmx.vi and double click the DAQ assistant icon on the block diagram the program won't let me configure the DAQ.  Also, right clicking on the DAQ and selecting properties does nothing.  How do configure the DAQ to allow me to use input channels on my NI-ELVIS board and then display the output on graphs in labview?
    Thank you.

    Hi cd384-
    In general, you can have two (or more) DAQ Assistants in the same VI.  I assume, however, that you're trying to have separate DAQ Assistants for analog input and this definitely will not work.  You must group all tasks of the same type (analog input, analog output) within the same DAQ Assistant or DAQmx Task.  You can add multiple channels by holding control and selecting multiple items when you are initially configuring the DAQ Assistant.
    In order to separate signals in LabVIEW you can use the Split Signals function that is available on several palettes.  To find it, just click "Search" from the functions palette and search for "Split Signals"
    Thanks-
    Tom W
    National Instruments

  • DAQ Assistant: How Can I Control this at the lowest Level?

    DAQ Assistant:
    I have the PCI slot version of this which I use to generate specific signatures. However I would like to get down to the very low level of this block to see what makes it tick if only to add some other features such as have it dynamically change high and low times after "N" number of pluses/bursts.
    At the current state I can't seem to get any further than the GUI that I work with right now. I can give you a VI upon request however this VI is included with labview from what I understand I have version 8.2. If anyone wants a copy of the vi which contains the DAQ Assistant block I will be more than happy to include.
    DAQ Assistant Location: Right click on the block diagram of a VI --> Measurement I/O --> NI-DAQmx --> DAQ Assist (This is an icon in itself)
    I see the read,write nodes however I have played and tried to see what they do but I have had no such luck. If anyone can point me in the right direction I would be grateful.
    Thank You.
    Solved!
    Go to Solution.

    You've got a couple of misconceptions about the DAQ Assistant. First, there is no such thing as a pci version. There are some slight differences between versions of the DAQmx driver. Second, the DAQ Assistant is a code generator. When you start it up, it will create custom code for the type of task you want (digital i/o, analaog i/o, etc.) so the VI you eventually have on the block diagram is not included in any version of LabVIEW.
    Once you have configured the assistant, you can right click on it and select 'Open Front Panel' this will convert the assistant task to a normal subVI that you can opne and view the block diagram. You can also right click and select 'Create DAQmx Code'. This will place the low level DAQmx functions on your block diagram. You could also skip the whole assistant and just start with the low level DAQmx functions in the first place. There is help associated with each and you have all of the shipping examples to look at. There is also the Getting Started with NI-DAQmx page.

  • DAQ Assistant with multichannels causing Simulation Loop slow?

    Hi, another LabView newbie here.
    I have in a Real Time Target (NI 9132)  a Control & Simulation Loop with DAQ Assistant block inside, whose signals are fed into a Discrete State Space block. The discrete state space model has 1 second time step. I have set the Simulation Loop parameters so that it executes every 1 second as well (see Fig. A below). *sorry for the big white gap under the figures..
    The DAQ assistant acquisition mode is set as "1 Sampe (On Demand)".
    However, when I run the VI, the plot seemed to be updated much slower than 1 second rate. To confirm this, I put an "Elapsed Time" block inside the Simulation Loop. The "elapsed time" shows the actual time in seconds while the simulation plot show slower time (see Fig. B below).
    I tried to isolate the problem by removing the block one by one. Finally, I found out that this problem was caused by (at least) the DAQ Assistant which acquires multichannels data of NI 9214. When I remove some channels and leave one or two channels, the VI runs at the actual time (see Fig. C below). But when I added more channels reading, it became slower again. 
    Here is the snippet of the block diagram (after all other blocks were removed):
    What am I doing wrong here? I'm going to use all of NI 9214 channels so how not to have similar problem like this?
    I look forward to hearing any relevant comments from the members. Thanks in advance.
    Tian

    Hi Tian,
    why do you need a Sim loop anyway?
    - When it comes to speed you shouldn't use the DAQAssistent. Use basic DAQmx functions…
    - Use parallel running loops for each task. Put DAQmx functions in their own loop, running in parallel to your Sim loop…
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • DAQ Assist Reading Wrong Voltage

    Hello,
    I'm using the DAQ Assistant VI to read an analog input voltage from a National Instruments PCI-6221 card. I'm reading the voltage from pin AI0. I supply a voltage directly to this pin from a DC power supply, but the voltage measurement obtained from the DAQ Assistant is incorrect - it seems to be scaled by a factor of about 1/3. For example, if I supply 4 Volts to pin AI0, the DAQ Assist reads 1.43 Volts. I used a multimeter to confirm that the voltage at pin AI0 is in fact 4 Volts, and so I know the problem is with my LabVIEW program and not my power supply.
    Here are the steps that lead to my problem:
    1. In the block diagram, I insert a DAQ Assistant block.
    2. In the Properies of the DAQ Assistant, I select Analog Input->Voltage
    3. I select channel ai0
    4. I click "test" in order to test the channel
    5. The voltage is shown to be 1.43 Volts, even though 4 Volts is being supplied to the pin (this is confirmed with a multimeter).
    6. To ensureI click OK to finish configuring the DAQ Assistant. I run the program and plot the voltage. The plot also shows 1.43 Volts.
    Does anyone have an idea why this may be occuring. I've spent a good 4 hours trying to diagnose this and haven't found anything.
    Thanks,
    Abed Alnaif
    Solved!
    Go to Solution.

    Thanks so much for your help. I used MAX, and found out that the issue
    is that I had specified differential voltage, but I should have
    specified RSE voltage.
    However, now I have a different issue: When I apply a voltage to one pin, MAX also shows a voltage on other pins.
    For example:
    1. I apply a DC voltage of 4V to analog input pin ai0.
    2. In MAX->Text Panels..., I select Channel Name = Dev1/ai0 and Input Configuration = RSE
    3. I click Start, and the chart shows the correct voltage (4V)
    4. I change the Channel Name to Dev1/ai1and Input Configuration = RSE
    5. The chart shows a noisy voltage reading between 1.45 and 1.5 V, even though no voltage is applied to pin ai1
    6. When I change the voltage on pin ai0 to 2V, the voltage reading on pin ai1 changes to 0.63V
    7. Using my multimeter, I confirm that there is in fact 2V on pin ai0 and 0V on pin ai1
    Does anyone know why applying a voltage to pin ai0 causes a voltage reading on pin ai1?

  • Input of DAQ assistant does not initialize soon enough

    Dear all,
    We are trying to generate an analog voltage output with the NI 9263 and then read that as an analog input with the NI 9215 as part of an academic exercise. Both of these are on board the NI cDAQ-9174 (compact DAQ). An analog waveform is generated within Labview (either a sine, square, or constant), and that signal is fed into the DAQ assistant to make an analog output. 
    From there, wires connect the 9263 output into the 9215 input. A second DAQ assistant is used to acquire that input voltage from the 9215. 
    The problem occurs when the waveform generated voltage and voltage input are printed on a waveform graph. The input signal for the sine and square waves occurs before the waveform generation signal (i.e., the input is phase shifted to the left instead of to the right, as we would expect due to latency).
    We believe the problem occurs because the waveform is being output before the input has had a chance to initialize and start recording measurements. 
    Is there a way to initialize the DAQ assistant and start taking measurements before the output is generated? Or is there a way to determine the absolute time of each so that we can plot them in absolute time?
    Thank you for any help you can give.
    Attachments:
    Basic inout4square.vi ‏412 KB

    I tried cleaning up the program as best I could. I'm sure you can tell I am new with LabView. Thanks for taking a look even though it was so messy.
    The problem we are having now is that the input data lags the output data by anywhere from 30-50 ms, which seems like far too large a delay. This is when I turn the Waveform Graph to "Ignore Time Stamp". Because the generated output voltage is graphed from the waveform generator, it seems to take a while for it to progress through the program and through the hardware, thus the input voltage graph lags.
    When I turn the Waveform Graph to not "Ignore Time Stamp," the signal from the waveform generator occurs in the year 1903. I've tried searching on how to change this to the current time, but I can't seem to find a way to do that, despite plenty of other people wondering about this. Is there a way to do this so we can know the time at which the generated waveform signal is actually sent to the output DAQ assistant?
    Attachments:
    basic.vi ‏149 KB
    blockdiagram.PNG ‏28 KB
    delay.PNG ‏24 KB

  • Pci-6251 analog input set up

    Hi all,
    Can I set the analog input of PCI-6251 in mix configuration (some in differential mode and some in single ended mode). Because in my set up the differential option is greyed out.
    Thanks
    dphan128
    Solved!
    Go to Solution.

    Yes you should be able to do that. You have 16 analog inputs. For differential you will use two of those per channel. See the DAQ M series user manual. You can program channels on an M Series device to acquire with different ground references.To enable multimode scanning in LabVIEW, use NI-DAQmx Create Virtual Channel.vi of the NI-DAQmx API. You must use a new VI for
    each channel or group of channels configured in a different input mode.
    Remeber that for a differential that you would wire for example AI 0 to the + side and AI 8 to the - side. So now you can't use the AI 8 for single ended. AI 1 is related to AI 9 etc. up to AI 7, AI 16.
    Hope this helps.
    Using LabVIEW 2010SP1 and TestStand 4.5

  • How to import Verilog codes into LabVIEW FPGA?

    I tried to import Verilog code by instantiation followed by the instruction in http://digital.ni.com/public.nsf/allkb/7269557B205B1E1A86257640000910D3, 
    but still I can see some errors while compiling the VI file.
    Simple test Verilog file is as follows:
    ==============================
    module andtwobits (xx, yy, zz);
    input xx, yy;
    output reg zz;
    always @(xx,yy) begin
    zz <= xx & yy;
    end
    endmodule
    ==============================
    and after following up the above link, we created the instantiation file as
    ==============================================
    library ieee;
    use ieee.std_logic_1164.all;
    entity mainVHDL is
    port(
    xxin: in std_logic;
    yyin: in std_logic;
    zzout: out std_logic
    end mainVHDL;
    architecture mainVHDL1 of mainVHDL is
    COMPONENT andtwobits PORT (
    zz : out std_logic;
    xx : in std_logic;
    yy : in std_logic);
    END COMPONENT;
    begin
    alu : andtwobits port map(
    zz => zzout,
    xx => xxin,
    yy => yyin);
    end mainVHDL1;
    ==============================================
    Sometimes, we observe the following error when we put the indicator on the output port,
    ERROR:ConstraintSystem:58 - Constraint <INST "*ChinchLvFpgaIrq*bIpIrq_ms*" TNM =
    TNM_ChinchIrq_IpIrq_ms;> [Puma20Top.ucf(890)]: INST
    "*ChinchLvFpgaIrq*bIpIrq_ms*" does not match any design objects.
    ERROR:ConstraintSystem:58 - Constraint <INST "*ChinchLvFpgaIrq*bIpIrq*" TNM =
    TNM_ChinchIrq_IpIrq;> [Puma20Top.ucf(891)]: INST "*ChinchLvFpgaIrq*bIpIrq*"
    does not match any design objects.
    and interestingly, if we remove the indicator from the output port, it sucessfully compiles on the LabVIEW FPGA.
    Could you take a look at and please help me to import Verilog to LabVIEW FPGA?
    I've followed the basic steps of instantiation on the above link, but still it won't work.
    Please find the attachment for the all files.
    - andtwobits.v : original Verilog file
    - andtwobits.ngc: NGC file
    - andtwobits.vhd: VHD file after post-translate simulation model
    - mainVHDL.vhd: instantiation main file
    Since there is no example file for Verilog (there is VHDL file, but not for Verilog), it is a bit hard to do the simple execution on LabVIEW FPGA even for the examples.
    Thank you very much for your support, and I'm looking forward to seeing your any help/reply as soon as possible.
    Bests,
    Solved!
    Go to Solution.
    Attachments:
    attach.zip ‏57 KB

    Hi,
    I am facing problem in creating successfully importing  VHDL wrapper file for a Verilog module,into LabVIEW FPGA using CLIP Node method. Please note that:
    I am working on platform SbRIO-9606.
    Labiew version used is 2011 with Xilinx 12.4 compiler tools
    NI RIO 4.0 is installed
    Xilinx ISE version installed in PC is also 12.4 webpack ( Though I used before Xilinx 10.1 in PC for generating .ngc file for verilog code FOR SbRIO 9642 platform, but problem remains same for both versions)
    Query1. Which versions of Xilinx ISE (to be installed in PC for generating .ngc file) are compatible with Labview 2011.1(with Xilinx 12.4 Compiler tools)? Can any version be used up to 12.4?
    Initially I took a basic and gate verilog example to import into LabVIEW FPGA i.e. simple_and.v and its corresponding VHDL file is SimpleAnd_Wrapper.vhd
    ///////////////// Verilog code of “simple_and.v”//////////////////////
    module simple_and(in1, in2, out1);
       input in1,in2;
       output reg out1;
       always@( in1 or in2)
       begin
          out1 <= in1 & in2;
       end
    endmodule
    /////////////////VHDL Wrapper file code of “SimpleAnd_Wrapper.vhd” //////////////////////
    LIBRARY ieee;
    USE ieee.std_logic_1164.ALL;
    ENTITY SimpleAnd_Wrapper IS
        port (
            in1    : in std_logic;
            in2    : in std_logic;
            out1   : out std_logic
    END SimpleAnd_Wrapper;
    ARCHITECTURE RTL of SimpleAnd_Wrapper IS
    component simple_and
       port(
             in1    : in std_logic;
             in2    : in std_logic;
             out1   : out std_logic
    end component;
    BEGIN
    simple_and_instant: simple_and
       port map(
                in1 => in1,
                in2 => in2,
                out1 => out1
    END RTL;
    Documents/tutorials followed for generating VHDL Wrapper file for Verilog core are:
    NI tutorial “How do I Integrate Verilog HDL with LabView FPGA module”. Link is http://digital.ni.com/public.nsf/allkb/7269557B205B1E1A86257640000910D3
    In this case, I did not get any vhdl file after “post-translate simulation model step” in netlist project using simple_and.ngc file previously generated through XST. Instead I got was simple_and_translate.v.
    Query2. Do I hv to name tht “v” file into “simple_and.vhd”?? Anyways it did not work both ways i.e. naming it as “simple_and with a “v” or “vhd” extension. In end I copied that “simple_and.v” post translate model file, “simple_and.ngc”, and VHDL Wrapper file “SimpleAnd_Wrapper.vhd” in the respective labview project directory.
    Query3. The post-translate model file can  also be generated by implementing verilog simple_and.v  file, so why have to generate it by making a separate netlist project using “simple_and.ngc” file? Is there any difference between these two files simple_and_translate.v generated through separate approaches as I mentioned?
    2. NI tutorial “Using Verilog Modules in a Component-Level IP Design”. Link is https://decibel.ni.com/content/docs/DOC-8218.
    In this case, I generated only “simple_and.ngc” file by synthesizing “simple_and.v “file using Xilinx ISE 12.4 tool. Copied that “simple_and.ngc” and “SimpleAnd_Wrapper.vhd” file in the same directory.
    Query4. What is the difference between this method and the above one?
    2. I followed tutorial “Importing External IP into LABVIEW FPGA” for rest steps of creating a CLIP, declaring it and passing data between CLIP and FPGA VI. Link is http://www.ni.com/white-paper/7444/en. This VI executes perfectly on FPGA for the example”simple_and.vhd” file being provided in this tutorial.
    Compilation Errors Warnings received after compiling my SimpleAnd_Wrapper.vhd file
    Elaborating entity <SimpleAnd_Wrapper> (architecture <RTL>) from library <work>.
    WARNING:HDLCompiler:89"\NIFPGA\jobs\WcD1f16_fqu2nOv\SimpleAnd_Wrapper.vhd"    Line 35: <simple_and> remains a black-box since it has no binding entity.
    2. WARNING:NgdBuild:604 - logical block 'window/theCLIPs/Component_ dash_Level _IP_ CLIP0/simple_and_instant' with type   'simple_and' could not be resolved. A pin name misspelling can cause this, a missing edif or ngc file, case mismatch between the block name and the edif or ngc file name, or the misspelling of a type name. Symbol 'simple_and' is not supported in target 'spartan6'.
    3. ERROR:MapLib:979 - LUT6 symbol   "window/theVI/Component_dash_Level_IP_bksl_out1_ind_2/PlainIndicator.PlainInd icator/cQ_0_rstpot" (output signal=window/theVI/ Component_dash_Level _IP_bksl_out1_ ind_2/PlainIndicator.PlainIndicator/cQ_0_rstpot) has input signal "window/internal_Component_dash_Level_IP_out1" which will be trimmed. SeeSection 5 of the Map Report File for details about why the input signal willbecome undriven.
    Query5. Where lays that “section5” of map report? It maybe a ridiculous question, but sorry I really can’t find it; maybe it lays in xilnx log file!
    4. ERROR:MapLib:978 - LUT6 symbol  "window/theVI/Component_dash_Level_IP_bksl_ out1_ind_2/PlainIndicator.PlainIndicator/cQ_0_rstpot" (output signal= window / theVI/Component_dash_Level_IP_bksl_out1_ind_2/PlainIndicator.PlainIndicator/ cQ_0_rstpot) has an equation that uses input pin I5, which no longer has a connected signal. Please ensure that all the pins used in the equation for this LUT have signals that are not trimmed (see Section 5 of the Map Report File for details on which signals were trimmed). Error found in mapping process, exiting.Errors found during the mapping phase. Please see map report file for more details.  Output files will not be written.
    Seeing these errors I have reached the following conclusions.
    There is some problem in making that VHDL Wrapper file, LabVIEW does not recognize the Verilog component instantiated in it and treat it as unresolved black box.
    Query6. Is there any step I maybe missing while making this VHDL wrapper file; in my opinion I have tried every possibility in docs/help available in NI forums?
    2. Query7. Maybe it is a pure Xilinx issue i.e. some sort of library conflict as verilog module is not binding to top VHDL module as can be seen from warning HDLCompiler89. If this is the case then how to resolve that library conflict? Some hint regarding this expected issue has been given in point 7 of tutorial “How do I Integrate Verilog HDL with LabView FPGA module”. http://digital.ni.com/public.nsf/allkb/7269557B205B1E1A86257640000910D3. But nothing has been said much about resolving that issue.  
    3. Because of this unidentified black box, the whole design could not be mapped and hence could not be compiled.
    P.S.
    I have attached labview project zip folder containing simple_translate.v, simple_and_verilog.vi file,SimpleAnd_Wrapper.xml,  Xilinx log file after compilation alongwith other files. Kindly analyze and help me out in resolving this basic issue.
    Please note that I have made all settings regarding:
    Unchecked add I/O buffers option in XST of Xilinx ISE 12.4 project
    Have set “Pack I/O Registers into IOBs” to NO in XST properties of project.
    Synchronization registers are also set to zero by default of all CLIP I/O terminals.
    Please I need speedy help.Thanking in you in anticipation.
    Attachments:
    XilinxLog.txt ‏256 KB
    labview project files.zip ‏51 KB

  • When using DAQ assistant to read frequency

    When using DAQ assistant to read frequency and Task timing is set to:
    N Samples, Clock settings to read 26,
    Frequency setup to rising edge,
    1 counter with 10 kHz to 1 kHz range.
    I get back a single number.
    Can I assume this is an average reading of 25 samples with the first sample unused?
    What is the base clock used?
    Is the “26” 26 cycles of the frequency to be measured?

    Hello,
    If you choose to acquire N samples from the DAQ Assistant then it will acquire all of these samples and return them to LabVIEW as an array.  However, when you use the DAQ Assistant it outputs the data in the dynamic data type first.  This data type makes it easy to graph and run the data through other express VIs.  If you were to create a Numeric Indicator from this data type it would just display the last element from the dynamic data array.  To display this data properly in a numeric format convert the dynamic data to an array of doubles by using the Convert From Dynamic Data function in LabVIEW.  Then you can select to convert it to a 1D array of scalars and when you create an indicator off of the output of this function all of the data should be displayed.
    The timebase that is used for lower frequency measurements is the onboard clock, which is internally connected to the Source.  Then you connect your signal to be measured to the Gate of the counter.  Since the frequency of the onboard clock is known it can be used to calculate the frequency of an unknown source based on when the counter is on and off (determined by the Gate). 
    Have a good day,
    Brian P.
    Applications Engineer

  • DAQ Assistant Tasks

    How do I create a task in LabView DAQ Assistant for one of our cDAQ modules without actually being connected to the cDAQ module?
    Solved!
    Go to Solution.

    You can simulate a large scope of instrument supported by DAQmx with Measurement & Automation Explorer.
    Right click on NI-DAQmx Peripherals then Create new then Simulate and choose in the list of supported device.
    For cDAQ, your chassis in first then your module.
    Excuse me but I don't have an english version of MAX and so, I don't have the correct translation of command...
    When your simulated device is configured, you ca use it with the LabVIEW DAQ Assistant.

  • Daq Assist and Graphing

    A very simple problem...
    Very new to LabView, and I am struggling with wiring up my Daq Assistant in order to graph data from a load cell. I've connected my load cell to the Daq and want to measure force readings over a span of time/when I press stop. When I run my program it only graphs a finite number of readings and then erases the graph to copy new readings on top. I put my graph outside of the while loop so that it would graph one reading at a time as they were read but it's not working. If my wiring isn't what's wrong, I have a feeling that my time settings for the Daq Assist are not right (and I don't know how to set those either.) I don't understand the description/effects of Rate and Samples to Read.
    Thanks for your help.
    Solved!
    Go to Solution.
    Attachments:
    Learning Load Test.vi ‏61 KB

    AFLR wrote:
    A very simple problem...
    Very new to LabView, and I am struggling with wiring up my Daq Assistant in order to graph data from a load cell. I've connected my load cell to the Daq and want to measure force readings over a span of time/when I press stop. When I run my program it only graphs a finite number of readings and then erases the graph to copy new readings on top. I put my graph outside of the while loop so that it would graph one reading at a time as they were read but it's not working. If my wiring isn't what's wrong, I have a feeling that my time settings for the Daq Assist are not right (and I don't know how to set those either.) I don't understand the description/effects of Rate and Samples to Read.
    Thanks for your help.
    Hi AFLR,
    I think settings are fine, you have set DAQ to read 100 samples at the rate of 100samples/second, so you'll get 100 samples every second.
    Now in order to retain the previous data in the Graph (which is not the nature of Graph), you may need to preserve it by writing extra code.
    If you already know about:
    1. Shift registers and
    2. Components of Waveform
    You can easily implement this requirement, find the attached VI for your reference.
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.
    Attachments:
    Learning Load Test_Modified.vi ‏79 KB

  • DAQ Assistant verses DAQmx

    Whats the advantages of acquiring data from the DAQmx block rather than using the DAQ assistant block? Are there pro's and con's to each? What is normally used in industry?
    I've also noticed there are multiple methods for writing data to file and wondered what the proper method or pros and cons for this are?

    Dawud-Beale wrote:
    Whats the advantages of acquiring data from the DAQmx block rather than using the DAQ assistant block? Are there pro's and con's to each? What is normally used in industry?
    You have A LOT more control over the acquisition with the DAQmx API over the DAQ Assistant.  You can also make the code more efficient with the API.  Normal in industry to to avoid the DAQ Assistant
    The only real advantage of the DAQ Assistant is that it is easier to set up your task.
    Dawud-Beale wrote:
    I've also noticed there are multiple methods for writing data to file and wondered what the proper method or pros and cons for this are?
    Well, that just depends on your requirements.
    Need human readable?  Go with a tab delimited text file or CSV.
    Need to log a lot of data quickly?  Use a TDMS or binary file.  I tend to use the TDMS since it is well put together for DAQ and waveforms.

Maybe you are looking for