Acquiring and Averaging DAQmx Values
All,
I am aquiring two voltage signals, one from a differential probe and one from a clamp-on current probe.
Two questions:
1. How do I configure my DAQ device, a USB-6212 in this instance, to aquire one signal on AI 0 and the other signal on AI1?
2. Is there an automated way to do the averaging? My For Loop works, but it is not elegant.
As always, any help would be greatly appreciated. Image and LV2009 code attached.
Thanks!
Attachments:
Fluke Voltage and Current.JPG 70 KB
Read and Graph Fluke Voltage and Current.vi 18 KB
To use a different scale for each, wire one DAQmx Create Channel to the next with the same task - not separately as you have done.
Similar Messages
-
I'm analysing the differences between several different images and I would like to select a large area of each (such as with lasso tool) and find the average RGB value of all of the pixels within the selection area made/
With the selection you could do a copy paste and use the histogram of the new layer set the histogram source to selected layer. The eye dropper sample size may also to get the average of a square area around a point.
-
Calculating Percent totals and Averages in SSRS
Below is a report created using SSRS. I want to be able to calculate the percentage(%) total to add up to 100 % in the Total column. For some reason my averages are not working as you can see from the table. The idea is to calculate the averages of opportunities
as well as Sold. But from my calculation, things are not adding up as I supposed. This report is not grouped by parent. How do I make this to work as it should? Thank you.
(NB: Please zoom in to view the image)
ZionliteHi,
According to the description, I doubt you may manually add Total and Averages columns. Did you manually add Local and International or they were in a column group?
You can use the following expression for the Opportunity Cost(averages) column in the total row:
=Sum(Fields!opportunitycost.Value)/2
How did you calculate the percentage column? If it is Sold value divided by Opportunity Cost value, use:
=Sum(Fields!sold.Value)/Sum(Fields!opportunitycost.Value)
Also try to define the scope in the expression and see if it make difference.
If there is no progress, please capture the screenshot of your design tab.(Click on the report and capture the image of your report structure)
Also, tell us the expression of the text box which in problem.
Thanks.
Tracy Cai
TechNet Community Support -
Bind Variable in SELECT statement and get the value in PL/SQL block
Hi All,
I would like pass bind variable in SELECT statement and get the value of the column in Dynamic SQL
Please seee below
I want to get the below value
Expected result:
select distinct empno ,pr.dept from emp pr, dept ps where ps.dept like '%IT' and pr.empno =100
100, HR
select distinct ename ,pr.dept from emp pr, dept ps where ps.dept like '%IT' and pr.empno =100
TEST, HR
select distinct loc ,pr.dept from emp pr, dept ps where ps.dept like '%IT' and pr.empno =100
NYC, HR
Using the below block I am getting column names only not the value of the column. I need to pass that value(TEST,NYC..) into l_col_val variable
Please suggest
----- TABLE LIST
CREATE TABLE EMP(
EMPNO NUMBER,
ENAME VARCHAR2(255),
DEPT VARCHAR2(255),
LOC VARCHAR2(255)
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (100,'TEST','HR','NYC');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (200,'TEST1','IT','NYC');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (300,'TEST2','MR','NYC');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (400,'TEST3','HR','DTR');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (500,'TEST4','HR','DAL');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (600,'TEST5','IT','ATL');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (700,'TEST6','IT','BOS');
INSERT INTO EMP (EMPNO,ENAME,DEPT,LOC) VALUES (800,'TEST7','HR','NYC');
COMMIT;
CREATE TABLE COLUMNAMES(
COLUMNAME VARCHAR2(255)
INSERT INTO COLUMNAMES(COLUMNAME) VALUES ('EMPNO');
INSERT INTO COLUMNAMES(COLUMNAME) VALUES ('ENAME');
INSERT INTO COLUMNAMES(COLUMNAME) VALUES ('DEPT');
INSERT INTO COLUMNAMES(COLUMNAME) VALUES ('LOC');
COMMIT;
CREATE TABLE DEPT(
DEPT VARCHAR2(255),
DNAME VARCHAR2(255)
INSERT INTO DEPT(DEPT,DNAME) VALUES ('IT','INFORMATION TECH');
INSERT INTO DEPT(DEPT,DNAME) VALUES ('HR','HUMAN RESOURCE');
INSERT INTO DEPT(DEPT,DNAME) VALUES ('MR','MARKETING');
INSERT INTO DEPT(DEPT,DNAME) VALUES ('IT','INFORMATION TECH');
COMMIT;
PL/SQL BLOCK
DECLARE
TYPE EMPCurTyp IS REF CURSOR;
v_EMP_cursor EMPCurTyp;
l_col_val EMP.ENAME%type;
l_ENAME_val EMP.ENAME%type;
l_col_ddl varchar2(4000);
l_col_name varchar2(60);
l_tab_name varchar2(60);
l_empno number ;
b_l_col_name VARCHAR2(255);
b_l_empno NUMBER;
begin
for rec00 in (
select EMPNO aa from EMP
loop
l_empno := rec00.aa;
for rec in (select COLUMNAME as column_name from columnames
loop
l_col_name := rec.column_name;
begin
l_col_val :=null;
l_col_ddl := 'select distinct :b_l_col_name ,pr.dept ' ||' from emp pr, dept ps where ps.dept like ''%IT'' '||' and pr.empno =:b_l_empno';
dbms_output.put_line('DDL ...'||l_col_ddl);
OPEN v_EMP_cursor FOR l_col_ddl USING l_col_name, l_empno;
LOOP
l_col_val :=null;
FETCH v_EMP_cursor INTO l_col_val,l_ename_val;
EXIT WHEN v_EMP_cursor%NOTFOUND;
dbms_output.put_line('l_col_name='||l_col_name ||' empno ='||l_empno);
END LOOP;
CLOSE v_EMP_cursor;
END;
END LOOP;
END LOOP;
END;user1758353 wrote:
Thanks Billy, Would you be able to suggest any other faster method to load the data into table. Thanks,
As Mark responded - it all depends on the actual data to load, structure and source/origin. On my busiest database, I am loading on average 30,000 rows every second from data in external files.
However, the data structures are just that - structured. Logical.
Having a data structure with 100's of fields (columns in a SQL table), raise all kinds of questions about how sane that structure is, and what impact it will have on a physical data model implementation.
There is a gross misunderstanding by many when it comes to performance and scalability. The prime factor that determines performance is not how well you code, what tools/language you use, the h/w your c ode runs on, or anything like that. The prime factor that determines perform is the design of the data model - as it determines the complexity/ease to use the data model, and the amount of I/O (the slowest of all db operations) needed to effectively use the data model. -
I've attached a VI that I am using to acquire amplitude from Spectrum analyzerse. I tried to connect amplitude ouput to the VI Write Characters To File.vi and Write to Spreadsheet File.vi. Unfortunately when I run continual this VI without interruption, labview ask me many time to enter a new file name to save a new value.
So, How can I do to aquire and save date in the same time and in the same file when I run continual my VI for example during 10 min.
Thank you in advance.
Regards,
Attachments:
HP8563E_Query_Amplitude.vi 37 KBHi,
Your VI does work perfectly. Unfortunately this not what I want to do. I've made error in my last comment. I am so sorry for this.
So I explain to you again what I want to do exactly. I want to acquire amplitude along road by my vehicle. I want to use wheel signal coming from vehicle to measure distance along road. Then I acquire 1 amplitude each 60 inches from spectrum analyzer.
I acquire from PC parallel port a coded wheel signal coming from vehicle (each period of the signal corresponds to 12 Inches). Figure attached shows the numeric signal coming from vehicle, and the corresponding values “120” and “88” that I can read from In Port vi.
So I want to acquire 1 time amplitude from spectrum analyser each 5
period of the signal that I am acquiring from parallel port.
So fist I have to find how can I count the number of period from reading the values “120” and “88” that I am acquiring from In Port (I don’t know the way to count a number of period from reading values “120” and “88”).
Here is a new algorithm.
1) i=0 (counter: number of period)
2) I read value from In Port
3) If I acquire a period
i= i+1 (another period)
4) If i is multiple of 5 (If I read 5 period)
acquire 1 time amplitude and write to the same
file this amplitude and the corresponding distance
Distance = 12*i). Remember each period of signal
Corresponds to 12 Inches).i has to take these
values: 5,10,15,20,25,35,40,45,50,55,60............
5) Back to 2 if not stop.
Thank you very much for helping me.
Regards,
Attachments:
Acquire_Amplitude_00.vi 59 KB
Figure_Algorithm.doc 26 KB -
How to use and configure DAQmx software triggering in Labview 7.0
Dear NI support staff,
I want use to software triggers to start and stop acquiring an indefinite amount of analog samples (as opposed to a fixed number of N samples in most of your code examples). I don't know the correct way to configure the DAQmx property nodes and the DAQmx send software trigger VIs. In the property node VI it doesn't seem to have an entry for software trigger. So I think I am not on the right track. Would you please help me out? It would be great if you can show me an example. Thanks a lot!
Regards,
BennettHello Bennett,
The example below demonstrates how to perform an analog software triggered acquisition in NI-DAQmx. The example allows the user to specify the triggering condition and the number of pre-trigger samples to acquire.
Example Code - Triggering
Let me know if you have further questions.
Good luck and have a great day!
Regards,
Koninika
Applications Engineer
National Instruments -
Finding and averaging multiple rows
Hello,
I am having a little problem and was wondering if anyone had any ideas on how to best solve it.
Here is the problem:
- I have a large file 6000 rows by 2500 columns.
- First I sort the file by columns 1 and 2
- then I find that various rows in these two columns (1 and 2) have duplicate values, sometimes only twice, but sometimes three or four, or five or up to 9 times.
- this duplication occurs in only the first two columns, but we don't know in which rows and we don't know how much duplication there is. The remaining columns, i.e. column 3 to column 2500, for the corresponding rows contain data.
- Programatically, I would like to find the duplicated rows by searching columns 1 and 2 and when I find them, average the respective data for these rows in columns 3 to 2500.
- So, once this is done I want to save the averaged data to file. In each row this file should have the name of colunm 1 and 2 and the averaged row values for columns 3 to 2500. So the file will have n rows by 2500 columns, where n will depend on how many duplicated rows there are in the original file.
I hope that this makes sense. I have outlined the problem in a simple example below:
In the example below we have two duplicates in rows 1 and 2 and four duplicates in rows 5 to 8.
Example input file:
Col1 Col2 Col3 ... Col2500
3 4 0.2 ... 0.5
3 4 0.4 ... 0.8
8 5 0.1 ... 0.4
7 9 0.7 ... 0.9
2 8 0.1 ... 0.5
2 8 0.5 ... 0.8
2 8 0.3 ... 0.2
2 8 0.6 ... 0.7
6 9 0.9 ... 0.1
So, based on the above example, the first two rows need averaging (two duplicates) as do rows 5 to 8 (four duplicates). The output file should look like this:
Col1 Col2 Col3 ... Col2500
3 4 0.3 ... 0.65
8 5 0.1 ... 0.4
7 9 0.7 ... 0.9
2 8 0.375 ... 0.55
6 9 0.9 ... 0.1
Solved!
Go to Solution.Well, here's an initial crack at it. The premise behind this
solution is to not even bother with the sorting. Also, trying to read
the whole file at once just leads to memory problems. The approach
taken is to read the file in chunks (as lines) and then for each line
create a lookup key to see if that particular line has the same first
and second columns as one that we've previously encountered. A shift
register is used to keep track of the unique "keys".
This
is only an initial attempt and has known issues. Since a Build Array is
used to create the resulting output array the loop will slow down over
time, though it may slow down, speed up, and slow down as LabVIEW
performs internal memory management to allocate more memory for the
resultant array. On the large 6000 x 2500 array it took several minutes on my computer. I did this on LabVIEW 8.2, and I know that LabVIEW 8.6
has better memory management, so the performance will likely be
different.
Attachments:
Averaging rows.vi 30 KB -
Overall Average and Average of part of data-set displayed simultaneously
I am trying to write a report on Discoverer 10.1.2 that will display the average for a sub-set of a data population alongside the overall average of the whole population, BUT NOT display the average for the remaining sub-sets.
For example:
If there are 5 regions, A, B, C, D & E, the requirement is to display the average for region A against the overall average (averaged across all regions) and not to display the averages for teams B, C, D & E.
Can anyone suggest how this could be achieved in Discoverer Desktop/Plus/Admin?
Thanks
DuncanHi James,
It looks to me like your code is not giving you the results you desire because of an assignment syntax error. You want to add all the Upr_Values together and then divide by the Upr_Count to get the overall average for the upper lobe. Your code, though, just overwrites the previous Upr_Value with the current one and assigns the value +1 to Upr_Count each time:
Upr_Value =+ Chd(iLoop, "Strg_Angle")
Upr_Count =+ 1
Here's what you need instead to calculate the overall average:
Upr_Value = Upr_Value + Chd(iLoop, "Strg_Angle")
Upr_Count = Upr_Count + 1
If you want to get the average values for each X value below the blue line, then I recommend using the Calculate() command to create two new sets of X and Y channels, one set for the upper lobe and another set for the lower lobe. The Calculate() command can also set all unwanted data to 0 or Null so that you can then calculate the per-X value average in each lobe with the linear approximation function.
If you want help with this, post or email me a sample data set,
Brad Turpin
DIAdem Product Support Engineer
National Instruments
[email protected] -
Calculate total and average for same key figure
Hi Experts,
I have a requirement where I need to calculate total and average for same key figure no of employees.
eg:
If I enter 03,2009 as Input the reuslt should give from financiual year starting to current month.
11.2008 12.2008 1.2009 2.2009 3.2009 average
11 10 12 10 10 10.6
10 10 11 12 10 10.6
total 21 20 23 22 20 21.2
we have only one characteristic in rows... companycode.
Waiting for your Inputs.
Regards
Prasad Nannurino it will work for u
you have to use variable on 0calmonth or fiscal period depending on what Time characteristic u are using.
lets say that variable is zcalmonth
it is based on 0calmonth for e.g.
now u restrict keyfigure with this variable zcalmonth with time char. = 0CALMONTH
copy and paste the restricted keyfigure
now set offset for variable in variable selection screen dialog box = -1
repeat this until u want
make this variable mandatory
now at query execution user will select any value for month/year
and u will see all 5 months in result set
now there can be maximum 12 months in a year, so u end up creating only 12 restricted keyfigures.
use YEAR in restricted keyfigure too, and restrict it with YEAR VARIABLE processing by CUSTOMER EXIT = CURRENT YEAR
in this case it will automatically removes any additional values...
for e.e.g
YEAR = 2008 only
User entered 6/2008
so lets say ur financial year starts in april 2007 and ends in april 2008
so u expect to see
4/08
5/08
6/08
but u created 12 restricted keyfigures , so it will show upto
4,5,6 months only -
Where to put data processing routine when acquiring data using DAQmx
I have a program that is aquiring data using the DAQmx Acquire N Samples mechanism with automatic reset and a data handler callback routine. DAQmx acquires N samples (usually 1024) from the board, calls the handler to do something with it, and then resets to get the next batch of data. The program acquires a number of lines of data, say 512 lines of N points each, with one callback call per line. Triggering is done by a hardware trigger at the start of each line of data. So far so good.
The issue is that the time that it can spend in the callback is limited, or else the callback is not finished when the next batch of data is ready to be transferd from the DAQmx buffers and processed. There is a substantial amount of analysis to be done after the entire frame has been acquired, and it ends up taking far longer than the time between lines; so where to put the processing? The data acquisition is started from a control callback callback that exits back to the idle loop after it starts the data acquisition process, so there is no code waiting to execute, to return to, when the data acquisition is finished.
I could try to put the data analysis routine into an idle-time routine and trigger it with a semaphore, or I could put it into a timer control callback with, say, a 10 millisecond repetition rate and poll a flag, setting the flag when all of the data has been acquired. Any suggestions would be appreciated.I would recommend using Thread Safe Queues. Your acquisition callback can place items in the TSQ and then you can process the data in a separate thread. TSQs are nice because they allow you to install a callback function to run for certain events. Most importantly, you can install a callback for the Items in Queue or Queue Size Changed event which will run the callback if a certain number of items are in the queue. This lets you take advantage of multithreading in a simple and safe way using a standard Producer/Consumer architecture. However, you may still run into problems with this architecture if your acquisition thread is running much faster than the consumer thread. You could eventually overflow the queue. In that case, your only options are to either get a faster system, slow down the acquisition or do the data handling post process.
National Instruments
Product Support Engineer -
Acquiring signal using DAQmx 6221
I'm new to using DAQ. I need some pointers on how to go about the following
issue. I want to acquire a sine wave through DAQmx (PCI 6221). I need to
acquire the signal at a particular rate and store it as a binary and ASCII
file. Also I want to know how to output a signal through DAQmx. Could you
please give some pointers and example VIs if any?
Thanks,
SukanyaGo to Help>Find Examples. Then expand Hardware Input and Output>DAQmx. Also look at Getting Started With DAQmx.
-
How can I acquire and record pre trigger samples
Hi I would like to acquire and record pre trigger samples. I have used many different methods including:
http://digital.ni.com/public.nsf/allkb/9DE9E3E4DAD9EE93862579C60018B6EA
the problem is I would like to acquire pre trigger samples before start trigger not reference trigger. I think reference trigger is used for stop triggering not start triggering. Could you please help me on this
Thanks?Hi tintin_99,
I hope you are well and thank you for posting your issues to the forum.
I wanted to ask you some further information in regards to your application:
- What version of LabVIEW do you have?
- What operating system (OS) are you running on your PC/laptop?
- Which application programming interface (API) are you using? NI-DAQmx, NI-Scope?
When you mention you want to acquire the pre-trigger samples before the start trigger and not the reference trigger, how are you currently doing this? Would you be able to post up your code so that everyone can have a look at it?
Here are some further links to look at:
- Continous Analog Input Acquisition with Pre-trigger Scans using 2 E-series Boards
- How Can I Acquire Pretrigger Samples with NI-SCOPE?
- Software Analog Trigger with Pre-Trigger Samples
- Synchronized Analog Input/Ouput with an Analog Trigger and Pre-Triggered Samples
I hope this is alright for now.
Kind Regards,
Dom C -
Dithering And Averaging With The 6025e Board
Hi Guys,
I'm trying to acquire a video analog signal, using the 6025e board. I've read that this board is able to dither and/or average the data stream in order to improve the signal and lower the noise.
The problem is that I did not find how to enable/disable dithering or averaging. Can you help me out with that ?
Also, are there any parameters for dithering and averaging that I can additionally control (dither noise level, number of points to average) ?Hi AnalogKid2DigitalMan, thank you very much for your reply.
First, can you please explain why I can reliably sample only 100KHz signals maximum ?
Secondly, If you meant that the sampling rate of this board is too slow, then it's okay since we're using a laser scanning microscope which produces a very low rate H-Sync of about (100 - 500 Hz). Our only concern is that in the time between 2 H-Syncs, the board would be fast enough to acquire a given number of pixels. For example say we have H-Sync of 200 Hz, and we would like to have an 1000x1000 image, then the time it takes between 2 H-Syncs is 1/200 = 0.005 sec. We need 1000 pixels during this time, so 1/(0.005/1000) = 200KHZ is the sample rate required for us.
A. Does anybody know how to activate/disable the dithering and the averaging capabilities of the 6025e board ? ( And if there are parameters that enable more control over these functions ).
B. Does dithering/averaging lower the effective sample rate of the board ?
Your help is kindly appreciated. -
I'm acquiring 64 channels at sampling rate of about 1450kHz continuously and showing the data second by second in a CWGraph. Now, I would like to add an on-line averaging for all of those 64 channels and show it in another CWGraph. The averaging should occur with respect of external trigger and the sample should consist of about 100ms prestimulus and 500ms poststimulus period. Has anybody been handling this sort of things? How it's solved with ComponentWorks and Visual Basic?
I'm using double Pentium 700MHz processors and NI-DAQ PCI-6033E card.1. To get started with external triggering check out the examples in the Triggering folder that install with CW under MeasurementStudio\VB\Samples\DAQ\Triggering.
2. Create a prestimulus buffer that always contains the latest 100 ms of data. Just modify the CWAI1_AcquiredData event to transfer ScaledData (the data acquired) to your prestimulus buffer. See how there will always be a prestimulus buffer of data regardless of trigger state?
3. On the trigger, you can use the same CWAI1_AcquiredData event to route the data into poststimulus buffer and average the 2.
I see something like this:
if no trigger
100 ms of data to prestimulus buffer
if trigger
500 ms of data to poststimulus buffer
averaging and display function
reset trigger and buffers
good luck
ben schulte
application engineer
national instruments
www.ni.com/ask -
Send analog trigger every time data is acquired and saved
I am currently writing a program that acquires and calibrates data from 32 channels of force plates. Once the data is acquired and calibrated it is saved to a file. I need to send a pulse to another computer running labview once every time data is acquired and saved. I have written several versions of this code that send a continuous pulse train while data is being saved. I need to modify the code so that it does not send a continuous pulse, but an individual pulse that is a set frequency every time I acquire data. I believe that I need to use a software trigger to control the pulse, but I can't seem to find a way to have the hardware send a pulse only when I want it to.
The program uses a boolean as a start trigger for the acquisition process. In this manner when the data begins saving(i.e. the boolean is switched) it would send one analog pulse per iteration of the save loop. Unfortunantly I only seem to be able to either send a continuous pulse that I start before the acquisition begins or am not able send any pulse. During the time this pulse is being sent I am also constantly acquiring data from the card. I was wondering if there is any way that I can have the pulse start and initialize before the loop begins and then have it send one analog pulse of a set frequency every time the loop is iterated. I have tried both DAQmx and DAQ traditional solutions but have had no success.
Hardware: PCI-6071-E and BNC-2110 boxes, so I have a large selection of ports and my card seems to be compatible with all the latest software capabilities.
Thanks so much for any help!
~MattActually after tinkering with it for several hours I managed to figure out a way to solve my problem. I'll explain it in case anyone else has a problem.
What the software does: The software is an acquisition loop that outputs a square wave pulse of finite frequency to another acquisition system at the beginning of each acquisition iteration while data is being saved. This is so that the data from the two systems can be synchronized later.
The program flow is this Send Single Synchronization Pulse-> Acquire Data->Calibrate Data->Save Data to file->Repeat
The pulse is triggered once data begins saving, but the device begins acquiring data on execution of the software.
To do this I set up the DAQmx Channl configuration to CO Pulse frequency. I then set the timing up to be finite iterations and the number of iterations to 1. Inside the acquisition loop I enclosed the whole acquisition process in a flat sequence. In the first sequence, if the program is saving data it turns the pulse on, if not saving the task passes through. The acquisition occurs in the middle sequence and in the final sequence if saving the task is turned off. If not saving the task passes through. This allowed me to send one pulse of a finite frequency every time the loop is run.
Thank you very much for your response though. If anyone else runs into a problem like this I can post a screenshot for future reference when I get a chance. Thanks again.
Maybe you are looking for
-
Server 2008 R2 with WSUS need to specify location for Feature install for Server 2012 and Windows 8
Having a problem where we need to add .net 3.5, amongst other things, in server 2012 and windows 8.1. We're using WSUS on server 2008 R2, so when I go to add roles or features, I'm not able to as it tries to use Windows Update but it's set to use WS
-
I just replaced my old desktop PC that was running Windows 7 because it stopped working. My new desktop PC is running Windows 8.1. I downloaded the new MX7600 drivers from the Canon website for Windows 8.1, and I discovered the scanner driver can not
-
Problems when viewing page in IE but not in Firefox - Help
Please help! I'm new to using tables in Dreamweaver. I built my page in Photoshop, sliced it and then opened it in Dreamweaver. It looks fine on my Mac but when viewed on a PC in IE it appears as though there are lines or gaps between the cells of my
-
Hi All, This is the first time I've ever encountered this and it has me worried. The past few times I turned on my Zen Sleek it went into recovery mode. I figured I would do a disc cleanup, but that doesn't seem to be working. It sat for /2 hour with
-
Connection to Oracle from remote computer in local network
I have local network with two computers: WinXP with Oracle 10g and Vista with Oracle Client tools. I need to connect to the Oracle 10g from the client PC with sqlplus and with Enterprise Manager Console. Sqlplus returns ORA-12154 error, EM console re