Averaging during data collection

I would like to set up a general vi to take data using three different methods. The first method would be software triggered, and we would use it fro sampling 0.2 Hz on up. The second method is hardware timed, and this would be used for sampling at 100 hz and up. The third method I would like is where I need help. I would like to setup a vi to collect data using software timed, but every time it goes to get a sample, I would like to perform a hardware timed data collection and average the results. For example, I would set the software timed data collection to one second, and every one second I would like to collect 100 samples at 1000 Hz and return the average value of the 100 samples.
Does anyone have any ideas on how to do thi
s?
Christopher Quijano

I've attached a VI that would do this...
Michael Aivaliotis
VI Shots LLC
Attachments:
Acquire_N_Scans_Software_&_hardware_timed.vi ‏84 KB

Similar Messages

  • Restrict specific sales order type during data collection............

    Hello Gurus,
    Can we restrict a specific order type of sales order in getting collected during the data collection, or is that the only way is customization, like remove all the sales orders with a specific order type after the data collection.
    can somebody help me on this.
    Thanx & Regards

    You can try defaulting rules to default the demand class so that you can exclude these demand classes during MDS..

  • Data collection was switched from an AI Config task writing to an hsdl file to synchronized DAQmx tasks logging to TDMS files. Why are different readings produced for the same test?

    A software application was developed to collect and process readings from capacitance sensors and a tachometer in a running spin rig. The sensors were connected to an Aerogate Model HP-04 H1 Band Preamp connected to an NI PXI-6115. The sensors were read using AI Config and AI Start VIs. The data was saved to a file using hsdlConfig and hsdlFileWriter VIs. In order to add the capability of collecting synchronized data from two Eddy Current Position sensors in addition to the existing sensors, which will be connected to a BNC-2144 connected to an NI PXI-4495, the AI and HSDL VIs were replaced with DAQmx VIs logging to TDMS. When running identical tests, the new file format (TDMS) produces reads that are higher and inconsistent with the readings from the older file format (HSDL).
    The main VIs are SpinLab 2.4 and SpinLab 3.8 in folders "SpinLab old format" and "Spinlab 3.8" respectfully. SpinLab 3.8 requires the Sound and Vibration suite to run correctly, but it is used after the part that is causing the problem. The problem is occuring during data collection in the Logger segment of code or during processing in the Reader/Converter segment of code. I could send the readings from the identical tests if they would be helpful, but the data takes up approximately 500 MB.
    Attachments:
    SpinLab 3.8.zip ‏1509 KB
    SpinLab 2.4.zip ‏3753 KB
    SpinLab Screenshots.doc ‏795 KB

    First of all, how different is the data?  You say that the reads are higher and inconsistent.  How much higher?  Is every point inconsistent, or is it just parts of your file?  If it's just in parts of the file, does there seem to be a consistent pattern as to when the data is different?
    Secondly, here are a couple things to try:
    Currently, you are not calling DAQmx Stop Task outside of the loop; you're just calling DAQmx Clear Task.  This means that if there were any errors that occured in the logging thread, you might not be getting them (as DAQmx Clear Task clears outstanding errors within the task).  Add a DAQmx Stop Task before DAQmx Clear Task to make sure that you're not missing an error.
    Try "Log and Read" mode.  "Log and Read" is probably going to be fast enough for your application (as it's pretty fast), so you might just try it and see if you get any different result.  All that you would need to do is change the enum to "Log and Read", then add a DAQmx Read in the loop (you can just use Raw format since you don't care about the output).  I'd recommend that you read in even multiples of the sector size (normally 512) for optimal performance.  For example, your rate is 1MHz, perhaps read in sizes of 122880 samples per channel (something like 1/8 of the buffer size rounded down to the nearest multiple of 4096).  Note: This is a troubleshooting step to try and narrow down the problem.
    Finally, how confident are you in the results from the previous HSDL test?  Which readings make more sense?  I look forward to hearing more detail about how the data is inconsistent (all data, how different, any patterns).  As well, I'll be looking forward to hearing the result of test #2 above.
    Thanks,
    Andy McRorie
    NI R&D

  • How to average data collected in a loop

    Hey everyone,
    I am using an interface card to read voltage across a resistor to measure the current through a photodiode.  The VI I made slowly increases the voltage applied across the sample. It takes a starting voltage, increases it by a specified increment, and then takes a series of measurements at that voltage (usually around 200 or 300). I had the program just save all of this data to an external measurement file, where I would then average it in excel.  I had to change the program to measure three variables, and I want to the VI to average the data and then save the average current at each voltage in a measurement file. 
    Ex. 
    It used to export the data as...
    -1     .90
    -1     .80
    -1     .85
    Im trying to get the program to average all of these values and then save it as one data point.
    -1     .85
    I would like the program to take the 200 or so data points, average them, and then save just the average in a file. 
    I usually have about 300 different applied voltages to measure, and with 200 current readings at each it becomes a huge amount of data.
    Right now I have the part of the VI that takes the measurements in a while loop, and once the number of loop iterations reaches the specified number of measurements it stops running.  The program would then increase the voltage, and run the measurement loop again.  I got everything else working, I just can't figure out a way to average all the data.
    Any help would be greatly appreciated

    Alright, I just started using labview last week and i knew that i would have to use shift registers, but when i tried to create one the add shift register option was grayed out.  All i had to do was click on the right or left side of the loop instead of the bottom which is what i had been trying before. 
    Thanks for the fast response

  • BCS - Data collection issue

    Hi,
    I'm using BCS 4.0. I'm working now in final testing and I have some question regarding to the data collection process using load from data stream method. I ran the task in consolidation monitor for 007.2007 period and for all companies without any error or warnings, but we have differences in financial information for that period.
    I reviewed the content in my BCS cube (RSA1) and we don't have any data for that accounts, the only thing that I found was that all docs were created on same date
    I deleted the ID request in RSA1in my BCS cube and I executed task again in consolidation monitor, but the result was the same.
    Looking at log / source data, in the rows listed, these data is not taking from the infoprovider
    Any idea which could be the problem ?
    Thanks
    Nayeli

    Hi Nayeli,
    I had to do this kind of job (reconciliation of data in the source basis cube and the totals cube) during the final testing a lot of times, with an accountant.
    The only way to have the first clue what is happening is to compare every particular amount s in both cubes. Comparing and trying to figure out any dependencies in data.
    The difference might arise because of ANY reason. Only you may analyze the data and decide what to do.
    AFAIU, you compared only reported data and did no currency translation and eliminations?
    Then, I'm afraid that you've made a very big mistake deleting the request from the totals cube. You have broken the consistency between totals and document data. For a transactional cube the request is yellow until the number of records in it reach 50000 records. Then it is closed and become green. As you may understand, in those 50000 records maybe contained a lot of data for different posting periods, for example, reported data for the previous closed period. Though, documents still contain eliminations for that period. - Inconsistency.

  • Data Collection in APS

    Hi,
    We have both the APS and OPM applications on the same instance. We have run data pull program. The required data such as organizations, items & formulas are coming into APS. But, item forecasts entered in Forecasting form of OPM Process Planning are not being pulled. In other words there's no demand schedule.
    We have checked every bit of setup involved. Yet we have no idea what's going wrong?
    Is this a bug or are we doing something wrong?
    null

    Hello Ashok,
    At the risk of repetition let me explain the set-up for use of forecast
    in OPM-APS and you should be able to determine the source of the problem
    you're encountering while transferring the forecast data.
    1. Have a forecast in OPM for the required item in the "required
    warehouse"
    2. Have a Schedule (in MPS of OPM) which has this forecast associated
    with it.
    3. Ensure that the Make to Stock indicator is turned ON for this
    schedule
    4. Ensure that if the make to order indicator is also turned on then the
    forecast quantities are not "consumed" by sales order quantities.
    5. Ensure that the plant for which planning is being done has the
    warehouse (mentioned in forecast) as a replenishment warehouse
    (Plant-Warehouse Effectivities) for the forecasted items
    6. Do a data collection and try the LOV for MDS. - You should see a MDS
    with a name which starts with the letters in the name of the schedule
    and thw warehouse code.
    This MDS is generated by the data extract program (while doing data
    extraction) and during the generation of this MDS the forecast is
    adjusted for the sales orders quantities for the required warehouses.
    Please fee free to get intouch with me or [email protected]
    for any further queries.
    Rgds
    Abhay
    PS : Just one more simple confirmation, though the forecast is NOT being
    seen on the APS side , are you able to get the forecasted items across
    to APS ?
    null

  • Locks During Object Collection & Activation in Business content in NRIV

    Hi,
       During the Collection & activation of InfoObjects (Business Content) the systems get locked. looks like a dead lock in DB01 with table NRIV
    We are on MSSQL 2005 SP3
    Does anyone encountered this..???  
    All of the OSS notes i found talks about the Number Range but no solutions so far.....
    Please Help.....

    Hi
    Database locks: Snapshot data about exclusive wait situations for database locks can be obtained
    from the Database Lock Monitor (DB01). If exclusive database lock waits occur, document the locked
    object, lock holder (program/transaction) and lock waiter (program/transaction).
    Slide 24--
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0477d70-5082-2910-e49a-e53ea6d4c893
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0667b7c9-0e01-0010-e4a3-873e87656048
    Re: Trace all processed ABAP statements

  • Consolidation Group in Target Cube (Data Collection Task)

    Dear Experts,
    In consolidation Monitor, While doing Data Collection task via Load from Data Stream, after doing Updation, WHen I see the content of Target InfoCube in RSA1, In GL account Line item I do not get COnsolidation Group Value.
    Like
    GL account   Company  CCode  Cons Group  Currency  PV LC    PV GC
    100000           3000         3000     (Blank)          USD         1000      44000
    1) Is it necessary to Have Value in Consolidation Group?
    2) If Yes, What is the utility and how to get value in this column.?
    Regards
    Ritesh M.

    No, ConsGoups are determined later, during the ConsGroup-dependent tasks.

  • Memory Leak Detector data collection timing

    Hello,
    I am Yoshizo Aori working at HP Japan.
    I would like to know timing of data collection for
    updating object type byte size increase rate.
    Data collection at garbage collection or any other timing?
    Is it possible to change the timing of the data collection?

    Yoshizo,
    The actual "Growth(bytes/sec)" column is updated along with the other columns at every normal GC and currently, independent of any GC, also every ten seconds. This interval is not yet configurable. (While the trend analysis is running, it is possible to manually press "Refresh" to get shorter time between updates.)
    However, the Growth column is primarily calculated using historic data collected during normal GCs while the trend analysis is active. Only if the historic data shows a difference in heap usage, the current value shown in the "Size (KB)" column is taken into account. The effect is that the "Refresh" button only updates Growth column rows that have had non-zero values.
    This also means that if an application leaks slowly and doesn't generate enough garbage to trigger a GC in the near future, you may not notice it by looking at the Growth column. If so, it is possible to trigger a GC, and thus a possible collection of historic data, by selecting "Garbage Collect" from the Action menu.
    Remember though, since the Growth column represents the growth rate over the entire time that the trend analysis has been running, you may want to avoid very long running analyses. In fact, after a while, historic data is not collected every GC but more and more seldom.

  • How can we upload questionnaire to CRM 2007 during data migration activity?

    Hi all,
    I have an requirement where i have to upload questionnaires from an existing CRM system to SAP-CRM 2007 during data migration activity.
    i just go through what are questionnaires, how we can create them and how we can assign them to transactions.
    how can i upload during data migration activity ?
    Thanks & Regards
    Raman Khurana

    Hi,
    According to your post, my understanding is that the 'Upload.aspx' page was missing or corrupted from many document libraries after migration to 2010 from 2007. 
    Per my knowleadge, the upgrade process sets the SPWeb.CustomUploadPage property for upgraded sites to a custom upload page titled
    uploadex.aspx, as opposed to upload.aspx which is the default page for file uploads. And this custom upload page contains the “Destination Folder” field.
    To removes the custom upload page (uploadex.aspx) from SPWeb, and reverts back to the default upload page (upload.aspx), we can use PowerShell script to loop through all Web sites that are contained within the site collection, including the top-level site
    and its subsites, and checks to see if the SPWeb.CustomUploadPage property is not equal to blank. If true, then it sets the SPWeb.CustomUploadPage property to blank.
    Here is a great blog for your reference:
    Destination Folder field in site upgraded from MOSS 2007 to SharePoint Server 2010
    More information:
    What’s This Destination Folder Field On My File Upload Page?
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Can not start data Collection service

    I have installed Fabric Manager version 3.3.(4). In the Performance Web Client I can not start the service Data Collection is in starting forever.
    I also could not stop it. I rebooted the server and the status for this service is in starting.
    Thanks,

    Please provide more details like: Server OS, Java version, and what type of database you are using for the FM Server.  Have you considered using a newer version of FM Server?

  • Customer Name on Sales Order not in Data Collections

    Hi,
    I am looking for some help. My company just did an upgrade from 11i to R12 in the middle of May. As part of the upgrade we installed ASCP. We have noticed that certain customers are not being collected when running Standard Data Collections. We have open sales orders in the system for the customer. The sales order are being collected properly and appear in the plan. When we show the Customer field in the 'Supply/Demand Screen' in ASCP, the field is blank for select customers. Is there anything on the customer master or sales order line that would prevent the customer name from being collected? Any insight would be helpful.
    Thanks
    Rich

    You can go to order organizer screen select the OA NO which you are not able to see in the ASCP workbench, go to line items---using show field get the Demand visible
    tab on screen-----default value should be 'Y' if blank then then that could be the one of the reason.

  • Data Collection -- Planning Data Pull process failed

    Hello Experts,
    I am having a problem with data collection, the result of Planning Data Pull was completed error. It seems the problem is on the Planning Data Pull Worker.
    I am using 2 worker to run Planning Data Pull, and both result was errors. Below are some of the errors log:
    16-MAY 19:11:34 : Procedure MSC_CL_SETUP_PULL.LOAD_CALENDAR_DATE started.
    Into populate_rsrc_cal
    16-MAY-2011 19:11:40
    APS string is Invalid, check for Error condition
    16-MAY 19:11:40 : Error in Routine GMP_CALENDAR_PKG.POPULATE_RSRC_CAL.
    16-MAY 19:11:40 : User-Defined Exception
    16-MAY 19:11:40 : User-Defined Exception
    16-MAY 19:11:40 : Error_Stack...
    16-MAY 19:11:40 : ORA-06510: PL/SQL: unhandled user-defined exception
    16-MAY 19:11:40 : Error_Backtrace...
    16-MAY 19:11:40 : ORA-06512: at "APPS.MSC_CL_PULL", line 6218
    ORA-06512: at "APPS.MSC_CL_PULL", line 1583
    Help me please.
    Thanks & Regards,
    Andi

    This error shows up when it cannot find any organizations.
    Use the sqls below to identify this
    1. select MSC_CL_PULL.get_org_str(&instance_id) from dual ;
    2. select instance_code, instance_id from msc.msc_apps_instances where instance_code='&instance_code'
    3. select * from apps.msc_instance_orgs where sr_instance_id= &instanc_id
    sql 1 should in principle return the orgs available. If this returns null or -9999, then it's an orgs problem
    sql 3 will show the available orgs.
    If the organizations are enabled for planning in the ASCP instances form, then run a targeted refresh for trading partners (suppliers/customers/orgs). If this works with no errors, then look at the table msc_trading_partners. It should contain records for the orgs with partner_type = 3

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • How to debug a transfer rule during data load?

    I am conducting a flat file (excel sheet saved as a CSV file) data load.  The flat file contains a date field and the value is '12/18/1988'.  In transfer rule for this field, I use a function call to transfer this value to '19881218' which corresponds to BW DATS format, but the monitor of the InfoPackage shows red error:
    "Value '1981218' of characteristic 0DATE is not a number with 000008 spaces".
    Somehow, the last digit or character of the year 1988 was cut and the year grabbed is 198 other than 1988.  The function code is (see below in between two * lines):
    FUNCTION ZDM_CONVERT_DATE.
    ""Local Interface:
    *"  IMPORTING
    *"     REFERENCE(CHARDATE) TYPE  STRING
    *"  EXPORTING
    *"     REFERENCE(DATE) TYPE  D
    DATA:
    c_date(2) TYPE c,
    c_month(2) TYPE c,
    c_year(4) TYPE c,
    c_date_combined(8) TYPE c.
    data: text(10).
    text = chardate.
    search text for '/'.
    if sy-fdpos = 1.
      concatenate '0' text into text.
    endif.
    c_month = text(2).
    c_date = text+3(2).
    c_year = text+6(4).
    CONCATENATE c_year c_month c_date INTO c_date_combined.
    date = c_date_combined.
    ENDFUNCTION.
    Could experts here tell me what's wrong and also tell me on how to debug a transfer rule during data load?
    Thanks

    hey Bhanu/AHP,
    I find the reason.  Originally, I set the character length for the date InfoObject ZCHARDAT1 to 9, then I find the date field value (12/18/1988)length is 10.  Then I modified the InfoObject ZCHARDAT1 length from 9 to 10 and activated it already.  But when defining the transfer rule for this field, before the code screen, click the radio button "Selected Fields" and pick the filed /BIC/ZCHARDAT1, then continue to go to the transfer rule code screen, but find the declaration lines for the infoObject /BIC/ZCHARDAT1 is as following:
      InfoObject ZCHARDAT1: CHAR - 000009
        /BIC/ZCHARDAT1(000009) TYPE C,
    That means even if I've modified the length to 10 for the InfoObject and activated it, but somehow the transfer rule code screen always takes the old length 9.  Any idea to have it fixed to take the length 10 in the transfer rule code screen defination?
    Thanks

Maybe you are looking for