Fluke FL2625a Data Collection. Need how to.

I have a Fluke 2625a data logger and I downloaded the driver for Labview 8.  The driver does not have any instructions.  Does anyone know of any instructions or examples that I could take a look at so I could figure out how to use it.  I have setup a thermocouple and I just want to read data from the Fluke.
EQ
Fluke Hydra II 2625a
Software
Labview 8
Thermocouple T type
I downloaded the driver from this URL
https://sine.ni.com/apps/utf8/niid_web_display.download_page?p_id_guid=E3B19B3E92A8659CE034080020E74...
Thanks in advance,
Matt O

I don't think I have used this instrument in the last 10 years or so. I did look at the current driver and I see that it comes with a VI called Fluke 2625A example. Have you tried running this? Instructions are not included in drivers. You are expected to have some familiarity with the instrument by reading it's manual. Have you turned on context help? That should show what each control does and what each function is used for. This is a very old driver and was converted from one that used the old serial port functions. In that old driver, port 0 corresponds to com 1 so that is something to watch out for. The error handling in this is pretty poor as well. You should turn on automatic error dialogs.

Similar Messages

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • Data Collection - values remained filled in from previous run through route

    Hello,
    We have an SFC that is running through a route the second time that requires specific Data Collection to be captured, but when Data Collection is attempted the values are all pre-filled in (and grayed out) from the previous run through the route.
    The Data Collection needs to be entered in each time it is run through this route, especially if it is an RMA no matter how many times.
    Am I missing a setting somewhere?
    Thanks for your help,
    Mike

    Hello Sergiy,
    The product goes through a normal route and is completed, SFC status = DONE.
    The Product comes back and goes through an RMA route. First Operation data needs to be collected so it is entered by the Operator. The Product continues on and is completed, SFC status = DONE.
    The Product comes back a second time and goes through the same RMA route. First Operation data needs to be collected, but it is already filled in with the data from the first time it went through the RMA route.
    I was just wondering if I missed a configuration part or if this is a bug?
    As always thanks,
    Mike

  • Data Collection in BPC

    Hi,
    How does the Data Collection(UCMON tcode) works in BPC?. Do we have any standard Data Manager Package to do the Data Collection?.How can we achieve this
    Would appreciate your time and response.
    Thanks.

    Hi,
    First, I am not sure what does the tcode that you mentioned does. There is no such code in BPC as far as iam aware.
    Regarding the data collection into BPC application (in other words a cube) -
    You could have a flat file approach -  Upload the flat file into BPC file service using upload data file - and then using standard import data manager package - you could import the data into BPC application. You need to define  and use the BPC transoformations and conversions (if any) in the import process. These are nothing but the files.
    Alternatively, you could also load the data directly from the BW infocube or even from a multi-provider. There is a separate data manager package to do this.
    You could also collect data into BPC through input schedules.
    You have the advantage of triggering the default logic while collecting the data which would apply your business logic on the data before it is saved to the database.
    Hope the above gives some insight to you.
    Thanks

  • I need to set up 9 IPhones and 4 IPads to be used as data collection devices. How can I do that without setting up 13 separate ITunes accounts?

    I need to set up 9 IPhones and 4 IPads to be used as data collection devices. How can I do that without setting up 13 separate ITunes accounts?

    You can use the same account to set up as many devices as you like.
    You might also find there are tools to help with a roll out.
    http://www.apple.com/support/iphone/enterprise/
    tt2

  • How to Check UPL data collection at Solution Manager

    Hello,
    I had activated UPL data collection as per the document How to Guide –SAP Custom Code Management Usage and Procedure Logging.
    I can check UPL data in my managed system by Report “/SDF/SHOW_UPL"
    But I am not able to get the same data in Solution Manager by this same repot (/SDF/SHOW_UPL).
    When I run this report in solman it show me only Solution manager data.
    How I can read Manage System data in Solman? And what are the check point (for both Solman and Manage system) I need to check for ensuring my UPL date collection is working fine.

    Set up CCLM and the custom code setup and you will be able to view UPL data that way,

  • How do I synchronize data collection on two DT3001 boards?

    I am very new to Labview, and I am trying to find an easy way to synchronize my data collection on two DT3001 boards.  I have tried modifying the AI Continuous Scan example.  This works because I am using all 16 channels on each of my boards.  Unfortunately, I cannot find a way to make both boards collect data beginning at the same time.  Is there an easy way to synchronize both boards to collect data at the same time?  Thank you for your help!

    Hello Kacie,
    I'm not sure how your DT 3001 device and it's drivers LabVIEW functions work, but for synchronizing acquisition on two National Instruments DAQ cards, you need to share a start trigger and sample clock between the two cards by connecting the Real-Time System Integration (RTSI) buses of the cards using a RTSI cable.  This provides dedicated lines for sharing the same sample clock and start trigger, and provides highly accurate hardware-timed synchronization.  Any synchronized starting you program in software in LabVIEW will be software timed.  You can use the sequence structure within LabVIEW to group together different portions of code so that all the configuration is performed for both cards first, and then they are started at the same time.  I know this probably isn't too much help, but I would need to know more about how your DT 3001 device and the AI Continuous Scan example works.  You might be able to find better help on this at Data Translation's website.
    Travis G.
    Applications Engineering
    National Instruments
    www.ni.com/support

  • How to enforce data collection

    Hi everybody,
    how can I enforce data collection at a specific operation? I am using the integrated POD showing the data to be collected in one part of the window. The data collection itself works fine, but it is merely optional. I can complete the operation without a warning.
    So, where is the hook or rule that I need to set up in order to check whether data collection has been done before completion of the operation?
    Georg

    Hello,
    Please chec k the link below:
    http://help.sap.com/saphelp_me52/helpdata/EN/d7/f7f0be3fec4a31bec083a035eb2423/content.htm
    this explains activity CT500,
    This activity deals with the checking of component, making them as mandatory to assembly etc.
    you can change for the CT500 and CT500_RICH activities in Activity Maintenance:
    On POD you have to use these activities.
    Note: The system executes all code associated with a hook point in the same database transaction. For hook points within POD pushbutton activities, the transaction includes a single pushbutton activity, such as Start (PR500). If the hook point activity fails, the system rolls back, or cancels, the entire transaction. For example, in the figure in Site Level Hook Points, if you associate Check Configuration (CT520) with the POST_START hook point and the components have not been assembled, the system rolls back the Start as well. This is true for all hook points
    br,
    Pushkar

  • I need  How to retrive data from sap r/3   using weblogic server

    Hi every body .
    I need how to retrieve r/3 data  using BAPI methods
    and using weblogic server
    very critical

    Hai.
    check the links.
    http://www.bea.com/content/news_events/white_papers/BEA_WLP_SAP_Portlets_81.pdf
    http://www.info-sun.com/docs/wp_sapinter.pdf
    regards.
    sowjanya.b

  • How to know which master data objects need to activated  in R3

    SALES OVERVIEW CUBE -0SD_C03
    How to know which master data objects need to activated from delivery version to active version in R/3 for a particular standard cube like 0SD_C03.
    its very urgent please advise.
    R/3 in RSA5
    Sales Master Data
    0ACCNT_ASGN_TEXT               Account assignment group for this customer   
    0ACCNT_GRP_TEXT                Customer account group                       
    0BILBLK_DL_TEXT                Locked                                       
    0BILBLK_ITM_TEXT               Billing block for item                       
    0BILL_BLOCK_TEXT               Billing block in SD document                 
    0BILL_CAT_TEXT                 Billing Category                             
    0BILL_RELEV_TEXT               Relevant for Billing                         
    0BILL_RULE_TEXT                Billing rule                                 
    0BILL_TYPE_TEXT                Billing Type                                 
    0CONSUMER_ATTR                 Consumer                                     
    0CONSUMER_LKLS_HIER            Consumer                                     
    0CONSUMER_TEXT                 Consumer                                     
    0CUST_CLASS_TEXT               Customer Classification                      
    0CUST_GROUP_TEXT               Customer Group                               
    0CUST_GRP1_TEXT                Customer Group 1                             
    0CUST_GRP2_TEXT                Customer Group 2                             
    0CUST_GRP3_TEXT                Customer Group 3                             
    0CUST_GRP4_TEXT                Customer Group 4                             
    0CUST_GRP5_TEXT                Customer Group 5                             
    0DEALTYPE_TEXT                 Sales Deal Type                              
    0DEL_BLOCK_TEXT                Delivery block (document header)             
    0DEL_TYPE_TEXT                 Delivery Type                                
    0DISTR_CHAN_TEXT               Distribution Channel                         
    0DIVISION_TEXT                 Division                                     
    0DLV_BLOCK_TEXT                Schedule line blocked for delivery           
    0DOC_CATEG_TEXT                SD Document Category                         
    0DOC_TYPE_TEXT                 Sales Document Type                          
    0INCOTERMS_TEXT                Incoterms (Part 1)                           
    0INDUSTRY_TEXT                 Industry keys                                
    0IND_CODE_3_TEXT               Industry code 3                              
    0IND_CODE_4_TEXT               Industry code 4                              
    0IND_CODE_5_TEXT               Industry code 5                              
    0IND_CODE_TEXT                 Industry code                                
    0ITEM_CATEG_TEXT               Sales document item category                 
    0ITM_TYPE_TEXT                 FS item type                                 
    0KHERK_TEXT                    Condition Origin                             
    0MATL_GRP_1_TEXT               Material Group1                                         
    0MATL_GRP_2_TEXT               Material Group 2                                         
    0MATL_GRP_3_TEXT               Material Group 3                                         
    0MATL_GRP_4_TEXT               Material Group 4                                         
    0MATL_GRP_5_TEXT               Material Group 5                                         
    0MATL_TYPE_TEXT                Material Type                                            
    0MAT_STGRP_TEXT                Material statistics group                                
    0NIELSEN_ID_TEXT               Nielsen ID                                               
    0ORD_REASON_TEXT               Order reason (reason for the business transaction)       
    0PICK_INDC_TEXT                Indicator for picking control                            
    0PRODCAT_TEXT                  Product Catalog Number                                   
    0PROD_HIER_TEXT                Product Hierarchy                                        
    0PROMOTION_ATTR                Promotion                                                
    0PROMOTION_TEXT                Promotion                                                
    0PROMOTYPE_TEXT                Promotion Type                                           
    0PROV_GROUP_TEXT               Commission Group                                         
    0REASON_REJ_TEXT               Reason for rejection of quotations and sales orders      
    0REBATE_GRP_TEXT               Volume rebate group                                      
    0RECIPCNTRY_TEXT               Destination country                                      
    0ROUTE_TEXT                          Route                                                    
    0SALESDEAL_ATTR                Sales deal                                               
    0SALESDEAL_TEXT                Sales deal                                               
    0SALESORG_ATTR                 Sales organization                                       
    0SALESORG_TEXT                 Sales Organization                                       
    0SALES_DIST_TEXT               Sales district                                           
    0SALES_GRP_TEXT                Sales Group                                              
    0SALES_OFF_TEXT                Sales Office                                             
    0SCHD_CATEG_TEXT               Schedule line category                                   
    0SHIP_POINT_TEXT               Shipping point/receiving point  
    In BW
    Base Unit of Measure     0BASE_UOM
    Bill-to party                    0BILLTOPRTY
    Calendar Day                  0CALDAY
    Calendar Year/Month      0CALMONTH
    Calendar Year/Week      0CALWEEK
    Change Run ID              0CHNGID
    Company code              0COMP_CODE  
    Cost in statistics currency          0COST_VAL_S
    Credit/debit posting (C/D)            0DEB_CRED
    Distribution Channel       0DISTR_CHAN
    Division                                     0DIVISION 
    Number of documents    0DOCUMENTS
    Sales Document Category          0DOC_CATEG
    Document category /Quotation/Order/Delivery/Invoice 0DOC_CLASS
    Number of Document Items         0DOC_ITEMS
    Fiscal year / period
    Fiscal year variant                      0FISCVARNT
    Gross weight in kilograms           0GR_WT_KG
    Number of Employees    0HDCNT_LAST
    Material                                     0MATERIAL
    Net value in statistics currency    0NET_VAL_S
    Net weight in kilograms 0NT_WT_KG
    Open orders quantity in base unit of measure 0OPORDQTYBM
    Net value of open orders in statistics currency 0OPORDVALSC
    Payer                            0PAYER
    Plant                             0PLANT
    Quantity in base units of measure 0QUANT_B
    Record type                   0RECORDTP
    Request ID                    0REQUID
    Sales Employee            0SALESEMPLY
    Sales Organization         0SALESORG
    Sales group                   0SALES_GRP
    Sales Office                   0SALES_OFF
    Shipping point                0SHIP_POINT
    Ship-To Party                0SHIP_TO
    Sold-to party                  0SOLD_TO
    Statistics Currency                    0STAT_CURR
    In R3 RSA5 we have all the Master data data sources as mentioned above, and BW also. How to find the related Master data Infosource in R/3 Master data Data sources.
    Thanks in advance,
       Bhima.
    Message was edited by: Bhima Chandra Sekhar Guntla

    Hi,
    <i>How to know which master data objects need to activated from delivery version to active version in R/3 for a particular standard cube like 0SD_C03.</i>
    I think, you are looking for master data sources(text,attributes,hier).Am i right?
    If so, This cube has almost all SD master data characterstics. So you can activate all the all master data datasources of SD in r/3 (SD-IO).
    Any way you requirement does not stop only by using this cube . You will activate all other cubes in SD also. So if you want to activate only needed master data datasources when you are activating a cube, the job becomes senseless. There is no problem(wrong) in activating all master data available under that application , even though you want to activate only one cube.
    With rgds,
    Anil Kumar Sharma .P

  • Is there any one who knows how to apply machine learning algorithms to the spectrum data collected using ni usrp for ism band to use spectrum efficiently by link adaptation

    Hi,
    may i know any one working in cognitive radio research and applying machine learning techniques using the spectrum data collected from NI usrp kits

    Can't edit my message anymore, so for the tldr crowd (too long didn't read), here is a shorter version:
    The good:
    E4200 worked fine for about 15 hours with great throughput and link quality (with one brief disconnect in the middle - online gaming software shows brief disconnects that otherwise go totally unnoticed).
    The bad:
    Wireless on the E4200 stopped broadcasting entirely (to a Windows 7 laptop with a 802.11n 2.4 GHZ USB adapter).
    I had switched between two adapters that use the same chipset, same driver and in fact show up as a single device in device manager) without rebooting the router but it worked (fine) for an hour after I swapped the adapters.
    Right before I rebooted it, I checked and the E4200 was not hot to the touch and wired internet was still working after the wireless radio stopped working.
    A reboot of the router cured it.
    The open question as to the root cause:
    Now I want to know whether this wireless radio ceasing to broadcast (requires a reboot of the E4200) will happen daily?
    I.e. I want to know whether it it really is overheating (then again why would only some have that problem needing to reboot daily while others have gone weeks with the E4200 without a single reboot being needed and without any problems occurring) or whether it had to do with changing the adapters back and forth without rebooting the router. Keep in mind that the router worked fine for an hour or so after I stopped swapping the adapters.
    Extra question:
    Will getting a Cisco AE1000 USB adapter perhaps help? (I know this is a loaded question and their is no easy / sure answer, but even a "maybe" or reasons why it might help at this point would be better than nothing.
    Any similar experiences (with the router needing to be rebooted to get wireless radio back on) would be appreciated as it may help myself and others experiencing these types of issues.

  • I need how to load data from MS Excel(csv format) to NW Excel

    Hi,
    I am doing Migration project from SAP MS to NW Manually.
    I need how to load data from MS excel (csv format) to NW excel .
    For example 2008 budget data.
    Could you please help me in this.
    Thanks and Regards
    Krishna

    Hi,
    You need to create a transformation file and a conversion file if required. First upload the excel (csv) file into BPC using Manage Data and Upload File option.
    Create the transformation file (refer to the sap help  on how to define a transformation file). You need to specify the mapping correctly and include all your application dimensions and map them to appropriate columns of the flat file.
    Before running the import package, do validate the data in the flat file you uploaded into BPC with the transformation file you created.
    Thanks,
    Sreeni

  • [JAVA] How to input data collected in a table

    Hello!
    I'm writing a program that monitors the signal of a sensor in a chart. Now I want to insert into a table of data collected by the sensor using a JTable.The problem is that if I run the program, data are not included in the table.
    Where did I go wrong?
    Here's part of the source code.
    ArrayList<Data> datiArray = new ArrayList<Data>();
    DataTableModel dtm= new DataTableModel(datiArray);
    public class DataTableModel extends AbstractTableModel {
        private ArrayList<Data> datiArray;    // <---- Il table model ha la collezione
        public DataTableModel(ArrayList<Data> datiArray) {
            this.datiArray=datiArray;
        public void addData(Data d) {
            datiArray.add(d);
    int row = datiArray.size()-1;
    fireTableRowsInserted(row,row);
             private String colName[] = {"Time", "G-value" };
            public String getColumnName(int col) {
                return colName[col];
            public int getRowCount() {
                return datiArray.size();
            public int getColumnCount() {
                return 2;
            public boolean isCellEditable(int row, int col) {
                return false;
              public Class getColumnClass(int c) {
              return (c == 0) ? Long.class : Double.class;
    public Object getValueAt(int row, int column) {
            Data d = datiArray.get(row);
            switch (column) {
                case 0: return dati.getTime();
                case 1: return dati.getGvalue();
            return null;
        private class Data  {
            public long time;
            public double gvalue;
            public Data(long time, double gvalue) {
                this.tempo = tempo;
                          this.gvalue = gvalue;
    public long getTime() { return time; }
        public double getGvalue() { return gvalue; }
    RecordButtonAction
        private void recordButtonActionPerformed(java.awt.event.ActionEvent evt) {                                            
            int i=0;
    int j= graphView.getSampleTime();
    int k=graphView.getIndexMax();
    System.out.println(j);
    System.out.println(k);
            while(i<k){
    Data dr= new Data(graphView.getTime(i),graphView.getGvalue(i));
    //datiArray.add(dr);
    dtm.addData(dr);
    //these System.out.println are served to me to check if the values were actually included in the class Data and DataTableModel
    System.out.print(graphView.getTime(i));
    System.out.println("/t"+graphView.getGvalue(i));
    System.out.println(dr.getTime());
    System.out.println(dtm.getValueAt(i ,1 ));
            i=i+j;
            readyRecord = false;
    Sorry for my bad English.

    Please don't cross-post the same question in multiple forums. Especially since you've already been given directions as to how to ask the question smartly.

  • How much data is needed to download the whole creative suite?

    How much data is needed to download the whole creative suite?

    Jacoedit you can see the size of the base installers at http://prodesigntools.com/adobe-cc-direct-download-links.html.  Please be aware that you will still need to download the updates for the applications as well.

  • How to do filter data after Data Collection for Oracle PS

    Hi Experts,
    I have a critical issue on the "Data Volume" after data collection program.
    If I give this big chunk of data as a input to PS, then integration program is not taking this High volume of data into PS.
    Hence now I have a critical task to reduce number of Data.
    In this front, first I need to delete some items which are not required for PS in the base table(In MSC_SYSTEM_ITEMS) after collection program, but I do not know the dependent entities which also has to be taken care in MSC side after collection, If I delete the itrems from MSC_SYSTEM_ITEMS..
    Hence I require help on this data filter issue.
    Appreciating anyone's help in advance...
    Thanks & Regards,
    Bharathram N

    Hello,
    Have you looked at the request Purge ODS Data of Collections? Using this you can delete all the collected data for a specific instance id.
    Once that is done you can run a complete refresh for all entities for all enabled organizations. This will ensure that old data pertaining to disabled orgs is removed.
    Additionally in 11.5.10 (collections rup 32 and later) there is a feature implemented which does not pull data from disabled organizations into the source snapshots. This in itself does not make a difference for the ODS data, but does give better performance in the refresh snapshots.
    One thing to take into account if your purge the ODS data that in the event when you still have ascp plans that are based on this data, then you may end up with no data being displayed in the workbench (because the lid tables have been purged). Here you would have to rerun the plans.
    best regards
    Geert

Maybe you are looking for

  • Pers.no. 48987 is blocked by user 48987

    Hi, After creating a travel request in SAP Enterprise Portal ESS Menu if the user is closing browser (Internet Explorer) screen instead of log off then on the next login they get error while creating travel request that the Pers.no. 48987 is blocked

  • Audigy 2 ZS Platinum Pro/Sound card quest

    Hi all, Since the I/O box on the Platinum Pro series has no rca out connection, does anybody know if I can use the lines out on the card at the same time the I/O is plugged into it's Thanks, Dale

  • Is it possible to use af:inlineframe component in JDeveloper 10.1.3.3.0 ?

    Hi, I need a requirement to use <af:inlineframe> component to show pdf as some portion of the entire jsf page. My Jdeveloper version is 10.1.3.3.0. This component is not loading when i try to include it.. Can anyone suggest me to use this component i

  • Edited Bootcamp's info.plist and now Bootcamp won't open?

    Here is the error report I get. Really want to get this fixed, so any help would be greatly appreciated. Process:         Boot Camp Assistant [285] Path:            /Applications/Utilities/Boot Camp Assistant.app/Contents/MacOS/Boot Camp Assistant Id

  • SQVI problem

    Hello everyone, I am not sure if t-code SQVI should be posted in the ABAP form area, but I think it is ok. I need to make a list of my costumers, and their AKONT in Excel so I am using SQVI. I am trying to join KNA1 a KNB1. So in the selection window