Multiple Trigger Level Data Collection

I'm having some difficulty with a unique data collection problem. I'm using DAQ Assistant to collect and display voltage data on a graph and in numeric indicators. I need to add functionality, so that when the user clicks a control, the incoming data is sampled and shown in a table. Each sample should occur at a succesively higher trigger level-- i.e., 1st sample when channel 0 is near 1 V, 2nd sample at 2 V etc... Is this possible using Labview 8.6? I have experimented with the Trigger and Gate function, but have been unable to trigger the manual trigger at successively higher levels. Any help or ideas would be appreciated! 

Hi Sailorguy,
Please have a look at this forum and see if it helps. Thanks!
Ipshita C.
National Instruments
Applications Engineer

Similar Messages

  • Can we run parallel data collections to collect multiple source instance.

    Hi,
    We have a business requirement where we will have multiple Source(ERP) instance and single Destination(ASCP) instance. We would like to check if we will be able to run Data collections in parallel to collect multiple EPR instance in the same time window. This is to reduce the total planning process time duration as following data collections we will be required to run multiple plan runs.
    Please help me with your expert comments
    Rgds
    Sid

    You may instead use Continuous collections to save on time so it can run collections from both instances periodically throughout the day thereby reducing a single timespan to collect all the data.
    Thanks
    Navneet Goel
    Inspirage

  • I'm looking for return value if trigger level is reached,in LABVIEW .NO data acquisition.

    I won't like to acquire data with the analog input , only a return value if the trigger level is reached, dearest a Bool expression to control another function, only to activate or deactivate this function. I use the PCI MIO 16E1 and Labview 6 and i hope there is someone who can help ; -)

    The DAQ Occurence will probably be the best option for you. It can be set to generate an occurence (interrupt) at a particular level. You won't get a BOOL output. Instead, you have another node (Wait on Occurence) that doesn't allow execution to continue until the occurence is generated.
    Otherwise, you may have to do a point-point comparison.

  • Multiple parametric Data Collection using createMultipleParametricData API

    Dear Experts,
    For the first time we are using the SAP ME PAPI functionality in SAP MII. As per the requirement we need to do the data collection against a SFC from SAP MII Transaction. We are successfully able to do the data collection  for single parametric Measure using the SAP ME PAPI createParametricData. But now we need to do the data collection for multiple parametric measures in one call and for that we can see that the createMultipleParametricData API is available. The sample xml file from which we are getting the data for three parametric measures Temprature, Pressure and Volume is pasted below.
    <?xml version="1.0" encoding="UTF-8" standalone="no"?>
    <Rowsets CachedTime="" DateCreated="2014-02-10T06:52:01" EndDate="2014-02-10T06:52:01" StartDate="2014-02-10T05:52:01" Version="14.0 SP4 Patch 1 (Jan 8, 2014)">
        <Rowset>
            <Columns>
                <Column Description="Temperature" MaxRange="1" MinRange="0" Name="Temperature" SQLDataType="-9" SourceColumn="Temperature"/>
                <Column Description="Pressure" MaxRange="1" MinRange="0" Name="Pressure" SQLDataType="-9" SourceColumn="Pressure"/>  
                <Column Description="Volume" MaxRange="1" MinRange="0" Name="Volume" SQLDataType="-9" SourceColumn="Volume"/>
    </Columns>
            <Row>
                <Temperature>32</Temperature>
                <Pressure>30</Pressure>
                <Volume>40</Volume>
            </Row>       
        </Rowset>
    </Rowsets>
    This is the input source from which we need to extract tag names and values under node-Row and then do Data Collection against an SFC with the help of Public API createMultipleParametricData.
    But we are not able to assign multiple parameters as request parameters to this API.
    Can anybody help us with what input format is required to assign multiple parameters to this API or to assign multiple parameters at a time?
    It would be great if you can paste the request xml stucture also.
    Thanks in advance,
    Sanjeev Sharma

    If your input stream is coming into a 1D array, you might try using the
    Decimate 1D array function in the array control group.
    Pull down the control until it matches the 8 data sets... (or 9 if there is a framing character of some sort in the data stream)...then each set should come out a separate output of hte block.
    You probably need to be careful that the serial stream always appears in the same order for this to work right.
    Hope that helps.
    The Hummer

  • How to Save Multiple Records In Data Block

    Hi All,
    I Have Two Blocks --> Control Block,Database Block
    Please Any Idea How to Save Multiple Records In Data Block when User changed Data in Control Block.
    Thanks For Your Help
    Sa

    Now i have to use each record of control block(ctl_blk) as where condition in data base block(dat_blk)and display 10 records from database table.>
    Do you want this coordination to be automatic/synchronized or only when the user clicks a button or something else to signal the coordination? Your answer here will dicate which trigger to put your code in.
    As to the coordination part, as the user selects a record in the Control Block (CB), you will need to take the Key information and modify the Data Block's (DB) "DEFAULT_WHER E" block property. The logical place to put this code is the CB When-New-Record-Instance (WNRI) trigger. You code will look something like the following:
    /* Sample WNRI trigger */
    /* This sample assumes you do not have a default value in the BLOCK WHER E property */
    DECLARE
       v_tmp_dw    VARCHAR2(250);
    BEGIN
       v_tmp_dw := ' DB_Key_Column1 = '||:CONTROL.Key_Column1||' AND DB_Key_Column2 = '||:CONTROL.Key_Column_2;
       Set_Block_Property('DATA_BLOCK', DEFAULT_WHER E, v_tmp_df);
       /* If you want auto coordination to occur, do the following */
       Go_Block('DATA_BLOCK');
       Execute_Query;
       /* Now, return to the Control Block */
       Go_Block('CONTROL_BLOCK');
    END;
    The Control block items are assigned with values in Form level (Key_exeqry).If your CD is populated from a single table, it would be better to create a Master - Detail relationship (as Abdetu) describes.
    Hope this helps,
    Craig B-)
    If someone's response is helpful or correct, please mark it accordingly.

  • Log NC based on data collection

    Is it possible to trigger the logging of an NC based on a data collection value being outside the acceptable range?
    ie. Acceptable range for the data collection is a number less than 6, if the user enters 7 I would like to log an NC that says the data collection is out or range.

    To summarize:
    What I'm taking away from this is that it is the best practice to have only one parameter per DC group if you intend to trigger the automatic logging of an NC when that group "fails." The one parameter in the DC group MUST have a min/max value assigned and a fail is triggered when the operator enters a value outside of that range.  The NC is logged using the value assigned to the LOGNC_ID_ON_GROUP_FAILURE parameter in activity maintenance.
    If there are multiple parameters in the DC group, they all have to have a min/max value assigned and ALL of the responses have to be out of range in order to fail the SFC.
    I cannot have a DC group that contains parameters of multiple types and expect an NC to be logged based on an incorrect answer (for one question or multiple.)
    I cannot expect an NC to be logged based on an incorrect answer of one question, if the rest of the questions in the DC group are answered "correctly."
    Sound correct?
    Edited by: Allison Davidson on Apr 18, 2011 10:06 AM  - typo

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • What is the significance of data collection nad search help exit field ?

    Dear Gurus
    I  know i am asking very basic quetion of abap but sdn is  the only source to learn sap for me.I want to thanks you all for your kind support.
    i read  most of the post related to search help and trying to create one.
    for elementary search help.
    SE11 -> SEACH HELP -> ELEMENTARY SEARCH HELP
    I have doubts regarding to fields "DATA COLLECTION" and "SEARCH HELP EXIT".
    reference to a  tutorial it is a maintenance view  shall i  have to reate  a maintenace view first.
    and other field is  SEARCH HELP EXIT what is this.
    please help me .
    Thanks in advance.
    Chitta Ranjan mahato.
    Edited by: chitto123 on Oct 8, 2010 5:59 AM

    Howdy,
    DATA COLLECTION - refers to a database table or view.  This is the data that the search help will search through and display based on the parameters provided, so you can create your own view for the search help if you want the search to cover multiple tables.
    SEARCH HELP EXIT - You can create a function module to be able to alter the Search Help's selection and results at various events throughout the search help.  An example of this function module is provided with some documentation in function module F4IF_SHLP_EXIT_EXAMPLE.
    Cheers
    Alex

  • Table in which system updates data collected through KKRV

    Hi,
    Can any one tell me the table in which system updates data collected through KKRV?
    and the correct procedure to collect the data for product drilldown.
    can i run KKRC alogwith KKRV to get summerized reports?
    Thanks in advance
    Regards,

    You could probably trace them at COSP or COSS tables where the object starts with VD (summarization object).
    Summarization transaction are better if run separately. First KKRV for summarising plant level information and then KKRC for hierarchy summarization.

  • Converted Crystal Statement Report not showing line level data properly

    Hi all,
    We have an issue with a Cystral-converted PLD Customer Statement layout behaving strangely when printing for a customer with multiple pages.
    When the statements are printed for a customer that has line level data which fills about an A4 page, this print absolutely fine; customer header data, line data etc. However, when we print for a customer that has more lines that can fit onto the one page, Crystal does the following:
    Page 1 - Customer Header info and column headers (OK); Line level info until bottom of page (OK)
    Page 2 - Customer Header info and column headers (MISSING); Line level info (ONLY ONE LINE APPEARS INDIVIDUALLY ON THIS PAGE BUT IS THE NEXT LINE CONTINUING FROM FROM THE PREVIOUS PAGE)
    Page 3 - Customer Header info and column headers (OK); Line level info until bottom of page (OK - CONTINUING FROM THE INDIVIDUAL LINE FROM THE PREVIOUS PAGE)
    Page 4 - Customer Header info and column headers (MISSING); Line level info (ONLY ONE LINE APPEARS INDIVIDUALLY ON THIS PAGE BUT IS THE NEXT LINE CONTINUING FROM FROM THE PREVIOUS PAGE)
    etc...
    So what basically seems to be happening is that every other page, Crystal just sticks in an individual line on it's own without any headers or anything OR other lines.
    Help please! Quite likely a 'Section Expert' issue but we don't know what setting could be causing this.

    Sounds like the report is trying to print multiple pages with blank pages in between, is the normal report ie details fit onto an A4 page printing with a blank back page?
    It will be page number resets and suppression of details on pagenumber = 2
    YOu need to go to section expert and go through each section noting what conditions have been suppressed and if any formula boxes are red then open and see what they do.
    Try turn conditions on and off and commenting out any formula conditiona and see what happens. By a process of elimination you should be able to work it out. However, you may have to stop the introduction of a blank page some how.
    Ian

  • Access 2010 InfoPath Data Collection Export Fails Due to Date Format That Includes Time Zone

    I created an Access 2010 database that has multiple data collection (InfoPath) forms that were generated from Access and have been in use for about 1.5 years.  Starting in 2013 (for about a week now), the submitted data fails to Export due
    to a "data type conversion error" with the date fields.  Prior to 2013, the date string returned in the InfoPath form looked like this: "2013-01-07T00:00:00", but now it looks like this: "2013-01-07T00:00:00-05:00".  The time zone is appended
    to the string and it kills the Outlook Export feature.
    To test this, I created a new database with one table and one date time field.  I generated an InfoPath template and emailed it to myself.  I entered the date into the template and submitted it (tried manually entering the date as well as
    using the date time picker control - made no difference).  The InfoPath template now contains "2013-01-07T00:00:00-05:00" and will not Export from Outlook to Access.  I tried manually pasting the string into the Access table and it would take it,
    but would show "1/7/2013 5:00:00 AM" in the date time field (which isn't correct either but at least it took).  Note: This problem has appeared at my office (Win 7 with Office 2010), but my testing was done on my personal laptop using Win 8 with Office
    2010.
    It looks like Microsoft has created a bug and now all of my data collection forms are unusable.  Any help will be appreciated.

    Microsoft confirmed that the issue was introduced with MS12-066 as follows:
    ***Start Quote
    We have been able to identify that this issue was introduced with the change made for the hotfix detailed in KB 2687395. This update was included in the security update MS12-066 when you installed KB 2687417.
    2687395          Description of the InfoPath 2010 hotfix package (ipeditor-x-none.msp): August 28, 2012
    http://support.microsoft.com/kb/2687395/EN-US
    2687417           MS12-066: Description of the security update for InfoPath 2010 Service Pack 1: October 9, 2012
    http://support.microsoft.com/kb/2687417/EN-US
    Investigating workarounds I've only come up with using HTML forms or changing the datatype of the control to text.
    ***End Quote
    My own testing also indicates that if you are using InfoPath with SQL Server, you may be able to change the Date/Time picker control in InfoPath to a Date only picker control (if the SQL Server data
    type will support that).

  • Data Collection Jobs fail to run

    Hello,
    I have installed three servers with SQL Server 2012 Standard Edition, which should be identically installed and configured.
    Part of the installation was setting up Data Collection which worked with out problems on two of the servers, but not on the last where 5 of the SQL Server Agent Jobs fails.
    The jobs failing are:
    collection_set_1_noncached_collect_and_upload
    Message:
    Executed as user: NT Service\SQLSERVERAGENT. The step did not generate any output.  Process Exit Code -1073741515.  The step failed.
    collection_set_2_collection
    Step 1 message:
    Executed as user: NT Service\SQLSERVERAGENT. The step did not generate any output.  Process Exit Code -1073741515.  The step failed.
    collection_set_2_upload
    Step 2 message:
    Executed as user: NT Service\SQLSERVERAGENT. The step did not generate any output.  Process Exit Code -1073741515.  The step failed.
    collection_set_3_collection
    Step 1 message:
    Executed as user: NT Service\SQLSERVERAGENT. The step did not generate any output.  Process Exit Code -1073741515.  The step failed.
    collection_set_3_upload
    Step 2 message:
    Executed as user: NT Service\SQLSERVERAGENT. The step did not generate any output.  Process Exit Code -1073741515.  The step failed.
    When I try to execute one of the jobs, I get the following event in the System Log:
    - <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
    - <System>
      <Provider
    Name="Application Popup"
    Guid="{47BFA2B7-BD54-4FAC-B70B-29021084CA8F}" />
      <EventID>26</EventID>
      <Version>0</Version>
      <Level>4</Level>
      <Task>0</Task>
      <Opcode>0</Opcode>
      <Keywords>0x8000000000000000</Keywords>
      <TimeCreated
    SystemTime="2013-06-04T11:23:10.924129800Z" />
      <EventRecordID>29716</EventRecordID>
      <Correlation
    />
      <Execution
    ProcessID="396" ThreadID="1336" />
      <Channel>System</Channel>
      <Computer>myServer</Computer>
      <Security
    UserID="S-1-5-18" />
      </System>
    - <EventData>
      <Data Name="Caption">DCEXEC.EXE - System Error</Data>
      <Data Name="Message">The program can't start because SqlTDiagN.dll
    is missing from your computer. Try reinstalling the program to fix this
    problem.</Data>
      </EventData>
     </Event>
    I have tried removing and reconfiguring the Data Collection two times using the stored procedure msdb.dbo.sp_syscollector_cleanup_collector and removing the underlying database.
    Both times with the same result.
    Below are the basic information about the setup of the server:
    Does any one have any suggestions about what the problem could be and what the cure might be?
    Best regards,
    Michael Andy Poulsen

    I tried running a repair on the SQL Server installation.
    This solved the problem.
    Only thing is that now I have to figure out the mening of these last lines in the log of the repair:
    "The following warnings were encountered while configuring settings on your SQL Server.  These resources / settings were missing or invalid so default values were used in recreating the missing resources.  Please review to make sure they don’t require
    further customization for your applications:
    Service SID support has been enabled on the service.
    Service SID support has been enabled on the service."
    /Michael Andy Poulsen

  • ASCP Data Collection Fails

    Dear Members :
    For sake of learning, ATP in R12.1.1 Vision instance, I created a simple ATP rule with Demand as Sales Order and Supply as PO, On-hand Available.
    I followed all the steps as below:
    1) After creating ATP rule and attaching it to the order mgmt tab in item master, enabled ATP at the org level.
    2) created sourcing rule for that item and assigned it to an assignment set(Supplier Scheduling).
    3) This assignment set appears by default to MRP:ATP assignment set profile option
    4) Set INV : Capable to Promise to ATP/CTP based on collected data, at responsibility level
    5) Include organizations to instance TST(default)
    6) Run ATP Data collection Program.
    RCS is errored out with User-Defined Exception (Source Setup Objects Creation Requests did not complete Successfully) along with Collection Triggers/Views errored out with
    Error with creating Source Triggers/Views
    Is it my setup problem or patch I need to apply, if you could please guide ?
    Thanks in advance.
    Atanu

    Following are in log for them:
    **Starts**03-JAN-2011 20:00:52
    **Ends**03-JAN-2011 20:12:22
    Error in creating Source Triggers
    Start of log messages from FND_FILE
    03-JAN 20:00:52 : Request : 5807365 :Creates Item Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807366 :Creates BOM Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807367 :Creates Routing Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807368 :Creates WIP Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807369 :Creates Demand Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807370 :Creates Supply Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807371 :Creates Other Triggers used by Collections Process
    03-JAN 20:00:52 : Request : 5807372 :Creates Repair Order Triggers used by Collections Process
    End of log messages from FND_FILE
    ***********************************AND that of Source Views details :**************************************************************
    **Starts**03-JAN-2011 20:00:47
    **Ends**03-JAN-2011 20:12:18
    Error in creating Source Views
    Start of log messages from FND_FILE
    03-JAN 20:00:47 : Request : 5807356 :Creates Setup Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807357 :Creates Item Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807358 :Creates BOM Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807359 :Creates Routing Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807360 :Creates WIP Views used by Collections Process
    03-JAN 20:00:47 : Request : 5807361 :Creates Demand Views used by Collections Process
    03-JAN 20:00:48 : Request : 5807362 :Creates Supply Views used by Collections Process
    03-JAN 20:00:48 : Request : 5807363 :Creates Other Views used by Collections Process
    03-JAN 20:00:48 : Request : 5807364 :Creates Repair Order Views used by Collections Process
    End of log messages from FND_FILE
    Executing request completion options...
    Output is not being printed because:
    The print option has been disabled for this report.
    Finished executing request completion options.
    Concurrent request completed
    Current system time is 03-JAN-2011 20:12:18
    ======================================================================
    Thanks
    Atanu

  • SAP ME Data Collection setup

    Hello Experts!
    We are trying to set up data collection for following scenario:
    Typically our order quantity can vary from 1000 - 5000 pcs and we want to create only one SFC per shop order. We want to collect data after a fixed number of pcs are reported, for example, after every 100 pcs reported. So in above example number of iterations for data collection could vary from 10 - 50.
    We see that there are two fields "Required Data Entries" & "Optional Data Entries" in the Data Collection parameter detail tab but looks like those are for static values but we want this to change based on order quantity.
    Also we noticed another issue with these fields for our scenario, if we use "Required Data Entries" field then user has to collect all the iterations together but that is not possible since we are collecting after reporting a certain qty. If we use "Optional Data Entries" then system allow the user to collect multiple times but the pop-up does not indicate how many iterations are already collected which is confusing for the users.
    Has anyone else had any experience with a similar scenario?
    Thanks in advance!
    -Venkat

    Hello Venkat,
    To collect data against the same SFC several times you should enable the corresponding System Rule 'Allow Multiple Data Colelction'. The "Optional Data Entries" rather controls the number of optional entries that you can enter (i.e. you measure temperature of SFC and need to enter several measures for a single Data Collection).
    As long as you enable the system rule, it will be up to you when to collect it. But there is no standard functionality to force it after certain Qty of SFC processed. You'll need a customization for it.
    Br, Alex.

  • [JAVA] How to input data collected in a table

    Hello!
    I'm writing a program that monitors the signal of a sensor in a chart. Now I want to insert into a table of data collected by the sensor using a JTable.The problem is that if I run the program, data are not included in the table.
    Where did I go wrong?
    Here's part of the source code.
    ArrayList<Data> datiArray = new ArrayList<Data>();
    DataTableModel dtm= new DataTableModel(datiArray);
    public class DataTableModel extends AbstractTableModel {
        private ArrayList<Data> datiArray;    // <---- Il table model ha la collezione
        public DataTableModel(ArrayList<Data> datiArray) {
            this.datiArray=datiArray;
        public void addData(Data d) {
            datiArray.add(d);
    int row = datiArray.size()-1;
    fireTableRowsInserted(row,row);
             private String colName[] = {"Time", "G-value" };
            public String getColumnName(int col) {
                return colName[col];
            public int getRowCount() {
                return datiArray.size();
            public int getColumnCount() {
                return 2;
            public boolean isCellEditable(int row, int col) {
                return false;
              public Class getColumnClass(int c) {
              return (c == 0) ? Long.class : Double.class;
    public Object getValueAt(int row, int column) {
            Data d = datiArray.get(row);
            switch (column) {
                case 0: return dati.getTime();
                case 1: return dati.getGvalue();
            return null;
        private class Data  {
            public long time;
            public double gvalue;
            public Data(long time, double gvalue) {
                this.tempo = tempo;
                          this.gvalue = gvalue;
    public long getTime() { return time; }
        public double getGvalue() { return gvalue; }
    RecordButtonAction
        private void recordButtonActionPerformed(java.awt.event.ActionEvent evt) {                                            
            int i=0;
    int j= graphView.getSampleTime();
    int k=graphView.getIndexMax();
    System.out.println(j);
    System.out.println(k);
            while(i<k){
    Data dr= new Data(graphView.getTime(i),graphView.getGvalue(i));
    //datiArray.add(dr);
    dtm.addData(dr);
    //these System.out.println are served to me to check if the values were actually included in the class Data and DataTableModel
    System.out.print(graphView.getTime(i));
    System.out.println("/t"+graphView.getGvalue(i));
    System.out.println(dr.getTime());
    System.out.println(dtm.getValueAt(i ,1 ));
            i=i+j;
            readyRecord = false;
    Sorry for my bad English.

    Please don't cross-post the same question in multiple forums. Especially since you've already been given directions as to how to ask the question smartly.

Maybe you are looking for