Best way to read KEKO and KEPH data

Hi All,
I have to collect data from KEKO and KEPH.
I tried FM 'CK_F_KEKO_KEPH_READING'. but it takes a long time. Is there any better way to read that data. I have to collect data for multiple materials and multiple plants.
Thanks in advance..!

Try
<b>CK_F_KEKO_KEPH_DIRECT_READ
CK_F_KEKO_KEPH_READING
CK_F_KEKO_KEPH_DIRECT_READ</b>
<u><b>also look at Function Group = "CK2U"</b></u>
Hope this’ll give you idea!!
<b>Pl... award the points.</b>
Good luck
Thanks
Saquib Khan
"Some are wise and some are otherwise"

Similar Messages

  • What is the best way to read high rate serial data?

    My goal is to read 14 bytes of binary data from an instrument over RS232 serial (baud 460800) at 2000 Hz. I have not gotten a high speed serial card yet so I am currently using the standard serial port (115200 baud) and reading at 400 Hz. I configure the serial port, flush the buffer then enter a while loop to read the incoming data. I have a visa read 14 bytes per interation every 0.0025 seconds (400 Hz). However it seems the sensor is spitting out data faster than labview can read it because bytes are accumilating at the port. After a while the buffer fills up and the program fails.
    Is there a better way to do this?
    Would it be better to read larger amounts of data less often, e.g. like 1400 bytes every 0.25 seconds?
    Thanks
    Solved!
    Go to Solution.
    Attachments:
    serial_read.jpg ‏161 KB
    serial_read.vi ‏19 KB
    bytes read.png ‏20 KB

    Here is a producer consumer approach.... not tested
    The unwrapping of the string could be done in a better way (didn't spend much time on it, not tested!! Just a quick mod from another post)
    Can't remember what is faster scan from string or match pattern or regular expression .....
    Avoid any frontpanel objects in the serial read loop to keep that thread independent from the GUI thread. Blocksize (128) should be optimized...
    EDIT: The scan loop will not work   
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'
    Attachments:
    ProducerConsumerData 2.vi ‏29 KB

  • Best way to read NTFS and WRITE to it?

    I know there is a laborious 3rd party software called MacFuse that will allow Macs to read AND write to a NTFS drive..but does anyone know of a more reliable and easier way? MacFuse sounds like a tech savvy way to go about and do it....
    Just curious...
    Thanks!

    There is shareware programed called Paragon NTFS. However, I see no reason not to use MacFuse and NTFS-3g. Both are open-source programs that are easy to use.
    Simply download and install MacFuse and than do the same for NTFS-3g. They both use preference panes for settings, but they work "as-is". Once installed together, NTFS volumes work the same as any other volume in OS X. It's not only for the "tech-savvy".

  • Best way of reading clob and loadng into table

    Hi,
    I'm loading the data from clob to one of our table. For this task its taking 10 minutes for 8,000 records. Down the road we are expecting more than 20,000 records.
    Is there any fastest way to load the data in this approach. please help me out
    sournce table is lob_effect1 and target table canbe any table.
    CREATE TABLE lob_effect1 (
      id  INTEGER NULL,
      loc CLOB    NULL
      STORAGE (
        NEXT       1024 K
    CREATE OR REPLACE FUNCTION f_convert(p_list IN VARCHAR2)
        RETURN varchar2
      AS
        l_string       VARCHAR2(32767) := p_list || ',';
        l_comma_index  PLS_INTEGER;
        l_index        PLS_INTEGER := 1;
       -- l_tab          test_type := test_type();
          v_col_val                         varchar2(32767);
       v_col_val_str            varchar2(32767);
      BEGIN
        LOOP
         --     dbms_output.put_line(l_string);
          l_comma_index := INSTR(l_string, ',', l_index);
                   EXIT WHEN l_comma_index = 0;
          v_col_val := SUBSTR(l_string, l_index, l_comma_index - l_index);
                   v_col_val_str :=v_col_val_str ||','||chr(39)||v_col_val|| chr(39);
                   v_col_val_str :=ltrim(v_col_val_str,',');
              --     dbms_output.put_line(v_col_val_str);
          l_index := l_comma_index + 1;
        END LOOP;
        RETURN v_col_val_str;
      END f_convert;
    CREATE OR REPLACE
    PROCEDURE p_load_clob1(
        p_date     IN DATE DEFAULT NULL,
        p_tab_name IN VARCHAR2,
        p_clob     IN CLOB DEFAULT NULL)
    IS
      var_clob CLOB;
      var_clob_line            VARCHAR2(4000);
      var_clob_line_count      NUMBER;
      var_clob_line_word_count NUMBER;
      v_col_val                VARCHAR2(32767);
      v_col_val_str            VARCHAR2(32767);
      v_tab_name               VARCHAR2(200):='coe_emea_fi_fails_new_tmp';
      v_sql                    VARCHAR2(32767);
      n_id                     NUMBER;
      CURSOR cur_col_val(p_str VARCHAR2)
      IS
        SELECT * FROM TABLE(fn_split_str(p_str));
    BEGIN
      INSERT
      INTO lob_effect VALUES
          seq_lob_effect.nextval,
          p_clob
      RETURNING id
      INTO n_id;
      COMMIT;
      SELECT loc INTO var_clob FROM lob_effect1 WHERE id =n_id;
      var_clob_line_count := LENGTH(var_clob) - NVL(LENGTH(REPLACE(var_clob,chr(10))),0) + 1;
      FOR i                                  IN 1..var_clob_line_count
      LOOP
        var_clob_line           := regexp_substr(var_clob,'^.*$',1,i,'m');
        var_clob_line_word_count:=LENGTH(var_clob_line) - NVL(LENGTH(REPLACE(var_clob_line,',')),0) + 1;
        v_col_val_str           :=NULL;
        v_col_val               :=NULL;
        FOR rec_col_val         IN cur_col_val(var_clob_line)
        LOOP
          v_col_val     :=rec_col_val.column_value;
          v_col_val_str :=v_col_val_str ||','||chr(39)||v_col_val|| chr(39);
          v_col_val_str :=ltrim(v_col_val_str,',');
        END LOOP;
        v_sql :='insert into '||p_tab_name||' values ('||v_col_val_str||')';
        EXECUTE immediate v_sql;
      END LOOP;
      COMMIT;
    EXCEPTION
    WHEN OTHERS THEN
      dbms_output.put_line('Error:' || SQLERRM);
    END;
    /Thanks & Regards,
    Ramana.

    Thread: HOW TO: Post a SQL statement tuning request - template posting
    HOW TO: Post a SQL statement tuning request - template posting

  • What is the best way to read, process, and write an Excel File Server side...SQL Server Agent Job

    So I was using dynamic Excel commands to open and save as using...
    Microsoft.Office.Interop.Excel.Application
    and
     workbook.SaveAs(StringDestinationFile, XlFileFormat.xlExcel8, Type.Missing, Type.Missing, Type.Missing, Type.Missing, XlSaveAsAccessMode.xlExclusive, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing);
    which worked all fine and dandy client side. Then when I attempted to create a SQL Server Agent Job, this failed as a result of SQL Server side not being able to execute dynamic Excel commands.
    So do I need to therefore try and do this function via Microsoft.ACE.OLEDB.12.0 commands? And where can I find the commands and syntax to open and save as? I have to Open a .xlsx file, save it as a .xls file, and then open this newly created .xls file and
    then save it as a .csv file.
    Thanks for your review and am hopeful for a reply.
    ITBobbyP85

    I think you might be over complicating things.
    You can use SSIS with Excel Source/Destination connections to read in, or output to an excel sheet/file.

  • What is the best way to read and manipulate large data in excel files and show them in Sharepoint

    Hi ,
    I have a large excel file that has 700,000 records in it. The excel file has a few columns that change every day.
    What is the best way to read the data form the excel file in fastest and most efficient way.
    2 nd Problem,
    I have one excel file that has many rows each row contain some data that has certain keywords.
    What I want is  to segregate the data of rows into respective sheets(tabs ) in the workbook.
    for example in rows have following data 
    1. Alfa
    2beta
    3 gama
    4beta
    5gama
    6gama
    7alfa
    in excel
    I want there to be 3 tabs now with each of the key words alfa beta and gamma.

    Hi,
    I don't really see any better options for SharePoint. SharePoint use other production called 'Office Web App' to allow users to view/edit Microsoft Office documents (word, excel etc.). But the web version of excel doesn't support that much records as well
    as there's size limitations (probably the default max size is 10MB).
    Regarding second problem, I think you need some custom solutions (like a SharePoint timer job/webpart ) to read and present data.
    However, if you can reduce the excel file records to something near 16k (which is supported rows in web version of excel) then you can use SharePoint Excel service to refresh data automatically in the excel file in SharePoint from some external sources.
    Thanks,
    Sohel Rana
    http://ranaictiu-technicalblog.blogspot.com

  • What's the best way to read JSON data?

    Hi all;
    What is the best way to read in JSON data? And is the best way to use it once read in to turn it into XML and apply XPath?
    thanks - dave

    jtahlborn wrote:
    without having a better understanding of what your definition of "use it" is, this question is essentially unanswerable. Jackson is a fairly popular library for translating json to/from java objects. the json website provides a very basic library for parsing to/from xml. which one is the "best" depends on what you want to do with it.Good point. We have a reporting product ([www.windward.net|http://www.windward.net]) and we've had a number of people ask us for JSON support. But how complex the data is and what they want to pull is all over the place. The one thing that's commin is they generally want to pull down the JSON data, and then put specific items from that in the report.
    XML/XPath struck me as a good way to do this for a couple of reasons. First it seems to map well to the JSON data layout. Second it provides a known query language. Third, we have a really good XPath wizard and we could then use it for JSON also.
    ??? - thanks - dave

  • How is the best way to read data from an iphone if you lost your itunes data after a crash?

    How is the best way to read data from an iphone if you lost your itunes data after a crash?

    How is the best way to read data from an iphone if you lost your itunes data after a crash?

  • I have a PPC iMac 10.4.11 and will shortly buy a new i Mac.  What is the best way to transfer all my HD date from old to new? Thank you!

    I have a PPC iMac 10.4.11 and will shortly buy a new i Mac.  What is the best way to transfer all my HD date from old to new? Thank you!

    Migrating from PPC Macs to Intel Macs:
    https://discussions.apple.com/docs/DOC-2295
    How to use Migration Assistant:
    http://support.apple.com/kb/HT4413?viewlocale=en_US
    http://support.apple.com/kb/TS1963
    Troubleshooting Firewire target disk mode:
    http://support.apple.com/kb/HT1661

  • My Mum( a pensioner) is wanting to purchase both an iPhone and iPad.  What is the best way for her to manage her data /calls/txt etc. obviously the cost needs to be as low as possible.  Currently does not have WiFi but uses dongle

    My Mum( a pensioner) is wanting to purchase both an iPhone and iPad.  What is the best way for her to manage her data /calls/txt etc. obviously the cost needs to be as low as possible.  Currently does not have WiFi but uses dongle

    My Mum( a pensioner) is wanting to purchase both an iPhone and iPad.  What is the best way for her to manage her data /calls/txt etc. obviously the cost needs to be as low as possible.  Currently does not have WiFi but uses dongle

  • What is the best software programs that I can use to read, write and modify data / files on external HD (NTFS format i.e.  Windows) ?

    Hi guys,
    I’m new to Mac and have a MacBook Pro Lion OS (10.6.8 I think !!!) with Parallels 7 (Windows 7) installed. Can someone please tell me what is the best software program that I can use to read, write and modify data / files on external HD (NTFS format) from the Mac OS ? I heard of Paragon and Tuxera NTFS. Are they free ? Are they good ? Are there any other software programs out there ? I heard that some people have issues with Paragon.
    Thanks.

    Your best bet would be to take the drive to the oldest/compatible with that drive Windows PC and grab the files off, right click and format it exFAT (XP users can download exFAT from Microsoft) and then put the files back on.
    Mac's can read and write all Windows files formats except write to NTFS (and in some cases not read) so if you can change the format of the drive to exFAT (all data has to be remove first) then you will have a drive that doesn't require paid third party NTFS software (a license fee goes to Microsoft) for updates.
    Also it's one less hassle to deal with too.
    .Drives, partitions, formatting w/Mac's + PC's

  • What's the best way for reading this binary file?

    I've written a program that acquires data from a DAQmx card and writes it on a binary file (attached file and picture). The data that I'm acquiring comes from 8 channels, at 2.5MS/s for, at least, 5 seconds. What's the best way of reading this binary file, knowing that:
    -I'll need it also on graphics (only after acquiring)
    -I also need to see these values and use them later in Matlab.
    I've tried the "Array to Spreadsheet String", but LabView goes out of memory (even if I don't use all of the 8 channels, but only 1).
    LabView 8.6
    Solved!
    Go to Solution.
    Attachments:
    AcquireWrite02.vi ‏15 KB
    myvi.jpg ‏55 KB

    But my real problem, at least now, is how can I divide the information to get not only one graphic but eight?
    I can read the file, but I get this (with only two channels):
    So what I tried was, using a for loop, saving 250 elements in different arrays and then writing it to the .txt file. But it doesn't come right... I used 250 because that's what I got from the graphic: at each 250 points it plots the other channel.
    Am I missing something here? How should I treat the information coming from the binary file, if not the way I'm doing?
    (attached are the .vi files I'm using to save in the .txt format)
    (EDITED. I just saw that I was dividing my graph's data in 4 just before plotting it... so It isn't 250 but 1000 elements for each channel... Still, the problem has not been solved)
    Message Edited by Danigno on 11-17-2008 08:47 AM
    Attachments:
    mygraph.jpg ‏280 KB
    Read Binary File and Save as txt - 2 channels - with SetFilePosition.vi ‏14 KB
    Read Binary File and Save as txt - with SetFilePosition_b_save2files_with_array.vi ‏14 KB

  • Best way to read chars from InputStream

    Hope this is not a too newbie question.
    Suppose I have an unbuffered InputStream inputStream, what is the best way to read chars from it (in terms of performance)?
    Reader reader = new BufferedReader(new InputStreamReader(inputStream));
    reader.read()
    or
    Read reader = new InputStreamReader(new BufferedInputStream(inputStream))
    reader.read()
    Is there a difference between the two and if so, which one is better?
    thanks.

    If you are reading using a buffer of your own, then adding a buffer for binary data is a bad idea.
    However for text, using a BufferedInputStream could be better as it reduces calls to the OS.
    If it really matters, I suggest you do a simple performance test which runs for at least a few seconds to see what the difference is. (You should runt he test mroe than once)
    Edited by: Peter__Lawrey on 20-Feb-2009 21:37

  • Best way to read from a file

    What would be the best way to read from a file. Which classes do I need to use?
    I have to write a program, which reads data from a comma separated flat file, parse it and after inserting some busineess logic insert into a databse .
    I will have to read the data line by line.
    Any help????

    I would use:
         public void readData()
              try
                   data = new String[this.countRows("comp.txt")][];
                   BufferedReader br = new BufferedReader(new FileReader("comp.txt"));
                   for(int x = 0; x < data.length; x++)
                        StringTokenizer temp = new StringTokenizer(br.readLine(), "?");
                        data[x] = new String[temp.countTokens()];
                        for(int y = 0; y < data[x].length; y++)
                             data[x][y] = temp.nextToken();
              catch(Exception e)
                   System.out.println(e.toString());
         private int countRows(String f)
              int t = 0;
              try
                   BufferedReader brCountRows = new BufferedReader(new FileReader(f));
                   while(brCountRows.readLine() != null)
                        t++;
              catch(Exception e)
                   System.out.println(e.toString());
                   return t;
              return t;
         }It works deliciously!

  • Best way to backup Iphoto and Itunes?

    I just purchased a 2TB Timecapsule and after reading addtional reviews I am reconsidering my purchase. What is the best way to backup iphotos and itunes? The idea of backing up the data wirelessly and seamlessly with time machine was a what encouraged me to purchase Apples solution. However I am seeing alot of TC's are dying after their warranty expires which is a deal breaker as I have a newborn and do not want to lose pictures.
    My 2009 MBP memory is nearly full and my photos eat memory. I would like a solution that allows me to access previous photos etc without relying on the MBP HD in the event it is stolen, crashes etc.
    The same holds true for itunes which I rarely use now that I have Spotify, however I do not want to lose my itunes catelog in two years either, but it would be great to have them off my MBP HD, but readily accessible if necessary.
    The literature on purchasing cloud storage is pretty scarce (Icloud!?, Amazon?)
    Any insight is greatly appreciated, and I am sure I will have followup questions as I need to resolve this soon now that I have a 2 week window to return the TC.
    Thank you!

    Welcome to Apple Discussions!
    Presumably, the old iMac is pre-Firewire? If it is, it really would be very slow for backing up. You can use these methods to network the two machines on my FAQ:
    http://www.macmaps.com/network9X.html

Maybe you are looking for