Best way to parse data

Hi, I'm fairly new to java programming coming from a midrange, COBOL background.
I need to take data from a legacy program and use it in an online java program. The data is stored in a table that occurs 1-25 times:
05  DATA-TABLE OCCURS 25 TIMES.
       10  FIELD1                       PIC X(25).
       10  FIELD2                       PIC X(25).
       10  FIELD3                       PIC X(50).  So, I could have between 100 and 2500 bytes of data to deal with in the java program. Can anyone point me in the right direction on the best way to handle this? I thought about creating an intial Array that has 25 elements and then substring-ing the returned data into that. Then, what would be the best way to break that data down to the individual components?
If you have any solutions or pointers, I would definitely appreciate it.
Thanks!
bfrmbama

I would say to you to start giving meaning to your data and puting it in classes, for example if your data is about pet animals and you have the animal's name, nick and description:
package example;
* @author leonardo     12/11/2004
* @version 1.0
public class Pet {
    private String name;
    private String nick;
    private String description;
     * Create a pet from the string with the triplet atributes.
     * @param triplet The triplet data.
    public Pet( String triplet ) {
        tokenize( triplet );
     * Method for tokenizing the pet data.
     * @param triplet the data.
    private void tokenize(String triplet) {
         * I am assuming the data is space separated,
         * look at the api to see a complete usage of
         * split and maybe the string tokenizer.
        String[] strings = triplet.split(" ");
        name = strings[1];
        nick = strings[2];
        description = strings[3];
     * Its always a good idea to encapsulate your data
     * and give access to it through accessors and mutators.
    public String getDescription() {
        return description;
    public void setDescription(String description) {
        this.description = description;
    public String getName() {
        return name;
    public void setName(String name) {
        this.name = name;
    public String getNick() {
        return nick;
    public void setNick(String nick) {
        this.nick = nick;
}then when you tokenize your file you can do:
// ... Incomplete ...
// I am assuming you will write a CobolFileReader that implements the file reading and line
// extratction logic. when there is no more data to be read it returns null
   List list = new ArrayList();
    CobolFileReader cobolFileReader = new CobolFileReader("TheFile.txt");
        for(String triplet = cobolFileReader.getNextTripet(); triplet != null; triplet = cobolFileReader.getNextTripet()) {
            list.add( new Pet( triplet ) );
  This way you will have a lista as big as it needs to be.

Similar Messages

  • What's the best way to parse a BSTR in TestStand?

    I receive a BSTR from an OLE interface; what is the best way to parse this into something understandable? Has anyone written a routing for this?

    Hi,
    If you are using a OLE interface I think you will be using the Active X Automation adaptor.
    TestStand Active X Automation Adaptor automatically converts BSTR when it store the data in a TestStand variable.
    Then you should be able to use the regular string functions which are available in TestStand to parse the strings.
    Please let me know if you have any more questions.
    Regards
    Anand Jain

  • How is the best way to read data from an iphone if you lost your itunes data after a crash?

    How is the best way to read data from an iphone if you lost your itunes data after a crash?

    How is the best way to read data from an iphone if you lost your itunes data after a crash?

  • Best way to import data to multiple tables in oracle d.b from sql server

    HI All am newbie to Oracle,
    What is the Best way to import data to multiple tables in Oracle Data base from sql server?
    1)linked server?
    2)ssis ?
    If possible share me the query to done this task using Linked server?
    Regards,
    KoteRavindra.

    check:
    http://www.mssqltips.com/sqlservertip/2011/export-sql-server-data-to-oracle-using-ssis/
          koteravindra     
    Handle:      koteravindra 
    Status Level:      Newbie
    Registered:      Jan 9, 2013
    Total Posts:      4
    Total Questions:      3 (3 unresolved)
    why so many unresolved questions? Remember to close your threads marking them as answered.

  • What could be the best way to Export data from 11.5.8 instance to 12.1.2?

    Hi All
    What could be the best way to Export data from 11.5.8 instance to 12.1.2?
    Release: 11.5.8
    OS: Oracle Solaris on SPARC (32-bit) verison 9
    DB: 9.2.0.1
    Thanks in advance

    What kind of data you are looking to move?
    Database export/import is only supported for full database export/import and the application release should be the same on the source/target nodes.
    You can move the setup using iSetup or FNDLOAD.
    Thanks,
    Hussein

  • What is the best way to transfer data from a PC to an iMac?

    What is the best way to transfer data from a PC to an iMac?

    If you know how to set up a computer-to-computer Ethernet network, then you can give that a try, but a hard drive will be faster than Ethernet unless you don't have a lot to transfer.
    Mac OS X 10.6 Help- Creating a computer-to-computer network

  • Best ways to view data, total records of an application table ie VBAK

    Hi all,
    What is the best way to view data of an application table in the source system?
    I know about SE16....but are there other ways to know details ie the total no of records and different field information about a
    application table ie VBAK in source R3?
    Also, using SE16 when i checked for VBAK and clicked on the "number of enteries" then it showed 0...however
    when i directly checked from the sqlplus then i found about 5000 records in there in VBAK. I am not sure why
    via SE16 it showed 0. Does anybody have any idea what i missed here?
    Thanks...will give points for ur input.
    ak

    I tried "number of enteried" on se16 and it shows 0 enteries without any selection criterion...i cheked by putting relevant time range as well but it shows 0...
    As i told that when i checked VBAK separately via logging to database directly then i did find 5678 rows there.
    Please note that this is a new demo version....so i thought that i first need to activate the table which i did using tcode SE11. Now the VBAK table is active but still via SE16 shows 0 nuber of enteris....
    Can anybody please advise here..
    Thx
    ak

  • Best way to export data with r.t. prompts and have dense dim mbrs on rows?

    Hi All-
    What is the best way to export data with Run time prompts out of Essbase?
    One thought was to use Business Rules with run time variables and DATAEXPORT command, but I came across at least one limitation where I cannot have months (part of dense Time Periods dimension) on rows.
    I have only two dense dimensions: Accounts and Time Periods and I need both of these on rows. This would come handy when user enter Start and End month and year for data to be exported e.g. If start period is Feb 2010 and end is Jan 2011, I get data for all months in 2010 and 2011.
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000",14202.24,14341.62,14560,13557.54,11711.92,10261.58,12540.31,15307.83,16232.88,17054.62,18121.76,18236
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000",19241,21372.84,21008.4,18952.75,23442.13,19938.18,22689.61,23729.29,22807.48,23365,23915.3,24253
    "CORP1","0173","FY11","Working","Budget","Local","HSP_InputValue","404000",21364,22970.37,23186,27302,25144.38,27847.91,27632.11,29007.39,24749.42,27183.39,26599,27112.79
    where ideally I would need to get the following:
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Feb",14341.62
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Mar",14560
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Apr",13557.54
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","May",11711.92
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Jun",10261.58
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Jul",12540.31
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Aug",15307.83
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Sep",16232.88
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Oct",17054.62
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Nov",18121.76
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Dec",18236
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Feb",21372.84
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Mar",21008.4,
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Apr",18952.75
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","May",23442.13
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Jun",19938.18
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Jul",22689.61
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Aug",23729.29
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Sep",22807.48
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Oct",23365
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Nov",23915.3
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Dec",24253
    "CORP1","0173","FY11","Working","Budget","Local","HSP_InputValue","404000","Jan",21364
    Thank you in advance for any tips.

    Have a read of the following post :- export data to sql
    It may give you a further option.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Best way to read data sources in parallel

    Hi,
    I'm looking for conceptual help as I start a project. I am trying to figure out the best way to get data from several sources at different timings and deliver them to a main vi.
    I have 4 systems, which each work well on their own (OK, one doesn't work yet, but let's assume that can be fixed
    One system reads from 2 pH meters on serial ports. The meters are slow to respond, so it takes about 2 minutes to read 4 channels of data. I save these data to a file every 10 minutes
    One system reads from a CO2 meter on the USB port. It reads the data every second, and does some averaging. Every 2-10 minutes, it saves the average to a file and then sends a command to the parallel port to switch the input to the meter.
    The third system reads from 6 valves, each on a serial port. These also take time, probably several minutes to poll all 6. These data will also be saved to a file.
    The 4th system reads a bank of temperature probes on the USB port. These get polled every few seconds and saved to a file every few minutes.
    Now that these individual routines are working, I am trying to create a front end that will display all the data in one place and allow me to set all the parameters from a single place. I would also like the possibility of using the data from one source at another place (for instance, having the output of the temperature probes sent to the pH meters to adjust their calibration). At this point, I get confused as to the best way to proceed.
    It seems like if I just want to read the data from each source, I could simply put all 4 routines together in a single vi (oh, what a mess that would be to read). Maybe I should start this way?
    However, if I want to have any communication between the different data sources, it seems like I will either need to use queues or VI server. I sort of envision a vi that lets me configure the various ports and the file operations and then can turn on monitoring of any or all of the various inputs. Each of them will do their thing at their own time and the main routine will simply display whatever data they deliver whenever they have new data. Fortunately, nothing is particularly time-critical, nor does it need to run fast.
    My questions: Am I correct in how I'm thinking about getting this to work?   Is there a clear choice between queues or vi server? I've looked at several examples of each, but without having done something like this before, it is hard for me to tell which is better.
    Thanks for any suggestions.
    mike

    Hi Mike,
    I think that you are on the right track with your thinking process. You might be able to implement this using queues. I'm not exactly sure how you would do it with VI server since it is just a set of functions that allows you to dynamically control front panel objects, VIs, and the LabVIEW environment. However, there are some great resources available with using queues for this type of application. I'm including the link to another discussion forum that had a very similar question to yours. There is a good example of using queues within this forum post. Also, there is a great example in NI Developer Zone about using queues and some other good ones in the NI Example Finder (just search 'queues' and you should get a few results). I hope this helps!
    Carla
    National Instruments
    Applications Engineer

  • Best way to pass data sets to another program

    Hey
    I want to connect another (maths) program with my java application. Therefore I need to paste data (some kind of tab separated table) to this program.
    I try now to save these data in a separate newly generated file and to pass a command with Java's Runtime.exec() method to this program to read these data. Is this a good idea or might there be better ways?
    If I do so, is there a way in Java to generate some kind of a "temporary" file which will be deleted automatically after usage or is this nothing else than to save it in a common file and delete it afterwards. What's the best way to pass data generally?

    Well, the connection will not be over a network, so I'd rather think it's not a Socket or RMI problem (unless someone convinces me).
    Yes it's very external, it's a C or C++ written program, I don't have any source codes. So far I generated a file for Input command and data, I passed that on to the maths program and returned the output into another file.
    Now I would like to separate the output and like to obtain some tables and graphical things like charts at the output. Do I have to generate three different types of outputfiles? How to store some graphics e.g. some distributions. I thought even of generating a database. I never thought about XML, I don't know if this works for that kind of problem?!

  • Best Way to Load Data in Hash Partition

    Hi,
    I have partitioning by Hash on a Large Table of 5 TB. We have to load Data say more than 500GB daily on that table from ETL.
    What is the best way to Load data into that Big Table which has hash Partition .
    Regards
    Sahil Soni

    Do you have any specific requirements to match records to lookup tables or it just a straight load - that is an insert?
    Do you have any specific performance requirements?
    The easiest and fastest way to load data into Oracle is via external file and parallel query/parallel insert. Remember that parallel DML is not enabled by default and you have to do so via alter session command. You can leverage multiple CPU cores and direct path operation to perform the load.
    Assuming your database is on a linux/unix server - you could NFS load the file if it is on a remote system, but then you will be most likely limited by network transfer speed.

  • Best way to mark data that has changed when it is edited in  a spark datagrid?

    I am not sure if this is the best way to mark data that has changed (isModifiedClientSide ) when it is edited in  a spark datagrid:
    <GridColumn  width="140" headerText="Margin "dataField="margin" editable="false"  editable.editMode="true"  >
                        <itemRenderer>
                            <fx:Component>
                                <DefaultGridItemRenderer textAlign="right"   background="true" backgroundColor="#ffffff"  alpha="1.0" color="#000000" >
                                    <focusOut>
                                        <![CDATA[
                                        this.data.isModifiedClientSide = 1;
                                        ]]>
                                    </focusOut>
                                </DefaultGridItemRenderer>
                            </fx:Component>
                        </itemRenderer>
                    </GridColumn>

    I tried this but the value commit doesnt't get fired bro:
        <GridColumn  width="140" headerText="Margin (disabled)" headerText.editMode="Margin (editing)" dataField="margin" editable="false"  editable.editMode="true"  >
                        <itemEditor>
                            <fx:Component>
                                <DefaultGridItemEditor>
                                    <valueCommit>
                                        <![CDATA[
                                        this.data.isModifiedClientSide = 1;
                                        ]]>
                                    </valueCommit>
                                </DefaultGridItemEditor>
                            </fx:Component>
                        </itemEditor>
                    </GridColumn>
    what do you think?

  • Best way to extract data from archived cube

    Hello Experts,
    Can anyone tell me best way to extract data from archived cube.
    Basically I am trying to pull all the data from archived cube and then load it into another brand new infoprovider which is in different box.
    Also I need to extract all the master data for all infoobjects.
    I have two options in my mind:
    1) Use open hub destination
    or
    2) Infoprovider>display data>select the fields and download the data.
    Is it really possible to extract data using option (2) if records are too high and then load it into another infoprovider in new system.
    Please suggest me the pros and cons for the two options.
    Thanks for your time in advance.

    Hello Reddy,
    Thanks a lot for your quick reply.
    Actually in my case I am trying to extract archived infocube data and then load it into new infoprovider which is in different system. If I have connectivity I can simply export data source from archived infocube and then reload into new infoprovider.
    But there is no connectivity between those two systems (where archived cube is and new infoprovider) and so I am left with the two options I mentioned.
    1) Use Open Hub
    or
    2) Extract data manually from infoprovider into excel.
    Can anyone let me know which of the two options is the best and also I doubt on how to use excel in extracting data as excel have limit of no.of records 65536
    Thanks
    Edited by: saptrain on Mar 12, 2010 6:13 AM

  • Best way to export data for a migration

    Hi Oracle Community,
    What's the best way to export data from an Oracle 8i database for it to be suitable for import into an Oracle 10g database?
    What's the best way to export data if it is to go into different rdbms database?
    Thanks, David

    Thanks everyone for all your help. You guys are great.
    There seems to be many good ways to export your data from Oracle into a flat file format, suitable for import into other RDBS': Oracle, mysql, postgresql, etc.
    A few tools where mentioned but using SQL*Plus, which comes with Oracle (And SQL*LDR on the backend, which also comes with Oracle) seem the most straight forward.
    I found this script on asktom.oracle.com to work great, slightly modified here,
    (to Include linesize max, and pipes rather than commas):
    set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
    set linesize 32767
    spool payment.txt
    select
    PAYMENT_ID||'|'||
    USER_ID||'|'||
    <more fields her>
    from
    payment
    spool off
    exit ;
    It works great. Rather than making one of these for each table I wrote an perl script called ora_export. http://crowfly.net/oracle/ora_export. It runs in Unix and only requires SQL*PLUS. It creates these four files:
    <tablename>.def # list of table columns and types (SQL*Plus DESC)
    <tablename>_dump.sql  # a script to export the data
    <tablename>.psv # THE DATA (eq. - name|address|etc)
    <tablename>_load.ctl  # SQL*LDR control for ORCL if you need it.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Best way to move data and programs to another profile on same Mac?

    Hello,
    What is the best way to move data and programs to another profile on the same Mac? I have a user who's profile is corrupt, I know that most programs will work on both the new and old profile however when trying to copy the Desktop folder, or Documents folder I am getting permissions denied.
    Sort of like weeding a garden, I'm hoping I do not have to pick the data in each folder and copy individually.
    Thanks for your help!
    Johnathon

    This usually means that a configuration or preference file is corrupted.  In this user's /Home/Library/Preferences/ folder locate any preference files associated with iLinc and drag to the Trash.
    I would also check in the /Home/Library/Caches/ folder for a file or folder associated with iLinc and delete as well.
    See if the problem is resolved in the user's normal account.
    It's not that you cannot copy data from account to account, but doing so causes a lot of permissions issues that must be resolved.  The MacFixit article I linked above shows what you need to do after transferring from one account to another in order to change permissions on the "foreign" files to those of the destination account.

Maybe you are looking for

  • How to set up 2 time capsules to back up one computer?

    what can i say, i run a music studio, dj off of mp3's as well, and edit video in my spare time so, i've got terrabytes of data, on multiple external and internal hard drives, and have therefore invested in 2 (two) 1tb time capsules. however, there do

  • 1 step workflow hangs following client copy

    Hi. We are using SRM 4 with the basic 1 step approval workflow. We just copied our live system to make a new test system and I have run all the BDLS / attribute replace jobs and such like. I have also scheduled the RSWWCOND / RSWWHEX / RSWWERRE jobs.

  • Launch problems mac osx10.9.1 and indesign cs5/cc

    We have recently upgraded to mavericks 10.9.1 on imacs.  indesign CS5 wont launch, we have updated with java2013-5 but this hasnt helped.  We then tried a trial version of indesign cc but this wont launch as 'you lack sufficient permissions to access

  • Clustering issue with WLCS 3.2 weblogic 5.1 and service pack 8

    I'm having a problem with Portlets on a page with commerce server 3.2 on           Cluster A below. One Portal page has multiple portlets on it. When the           Portal Page executes the portlets are being scheduled across both SCCWLS00           a

  • Why when i insert a table it won't let me put something next to the table?

    when i insert a table it only lets me put something under the table intead of beside it.