Writing to same file

So I am trying to write the output of two different java class files to one txt file while the program runs.  The file name is determined before the program is ran, through the command prompt as arguments.  How can I get the second class file to edit the same txt file without running into compile errors.
For right now I'm just going to send everything that the second file outputs to a message String variable, so that the Main class outputs to the the text file.  I still want to learn how to write to the same text file directly from the second class file.  Don't need the code written just a couple of pointers in the right direction.
package Week7.Project3;
import java.io.*;
public class Test{
    public static void main(String[] args) throws IOException{
        int x;
        //create a new file internally called doc.  But externally labelled the user input
        File doc = new File(args[0]);
        if (doc.exists()){
            System.out.println("File exists");
            System.exit(1);
        //creates new output writing method to the doc file
        PrintWriter write = new PrintWriter(doc);
        //Creates 10 instances of the Banjo
        write.println("Let's gather 10 Banjos.");
        Banjo instruments[] = new Banjo[10];
        for (x = 0; x < instruments.length; x++){
            instruments[x] = new Banjo();
        //Tunes all 10 Banjos
        write.println("\nTuning the 10 Banjos.");
        for (x = 0; x < instruments.length; x++){
            instruments[x].tunedBanjo();
        //Give even banjos Resonators
        write.println("\nLets give the even Banjos resonators.");
        for (x = 1; x < instruments.length; x+=2){
            instruments[x].setResonator();
        //Starts playing banjos
        write.println("\nLets start playing the Banjos.");
        for (x = 0; x < instruments.length; x++){
            instruments[x].setPlayBanjo();
        //Stops playing the banjos
        write.println("\nLets stop playing the Banjos.");
        for (x = 0; x < instruments.length; x++){
            instruments[x].stopBanjo();
        write.close();
second file
package Week7.Project3;
import java.util.Arrays;
import java.io.*;
public class Banjo{
    private int numberStrings = 5;
    private String banjoName = "Banjo "; //first name of  each banjo
    private String stringNotes[] = {"D", "B", "G", "D", "G"};
    private String arg;
    private boolean tunedStatus; //Is the banjo tuned?
    private boolean playStatus; //Is the banjo Playing?
    private boolean resonator; //Open back or resonator
    private static int banjoNumber = 0; //helps name banjos
    public Banjo(){
        tunedStatus = false;
        playStatus = false;
        resonator = false;
        banjoNumber += 1;
        banjoName += banjoNumber;
        write.println(banjoName + " created. \n\t It has " + numberStrings + " strings." +
                "\n\t Notes: " + Arrays.toString(stringNotes) + "\n\t Not tuned with open back.");
    public void tunedBanjo(){
        tunedStatus = true;
        write.println(banjoName + " is now tuned");
    public void setPlayBanjo(){
        playStatus = true;
        write.println(banjoName + " is now playing");
    public void stopBanjo(){
        playStatus = false;
        write.println(banjoName + " has stopped being played.");
    public void setResonator(){
        resonator = true;
        write.println(banjoName + " now has a resonator.");

Welcome.
>I had to ... pass that PrintWriter to a local static PrintWriter.
This is only because you have all your code of the Test class in its main method which itself is static.
If you transfer the code outside of main static is not a must any more.
> ...so that all my methods could use it
But since this is a requirement, to pass the PrintWriter to a class variable (this.write= write;) is necessary anyhow.
Regards
J.

Similar Messages

  • Is is possible to read a file while writing the same file at time in java?

    Hi ,
    I am reading a MSAccess file (mdb) from one location to another location .but the source file is an online database which will chage regularly .so while reading that file i am getting error like
    FileNotFoundException.
    java.io.FileNotFoundException: c:\Msaccess\db2.mdb (The process cannot access the file because it is being used by another
    s)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:106)
    at read.main(read.java:16)

    alphabet soup,
    Yes, windows helpfully prevents two processes from accessing a file simultaneously. Operating Systems on the other hand uniformly support it.
    Which probably doesn't help you very much, as I imagine that you can't just up and change the source machine over to *nix, but there you have it.
    You'll just have to close the program which has the database open, copy the database, and restart the program which uses the database.
    keith.
    Message was edited by: corlettk

  • Why do I get a NI-488 error massage when writing into a file and at the same time copiyng this file with a backup softwarre like Easy2Sync?

    I have a small LabVIEW program which writes random numbers very fast into a ASCII-file. I want that file to be copied to a new position every 10 min. Therefor I use a backup/synchronisation software which is doing a copy operation every 10 min. It works fine a certian amount of time and after a while I get either an LabVIEW error (LabVIEW: Fiel already open: NI-488 Comand requieres GPIB Controller to be System controller) or an backup-software error (couldn´t open file...whatever). I´m guessing, it has something to do with file-access but I don´t know why?!? If I run the LabVIEW program and I copy and paste the random-number-file with the windows explorer very fast (pressing ctrl+v rapidly) while LabVIEW is still writing into this file, no error appears. Can sombody help me?
    LabVIEW 2011

    Hi Serdj,
    you don't get a GPIB error, the error number has just 2 different explanations...
    Well, you have two programs accessing the same file. One program just wants to make a copy, the other (LabView) is trying to write to the file. When copying a file that is written to you get inconsistent results! That's why one of both programs is complaining an error...
    You can't have write and read access at the same time! (But you can have more than one read access at the same time...)
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • Writing to a file and Reading from it "AT THE SAME TIME"

    Hello,
    If you have a vi (vi 1) that is generating and storing data to a file. Is it possible to write ANOTHER vi (vi 2) that reads & plots the data located at the same file (BTW, vi 1 will still store data to the file while vi 2 is reading the data)
    All I am trying to do is to plot data that is generated from a different vi (vi 1) somewhat in "real time". Unfortunately, I cannot access the vi (vi 1) that is generating the data and add a diagram to it. I have to build my own separate vi (vi 2).  Is this possible?
    Thanks in advance

    Once you open the file, you can't access it again until the original reference is closed. You might want to look at a functional global architecture. You use a case structure to determine how to access the data. Initialize, write, read, save are common examples of sections inside the functional global.
    http://zone.ni.com/devzone/conceptd.nsf/webmain/D2​E196C7416F373A862568690074C759

  • How to read the data file and write into the same file without a temp table

    Hi,
    I have a requirement as below:
    We are running lockbox process for several business, but for a few businesses we have requirement where in we receive a flat file in different format other than how the transmission format is defined.
    This is a 10.7 to 11.10 migration. In 10.7 the users are using a custom table into which they are first loading the raw data and writing a pl/sql validation on that and loading it into a new flat file and then running the lockbox process.
    But in 11.10 we want to restrict using temp table how can we achieve this.
    Can we read the file first and then do validations accordingly and then write to the same file and process the lockbox.
    Any inputs are highly appreciated.
    Thanks & Regards,
    Lakshmi Kalyan Vara Prasad.

    Hello Gurus,
    Let me tell you about my requirement clearly with an example.
    Problem:
    i am receiving a dat file from bank in below format
    105A371273020563007 07030415509174REF3178503 001367423860020015E129045
    in this detail 1 record starting from 38th character to next 15 characters is merchant reference number
    REF3178503 --- REF denotes it as Sales Order
    ACC denotes it as Customer No
    INV denotes it as Transaction Number
    based on this 15 characters......my validation comes.
    If i see REF i need to pick that complete record and then fill that record with the SO details as per my system and then submit the file for lockbox processing.
    In 10.7 they created a temporary table into which they are loading the data using a control file....once the data is loaded into the temporary table then they are doing a validation and updating the record exactly as required and then creating one another file and then submitting the file for lockbox processing.
    Where as in 11.10 they want to bypass these temporary tables and writing it into a different file.
    Can this be handled by writing a pl/sql procedure ??
    My findings:
    May be i am wrong.......but i think .......if we first get the data into ar_payments_interface_all table and then do the validations and then complete the lockbox process may help.
    Any suggestions from Oracle GURUS is highly appreciated.
    Thanks & Regards,
    Lakshmi Kalyan Vara Prasad.

  • Expdp with parallel writing in one file at a time on OS

    Hi friends,
    I am facing a strange issue.Despite giving parallel=x parameter the expdp is writing on only one file on OS level at a time,although it is writing into multiple files sequentially (not concurrently)
    While on other servers i see that expdp is able to start writing in multiple files concurrently. Following is the sample log
    of my expdp .
    ++++++++++++++++++++
    Export: Release 10.2.0.3.0 - 64bit Production on Friday, 15 April, 2011 3:06:50
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "CNVAPPDBO4"."EXPDP_BL1_DOCUMENT": CNVAPPDBO4/********@EXTTKS1 tables=BL1_DOCUMENT DUMPFILE=DUMP1_S:Expdp_BL1_DOCUMENT_%U.dmp LOGFILE=LOG1_S:Expdp_BL1_DOCUMENT.log CONTENT=DATA_ONLY FILESIZE=5G EXCLUDE=INDEX,STATISTICS,CONSTRAINT,GRANT PARALLEL=6 JOB_NAME=Expdp_BL1_DOCUMENT
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 23.93 GB
    . . exported "CNVAPPDBO4"."BL1_DOCUMENT" 17.87 GB 150951906 rows
    Master table "CNVAPPDBO4"."EXPDP_BL1_DOCUMENT" successfully loaded/unloaded
    Dump file set for CNVAPPDBO4.EXPDP_BL1_DOCUMENT is:
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_01.dmp
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_02.dmp
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_03.dmp
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_04.dmp
    Job "CNVAPPDBO4"."EXPDP_BL1_DOCUMENT" successfully completed at 03:23:14
    ++++++++++++++++++++
    uname -aHP-UX ocsmigbrndapp3 B.11.31 U ia64 3522246036 unlimited-user license
    Is it hitting any known bug? Please suggest.
    regds,
    kunwar

    PARALLEL always using with DUMPFILE=filename_*%U*.dmp. Did yoy put the same parameter on target server?
    PARALLEL clause depend on server resources. If the system resources allow, the number of parallel processes should be set to the number of dump files being created.

  • Error while writing to a file

    Hi,
    I am getting an error while writing to a file.Here is the sample code in which I am getting error.The STDERR is getting printed to console but the same is not getting written to a file.
    package Sample;
    import java.util.*;
    import java.io.*;
    public class MediocreExecJavac
    public static void main(String args[])
    try
    Runtime rt = Runtime.getRuntime();
    Process proc = rt.exec("perl ic_start");
    InputStream stderr = proc.getErrorStream();
    InputStreamReader isr = new InputStreamReader(stderr);
    BufferedReader br = new BufferedReader(isr);
    FileWriter fw=new FileWriter("result.txt");
    String line = null;
    System.out.println("<ERROR>");
    while ( (line = br.readLine()) != null)
    System.out.println(line);
    fw.write(line);
    System.out.println("</ERROR>");
    int exitVal = proc.waitFor();
    System.out.println("Process exitValue: " + exitVal);
    fw.close();
    } catch (Throwable t)
    t.printStackTrace();
    }Below is the output -
    <ERROR>
    Can't open perl script "ic_start": No such file or directory
    java.lang.NullPointerException
    at java.io.Writer.write(Unknown Source)
    at Sample.MediocreExecJavac.main(MediocreExecJavac.java:21)
    Please tell where the program is going wrong.

    i think it is just the path of file that u r missing

  • How to read from and write into the same file from multiple threads?

    I need to read from and write into a same file multiple threads.
    How can we do that without any data contamination.
    Can u please provide coding for this type of task.
    Thanks in advance.

    Assuming you are using RandomAccessFile, you can use the locking functionality in the Java NIO library to lock sections of a file that you are reading/writing from each thread (or process).
    If you can't use NIO, and all your threads are in the same application, you can create your own in-process locking mechanism that each thread uses prior to accessing the file. That would take some development, and the OS already has the capability, so using NIO is the best way to go if you can use JDK 1.4 or higher.
    - K
    I need to read from and write into a same file
    multiple threads.
    How can we do that without any data contamination.
    Can u please provide coding for this type of task.
    Thanks in advance.

  • Write x y position and voltage to same file

    Hi,
    I've been writing a program to perform a 2D scan with stepper motors.  There are actually 3 in the program but only 2 perform the scan.  I have used a subvi to take 10 voltage measurements at each position in the scan.  What I'd like to do is record the X and Y position for each set of 10 voltage measurements.  I'm fairly sure I can record the position with the 'GetPosition' function for the motors, but am a little lost on how to get them to write to the same file, and whether or not those positions would correspond to the voltages taken there.  I have included my vi and the subvi used for reference.  If anyone has any suggestions on this problem I'd greatly appreciate it.
    Cheers
    RJ
    Attachments:
    xyz scan 015.vi ‏116 KB
    Keithley 2000 Read Multiple.vi ‏44 KB

    rjwilliams wrote:
    ... I'm fairly sure I can record the position with the 'GetPosition' function for the motors, but am a little lost
    on how to get them to write to the same file, and whether or not those positions would correspond to the
    voltages taken there.
    Well, this is a bit too diffuse. You seem to be writing to a file, but the code is nearly incomprehesible from a dataflow point of view.
    If you are not sure if the file contais the correct values, just run the program, proble the wires, and look at the file later. You should be
    able to tell if the are correct or not. Why should they be different? What values do you expect? What values do you get?
    I agree with Ray that you NEED to cleaup the code in order to troubleshoot. Your code is just a bewildering jumble of code fragments,
    trapped deep inside stacks of while loops and event structures, all tied together with confusing logic.  All you need is a simple state
    machine with one while loop containing one event structure.
    Message Edited by altenbach on 07-07-2008 07:48 AM
    LabVIEW Champion . Do more with less code and in less time .

  • Writing a binary file

    Hi,
    I have trouble when writing a binary file. Here is what I do:
    I have some VI's collecting various data for a fixed period of time. I put all the data into 1D arrays at a fixed frequency (they all have the same size). Once it is done I merge all the arrays into one 2D array and I write that to a binary file. If I wait about 10 seconds after writing the file and I repeat the whole thing, everything is fine and the timing is correct. If I wait for a shorter period of time (say a sec, and this is the maximum I can wait to have a usable application) then the timing is wrong in the first part of the loop (the time-critical data-acquisition one). Of course, if I don't write the file, everything is fine (I have to wait about 250 msec between two runs, but that's ok). Any advice ?
    I also tried several things like streaming the data to disk during the time-critical period and this works fine only if the file allready exists. If not I experience jitters again. I don't really want to stream the data in real time, but it could work if I can get rid of the jitters. Again, any advice ?
    I'm using LV7.1 and the real time module running on a PC (ETS).
    Another, sort of related, topic: I noticed that when using 2D arrays within time critical loops all the process are slowdown whereas if I use several 1D arrays everything is fine... Any idea why ?
    Thanks,
    Laurent

    Hello,
    File I/O transfer rates during streaming to disk depends on several factors: CPU speed and load, hard drive technology (IDA / Serial ATA / SCSI ...), quality of programming. High jitter is always due to bad programming. File I/O operations are not deterministic. So, file I/O VI's must not be called in the critical task. Data must be saved in the normal priority VI's in order to keep jitter as low as possible. This is the reason why an application in RT is based on RT FIFO to transfer data from the critical loop and the normal priority VI.
    You will find a lot of tutorials detailing the key concepts for RT programming at the link below:
    * Real-Time Module
    http://zone.ni.com/devzone/devzone.nsf/webcategories/C25F8C664230613A862567DF006ABB06
    Moreover, memory allocation in LabVIEW is implicit. You must use large set of data carefully because some function reallocate new buffers for their outputs instead of reusing the input buffers (like the function "build an array"). You will find an example of how we can decrease the memory use with array in LabVIEW in the tutorials linked above.
    If you need more specific advices, you can post a sample code that reproduce the behavior that you does not understand. I will try to look at this and give you my feedback.
    Sincerely.
    Matthieu Gourssies, NIF.

  • Errors when writing metadata to files on Windows Server

    Good day!
    I have an iMac running OSX 10.9 (Maverick) and Lightroom 5.2.
    My catalog is on the local iMac drive, and the images are on SMB shared drive (handled by Windows 2012 server).
    I decided to do the final layout of my album using InDesign, and for that I needed to access to images using Bridge. So selected all the images (the rated once...) and issue a Metadata/Save command.
    The system came back with errors on many of the files (such as Photos are read only, and unknown errors).
    Retrying to save the metadata on image that was flagged with an error usually (but not always) worked.
    I tried to select save the metadata for few (6) images in a time, many times it was ok, but sometimes, one or two of them failed (and retrying again usually solved the problem).
    It looks to me like a timing related issue.
    Any ideas?
    (After spending about an hour on this, I have exported the images to my local drive...). In general I don't have similar problems with other apps.
    Any ideas?
    Yuval

    I Yuval
    I'm experiencing more or less the same behavior (see Read-only error when writing metadata to file over network (Synology DS1315+ using AFP) ). For me however, the problem is more pronounced if I use the AFP protocol and is better (I think as you described) if I use SMB (all done on a Synology DS1315+). With SMB it most often works but in very rare cases I also get the "read only" message!
    Do you have solution to this? Or are you stuck as I am?
    Cheers, Chris

  • Error when Writing Metadata to Files in Bridge (Mac) but not in Bridge (PC)

    We get an error when writing metadata to files in Bridge (Mac) but not in Bridge (PC). In the same drive and folder, the PC can successfully write a keyword to a file on the PC, where the Mac returns an error. I have researched this at the Adobe Knowledgebase, but their answer seemed to indicate it was a global issue, and we don't see that behavior on the PC.
    The client is a Mac of course, and the server volume is a Windows share volume. The Mac is bound to AD, and the domain\username and username formats have both been tried when logging in, but you receive the error in both.
    Any help would be appreciated.
    Thanks!
    Rich Oliver

    Hi, I'm having the same problem using FreeNAS (which uses Samba and Netatalk in the backend), but I tried with both AFP and SMB on Mavericks and Yosemite, I still have the same issue.  I think it might be a timing issue with how lightroom interact with a slower write delay using network shares.  I suggest you also chime into this thread:  Lightroom 5 can't write metadata to DNG files   I really hope this is resolved as this is impacting my productivity as I moved my workflow to my Macbook with a shared NAS.

  • Writing to HDF5 file timing is erratic

    Dear Community,
    I am trying to acquire and stream 10-15 megabytes / sec from a bunch of 4472 cards to an HDF5 file and I am having a problem that every once in a while, the H5Fwrite vi takes a lot of CPU time (3-10 seconds) to write a chunk to disk, during which time the backlog from the acquisition system gets too big and overflows the buffer.
    I have tried working around this by creating two loops, one of which reads data from the 4472s and puts it on a queue, while the other dequeues the data and writes it to disk.
    But I'm also wondering whether there's a way to control when the HDF5 vi's actually write to disk, and whether there are some parameters I can tune in the HDF5 libraries, such as block sizes and cache sizes
    , so that writing to disk is fast and not so "lumpy."
    I've tried using the HDF5 library functions h5Pset_cache() and h5Pset_sieve_buf_size() to change the sizes of various buffers but it doesn't seem to change anything.
    Or maybe I should just use the SFP libraries to do this instead instead of rolling my own HDF5-based vi's?
    Thanks,
    Cas
    Casimir Wierzynski
    Graduate Student, Computation and Neural Systems
    California Institute of Technology, MC 139-74
    Pasadena, CA 91125

    What hardware are you using? A 2000 era computer (Pentium III 650) should be able to sustain 10MBytes/sec or better until you run out of disk space. A 2004 era computer (Pentium 4 2.4HT) should be able to do twice that. Both are disk speed limited. Are you using a PXI controller or a desktop machine? PXI controllers use laptop hard drives to keep power consumption down. Laptop hard drives are considerably slower than those in a typical desktop machine.
    Some general tips which may help you.
    1) Make sure all other processes are turned off. This includes screensavers, the Microsoft indexing service, virus checkers, etc. This can be tough on an NT, 2000, or XP machine, but is absolutely vital if you want to sustain high speeds.
    2) Tune your HDF5 chunk size appropriately. For Windows based systems, the optimum is 65,000 points (just a bit under the 65,525 16-bit boundary). If this chunk size gets anywhere near the 1MByte size of the normal HDF5 cache, your write speeds will take a huge hit.
    3) Tune your fetch size from the 4472s for maximum performance. I would expect this to be in the low 100s of thousands of points (that is where it is for NI high-speed digitizers). Unless you increase the HDF5 buffer size from 1MByte, you probably should keep this below about 350,000 points to avoid paging problems. Just fetch the large buffer and write the whole thing at once. The HDF5 software will chunk it to disk for you.
    4) Make sure you serialize calls to the HDF5 driver. HDF5 is NOT threadsafe under Windows (it is under Linux). This could cause serious data corruption and other wierd problems if you are taking data from multiple boards and writing to the same file in multiple loops.
    5) Check your disks to make sure you do not have a hardware problem. A dying hard drive will produce symptoms similar to these. You can get utilities from the major disk manufacturers.
    If you tried all these things and are still having problems, please reply with more information. You should be able to succeed. You may also want to check out NI-HWS, available on the latest driver CD. It uses the same HDF5 format as the SFP routines, but is far easier to use.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • \n is not working by the time writing text into file ...

    Hi,
    I am reading each line from file and writing into another file ...
    It is writing continously in the file even if i use \n for new line ...
    Why "\n" is not working by the time writing text into file ...
    Here is my code:
    import java.io.*;
    import java.net.*;
    import java.util.*;
    public class Test11{
    private BufferedReader data;
    private String line=null;
    private StringBuffer buf= new StringBuffer();
    private BufferedWriter thewriter=null;
    private File file=null;
    private FileReader fr=null;
    private String inputLocation="c:\\test14.txt";
    public Test11(){ }
    public void disp(){
    try {   
    file=new File(inputLocation);
    fr = new FileReader(file);
    data = new BufferedReader(fr);
    while ((line = data.readLine()) != null) {
    buf.append(line + "\n");
    String text=buf.toString();
    thewriter = new BufferedWriter(new FileWriter("c:\\test15.txt"));
    thewriter.write(text);
    buf=null;
    thewriter.close();
    } catch(IOException e) { System.out.println("error ==="+e ); }
    public static void main(String[] args) {
    Test11 t=new Test11();
    t.disp();
    System.out.println("all files are converted...");
    I used "\n" after reading each line .. i want output file also same as input file ... how do i break each line by the time writing into text file .. "\n" is working in word pad but now working in notepad ... in note pad i am getting some thing like rectangle insted of "\n" ....
    Any help please .....
    thanks.

    \n works just fine, and every text editor in the world except Notepad understands that it is supposed to be a line-ending character. You don't have a problem, except in your choice of text editor. If somebody is forcing you to use Notepad then you'll have to output "\r\n" instead.

  • Time change to stop and start writing to a file

    Guys,
    I'm trying to stop writing to a file at 12:00 AM and start writing to a new file when it gets to this time. I also need to start writing to the same directory. How can I do this. I beleive I know how to do it with a close and open file but i'm still learning these functions. Attached is my code.
    Thanks
    Attachments:
    Test1.vi ‏141 KB

    You were getting help on this problem in your other thread. It usually is best to continue the active one rather than start an all new one for the same problem. If the answer(s) you get in the other thread don't help, specify why and the original poster, or others will usually add detail to their answers. Starting a new thread usually results in repeated work, as the respondents may not have seen what was proposed in the other thread.
    P.M.
    Putnam
    Certified LabVIEW Developer
    Senior Test Engineer
    Currently using LV 6.1-LabVIEW 2012, RT8.5
    LabVIEW Champion

Maybe you are looking for