Writing to HDF5 file timing is erratic

Dear Community,
I am trying to acquire and stream 10-15 megabytes / sec from a bunch of 4472 cards to an HDF5 file and I am having a problem that every once in a while, the H5Fwrite vi takes a lot of CPU time (3-10 seconds) to write a chunk to disk, during which time the backlog from the acquisition system gets too big and overflows the buffer.
I have tried working around this by creating two loops, one of which reads data from the 4472s and puts it on a queue, while the other dequeues the data and writes it to disk.
But I'm also wondering whether there's a way to control when the HDF5 vi's actually write to disk, and whether there are some parameters I can tune in the HDF5 libraries, such as block sizes and cache sizes
, so that writing to disk is fast and not so "lumpy."
I've tried using the HDF5 library functions h5Pset_cache() and h5Pset_sieve_buf_size() to change the sizes of various buffers but it doesn't seem to change anything.
Or maybe I should just use the SFP libraries to do this instead instead of rolling my own HDF5-based vi's?
Thanks,
Cas
Casimir Wierzynski
Graduate Student, Computation and Neural Systems
California Institute of Technology, MC 139-74
Pasadena, CA 91125

What hardware are you using? A 2000 era computer (Pentium III 650) should be able to sustain 10MBytes/sec or better until you run out of disk space. A 2004 era computer (Pentium 4 2.4HT) should be able to do twice that. Both are disk speed limited. Are you using a PXI controller or a desktop machine? PXI controllers use laptop hard drives to keep power consumption down. Laptop hard drives are considerably slower than those in a typical desktop machine.
Some general tips which may help you.
1) Make sure all other processes are turned off. This includes screensavers, the Microsoft indexing service, virus checkers, etc. This can be tough on an NT, 2000, or XP machine, but is absolutely vital if you want to sustain high speeds.
2) Tune your HDF5 chunk size appropriately. For Windows based systems, the optimum is 65,000 points (just a bit under the 65,525 16-bit boundary). If this chunk size gets anywhere near the 1MByte size of the normal HDF5 cache, your write speeds will take a huge hit.
3) Tune your fetch size from the 4472s for maximum performance. I would expect this to be in the low 100s of thousands of points (that is where it is for NI high-speed digitizers). Unless you increase the HDF5 buffer size from 1MByte, you probably should keep this below about 350,000 points to avoid paging problems. Just fetch the large buffer and write the whole thing at once. The HDF5 software will chunk it to disk for you.
4) Make sure you serialize calls to the HDF5 driver. HDF5 is NOT threadsafe under Windows (it is under Linux). This could cause serious data corruption and other wierd problems if you are taking data from multiple boards and writing to the same file in multiple loops.
5) Check your disks to make sure you do not have a hardware problem. A dying hard drive will produce symptoms similar to these. You can get utilities from the major disk manufacturers.
If you tried all these things and are still having problems, please reply with more information. You should be able to succeed. You may also want to check out NI-HWS, available on the latest driver CD. It uses the same HDF5 format as the SFP routines, but is far easier to use.
This account is no longer active. Contact ShadesOfGray for current posts and information.

Similar Messages

  • Writing a binary file

    Hi,
    I have trouble when writing a binary file. Here is what I do:
    I have some VI's collecting various data for a fixed period of time. I put all the data into 1D arrays at a fixed frequency (they all have the same size). Once it is done I merge all the arrays into one 2D array and I write that to a binary file. If I wait about 10 seconds after writing the file and I repeat the whole thing, everything is fine and the timing is correct. If I wait for a shorter period of time (say a sec, and this is the maximum I can wait to have a usable application) then the timing is wrong in the first part of the loop (the time-critical data-acquisition one). Of course, if I don't write the file, everything is fine (I have to wait about 250 msec between two runs, but that's ok). Any advice ?
    I also tried several things like streaming the data to disk during the time-critical period and this works fine only if the file allready exists. If not I experience jitters again. I don't really want to stream the data in real time, but it could work if I can get rid of the jitters. Again, any advice ?
    I'm using LV7.1 and the real time module running on a PC (ETS).
    Another, sort of related, topic: I noticed that when using 2D arrays within time critical loops all the process are slowdown whereas if I use several 1D arrays everything is fine... Any idea why ?
    Thanks,
    Laurent

    Hello,
    File I/O transfer rates during streaming to disk depends on several factors: CPU speed and load, hard drive technology (IDA / Serial ATA / SCSI ...), quality of programming. High jitter is always due to bad programming. File I/O operations are not deterministic. So, file I/O VI's must not be called in the critical task. Data must be saved in the normal priority VI's in order to keep jitter as low as possible. This is the reason why an application in RT is based on RT FIFO to transfer data from the critical loop and the normal priority VI.
    You will find a lot of tutorials detailing the key concepts for RT programming at the link below:
    * Real-Time Module
    http://zone.ni.com/devzone/devzone.nsf/webcategories/C25F8C664230613A862567DF006ABB06
    Moreover, memory allocation in LabVIEW is implicit. You must use large set of data carefully because some function reallocate new buffers for their outputs instead of reusing the input buffers (like the function "build an array"). You will find an example of how we can decrease the memory use with array in LabVIEW in the tutorials linked above.
    If you need more specific advices, you can post a sample code that reproduce the behavior that you does not understand. I will try to look at this and give you my feedback.
    Sincerely.
    Matthieu Gourssies, NIF.

  • Errors when writing metadata to files on Windows Server

    Good day!
    I have an iMac running OSX 10.9 (Maverick) and Lightroom 5.2.
    My catalog is on the local iMac drive, and the images are on SMB shared drive (handled by Windows 2012 server).
    I decided to do the final layout of my album using InDesign, and for that I needed to access to images using Bridge. So selected all the images (the rated once...) and issue a Metadata/Save command.
    The system came back with errors on many of the files (such as Photos are read only, and unknown errors).
    Retrying to save the metadata on image that was flagged with an error usually (but not always) worked.
    I tried to select save the metadata for few (6) images in a time, many times it was ok, but sometimes, one or two of them failed (and retrying again usually solved the problem).
    It looks to me like a timing related issue.
    Any ideas?
    (After spending about an hour on this, I have exported the images to my local drive...). In general I don't have similar problems with other apps.
    Any ideas?
    Yuval

    I Yuval
    I'm experiencing more or less the same behavior (see Read-only error when writing metadata to file over network (Synology DS1315+ using AFP) ). For me however, the problem is more pronounced if I use the AFP protocol and is better (I think as you described) if I use SMB (all done on a Synology DS1315+). With SMB it most often works but in very rare cases I also get the "read only" message!
    Do you have solution to this? Or are you stuck as I am?
    Cheers, Chris

  • Error when Writing Metadata to Files in Bridge (Mac) but not in Bridge (PC)

    We get an error when writing metadata to files in Bridge (Mac) but not in Bridge (PC). In the same drive and folder, the PC can successfully write a keyword to a file on the PC, where the Mac returns an error. I have researched this at the Adobe Knowledgebase, but their answer seemed to indicate it was a global issue, and we don't see that behavior on the PC.
    The client is a Mac of course, and the server volume is a Windows share volume. The Mac is bound to AD, and the domain\username and username formats have both been tried when logging in, but you receive the error in both.
    Any help would be appreciated.
    Thanks!
    Rich Oliver

    Hi, I'm having the same problem using FreeNAS (which uses Samba and Netatalk in the backend), but I tried with both AFP and SMB on Mavericks and Yosemite, I still have the same issue.  I think it might be a timing issue with how lightroom interact with a slower write delay using network shares.  I suggest you also chime into this thread:  Lightroom 5 can't write metadata to DNG files   I really hope this is resolved as this is impacting my productivity as I moved my workflow to my Macbook with a shared NAS.

  • How do you stream data to a HDF5 file?

    I have managed to collect data to a hdf5 file but only for the first block of data (100, 1000, 10,000) but I would like to stream data until the hard disk is full or the user stops it. I have placed the HDF5 *.vi's, dll's, on my 'C' Drive.

    There are at least 4 HDF5 APIs available for LabVIEW.  Which one are you using?  If you downloaded the sfpFile API from the NI Website, the example I attached below (which also uses NI-SCOPE) should work for you.  You will probably need to relink to the appropriate API VIs when you load it.  If you do not have NI-SCOPE, the example will still give you the info you need.  Treat the scope VI as a generic data source and ignore the VI linking errors from NI-SCOPE VIs on load (you can get the NI-SCOPE driver here).
    At each iteration, you need to do the following:
    Get the current dataspace from the data on disk.
    Extend this dataspace by the amount you will be adding (assumes an extensible data set, you can preallocate the size beforehand if you know what it is, which will make this step unnecessary).
    Select a hyperslab on this extended dataspace corresponding to the data you will be writing.
    Using the the new dataspace, write your data to disk.
    This is a lot more complex than it could be, but that is the nature of HDF5 (very low level, very powerful).  Let us know if you need more information.  Please let us know what version of LabVIEW you are using, if you do.
    This account is no longer active. Contact ShadesOfGray for current posts and information.
    Attachments:
    sfpFileEX Stream To Disk from NI-Scope.vi ‏234 KB

  • Expdp with parallel writing in one file at a time on OS

    Hi friends,
    I am facing a strange issue.Despite giving parallel=x parameter the expdp is writing on only one file on OS level at a time,although it is writing into multiple files sequentially (not concurrently)
    While on other servers i see that expdp is able to start writing in multiple files concurrently. Following is the sample log
    of my expdp .
    ++++++++++++++++++++
    Export: Release 10.2.0.3.0 - 64bit Production on Friday, 15 April, 2011 3:06:50
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "CNVAPPDBO4"."EXPDP_BL1_DOCUMENT": CNVAPPDBO4/********@EXTTKS1 tables=BL1_DOCUMENT DUMPFILE=DUMP1_S:Expdp_BL1_DOCUMENT_%U.dmp LOGFILE=LOG1_S:Expdp_BL1_DOCUMENT.log CONTENT=DATA_ONLY FILESIZE=5G EXCLUDE=INDEX,STATISTICS,CONSTRAINT,GRANT PARALLEL=6 JOB_NAME=Expdp_BL1_DOCUMENT
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 23.93 GB
    . . exported "CNVAPPDBO4"."BL1_DOCUMENT" 17.87 GB 150951906 rows
    Master table "CNVAPPDBO4"."EXPDP_BL1_DOCUMENT" successfully loaded/unloaded
    Dump file set for CNVAPPDBO4.EXPDP_BL1_DOCUMENT is:
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_01.dmp
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_02.dmp
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_03.dmp
    /tksmig/load2/oracle/postpaidamd/DUMP1_S/Expdp_BL1_DOCUMENT_04.dmp
    Job "CNVAPPDBO4"."EXPDP_BL1_DOCUMENT" successfully completed at 03:23:14
    ++++++++++++++++++++
    uname -aHP-UX ocsmigbrndapp3 B.11.31 U ia64 3522246036 unlimited-user license
    Is it hitting any known bug? Please suggest.
    regds,
    kunwar

    PARALLEL always using with DUMPFILE=filename_*%U*.dmp. Did yoy put the same parameter on target server?
    PARALLEL clause depend on server resources. If the system resources allow, the number of parallel processes should be set to the number of dump files being created.

  • Error in writing to export file

    Hi,
    I am getting the below error on running export:-
    EXP-00002: error in writing to export file
    The query is as given below:-
    exp cam3_mpcb2/cam3_mpcb2@ORA10G file='c:\MPCB_cam_20120113.dmp' log='c:\MPCB_20120113.log'
    CAn i overcome this error by adding "ignore" at the end of the query or how can i modify the above query ?

    Are there any other error messages that accompany the one you posted?
    EXP-00002: error in writing to export file
    Cause: Export could not write to the export file, probably because of a device error. This message is usually followed by a device message from the operating system.
    Action: Take appropriate action to restore the device. Do you have permissions to write to this location?

  • Error while writing to a file

    Hi,
    I am getting an error while writing to a file.Here is the sample code in which I am getting error.The STDERR is getting printed to console but the same is not getting written to a file.
    package Sample;
    import java.util.*;
    import java.io.*;
    public class MediocreExecJavac
    public static void main(String args[])
    try
    Runtime rt = Runtime.getRuntime();
    Process proc = rt.exec("perl ic_start");
    InputStream stderr = proc.getErrorStream();
    InputStreamReader isr = new InputStreamReader(stderr);
    BufferedReader br = new BufferedReader(isr);
    FileWriter fw=new FileWriter("result.txt");
    String line = null;
    System.out.println("<ERROR>");
    while ( (line = br.readLine()) != null)
    System.out.println(line);
    fw.write(line);
    System.out.println("</ERROR>");
    int exitVal = proc.waitFor();
    System.out.println("Process exitValue: " + exitVal);
    fw.close();
    } catch (Throwable t)
    t.printStackTrace();
    }Below is the output -
    <ERROR>
    Can't open perl script "ic_start": No such file or directory
    java.lang.NullPointerException
    at java.io.Writer.write(Unknown Source)
    at Sample.MediocreExecJavac.main(MediocreExecJavac.java:21)
    Please tell where the program is going wrong.

    i think it is just the path of file that u r missing

  • Problem writing object to file

    Hi everyone,
    I am creating an index by processing text files. No of files are 15000 and index is a B+ Tree. when all files processed and i tried to write it to the file it gives me these errors.
    15000 files processed.
    writing to disk...
    Exception in thread "main" java.lang.StackOverflowError
            at sun.misc.SoftCache.processQueue(SoftCache.java:153)
            at sun.misc.SoftCache.get(SoftCache.java:269)
            at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:244)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1029)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:380)
            at java.util.Vector.writeObject(Vector.java:1018)
            at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
            at java.lang.reflect.Method.invoke(Method.java:585)
            at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:890)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1333)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1245)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1069)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:380)
            at java.util.Vector.writeObject(Vector.java:1018)
            at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
            at java.lang.reflect.Method.invoke(Method.java:585)
            at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:890)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1333)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1341)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1341)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1341)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1341)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1341)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1369)
    ..........can anyone point out the mistake im doing?
    thanks

    the B+ Tree is balanced and is perfectly working without writing to the file. and i am using default writeObject() method of ObjectOutputStream.
      try {
                                FileOutputStream   f_out   = new   FileOutputStream ("tree.idx");
                                ObjectOutputStream obj_out = new   ObjectOutputStream (new BufferedOutputStream (f_out));   
         for (int x = 0, l = files.length; x < l; x++) {
              ProcessTree(files[x].toString());
                    System.out.println("Writing main index to the disk...");
                    obj_out.writeObject (tree); 
                    obj_out.flush();
                    obj_out.close();
                    } catch (Exception e)
          System.out.println (e.toString ());
          System.out.println("Error Writing to Disk!");
        }

  • Cannot install PhotoShop CC "Error writing to temporary file location", how do I fix this?

         I am working on a Mac OS X Server, version 10.6.8
         I use my Adobe applications - Indesign, PhotoShop, Illustrator, and a couple of others - primarily for school work (graphic design). I download the applications from my Adobe Application Manager, but they recently stopped working. I didn't know if this was due to recent updates or the fact that I had just changed out my hard drive when the original one failed. I uninstalled the applications, and am trying to re-install them, but am receiving the error message, "Error writing to temporary file location".
         Does anyone know how to fix this? Is it just because I'm running 10.6.8? If so, can I make my applications work without getting a whole new computer?

    Please refer PS system requirements from this link:
    http://helpx.adobe.com/photoshop/system-requirements.html#Photoshop CC system requirements/
    Regards,
    Ashutosh

  • How to insert new line char while writing bytes into file

    Hello Sir,
    Is it possible to insert the new line character in set of String variables and stored them into bytearray ,then finally write into File?
    This is the sample code which i tried:
                 File f = new File(messagesDir,"msg" + msgnum + ".txt");
                 FileOutputStream fout = new FileOutputStream(f);
                    String fromString = "From:    "+msg.getFrom()+"\n";
                    String toString = "To:     "+msg.getTo()+"\n";
                    String dateString = "Sent:    "+msg.getDate()+"\n";
                      String msgString =msg.getBody()+"\n";
                    String finalString=fromString+toString+dateString+msgString;
                    byte[] msgBytes = finalString.getBytes();
                    fout.write(msgBytes);
                 fout.close();in the above code , i tried to add the new line as "\n" in end of each string. but when i look into the generated files msg1.txt , it contains some junk char [] .
    please provide me the help
    regards
    venki

    but it has still shown the the junk char, its not able
    to create the new line in the created file i am afraid
    how am i going to get the solution?:(Do not be afraid dear sir. You are obviously using a windows operating system or a mac operating system. On windows a newline is "\r\n" not '\n', and on a mac a newline is '\r', not '\n'. If you make that correction, dear sir, your program will work.
    However, there is a better way. First, you probably want to buffer your output if you are going to write more than one time to the file, which will make writing to the file more efficient. In addition, when you buffer your output, you can use the newLine() method of the BufferedWriter object to insert a newline. The newline will be appropriate for the operating system that the program is running on. Here is an example:
    File f = new File("C:/TestData/atest.txt");
    BufferedWriter out = new BufferedWriter(new FileWriter(f) );
    String fromString = "From: Jane";
    out.write(fromString);
    //Not written to the file until enough data accumulates.
    //The data is stored in a buffer until then.
    out.newLine();
    String toString = "To: Dick";
    out.write(toString);
    out.newLine();
    String dateString = "Sent: October 27, 2006";
    out.write(dateString);
    out.newLine();
    out.close(); 
    //Causes any unwritten data to be flushed from
    //the buffer and written to the file.

  • Writing output to file does not work

    Hello All,
    I have a problem with writing to a file created on the command prompt:
    1) my base class is called TestRandom2
    2) I have a method called run which runs the print method
    3) I passed a reference of the run method into the main below, I don't know if I'm doing it correctly but the error message I'm getting "cannot resolve variable symbol : variable String
    location: class TestRandom2 return String;"
    4)When I declare the method run as void it's also giving an error.
    5)I would to fix this proble to write my output to a .xls file. Can someone help me as soon as possible please!!!!
    public static void main(String[] args) throws Exception {
         TestRandom2 testrandom = new TestRandom2();
              byte b[] = new String(testrandom.run()).getBytes();
              OutputStream o = new FileOutputStream(args[0]);
              o.write(b);
              o.close();
         }

    this is my entire file, I want to be able to call the run method in main and have the file out put to an xls file can some one help me please
    Code ------
    import java.io.*;
    public class TestRandom2{
         Deck deck = new Deck();
         public TestRandom2(){
              deck.populate();
         public void run(String file){
                   if(file.equals("") || file==null)
                   throws IOException {
                   FileOutputStream output=new FileOutputStream(file);
                   for(int i=0; i<5; i++){
                   deck.shuffle();
                   deck.print();
                   output.close;
         class Deck{
              public void populate(){
                   System.out.println("populating deck!");
                   cards=new Integer[52];
                   for(int i=0;i<52;i++){
                        cards= new Integer(i+1);
         public void shuffle(){
              System.out.println("Shuffling deck!");
              rnd.setSeed(randomseed.getSeed());
              java.util.List list = java.util.Arrays.asList(cards);
              java.util.Collections.shuffle(list,rnd);
              cards = (Integer[])list.toArray(new Integer[0]);
         public void print(){
              System.out.println("populating deck!");
              for(int i=0;i<cards.length;i++){
                   System.out.print("P:"+(i+1));
                   System.out.print("C:"+cards[i].intValue());
              Integer[] cards = null;
              RandomSeed randomseed = new RandomSeed();
              java.util.Random rnd = new java.util.Random();
              class RandomSeed {
         private long[] seeds = new long[] {
              9876, 54321,
              1299961164, 253987020,
              669708517, 2079157264,
              190904760, 417696270,
              1289741558, 1376336092,
              1803730167, 324952955,
              489854550, 582847132,
              1348037628, 1661577989,     
              350557787, 1155446919,          
              591502945, 634133404,
              1901084678, 862916278,     
              1988640932, 1785523494,     
              1873836227, 508007031,          
              1146416592, 967585720,
              1837193353, 1522927634,
              38219936, 921609208,
              349152748, 112892610,          
              744459040, 1735807920,
              1983990104, 728277902,
              309164507, 2126677523,
              362993787, 1897782044,
              556776976, 462072869,
              1584900822, 2019394912,
              1249892722, 791083656,          
              1686600998, 1983731097,
              1127381380, 198976625,
              1999420861, 1810452455,
              1972906041, 664182577,          
              84636481, 1291886301,
              1186362995, 954388413,
              2141621785, 61738584,
              1969581251, 1557880415,          
              1150606439, 136325185,
              95187861, 1592224108,
              940517655, 1629971798,
              215350428, 922659102,     
              786161212, 1121345074,
              1450830056, 1922787776,
              1696578057, 2025150487,
              1803414346, 1851324780,          
              1017898585, 1452594263,
              1184497978, 82122239,
              633338765, 1829684974,
              430889421, 230039326,          
              492544653, 76320266,
              389386975, 1314148944,
              1720322786, 709120323,
              1868768216, 1992898523,          
              443210610, 811117710,
              1191938868, 1548484733,
              616890172, 159787986,
              935835339, 1231440405,     
              1058009367, 1527613300,
              1463148129, 1970575097,
              1795336935, 434768675,
              274019517, 605098487,          
              483689317, 217146977,
              2070804364, 340596558,
              930226308, 1602100969,
              989324440, 801809442,          
              410606853, 1893139948,
              1583588576, 1219225407,
              2102034391, 1394921405,
              2005037790, 2031006861,          
              1244218766, 923231061,
              49312790, 775496649,
              721012176, 321339902,
              1719909107, 1865748178,          
              1156177430, 1257110891,
              307561322, 1918244397,
              906041433, 360476981,
              1591375755, 268492659,          
              461522398, 227343256,
              2145930725, 2020665454,
              1938419274, 1331283701,     
              174405412, 524140103,          
              494343653, 18063908,
              1025534808, 181709577,
              2048959776, 1913665637,
              950636517, 794796256,          
              1828843197, 1335757744,
              211109723, 983900607,
              825474095, 1046009991,
              374915657, 381856628,          
              1241296328, 698149463,
              1260624655, 1024538273,
              900676210, 1628865823,
              697951025, 500570753,          
              1007920268, 1708398558,
              264596520, 624727803,
              1977924811, 674673241,
              1440257718, 271184151,          
              1928778847, 993535203,
              1307807366, 1801502463,
              1498732610, 300876954,
              1617712402, 1574250679,          
              1261800762, 1556667280,
              949929273, 560721070,
              1766170474, 1953522912,
              1849939248, 19435166,          
              887262858, 1219627824,
              483086133, 603728993,
              1330541052, 1582596025,
              1850591475, 723593133,          
              1431775678, 1558439000,
              922493739, 1356554404,
              1058517206, 948567762,
              709067283, 1350890215,          
              1044787723, 2144304941,
              999707003, 513837520,
              2140038663, 1850568788,
              1803100150, 127574047,          
              867445693, 1149173981,
              408583729, 914837991,
              1166715497, 602315845,
              430738528, 1743308384,          
              1388022681, 1760110496,
              1664028066, 654300326,
              1767741172, 1338181197,
              1625723550, 1742482745,          
              464486085, 1507852127,
              754082421, 1187454014,
              1315342834, 425995190,
              960416608, 2004255418,          
              1262630671, 671761697,
              59809238, 103525918,
              1205644919, 2107823293,
              1615183160, 1152411412,          
              1024474681, 2118672937,
              1703877649, 1235091369,
              1821417852, 1098463802,
              1738806466, 1529062843,          
              620780646, 1654833544,
              1070174101, 795158254,
              658537995, 1693620426,
              2055317555, 508053916,          
              1647371686, 1282395762,
              29067379, 409683067,
              1763495989, 1917939635,
              1602690753, 810926582,          
              885787576, 513818500,
              1853512561, 1195205756,
              1798585498, 1970460256,
              1819261032, 1306536501,          
              1133245275, 37901,
              689459799, 1334389069,
              1730609912, 1854586207,
              1556832175, 1228729041,
              251375753, 683687209,
              2083946182, 1763106152,
              2142981854, 1365385561,
              763711891, 1735754548,
              1581256466, 173689858,
              2121337132, 1247108250,
              1004003636, 891894307,
              569816524, 358675254,
              626626425, 116062841,
              632086003, 861268491,
              1008211580, 779404957,
              1134217766, 1766838261,
              1423829292, 1706666192,
              942037869, 1549358884,
              1959429535, 480779114,
              778311037, 1940360875,
              1531372185, 2009078158,
              241935492, 1050047003,
              272453504, 1870883868,
              390441332, 1057903098,
              1230238834, 1548117688,
              1242956379, 1217296445,
              515648357, 1675011378,
              364477932, 355212934,
              2096008713, 1570161804,
              1409752526, 214033983,
              1288158292, 1760636178,
              407562666, 1265144848,
              1071056491, 1582316946,
              1014143949, 911406955,
              203080461, 809380052,
              125647866, 1705464126,
              2015685843, 599230667,
              1425476020, 668203729,
              1673735652, 567931803,
              1714199325, 181737617,
              1389137652, 678147926,
              288547803, 435433694,
              200159281, 654399753,
              1580828223, 1298308945,
              1832286107, 169991953,
              182557704, 1046541065,
              1688025575, 1248944426,
              1508287706, 1220577001,
              36721212, 1377275347,
              1968679856, 1675229747,
              279109231, 1835333261,
              1358617667, 1416978076,
              740626186, 2103913602,
              1882655908, 251341858,
              648016670, 1459615287,
              780255321, 154906988,
              857296483, 203375965,
              1631676846, 681204578,
              1906971307, 1623728832,
              1541899600, 1168449797,
              1267051693, 1020078717,
              1998673940, 1298394942,
              1914117058, 1381290704,
              426068513, 1381618498,
              139365577, 1598767734,
              2129910384, 952266588,
              661788054, 19661356,
              1104640222, 240506063,
              356133630, 1676634527,
              242242374, 1863206182,
              957935844, 1490681416 };
         public void longToLong(){
              for (int i=0;i<(seeds.length);i++){
                   seedsArray[i] = new java.lang.Long(seeds[i]);
         public boolean checkForSeed(long seed){
              for(int i=0;i<lastSeed.length;i++){
                   if (seed == lastSeed[i])
                        return true;
              addSeed(seed);
              return false;
         public void addSeed(long seed){     
              if (!(currentseedpos < maxSeeds))
                   currentseedpos = 0;
              lastSeed[currentseedpos] = seed;
         public boolean checkForValue(int value){
              for(int i=0;i<lastValue.length;i++){
                   if (value == lastValue[i])
                        return true;
                   addValue(value);
                   return false;
         public void addValue(int value){
              if (!(currentvaluepos < maxValues))
                   currentvaluepos = 0;
              lastValue[currentvaluepos] = value;
         public RandomSeed(){
              lastValue = new int[maxValues];
              lastSeed = new long[maxSeeds];
              longToLong();
              shuffle();
         public RandomSeed(int v){
              maxValues = v;
              lastValue = new int[maxValues];
              lastSeed = new long[maxSeeds];
              longToLong();
              shuffle();
         public RandomSeed(int v, int s){
              maxValues = v;
              maxSeeds = s;
              lastValue = new int[maxValues];
              lastSeed = new long[maxSeeds];
              longToLong();
              shuffle();
         public void shuffle(){
              random.setSeed(seeds[random.nextInt(seeds.length-1)]);
              java.util.List list = java.util.Arrays.asList(seedsArray);
              java.util.Collections.shuffle(list,random);
         public long getSeed(){
              shuffle();
              random.setSeed(seeds[random.nextInt(seeds.length-1)]);
              long seed=seedsArray[random.nextInt(seeds.length-1)].longValue();
              while (checkForSeed(seed))seed=seedsArray[random.nextInt(seeds.length-1)].longValue();
              return seed;
         public int nextInt(int max){
              shuffle();
              random.setSeed(seeds[random.nextInt(seeds.length-1)]);
              long seed = seedsArray[random.nextInt(seeds.length-1)].longValue();
              while (checkForSeed(seed))
                   seed = seedsArray[random.nextInt(seeds.length-1)].longValue();
                   random.setSeed(seed);
                   int seedone = random.nextInt(max)+1;
                   if (checkForValue(seedone))
                   seedone = random.nextInt(max)+1;
                   return (seedone);
         public int nextInt(){
              return nextInt(MAXINT);
         }     private int currentseedpos = 0;
         private int currentvaluepos = 0;
         public int lastValue[] = new int[0];
         public long lastSeed[] = new long[0];
         public final int MAXINT = 9999;
         public int maxSeeds = 400;
         public int maxValues = 5;
         private java.lang.Long[] seedsArray = new java.lang.Long[seeds.length];
         private java.util.Random random = new java.util.Random();
         public static void main(String[] args) throws Exception {
         TestRandom2 testrandom = new TestRandom2();
              testrandom.run();
    ----Code
    Thanks for all the response I've been getting please continue to help

  • Problem in writing to the file

    I use this labview code to read and save some electrical measurement data from a set of instruments. I am having a problem that the code stops writing to the file after a while. It stops responding too. The only way to stop it then is to use the task manager and kill it. The code was written for an older version of labview but now I am using labview 9. Everything else seems updated but there's a section that uses Write characters to file vi and that may be causing the problem. I made a few futile attempts to change it. I would highly appreciate if someone takes a look at it and could tell me what's going wrong.
    Attachments:
    JANUS 2.2_4K Probe edit (2).vi ‏60 KB

    I will second aeastet's advice - please look into how state machines and producer/consumer loops work and use them.  Your program is very inefficient, but is very similar to what I would have written before I learned about state machines and producer/consumer loops.  Start with the LabVIEW help and go from there.  These forums and the National Instruments website can give you lots of help.
    Two things that will help you for this particular problem:
    At every loop iteration, you are opening the file, seeking to the end of it, appending data, then closing the file.  This is very slow and the code you use, as mentioned above, will not work if the file size exceeds 2GBytes.  I would recommend you open the file once, then use the write primitive to write to it until you finish, then close it.  You do not need the write character to file VI.  No seeking.  No repetitive opening and closing.  You can either open and close outside the loop, or use case structures and boolean flags (as you have done for other things) to open and close inside the loop.
    After you write to the file, if you choose to graph, you are reopening the file, reading the entire thing, and plotting this data.  This is another major slowdown that will only get worse as your file gets bigger.  You would be far better off caching the data in a shift register and plotting it on demand.  It would probably take less memory, as well.  You may want to read the tutorial Managing Large Data Sets in LabVIEW.
    One last tip.  You use Value properties to read and set the values of front panel controls.  Local variables are far faster (about three orders of magnitude).  However, do not make the mistake of using local variables for data storage.  Data is wires.  Local variables are a way to communicate to the front panel.  You seem to have this down, but a reminder to others reading this thread is in order.
    Let us know if you need more explanation or help.  Good luck!
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Error writing metadata to file

    I am getting constant error messages from Bridge CS4 when trying to add keywords to my images "Error Writing Metadata to file "XXXX". It's very annoying and destroying my workflow. I have tried resetting original preferences, and using purge cashe. Finally I reorganized my entire folder structure which seemed to make the problem less frequent.
    The circumstance that makes this error pop up most frequently is when I add keywords to a handful of files that already have another keyword assigned. Quitting the folder and then re-entering it solves the problem for the photos with the current error, but then 20 or so photos later I get the error message again.
    Any Fixes?

    A new suspicion was that the read/write data might be different for the images files themselves than it was for the folder holding them. So I applied the read/write permission to the image files themselves. No luck though. Still getting "Error writing metadata to file xxxx" Errors. Would love to be spending all this time editing photos instead of chasing down bugs. Brand new MacBook Pro and I'm spending more time trying to fix errors than using the software. ::sigh::

  • \n is not working by the time writing text into file ...

    Hi,
    I am reading each line from file and writing into another file ...
    It is writing continously in the file even if i use \n for new line ...
    Why "\n" is not working by the time writing text into file ...
    Here is my code:
    import java.io.*;
    import java.net.*;
    import java.util.*;
    public class Test11{
    private BufferedReader data;
    private String line=null;
    private StringBuffer buf= new StringBuffer();
    private BufferedWriter thewriter=null;
    private File file=null;
    private FileReader fr=null;
    private String inputLocation="c:\\test14.txt";
    public Test11(){ }
    public void disp(){
    try {   
    file=new File(inputLocation);
    fr = new FileReader(file);
    data = new BufferedReader(fr);
    while ((line = data.readLine()) != null) {
    buf.append(line + "\n");
    String text=buf.toString();
    thewriter = new BufferedWriter(new FileWriter("c:\\test15.txt"));
    thewriter.write(text);
    buf=null;
    thewriter.close();
    } catch(IOException e) { System.out.println("error ==="+e ); }
    public static void main(String[] args) {
    Test11 t=new Test11();
    t.disp();
    System.out.println("all files are converted...");
    I used "\n" after reading each line .. i want output file also same as input file ... how do i break each line by the time writing into text file .. "\n" is working in word pad but now working in notepad ... in note pad i am getting some thing like rectangle insted of "\n" ....
    Any help please .....
    thanks.

    \n works just fine, and every text editor in the world except Notepad understands that it is supposed to be a line-ending character. You don't have a problem, except in your choice of text editor. If somebody is forcing you to use Notepad then you'll have to output "\r\n" instead.

Maybe you are looking for