Process many files concurrently

My program removes footnotes from a file read in, where the footnotes have the form [...].
Currently it just reads in one file and prints the output to an output file.
But I want it to remove footnotes from about 40 files that I have.
How can I modify the program so that it reads in all of the files the user wants the footnotes removed from, and prints all of the resulting files to new files? (i.e.without having to process each file individually?
/** This program reads in a text file and removes [...] characters from it
  * even when there are punctuation characters between the brackets
  *@author
import java.io.*;
import java.util.*;
import java.util.regex.*;
public class Footnotes2
public static void main (String [] args)
try
    InputStream in;
   in = new FileInputStream(Input.readString("Input Filename:"));
        // Set up character stream
        BufferedReader r = new BufferedReader(new InputStreamReader(in, "8859_1"));
//open output stream
Writer out = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(Input.readString("outfilename:")), "8859_1"));
String line = null;
    while( (line = r.readLine() ) != null)
          String regexp = "(\\[[\\w\\s\\w\\.\\,\\'\\`\\!\\?\\(\\)\\;\\:]*\\])+";
          Pattern patt = Pattern.compile(regexp);
          String footnote = new String(line);
          String text = patt.matcher(footnote).replaceAll("");
out.write(text);
out.write("\n");
out.flush();
out.close();
catch(IOException e)
System.out.println("IOException caught");
System.out.println(e.getMessage());
}  

Step one) Make the "remove" bit a method that takes a file name for the file you wsih to edit.
Step two) Using the methods in java.io.File, create a list of all the files in a folder (or using the command line args, or read from standard in, up to you)
Step three) Call the method created earlier for each of the files returned above.
Now, your subject says "concurrently" which suggests threading, but the question body does not suggest that threading is required.

Similar Messages

  • How to Process flat File in Oracle Apps through Concurrent Program

    Hello Everyone,
    My client has a request, to process a bank file (Lockbox) which is a flat file that will be copied on UNIX box and I will have to create a new concurrent request that will process this flat file and will update receipt information in Oracle Apps database tables.
    Could you please suggest, if there are any other standard Oracle Apps functions (Example FND) available which can be used through Concurrent program that can be used to open a file from a particular directory and can be read from the flat file and after processing this file can be closed.
    Please let me know, if you have a small example, that would help me a lot.
    Thanks

    There are base concurrent programs in Accts Receivable that do consume lockbox flat files. Pl see the AR Setup/User Guides at
    http://download.oracle.com/docs/cd/B40089_10/current/html/docset.html
    Srini

  • Batch processing saves files in wrong folder.

    I ran into a problem the other day, and I believe an item should be added to the Process Multiple Files window, or perhaps it's a bug that should be squashed.
    I have a folder with many subfolders in it, all full of jpegs. I wanted to use the Process Multiple Files function to convert the jpegs into png files and have the png files placed into the same folder as the jpeg it originated from.
    I have the Destination set to Same as Source, but instead of placing the files into the subfolders, all the images go directly into the parent folder. It was much more time-consuming to try to sort them out than to start over and process each folder individually. I should be able to leave the computer unattended and have the files saved into the proper folders; Instead I had to stay close so I could change the Source folder every few minutes.
    I see no reason why there isn't an option to place the converted files into their original folder. Isn't that what the Same as Source setting is supposed to do?

    Sure, if you owned all of those files and were doing it strictly for yourself you could certainly write a plugin to Acrobat (in C/C++) that could do some/all of the things you asked for.

  • Infopackage-Load Many Files from Application Server and later Archive/Move

    Hi All..
      I have a doubt,   I have a requirement of take many files to load into BI 7.0..  I used the infopackage before with option:
    Load Binary File From Application server
      I load information successfully... only with one file ...but If I can load many files (with different names) like the next list.. I think it's not a good idea modify the file name (path) on infopackage each time).. :
    *All of this files will be on one server that itu2019s map into AL11.. Like
    Infopfw
    BW_LOAD_20090120.txt
    BW_LOAD_20090125.txt
    BW_LOAD_OTHER_1.txt
    u2026.
    Etc..
    This directory it's not in BW server.. It's other server..but I can load form this location (one file by one)
    Could you help me with this questions:
    -     How can I Use an infopackage with routine that take all the files..one by oneu2026 in order of creation dateu2026and load into Target? Is it possible?.. I have some knowledge of ABAP.. but I don´t know exactly how I can say to system this logicu2026
    -     In addition is it possible move this files to other locationu2026 like into Infopfwarchive u2026 just to have an history of files loaded.
    I saw that in infopackage you have an option to create a routine.. in ABAP codeu2026 Iu2019m a little bit confused because I donu2019t  know how I can specify all the path..
    I try with:
    Infopfw
    InfopfwFile.csv
    Infopfw
    This is the abap code that automatically you see and you need to modifyu2026
    Create a routine for file name
    This routine will be called by the adapter,
    when the infopackage is executed.
              p_filename =
              p_subrc = 0.
    Thank you for your ideas or recommendations.
    Al

    Hi Reddy, thank you for your answer
    I have some doubuts.. when you explain me the option:
    All the above files are appending dates at the end of the file....
    You can load the files through infopackage by using Routines and pick the files based on date at the end of the file..***
    I need to ask you if you think that when you know the date of the file and the infopackage pick each file... thi can work for many files??... or how it's possible control this process?
    About this option, I want to ask you If when you menction Unix code... where it's programed this code?.. in the routine of BW Infopackage??
    ****Or
    Create two folders in your BW in Application server level, in AL11 (ask Basis team)
    I call it is F1 and F2 folders.
    First dump the files into F1 I assume that the file name in F1 is "BW_LOAD_20090120.txt", using Unix code you rename the file and then keep in the same foleder F1 or move to F2.
    Then create InfoPackage and fix the file name (i.e. you renamed), so you don't need to change everyday your file name at infopackage level.Because in AL11 everyday the file are overwrite.
    To I get BW_LOAD_20090120.txt file in F1, then I renamed to BW_LOAD.txt and loaded into BW, then tomorrow I get BW_LOAD_20090125.txt in F1, then I renamed to BW_LOAD.txt....
    so in this way it will work.You need to schedule the Ubix script in AL11.
    This is the way how to handle the application server...I'm using the same logic.
    Thank you soo much.
    Al

  • Processing Multiple Files for more than 100 Receive Location - File Size - 25 MB each file, file type DML

    Hi Everybody
    Please suggest.
    For one of our BizTalk interface, we have around 120 receive locations.  We are just moving (*.dml) files from the source to destination without doing any processing.  
    We receive lots of files in different receive locations and in few cases the file size will be around 25 MB and so while moving these large files, the CPU usage is varying between 10% to 90+% and the time consuming for this single huge file is around
    10 to 20 minutes.  This solution was already in place and was designed by the previous vendor for moving the files to their clients.  Each client has 2 receive locations and they have around 60 clients.  Is there any best solution for implementing
    this with in BizTalk or outside BizTalk? Please suggest.
    I am also looking for how to control the number of files which gets picked from the BizTalk receive location.  For example, If we have say 1000 files in receive location and we want to pick at a time only 50 files only (batch of 50) then is it possible?
    because currently it is picking all the files available in source location, and one of the process is dropping thousands of files in to the source location, so we want to control  the number of files getting picked (or even if we can control to pick the
    number of KBs).  Please guide us on how we can control the number of files.

    Hi Rajeev,
    25 MB per file, 1000 files. Certainly you got to revisit the reason for choosing BizTalk.
    “the time consuming for this single huge file is around 10 to 20 minutes”
     - This is a problem.
    You could consider other file transfer options like XCopy or RobotCopy etc if you want to transfer to another local/shared drive. Or you can consider using SSIS
    which does comes with many adapters to send to destination system depending on their destination transfer protocol.
    But in your case, you have some of the advantages that you get with BizTalk. For your scenario, you have more source systems (more Receive locations), with BizTalk
    it’s always easier to manage these configurations, you can easily enable and disable them when a need arise. You can easily configure tracking; configure host instances based on load etc. So you can consider following design for your requirement. This design
    would suit you well since you’re not processing the message and just pass it through from source to destination:
    Use a custom pipeline component in the Receive Locations which receives the large file.
    Stores the received file into disk and creates a small XML metadata message that contains the information about where the large file is stored.
    The small XML message is then published into the
    message box db
    instead of the large file. Let the metadata file also contain the same context properties as the received file.
    In the send port, use another custom pipeline component that process the metadata xml file, retrieve the location of the disk where the file is stored, access the file and send it to destination.
    Read the following article on this design..
    http://www.codeproject.com/Articles/180333/Transfer-Large-Files-using-BizTalk-Send-Side
    This way you don’t need to publish the whole message into message box DB which would considerably reduce the processing time and utilises host instance to process
    more files. This way you can still get the advantages of BizTalk and still process large files.
    And regarding your question of restricting the Receive location to handles the number of files from receives location. No it’s not possible.
    Regards,
    M.R.Ashwin Prabhu
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Read a same file concurrently by several threads

    Can anyone show me how to open and read a same file "concurrently" by several threads?

    You'll have to be more specific. Which part do you not know how to do? Do you know how to open and read a file? Do you know how to create multiple threads? Also, are all threads reading the file from start to end, or is each thread going to read a different part of the file, to either be combined after all are done or processed independently?
    Please be more specific about what you're asking and post code showing what you've attempted so far.

  • OutOfMemoryError during many file open/close

    In my application, which reads many files( one by one). I close the file everytime when it is finished. But when the number of files getting bigger, such as 16, I got following problem. What I could to do to avoid it? Any help is great appreciated.
    Exception occurred during event dispatching:
    java.lang.OutOfMemoryError
    <<no stack trace available>>

    I had this problem once and I solved it...
    I made a application that need tables and for each table I needed to read from 2 text files and querry a database and than build up a table. The problem appeared when I had a too arge amount of data in the the arrays in wich I stored all the data to be processed. I made 2 swap files in wich I I write all the data that is not usefl at the moment and empty the arrays that I have written into the files.
    Another problem may be with the verry slow immage processing of java, if you have many images and store them in some buffers than that may get you that exception even if you close the files. If you have to paint a too large amount of objects on the screen and beyond it (using scrollbars or so) even if the objects are not displayed it will eat up all the resources needed as if it would be painted on the screen.
    Also you just might have a buffer you fill without emptying it or a array.
    BTW encreasing memory heap is not advisable because I have never seen an application even the grafic applications that need more than 32 mb if they are well designed so try to enhance the way you manage the data, will cost you some time now but will save you the hollydays..:)
    Paul

  • Mailbfr - rsync issue too many files??

    hey so basically we have an imap server with around 350 mail accounts,
    we use mailbrf to backup the mail database and the users email but the mailbfr is not completing successfully the error i get in the log that gets email to me is this
    rsync: connection unexpectedly closed (8 bytes received so far) [sender]
    rsync error: error in rsync protocol data stream (code 12) at /SourceCache/rsync/rsync-24/rsync/io.c(359)
    mailbfr was aborted. The process was NOT completed successfully.
    im guessing that because we didn't have this issue a couple months ago that the mail store has grown too big, its around 190GB at the moment so perhaps too many files for rsync to backup correctly??
    any ideas on what we can use to back this much data up??
    Cheers for any help
    Calum

    While I can't exclude the large number of files being the issue (I have seen rsync handle larger amounts, but that doesn't mean a particular combination of size/number couldn't break rsync), I somehow have the feeling it's a different issue.
    Calum, if you need to further investigate this, drop me an e-mail or catch me online.
    That said, with this amount of users, I'd consider splitting things over at least a couple of servers.

  • CS3 Camera Raw Too Many Files Open at the Same time alert!!!

    Please help me. I keep getting this error message in Camera Raw that says there are too many files open at the same time - but I only have 100 open:( Please help - I was getting this error with CS2 and thought upgrading to CS3 would fix it and it didn't!!!

    > "10 or 100 - you can quickly go through a stack of images in ACR and make any desired changes you want. Whether making the same or similar adjustment to similar files, or making radically different adjustments to different images as appropriate".
    I've done this with far more than 100! I think my maximum is 425 raw files, invoking ACR from Bridge without Photoshop even loaded, and it worked well. (I've also done 115 JPEGs in order to crop them under extreme time constraints).
    It can be very slick. For example, if I use a ColorChecker a number of times in a shoot, it is easy to select just the set (perhaps 100 or so) that a particular ColorChecker shot applies to and set the WB for all of them.
    Furthermore, in case people don't know, you can set ratings on raw images while many of them are open in ACR. (Just click under the thumbnail). It isn't as powerful as Lightroom, but it is not to be dismissed.
    I suspect that it is possible to apply sensor-dust-healing to lots of images in the same way, and certainly it is easy to apply presets based on various selections.
    Perhaps with AMG (Adobe Media Gallery) it will be sensible to use the above capability to process 100s of raw files, then create a set of web pages for the best of them, in not much more time than it would have taken in Lightroom. I judge that Lightroom is the "proper" tool for the job (perhaps after 1.1!), but Bridge+ACR can go a long way.

  • Too many files on 10.4 Desktop - Finder won't load

    2.66 DualCore Intel Xeon + OSX 10.4.10 + 2Gb DDR2 DIMM + 230Gb ST3250820AS + OPTIARC DVD RW AD-7170A
    Despite recurrent warnings, a relative kept leaving too many files on the Desktop, up to 300 files (JPGs, downloaded archives). Last week, after attempting to move/copy them to a folder, OSX crashed, and now Finder won't load; after password screen, menu bar and Dock appear, the 'MyDay' app (from Entourage) starts bouncing, and system freezes. Sometimes it goes back to password screen, and from there again to a frozen, empty finder with dead mouse pointer. Other times the spinning clock or wheel remain spinning forever.
    Tried DiskUtility, repaired disk and permissions, reset PRAM, no dice.
    Tried to boot off the 10.4.10 DVD, it doesn't allow to run OSX from the disc (I have a vague recollection of doing this once, but...)
    Started in SafeBoot mode, finder froze at same stage.
    Re-Installed OSX 10.4.10 in Archive and Install mode, same freeze at same stage.
    The only explanation for the crash, as no updates/new app installing happened, is that there are, again, too many loose files on the desktop and as it tries to load the Finder it crashes. I'm hauling the machine ASAP to try with my own Intel to try access his disk via FireWire target mode but worry the same thing will happen on my system, that it will balk at too many files.
    Is there a way to try to fix this without resorting to clean install (needless to say, last backup was months ago)? If I install Leopard on that disk (it comes out fine in Disk Utility) will the same thing happen as soon as it tries to load the Finder? Is there a way to boot 10.4 off a CD/DVD?
    Thanks for any info or tips!

    Welcome to the forums!
    Make your relative write this out 1000 times:
    Performance tip: Keep the Desktop clutter-free (empty, if possible)
    Mac OS X's Desktop is the de facto location for downloaded files, and for many users, in-progress works that will either be organized later or deleted altogether. The desktop can also be gluttonous, however, becoming a catch-all for files that linger indefinitely.
    Unfortunately - aside from the effect of disarray it creates - keeping dozens or hundreds of files on the Desktop can significantly degrade performance. Not necessarily because the system is sluggish with regard to rendering the icons on the desktop and storing them in memory persistently (which may be true in some cases), but more likely because keeping an excessive number of items on the Desktop can cause the windowserver process to generate reams of logfiles, which obviously draws resources away from other system tasks. Each of your icons on your desktop is stored as a window in the window server, not as an alias. The more you have stored, the more strain it puts on the window server. Check your desktop for unnecessary icons and clear them out.
    Keeping as few items as possible on the Desktop can prove a surprisingly effective performance boon. Even creating a single folder on your Desktop and placing all current and future clutter inside, then logging out and back in can provide an immediately noticeable speed boost, particularly for the Finder.
    And it is why Apple invented 'Stacks' for Leopard.

  • Word cannot complete the operation because too many files are open

    Hello,
    I'm working on a 60 pg. document, and when I hit 'save', I get this annoying msg.:
    "word cannot complete the operation because too many files are open." I have no other files open, and I am unable to continue to work, because I cannot save - can anyone help? Thanks!

    Hello Kristina,
    Have you been able to save the file at all?
    More than anything, it seems to be a bug when you're trying to save to a server. There's more on the problem here. So what you can also try is simply saving the file to your desktop, or somewhere on your Mac's hard drive. That way you're saving the file locally instead of to a server. If that works, you've at least saved the file and can continue on. Otherwise, read on.
    If you can't do it from Word, then at least save your changes from another application. Open TextEdit. You should get a blank file on the desktop. Make sure it's set as a rich text file. You'll know just be looking at it. If you see a ruler and other buttons at the top of the TextEdit document window, it's rich text. If not, make it rich text by pressing CommandShiftT.
    Copy and paste your entire Word text into the TextEdit window. Save the TextEdit document to the desktop. Now that you've at least gotten your work saved, you can try some other things.
    If you're not trying to work on the file from, or save it to a remote server, then don't worry about the first workaround. It doesn't apply. Number two was poorly worded by John. He seems to assume everyone knows what RH is. I had to look it up. It means Remote Home folder. So open the Word preferences and click on FIle Locations. Highlight the first choice, Documents, and click on the Modify button. Choose any location you want on your Mac's hard drive. Click "Choose" and close the preferences. Can you now save the document to the folder you defined as the default save location?
    Spotlight is that little magnifying glass in the upper right corner of your screen by the time/date. There are a few methods to disable Spotlight. What John is saying that if the Mac is spending a lot of time indexing your drives at the same time you're trying to save your Word document, the intense processing time to index may be keeping Word from saving.
    If you've managed to at least save your file as a rich text TextEdit document, then shut all of your applications down and restart your Mac. Launch Word and open the TextEdit file. You may lose some formating if you've done tables or other things TextEdit can't save in Word's original form.

  • ERROR: Failed to process XAP file - Windows Phone 8 Cordova project

    I am having trouble running the Store Test Kit on my Cordova Windows Phone 8 app, using Visual Studio 2013. I can deploy to the device
    fine, but when it comes to submitting I get the following error
    ERROR - Failed to process XAP file: CordovaAppProj_Release_AnyCPU.xap
    I have tried to rebuild the project multiple times, closed and opened the test kit and visual studio but no luck. Looking around the
    internet, similar problems were occurring but nothing worked.
    I have tried renaming the XAP file created, and even changing its extension to a .zip to find incorrect files that may be causing problems, but nothing.
    After rebuilding my project I don't receive an errors of any kind so I am totally stumped!

    thanks for your kind reply,
    But result is same
    PM>  Install-Package System.Spatial -Version 5.6.0
    Installing 'System.Spatial 5.6.0'.
    You are downloading System.Spatial from Microsoft Corporation, the license agreement to which is available at http://go.microsoft.com/?linkid=9809688. Check the package for additional dependencies, which may come with their own license agreement(s). Your use
    of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device.
    Successfully installed 'System.Spatial 5.6.0'.
    Adding 'System.Spatial 5.6.0' to Appify.
    Uninstalling 'System.Spatial 5.6.0'.
    Successfully uninstalled 'System.Spatial 5.6.0'.
    Install failed. Rolling back...
    Install-Package : Could not install package 'System.Spatial 5.6.0'. You are trying to install this package into a project that targets 'WindowsPhoneApp,Version=v8.1', but the
    package does not contain any assembly references or content files that are compatible with that framework. For more information, contact the package author.
    At line:1 char:2
    +  Install-Package System.Spatial -Version 5.6.0
    +  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (:) [Install-Package], InvalidOperationException
        + FullyQualifiedErrorId : NuGetCmdletUnhandledException,NuGet.PowerShell.Commands.InstallPackageCommand
    Not only this reference, so many other references also not yet ready or not supporting windows phone 8.1 RT apps. 
    I am converting windows phone 8 app to windows phone 8.1 RT , for that I need WindowsAzure Storage reference.

  • Too Many Files Opened

    All,
    I am running on Fedora Core 10 (2.6.27.21-170.2.56.fc10.x86_64) with Java 1.6.0_12-b04 (32-bit and 64-bit get the same issue) and am getting a "java.io.FileNotFoundException: ... (Too many open files)" error when running my unit tests. I have been very careful about placing things in a try { } finally { stream.close(); } block and can't figure out what is actually going wrong.
    The unit tests actually work fine on Windows XP, which points me to either Java or Linux itself.
    After doing some research online I found some articles and postings:
    - http://lj4newbies.blogspot.com/2007/04/too-many-open-files.html
    - http://www.mail-archive.com/[email protected]/msg15750.html
    - http://www.linuxforums.org/forum/redhat-fedora-linux-help/73419-we-facing-too-many-open-files-regularly.html
    However, is there anything else I can do within Java itself (using a VM argument, code "best practices" that I may be missing, etc. to combat this?
    My Linux core configuration as based on the articles are as follows:
    $ cat /proc/sys/fs/file-max
    564721
    $ ulimit -a
    core file size          (blocks, -c) 0
    data seg size           (kbytes, -d) unlimited
    scheduling priority             (-e) 0
    file size               (blocks, -f) unlimited
    pending signals                 (-i) 55296
    max locked memory       (kbytes, -l) 32
    max memory size         (kbytes, -m) unlimited
    open files                      (-n) 1024
    pipe size            (512 bytes, -p) 8
    POSIX message queues     (bytes, -q) 819200
    real-time priority              (-r) 0
    stack size              (kbytes, -s) 10240
    cpu time               (seconds, -t) unlimited
    max user processes              (-u) 1024
    virtual memory          (kbytes, -v) unlimited
    file locks                      (-x) unlimited

    tjacobs01 wrote:
    The unit tests actually work fine on Windows XP, which points me to either Java or Linux itself.OR another app on your linux machine that's got a lot of files open
    OR you installed java as yourself and not system and you don't have priveledges to open the files
    OR your code has a bug somewhere that doesn't effect running it on windows
    99.999% of the time, the problem is with you, not with Java or LinuxThis actually seems to be it. I rebooted the system into console mode to avoid X from loading. I then reset the JDK to be setup by root (I use the self-extracting, non-RPM version). When I ran the test suite via Ant here it did not show the problem. Therefore, I'm fairly convinced it is actually some other application eating up too many files.

  • Genunix: basic rctl process.max-file-descriptor (value 256) exceeded

    Hi .,
    I am getting the following error in my console rapidly.
    I am using Sun Sparc server running with Solaris 10 ., We start getting this error
    suddently after a restart of the server and the error is continously rolling on the console...
    The Error:
    Rebooting with command: boot
    Boot device: disk0 File and args:
    SunOS Release 5.10 Version Generic_118822-25 64-bit
    Copyright 1983-2005 Sun Microsystems, Inc. All rights reserved.
    Use is subject to license terms.
    Hardware watchdog enabled
    Failed to send email alert for recent event.
    SC Alert: Failed to send email alert for recent event.
    Hostname: nitwebsun01
    NOTICE: VxVM vxdmp V-5-0-34 added disk array DISKS, datype = Disk
    NOTICE: VxVM vxdmp V-5-3-1700 dmpnode 287/0x0 has migrated from enclosure FAKE_ENCLR_SNO to enclosure DISKS
    checking ufs filesystems
    /dev/rdsk/c1t0d0s4: is logging.
    /dev/rdsk/c1t0d0s7: is logging.
    nitwebsun01 console login: Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 439
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 414
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 413
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 414
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 413
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 121
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 414
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 413
    Nov 20 14:56:41 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 121
    Nov 20 14:56:41 nitwebsun01 last message repeated 1 time
    Nov 20 14:56:43 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 470
    Nov 20 14:56:43 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 467
    Nov 20 14:56:44 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 470
    Nov 20 14:56:44 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 121
    Nov 20 14:56:44 nitwebsun01 last message repeated 1 time
    Nov 20 14:56:49 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 503
    Nov 20 14:56:50 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 510
    Nov 20 14:56:50 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 121
    Nov 20 14:56:50 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 519
    Nov 20 14:56:50 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 516
    Nov 20 14:56:50 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 519
    Nov 20 14:56:53 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 540
    Nov 20 14:56:53 nitwebsun01 last message repeated 2 times
    Nov 20 14:56:53 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 549
    Nov 20 14:56:53 nitwebsun01 last message repeated 4 times
    Nov 20 14:56:56 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 665
    Nov 20 14:56:56 nitwebsun01 last message repeated 6 times
    Nov 20 14:56:56 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 667
    Nov 20 14:56:56 nitwebsun01 last message repeated 2 times
    Nov 20 14:56:56 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 121
    Nov 20 14:56:57 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 868
    Nov 20 14:56:57 nitwebsun01 /usr/lib/snmp/snmpdx: unable to get my IP address: gethostbyname(nitwebsun01) failed [h_errno: host not found(1)]
    Nov 20 14:56:58 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 887
    Nov 20 14:57:00 nitwebsun01 genunix: basic rctl process.max-file-descriptor (value 256) exceeded by process 976
    nitwebsun01 console login: root
    Nov 20 14:57:00 nitwebsun01 last message repeated 2 times
    Here I attached my /etc/project file also..
    [root@nitwebsun01 /]$ cat /etc/project
    system:0::::
    user.root:1::::
    process.max-file-descriptor=(privileged,1024,deny);
    process.max-sem-ops=(privileged,512,deny);
    process.max-sem-nsems=(privileged,512,deny);
    project.max-sem-ids=(privileged,1024,deny);
    project.max-shm-ids=(privileged,1024,deny);
    project.max-shm-memory=(privileged,4294967296,deny)
    noproject:2::::
    default:3::::
    process.max-file-descriptor=(privileged,1024,deny);
    process.max-sem-ops=(privileged,512,deny);
    process.max-sem-nsems=(privileged,512,deny);
    project.max-sem-ids=(privileged,1024,deny);
    project.max-shm-ids=(privileged,1024,deny);
    project.max-shm-memory=(privileged,4294967296,deny)
    group.staff:10::::
    [root@nitwebsun01 /]$
    Help me to came out of this issue
    Regards
    Suseendran .A

    This is an old post but I'm going to reply to it for future reference of others.
    Please ignore the first reply to this thread... by default /etc/rctladm.conf doesn't exist, and you should never use it. Just put it out of your mind.
    So, then... by default, a process can have no more than 256 file descriptors open at any given time. The likelyhood that you'll have a program using more than 256 files very low... but, each network socket counts as a file descriptor, therefore many network services will exceed this limit quickly. The 256 limit is stupid but it is a standard, and as such Solaris adheres to it. To look at the open file descriptors of a given process use "pfiles <pid>".
    So, to change it you have several options:
    1) You can tune the default threshold on the number of descriptors by specifying a new default threshold in /etc/system:
    set rlim_fd_cur=1024
    2) On the shell you can view your limit using 'ulimit -n' (use 'ulimit' to see all your limit thresholds). You can set it higher for this session by supplying a value, example: 'ulimit -n 1024', then start your program. You might also put this command in a startup script before starting your program.
    3) The "right" way to do this is to use a Solaris RCTL (resource control) defined in /etc/project. Say you want to give the "oracle" user 8152 fd's... you can add the following to /etc/project:
    user.oracle:101::::process.max-file-descriptor=(priv,8152,deny)
    Now log out the Oracle user, then log back in and startup.
    You can view the limit on a process like so:
    prctl -n process.max-file-descriptor -i process <pid>
    In that output, you may see 3 lines, one for "basic", one for "privileged" and one for "system". System is the max possible. Privileged is the limit by which you need to have special privs to raise. Basic is the limit that you as any user can increase yourself (such as using 'ulimit' as we did above). If you define a custom "priviliged" RCTL like we did above in /etc/projects it will dump the "basic" priv which is, by default, 256.
    For reference, if you need to increase the threshold of a daemon that you can not restart, you can do this "hot" by using the 'prctl' program like so:
    prctl -t basic -n process.max-file-descriptor -x -i process <PID>
    The above just dumps the "basic" resource control (limit) from the running process. Do that, then check it a minute later with 'pfiles' to see that its now using more FD's.
    Enjoy.
    benr.

  • After running Magician app many files where deleted and after restarting I received the language portion of when you first start to set up a new Mac

    After running Magician app many files where deleted from my computer, nothing new,  After restarting I received the language portion of when you first start to set up a new Mac. Except after the language portion it went immediately to 5 options; install back-up from Time Machine, Disk Utility, Restart from start up disc, etc. I tried doing them all. Disc repair, rebooting from TM, but I don't save everything from TM I use an external HD for backup. None of those options are working for me. Now I get a grey screen with a flashing globe for so many minutes then a folder appears with a ? In the middle of it. What am I to do? Can anyone help me, please!!!

    I appreciate the extra forums you suggested I look into. I ended up forceing a Safe Mode reboot on my Mac  holding the Shift key. I let the installation process commence and I'm back to the way things were.

Maybe you are looking for

  • Cannot paste any online pictures from any websites into word/powerpoint/onenote 2013

    I cannot paste any pictures/ images copied from any websites into word, powerpoint and onenote. Now, i am so stressful because this strange problem occurs with both versions of Microsoft Office (2010 and 2013). I had updated the Microsoft Office from

  • How can I run a light-grey screen behind a display ad in indd?

    Either I'm doing something terribly wrong or you haven't told me something I really need to know (and the manual is extremely difficult to learn from). We're back at my wife's bimonthly newsletter and she wanted to set a short paragraph in a box with

  • Is Support for 9i RDBMS in OEM going to be dropped ?

    How many customers run their critical apps on so called "old" 9i RDBMS databases ? We have a few in our company. One of the reasons for not upgrading is because we have too much invested in it and the world appears to have become very change averse w

  • Using a projector as a mirrored display

    HI, I'm trying to use an LCD projector to mirror my display and have connected using the VGA connector. The iBook seems to detect it but nothing is displayed on the projector apart from 'No RGB input detected'. I've tried following the instructions g

  • Error U44M1P7 from CSXS Infrastructure 4.0.2 update

    Got to work this morning, and Creative Cloud pops up an update message. Goodie, I thought. There were updates for the following applications, all of which were the CSXS Infrastructure 4.0.2 update: - Dreamweaver CC - Illustrator CC - InCopy CC - InDe