Too many sound files... AS3 no worky
I've created this little program to test with. It works perfectly for 5 or less sound files/for loops. As soon as I do more then that the sounds stop working. Everything else is fine... just no sound. My plan is to have the xml file determin how many movie clips with attached sound files to create. I'm planning on having tons of movie clips and sound files once this is all finished. No luck so far
Is there a limit to how many sound files a swf can reference? Any other ideas why the sounds aren't working for me?
//load xml
var xmlLoader:URLLoader = new URLLoader();
var xmlData:XML = new XML();
var xmlPath:String = "data.xml";
xmlLoader.load(new URLRequest(xmlPath));
trace("loading xml from: " + xmlPath);
xmlLoader.addEventListener(Event.COMPLETE, LoadXML);
function LoadXML(e:Event):void {
trace("xml loading complete");
xmlData = new XML(e.target.data);
//load objects
buildScroller();
function buildScroller():void{
var tl:MovieClip=this;
for (var i:Number = 0; i < 5; i++){
//create movie clips
var container_mc:container = new container();
addChild(container_mc);
container_mc.x = 325 * i; // set the number here to the width of the movie clip being repeated
//sound info
tl["snd"+i] = new Sound();
tl["snd"+i].load(new URLRequest(xmlData.sound.loc[i]));
container_mc.snd = tl["snd"+i];
container_mc.addEventListener(MouseEvent.CLICK, playSound, false, 0, true);
function playSound(event:MouseEvent):void {
event.currentTarget.snd.play();
here is the xml file that I'm using:
<DATA>
<object_count>20</object_count>
<sound>
<loc>fart.mp3</loc>
<loc>burp.mp3</loc>
<loc>creak128.mp3</loc>
<loc>GateOpen.mp3</loc>
<loc>bang.mp3</loc>
<loc>pop.mp3</loc>
<loc>slap.mp3</loc>
</sound>
</DATA>
Dmennenoh, pulling the playSound function out of the other functions worked perfectly! I should have known better. Even though everything is working correctly I am now recieving an error stating that "Parameter url must be non-null." I'm certain this has to do with the sound file urls that I am passing from the xml document... but if the sounds are playing the urls must be non-null already.
I tried changing the tl["snd"+i] to container_mc["snd"+i] as you suggested but the sounds would not play at that point. I recieve an error saying: "a term is undefined and has no porperties." Below is how I built it:
function buildScroller():void{
for (var i:Number = 0; i < 55; i++){
//create movie clips
var container_mc:container = new container();
addChild(container_mc);
container_mc.x = 325 * i; // set the number here to the width of the movie clip being repeated
//sound info
container_mc["snd"+i] = new Sound();
container_mc["snd"+i].load(new URLRequest(xmlData.sound.loc[i]));
container_mc.snd = container_mc["snd"+i];
container_mc.addEventListener(MouseEvent.CLICK, playSound, false, 0, true);
function playSound(event:MouseEvent):void {
event.currentTarget.container_mc.play();
Similar Messages
-
What have "Too many open Files" to do with FIFOs?
Hi folks.
I've just finished a middleware service for my company, that receives files via a TCP/IP connection and stores them into some cache-directory. An external program gets called, consumes the files from the cache directory and puts a result-file there, which itself gets sent back to the client over TCP/IP.
After that's done, the cache file (and everything leftover) gets deleted.
The middleware-server is multithreaded and creates a new thread for each request connection.
These threads are supposed to die when the request is done.
All works fine, cache files get deleted, threads die when they should, the files get consumed by the external program as expected and so on.
BUT (there's always a butt;) to migrate from an older solution, the old data gets fed into the new system, creating about 5 to 8 requests a second.
After a time of about 20-30 minutes, the service drops out with "IOException: Too many open files" on the very line where the external program gets called.
I sweeped through my code, seeking to close even the most unlikely stream, that gets opened (even the outputstreams of the external process ;) but the problem stays.
Things I thought about:
- It's the external program: unlikely since the lsof-command (shows the "list of open files" on Linux) says that the open files belong to java processes. Having a closer look at the list, I see a large amount of "FIFO" entries that gets bigger, plus an (almost) constant amount of "normal" open file handles.
So perhaps the handles get opened (and not closed) somehwere else and the external program is just the drop that makes the cask flood over.
- Must be a file handle that's not closed: I find only the "FIFO" entries to grow. Yet I don't really know what that means. I just think it's something different than a "normal" file handle, but maybe I'm wrong.
- Must be a socket connection that's not closed: at least the client that sends requests to the middleware service closes the connection properly, and I am, well, quite sure that my code does it as well, but who knows? How can I be sure?
That was a long description, most of which will be skipped by you. To boil it down to some questions:
1.) What do the "FIFO" entries of the lsof-command under Linux really mean ?
2.) How can I make damn sure that every socket, stream, filehandle etc. pp. is closed when the worker thread dies?
Answers will be thanked a lot.
TomThanks for the quick replies.
@BIJ001:
ls -l /proc/<PID>/fdGives the same information as lsof does, namely a slowly but steadily growing amount of pipes
fuserDoesn't output anything at all
Do you make exec calls? Are you really sure stdout and stderr are consumed/closed?Well, the external program is called by
Process p = Runtime.getRuntime().exec(commandLine);and the stdout and stderr are consumed by two classes that subclass Thread (named showOutput) that do nothing but prepending the corresponding outputs with "OUT:" and "ERR" and putting them into a log.
Are they closed? I hope so: I call the showOutput's halt method, that should eventually close the handles.
@sjasja:
Sounds like a pipe.Thought so, too ;)
Do you have the waitFor() in there?Mentioning the waitFor():
my code looks more like:
try {
p = Runtime.getRuntime.exec(...);
outshow = new showOutput(p.getInputStream(), "OUT").start;
errshow = new showOutput(p.getErrorStream(), "ERR").start;
p.waitFor();
} catch (InterruptedException e) {
//can't wait for process?
//better go to sleep some.
log.info("Can't wait for process! Going to sleep 10sec.");
try{ Thread.sleep(10000); } catch (InterruptedException ignoreMe) {}
} finally {
if (outShow!=null) outShow.halt();
if (errShow!=null) errShow.halt();
/**within the class showOutput:*/
/**This method gets called by showOutput's halt:*/
public void notifyOfHalt() {
log.debug("Registered a notification to halt");
try {
myReader.close(); //is initialized to read from the given InputStream
} catch (IOException ignoreMe) {}
}Seems as if the both of you are quite sure that the pipes are actually created by the exec command and not closed afterwards.
Would you deem it unlikely that most of the handles are opened somewhere else and the exec command is just the final one that crashes the prog?
That's what I thought.
Thanks for your time
Tom -
Java.io.IOException: Too many open files while deploying in soa 11g
hi all,
I am getting a strange error while deploying any composite .. it's a hello world kinda composite but while i am trying to deploy it i am getting "java.io.IOException: Too many open files" while deployment.. i have tried to deploy it in 2-3 ways but all of them resulted in the same error..bouncing the soa server might be an option but can someone give an insight as why it is happening and can it be resolved without restarting the server..
Thanksyes..so this problem is with unix only ..coz i previously worked in Windows ..never got this problem..
-
STARTING DATABASE : PROBLEM OF Linux Error: 23: Too many open files in syst
Hi everybody,
I am running an RMAN script and get this error,
9> @/u01/app/oracle/admin/devpose/backup/configuration.rcv
RMAN> ###################################################################
2> # Configuration file used to set Rman policies.
3> #
4> ###################################################################
5>
6> CONFIGURE DEFAULT DEVICE TYPE TO DISK;
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of configure command at 08/26/2009 20:03:30
RMAN-06403: could not obtain a fully authorized session
ORA-01034: ORACLE not available
RMAN> CONFIGURE RETENTION POLICY TO REDUNDANCY 1;
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of configure command at 08/26/2009 20:03:30
RMAN-06403: could not obtain a fully authorized session
ORA-01034: ORACLE not available
RMAN> #CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
2> CONFIGURE DEVICE TYPE DISK PARALLELISM 2;
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of configure command at 08/26/2009 20:03:30
RMAN-06403: could not obtain a fully authorized session
ORA-01034: ORACLE not available
RMAN>
RMAN> CONFIGURE CHANNEL DEVICE TYPE DISK FORMAT '/u01/app/oracle/backup/db/ora_df%t_s%s_s%p';
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of configure command at 08/26/2009 20:03:30
RMAN-06403: could not obtain a fully authorized session
ORA-01034: ORACLE not available
But this problem is understandable, as the database is not running. The main problem why database is not running, I have found the reason but do not understand how to solve the problem.
Since, the database was not running, I tried to startup the database, I then came across the following which is my problem (Why so many files are open? Linux OS error says too many files open. See below,
SQL> conn /as sysdba
Connected to an idle instance.
SQL> startup
ORACLE instance started.
Total System Global Area 419430400 bytes
Fixed Size 779516 bytes
Variable Size 258743044 bytes
Database Buffers 159383552 bytes
Redo Buffers 524288 bytes
Database mounted.
ORA-00313: open failed for members of log group 2 of thread 1
ORA-00312: online log 2 thread 1: '/u01/app/oracle/oradata/devpose/redo02.log'
ORA-27041: unable to open file
Linux Error: 23: Too many open files in system
Can anybody has run into such problem and guide me to a solution, please?
ThanksHi,
yes, this DB was functioning o.k. this configuration script was part of RMAN daily backup.
Last night the backup failed. So, when I opened "Failed job" in the EM, I saw this type of messages.
That was the starting point. Gradually, I tried to narrow down on to the actual problem and found the findings as I have posted.
One way of sovling problem, I thought that, all these processes I would kill and then try to open the database, it might startup. However, that wouldnot lead me in ensuring this won't occur again.
That's why I am trying to understand why it should open, so many processes (why spawn so many .flb files?) Any thoughts you have around this?
I will try to restart the OS as the last resort.
Thanks for your help and suggestions.
Regards, -
"java.io.IOException: Too many open files" in LinuX
Hi Developers,
* I am continiously running and processing more than 2000 XML files by using SAX and DOM.....
* My process is as follows,
- Converting the XML file as Document object by DOM....
- And that DOM will be used while creating log file report, that log file will be created after executing all XML files..
* After processing approx 1000 files, it throws *"java.io.IOException: Too many open files" in LinuX system* ....
* I have googled more and more in all sites including sun forum also, but they are telling only to increase the system config by ULIMIT in linux....If i increase that its executing well without exception........
* My question is, Is it possible to do it by JAVA code itself or any other VM arguments like -Xms512m and -Xmx512m.....
* Please let me know , if you have any idea.....
Thanks And Regards,
JavaImranDoh! I forgot to post my little code sample...
package forums.crap;
import java.io.*;
import java.util.*;
public class TooManyFileHandles
private static final int HOW_MANY = 8*1024;
public static void main(String[] args) {
List<PrintWriter> writers = new ArrayList<PrintWriter>(HOW_MANY);
try {
try {
for (int i=1; i<=HOW_MANY; i++ ) {
writers.add(new PrintWriter("file"+i+".txt"));
} finally {
for (PrintWriter w : writers) {
if(w!=null)w.close();
} catch (Exception e) {
e.printStackTrace();
}... and the problem still isn't OOME ;-)
Cheers. Keith. -
WLS 10.3.5 on RHEL 5.4, SocketException: Too many open files
Hi
I'm running Weblogic server 10.3.5 on Red Hat Enterprise Linux Server release 5.4 (Tikanga), with Java jdk1.6.0_27.
My order handling application, when receiving client orders, needs to make outbound SOAP calls to fulfill the order. During a performance test, we got following errors:
####<Feb 10, 2012 2:28:41 PM ICT> <Critical> <Server> <KKMOMAPP2> <KKMOMPE2> <DynamicListenThread[Default]> <<WLS Kernel>> <> <> <1328858921806> <BEA-002616> <Failed to listen on channel "Default" on 172.24.106.81:4095, failure count: 1, failing for 0 seconds, java.net.SocketException: Too many open files>
I monitored the java process of this application, when the "Too many open files" error happened, it had 1388 open file descriptors, among which 655 were sockets.
I also monitored the total open file descriptors of the weblogic user account, the count was around 6300 during this error.
These numbers are far smaller than the file limits configured on OS:
- Under weblogic account, ulimit -n shows 65536
- /proc/sys/fs/file-max shows 772591
- Following lines are already in /etc/security/limits.conf
weblogic soft nofile 65536
weblogic hard nofile 65536
weblogic soft nproc 16384
weblogic hard nproc 16384
I did another test using a simple java program to open large number of sockets under weblogic account. It has no problem to open 15,000 sockets. It seems the file descriptor limit is indeed quite high, but for some reasons, the Weblogic process fails even when it has merely 1388 open files. Are there other Linux or Weblogic parameters I should tune? Or anything else I missed?
Thank you very much
NingHi All,
Any help on this issue ?
Thank you,
Ram -
Java.util.zip.ZipException: Too many open files on Linux
Hi,
We have web application running on Caucho's resin server on jdk 1.5.0_11 and Red hat Linux. We are noticing that java process is running out of file handles within 24-30 hours. We have file limit of 5000 which it consumes in 24 hours throwing 'java.util.zip.ZipException: Too many open files'.
I have made sure all sorts of file handles are closed from application point of view. Here is the snapshot of lsof (list of file handles) from java process. The following list keeps growing until it runs out of limit. Do you have tips/suggestions on how to mitigate this problem (considering we dont want to increase ulimit for this process)? Also, can you make out any thing more from the description of file handles like, are they unclosed POP3 connections or URL connection to external sites?
java 7156 resin 120u IPv4 34930051 UDP localhost.localdomain:59693
java 7156 resin 121u IPv4 34927823 UDP localhost.localdomain:59663
java 7156 resin 122u IPv4 34931861 UDP localhost.localdomain:59739
java 7156 resin 123u IPv4 34932023 UDP localhost.localdomain:59745
java 7156 resin 124u IPv4 34930054 UDP localhost.localdomain:59700
java 7156 resin 125u IPv4 34927826 UDP localhost.localdomain:59665
java 7156 resin 126u IPv4 34927829 UDP localhost.localdomain:59666
java 7156 resin 127u IPv4 34930057 UDP localhost.localdomain:59703
java 7156 resin 128u IPv4 34930713 UDP localhost.localdomain:59727
java 7156 resin 129u IPv4 34930716 UDP localhost.localdomain:59730
java 7156 resin 130u IPv4 34932238 UDP localhost.localdomain:59789
java 7156 resin 131u IPv4 34932026 UDP localhost.localdomain:59749
java 7156 resin 132u IPv4 34932221 UDP localhost.localdomain:59770
java 7156 resin 133u IPv4 34932224 UDP localhost.localdomain:59775
java 7156 resin 134u IPv4 34932029 UDP localhost.localdomain:59753
java 7156 resin 135u IPv4 34932032 UDP localhost.localdomain:59754
java 7156 resin 138u IPv4 34932035 UDP localhost.localdomain:59760
java 7156 resin 139u IPv4 34932038 UDP localhost.localdomain:59763
java 7156 resin 140u IPv4 34932227 UDP localhost.localdomain:59780
java 7156 resin 141u IPv4 34932230 UDP localhost.localdomain:59781
java 7156 resin 144u IPv4 34932234 UDP localhost.localdomain:59786
java 7156 resin 146u IPv4 34932241 UDP localhost.localdomain:59792
java 7156 resin 147u IPv4 34932247 UDP localhost.localdomain:59802Finally we resolved this issue. It was oracle driver which had some compatibility issue, we upgraded our Oracle client driver to newer version, and this fixed the problem. Base line, there was nothing wrong with application code, code was doing good resource clean up, but oracle driver was leaking handles per every connection.
-
Runtime.exec - Too Many Open Files
System version : Red Hat Enterprise Linux 2.4.21-47.ELsmp AS release 3 (Taroon Update 8)
JRE version : 1.6.0-b105
Important : the commands described below are launched from a Web application : Apache Tomcat 6.0.10
Hello,
I'm facing a problem already known, but appearantly never really solved ??!! ;)
When I invoke many system commands with the 'Runtime.exec(...)' method, there are open files that are not released (I can see them with the "lsof" system command) .
At the end, the unavoidable "too many open files" Exception.
The lauched commands are "ssh ... " commands.
In the topics relating to this problem, the solution is always to close all Streams / threads and to explicitely invoke the method "Process.destroy()".
My problem is that this is what I do ! And I can't do more...
Here is the code :
Runtime rt = Runtime.getRuntime();
Process process = rt.exec("ssh ...");
// ProcessStreamHolder extends Thread and reads from the InputStream given in constructor...
ProcessStreamHolder errorStream = new ProcessStreamHolder(process.getErrorStream());
ProcessStreamHolder outputStream = new ProcessStreamHolder(process.getInputStream());
errorStream.start();
outputStream.start();
exitValue = process.waitFor();
try {
errorStream.interrupt();
} catch (RuntimeException e) {
logger.warn("...");
try {
outputStream.interrupt();
} catch (RuntimeException e) {
logger.warn("...");
try {
process.getInputStream().close();
} catch (RuntimeException e) {
logger.warn("...");
try {
process.getOutputStream().close();
} catch (RuntimeException e) {
logger.warn("...");
try {
process.getErrorStream().close();
} catch (RuntimeException e) {
logger.warn("...");
process.destroy();Does someone know if my code is wrong or if there's a workaround for me ?
Thanks by advance !
Richard.Don't interrupt those threads. Close the output stream first, then wait for the process to exit, then both threads reading the stdout and stderr of the process should get EOFs, so they should exit naturally, and incidentally close the streams themselves.
-
"Too many open files" Exception on "tapestry-framework-4.1.1.jar"
When a browser attempts accessing to my webwork, the server opens a certain number of file descriptors to "tapestry-framework-4.1.1.jar" file and don't release them for a while.
Below is the output from "lsof | grep tapestry":
java 26735 root mem REG 253,0 62415 2425040 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-portlet-4.1.1.jar
java 26735 root mem REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
java 26735 root mem REG 253,0 320546 2425036 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-contrib-4.1.1.jar
java 26735 root mem REG 253,0 49564 2424979 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-annotations-4.1.1.jar
java 26735 root 28r REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
java 26735 root 29r REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
java 26735 root 30r REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
These unknown references are sometimes released automatically, but sometimes not.
And I get "Too many open files" exception after using my application for a few hours.
The number of the unknown references increases as I access to my webwork or just hit on "F5" key on my browser to reload it.
I tried different types of browsers to see if I could see any differences in consequence, and in fact it differed by the browser I used.
When viewed by Internet Explorer it increased by 3 for every access.
On the other hand it increased by 7 for each attempt when accessed by FireFox.
I have already tried optimizing the max number of file discriptors, and it solved the "Too many open files" exception.
But stil I'm wondering who actually is opening "tapestry-framework-4.1.1.jar" this many.
Could anyone figure out what is going on?
Thanks in advance.
The following is my environmental version info:
- Red Hat Enterprise Linux ES release 4 (Nahant Update 4)
- Java: 1.5.0_11
- Tomcat: 5.5.20
- Tapestry: 4.1.1Hi,
Cause might The server got an exception while trying to accept client connections. It will try to backoff to aid recovery.
The OS limit for the number of open file descriptor (FD limit) needs to be increased. Tune OS parameters that might help the server to accept more client connections (e.g. TCP accept back log).
http://e-docs.bea.com/wls/docs90/messages/Server.html#BEA-002616
Regards,
Prasanna Yalam -
Intermittent too many open files error and Invalid TLV error
Post Author: jam2008
CA Forum: General
I'm writing this up in the hopes of saving someone else a couple of days of hair-pulling...
Environment: Crystal Reports XI Enterprise / also runtime via Accpac ERP 5.4
Invalid TLV error in Accpac
"too many open files" error in event.log file
Situation:
Invalid TLV error occurs seemingly randomly on report created in CR Professional 11. Several days of troubleshooting finally lead to the following diagnosis:
This error occurs in a report that contains MORE THAN 1 bitmap image.
The error only shows up after 20 or more reports have been generated sequentially, WITHOUT CLOSING the application that is calling the report. In our case the Invoice Report dialog within Accpac. This same error occurred in a custom 3rd party VB.NET app that also called the report through an Accpac API.after getting this message you need to do 2 things:
1. delete the current workspace because it contains some bad data in one the config files - failure to delete the workspace will result the error message to appear even if trying to upload a single file.
2. add to DTR files in groups - no more than 500 in a single add. -
Actually, the stuff works in interpreted mode.
It's only when having the server partition compiled that this happen.
j-p
-----Message d'origine-----
De: Adamek, Zenon [mailto:ZAdamekpurolator.com]
Date: lundi 25 septembre 2000 17:13
À: 'Jean-Paul.Gabriellisema.fr'
Cc: Forte-userslists.xpedior.com
Objet: RE: (forte-users) [UNIX] "Too many open files" 3.0.M2 question
see Technote 10981
-----Original Message-----
From: Jean-Paul Gabrielli [SMTP:Jean-Paul.Gabriellisema.fr]
Sent: Monday, September 25, 2000 11:02 AM
To: zeForte-users
Subject: (forte-users) [UNIX] "Too many open files" 3.0.M2 question
Hi,
running a server partition that reads a configuration file,
and apparently doen't close it after, I have that exception:
SYSTEM ERROR: System Error: Too many open files, opening '....'with mode
'r'
Class: qqos_FileResourceException
1) Is there such a limit, or does this rely only on the OS one ?
2) How is this error not trapped, as I only got itinteractively, whereas
my server log does a exception trap/segmentation fault,
thanlks
j-p
For the archives, go to: http://lists.xpedior.com/forte-users and use
the login: forte and the password: archive. To unsubscribe,send in a new
email the word: 'Unsubscribe' to: forte-users-requestlists.xpedior.comHi Jean-Paul,
As described in the Technote 10981 some Forte programs (Nodemanager and
router) handle correct the high-file descriptor-use problem. It is possible
that Forte interpreter do it correct too.
Zenon
-----Original Message-----
From: Jean-Paul Gabrielli [SMTP:Jean-Paul.Gabriellisema.fr]
Sent: Monday, September 25, 2000 12:11 PM
To: Adamek, Zenon
Cc: Forte-userslists.xpedior.com
Subject: RE: (forte-users) [UNIX] "Too many open files" 3.0.M2
question
Actually, the stuff works in interpreted mode.
It's only when having the server partition compiled that this happen.
j-p
-----Message d'origine-----
De: Adamek, Zenon [mailto:ZAdamekpurolator.com]
Date: lundi 25 septembre 2000 17:13
À: 'Jean-Paul.Gabriellisema.fr'
Cc: Forte-userslists.xpedior.com
Objet: RE: (forte-users) [UNIX] "Too many open files" 3.0.M2 question
see Technote 10981
-----Original Message-----
From: Jean-Paul Gabrielli [SMTP:Jean-Paul.Gabriellisema.fr]
Sent: Monday, September 25, 2000 11:02 AM
To: zeForte-users
Subject: (forte-users) [UNIX] "Too many open files" 3.0.M2 question
Hi,
running a server partition that reads a configuration file,
and apparently doen't close it after, I have that exception:
SYSTEM ERROR: System Error: Too many open files, opening '....'with mode
'r'
Class: qqos_FileResourceException
1) Is there such a limit, or does this rely only on the OS one ?
2) How is this error not trapped, as I only got itinteractively, whereas
my server log does a exception trap/segmentation fault,
thanlks
j-p
For the archives, go to: http://lists.xpedior.com/forte-users and use
the login: forte and the password: archive. To unsubscribe,send in a new
email the word: 'Unsubscribe' to:
forte-users-requestlists.xpedior.com
>
For the archives, go to: http://lists.xpedior.com/forte-users and use
the login: forte and the password: archive. To unsubscribe, send in a new
email the word: 'Unsubscribe' to: forte-users-requestlists.xpedior.com -
Too many open files in system cause database goes down
Hello experts I am very worry because of the following problems. I really hope you can help me.
some server features
OS: Suse Linux Enterprise 10
RAM: 32 GB
CPU: intel QUAD-CORE
DB: There is 3 instances RAC databases (version 11.1.0.7) in the same host.
Problem: The database instances begin to report Error message: Linux-x86_64 Error: 23: Too many open files in system
and here you are other error messages:
ORA-27505: IPC error destroying a port
ORA-27300: OS system dependent operation:close failed with status: 9
ORA-27301: OS failure message: Bad file descriptor
ORA-27302: failure occurred at: skgxpdelpt1
ORA-01115: IO error reading block from file 105 (block # 18845)
ORA-01110: data file 105: '+DATOS/dac/datafile/auditoria.519.738586803'
ORA-15081: failed to submit an I/O operation to a disk
At the same time I search into the /var/log/messages as root user and I the error notice me the same problem:
Feb 7 11:03:58 bls3-1-1 syslog-ng[3346]: Cannot open file /var/log/mail.err for
writing (Too many open files in system)
Feb 7 11:04:56 bls3-1-1 kernel: VFS: file-max limit 131072 reached
Feb 7 11:05:05 bls3-1-1 kernel: oracle[12766]: segfault at fffffffffffffff0 rip
0000000007c76323 rsp 00007fff466dc780 error 4
I think I get clear about the cause, maybe I need to increase the fs.file-max kernel parameter but I do not know how to set a good value. Here you are my sysctl.conf file and the limits.conf file:
sysctl.conf
kernel.shmall = 2097152
kernel.shmmax = 17179869184
kernel.shmmni = 4096
kernel.sem = 250 32000 100 128
fs.file-max = 6553600
net.ipv4.ip_local_port_range = 1024 65000
net.core.rmem_default = 4194304
net.core.rmem_max = 4194304
net.core.wmem_default = 262144
net.core.wmem_max = 4194304
limits.conf
oracle soft nproc 2047
oracle hard nproc 16384
oracle soft nofile 1024
oracle hard nofile 65536process limit
bcm@bcm-laptop:~$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 20
file size (blocks, -f) unlimited
pending signals (-i) 16382
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) unlimited
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited -
WL10 Compiler executable.exec error "too many open files" deploying ear
When I try to deploy an ear containing a web module and some ejb modules, I obtain this error:
<Info> <J2EE Deployment SPI> <BEA-260121> <Initiating deploy operation for application, MB-ADM_EAR [archive: /wl/wlments/MB-ADM_EAR/MB-ADM_EAR.ear], to Cluster1 .>
Task 1 initiated: [Deployer:149026]deploy application MB-ADM_EAR on Cluster1.
Task 1 failed: [Deployer:149026]deploy application MB-ADM_EAR on Cluster1.
Target state: deploy failed on Cluster Cluster1
java.io.IOException: Compiler failed executable.exec:
/wl/servers/MS1/cache/EJBCompilerCache/-1dj0waj53cbu8/it/apps/ejbs/core/ExSvc_167qnt_Impl.java:17: error while writing it.apps.ejbs.core.ExSvc_167qnt_Impl: /wl/servers/MS1/cache/EJBCompilerCache/-1dj0waj53cbu8/it/apps/ejbs/core/ExSvc_167qnt_Impl.class (Too many open files)If i split the ear in two parts, web in one ear and ejbs in another ear, deploy is succesfull.
Do you have any idea of what is happening?
Below the environment specifications:
JVM Version: jrockit_150_11
JVM Heap: 512
Web Logic: 10.0.1.0
Server: Hewlett Packard DL585 G2
OS: Linux / 2.6.5-7.287.3.PTF.345489.1-smp
Thank you, bye,
MarcoHello Marco.
When you try to deploy an EAR weblogic server at the time of deployment unjar's it and compiles the files and so on. Every Operating system has a limit on how many number of files can be opened by a process. If your EAR is big then the number of files which WLS will unjar will also be large hence you hit the limit. By splitting your ear into 2, you are splitting wls task into smaller parts, which means the number of files it unjars at a time is less.
The following note tells what needs to be done to avert this issue.
http://download.oracle.com/docs/cd/E12839_01/doc.1111/e14772/weblogic_server_issues.htm#CHDGFFHD -
WLS 92MP1: Application Poller issue Too many open files
Hi,
We have a wls92mp1 domain on linux AS4(64bit) with Sun jdk 1.5.0_14. It contains only Admin server where we have deployed the application. Over a period of time the server start showing up below message in the logs. We have not deployed the application from autodeploy directory. And the file "/home/userid/wls92/etg/servers/userid_a/cache/.app_poller_lastrun " is available in the location, still it throws FileNotFoundException.
<Error> <Application Poller> <BEA-149411> <I/O exception encountered java.io.FileNotFoundException: /home/userid/wls92/etg/servers/userid_a/cache/.a
pp_poller_lastrun (Too many open files).
java.io.FileNotFoundException: /home/userid/wls92/etg/servers/userid_a/cache/.app_poller_lastrun (Too many open files)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:179)
at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
at java.io.FileWriter.<init>(FileWriter.java:73)
at weblogic.management.deploy.GenericAppPoller.setLastRunFileMap(GenericAppPoller.java:423)
Any help regarding this would be highly appreciated.
Thanks.Hi,
By above seeing in error, this error code (BEA-149411) describe the following
149411: I/O exception encountered {0}.
L10n Package: weblogic.management.deploy.internal
I18n Package: weblogic.management.deploy.internal
Subsystem: Application Poller
Severity: Error
Stack Trace: true
Message Detail: An I/O exception denotes a failure to perform a read/write operation on the application files.
Cause: An I/O exception can occur during file read/write while deploying an application.
Action: Take corrective action based on the exception message details.
i think it helps u.
-abhi -
Getting java.io.FileNotFoundException: Too many open files
I have search for different strings in a file again and again. I am using following code:
try
fileReaderObject = new BufferedReader(new FileReader(fileObject));
String inputLine;
while ((inputLine = fileReaderObject.readLine()) != null)
if (zipCode.equalsIgnoreCase(inputLine.split("\t")[0]))
s = inputLine;
fileReaderObject.close();
return s;
fileReaderObject.close();
catch (Exception e)
Utils.writeMessage(e.toString());
e.printStackTrace();
}But getting java.io.FileNotFoundException:file/emp_info (Too many open files)
1. Whats the main reason of this exception and how can it be removed.
2. Is there any way to move the pointer to the start of the file. I tried reset() but its seems to have some problem.
-vcA hint: your fileReaderObject is not closed if an exception occurs.
Lacking desctructors, the Java-ish way is to close in a finally clause.
Maybe you are looking for
-
HI, I receive the following error while installing BC Object 0PURCH_ORG_TEXT CL300 (Transfer Rules,ISMP) could not be collected for object 0PURCH_ORG (InfoObject,IOBJ)
-
How to remove duplicates in SAP Query
Dear Frns, I created a simple SAP query with quickviewer by table join option. tables are MAST & STPO. Link is only STLNR. I am getting a duplicate extra line with same data. Can anyone suggest me how it happened. Is there anyway to remove these dupl
-
Lately, whenever I use pages to export to a PDF file, the images are all the same image and they are often all skewed, scrunched or stretched. The way for me to get around it is to print>save as PDF. Also, the files are huge lately. I can open the
-
Flash swf. file will only show in template and not linked pages
I placed a swf. file on my template page in a non-editable region. I then updated all the files that were linked to that template. I then looked at the linked pages in Dreamweaver and I can see the image of the swf. in all the affected pages. However
-
TRUE or FALSE if cell contains number value
I'm looking for the equivalent of the NUMBERVALUE function in Excel. My equation in excel is: =IF(AND(NUMBERVALUE($I8),NOT(NUMBERVALUE($C8)),NOT(NUMBERVALUE($D8)),NOT(NUMBERV ALUE($E8)),NOT(NUMBERVALUE($F8)),NOT(NUMBERVALUE($G8))),"ERROR","") Reworki