Many JS files

Have the developers thought about trying to consolodate the javascript code? I am looking at the html code for a tabbed panel with a fade effect and it loads nine javascript files. That is a lot of http requests for just one widget.
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryDOMUtils.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryDOMEffects.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryWidget.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryPanelSet.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryPanelSelector.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryFadingPanels.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/SpryTabbedPanels2.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/plugins/TabbedPanels2/SpryFadingPanelsPlugin.js'></script>
<script type='text/javascript' src='Spry-UI-1.7/includes/plugins/TabbedPanels2/SpryTabbedPanelsKeyNavigationPlugin.js'>< /script>

For Spry 1.0 widgets, each widget had one core file. Each of them had common things like addClassName(), removeClassname(), along with a bunch of basic functionality. So that adds up to a lot of redundant code.
This way, all that common stuff is kept in a single file. That means that each widget base class just has what it needs to glue everything together.
Many widgets are 'panel' based: accordions, tabbed panels, slideshows. So handling those panels is done via the PanelSet and PanelSelector files.
You will find SpryTabbedPanels2.js much smaller than the first version.
By components, for instance: SpryFadingPanels.js is only needed if you want to fade between panels. If you don't want that effect, you don't need to use it. WIth components and plugins, you or anyone else can write a different transition and make that available to all your Spry UI widgets.
I don't like all the includes either but it's the right way to go with what we are doing.
Thanks,
Don

Similar Messages

  • To many open files

    Hi gurus i have a problem on ssl ldap and and and the file descriptors:
    OS: SunSolaris 8
    Version 1.4.2
    ID JDK
    Build Level 1.4.2_04-b05
    Build Date 06/27/2004
    the strange problem is related on too many open files and file descriptors
    when a first user login and logoff no related broblems on the /WEB-INF/cacerts files but when a second user login and log off
    the files remains open and the output log give the i too many open files the log
    ] b44131 LdapRegistryI Could not get the users matching the pattern AW504P28 because of the following exception javax.naming.CommunicationException: webldap.IP:PORT [Root exception is java.net.SocketException: Too many open files]
         at com.sun.jndi.ldap.Connection.<init>(Connection.java:204)
         at com.sun.jndi.ldap.LdapClient.<init>(LdapClient.java:119)
         at com.sun.jndi.ldap.LdapClient.getInstance(LdapClient.java:1668)
         at com.sun.jndi.ldap.LdapCtx.connect(LdapCtx.java:2599)
         at com.sun.jndi.ldap.LdapCtx.<init>(LdapCtx.java:290)
         at com.sun.jndi.ldap.LdapCtxFactory.getUsingURL(LdapCtxFactory.java:175)
         at com.sun.jndi.ldap.LdapCtxFactory.getUsingURLs(LdapCtxFactory.java:193)
         at com.sun.jndi.ldap.LdapCtxFactory.getLdapCtxInstance(LdapCtxFactory.java:136)
         at com.sun.jndi.ldap.LdapCtxFactory.getInitialContext(LdapCtxFactory.java:66)
         at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:662)
         at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:243)
         at javax.naming.InitialContext.init(InitialContext.java:219)
         at javax.naming.InitialContext.<init>(InitialContext.java:195)
         at javax.naming.directory.InitialDirContext.<init>(InitialDirContext.java:80)
         at com.ibm.ws.security.registry.ldap.LdapConfig.getRootDSE(LdapConfig.java:287)
         at com.ibm.ws.security.registry.ldap.LdapRegistryImpl.getRootDSE(LdapRegistryImpl.java:181)
         at com.ibm.ws.security.registry.ldap.LdapRegistryImpl.search(LdapRegistryImpl.java:1622)
         at com.ibm.ws.security.registry.ldap.LdapRegistryImpl.search(LdapRegistryImpl.java:1564)
         at com.ibm.ws.security.registry.ldap.LdapRegistryImpl.search(LdapRegistryImpl.java:1559)
         at com.ibm.ws.security.registry.ldap.LdapRegistryImpl.getUsers(LdapRegistryImpl.java:1105)
         at com.ibm.ws.security.registry.ldap.LdapRegistryImpl.checkPassword(LdapRegistryImpl.java:256)
         at com.ibm.ws.security.registry.UserRegistryImpl.checkPassword(UserRegistryImpl.java:277)
         at com.ibm.ws.security.ltpa.LTPAServerObject.authenticate(LTPAServerObject.java:565)
         at com.ibm.ws.security.server.lm.ltpaLoginModule.login(ltpaLoginModule.java:411)
         at com.ibm.ws.security.common.auth.module.proxy.WSLoginModuleProxy.login(WSLoginModuleProxy.java:122)
         at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at javax.security.auth.login.LoginContext.invoke(LoginContext.java:675)
         at javax.security.auth.login.LoginContext.access$000(LoginContext.java:129)
         at javax.security.auth.login.LoginContext$4.run(LoginContext.java:610)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.login.LoginContext.invokeModule(LoginContext.java:607)
         at javax.security.auth.login.LoginContext.login(LoginContext.java:534)
         at com.ibm.ws.security.auth.JaasLoginHelper.jaas_login(JaasLoginHelper.java:307)
         at com.ibm.ws.security.auth.JaasLoginHelper.jaas_login(JaasLoginHelper.java:349)
         at com.ibm.ws.security.auth.ContextManagerImpl.login(ContextManagerImpl.java:1001)
         at com.ibm.ws.security.auth.ContextManagerImpl.login(ContextManagerImpl.java:853)
         at com.ibm.ws.security.auth.ContextManagerImpl.login(ContextManagerImpl.java:844)
         at com.ibm.ws.security.auth.ContextManagerImpl.getServerSubject(ContextManagerImpl.java:1701)
         at com.ibm.ws.management.util.SecurityHelper.getServerSubject(SecurityHelper.java:539)
         at com.ibm.ws.management.event.NotificationDispatcher$2.run(NotificationDispatcher.java:247)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.ibm.ws.management.event.NotificationDispatcher$DispatchANotificationToAListener.setServerCredentials(NotificationDispatcher.java:245)
         at com.ibm.ws.management.event.NotificationDispatcher$DispatchANotificationToAListener.run(NotificationDispatcher.java:215)
         at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:912)
    Caused by: java.net.SocketException: Too many open files
         at java.net.Socket.createImpl(Socket.java:331)
         at java.net.Socket.<init>(Socket.java:304)
         at java.net.Socket.<init>(Socket.java:124)
         at com.sun.jndi.ldap.Connection.createSocket(Connection.java:346)
         at com.sun.jndi.ldap.Connection.<init>(Connection.java:181)
    what is the cause?
    thank you
    AP

    sorry i can explain well the issue
    for our experience
    LDAP access causes the opening a lot of file descriptor?
    The LDAP proves ist still running and is still opening files. Meanwhile, because of the SSL communication with application, some files are Created in the CACERTS directory
    Suddenly, the OS has reached its file descriptor limit: an error message �Too many open files� appears
    Question: the situation is : that files cannot be closed in the CACERT director, because the OS has not more file descriptors (because the ldap process has used all file descriptors to the limit)?
    is related on LDAP this problem?
    thank you for your kindness guys!
    AP

  • Java.io.IOException: Too many open files while deploying in soa 11g

    hi all,
    I am getting a strange error while deploying any composite .. it's a hello world kinda composite but while i am trying to deploy it i am getting "java.io.IOException: Too many open files" while deployment.. i have tried to deploy it in 2-3 ways but all of them resulted in the same error..bouncing the soa server might be an option but can someone give an insight as why it is happening and can it be resolved without restarting the server..
    Thanks

    yes..so this problem is with unix only ..coz i previously worked in Windows ..never got this problem..

  • Very slow Backup (Many little file in many subdirectories)

    - I have a Netware 6.5 SP8 server with the post SP8 NSS patch installed (NSS version 3.27.01 April 7, 2009)
    - iSCSI Initiator Version 1.06.05 October 30, 2008 connecting a 410 GB NSS Volume (Lefthand Networks iSCSI SAN)
    - the volume contains user data, and contains, from the Root, a USERDATA directory, which then contains 32,749 individual user directories.
    - The majority (90%) of the user directories are empty
    - each user directory has it's own trustee assignment, and space restrictions
    - compression is enabled
    - 376,269 Files using 45,556,460,661 bytes in 221,439 directories so the average would be around 121K per file and as you can see it's very Directory intensive
    TSATEST, indicates approximately 850 MB/min result, but when I attempt to back up the data using my backup software (NetVault, which uses NDMP), the performance is hideous. A Full Backup runs around 135k/sec or 8 MB/min (try getting that done in any size backup window) . I have also tried other backup solutions, with the same basic result.
    I assume the issue is with indexing, but I'm not sure what to check at this point. I've been trying every suggestion I could find. I've gotten the throughput up to about 1.5M/s, but obviously need better. Just wanted to know if anyone here had any suggestions that might help me to make this function more efficiently, or any TID's that have proven helpful.
    Thank You!

    Matt Karwowski wrote:
    > - I have a Netware 6.5 SP8 server with the post SP8 NSS patch installed
    > (NSS version 3.27.01 April 7, 2009)
    > - iSCSI Initiator Version 1.06.05 October 30, 2008 connecting a 410 GB
    > NSS Volume (Lefthand Networks iSCSI SAN)
    > - the volume contains user data, and contains, from the Root, a USERDATA
    > directory, which then contains 32,749 individual user directories.
    > - The majority (90%) of the user directories are empty
    > - each user directory has it's own trustee assignment, and space
    > restrictions
    > - compression is enabled
    > - 376,269 Files using 45,556,460,661 bytes in 221,439 directories so the
    > average would be around 121K per file and as you can see it's very
    > Directory intensive
    >
    > TSATEST, indicates approximately 850 MB/min result, but when I attempt
    > to back up the data using my backup software (NetVault, which
    > uses NDMP), the performance is hideous. A Full Backup runs around
    > 135k/sec or 8 MB/min (try getting that done in any size backup
    > window) . I have also tried other backup solutions, with the same basic
    > result.
    >
    > I assume the issue is with indexing, but I'm not sure what to check at
    > this point. I've been trying every suggestion I could find. I've
    > gotten the throughput up to about 1.5M/s, but obviously need
    > better. Just wanted to know if anyone here had any suggestions that
    > might help me to make this function more efficiently, or any TID's that
    > have proven helpful.
    >
    > Thank You!
    Known issue....
    http://www.novell.com/support/viewCo...1011&sliceId=1
    I have seen this in Java development shops with millions of 1kb files
    and folders. The nature of Java.
    - Use tar or zip on the local server,not across the wire to compress
    those types of data folders.
    - Exclude those folders from backup
    - Then backup only the zip or tar files.
    or maybe look into rsync as I think is copies by hardware blocks not
    files?????
    - maybe need toreview your environment if it is a development shop like
    Java.Maybe switch to GPFS or Reiser for many small files if you can, EXT
    could handle it for your small amount of data though.

  • How to merge many XML files into one?

    Hi: I got a small project to combine many XML files into one and convert the combined XML file in Excel using AppleScript. My XML files look like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <Metadataobject>
        <from>[email protected]</from>
        <jobname>B3_IM09MBDUF</jobname>
        <pages>2</pages>
        <priority>3</priority>
        <timezone>CEST</timezone>
        <year>2013</year>
        <month>7</month>
        <day>15</day>
        <hour>11</hour>
    </Metadataobject>
    and like this...
    <?xml version="1.0" encoding="UTF-8"?>
    <Metadataobject>
        <from>[email protected]</from>
        <jobname>P1_FR1330G006007_Kate_van der Vaart</jobname>
        <pages>2</pages>
        <priority>1</priority>
        <timezone>CEST</timezone>
        <year>2013</year>
        <month>7</month>
        <day>12</day>
        <hour>16</hour>
    </Metadataobject>
    I get many XML files like this. And I want them to be combined and shown like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <Metadataobject>
        <job id="1">
        <from>[email protected]</from>
        <jobname>B3_IM09MBDUF</jobname>
        <pages>2</pages>
        <priority>3</priority>
        <timezone>CEST</timezone>
        <year>2013</year>
        <month>7</month>
        <day>15</day>
        <hour>11</hour>
        </job>
        <job id="1">
        <from>[email protected]</from>
        <jobname>P1_FR1330G006007_Kate_van der Vaart</jobname>
        <pages>2</pages>
        <priority>1</priority>
        <timezone>CEST</timezone>
        <year>2013</year>
        <month>7</month>
        <day>12</day>
        <hour>16</hour>
        </job>
    </Metadataobject>
    And finally the combined XML file converts in Excel sheet with column headings "Job ID", "From", "Job Name" and so on...
    Or there is another best way to get the same result...
    Thanks

    That is just an intermediary state to get to the excel version. Actually I get many small XML files (as shown above) from client and I want them all combined in an excel sheet with common column headings... like this...
    from
    jobname
      pages
    priority
    timezone
    year
    month
    day
    hour
    id
    [email protected]
    B3_IM09MBDUF
       2
    3
    CEST
    2013
    7
    15
    11
    1
    [email protected]
    B3_IM09MBDUF
       2
    3
    CEST
    2013
    7
    15
    11
    2
    Thanks for your response.

  • I recently deleted many duplicate files. In the midst of it my Itunes songs are not in Itunes. They are in two different folders. One under 'Music' in my ID. And two under a MyMusic file in Documents. How do I know which one to import?

    I recently deleted many duplicate files. In the midst of it my Itunes songs are not in Itunes. They are in two different folders. One under 'Music' in my ID. And two under a MyMusic file in Documents. How do I know which one to import?

    Thanks, this is not an ideal answer but probably the most sensible one in my case.
    I will try it unless someone has a better suggestion, but I'll wait a bit as it will take me a few days anyway (I had actually tried to create a new smaller playlist or download by album, but at this stage the music app is not letting me queue a list of songs for download - I think I will have to disable and re-enable iTunes match which will probably delete all the songs).
    I have to say I am not very impressed with Apple here - having an online backup of all your data and beeing able to restore it to a new device easily was a strong selling point of iCloud. For music, they are Definitly not delivering at the stage.

  • Converting many numbers files to PDFs?

    Hi everyone!
    I would like to convert many Numbers files to PDF documents. Every Numbers document should be a new PDF with the same name!
    I even tried to write a workflow file with Automator, but I dont get the "create a PDF"-part running.
    Furthermore I googled for 2 hours, but all scripts I found do not work with the latest Numbers version...
    Please help me... Maybe one of you have a Apple Script or a solution using Automator?
    OS X 10.9.3
    Numbers Version 3.2 (1861)

    Dear Lori,
    promptUser =false, see below.
    Does not work.
    All programs that might disturbe PDF output removed from computer.
    Does not work.
    I am now investing some evenings into Word-VBA, making a script loop through the subdirectories over several levels (does Acrobat do that anyway?)...
    Does not work.
    Not yet.
    It will have to...
    best regards,
    Boris
    <?xml version="1.0" encoding="UTF-8"?>
    <Workflow xmlns="http://ns.adobe.com/acrobat/workflow/2012" title="ZOPtest ORT" description="" majorVersion="1" minorVersion="0">
    <Sources defaultCommand="WorkflowPlaybackSelectFolder">
      <Folder path="/nas02/quinsee$/Fachabteilung/ZOP/Standards/Standards NCH"/>
    </Sources>
    <Group label="Unbenannt">
      <Command name="Scan:OPT" pauseBefore="false" promptUser="false">
       <Items>
        <Item name="ApplyMRC" type="boolean" value="false"/>
        <Item name="BkgrRemove" type="integer" value="0"/>
        <Item name="ColorCompression" type="integer" value="4"/>
        <Item name="Descreen" type="boolean" value="false"/>
        <Item name="Deskew" type="boolean" value="false"/>
        <Item name="Format" type="integer" value="1"/>
        <Item name="Language" type="integer" value="-1"/>
        <Item name="MonoCompression" type="integer" value="1"/>
        <Item name="QualityLevel" type="integer" value="1"/>
        <Item name="TextSharpen" type="integer" value="0"/>
        <Item name="doOCR" type="boolean" value="false"/>
       </Items>
      </Command>
    </Group>
    </Workflow>

  • STARTING DATABASE : PROBLEM OF Linux Error: 23: Too many open files in syst

    Hi everybody,
    I am running an RMAN script and get this error,
    9> @/u01/app/oracle/admin/devpose/backup/configuration.rcv
    RMAN> ###################################################################
    2> # Configuration file used to set Rman policies.
    3> #
    4> ###################################################################
    5>
    6> CONFIGURE DEFAULT DEVICE TYPE TO DISK;
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of configure command at 08/26/2009 20:03:30
    RMAN-06403: could not obtain a fully authorized session
    ORA-01034: ORACLE not available
    RMAN> CONFIGURE RETENTION POLICY TO REDUNDANCY 1;
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of configure command at 08/26/2009 20:03:30
    RMAN-06403: could not obtain a fully authorized session
    ORA-01034: ORACLE not available
    RMAN> #CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 7 DAYS;
    2> CONFIGURE DEVICE TYPE DISK PARALLELISM 2;
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of configure command at 08/26/2009 20:03:30
    RMAN-06403: could not obtain a fully authorized session
    ORA-01034: ORACLE not available
    RMAN>
    RMAN> CONFIGURE CHANNEL DEVICE TYPE DISK FORMAT '/u01/app/oracle/backup/db/ora_df%t_s%s_s%p';
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of configure command at 08/26/2009 20:03:30
    RMAN-06403: could not obtain a fully authorized session
    ORA-01034: ORACLE not available
    But this problem is understandable, as the database is not running. The main problem why database is not running, I have found the reason but do not understand how to solve the problem.
    Since, the database was not running, I tried to startup the database, I then came across the following which is my problem (Why so many files are open? Linux OS error says too many files open. See below,
    SQL> conn /as sysdba
    Connected to an idle instance.
    SQL> startup
    ORACLE instance started.
    Total System Global Area 419430400 bytes
    Fixed Size 779516 bytes
    Variable Size 258743044 bytes
    Database Buffers 159383552 bytes
    Redo Buffers 524288 bytes
    Database mounted.
    ORA-00313: open failed for members of log group 2 of thread 1
    ORA-00312: online log 2 thread 1: '/u01/app/oracle/oradata/devpose/redo02.log'
    ORA-27041: unable to open file
    Linux Error: 23: Too many open files in system
    Can anybody has run into such problem and guide me to a solution, please?
    Thanks

    Hi,
    yes, this DB was functioning o.k. this configuration script was part of RMAN daily backup.
    Last night the backup failed. So, when I opened "Failed job" in the EM, I saw this type of messages.
    That was the starting point. Gradually, I tried to narrow down on to the actual problem and found the findings as I have posted.
    One way of sovling problem, I thought that, all these processes I would kill and then try to open the database, it might startup. However, that wouldnot lead me in ensuring this won't occur again.
    That's why I am trying to understand why it should open, so many processes (why spawn so many .flb files?) Any thoughts you have around this?
    I will try to restart the OS as the last resort.
    Thanks for your help and suggestions.
    Regards,

  • "java.io.IOException: Too many open files"  in LinuX

    Hi Developers,
    * I am continiously running and processing more than 2000 XML files by using SAX and DOM.....
    * My process is as follows,
    - Converting the XML file as Document object by DOM....
    - And that DOM will be used while creating log file report, that log file will be created after executing all XML files..
    * After processing approx 1000 files, it throws *"java.io.IOException: Too many open files" in LinuX system* ....
    * I have googled more and more in all sites including sun forum also, but they are telling only to increase the system config by ULIMIT in linux....If i increase that its executing well without exception........
    * My question is, Is it possible to do it by JAVA code itself or any other VM arguments like -Xms512m and -Xmx512m.....
    * Please let me know , if you have any idea.....
    Thanks And Regards,
    JavaImran

    Doh! I forgot to post my little code sample...
    package forums.crap;
    import java.io.*;
    import java.util.*;
    public class TooManyFileHandles
      private static final int HOW_MANY = 8*1024;
      public static void main(String[] args) {
        List<PrintWriter> writers = new ArrayList<PrintWriter>(HOW_MANY);
        try {
          try {
            for (int i=1; i<=HOW_MANY; i++ ) {
              writers.add(new PrintWriter("file"+i+".txt"));
          } finally {
            for (PrintWriter w : writers) {
              if(w!=null)w.close();
        } catch (Exception e) {
          e.printStackTrace();
    }... and the problem still isn't OOME ;-)
    Cheers. Keith.

  • WLS 10.3.5 on RHEL 5.4, SocketException: Too many open files

    Hi
    I'm running Weblogic server 10.3.5 on Red Hat Enterprise Linux Server release 5.4 (Tikanga), with Java jdk1.6.0_27.
    My order handling application, when receiving client orders, needs to make outbound SOAP calls to fulfill the order. During a performance test, we got following errors:
    ####<Feb 10, 2012 2:28:41 PM ICT> <Critical> <Server> <KKMOMAPP2> <KKMOMPE2> <DynamicListenThread[Default]> <<WLS Kernel>> <> <> <1328858921806> <BEA-002616> <Failed to listen on channel "Default" on 172.24.106.81:4095, failure count: 1, failing for 0 seconds, java.net.SocketException: Too many open files>
    I monitored the java process of this application, when the "Too many open files" error happened, it had 1388 open file descriptors, among which 655 were sockets.
    I also monitored the total open file descriptors of the weblogic user account, the count was around 6300 during this error.
    These numbers are far smaller than the file limits configured on OS:
    - Under weblogic account, ulimit -n shows 65536
    - /proc/sys/fs/file-max shows 772591
    - Following lines are already in /etc/security/limits.conf
    weblogic soft nofile 65536
    weblogic hard nofile 65536
    weblogic soft nproc 16384
    weblogic hard nproc 16384
    I did another test using a simple java program to open large number of sockets under weblogic account. It has no problem to open 15,000 sockets. It seems the file descriptor limit is indeed quite high, but for some reasons, the Weblogic process fails even when it has merely 1388 open files. Are there other Linux or Weblogic parameters I should tune? Or anything else I missed?
    Thank you very much
    Ning

    Hi All,
    Any help on this issue ?
    Thank you,
    Ram

  • Java.util.zip.ZipException: Too many open files on Linux

    Hi,
    We have web application running on Caucho's resin server on jdk 1.5.0_11 and Red hat Linux. We are noticing that java process is running out of file handles within 24-30 hours. We have file limit of 5000 which it consumes in 24 hours throwing 'java.util.zip.ZipException: Too many open files'.
    I have made sure all sorts of file handles are closed from application point of view. Here is the snapshot of lsof (list of file handles) from java process. The following list keeps growing until it runs out of limit. Do you have tips/suggestions on how to mitigate this problem (considering we dont want to increase ulimit for this process)? Also, can you make out any thing more from the description of file handles like, are they unclosed POP3 connections or URL connection to external sites?
    java 7156 resin 120u IPv4 34930051 UDP localhost.localdomain:59693
    java 7156 resin 121u IPv4 34927823 UDP localhost.localdomain:59663
    java 7156 resin 122u IPv4 34931861 UDP localhost.localdomain:59739
    java 7156 resin 123u IPv4 34932023 UDP localhost.localdomain:59745
    java 7156 resin 124u IPv4 34930054 UDP localhost.localdomain:59700
    java 7156 resin 125u IPv4 34927826 UDP localhost.localdomain:59665
    java 7156 resin 126u IPv4 34927829 UDP localhost.localdomain:59666
    java 7156 resin 127u IPv4 34930057 UDP localhost.localdomain:59703
    java 7156 resin 128u IPv4 34930713 UDP localhost.localdomain:59727
    java 7156 resin 129u IPv4 34930716 UDP localhost.localdomain:59730
    java 7156 resin 130u IPv4 34932238 UDP localhost.localdomain:59789
    java 7156 resin 131u IPv4 34932026 UDP localhost.localdomain:59749
    java 7156 resin 132u IPv4 34932221 UDP localhost.localdomain:59770
    java 7156 resin 133u IPv4 34932224 UDP localhost.localdomain:59775
    java 7156 resin 134u IPv4 34932029 UDP localhost.localdomain:59753
    java 7156 resin 135u IPv4 34932032 UDP localhost.localdomain:59754
    java 7156 resin 138u IPv4 34932035 UDP localhost.localdomain:59760
    java 7156 resin 139u IPv4 34932038 UDP localhost.localdomain:59763
    java 7156 resin 140u IPv4 34932227 UDP localhost.localdomain:59780
    java 7156 resin 141u IPv4 34932230 UDP localhost.localdomain:59781
    java 7156 resin 144u IPv4 34932234 UDP localhost.localdomain:59786
    java 7156 resin 146u IPv4 34932241 UDP localhost.localdomain:59792
    java 7156 resin 147u IPv4 34932247 UDP localhost.localdomain:59802

    Finally we resolved this issue. It was oracle driver which had some compatibility issue, we upgraded our Oracle client driver to newer version, and this fixed the problem. Base line, there was nothing wrong with application code, code was doing good resource clean up, but oracle driver was leaking handles per every connection.

  • Runtime.exec - Too Many Open Files

    System version : Red Hat Enterprise Linux 2.4.21-47.ELsmp AS release 3 (Taroon Update 8)
    JRE version : 1.6.0-b105
    Important : the commands described below are launched from a Web application : Apache Tomcat 6.0.10
    Hello,
    I'm facing a problem already known, but appearantly never really solved ??!! ;)
    When I invoke many system commands with the 'Runtime.exec(...)' method, there are open files that are not released (I can see them with the "lsof" system command) .
    At the end, the unavoidable "too many open files" Exception.
    The lauched commands are "ssh ... " commands.
    In the topics relating to this problem, the solution is always to close all Streams / threads and to explicitely invoke the method "Process.destroy()".
    My problem is that this is what I do ! And I can't do more...
    Here is the code :
           Runtime rt = Runtime.getRuntime();
           Process process = rt.exec("ssh ...");
            // ProcessStreamHolder extends Thread and reads from the InputStream given in constructor...
            ProcessStreamHolder errorStream = new ProcessStreamHolder(process.getErrorStream());
            ProcessStreamHolder outputStream = new ProcessStreamHolder(process.getInputStream());
            errorStream.start();
            outputStream.start();
            exitValue = process.waitFor();
            try {
                errorStream.interrupt();
            } catch (RuntimeException e) {
                logger.warn("...");
            try {
                outputStream.interrupt();
            } catch (RuntimeException e) {
                logger.warn("...");
            try {
                process.getInputStream().close();
            } catch (RuntimeException e) {
                logger.warn("...");
            try {
                process.getOutputStream().close();
            } catch (RuntimeException e) {
                logger.warn("...");
            try {
                process.getErrorStream().close();
            } catch (RuntimeException e) {
                logger.warn("...");
            process.destroy();Does someone know if my code is wrong or if there's a workaround for me ?
    Thanks by advance !
    Richard.

    Don't interrupt those threads. Close the output stream first, then wait for the process to exit, then both threads reading the stdout and stderr of the process should get EOFs, so they should exit naturally, and incidentally close the streams themselves.

  • "Too many open files" Exception on "tapestry-framework-4.1.1.jar"

    When a browser attempts accessing to my webwork, the server opens a certain number of file descriptors to "tapestry-framework-4.1.1.jar" file and don't release them for a while.
    Below is the output from "lsof | grep tapestry":
    java 26735 root mem REG 253,0 62415 2425040 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-portlet-4.1.1.jar
    java 26735 root mem REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
    java 26735 root mem REG 253,0 320546 2425036 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-contrib-4.1.1.jar
    java 26735 root mem REG 253,0 49564 2424979 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-annotations-4.1.1.jar
    java 26735 root 28r REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
    java 26735 root 29r REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
    java 26735 root 30r REG 253,0 2280602 2425039 /usr/local/apache-tomcat-5.5.20/my_webwork/WEB-INF/lib/tapestry-framework-4.1.1.jar
    These unknown references are sometimes released automatically, but sometimes not.
    And I get "Too many open files" exception after using my application for a few hours.
    The number of the unknown references increases as I access to my webwork or just hit on "F5" key on my browser to reload it.
    I tried different types of browsers to see if I could see any differences in consequence, and in fact it differed by the browser I used.
    When viewed by Internet Explorer it increased by 3 for every access.
    On the other hand it increased by 7 for each attempt when accessed by FireFox.
    I have already tried optimizing the max number of file discriptors, and it solved the "Too many open files" exception.
    But stil I'm wondering who actually is opening "tapestry-framework-4.1.1.jar" this many.
    Could anyone figure out what is going on?
    Thanks in advance.
    The following is my environmental version info:
    - Red Hat Enterprise Linux ES release 4 (Nahant Update 4)
    - Java: 1.5.0_11
    - Tomcat: 5.5.20
    - Tapestry: 4.1.1

    Hi,
    Cause might The server got an exception while trying to accept client connections. It will try to backoff to aid recovery.
    The OS limit for the number of open file descriptor (FD limit) needs to be increased. Tune OS parameters that might help the server to accept more client connections (e.g. TCP accept back log).
    http://e-docs.bea.com/wls/docs90/messages/Server.html#BEA-002616
    Regards,
    Prasanna Yalam

  • Why Are Many iPhoto Files Now Thumbnails?

    Greetings All: I've searched the forums for answers to my question, and there is a partially useful solution which dates back to 2006 about shrinking photos in iPhoto. All one needed to do to get around this problem was to go to ~Library/Pictures and make sure that iPhoto Data wasn't the selected folder for iPhoto. The solution was to select iPhoto Library as the source for iPhoto.
    Now, things are a bit more complex. I cannot find a Pictures folder in ~Libary directory. It's now moved to ~Pictures. I have only one iPhoto libary file and it's called "Rebuilt Library". When I'm in Events mode and browsing iPhoto, I can get the app to show me the individual files, but I don't know where to find them. Often there are six copies of any given file with (EDIT) appended to one of the copies. As I click on each one, it's apparent that they are all tumbnails. Some of the photos are orignial size, but many aren't.
    I wonder if I am doing something wrong, selecting an improper libary, or if I can kiss 800 or so of my Africa photos "goodbye"?
    Thanks in Advance.
    -jpkmd

    LN:
    I use iPhoto '11 v9.3. I own a MBP i7 dual core running OS 10.4.7. How did I get to the individual photo files? By selecting an indivdual photo, going to the "Edit" menu on iPhoto, and dropping down to the "Reveal in Finder" command. I saw folders marked by date full of individual photo files. I clicked on several of them which seemed to be duplicates named in series; for example:
    IMO2345.JPG
    IMO2345.1.JPG...
    IMO2345.6.JPG
    All seven photos were identical and all were of thumbnail size.
    Also present were photo files marked with (EDIT) such as
    HOLIDAY2005.JPG
    HOLIDAY2005(EDIT).JPG
    Again, both photos identical and both in thumbnail size.
    I opened these photos with Lemke's GraphicConverter. You say this is an SQL database and cannot be interpreted by any other application but iPhoto. (I had to look up the definition of SQL.) From my non-tech point of view, I was struck by the fact that there was so much duplication, so many (EDIT) files, and all in thumbnail size which is how they appear in iPhoto. I didn't delete or rearrange any of the files. I won't monkey around with that database again.
    There are two iPhoto libraries in my ~Pictures folder. One is iPhoto Library which is blank. The other is Rebuilt Library which contains all the data; it was created at the suggestion of the iPhoto application I was running in about 2006-2007. I don't recall exactly how I accomplished this.
    There are still some photos which are at normal size. I recall going through the approximately 4,500 photos in my iPhoto collection (there have to be copies in here) and taking out thumbnail duplicate photos. I took a photo journey to Africa a few years ago, and some of those photos seem to be thumbnail now, which makes me panic; however most of the other photos I have in iPhoto are thumbnail size now. I don't know how to proceed from here. At the time I did it, looking at the individual photos in the folders seemed like a reasonable thing to do. I'm a doctor, and I felt like doing some exploratory surgery. But I won't do that again.

  • My macbook is frozen but I can't turn it off because I have many unsaved files

    I was watching a movie then it froze so I tried to press escape but the I got the rainbow wheel. When I closed the lid the apple loga was still glowing, and the sound was still playing. I am a studdent and have many unsaved files on it please help!!!

    I have the EXACT SAME PROBLEM and have been researching the past 2 hours!!!! Aaargh!

  • I've recently downloaded Wondershare software and converted many video files. I'm in the process of creating a movie and imovie keeps quitting on me. Wondering if there is a problem with Wodershare imac conpatibility?

    I recently purchased Wondershare software. I've converted many video files and made a movie. imovie quitting. Wondering if there is a problem with Wondershare and imac compatibility. Any thoughts?
    Rick

    Not sure about the codec- Wondershare asked where I was sending to in setup and I selected imovie. It did the rest. I have been successful with editing on storyboard, but imovie (09) quits when I attempt to view project full screen, or prepare the movie for sharing. I have removed some files (to ensure headroom), and have installed most recent updates.

Maybe you are looking for

  • 0fi_ar_4 netduedate(netdt) value coming in RSA3 but not coming in BI PSA

    Hi all, i am using 0fi_ar_4, netduedate(netdt) value coming in RSA3 but not coming in BI PSA, in BI PSA there is 10-12 do cument there where netduedate is null while for same document we are getting the netduedate running from transaction RSA3, how?

  • Retrieving farm details of Sharepoint server

    an anyone help me out with the c# code to retrieve farm Id, farm Name,No of WSP solution, No of web application,No of site collection, Number of sites, Number of Content database, farm version and farm size of a sharepoint server??

  • Properties Tab and Miscellaneous Tab in Workflow

    Hi Experts, I want know how the Properties Tab of all steps works in workflow? please give some example workflows to check? i found in SAP library WS30000015 as an example. but i'm not able to find   1. how the sap.bc.bmt.wfm.process.status system se

  • Maximum database size Exchange 2013 Standard

    Hi, Does anybody is aware of database size limitation for Exchange 2013 Standard. I remember in older version there was a registry tweak to increase database limits. Thank you  alfa21

  • How can I edit column name/heading in Column Attributes?

    Hi All, In the link "*Home>Application Builder>Application 1000>Page 2>Report Attributes>Column Attributes*", can someone help me how to edit/modify 'Column Name' and 'Column Heading' ? Thanks in advance. Regards Sharath