File too large error unpacking War during app deploy - RHEL &WLS 10.3.5

I'm stumped and I'm hoping someone can help out here. Does anyone have any insights into the cause of my problem below, or tips on how to diagnose the cause?
scenario
We ran into an open files limit issue on our RH Linux servers, and had the SA boost the our open files limit fro 1024 to 3096. This seems to have solved the open files limit issue, once we restarted the node managers and the managed servers (our WLS startup script sets the soft limit to the hard limit).
But now we've got a new issue, right after this change. The admin server is now no longer able to deploy and war/ear, as when I click on "Activate" after the install I get
Message icon - Error An error occurred during activation of changes, please see the log for details.
Message icon - Error Failed to load webapp: 'TemplateManagerAdmin-1.0-SNAPSHOT.war'
Message icon - Error File too large
on the console, and see the stack trace below in the Admin server log (nothing in the managed server logs) - indicating it's getting the error in exploding the war.
I've tried both default deployment mode, and the mode "will make the deployment available in the following location" where the war is manually copied to the same location on each box, available to each server - all with the same result. I've also tried restarting the admin server, but no luck.
The files are not overly large (<=34 MByte) and we had no trouble with them before today. I'm able to log in as the WebLogic user and copye files, etc. with no problem.
There is no disk space issue - plenty of space left on all of our filesystems. There is, as far as I can tell, no OS or user file size limit issue:
     -bash-3.2$ ulimit -a
     core file size (blocks, -c) 0
     data seg size (kbytes, -d) unlimited
     scheduling priority (-e) 0
     file size (blocks, -f) unlimited
     pending signals (-i) 73728
     max locked memory (kbytes, -l) 32
     max memory size (kbytes, -m) unlimited
     open files (-n) 3096
     pipe size (512 bytes, -p) 8
     POSIX message queues (bytes, -q) 819200
     real-time priority (-r) 0
     stack size (kbytes, -s) 10240
     cpu time (seconds, -t) unlimited
     max user processes (-u) unlimited
     virtual memory (kbytes, -v) unlimited
     file locks (-x) unlimited
environment
WLS 10.3.5 64-bit
Linux 64-bit RHEL 5.6
Sun Hotspot 1.6.0_29 (64--bit)
stack trace
####<Mar 6, 2013 4:09:33 PM EST> <Error> <Console> <nj09mhm5111> <prp_appsvcs_admin> <[ACTIVE] ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'> <steven_elkind> <> <> <1362604173724> <BEA-240003> <Console encountered the following error weblogic.application.ModuleException: Failed to load webapp: 'TemplateManagerAdmin-1.0-SNAPSHOT.war'
at weblogic.servlet.internal.WebAppModule.prepare(WebAppModule.java:393)
at weblogic.application.internal.flow.ScopedModuleDriver.prepare(ScopedModuleDriver.java:176)
at weblogic.application.internal.flow.ModuleListenerInvoker.prepare(ModuleListenerInvoker.java:199)
at weblogic.application.internal.flow.DeploymentCallbackFlow$1.next(DeploymentCallbackFlow.java:517)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:159)
at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:45)
at weblogic.application.internal.BaseDeployment$1.next(BaseDeployment.java:613)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
at weblogic.application.internal.BaseDeployment.prepare(BaseDeployment.java:184)
at weblogic.application.internal.SingleModuleDeployment.prepare(SingleModuleDeployment.java:43)
at weblogic.application.internal.DeploymentStateChecker.prepare(DeploymentStateChecker.java:154)
at weblogic.deploy.internal.targetserver.AppContainerInvoker.prepare(AppContainerInvoker.java:60)
at weblogic.deploy.internal.targetserver.operations.ActivateOperation.createAndPrepareContainer(ActivateOperation.java:207)
at weblogic.deploy.internal.targetserver.operations.ActivateOperation.doPrepare(ActivateOperation.java:98)
at weblogic.deploy.internal.targetserver.operations.AbstractOperation.prepare(AbstractOperation.java:217)
at weblogic.deploy.internal.targetserver.DeploymentManager.handleDeploymentPrepare(DeploymentManager.java:747)
at weblogic.deploy.internal.targetserver.DeploymentManager.prepareDeploymentList(DeploymentManager.java:1216)
at weblogic.deploy.internal.targetserver.DeploymentManager.handlePrepare(DeploymentManager.java:250)
at weblogic.deploy.internal.targetserver.DeploymentServiceDispatcher.prepare(DeploymentServiceDispatcher.java:159)
at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.doPrepareCallback(DeploymentReceiverCallbackDeliverer.java:171)
at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.access$000(DeploymentReceiverCallbackDeliverer.java:13)
at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer$1.run(DeploymentReceiverCallbackDeliverer.java:46)
at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:528)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
Caused by: java.io.IOException: File too large
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:282)
at weblogic.utils.io.StreamUtils.writeTo(StreamUtils.java:19)
at weblogic.utils.FileUtils.writeToFile(FileUtils.java:117)
at weblogic.utils.jars.JarFileUtils.extract(JarFileUtils.java:285)
at weblogic.servlet.internal.ArchivedWar.expandWarFileIntoDirectory(ArchivedWar.java:139)
at weblogic.servlet.internal.ArchivedWar.extractWarFile(ArchivedWar.java:108)
at weblogic.servlet.internal.ArchivedWar.<init>(ArchivedWar.java:57)
at weblogic.servlet.internal.War.makeExplodedJar(War.java:1093)
at weblogic.servlet.internal.War.<init>(War.java:186)
at weblogic.servlet.internal.WebAppServletContext.processDocroot(WebAppServletContext.java:2789)
at weblogic.servlet.internal.WebAppServletContext.setDocroot(WebAppServletContext.java:2666)
at weblogic.servlet.internal.WebAppServletContext.<init>(WebAppServletContext.java:413)
at weblogic.servlet.internal.WebAppServletContext.<init>(WebAppServletContext.java:493)
at weblogic.servlet.internal.HttpServer.loadWebApp(HttpServer.java:418)
at weblogic.servlet.internal.WebAppModule.registerWebApp(WebAppModule.java:972)
at weblogic.servlet.internal.WebAppModule.prepare(WebAppModule.java:382)

In the end, the problem was not in the Admin server where the log entry is, but in one of the managed servers where there was no such log entry.
Somehow, and we have no idea how, the NodeManager process had the soft limit for max file size set to 2k blocks. Thus, the managed server inherited that. We restarted the Node Manager, then the managed server, and the problem went away.
The diagnostic that turned the trick:
cat /proc/<pid>/limits
for the managed server showed the bad limit setting, then diagnosis proceeded from there. The admin server, of course, had "unlimited" since it was not the source of the problem.

Similar Messages

  • Disk Utility: Creating a new blank image receiving "file too large" error.

    Hello All!
    I'm trying to create a 10GB non-encrypted, non-compressed RW blank image via the disk utility. The DU runs for a few minutes then barfs out "file too large" error. I have over 30GB free on my HDD. I tried with a smaller size of 6GB to no avail. Also tried unsuccessfully to create from a file (about 4 GB). My ultimate goal is to create a case-insensitive image to run an extremely important program needed for high priority work productivity (i.e. WoW). Thanks in advance for any advice! You will be my new best friend if you help me resolve this. =D
    Hollie
    "There are only 10 types of people in this world: Those who understand binary, and those who don't."

    Hi Hollie, and welcome to the forums!
    Have you created images before successfully?
    Is this to/on your boot drive, or an external drive?
    Have you done any Disk/OS maintenance lately?
    We might see if there are some big temp files left or such...
    How much free space is on the HD, where has all the space gone?
    OmniDiskSweeper is now free, and likely the best/easiest...
    http://www.omnigroup.com/applications/omnidisksweeper/
    WhatSize...
    http://www.macupdate.com/info.php/id/13006/
    Disk Inventory X...
    http://www.derlien.com/
    GrandPerspective...
    http://grandperspectiv.sourceforge.net/

  • File too large error or corrupt file error

    I have scanned some images using a Nikon Cool Scan and when trying to import the the NEF files into Lightroom I get a corrupt or unrecognized file error. Bring it into CS2 or CS3 and save as TIFF and try the import and get a File too large error.
    Any ideas or help on this. What is the file size max for import?
    The scan is 4000dpi even tried at 300dpi.
    Thanks in advance for any insight.

    &gt; Is it truly a size problem? If so, what is recommended? Lee Jay states that 10000 pixels is the max on either side. Okay, in DPI, what does that translate to?
    <br />
    <br />There's no necessary relationship between pixels and dots. You could scan an image at 4,000,000 dpi and translate it into an image of 100 x 100 pixels. I've used ridiculous extremes to make a point. The LR limitation is currently 10,000 pixels for any side. So you could have 9,000 x9,000 pixels but not 10,001 x50 pixels.
    <br />
    <br />Is this now clearer?
    <br />
    <br />
    <span style="color: rgb(102, 0, 204);">John "McPhotoman"</span>
    <font br="" /></font> color="#800000" size="2"&gt;~~ John McWilliams
    <br />
    <br />
    <br />
    <br />MacBookPro 2 Ghz Intel Core Duo, G-5 Dual 1.8;
    <br />Canon DSLRs

  • File too large error message

    I have an iBook G4 and I am running Numbers '08. I tried to open my budget document today and got an 'The document can't be opened because it is too large' error message. I just opened it a few days ago and it was fine, and my other Numbers documents are opening fine. It is a spreadsheet with about 5 sheets - not really THAT big, especially considering some of the Excel spreadsheets I deal with at work. Any ideas? This is (of course) the most important document on my computer...

    (1) comparing to Excel documents size is meaningless.
    Given the fact that Numbers uses xml, its files are huge.
    (2) Try to close every application including Numbers then, double click your document.
    Doing that you will have the maximum memory available for the document.
    (3) check the quantity of space available on your HD.
    Numbers make an heavy use of virtual memory so, it need a lot of free space available on the HD.
    If these tracks don't help, you may send the document to my mailbox (use the free YouSendit if it's too big for a attachment)
    I would try to open it and scan it to see if there is a way to reduce its memory requirements.
    Click my blue name to get my address.
    Don't worry if it contains personal infos, i'm neither the KGB, nor the CIA.
    Yvan KOENIG (VALLAURIS, France) dimanche 1 août 2010 10:42:09

  • XCode 3 download gives "file too large" error msg

    When I tried to download XCode 3 (XCode 3.26 and iOS SDK 4.3) from the developer website, the download manager displayed an error message just after the file finished downloading, saying that the file was too large for my hard drive (insufficient memory). Although it was a 4GB file, I had plenty of disk space. This happened twice with two different disk drives (a 250 GB Iomega and an 8GB flash drive). Any recommendations?
    Thanks,
    Josh

    I had plenty of disk space.
    That's a technical term way over my head
    What is the total capacity of your hard drive in GB?
    What is the exact amount of unused space on your hard drive?

  • Outlook for mac 2011 sent mail file too large error message

    The synchronization of outlook with gmail is slowing down because there is a stuck message that appears to be too large.
    Have tried to run a script i found out about to delete and it didnt work. I also went to Offline mode and sent myself an email so i could see the outbox and there is nothing there.
    Suggestions appreciate.

    Normanfrompalmyra,
    Gmail does not have an outbox per se. While you are composing or writing an e-mail, it is automatically saved in the Drafts folder. After you hit send, a copy of the e-mail is saved in the Sent folder. So, you can look in both those folders.
    If you are in Outlook, click on the Home tab, then look at the left side of the Outlook window. You should see a column on the left with each of the different e-mail accounts that you have set up in Outlook. At the top of this column are the four general folders: Inbox, Drafts, Sent Items, Deleted Items. Below that are folders for each e-mail account, under which you see the folder structure you created for each e-mail account. If you know which e-mail is causing the problem, look under Drafts, Sent Items, then Deleted Items for that e-mail.
    Good luck,
    Arbor Friend

  • Posfix error writing message: File too large

    I was looking for a reason for my Mac hanging occasionally. In Console I found the following recurring error message.
    10/2/11 8:15:19 PM     postfix/master[656]     daemon started -- version 2.5.5, configuration /etc/postfix
    10/2/11 8:16:00 PM     postfix/pickup[657]     B6095D33A25: uid=501 from=<Al>
    10/2/11 8:16:00 PM     postfix/cleanup[664]     B6095D33A25: message-id=<[email protected]>
    10/2/11 8:16:00 PM     postfix/qmgr[658]     B6095D33A25: from=<[email protected]>, size=668, nrcpt=1 (queue active)
    10/2/11 8:16:00 PM     postfix/local[666]     B6095D33A25: to=<[email protected]>, orig_to=<Al>, relay=local, delay=0.08, delays=0.04/0.02/0/0.02, dsn=5.2.2, status=bounced (cannot update mailbox /var/mail/al for user al. error writing message: File too large)
    10/2/11 8:16:00 PM     postfix/cleanup[664]     C5ACFD33A28: message-id=<[email protected]>
    10/2/11 8:16:00 PM     postfix/bounce[667]     B6095D33A25: sender non-delivery notification: C5ACFD33A28
    10/2/11 8:16:00 PM     postfix/qmgr[658]     C5ACFD33A28: from=<>, size=2366, nrcpt=1 (queue active)
    10/2/11 8:16:00 PM     postfix/qmgr[658]     B6095D33A25: removed
    10/2/11 8:16:00 PM     postfix/local[666]     C5ACFD33A28: to=<[email protected]>, relay=local, delay=0.01, delays=0/0/0/0, dsn=5.2.2, status=bounced (cannot update mailbox /var/mail/al for user al. error writing message: File too large)
    10/2/11 8:16:00 PM     postfix/qmgr[658]     C5ACFD33A28: removed
    10/2/11 8:16:19 PM     postfix/master[656]     master exit time has arrived
    I haven't done anything with Postfix, but I'm guessing one of the monitoring utilities is set up to send me an email message if it finds an error.
    Looking for ideas on how to fix this.

    Well, I think I figured this out.
    I found this posting on a message board http://www.linuxquestions.org/questions/linux-networking-3/file-too-large-in-pos tfix-495988/#post2480213
    I ran this command in terminal: sudo postconf -e "virtual_mailbox_limit=0" then: sudo postfix reload
    That still didn't fix the error messages.
    Then I ran this command in terminal: sudo postconf -e "mailbox_size_limit=0" then: sudo postfix reload
    That stopped the "file too large" error message.
    Now the console message I get is:
    postfix/master[1630]          daemon started -- version 2.5.14, configuration /etc/postfix
    postfix/pickup[1631]          68EA012A66BD: uid=501 from=<Al>
    postfix/cleanup[1643]          68EA012A66BD: message-id=<[email protected]>
    postfix/qmgr[1632]          68EA012A66BD: from=<[email protected]>, size=707, nrcpt=1 (queue active)
    postfix/pickup[1631]          BA07112A66BE: uid=501 from=<Al>
    postfix/cleanup[1643]          BA07112A66BE: message-id=<[email protected]>
    postfix/local[1645]          68EA012A66BD: to=<[email protected]>, orig_to=<Al>, relay=local, delay=0.82, delays=0.5/0.09/0/0.23, dsn=2.0.0, status=sent (delivered to mailbox)
    postfix/master[1630]          master exit time has arrived
    I'm guessing this is normal function of the postfix system. However, I'm still not sure what postfix is doing. Maybe its the under-the-hood stuff for Apple Mail?

  • Weblogic 10 - application deployment error: Exception is: "File too large"

    I posted this in Weblogic -> general but realise is should have really gone here as it's about admin server/deployment services setup / configuration.
    I am using weblogic application server 10 in a weblogic clustered enviornment.
    I am trying to deploy an application to a managed server when it starts up, all goes well and I can see it deploying the war files to the managed server.
    It hits a certain war and panics with the exception
    ####<Nov 19, 2011 2:03:59 PM BRST> <Error> <Deployer> <devnode01> <managedserver2> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <1321718639109> <BEA-149205> <Failed to initialize the application 'test_war' due to error weblogic.management.DeploymentException: Exception occured while downloading files.
    weblogic.management.DeploymentException: Exception occured while downloading files
    at weblogic.deploy.internal.targetserver.datamanagement.AppDataUpdate.doDownload(AppDataUpdate.java:43)
    at weblogic.deploy.internal.targetserver.datamanagement.DataUpdate.download(DataUpdate.java:56)
    at weblogic.deploy.internal.targetserver.datamanagement.Data.prepareDataUpdate(Data.java:97)
    at weblogic.deploy.internal.targetserver.BasicDeployment.prepareDataUpdate(BasicDeployment.java:682)
    at weblogic.deploy.internal.targetserver.BasicDeployment.stageFilesForStatic(BasicDeployment.java:725)
    at weblogic.deploy.internal.targetserver.AppDeployment.prepare(AppDeployment.java:104)
    at weblogic.management.deploy.internal.DeploymentAdapter$1.doPrepare(DeploymentAdapter.java:39)
    at weblogic.management.deploy.internal.DeploymentAdapter.prepare(DeploymentAdapter.java:187)
    at weblogic.management.deploy.internal.AppTransition$1.transitionApp(AppTransition.java:21)
    at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:233)
    at weblogic.management.deploy.internal.ConfiguredDeployments.prepare(ConfiguredDeployments.java:165)
    at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:122)
    at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:173)
    at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:89)
    at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused By: java.io.IOException: [DeploymentService:290066]Error occurred while downloading files from admin server for deployment request "0". Underlying error is: "[DeploymentService:290065]Deployment service servlet encountered an Exception while handling the deployment datatransfer message for request id "0" from server "managedserver2". Exception is: "File too large"."
    at weblogic.deploy.service.datatransferhandlers.HttpDataTransferHandler.getDataAsStream(HttpDataTransferHandler.java:86)
    at weblogic.deploy.service.datatransferhandlers.DataHandlerManager$RemoteDataTransferHandler.getDataAsStream(DataHandlerManager.java:153)
    at weblogic.deploy.internal.targetserver.datamanagement.AppDataUpdate.doDownload(AppDataUpdate.java:39)
    at weblogic.deploy.internal.targetserver.datamanagement.DataUpdate.download(DataUpdate.java:56)
    at weblogic.deploy.internal.targetserver.datamanagement.Data.prepareDataUpdate(Data.java:97)
    at weblogic.deploy.internal.targetserver.BasicDeployment.prepareDataUpdate(BasicDeployment.java:682)
    at weblogic.deploy.internal.targetserver.BasicDeployment.stageFilesForStatic(BasicDeployment.java:725)
    at weblogic.deploy.internal.targetserver.AppDeployment.prepare(AppDeployment.java:104)
    at weblogic.management.deploy.internal.DeploymentAdapter$1.doPrepare(DeploymentAdapter.java:39)
    at weblogic.management.deploy.internal.DeploymentAdapter.prepare(DeploymentAdapter.java:187)
    at weblogic.management.deploy.internal.AppTransition$1.transitionApp(AppTransition.java:21)
    at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:233)
    at weblogic.management.deploy.internal.ConfiguredDeployments.prepare(ConfiguredDeployments.java:165)
    at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:122)
    at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:173)
    at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:89)
    at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    The error appears to be stating the physical file is too big to be deployed
    I'm running the managed servers with a heap size of 3GB and the managed server is running with 2GB - I know these are large but they where being used for debugging
    I can't find any documentation on the file size too large error, or how to resolve it
    DeploymentService:290065 says to look in the log (details are above) and DeploymentService:290066 says the error will be explained in it's description, which it is, "file size too big", it doesn't say where to see/set the max file size, there is plenty of disk space so I can only assume it's setting for the deployment service that needs to be increase, but I cannot find info on this.

    I don't think this would help, but would using the nostage option for deployment change this behaviour.
    I don't think it would as this is for disk based problems rather than transfer size issues.

  • "result too large" error when accessing files

    Hi,
    I'm attempting to make a backup copy of one of my folders (using tar from shell). For several files, I got "Read error at byte 0, reading 1224 bytes: Result too large" error message. It seems those files are unreadable. Whatever application attempts to access them results with the same error.
    The files reside on the volume that I created a day ago. It's a non-journaled HFS+ volume on external hard drive. They are part of an Aperture Vault that I wanted to make an archive copy and store offsite. Aperture was closed (not running) when I was creating the archive.
    This means two things. The onsite backup of my photos is broken, obviously (some of the files are unreadable). My offsite backup is broken, since it doesn't contain those files.
    I've searched the net, and found couple of threads on some mailing lists describing same problem. But no answer. Couple of folks on those mailing lists suggested it migh point to full disk. However, in my case, there is some 450GB of free space on the volume I was getting read errors on (the destination volume had about 200GB free, and system drive had about 50GB free, so there was plenty of space all around the system too).
    File system corruption?
      Mac OS X (10.4.9)  

    Here's the tar command with the output:
    $ tar cf /Volumes/WINNIPEG\;TOPORKO/MacBackups/2007-05-27/aperture.tar Alex\ -\ External\ HD.apvault
    tar: Alex - External HD.apvault/Library/2003.approject/2007-03-24 @ 08\:17\:52 PM - 1.apimportgroup/IMG0187/Thumbnails/IMG0187.jpg: Read error at byte 0, reading 3840 bytes: Result too large
    tar: Alex - External HD.apvault/Library/2006.approject/2007-03-24 @ 08\:05\:07 PM - 1.apimportgroup/IMG2088/IMG2088.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Jasper and Banff 2006.approject/2007-03-25 @ 09\:41\:41 PM - 1.apimportgroup/IMG1836/IMG1836.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image04_05 (1)/Info.apmaster: Read error at byte 0, reading 503 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Old Scanned.approject/2007-03-24 @ 12\:42\:55 AM - 1.apimportgroup/Image16_02/Info.apmaster: Read error at byte 0, reading 499 bytes: Result too large
    tar: Alex - External HD.apvault/Library/Vacation Croatia 2006.approject/2007-03-25 @ 09\:47\:17 PM - 1.apimportgroup/IMG0490/IMG0490.jpg.apfile: Read error at byte 0, reading 1224 bytes: Result too large
    tar: Error exit delayed from previous errors
    Here's the "ls -l" output for one of the files in question:
    $ ls -l IMG_0187.jpg
    -rw-r--r-- 1 dijana dijana 3840 Mar 24 23:27 IMG_0187.jpg
    Accessing that file (or any other from the above list) gives same/similar error. The wording differes from command to command, but basically it's the same thing (read error, or result too large, or both combined). For example:
    $ cp IMG_0187.jpg ~
    cp: IMG_0187.jpg: Result too large
    The console log doesn't show any related errors.

  • Page Too Large Error with JHS 10.1.3.2

    Hi,
    I have migrated to the new version of JHS (10.1.3.2). According to the JHS documentation the “page to large error” has been fixed in this version but I am still experiencing this problem.
    I have checked the boxes for Generate Group in Region File and Generate Search Area in Region File in the app def for the large page that I have in my app. With this done the “page too large” error occurs. I then tried checking the box for Generate in Region File on each item group region on my page but the “page too large” error still happens. None of my item group regions contain that many items (average is about 10 items).
    Is any one having the same issues?
    Regards
    Bar

    josealej,
    Have you spilt up your items into different regions and selected the option to generate the groups in a region file? If so then you can try setting the "generate in region file" option on each region container. Check out section 4.6.2 in the JHS developers manual.
    Sandra,
    Unfortunately I don't have enough free time to debug the problem. I will look into it when my schedule is not as hectic.
    Bar

  • WebLogic Issue: File too large

    Hi All,
    I am getting below error in logs while starting the WLS (10.3.5 on IBM AIX 6.1 using IBM JDK) AdminServer:
    ####<Nov 8, 2012 10:28:45 PM PST> <Notice> <Security> <edrpoc10.ftb.ca.gov> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1352442525279> <BEA-090082> <Security initializing using security realm myrealm.>
    ####<Nov 8, 2012 10:28:51 PM PST> <Notice> <WebLogicServer> <edrpoc10.ftb.ca.gov> <AdminServer> <main> <<WLS Kernel>> <> <> <1352442531303> <BEA-000365> <Server state changed to STANDBY>
    ####<Nov 8, 2012 10:28:51 PM PST> <Notice> <WebLogicServer> <edrpoc10.ftb.ca.gov> <AdminServer> <main> <<WLS Kernel>> <> <> <1352442531304> <BEA-000365> <Server state changed to STARTING>
    ####<Nov 8, 2012 10:28:54 PM PST> <Warning> <oracle.as.jmx.framework.MessageLocalizationHelper> <edrpoc10.ftb.ca.gov> <AdminServer> <JMX FRAMEWORK Domain Runtime MBeanServer pooling thread> <<anonymous>> <> <0000JfZqpLg4ykJLQm5Eid1GbAAX000001> <1352442534039> <J2EE JMX-46041> <The resource for bundle "oracle.jrf.i18n.MBeanMessageBundle" with key "oracle.jrf.JRFServiceMBean.checkIfJRFAppliedOnMutipleTargets" cannot be found.>
    ####<Nov 8, 2012 10:28:57 PM PST> <Error> <Deployer> <edrpoc10.ftb.ca.gov> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1352442537493> <BEA-149205> <Failed to initialize the application 'adf.oracle.domain [LibSpecVersion=1.0,LibImplVersion=11.1.1.2.0]' due to error weblogic.application.library.LibraryDeploymentException: [J2EE:160141]Could not initialize the library Extension-Name: adf.oracle.domain, Specification-Version: 1, Implementation-Version: 11.1.1.2.0. Please ensure the deployment unit is a valid library type (war, ejb, ear, plain jar). weblogic.application.library.LibraryProcessingException: java.io.IOException: File too large
         at weblogic.application.internal.library.EarLibraryDefinition.init(EarLibraryDefinition.java:93)
         at weblogic.application.utils.LibraryLoggingUtils.initLibraryDefinition(LibraryLoggingUtils.java:277)
         at weblogic.application.internal.library.LibraryDeployment.prepare(LibraryDeployment.java:44)
         at weblogic.application.internal.DeploymentStateChecker.prepare(DeploymentStateChecker.java:154)
         at weblogic.deploy.internal.targetserver.AppContainerInvoker.prepare(AppContainerInvoker.java:60)
         at weblogic.deploy.internal.targetserver.AppDeployment.prepare(AppDeployment.java:141)
         at weblogic.management.deploy.internal.DeploymentAdapter$1.doPrepare(DeploymentAdapter.java:39)
         at weblogic.management.deploy.internal.DeploymentAdapter.prepare(DeploymentAdapter.java:191)
         at weblogic.management.deploy.internal.AppTransition$1.transitionApp(AppTransition.java:21)
         at weblogic.management.deploy.internal.ConfiguredDeployments.transitionApps(ConfiguredDeployments.java:240)
         at weblogic.management.deploy.internal.ConfiguredDeployments.prepare(ConfiguredDeployments.java:165)
         at weblogic.management.deploy.internal.ConfiguredDeployments.deploy(ConfiguredDeployments.java:122)
         at weblogic.management.deploy.internal.DeploymentServerService.resume(DeploymentServerService.java:180)
         at weblogic.management.deploy.internal.DeploymentServerService.start(DeploymentServerService.java:96)
         at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.io.IOException: File too large
         at java.io.FileOutputStream.writeBytes(Native Method)
         at java.io.FileOutputStream.write(FileOutputStream.java:282)
         at weblogic.utils.io.StreamUtils.writeTo(StreamUtils.java:19)
         at weblogic.utils.FileUtils.writeToFile(FileUtils.java:117)
         at weblogic.utils.jars.JarFileUtils.extract(JarFileUtils.java:285)
         at weblogic.utils.jars.JarFileUtils.extract(JarFileUtils.java:246)
         at weblogic.application.io.ExplodedJar.extractJarFile(ExplodedJar.java:301)
         at weblogic.application.io.ExplodedJar.<init>(ExplodedJar.java:54)
         at weblogic.application.io.Ear.<init>(Ear.java:47)
         at weblogic.application.internal.library.EarLibraryDefinition.init(EarLibraryDefinition.java:81)
         ... 16 more
    ####<Nov 8, 2012 10:28:59 PM PST> <Error> <Deployer> <edrpoc10.ftb.ca.gov> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1352442539037> <BEA-149205> <Failed to initialize the application 'emai' due to error weblogic.application.library.LibraryDeploymentException: [J2EE:160141]Could not initialize the library Extension-Name: emai. Please ensure the deployment unit is a valid library type (war, ejb, ear, plain jar). weblogic.application.library.LibraryProcessingException: java.io.IOException: File too large
    Any pointers on resolving the issue?
    Regards,
    Sunny
    Edited by: ajmerasunny on Nov 9, 2012 10:23 AM

    Hi Sunny,
    The issue is because in your OS AIX,You dont have the support for largefile .
    How do you enable largefile support in AIX?
    On file /etc/security/limits , change the value of "fsize" to -1 . "-1" denotes unlimited. Log-off and login , stop & start applications to make largefiles work.
    Ref: http://unixfoo.blogspot.in/2008/11/aix-filesystem-tips.html

  • File too large - attachment settings not working

    Hi there
    We are having problems with attaching files in IMS 5.2 & wondered if anybody can help.
    Our outgoing mail message max size is set to 50MB (I know about the extra 33% space required for encoding) and yet we still cannot attach files to emails that are greater than 5MB.
    Does anyone have any idea why this is not working?
    Anytime we try to send a 7 or 8 MB file an error "File too large "comes up right away
    The settings in the messaging server console is set to 50MB. This is under the HTTP service
    There was a previous post but the solution did not solve my problem.
    Can anyone help?
    Thanks

    There are separate settings for webmail attaqchments. Please check documentation at:
    http://docs.sun.com/source/816-6020-10/cfgutil.htm
    and look at:
    service.http.maxmessagesize
    and
    service.http.maxpostsize
    these both default to 5 megs.
    You have to restart the webmail daemon to make a change take effect.

  • Too large error troubleshooting jdev10.1.3.3- jhs10.3.2.52

    Hi, how are you? I have in my jheadstart application definitiion cases where one detail or detail-detail group has list of values, when I run the application I take the following error: •     Error: code segment of method _jspService(javax.servlet.http.HttpServletRequest, javax.servlet.http.HttpServletResponse) too large
    , then I set the generate groups in region file- generate search area in region file properties- generate and run the application , but I cannot open the detail groups or both the list of values in master groups. Then I uncheck these properties and run again but sometimes I take again the above too large error, which prevents me from running the application. i' ve heard that if one buys the jheadstart license can have access to one patch that faces the too large problem successfully, we have the jheadstart license by oracle, could you inform about any possible solution that could solve this problem?

    Yes, if you have a JHeadstart license, you can download the latest 10.1.3.3 release from cso.oracle.com.
    Can you download this version and see whether it solves your problem?
    Steven Davelaar,
    JHeadstart team.

  • Code too large error

    Hi,
    I have an enum with about 3000 values in application. When I add some more items I get "code too large" error during compilation.
    Is there any workaround for that problem? (Some configuration change?)
    What exactly does compiler want to say? Where can I find info about what limit do I exceed?
    I will appreciate any kind of help :)
    Greetings
    Michal

    what can you kno? Model of the form is not hardcoded, it is in DB, it is modifiable by user(admin). Only elements which have specific business validation have identifier (i mean enum identifier). They must be somehow identified, not just by number. Number may change if sone adds a question before this one.
    What's better than that? The solution has been made before I came to the company, but I believe it is one of the best possible.

  • TFTP file too large for upload

    I'm trying to upgrade my router via TFTP. I keep getting this File too large for TFTP error. I'm using the recommended TFTP server from Solarwinds.
    There does not seem to be any setting in the server to let large file pass, It's the first time I see that, but this is the biggest IOS I had to upload. I had no problem sending the last IOS witch is only about 3MB smaller.

    correct. copy ftp://userid:password@servername/directory/filename flash:
    For more information, refer to the following URL:
    http://www.cisco.com/univercd/cc/td/doc/product/software/ios124/124tcr/tcf_r/cf_02ht.htm#wp1032450
    Hope this helps,

Maybe you are looking for

  • When I close my MacBook Pro and put it away it heats up to ~145 degrees.  It's happened twice now.

    Help!  It gets so hot it actually burns the skin.  I have to open it up, hold the power button down for 5 seconds and it comes back.  I shut it down and plug it in and it cools down. Model Name:          MacBook Pro   Model Identifier:          MacBo

  • GTS and FI intergration

    Hi, Currently we are implementing SAP GTS 7.2, we need to do SPL check on payment run. When the payment proposal is generated in ECC - get an error as u201CTechnical error when checking in GTSu201D the message class is u201CFIPAY_GTSu201D but in GTS

  • Difference between sap 4.7e and Ecc6.0 in ABAP point of wiew

    hi experts, can any one explain about the differences b/n sap 4.7e and sap ecc6.0 in ABAP point of view. thanks in advance. rgds, nag.

  • Script to display users and what they have spent

    its giving me the username fine, but is not adding up the costs correctly just displays it as 100 for all users, have tried to change it but now its giving me a syntax error. any ideas? insert into clubcosts(USER_NAME, TOTALSPENT)      select        

  • Surfing the internet via Bluetooth

    Hi guys, I just got my Macbook and a new Blackberry 8800. When I paired the devices it said that I could set my macbook up to use the internet from my blackberry's data connection. Sounds great for when i'm on the road but I've tried using it and I j