Playing a large Clip and "Requested buffer too large" exception

Hi All.
Through JavaSound, I am trying to reproduce a WAV audio file that I create within my application. Also the recording/acquisition of the audio file is performed through JavaSound.
The WAV audio file that I am trying to play has the following features:
- 44 Khz, 8-bit, mono
- Length = 1min 36sec
- File size = 4.04 MB
Here is a fragment of my code:
// Get the input stream from the input audio file
audioInStream = AudioSystem.getAudioInputStream(inputFile);
AudioFormat audioFormat = audioInStream.getFormat();
// Compute the buffer size (for a 0.25 seconds buffer)
float bufferTime = 0.25F; // 0.25 seconds of buffer
int frameSize = audioFormat.getFrameSize();
float frameRate = audioFormat.getFrameRate(); // frames per second
int numOfBufferedFrames = (int)(frameRate * bufferTime);
int bufferSize = frameSize * numOfBufferedFrames;
// Create the output data line
DataLine.Info sourceDataLineInfo = new DataLine.Info(Clip.class, audioFormat, bufferSize);
sourceClip = (Clip)AudioSystem.getLine(sourceDataLineInfo);
sourceClip.addLineListener(this);
// Prepare the output line for playing
sourceClip.open(audioInStream);
// Start playing
sourceClip.start();This code works perfectly fine on Windows (I am testing on a Windows XP Pro 32-bit machine).
On the other side, this code throws the following exception on Mac OS X:
javax.sound.sampled.LineUnavailableException: Failed to allocate clip data: Requested buffer too large.
     at com.sun.media.sound.MixerClip.implOpen(MixerClip.java:561)
     at com.sun.media.sound.MixerClip.open(MixerClip.java:165)
     at com.sun.media.sound.MixerClip.open(MixerClip.java:256)
     at mypackage.AudioPlaybackThread.run(AudioPlaybackThread.java:70)
     at java.lang.Thread.run(Thread.java:637)The exception is thrown upon the sourceClip.open(audioInStream); call.
Googling around, I discovered that the MixerClip implementation seems to have some inherent buffer size limitation of 1 MB... Isn't it ridiculous?
But why this is working perfectly fine on Windows - even with clips of 4 megabytes -, while not working with larger clips on Mac OS X?
Also, will the following code line influence the size of the buffer used by MixerClip?
DataLine.Info sourceDataLineInfo = new DataLine.Info(Clip.class, audioFormat, bufferSize);I know that I could use a SourceDataLine class, and write() data into it in order to reproduce my long audio clip, but I need support for pausing, seeking audio to a specific position, seeking forward, etc., and using a Clip would be absolutely perfect for this purpose!
Any help or suggestion would be greatly appreciated.
Thanks,
Marco

Googling around, I discovered that the MixerClip implementation seems to have some inherent buffer size limitation of 1 MB... Isn't it ridiculous?
But why this is working perfectly fine on Windows - even with clips of 4 megabytes -, while not working with larger clips on Mac OS X?Because Apple changes stuff in the JRE and Microsoft doesn't... and I personally believe Mac specifically strips down the JavaSound stuff because of iTunes (anti-competition and such).
Also, will the following code line influence the size of the buffer used by MixerClip?Probably not, no.
I know that I could use a SourceDataLine class, and write() data into it in order to reproduce my long audio clip, but I need support for pausing, seeking audio to a specific position, seeking forward, etc., and using a Clip would be absolutely perfect for this purpose!You can program that functionality yourself, and if you want to open large files in Mac, it sounds like you'll have to.

Similar Messages

  • Requested buffer too large - but data is already in memory

    Hello all,
    I am writing a program that generates sound and then uses the Java Sound API to play it back over the speakers. Until recently, using clips have not led to any problems. On two computers I can play the sound without a hitch. However, on the newest computer (and also with the largest specs and especially more RAM), I am getting an error while trying to play back the sound. The exception that is thrown is:
    javax.sound.sampled.LineUnavailableException: Failed to allocate clip data: Requested buffer too large.
    I find this odd because the buffer already exists in memory: I don't have to read in a .wav file or anything because I am creating the audio during the course of my program's execution (this is also why I use Clips instead of streaming - the values are saved as doubles during the calculations and then converted into a byte array, which is the buffer that is used in the clip.open() method call). It has no problems allocating the double array, the byte array, or populating the byte array. It is only thrown during clip.open() call. I also find it strange that it would work on two other computers, both of which have less RAM (it runs fine on a machine with 512MB and 2GB of RAM, both XP 32-bit). The only difference is that the computer with the issue is running Windows 7 (the RTM build), 64-bit with 6GB of RAM. I am running it through Netbeans 6.7.1 with memory options set to use up to 512MB - but it's never gone up that far before. And I've checked the size of the buffer on all three computers and they are all the same.
    Does anyone know what the issue could be or how to resolve it? I am using JDK6 if that matters. Thank you for your time.
    Edited by: Sengin on Sep 18, 2009 9:40 PM

    Thanks for your answer. I'll try that.
    I figured it had something to do with Windows 7 since it technically hasn't been released yet (however I have the RTM version thanks to a group at my univeristy in cahoots with Microsoft which allows some students to get various Microsoft products for $12).
    Edit: I just changed the Clip to a SourceDataLine (and the few other necessary changes like changing the way the DataLine.Info object was created) and wrote the whole buffer into it, drained the line and then closed it. It works fine. I'll mark the question as answered, however that may not be the "correct" answer (perhaps it does have something to do with Windows 7 and not being completely tested yet). Thanks.
    Edited by: Sengin on Sep 21, 2009 8:44 PM
    Edited by: Sengin on Sep 21, 2009 8:46 PM

  • (413) Request Entity too large intermittent error

    I have page in a SharePoint 2013 website which is viewed over https. The page has several input fields including two file upload controls. I am trying to upload a sample picture less than 1MB for each of the controls.
    I am calling a BizTalk WCF service on Submit. Sometimes, when I try to submit I get ‘413 Request Entity Too Large'. This error happens intermittently though because if I try submitting the same data a number of times, it fails sometimes and works other times.
    The binding settings for the service are set in code (not in Web.Config) as shown below ...
    var binding = RetrieveBindingSetting();
    var endpoint = RetrieveEndPointAddress(“enpointAddress”);
    var proxy = new (binding, endpoint);
    proxy.Create(request);
    public BasicHttpBinding RetrieveBindingSetting()
    var binding = new BasicHttpBinding
    MaxBufferPoolSize = 2147483647,
    MaxBufferSize = 2147483647,
    MaxReceivedMessageSize = 2147483647,
    MessageEncoding = WSMessageEncoding.Text,
    ReaderQuotas = new System.Xml.XmlDictionaryReaderQuotas
    MaxDepth = 2000000,
    MaxStringContentLength = 2147483647,
    MaxArrayLength = 2147483647,
    MaxBytesPerRead = 2147483647,
    MaxNameTableCharCount = 2147483647
    Security =
    Mode = BasicHttpSecurityMode.Transport,
    Transport = { ClientCredentialType = HttpClientCredentialType.Certificate }
    return binding;
    I have also set the uploadReadAheadSize in applicationHost.config file on IIS as by running the command below, as suggested here ...
    appcmd.exe set config "sharepoint" -section:system.webserver/serverruntime /uploadreadaheadsize:204800 /commit:apphost
    Nothing I do seems to fix this issue so I was wondering if anyone had any ideas?
    Thanks

    Sounds like it's not a SharePoint problem, does the page work correctly if you don't call the BizTalk WCF service? And what happens if a console app is calling the WCF service independently of SharePoint, does it fail then as well? In both cases, it would
    limit the troubleshooting scope to the WCF service, which gets you a step further.
    Kind regards,
    Margriet Bruggeman
    Lois & Clark IT Services
    web site: http://www.loisandclark.eu
    blog: http://www.sharepointdragons.com

  • When editing a wiki page, get a 'request entity too large' error message.

    [https://stbeehive.oracle.com/teamcollab/wiki/Sales+Playbooks:Demonstrating+Differentiators|https://stbeehive.oracle.com/teamcollab/wiki/Sales+Playbooks:Demonstrating+Differentiators] I'm trying to edit one of my wiki pages that has been static for about 5 months now, and when I try and save the page, I get the following message (note I cropped some because of formatting issues when posting):
    *413 Request Entity Too Large*
    HTTP/1.1 413 Request Entity Too Large Date: Tue, 18 Oct 2011 15:35:41 GMT Server: Oracle-Application-Server-10g Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=iso-8859-1
    Request Entity Too Large
    The requested resource
    /teamcollab/wiki/<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml"><head><meta content="text/html; charset=utf-8" http-equiv="Content-Type" /><script type="text/javascript"> var U = "undefined"; var gHttpRelativeWebRoot = "/ocom/"; var SSContributor = false; var SSForceContributor = false; var SSHideContributorUI = false; var ssUrlPrefix = "/splash/"; var ssUrlType = "2"; var g_navNode_Path = new Array(); g_navNode_Path[0] = '1790'; g_navNode_Path[1] = 'splash_collabsuite'; var g_ssSourceNodeId = "splash_collabsuite"; var g_ssSourceSiteId = "splash";</script><script id="SSNavigationFunctionsScript" type="text/javascript" src="/ocom/websites/splash/sitenavigationfunctions.js"></script><script id="SSNavigationScript" type="text/javascript" src="/ocom/websites/splash/sitenavigation.js"></script><script type="text/javascript">var g_strLanguageId = "en";</script><script type="text/javascript" src="/ocom/resources/wcm/sitestudio/wcm.toggle.js"></script><script type="text/javascript" src="/ocom/resources/sitestudio/ssajax/ssajax.js"></script> <script id="ssInfo" type="text/xml" warning="DO NOT MODIFY!"> <ssinfo> <fragmentinstance id="fragment1" fragmentid="universal-metatag" library="server:UNIVERSAL-FRAGMENTS"> </fragmentinstance> <fragmentinstance id="fragment2" fragmentid="ExternalSiteCatalystFragment" library="server:EXTERNALSCFRAGMENTLIB"></fragmentinstance> </ssinfo> </script> <meta name="GENERATOR" content="MSHTML 8.00.6001.18904" /><!--SS_BEGIN_SNIPPET(fragment1,head_tags)--><title>Collabsuite Outage</title><meta name="Title" content="Collabsuite Outage"><meta name="Description" content="Collabsuite Outage"><meta name="Keywords" content="Collabsuite Outage"><meta name="robots" content="NOINDEX, NOFOLLOW"><meta name="country" content=""><meta name="Language" content="en"><meta name="Updated Date" content="4/12/11 10:38 AM"><!--SS_END_SNIPPET(fragment1,head_tags)--> </head><body> <!--SS_BEGIN_SNIPPET(fragment1,code)...
    does not allow request data with GET requests, or the amount of data provided in the request exceeds the capacity limit.
    Additionally, a 413 Request Entity Too Large error was encountered while trying to use an ErrorDocument to handle the request.
    The page, should you wish to eyeball it, is at:
    https://stbeehive.oracle.com/teamcollab/wiki/Sales+Playbooks:Demonstrating+Differentiators

    Duane,
    This looks like the URL has the content of a wiki page as an attachment to the URL which is blowing up the get request. Can you go to the earlier version - the history should allow you to backtrack changes - if you access this earlier version and change something small - does it save OK. If so then maybe the change you made is the problem.
    I cannot access the workspace without being given explicit access so this is a guess.
    Phil

  • Keep getting 413 Request Entity too large for only one website

    When I go to www.stapleseasyrebate.com my computer goes nuts and then I get a 413 Request entity too large. It only seems to be this one website.

    This issue can be caused by corrupted cookies or cookies that are blocked.
    *check the permissions on the about:permissions page and in "Tools > Page Info > Permissions"
    Clear the cache and remove cookies only from websites that cause problems.
    "Clear the Cache":
    *Firefox > Preferences > Advanced > Network > Cached Web Content: "Clear Now"
    "Remove Cookies" from sites causing problems:
    *Firefox > Preferences > Privacy > "Use custom settings for history" > Cookies: "Show Cookies"

  • Sync error: Uploading records failed: "client issue: request body too large"

    1328917629619 Sync.Engine.AdblockPlus DEBUG Uploading records failed: "client issue: request body too large"
    This happens from 2 Win 7 computer and 1 Win XP computer sync'd. Without the Add Block Plus being sync'd everything works.

    Hi!
    That's a problem with the Add Block Plus sync engine. It seems that they have issues on their servers.
    Try to ask the same question in their forum: https://adblockplus.org/forum/
    Good luck!

  • Request Entity Too Large?

    What does "Request Entity Too Large" mean? It keeps preventing me from seeing ANY of the posts I try to open.

    Try clearing your cache and cookies in the browser.  That should temporarily fix the problem until lithium comes up with a permanent fix.
    WyreNut
    I am a Volunteer here, not employed by HP.
    You too can become an HP Expert! Details HERE!
    If my post has helped you, click the Kudos Thumbs up!
    If it solved your issue, Click the "Accept as Solution" button so others can benefit from the question you asked!

  • Invoking service from OSB throws Request Entity Too Large fault

    Invoking .net service with large(>1M) base64Binary field from OSB returns "BEA-380000:Request Entity Too Large" error.
    How to enlagre the maximum message size?

    You have to slightly adjust WebLogic configuration:
    http://download.oracle.com/docs/cd/E12840_01/wls/docs103/config_wls/web_server.html#wp1059784

  • Best way to organise multiple large clips and hundreds of subclips

    Situation
    I have the task of producing a suite of short 2 minute videos comprising highlights of several hours of footage in multiple large files.
    I want to extract and manage the highlights in a non-destructive way and organise them according to subject matter.
    I would then use these in several projects to create the highlights videos required.
    I am using Adobe CC, so it's CS6.
    My approach so far is to:
    1. Import all the footage into a single project as individual large clips
    2. Scrub through each clip, set in/outs and extract to subclip, store in subject matter bin
    My questions:
    1.  if I go back to the original footage and edit it, it doesn't seem to flow through to the subclips. How do I make the subclips inherit the attributes of their parent? (e.g. could be sound, colour, etc)
    2. how do I access these highlights bins from another project?
    3. is this the best way to handle my situation?
    Thanks for your input.
    Ric

    I suppose applying the effects to the source clip and rendering out a new source file is one way of doing it.  Just be very careful that you don't compress the output.  And, of course, it will double the amount of hard drive space.  I'd still make all the sub-clips first and use the "paste attributes" command.  I think you can even lasso a whole bunch of clips and paste attributes to all of them at once (they have to be in the timeline, though - not the project panel).
    It shouldn't take an awful lot of hard drive space to duplicate projects.  Just duplicate the "prproj" file, not the whole projects folder (and certainly not the source material).
    Here's a screen shot of my set-up for a complicated, 5-camera shoot of a play:
    The original project (syncing all the cameras up) is 1 MB, each successive archive gets a little bigger but the current project is only 3.8 MB.  I would imagine if you are deleting bins and sequences for the mini-projects, the prproj file would actually get smaller than the original master.
    Okay.. so here's the mantra about non-tape based video... First you back up the card (including the complete folder structure) onto an archive drive.  Next you copy this folder onto an external or RAID or other "real" archive drive.  Then (and only then) do you start editing the material!
    Ideally, you need at least three, physically separate harddrives (not partitions) in an editing station (this is true Windows or Mac).  One drive for the operating system and applications only (no media or projects or anything!).  One for working projects (each project in its own folder) where everything except the source video is stored.  And finally one for the source video.  On my system my OS drive is Tardis ('cuz I'm a geek!).  I have a Projects drive for projects (duh!) a Scratch drive for working video files and an Archive for "finished" projects waiting delivery and back-back-ups of my SD cards.

  • Request-URI Too Large

    Hi, I am working on a project that involves exporting excel spreadsheets to a JSP script for special charting and as a result the URLs can be very long like: http://preview.tinyurl.com/p9vykm (full URL shown in tinyurl preview). As a result of this, servers typically complain about the URL being too large to process even though browsers like Firefox will happily accept such long URLs. So obviously a alternative method of passing this data to my JSP script must be saught, and it must be a manageable method able to be performed with VBA which is unfortunately being used to interact between the excel spreadsheet and this script. My thought was base64 encrypting the parameters, passing that to the JSP script and then having the JSP script decrypt the encrypted parameters and process them. Is that possible and if so how? If not, how would you think it best to approach this issue? Thanks!

    Yes, it's true. Servers are allowed to put a limit on the length of a URL when it's used for a GET request. There's nothing you can do about that. However if you use a POST request, then the parameters are passed differently and they don't count as part of the length. I have no idea whether VBA can be made to use POST instead of GET.
    I don't see why Base64 encoding would help, since it's guaranteed to increase the size of the string by one third. You need something which reduces the size of the string.

  • Trying to send vid clip results in "file too large". Yet friends can send me clips larger. Why is this?

    I cannot send video clips I have taken to friends.  I get "File too large". Yet friends seem to be able to send ME larger clips from their phones.
    Do they have some app or something that zips or compresses files automatically?

    Try reproducing the issue in Windows safe mode with Networking.... I think the guess of a synchronization issue was correct. I just think the blame finger went to the wrong place. Windows safe mode should confirm my guess... Or prove me wrong
    Note that i can move messages in imap to live from junk to inbox and they are not corrupt.

  • SP01 - display spool request, buffer too short

    Dear all,
    when trying to display spool request in transaction SP01, we are getting the error: internal error 0000000040/ 32 in spooler (buffer too short)
    But the same request could be displayed via transaction SPO10.
    R/3 4.6C.
    Any idea ?
    Thank you for the answer.
    Pavol Simko
    Edited by: Pavol Simko on Nov 3, 2009 3:18 AM

    In SM37:
    Goto>Display Requests>Settings...
    Then in "Display area"-->"From page"
    Set it to from page 1 - "no of pages" then try to display
    the spool again.

  • HT1338 Since having my Macbook Pro attached to a large display screen, everything is too large for the screen. Is there a cure please?

    Today I had my Mac linked to a projector screen, and since disconnection evrything from the desk top onwards has been too large for the screen. Before that evrything was perfect. I have tried zoom buttons etc, but nothing seems to work. Any help gratefully rceived.

    So it's not the Zoom feature?  Open System Preferences Universal Access pane Seeing tab.  Make sure Zoom is turned OFF.
    If it's not that, then you may have the resolution set so that it is not optimal for the built-in display.  Open System Preferences Displays pane Display tab.  The optimal resolution is usually the one at the top of the list of choices.

  • Mkisofs: Value too large for defined data type too large

    Hi:
    Does anyone meet the problem when use mkisofs command?
    <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
    Warning: creating filesystem that does not conform to ISO-9660.
    mkisofs 2.01 (sparc-sun-solaris2.10)
    Scanning iso
    Scanning iso/rac_stage1
    mkisofs: Value too large for defined data type. File iso/rac_stage3/Server.tar.gz is too large - ignoring
    Using RAC_S000 for /rac_stage3 (rac_stage2)
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    Thanks!

    An update on the original bug. This is the direct link to launchpad bug 455122:
    https://bugs.launchpad.net/ubuntu/+sour … bug/455122

  • I just created a slide show with music and tried to export it to a thumb drive to play on another computer.  File seemed too large and slow to export, so I deleted the music. Now, I can't get the music back on it and don't know what to do...

    Can't export my slide show.  Is the file too big? Tried to export without music.  Now, can't get the music back into it.

    If you want help you’ll need to give us more information. There are 9 different versions of iPhoto and they run on 8 different versions of the Operating System. The tricks and tips for dealing with issues vary depending on the version of iPhoto and the version of the OS. So to get help you need to give as much information as you can. Include things like:
    - What version of iPhoto.
    - What version of the Operating System.
    - Details. As full a description of the problem as you can. For example, if you have a problem with exporting, then explain by describing how you are trying to export, and so on.
    - History: Is this going on long? Has anything been installed or deleted? - Are there error messages?
    - What steps have you tried already to solve the issue.
    - Anything unusual about your set up? Or how you use iPhoto?
    Anything else you can think of that might help someone understand the problem you have.

Maybe you are looking for

  • Problem with custom Reports and forms in R12

    Hi All, we are upgrading from 1103 to R12. In R12 we are facing a peculiar problem with Reports. All seeded reports are running perfectly. But no data is coming while running the custom reports. The operating unit field in the SRS window is getting p

  • How to call a variable file path in javascript

    Hi, My objective is to capture a file path on mac and windows. However, this file path gets changed with various users. For example the path is: c:\user\username\AppData\Local\Temp\myfolder------> this is for windows machine /users/username/Library/A

  • "Place" greyed out

    When I want to "place" an image from Bridge to Photoshop, the menu command is greyed out.  I can do it from the Mini Bridge, but that is not always the most convenient way.  How can I get the functionality of the menu command back?  Thanks.

  • Video pauses when clicking out of premiere? can i get it to continue playing?

    Hi all.... I am watching a video taken from a helicopter and need to track this in google earth.... everytime i click out of the adobe premiere window.. ie. in to google earth, the video pauses in premiere... i really want this video to continue runn

  • Cleaning Your Camera and Lens

      Like everything we own cameras and lenses get dirty and should be cleaned from time to time. It's really not too difficult, but make sure to do it right. The feeling of scratching an expensive lens while trying to clean it is something you'll regre