J2me maximum response bytes size (http)

Hi,
i am developing a end to end application using j2me and servlets.
iam getting the response as xml from the servlet and parsing it in my j2me client.
i am getting error- if the total bytes return from the server exceeds..2000bytes.
is there any maximum incoming and outgoing messages bytes size for the j2me applications.iam connecting to the server thru http.
thanks,
RP

There is no official maximum, but some devices or networks may impose a maximum. In working with the Nokia 7210, for example, I found that large responses would either return an error from the WAP gateway or would cause intermittent IOExceptions on the phone. I ended up limiting the response size to 30000 bytes for the Nokia 7210.
With most devices, there is no maximum response size.
I have encountered a WAP gateway that required the response to a Nokia 6310i to be less than 2800 bytes, but I don't think a gateway like this would be encountered much by real users since it essentially kills over-the-air provisioning.
If you're getting an error with the J2ME Wireless Toolkit or a device that doesn't limit the response size, then it's probably a bug in your code. I frequently see developers fail to realize that InputStream.read(byte[] b) may return before reading b.length bytes; an error like this would result in the symptoms you're seeing.

Similar Messages

  • How do I get the byte size of a server file before sending output via HTTP?

    I need to get the byte size of the file prior to streaming it. I can't seem to find a class/method I need. Basically, I have the path c:\\tomcat\\webapps\\documents\\sample.pdf in the servlet, I was hoping I could just use something from the File class but I couldn't find anything that seems to do the trick?
    thanks, in advance,
    Chuck

    maybe the source of the problem will help...I am trying to stream a PDF to IE and a blank page is being generated although all other file type work.
    I have found a lot of answers in the forum but no specific code examples. Here's what I have so far from picking through threads in here (can someone please show me how to get the byte size of the file so that I can assign it to the method response.setContentLength();?):
    String CONTENT_TYPE = " ";
         String target = " ";
    public void doGet(HttpServletRequest request, HttpServletResponse response)
    throws ServletException, IOException {
         StringBuffer buf = new StringBuffer();
         HttpSession session = request.getSession();
         String file = request.getParameter("filename");
         target = file;
         int end = file.length();
    int beg = end-2;
         String type = file.substring(beg, end);
         if (type.equals("DOC")){
              CONTENT_TYPE = "application//vnd.msword";
         }else if (type.equals("XLS")){
              CONTENT_TYPE = "application//vnd.x-excel";
         }else if (type.equals("PPT")){
              CONTENT_TYPE = "application//vnd.ms-powerpoint";
         }else if (type.equals("PDF")){
              CONTENT_TYPE = "application//vnd.x-pdf";
         }else if (type.equals("MPP")){
              CONTENT_TYPE = "application//vnd.ms-project";
         }else if (type.equals("ZIP")){
              CONTENT_TYPE = "application//ZIP";
         }else if (type.equals("TXT")){
              CONTENT_TYPE = "text//plain";
         }else {
              CONTENT_TYPE = "text//html";
         //File f = new File(file);
         //int l = f.length();
         response.setContentLength(l); <----- supposedly this fixes my problem but I don't know how to get the byte szie of the file in an integer??
         // reset the response
         response.reset();
         response.setContentType(CONTENT_TYPE);
         try{
         // Get streams
         FileInputStream fileInputStream = new FileInputStream(target);
         ServletOutputStream servletOutputStream = response.getOutputStream();
         // Init byte count and array
         int bytesRead = 0;
         byte byteArray[] = new byte[4096];
         // Read in bytes through file stream, and write out through servlet stream
         while((bytesRead = fileInputStream.read(byteArray)) != -1) {
         servletOutputStream.write(byteArray, 0, bytesRead);
              servletOutputStream.flush();
         // Flush and close streams
         servletOutputStream.flush();
         servletOutputStream.close();
         fileInputStream.close();
         } catch (Exception e) {
         System.out.println(e.toString());

  • Maximum Data file size in 10g,11g

    DB Versions:10g, 11g
    OS & versions: Aix 6.1, Sun OS 5.9, Solaris 10
    This is what Oracle 11g Documentation
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28320/limits002.htm
    says about the Maximum Data file size
    Operating system dependent. Limited by maximum operating system file size;typically 2^22 or 4 MB blocksI don't understand what this 2^22 thing is.
    In our AIX machine and ulimit command show
    $ ulimit -a
    time(seconds)        unlimited
    file(blocks)         unlimited  <-------------------------------------------
    data(kbytes)         unlimited
    stack(kbytes)        4194304
    memory(kbytes)       unlimited
    coredump(blocks)     unlimited
    nofiles(descriptors) unlimited
    threads(per process) unlimited
    processes(per user)  unlimitedSo, this means, In AIX that both the OS and Oracle can create a data file of any Size. Right?
    What about 10g, 11g DBs running on Sun OS 5.9 and Solaris 10 ? Is there any Limit on the data file size?

    How do i determine maximum number of blocks for an OS?df -g would give you the block size. OS blocksize is 512 bytes on AIX.
    Lets say the db_block_size is 8k. What would the maximum file size for data file in Small File tablespace and Big File tablespace be?Smallfile (traditional) Tablespaces - A smallfile tablespace is a traditional Oracle tablespace, which can contain 1022 datafiles or tempfiles, each of which can contain up to approximately 4 million (222) blocks. - 32G
    A bigfile tablespace contains only one datafile or tempfile, which can contain up to approximately 4 billion ( 232 ) blocks. The maximum size of the single datafile or tempfile is 128 terabytes (TB) for a tablespace with 32K blocks and 32TB for a tablespace with 8K blocks.
    HTH
    -Anantha

  • Maximum recommended file size for public distribution?

    I'm producing a project with multiple PDFs that will be circulated to a goup of seniors aged 70 and older. I anticipate that some may be using older computers.
    Most of my PDFs are small, but one at 7.4 MB is at the smallest size I can output the document as it stands. I'm wondering if that size may be too large. If necessary, I can break it into two documents, or maybe even three.
    Does anyone with experience producing PDFs for public distribution have a sense of a maximum recommended file size?
    I note that at http://www.irs.gov/pub/irs-pdf/ the Internal Revenue Service hosts 2,012 PDFs, of which only 50 are 4 MB or larger.
    Thanks!

    First Open the PDF  Use Optimizer to examine the PDF.
    a Lot of times when I create PDF's I end up with a half-dozen copies of the same font and fontfaces. If you remove all the duplicates that will reduce the file size tremendously.
    Another thing is to reduce the dpi of any Graphicseven for printing they don't need to be any larger than 200DPI.
    and if they are going to be viewed on acomputer screen only no more than 150 DPI tops and if you can get by with 75DPI that will be even better.
    Once you set up the optimized File save under a different name and see what size it turns out. Those to thing s can sometimes reduce file size by as much as 2/3's.

  • Maximum hard drive size for 350 MHz slot loading?

    Hello,
    What is the maximum hard drive size for a 350 MHz slot loading imac?
    I am aware of a 128 GB limit on imacs. Does the 128 GB hard drive limit apply to all imac CRTs or just the earlier models?
    Here are the specs of my imac:
    Blueberry 350 MHz
    Slot loading CD-ROM drive,
    6 GB hard drive
    Is the maximum hard drive size limit of 127GB apply to this particular model?
    Thanks

    Yes, the last CRT iMac was in 2001. The first machines with built-in greater than 127 GB support were released in June 2002, as this article explains:
    http://docs.info.apple.com/article.html?artnum=86178
    So all CRT iMacs have that limiation.
    http://support.apple.com/specs/imac/
    The 8 GB limit is explained here:
    http://docs.info.apple.com/article.html?artnum=106235

  • DV9000 Maximum Hard Disk Size Supported?

    I'm searching for a used HP Pavilion DV9000.  I need to know the maximum hard disk size the DV9000 series laptops support.  To be clear, could I put a 750G or 1TB disk in each bay?
    HP's web site isn't helpful. It gives only the "official" numbers at time of release which state it supports up to 240G, 120 in each bay. Those kinds of numbers are almost never an accurate statement of the limitation as I have seen similar "official" numbers for other HP laptops (supposedly limited to 120G or 250G or whatever) that have no problem supporting 640G in the real world.
    The size of hard disk a computer supports is usually a function of the main board's chipset and I don't know how to determine the exact chipset or find the chipset's hard disk support capability. Again, HP's web site isn't helpful here either. They may list it has an nVidia chipset; but, they don't mention which nVidia chipset.
    To be clear, I don't own a DV9000 yet and I need to know how to find this information maybe with the exact model number. The model series may be DV9000; but, there are dozens of more exacting model numbers on the bottoms of the DV9000 series computers (example: DV9025ea).  I don't want to purchase a DV9000 and then find that its limitation really is 120G per bay.

    Hard drive size is a limitation of the Bios, not the chipset, this was overcome with the implementation of Large Block Addressing 48bit in modern bios's in 2003
    Hp does not state which bios's on older laptops support this or not.
    Looking at the production dates of the 9000 series, I would be willing to bet it supports any size hard drive (48bit LBA), since 48bit LBA was introduced in 2003
    http://en.wikipedia.org/wiki/Logical_block_addressing
    That being said, there could be brand compatibility issues, so be sure to buy the hard drives from a source that has a friendly return policy.
    Maybe some one can post that has actually installed a large hard drive in a 9000 series.

  • Why is there a maximum email attachment size?

    I can understand why there is a maximum MMS file message size but why is there a maximum email attachment size set by the handset? The limit is preventing me sending pictures from my phone. MMS messages are charged at a flat rate but GPRS data is chared per kB. My operator, Tmobile, say they have no limit on email message sizes (and obviously they make more money if you send a larger file. If I transfer my sim card to my laptop or PDA I am able to send any size of attachment with no problem. But with the sim back in the handset a limit of 100kB is imposed. I get the same result using both Tmobile or Orange sims. This is especially annoying if you have a phone that produces high quality images with file sizes several times the limit. You can have a great image with no means of sending it at its original resolution. Anyone know why this is?

    Most operator's (carrier's) MMSC (Multimedia Messaging Service Center) servers do set a limit.
    What the MMS size limit that's guaranteed to go through is defined by a document known as the MMS Conformance Document (maintained by OMA, Open Mobile Alliance, which is a standardization organization by mobile device manufacturers and network operators, primarily).
    So, both the handsets (phones) may or may not have a size limit depending on how it was designed, and in addition the networks also have their own limits, and they do not need to be the same, but the doc I mentioned specifies what the devices and networks have to support, at least.
    More on OMA (and you can try to find the MMS Conformance Doc, if you wish):
    http://www.openmobilealliance.org

  • UCS C series maximum hard drive size

    Looking for info on the maximum hard drive size support on teh UCS C series.  I have a C200M2 that I added a 4TB drive, I have updated the server to the latest firmware and BIOS however the server shows the drive size is 2TB.

    Hi!  and welcome to the community mate!
    Here is the table that shows the supported disk's PID for the C200 server:
    http://www.cisco.com/c/dam/en/us/products/collateral/servers-unified-computing/ucs-c-series-rack-servers/c200m2_sff_specsheet.pdf
    If you successfully installed ONE HDD of 4GB and made it work, you were lucky cause I don't see any disk with that size, I have not even seen a disk of that size for UCS yet.
    I am wondering if perhaps you installed 4 disks of 1TB and configured them in RAID 1 which would explain the reason for the size to go down to half of the capacity of the array as RAID 1 is a mirror.
    Let me know if I misunderstood you.
    Rate ALL helpful answers.
    -Kenny

  • What is the Exchange 2010 maximum mailbox database size that is support by MS in a single DAG environment?

    My Exchange setup:
    Exchange 2010 Enterprise
    2 mailbox servers
    2 CAS, 2 HT
    12 mailbox databases. The total of all databases combine is about 2TB. The largest mailbox databases are 530GB, 250GB, and 196GB. Are this over the supported recommendations?
    bl

    2TB, look at the second article but best practice 200GB or less 
    http://social.technet.microsoft.com/Forums/exchange/en-US/48431bab-4049-47db-9a84-359d5123d247/what-is-the-maximum-supported-database-size-in-exchange-2010-
    http://social.technet.microsoft.com/Forums/exchange/en-US/f96892b3-8e2d-4eef-b64a-4cbc0097396d/ideal-size-for-exchange-mailbox-database

  • The request exceeds the maximum allowed database size of 4 GB

    I have craeted a user in oracle 10g with following commands
    CRAETE USER USERNAME IDENTIFIED BY PASSWORD
    DEFAULT TABLESPACE users TEMPORARY TABLESPACE temp;
    Grant create session to USERNAME;
    Grant create table to USERNAME;
    Grant create view to USERNAME;
    Grant create trigger to USERNAME;
    Grant create procedure to USERNAME;
    Grant create sequence to USERNAME;
    grant create synonym to USERNAME;
    after that when i want to craete a table i got following error
    SQL Error: ORA-00604: error occurred at recursive SQL level 1
    ORA-12952: The request exceeds the maximum allowed database size of 4 GB
    00604. 00000 - "error occurred at recursive SQL level %s"

    Error starting at line 1 in command:
    SELECT /* + RULE */ df.tablespace_name "Tablespace", df.bytes / (1024 * 1024) "Size (MB)", SUM(fs.bytes) / (1024 * 1024) "Free (MB)", Nvl(Round(SUM(fs.bytes) * 100 / df.bytes),1) "% Free", Round((df.bytes - SUM(fs.bytes)) * 100 / df.bytes) "% Used" FROM dba_free_space fs, (SELECT tablespace_name,SUM(bytes) bytes FROM dba_data_files GROUP BY tablespace_name) df WHERE fs.tablespace_name = df.tablespace_name GROUP BY df.tablespace_name,df.bytes
    UNION ALL
    SELECT /* + RULE */ df.tablespace_name tspace, fs.bytes / (1024 * 1024), SUM(df.bytes_free) / (1024 * 1024), Nvl(Round((SUM(fs.bytes) - df.bytes_used) * 100 / fs.bytes), 1), Round((SUM(fs.bytes) - df.bytes_free) * 100 / fs.bytes) FROM dba_temp_files fs, (SELECT tablespace_name,bytes_free,bytes_used FROM v$temp_space_header GROUP BY tablespace_name,bytes_free,bytes_used) df WHERE fs.tablespace_name = df.tablespace_name GROUP BY df.tablespace_name,fs.bytes,df.bytes_free,df.bytes_used ORDER BY 4 DESC
    Error at Command Line:1 Column:319
    Error report:
    SQL Error: ORA-00942: table or view does not exist
    00942. 00000 - "table or view does not exist"
    *Cause:   
    *Action:                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Maximum Temp tablespace size you've seen

    DB version: 10.2.0.4
    Our DB caters a retail application . Total DB Size of 3TB. Its is a bit of mix of both OLTP and Batch processing environment.
    Our temp tablespace has 1 file and we had set the tempfile as AUTOEXTEND. Somehow its size has reached 25GB now !
    We don't actually need this. Do we?
    For a fairly busy DB of around 3TB size, what is the maximum temp tablespace size you've ever seen?
    What is the maximum temp tablespace size you've ever seen for Big DB like Telecom, Banking?

    The point about temp space is that your requirements are dynamic - the actively used area shrinks and grows.
    Comparing apples with oranges, tempfile sizes on system where I am currently here amount to 50 GB but to be honest that's mainly due to a not insignificant number of pretty poor queries, run concurrently, with multipass operations, etc.
    Kellyn Pedersen had an example of a process using 720GB here:
    http://dbakevlar.com/?p=43
    Whilst 25GB may not be earth shattering, a "large" temp area may perhaps be an indicator that you want to review some of your application code.
    (What is "large"? "unusual" compared to what is "normal" for your system may be a better metric)
    If you're using automatic pga memory management, then there's a limit to what each session can get, it may be that automatic work areas are not suitable for some of your code and some manual sizing is required to prevent operations spilling unnecessarily.
    Also, inaccurate CBO estimates on your queries can lead to inaccurately sized work areas that then spill into temp.
    There was a reminder of this in one of Randolf Geist's articles on hash aggregation:
    http://oracle-randolf.blogspot.com/2011/01/hash-aggregation.html

  • Rows per batch and Maximum insert commit size

    Have been reading various post to understand what the properties 'Maximum insert commit size' and 'Rows per batch' actually do and am confused as each have their own explanation to these properties. As far as I understood, MICS helps SQL server engine to
    commit the number of rows specified to it and RPB is to define the number of rows in a batch?
    In case I set the property to say RPB - 10000 and MICS - 5000, does this mean the input data is fetched in a batch of 10000 and each time only 5000 are committed?
    What happens in case of RPB - 5000, MICS - 10000. Are the batches formed for each 5000 records and committed for each 10k records?
    One post mentioned on RPB being merely a property to help SQL server device a optimal plan for data load. If this is true then why have the MICS property? Instead RPB can be assigned a value and engine decides on what is best load method it wants to use.
    Thanks!
    Rohit 

    Hi Rohit,
    Maximum insert commit size specify the batch size that the component tries to commit during fast load operations. The value of 0 indicates that all data is committed in a single batch after all rows have been processed. A value of 0 might cause the running
    package to stop responding if the component and another data flow component are updating the same source table. To prevent the package from stopping, set the Maximum insert commit size option to 2147483647.
    Rows per batch specify the number of rows in a batch. The default value of this property is –1, which indicates that no value has been assigned.
    For more details about those two properties, please refer to the following document:
    http://msdn.microsoft.com/en-IN/library/ms188439.aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Error message "Maximum character output size limit reach Err_WIS_30272"

    Users are getting the Error message "Maximum character output size limit reach Err_WIS_30272" while creating the Universe Query in the Live Office, but when they create the same query in the Web intelligence Report then it work fine.
    To resolve this error message we can have to increase the value of "Maximum Character Stream Size" parameter in the Web intelligence Processing Server.
    But i want to know the reason why the error message is only appearing in the Live Office Query, where it work fine in Web intelligence Query.
    Are there architectural difference in Web intelligence and Live Office?

    Hi,
    maybe one of the following SAP Notes will answer your Question.
    https://websmp230.sap-ag.de/sap%28bD1lbiZjPTAwMQ==%29/bc/bsp/sno/ui_entry/entry.htm?param=69765F6D6F64653D3030312669765F7361706E6F7465735F6E756D6265723D3133373030343526
    https://websmp230.sap-ag.de/sap%28bD1lbiZjPTAwMQ==%29/bc/bsp/sno/ui_entry/entry.htm?param=69765F6D6F64653D3030312669765F7361706E6F7465735F6E756D6265723D3133373537353526
    https://websmp230.sap-ag.de/sap%28bD1lbiZjPTAwMQ==%29/bc/bsp/sno/ui_entry/entry.htm?param=69765F6D6F64653D3030312669765F7361706E6F7465735F6E756D6265723D3133383336363426
    Regards
    -Seb.

  • Max Byte size for String?

    Friends,
    I want to know maximum byte size the string can hold.
    Is there any restriction for the size of the string
    Whether it can hold 898380 bytes in one string

    Friends,
    I want to know maximum byte size the string can
    hold.
    Is there any restriction for the size of the string
    Whether it can hold 898380 bytes in one stringEasily. Can you hold 830 MB in your memory?
    Max size is, as you could have easily found out by searching, Integer.MAX_VALUE bytes.

  • How can I manage byte size of a string value in java?

    Hi,
    How can I manage byte size of a string value in java? I have NAME column in my database table, of type VARCHAR2(128). Supports multilingual, so value in Name can be English, French or Dutch.. like that. Byte size of English character is 1 and that of French is 2 and varies.. . Because of this reason I find difficulty in insert query. Please suggest solution.

    But the event does not really have a size - you can export the photos and make the size pretty much what you want - while it is in iPhoto it is an event
    I guess that iPhoto could report the size of the original photos as imported - or the size of the modified photos if exported as JPEGs - or the size of the modified photos if exported with a maximum dimension of 1080 - but the event simply is photos and does not have a "size" until you export it
    Obviously you want to know the size but the question was
    what is your puprose for knowing the size?
    WIth that information maybe there is a way to get you what you want
    But the basic answer is simply that an event does not have a size - an event is a collection of photos and each photo has either two or three versions in the iPhoto library and each photo can be exported for outside use in several formats and at any size
    LN

Maybe you are looking for

  • MDM data manager

    hi experts, i could not work with data manager, the error is prompting by telling that it cannot load the repository. can anyone will help me in solving this error. thanks, kannan.

  • Can log into Yosemite server (4.0) VPN service with a Mavericks client, but not Yosemite client

    Sever Info: Yosemite Server 4.0 running on a late 2009 Mac Mini with 8 GB RAM with vpnd service enabled The server was upgraded to Yosemite - not clean install - this may not matter (see below) Airport extreme router with standard VPN UDP ports for L

  • Xp20:format-dateTime method in BPEL 11g

    I am doing a date conversion as xp20:format-dateTime('2007-10-07','YYYYMMDD') and storing in a String variable - givenDateConverted but the result is as below <givenDateConverted >YYYYMMDD</givenDateConverted> My original requirement is to compare th

  • How to Delete Marketting Attribute

    How to delete marketing attribute when it is assigned to marketing attribute set  and some accounts  are already created with  use of these marketing attribute. Kindly help

  • CANT CONNECT TO YAHOO FINANCE POTFOLIOS

    I cannot accces my portfolios on Yahoo finance.   I get an message that it cannot connect to the site because there are "too many redirects".   I did not have this problem on previous operating system.