Maximum data segment size for a process

Hello
we have an issue in our proxy server with the size of a process.
the process grow in size and when it reaches 4Gbytes, the process stop with an error that it cannot allocate memory, we check and there's plently of swap left,. all the ulimit settings in solaris 10 are set to 'unlimited'
is there a way i can see or determine, bypass configure what is the maximum process size. on a 64bit solaris.
thank you in advance
Mario G.

It sounds like you're running a 32-bit app. You need a 64-bit application for it to be able to utilize more than 4 GB of RAM.

Similar Messages

  • Maximum Data file size in 10g,11g

    DB Versions:10g, 11g
    OS & versions: Aix 6.1, Sun OS 5.9, Solaris 10
    This is what Oracle 11g Documentation
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28320/limits002.htm
    says about the Maximum Data file size
    Operating system dependent. Limited by maximum operating system file size;typically 2^22 or 4 MB blocksI don't understand what this 2^22 thing is.
    In our AIX machine and ulimit command show
    $ ulimit -a
    time(seconds)        unlimited
    file(blocks)         unlimited  <-------------------------------------------
    data(kbytes)         unlimited
    stack(kbytes)        4194304
    memory(kbytes)       unlimited
    coredump(blocks)     unlimited
    nofiles(descriptors) unlimited
    threads(per process) unlimited
    processes(per user)  unlimitedSo, this means, In AIX that both the OS and Oracle can create a data file of any Size. Right?
    What about 10g, 11g DBs running on Sun OS 5.9 and Solaris 10 ? Is there any Limit on the data file size?

    How do i determine maximum number of blocks for an OS?df -g would give you the block size. OS blocksize is 512 bytes on AIX.
    Lets say the db_block_size is 8k. What would the maximum file size for data file in Small File tablespace and Big File tablespace be?Smallfile (traditional) Tablespaces - A smallfile tablespace is a traditional Oracle tablespace, which can contain 1022 datafiles or tempfiles, each of which can contain up to approximately 4 million (222) blocks. - 32G
    A bigfile tablespace contains only one datafile or tempfile, which can contain up to approximately 4 billion ( 232 ) blocks. The maximum size of the single datafile or tempfile is 128 terabytes (TB) for a tablespace with 32K blocks and 32TB for a tablespace with 8K blocks.
    HTH
    -Anantha

  • Sql loader maximum data file size..?

    Hi - I wrote sql loader script runs through shell script which will import data into table from CSV file. CSV file size is around 700MB. I am using Oracle 10g with Sun Solaris 5 environment.
    My question is, is there any maximum data file size. The following code from my shell script.
    SQLLDR=
    DB_USER=
    DB_PASS=
    DB_SID=
    controlFile=
    dataFile=
    logFileName=
    badFile=
    ${SQLLDR} userid=$DB_USER"/"$DB_PASS"@"$DB_SID \
              control=$controlFile \
              data=$dataFile \
              log=$logFileName \
              bad=$badFile \
              direct=true \
              silent=all \
              errors=5000Here is my control file code
    LOAD DATA
    APPEND
    INTO TABLE KEY_HISTORY_TBL
    WHEN OLD_KEY <> ''
    AND NEW_KEY <> ''
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
            OLD_KEY "LTRIM(RTRIM(:OLD_KEY))",
            NEW_KEY "LTRIM(RTRIM(:NEW_KEY))",
            SYS_DATE "SYSTIMESTAMP",
            STATUS CONSTANT 'C'
    )Thanks,
    -Soma
    Edited by: user4587490 on Jun 15, 2011 10:17 AM
    Edited by: user4587490 on Jun 15, 2011 11:16 AM

    Hello Soma.
    How many records exist in your 700 MB CSV file? How many do you expect to process in 10 minutes? You may want to consider performing a set of simple unit tests with 1) 1 record, 2) 1,000 records, 3) 100 MB filesize, etc. to #1 validate that your shell script and control file syntax function as expected (including the writing of log files, etc.), and #2 gauge how long the processing will take for the full file.
    Hope this helps,
    Luke
    Please mark the answer as helpful or answered if it is so. If not, provide additional details.
    Always try to provide actual or sample statements and the full text of errors along with error code to help the forum members help you better.

  • Maximum hard drive size for 350 MHz slot loading?

    Hello,
    What is the maximum hard drive size for a 350 MHz slot loading imac?
    I am aware of a 128 GB limit on imacs. Does the 128 GB hard drive limit apply to all imac CRTs or just the earlier models?
    Here are the specs of my imac:
    Blueberry 350 MHz
    Slot loading CD-ROM drive,
    6 GB hard drive
    Is the maximum hard drive size limit of 127GB apply to this particular model?
    Thanks

    Yes, the last CRT iMac was in 2001. The first machines with built-in greater than 127 GB support were released in June 2002, as this article explains:
    http://docs.info.apple.com/article.html?artnum=86178
    So all CRT iMacs have that limiation.
    http://support.apple.com/specs/imac/
    The 8 GB limit is explained here:
    http://docs.info.apple.com/article.html?artnum=106235

  • Suggested data file size for Oracle 11

    Hi all,
    Creating a new system (SolMan 7.1) on AIX 6.1 running Oracle 11. 
    I have 4 logical volumes for data sized at 100gb each.  During the installation I'm being asked to input the size for the data files. The default is "2000mb/2gb" is this acceptable for a system sized like mine, or should I double them to 4gb each? I know the max is 32gb per data file but that seems a bit large to me.  Just wanted to know if there was a standard best practice for this, or a formula to use based on system sizing.
    I was not able to find any quick suggestions in the Best Practices guide on this unfortunately...
    Any help would be greatly appreciated.
    Thanks!

    Ben Daniels wrote:
    Hi all,
    >
    > Creating a new system (SolMan 7.1) on AIX 6.1 running Oracle 11. 
    >
    > I have 4 logical volumes for data sized at 100gb each.  During the installation I'm being asked to input the size for the data files. The default is "2000mb/2gb" is this acceptable for a system sized like mine, or should I double them to 4gb each? I know the max is 32gb per data file but that seems a bit large to me.  Just wanted to know if there was a standard best practice for this, or a formula to use based on system sizing.
    >
    > I was not able to find any quick suggestions in the Best Practices guide on this unfortunately...
    >
    > Any help would be greatly appreciated.
    >
    > Thanks!
    Hi Ben,
    Check the note 129439 - Maximum file sizes with Oracle
    Best regards,
    Orkun Gedik

  • Maximum input payload size(for an XML file) supported by OSB

    Hey Everyone,
    I wanted to know, what is the maximum payload size that OSB can handle.
    The requirement is to pass XML files as input to OSB and insert the data of the XML files in the oracle staging tables. The OSB will host all the .jca,wsdl, xml, xml schema and other files required to perform the operation.
    The hurdle is to understand, what is the maximum XML file size limit, that OSB can allow to pass through without breaking.
    I did some test runs and got the following output,
    Size of the XML file:  OSB successfully read a file of size, 3176kb but failed for a file of size 3922kb, so the OSB breakpoint occurs somewhere between 3-4 MB, as per the test runs.
    Range of number of Lines of XML:  102995 to 126787, since OSB was able to consume a file with lines (102995) and size 3176kb but broke for a file with number of lines (126787) and size 3922kb.
    Request to please share your views on the test runs regarding the OSB breakpoint and also kindly share the results, if the same test has been performed at your end.
    Thank you very much.

    Hey Everyone,
    I wanted to know, what is the maximum payload size that OSB can handle.
    The requirement is to pass XML files as input to OSB and insert the data of the XML files in the oracle staging tables. The OSB will host all the .jca,wsdl, xml, xml schema and other files required to perform the operation.
    The hurdle is to understand, what is the maximum XML file size limit, that OSB can allow to pass through without breaking.
    I did some test runs and got the following output,
    Size of the XML file:  OSB successfully read a file of size, 3176kb but failed for a file of size 3922kb, so the OSB breakpoint occurs somewhere between 3-4 MB, as per the test runs.
    Range of number of Lines of XML:  102995 to 126787, since OSB was able to consume a file with lines (102995) and size 3176kb but broke for a file with number of lines (126787) and size 3922kb.
    Request to please share your views on the test runs regarding the OSB breakpoint and also kindly share the results, if the same test has been performed at your end.
    Thank you very much.

  • Data Conversion rules for EDI processing (same client IDOC processing)

    Hi,
    I am trying to post IDOCS in same client.Its a PO->SO process.
    ie. there will be 1 outbound and inbound idoc in same client using EDI processing.
    I am using Data Conversion using Rulesfor converting sender fields.
    The LIFNR and PAORG od segment E1EDKA1 has to be converted.
    For ALE processing, the Data conversion is been done correctly.
    But no conversion is done for EDI.
    Can anybody help me with this problem ?
    Thanks in advance.
    Regards
    Megha

    Issue solved

  • Maximum recommended file size for public distribution?

    I'm producing a project with multiple PDFs that will be circulated to a goup of seniors aged 70 and older. I anticipate that some may be using older computers.
    Most of my PDFs are small, but one at 7.4 MB is at the smallest size I can output the document as it stands. I'm wondering if that size may be too large. If necessary, I can break it into two documents, or maybe even three.
    Does anyone with experience producing PDFs for public distribution have a sense of a maximum recommended file size?
    I note that at http://www.irs.gov/pub/irs-pdf/ the Internal Revenue Service hosts 2,012 PDFs, of which only 50 are 4 MB or larger.
    Thanks!

    First Open the PDF  Use Optimizer to examine the PDF.
    a Lot of times when I create PDF's I end up with a half-dozen copies of the same font and fontfaces. If you remove all the duplicates that will reduce the file size tremendously.
    Another thing is to reduce the dpi of any Graphicseven for printing they don't need to be any larger than 200DPI.
    and if they are going to be viewed on acomputer screen only no more than 150 DPI tops and if you can get by with 75DPI that will be even better.
    Once you set up the optimized File save under a different name and see what size it turns out. Those to thing s can sometimes reduce file size by as much as 2/3's.

  • Maximum WPA2 key size for Apple TV (gen 2 & 3)

    Does anyone know the maximum key size for WPA2?  I am using a Airport Extreme with a key size of 63 characters and the Apple TV doesn't seem to accept it.  Before I reconfigure the whole network, I would like to know the max key size versus trial and error (kind of a pain to use the remote to input the characters).

    Ok.. in order to help other users i will post this..
    I have resolved the issue... it was rather simple and dumb...
    I never use windows media player ever.. i do not like it and can not stand it.. because in my case it was a brand new windows installation this fix will only apply to certain users..
    because it was a brand new windows installation.. i had never clicked on windows media player.. apparently streaming settings are embedded into windows media player for the windows 7 OS...    in past installations that i have used for years.. i simply never ever clicked on windows media player, as i never had a reason to.. i use itunes for my ipad, iphone etc.. etc... 
    by accident and shear luck i decided to click on windows media player.. and of course go through the express settings....  after i did this all of my streaming problems to my apple tv gen2 disapeared... HD movies now start almost instantly... so as i stated above apparently there are media streaming settings embedded into the OS through windows media player... doesnt make any sense to me as i never use the program.. but whatever that fixed my issue..
    so for any of you windows 7 people out there using a windows 7 pc as your itunes sync computer and using that itunes library to stream to your apple tv... make sure you have done the initial setup at min.. of windows media player.. i still left itunes as my default player....  everything now works flawlessly... infact for whatever reason.. even in an HD movie my network almost has half the movie synced now accross the bar before the movie starts so now my apple tv is blazing fast....
    i have been struggling with this for 2 days now and simply never gave it a thought that windows media player would ever have anything to do with itunes streaming a totally seperate program.. but it did..
    i hope this helps other people with similar problems.
    good luck

  • Data type size for the view column

    I am creating a view from base table using substr() for some fields. The data type size of the column in the view becomes twice the size of the corresponding column in the base table. Example
    SQL> create table temp_sat(name varchar2(20));
    Table created.
    SQL> create view temp_view as select substr(name,1,10) name from temp_sat;
    View created.
    SQL> desc temp_view
    Name Null? Type
    NAME VARCHAR2(40)
    Can i specify the size of the column while creating view

    satish asnani wrote:
    I am creating a view from base table using substr() for some fields. The data type size of the column in the view becomes twice the size of the corresponding column in the base table. Example
    SQL> create table temp_sat(name varchar2(20));
    Table created.
    SQL> create view temp_view as select substr(name,1,10) name from temp_sat;
    View created.
    SQL> desc temp_view
    Name                                      Null?    Type
    NAME                                               VARCHAR2(40)Can i specify the size of the column while creating viewHi,
    Which platform and Oracel DB version?
    I had try to replicate, but I can't:
    SQL> create table t_v
      2  (id varchar2(20));
    Tabla creada.
    SQL> desc t_v
    Nombre                                                            ┐Nulo?   Tipo
    ID                                                                         VARCHAR2(20)
    SQL> r
      1  create view v_t_v
      2  as
      3  select substr(id,1,10) id
      4* from t_v
    Vista creada.
    SQL> desc v_t_v
    Nombre                                                            ┐Nulo?   Tipo
    ID                                                                         VARCHAR2(10)John

  • Is it possible to config memory size for a process?

    Dear experts,
    I would like to control the memory size assigned for a process on Solaris 9. Can i config memory size assigned for process??
    Many thanks,
    Shun

    You can use ulimit/limit to assign a bound on the amount of space that a process and its children can allocate.
    I'm not sure if that's what you're asking though.
    Darren

  • Maximum attachment file size for email account

    hi what is the maximum file/s attachment size for email accounts .com and where do I find the means to adjust/ change the setting? thanks  Brewsta

    Our BigPond mail attachment limit is 10MB. This is the user guide for our BigPond email service:
    https://my.bigpond.com/res/pdf/premiummail/installation_user_guide.pdf
     

  • Maximum bitrate and size for Apple TV3

    Hello,
    how much is maximum bitrate and file size for Apple TV3 homesharing? I transfared some bluray, but it lags. Im streaming via cabble.

    No official answer on this but specs say:
    Video Formats
    H.264 video up to 1080p, 30 frames per second, High or Main Profile level 4.0 or lower, Baseline profile level 3.0 or lower with AAC-LC audio up to 160 Kbps per channel, 48kHz, stereo audio in .m4v, .mp4, and .mov file formats
    MPEG-4 video up to 2.5 Mbps, 640 by 480 pixels, 30 frames per second, Simple Profile with AAC-LC audio up to 160 Kbps, 48kHz, stereo audio in .m4v, .mp4, and .mov file formats
    Motion JPEG (M-JPEG) up to 35 Mbps, 1280 by 720 pixels, 30 frames per second, audio in ulaw, PCM stereo audio in .avi file format
    AppleTV 2/3 have less than 8GB of memory to buffer movies but there are movies over 8GB and I've certainly streamed larger than that.
    What lags?  Playback?  Stuttery display?

  • Save for Web with Optimize File Size for Batch Process

    I have a client who needs to be able to run a batch process on jpg images ranging in size from 600KB to 2.5MB optimizing their file size as close to 200KB as possible. The feature is included in Photoshop's save for web optimize dialog but it doesn't work for batch processes. If you record an action with the optimize to file size option set it processes all the images in the batch based on the quality settings required to resize the image you used during recording, meaning if the image you recorded was 1.5MB the quality setting might be 70 percent to hit the 200KB mark but that wouldn't be the correct quality setting for a 600KB file nor a 2.5MB file those would either be too small or too large.
    I've done quite a bit of searching both online and in the photoshop scripting forums but I haven't been able to find anything that would point me in the right direction.
    My client's paying for this work so if there's someone who knows how to script this who'd rather be paid than just do some pointing we can arrange for that quite easily.
    Thanks in advance for your assistance and suggestions.
    In case it's of importance we're using Photoshop CS4.

    If you don't mind spending a few bucks on a shareware prog, Jpegsizer has the option to save to a filesize.
    You can find it here.
    http://www.tangotools.com/jpegsizer/register.htm
    There is a free trial version, but you can't save files.

  • Maximum PDF file size for browsers to cache?

    My site distributes daily newsletters via PDF that are typically about 2MB in size. A link is provided to the PDF, and when clicked, it opens the Adobe Reader plugin within the browser (Firefox, IE) and opens the file therein.
    Recently it was brought to my attention by a dialup user (they do still exist) that when he navigates back or forward in his browser's history, the PDF has to be redownloaded rather than reloaded from his browser's cache. I, with a broadband connection and some wonderful Firefox extensions (Firefox Throttle and Live HTTP Headers) confirmed this behavior.
    However, a similarly sized file (2MB) of a different type (text) is cached by the browser and I can go back and forth in the history without the file having to be redownloaded. I also had success with a smaller sized PDF (70kb).
    What appears to happen is this:
    1) Click the link to the 2MB PDF, and it loads in the browser.
    2) Click browser's back button to back to previous page in browser history.
    3) Click forward button to return to PDF.
    4) A request with a "Range: <byte range>" header is sent which I think requests part of the PDF file (the byte ranges).
    5) The server responds with a "206 Partial Content" response code and resends some/most/all of the PDF to the browser.
    What should happen (and does with other file types and smaller sized PDFs) is that the browser should make a simple request for the PDF file, and assuming the PDF hasn't been modified since the latest request for it, the server should send a "304 Not Modified" response code which instructs the browser to use what it has in cache.
    I've increased my browser cache sizes dramatically to no avail. Cleared them before starting the process to no avail. Should be no problem with any browser's cache limit.
    I'm using Firefox 2 and IE7. My co-worker has IE6 and Acrobat Reader 6 installed on another machine and it works fine there (PDF is cached and does not need to be redownloaded).
    Help! Why aren't these larger PDF files being cached correctly, and what can we do to get them saved in the cache so our poor dialup users don't have to redownload them everytime they navigate their browser history?
    Thanks!

    Just wanted to reply with my discovery and fix. I'm not sure if it is Adobe Reader or the web browsers (I tested Firefox 2 and IE7), but they appear to request parts of large PDF files "on demand." That is, they send requests to the web server with a "Range: <byte range(s)>" header which instructs the web server to only send segments of the file. I think they use the feature of HTTP 1.1 that keeps the connection open or alive to request/send the data incrementally.<br /><br />While this might seem preferrable, in the case of dialup users that navigate through their browser(s) history, they should instead download the PDF in its entirety and allow the browser to cache the file. This way, when they click the link to open the PDF again (or go back/forward in their browser's history), they use what they have in their cache instead of redownloading the PDF (which in the case of dialup users is quite painful).<br /><br />We fixed the problem by instructing our web server to ignore/disregard byte range requests. In a nutshell, you need to have the web server issue a header:<br /><br />Accept-Ranges: none<br /><br />In Apache, you can do this on a per-file-type basis using a FilesMatch directive. In IIS (we're using 5.0), you cannot specify a file type for this behavior; you have to set it in Internet Services Manager by adding a custom header under the "HTTP Headers" tab.<br /><br />Personally, I prefer folks use their browser cache than repeatedly request data (PDF file in this case) from our server. To each his own I guess. Hope this helps someone.

Maybe you are looking for