Finding CachedRowSet max. memory limit

Hi,
I am using a CachedRowSet for storing data got from a ResultSet. I understand that there is a memory limit on the amount of data the CachedRowset can hold.
I would like to know how to determine the maximum size of data the CachedRowSet can hold ?
Thanks,
Shashi.

That depends on many things. Experiment with it if you must.
But if your result sets are so large memory use is an issue, try to find a better approach.
The best way is not to need the results in memory at the same time. Process each row as it comes in if you can.
Or if you need the result set in memory (try not to need it!) don't use CachedRowSet. Instead convert the rows into real Java objects. They are likely to be smaller, faster, you get object oriented programming, access control, looser coupling, independence of data source which can help reusability, ...
I've never used CachedRowSet. My gut reaction to it is "hmm, is that a code smell..."

Similar Messages

  • How to Determine Safe Max Memory Limit?

    From what I understand, the amount of memory available to your AIR game on a device will vary depending on how many other apps the user is running, the type of device, etc. 
    So with this in mind, how does one define a safe max memory limit for a given device? What percentage of a device's total RAM can we assume we have access to? 
    Here are the iOS RAM Specs
    iPod (4th Gen): 128Mb.
    iPad 1: 256 MB
    iPad 2, iPod Touch (5th Gen): 512 MB
    iPad3: 1 GB 
    I already have highly optimized texture atlases using PVRTC ATF for best use of memory, but if you've ever worked with a passionate creative department, you'll know they always want more. I need to be able to tell them exactly how many spritesheets they have to work with for a given device - how would I do this? 
    NOTE: iPhones/iPod Touches/iPads all have a Unified Memory Architecture which mean that both the CPU and GPU share system memory, which means there is no dedicated GPU GRAM on these devices. So the RAM listed above, I assume, is shared by both the game logic and the GPU.

    Update, I've found an Objective C post on Stack Overflow that answers this question
    Here are the (rough) max memory limits to expect before an out-of-memory crash will occur:
    iPad1: 127MB/256MB (crash amount/total amount)
    iPad2: 275MB/512MB
    iPad3: 645MB/1024MB
    iPhone4: 325MB/512MB
    Note that max memory limit will vary depending on how many apps a user is running, but these figures are a good rough guide. See the Stack Overflow post for more details.

  • Max application memory Limit in iPad 4th?

    Is there material to talk about it? what is max memory limit for application running on iPad 4th?

    If you have to ask, you're doing it wrong.

  • To find out the min and max memory been used by each parameter under SGA_MA

    Hi,
    Can any please tell me how to find out the min and max memory been used by each parameter under SGA_MAX and SGA_TARGET ? below is the db CRMS65T. If any such script is there please provide me
    SQL> select name from v$database;
    NAME
    CRMS65T
    SQL> show parameter sga
    NAME                                 TYPE        VALUE
    lock_sga                             boolean     FALSE
    pre_page_sga                         boolean     FALSE
    sga_max_size                         big integer 1000M
    sga_target                           big integer 1000MThanks in advance

    Can any please tell me how to find out the min and max memory been used by each parameter under SGA_MAX and SGA_TARGET ? below is the db CRMS65T. If any such script is there please provide meI guess your question is, each memory components of SGA? if so
    SQL> select * from v$sgainfo;
    NAME                                  BYTES RES
    Fixed SGA Size                      2088504 No
    Redo Buffers                       18882560 No
    Buffer Cache Size                 616562688 Yes
    Shared Pool Size                  301989888 Yes
    Large Pool Size                     4194304 Yes
    Java Pool Size                      4194304 Yes
    Streams Pool Size                         0 Yes
    Granule Size                        4194304 No
    Maximum SGA Size                  947912704 No
    Startup overhead in Shared Pool   125829120 No
    Free SGA Memory Available                 0Also check
    SQL> select COMPONENT,CURRENT_SIZE,MIN_SIZE,MAX_SIZE,USER_SPECIFIED_SIZE from v$sga_dynamic_components;
    shared pool                                                         301989888  301989888          0           209715200
    large pool                                                            4194304    4194304          0             4194304
    java pool                                                             4194304    4194304          0             4194304
    streams pool                                                                0          0          0                   0
    DEFAULT buffer cache                                                616562688  616562688          0           603979776
    KEEP buffer cache                                                           0          0          0                   0
    RECYCLE buffer cache                                                        0          0          0                   0
    DEFAULT 2K buffer cache                                                     0          0          0                   0
    DEFAULT 4K buffer cache                                                     0          0          0                   0
    DEFAULT 8K buffer cache                                                     0          0          0                   0
    DEFAULT 16K buffer cache                                                    0          0          0                   0
    DEFAULT 32K buffer cache                                                    0          0          0                   0
    ASM Buffer Cache                                                            0          0          0           603979776
    13 rows selected.
    SQL>Edited by: CKPT on Sep 19, 2011 8:55 AM

  • MAX memory 8GB

    Hi i have one problem i canot understand why compay HP block support max memory for notebook. Configuration write max 8GB but hardware support 16GB. Chipset support 16GB processors support 16GB and mother board support 16GB. But BIOS BLOCK max memory only 8GB. I test i put 1 DDR3 8GB windows working and can looking if i put 1 8GB and 4GB total 12Gb only worki 8GB. I start find and study this problem to much view forum and understand this lock in BIOS support max memory. After i write insyde and get answer " Insyde Software sells BIOS source code to PC manufacturers who modify the source code to meet their specific BIOS needs. Thus, each PC manufacturer has a unique BIOS. Insyde does not track the changes made to the BIOS by the PC manufacturer because the PC manufacturer has full control over the BIOS features. Your laptop is limited to 8GB because either: 1) HP configured it that way and will not allow additional memory or 2) the processor is limited to work with 8GB. Insyde has no control over the amount of memory allocated."  I ask and want HP help me. I want use my notebook 100% and not have limit use my notebook. I think this do this bad For the buyer. Thanks i think you help.

    That is the theoretical configuration of the chipsets. I think you already know that HP has designed the unit (a g6-2003sr) to take no more than an i5-2450M processor and no more than 8 gigs of DDR3-1600. Sometimes the specs on these are conservative and it will in fact take more memory, but that would be an expensive experiment. There is no way it is going to take a 3rd gen i7 Quad Core, though. What is it exactly you want done here? You think that if HP would just write new BIOS you could install these parts you want to install? I don't think it is not that simple although even a new BIOS is not going to happen. I did go back and search the internet and there are posts on the Dell and Asus forums with exactly the same complaints....HM76 chipset 2d gen Intel Core laptops maxxing at 8 gigs when it "could be" 16 gigs. So in the industry this was a common design in 2012. 

  • CF10 VFS Max Size Limit?

    It there a maximum amount of Ram that can be allocated the the virtual file system?  When I specify more than 1 GB size, the system returns a negative number for the remaining free space, and any attempt to store a file resuts in storage limit exceeded errors.
    I've been unable to locate much detail about the VFS configuration via web searches.  Any details are appreciated.
    Ronnie

    AustinValley wrote:
    It there a maximum amount of Ram that can be allocated the the virtual file system?  When I specify more than 1 GB size, the system returns a negative number for the remaining free space, and any attempt to store a file resuts in storage limit exceeded errors.
    Well, you will also get that when you have $25 in your pocket, but spend $32. It is easy to find out how much memory you have. Search for Coldfusion java.lang.Runtime on the web. You will find something like
    <cfscript>
      rt = CreateObject("java","java.lang.Runtime").getRuntime();
      memory = StructNew();
      memory.freeAllocated = rt.freeMemory() / 1024^2;
      memory.total = rt.totalMemory() / 1024^2;
      memory.max = rt.maxMemory() / 1024^2;
      memory.used = memory.total - memory.freeAllocated;
      memory.freeTotal = memory.max - memory.total + memory.freeAllocated;
      memory.heapMemory = memory.used;
    </cfscript>
    <cfdump label="Memory in MB" var="#memory#">

  • MS 6147 max memory and disk sizes

    For the MS 6147 can anybody confirm the max Hard disk size and max memory.  
    It will not recognise the 40Gb HD I am trying to fit, and only recognises half of the 256Mb memeory simm I have fitted.
    The BIOS version is 1.9 but I think that is a 'special' by Packard Bell.  The MSI BIOS download site makes no mention of disk problems rectified right up to V1.8 which is the latest, with the exception of one for the ZX chipset only which addresses EDMA 66 problem.
    Anybody got a definitive answer on this?

    Supports a maximum memory size of 256MB (8M x 8) or
       512MB (16M x 4) registered DIMM only
    how many chips on dimm is what counts with older boards
    go to drive makers web site get jumper settings to limit it to 32gb and try

  • Solaris-x86 mount fat32 partition, the partition max size limit?

    solaris10 x86, laptop, 10G FAT32 partition for windows & x86 exchange data.
    the fat32 partition mount as normal, can be read fine.
    but write some file by x86, that can not find by windows.
    anyboy know did the solaris-x86 mount fat32 partition, the partition max size limit? or no limit, why this problem occur?

    Mounting Windows partition in Solaris
    The easiest way to share data now is to do it through a FAT32 partition. Solaris
    recognises it as partition of type pcfs. It is specified as device:drive where drive is
    either the DOS logical drive letter (c through z) or a drive number (1 through 24).
    Drive letter c is equivalent to drive number 1 and represents the Primary DOS partition
    on the disk; drive letters d through z are equivalent to drive numbers 2 through 24,
    and represent DOS drives within the Extended DOS partition.Syntax is
    mount -F pcfs device:drive /directroy-name
    where directory name specifies the location where the file system is mounted.
    To mount the first logical drive (d:) in the Extended DOS partition from an IDE hard
    disk in the directory /d use
    mount -F pcfs /dev/dsk/c0d0p0:d /d
    You can use mount directory-name after appending following line is in
    /etc/vfstab file
    device:drive directory-name pcfs no rw
    for example
    c0d0s0:c /c pcfs no rw
    If your windows partition like the following means
    C: - NTFS, D:-FAT32, E:-NTFS, F:-FAT32
    Then you can only mount D, F not C & E.
    Mounting D Drive:
    mount -F pcfs /dev/disk/c0d0p0:c /mountpoint
    Mounting F Drive
    mount -F pcfs /dev/disk/c0d0p0:d /mountpoint
    The driveletter only for fat not including other file systems (ntfs or any linux filesystems).

  • Max memory, and 1.3 vs 1.4

    Hi Guys.
    I have a windows 2000 computer, with 2Gb of memory, running Sun's JVMs.
    I'm using JRE 1.4.2_4, and I can only allocate 1.2Gb of memory (With the JVM switches). Any more and I get a 'cannot allocate memory' error.
    How can I get more memory? If I upgrade my machine to 4Gb, will that help?
    When using Java 1.3, I was able to allocate 1.5Gb of memory. I've now switched to 1.4, and only 1.2Gb is possible. Why is it lower now?
    Cheers,
    R.

    This entirely OS based problem and it depends on this
    libraries the JRE uses (They use the space their own way)
    I have run a 1.8 GB JRE on Solaris (with 2 GB memory)
    with no problems (Except if I want to run anything else)Yes but that is the limit on solaris. Again if you had added more memory it would not have allowed you to increase the maximum.
    Running 1.5 may be better but using another OS like XP
    or Linux will probibly make more difference.
    You may also find adding more memory for the OS to
    hide may also help.No 1.5 will not help. It will likely decrease the maximum on all OSes (as all other releases have done so.)
    To increase the potential memory the following is required...
    - A 64 bit VM must be used.
    - An OS compatible with the 64 bit VM must be used.
    And realistically to the above one would also have to have enough phyiscal memory to support the maximum. Although this is not necessary, the VM will run probably at least an order of magnitude slower if this is not the case.
    I believe someone reported on the forums that they are running a VM with 16 gig of memory (64 bit VM of course.)

  • Error: Max processing time or Max records limit reached

    Hi All,
    While I run the report in Infoview, I get the below error:
    Unable to retrieve object:
    Max processing time or Max records limit reached
    Kindly suggest me.
    Thanks,
    Meena

    There is a default limit on the number of records returned and on the time out of an 'idle' connection..These could be set in the CMC , however first try to check the query for that report and see if it is applying your record selection criteria at the database level  ( use the Show Sql option and see if all your selection criteria have been turned into WHERE clauses)
    - this will drastically reduces both the number of records returned to the Crystal and the time it takes for...
    You can find setting here:
    CMC>servers>page server>properties
    Its always not recommended to set it to unlimited as page server is not a robust server, you need to schedule such reports that uses job server which is more robust.
    Regards,
    Parsa.

  • HotSpot(TM)64-BitServer VM warning: (benign)Hit CMSMarkStack max size limit

    Hi I am getting the CMSMarkStack max size limit. Could anyone explain why i am getting that.
    61083.003: [GC 61083.003: [ParNew: 523392K->0K(523840K), 0.1802670 secs] 2866168K->2364464K(4193856K), 0.1804250 secs]
    61087.107: [GC 61087.107: [ParNew: 523392K->0K(523840K), 0.1970010 secs] 2887856K->2396761K(4193856K), 0.1971990 secs]
    61087.349: [GC [1 CMS-initial-mark: 2396761K(3670016K)] 2408426K(4193856K), 0.0330660 secs]
    61087.382: [CMS-concurrent-mark-start]
    61089.382: [CMS-concurrent-mark: 2.000/2.000 secs]
    61089.382: [CMS-concurrent-preclean-start]
    61089.637: [CMS-concurrent-preclean: 0.253/0.255 secs]
    61089.637: [CMS-concurrent-abortable-preclean-start]
    CMS: abort preclean due to time 61090.703: [CMS-concurrent-abortable-preclean: 0.224/1.067 secs]
    61090.721: [GC[YG occupancy: 336074 K (523840 K)]61090.721: [Rescan (parallel) , 0.4475020 secs]61091.169: [weak refs processing, 1.8464740 secs]Java HotSpot(TM) 64-Bit Server VM warning: (benign) Hit CMSMarkStack max size limit
    [1 CMS-remark: 2396761K(3670016K)] 2732836K(4193856K), 2.5285500 secs]
    61095.521: [CMS-concurrent-sweep-start]
    61104.793: [CMS-concurrent-sweep: 9.271/9.271 secs]
    61104.793: [CMS-concurrent-reset-start]
    61104.821: [CMS-concurrent-reset: 0.029/0.029 secs]
    61110.338: [GC 61110.338: [ParNew: 523392K->0K(523840K), 0.6183310 secs] 2133391K->1628588K(4193856K), 0.6184950 secs]
    61162.032: [GC 61162.032: [ParNew: 523392K->0K(523840K), 0.2259220 secs] 2151980K->1662904K(4193856K), 0.2261040 secs]
    61171.154: [GC 61171.155: [ParNew: 523392K->0K(523840K), 0.1890640 secs] 2186296K->1686907K(4193856K), 0.1892200 secs]
    regards
    R.Sriram

    "errno = 28" is an error code from the OS which means "No space left on device" this could indicate you don't have enough swap space.
    I suspect you are using too much memory even when you don't think you are.
    I would start with a simple "hello world" program and increase the memory until you get an error. If you can't run even a hello world program you have a serious system error.

  • I want to know the max memory of my t500 thanks. Dose it support 8G memory?

    I got  one T500 (CTO2242 L3A3865   08/09) one years ago.
    I want to know the max memory of  my t500   thanks. Dose it  support 8G memory(once I install win7 64bit)?
    I diy T500 on the lenovo net shop today, I find that I can choice 8G memory, so I want to know if my t500 support 8G memory with win7 64bit  ?
    thanks alot!!
    Solved!
    Go to Solution.

    ernest100 wrote:
    thanks alot, are there any Definite answer?
    The T500 (2242CTO) has 2 memory slots and it will support 8GB of memory in a 2 X 4GB configuration. 

  • DataServices Error "Unable fo find free in-memory datarecord"

    Hi,
    While running a process to suppress few million data warehouse records (1.8M), the process errors out with following error:
    DQX-058302: |Data flow Test_SuppressPS_2|Transform BASE_MATCH Transform <BASE_MATCH>: DLL <libmatchtransformu.so> runtime function <ProcessCollection> failed with error <:|Data flow Test_SuppressPS_2|Transform BASE_MATCH:Transform <BASE_MATCH>: Internal error: Unable to find free in-memory datarecord while processing pageable collection as all <1000> records are locked.>. More detailed information may be obtained from previous errors.
    Previous message:
    SYS-058310: |Data flow Test_SuppressPS_2|Transform BASE_MATCH Transform <BASE_MATCH>: Internal error: Unable to find free in-memory datarecord while processing pageable collection as all <1000> records are locked.
    If I try to limit the data warehouse records to a smaller number (about 0.5M), it seems to work fine.
    Or if I change the Cache type to "In-Memory" in the data transform, it works fine for all the 1.8M records.
    My problem is that I am not able to understand the above error the way I can change things to correct it. Running In-Memory all the time is not good for us as there may be more then one process running at same time in production.
    Thanks,
    Gaurav

    Hello Gaurav,
    This internal error seem like will require detail investigation. I would recommend filing a customer support request and we will take it from there.
    Thanks,
    Abhiram

  • Frequent memory limit reached

    Hi all. Need some help. Already install SLES in 6 server and found that in the /var/log/message, the content of the log files:
    Aug 30 05:00:22 pgw2 rcd[28133]: Running heartbeat at Tue Aug 30 05:00:22 2011
    Aug 30 05:00:22 pgw2 rcd[28133]: Memory limit reached, restarting
    Aug 30 05:00:22 pgw2 rcd[28133]: Shutting down daemon...
    Aug 30 05:00:22 pgw2 rcd[28133]: Shutting down local server
    Aug 30 05:00:22 pgw2 rcd[28133]: Shutting down remote server
    Aug 30 05:00:22 pgw2 rcd[13273]: Red Carpet Daemon 2.4.9
    Aug 30 05:00:22 pgw2 rcd[13273]: Copyright (C) 2000-2003 Ximian Inc.
    Aug 30 05:00:22 pgw2 rcd[13273]: Start time: Tue Aug 30 05:00:22 2011
    Aug 30 05:00:22 pgw2 rcd[13273]: Initializing RPC system
    Aug 30 05:00:22 pgw2 rcd[13273]: Initializing modules
    Aug 30 05:00:22 pgw2 rcd[13273]: [rcd.serverpoll] Starting server-poll
    Aug 30 05:00:22 pgw2 rcd[13273]: Starting local server
    Aug 30 05:00:22 pgw2 rcd[13273]: Starting remote server
    Aug 30 05:00:23 pgw2 rcd[13273]: Loading system packages
    Aug 30 05:00:23 pgw2 rcd[13273]: Done loading system packages
    Aug 30 05:00:30 pgw2 rcd[13273]: id=1 COMPLETE 'Downloading https://update.novell.com/data/serviceinfo.xml' time=7s (failed)
    Aug 30 05:00:30 pgw2 rcd[13273]: Unable to download service info: IO error - Soup error: Cannot resolve hostname (2)
    Aug 30 05:00:30 pgw2 rcd[13273]: Unable to load service for default host URL 'https://update.novell.com/data': Unable to download service info: IO error - Soup error: Cannot resolve hostname (2)
    Aug 30 05:00:30 pgw2 rcd[13273]: Can't find rcd 1.x subscription file '/var/lib/redcarpet/subscriptions.xml'
    Aug 30 05:00:30 pgw2 rcd[13273]: Starting heartbeat
    Aug 30 05:22:43 pgw2 -- MARK --
    Aug 30 05:42:43 pgw2 -- MARK --
    Aug 30 05:59:01 pgw2 /USR/SBIN/CRON[5035]: (root) CMD ( rm -f /var/spool/cron/lastrun/cron.hourly)
    Aug 30 06:22:44 pgw2 -- MARK --
    Aug 30 06:42:44 pgw2 -- MARK --
    This report is keep repeating. Somehow, I believe this report is the cause of the problem:
    Aug 30 05:00:30 pgw2 rcd[13273]: id=1 COMPLETE 'Downloading https://update.novell.com/data/serviceinfo.xml' time=7s (failed)
    Aug 30 05:00:30 pgw2 rcd[13273]: Unable to download service info: IO error - Soup error: Cannot resolve hostname (2)
    Aug 30 05:00:30 pgw2 rcd[13273]: Unable to load service for default host URL 'https://update.novell.com/data': Unable to download service info: IO error - Soup error: Cannot resolve hostname (2)
    Aug 30 05:00:30 pgw2 rcd[13273]: Can't find rcd 1.x subscription file '/var/lib/redcarpet/subscriptions.xml'
    Aug 30 05:00:30 pgw2 rcd[13273]: Starting heartbeat
    Why this keep failed to download?

    ayel,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

  • Hp pavilion 6803w windows 7 memory limit

    What is the max memory for this system?

    Hi,
    Please use the following information:
    Memory
    4 GB
    Amount: 4 GB
    Speed: PC3-10600 MB/sec (message as PC3-8500)
    Type: DDR3-1333
    Memory upgrade information
    Dual channel memory architecture
    Four DDR3 DIMMs (240-pin) sockets
    PC3-8500 (DDR3-1066)
    PC3-10600 (DDR3-1333)
    Non-ECC memory only, unbuffered
    Supports 1GB, 2GB, and 4GB DDR3 DIMMs
    Supports up to 16 GB on 64-bit systems
    Supports up to 4 GB on 32-bit PCs
    *32-bit PCs cannot address a full 4.0 GB of memory.
    Source:  http://h20564.www2.hp.com/hpsc/doc/public/display?docId=emr_na-c02859378
    Regards.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

Maybe you are looking for