DocumentSplitter and memory

Regarding the DocumentSplitter class that allows for the SAX parsing of large XML documents.
Is there a way to free the memory of each of the "temporary" XMLDocuments that get created? I know there's a XMLNodeCover.freeDocument(int id)
method in the xmlplsql.jar- where do you get the id from? I'm trying XMLNodeCover.getDocumentId(curDoc) - is that correct?
And can I use that same method for each Node and Element that gets created also?
Also, would like to free the memory for the XSL processor, SAXParser, and XSLStylesheet - but this is even less obvious (although not as critical).
I know that there are PL/SQL wrappers for all these, but I'm trying to use the DocumentSplitter and associated custom classes
in only Java stored procs, and then writing a single XSLT.transform wrapper. This way its largely invisible to the PLSQL layer.
Thanks

short update: both database instances are configured for 500MB "memory_target" and currently one database is using 877MB virtual memory and the other one 784MB of virtual memory (according to windows taskmanager, process oracle.exe). One of these instances is a recovery catalog, doing nothing at all (at the moment).
I'm really worried where all the memory is going to!

Similar Messages

  • Get CPU and memory usage

    Hi!
    I would like to know if there is any way of getting system CPU and memory usage using Java code.

    I want to get the system CPU and memory usage using the performance monitor dll, the perfctrs.dll, but access this data using Java language.Then you should create wrapper dll between your java code and perfctrs.dll and convert data from format of dll to format of your java code.
    So, that is next question - how to create wrapper dll, how to deal with or how perfctrs.dll works?

  • What is the difference between Azure RemoteApp Basic vs Standard Plans in terms of compute cores and memory?

    So our customer has asked us to compare compare Amazon Workspace and Azure RemoteApp offerings for them to choose from. While looking at Amazon Workspace, it clealy defines bundles with specific CPU cores, memory and user storage. However, Azure RemoteApp
    only specifies user storage and vaguely compares its basic vs. standard plans in terms of "task worker" vs. "information worker"
    I tried looking up its documentation but couldn't find specific CPU cores that are dedicated per user in basic vs. standard plans. I have following questions:
    Can anyone point me in the right direction or help understand how many CPU cores and memory are dedicated (or shared) per user in each plan?
    Our customer would most likely need a "custom" image for their custom apps. Is it possible for us to choose specific CPU cores and memory for the users to be able to run their apps in azure remoteapp?
    In case i am misunderstanding the basic difference between AWS workspace and Azure RemoteApp, i'd appreciate some help in understanding it as well.
    Thanks!

    Hi,
    With Azure RemoteApp users see just the applications themselves, and the applications appear to be running on their local machine similar to other programs.  With Workspaces users connect to a full desktop and launch applications within that.
    1. Azure RemoteApp currently uses size A3 Virtual Machines, which have 4 vCPUs and 7GB RAM.  Under Basic each VM can have a maximum of 16 users using it whereas under Standard each VM is limited to 10 users.  The amount of CPU available
    to a user depends on what the current demands are on the CPU at that moment from other users and system processes that may be on the server.
    For example, say a user is logged on to a VM with 3 other users and the other users are idle (not consuming any CPU).  At that moment the user could use all 4 vCPUs if a program they are running needed to.  If a few moments later
    the other 3 users all needed lots of CPU as well, then the first user would only have approximately 1 vCPU for their use.  The process is dynamic and seeks to give each user their fair share of available CPU when there are multiple users demanding CPU.
    Under the Standard plan a user will receive approximately a minimum of .4 vCPU assuming that the VM has the maximum number of users logged on and that all users are using as much CPU as possible at a given moment.  Under the Basic plan the approximate
    minimum would be .25 vCPU.
    2. You cannot choose the specific number of cores and memory.  What you can do is choose the Azure RemoteApp billing plan, which affects the user density of each VM as described above.  If you need a lower density than Standard you
    may contact support.
    -TP

  • CPU usage and memory

    What is FF doing when it sets at idle with nothing streaming , playing or such and uses 40-70% of CPU and memory usage keeps on going up. I just set there and watch in the task manager. Every other program uses nothing,
    There has to be a programmer intelligent enough to fix this. this has been going on for years.

    SImple fact is there is all "fixes" and answers are bogus. Just try using the same tabs in any other browser.
    Right now Opera is 2-3% idle no memory problems. other browsers are similiar and I will post.
    Mozzila is just like Microsoft . They have no desire to fix the problem

  • Upgrade chip and memory

    Can I upgrade the chip and memory? if so what can I use. The system is a little too slow. Yes I have all the software to speed it up. Thanks
    HP 2000 Notebook PC Windows 8.1 64-bit [Edited for Personal Information] AMD E-300 APU with Radeon(tm) HD Graphics microprocessor 4GB SODIMM Kinston 800MHz memory 188B KBC Version 69.18 system board F.37 system bios HGST HTS545032A7E380 SATA Disk Device Product number: E0M17UA#ABA AMD E-300 APU with Radeon(tm) HD Graphics

    Here is the Service Manual:
    http://h10032.www1.hp.com/ctg/Manual/c03763129.pdf
    Go through Page 16-17. Each different processor is soldered to the board, so you've to replace the whole motherboard for that, you can see that their part numbers are different according to three different boards.
    I suggest RAM upgrade from 4 to 8GB by adding another 4GB in the other vacant slot, with configuration DDR3 1333MHz dual channel boost.
    thanks,
    ++Please click KUDOS / White thumb to say thanks
    ++Please click ACCEPT AS SOLUTION to help others, find this solution faster
    **I'm a Volunteer, I do not work for HP**

  • HDD and memory upgrade on Satellite A350

    Hi All
    I have a satellite a350 laptop can anyone tell me whats the biggest harddrive I can have and memory please
    Thanks

    Hi
    The RAM upgrade depends on the available chipset.
    Not quite sure about your A350-xxx model but many of the Sat A350 notebooks were quipped with an Intel GM45 Express chipset and in such case you could upgrade the memory max up to 8GB RAM
    Regarding the HDD: This notebook supports SATA HDD. This mean that you are not limited to an special size.
    Theoretically it could be possible to use 500GB, 750GB 2.5 SATA HDDs

  • MSI 785GM-E51 and memory modules problems

     
    Could anyone help me with one problem. I have motherboard MSI 785GM-E51 and 4 memory modules GOODRAM GR 1333D364L9/2G. But when I'm inserting ALL modules my system not starting (even POST procedure). So working variants - when I'm inserting:
    - one module in first (boot) slot
    or
    - they inserted in first and third memory slots
    Any ideas?

    It is not motherboard problem, but processor itself - I have AMD Phenom II X4 965 Black Edition 3,4 Ghz Core Revision C2 and this revision (according to the article: http://www.fcenter.ru/online.shtml?articles/hardware/processors/27650#02) has problems with support memory modules DDR3-1333 Mhz more than 2 modules. Conclusion - either adjust BIOS settings concerning memory timings and memory controller voltage or replace this kind of processor with the same type but with core revision C3

  • How to display CPU and memory utilization from ST06 in a report

    Hi,
    I want to display CPU Utilization and Memory utilization and File sys details from ST06 transaction in a report.
    Is there any function module or any other method to do that.
    Please advice.
    Thanks,
    Sandeep.

    Hi Ranganath,
    Thanks for your time.
    And thank you very much for the reply.
    Both the function modules are helpful.
    But can u also help me in getting the data of FileSys from ST06.
    Thankyou,
    Sandeep.

  • Profile Performanc​e and memory

    Hi, guys
    I am trying to analysis my Profile Data for the Profile Performance and memory.
    When I check the data for VI Time, I found there are same VIs with different VI Time, what is that mean
    The reason why I am doing this is because I am going to add pulse model for my VI, so I need to know the relationship between my all instruments and time, Am I in the right track
    Thanks
    Solved!
    Go to Solution.
    Attachments:
    VI Time.png ‏72 KB

    It looks like the timing had a resolution of 15.6 ms, which I think is common on some flavors of the Windows OS.  Note that the average times on some of those VIs is zero.  This suggessts that for many of the 1878 runs the time was probably recorded as zero and a few were at 15.6 ms.
    This is one of the limitations of the Profiler.
    Lynn

  • Profile Performanc​e and Memory shows very large 'VI Time' value

    When I run the Profile Performance and Memory tool on my project, I get very large numbers for VI Time (and Sub VIs Time and Total Time) for some VIs.  For example 1844674407370752.5.  I have selected only 'Timing statistics' and 'Timing details'.  Sometimes the numbers start with reasonable values, then when updating the display with the snapshot button they might get large and stay large.  Other VI Times remain reasonable.
    LabVIEW 2011 Version 11.0 (32-bit).  Windows 7.
    What gives?
     - les

    les,
    the number indicates some kind of overroll.... so, do you have a vi where this happens all the time? Can you share this with us?
    thanks,
    Norbert
    CEO: What exactly is stopping us from doing this?
    Expert: Geometry
    Marketing Manager: Just ignore it.

  • Hard Drive and memory upgrade on macbook pro (13" mid 2009)

    I want to upgrade my current hard drive and memory (currently 250gb and 4 gb) but i'm having trouble to find out which one is going to fit.
    I considered buying this ones
    http://www.bestbuy.com/site/Western+Digital+-+Scorpio+Blue+1TB+Internal+Serial+A TA+Hard+Drive+for+Laptops/9960668.p?skuId=9960668&productCategoryId=pcmcat189600 050008&id=1218253425468
    http://www.bestbuy.com/site/Kingston+Technology+-+4GB+PC3-8500+DDR3+SoDIMM+Lapto p+Memory/2129381.p?skuId=2129381&productCategoryId=abcat0506002&id=1218404812331
    Any ideas?

    OWC is a reputable supplier of hard drives and RAM upgrades to the Mac community.
    Go to their site, lookup your model MacBook Pro and see what upgrades are available.
    The 1tb WD Scorpio Blue is a 5400rpm drive and slower than the crop of 7200rpm drives.
    Do you need space over speed?

  • Hard drive on HP notebook PC failed. After replacement, can I upgrade OS and memory?

    I have an HP Pavilion Entertainment Notebook PC, Model dv9617nr (HP pn GS730UA).  An error message on boot-up says "Operating System not found" so my Windows Vista Ultimate 32-bit (upgraded two years ago) will not load. I am assuming this means that my hard drive has failed. How can I verify this assumption? Again, assume I am correct, and I order the recommended replacement (HP pn 590736-001) from SpareParts Warehouse ($142.50 plus shipping). Are there instructions on the internet for doing this replacement? After I do this replacement, I thought I could install Windows 7 I already have and upgrade the memory to 4GB SODIMM from the 1GB SODIMM that was originally used. What problems would I have with Hardware Compatibility using Windows 7? If the memory module I use has the same specs as the HP recommended replacement (HP pn 598861-001), i.e. 800MHz, 200 pin, PC2-6400, SDRAM, but a larger 4GB memory capacity, would I have any compatibility problems using it? Please advise me on these issues. Thanks, Jeff
    This question was solved.
    View Solution.

    Hi, Jeff:
    Let me break down your question...
    First, to confirm your HDD died, go into your BIOS menu. There should be a hard drive and memory diagnostics utility.  Run the HDD diagnostics utility and see what it reports. More than likely the HDD has crashed and is no longer usable.
    You can order a replacement drive or you can get any SATA II hard drive --either 5400 RPM or 7200 RPM.
    You could go to 250 to 500 GB if you choose. Unfortunately right now, HDD prices are unreasonably high due to the flooding in Thailand where many HDD manufacturing plants are located. Some were damaged heavily, causing HDD prices to triple almost overnight.
    This is a drive you can get that I think is better than what you have now: This one is 7,200 RPM. Faster but uses more battery power.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16822136279
    You can get the 5400 rpm version too...similar. Not as good as the caviar black. This is more along the line of what you would get from HP.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16822136387
    Read the reviews on both or you are free to get the one from HP.
    There are instructions to install the hard drive and memory on your notebook's support and driver page. Click on Manuals on the right side of the page.
    http://h10025.www1.hp.com/ewfrf/wc/document?docname=c01278446&tmp_task=prodinfoCategory&cc=us&dlc=en...
    Then you want to look at the maintenance and service guide.
    You can upgrade the memory to a maximum of 4 GB using 2 x 2GB of PC2-6400 memory. You can get it from HP or you can get it at the link below. This is the memory I use. As a matter of fact I just ordered this very same memory to install in my sister-in-law's dv6700.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16820148159
    Now, you can install W7 just fine. I installed W7 Home Premium 64 bit on my dv6810us that has pretty much the same hardware as yours and 4 GB of memory (the same as I posted above).
    Here is the catch though. It depends on what version of W7 you have. If you have a W7 upgrade version, you must install a qualifying operating system on the hard drive before you can upgrade it.
    If you have a full version of Windows 7, then you can install it on a blank hard drive.
    If you have the upgrade version, contact HP support and order a set of Vista recovery disks. Install Vista and then uprade to W7 if you have the upgrade version of W7.
    You can use the vista drivers from your notebook's support page for any drivers you need that W7 didn't supply.
    One thing...DO NOT flash (update) the BIOS when running Windows 7. You can only flash it using windows vista.
    Hope this helps. If you have any other questions, please let us know.
    Paul

  • 13" MacBook Pro (mid 2009) hard drive and memory upgrade

    So I'm looking to upgrade my hard drive and memory for my 13" MacBook Pro. I'm still wondering just how easy it will be to recover my data, and put it back onto my new drive. I currently have everything backed up to an External drive using Time Machine. If I back it up, and then do the replacement, is it really as simple as it sounds to start it up, put in the Mac OS disk and then restore from a Time Machine backup? I have 150 Gb worth of data and would be ticked off if I can't get it back to my MacBook Pro. Any expertise out there? The actual installation looks easy, but it's the loading the new drive that has me concerned. When I install the new drive, is there anything else I need to do to prep it? And how do I do it if there is? Looking at a 500 gb or even 1 Tb Western Digital drive.

    I would say get yourself an external enclosure. Put the hard drive that comes out of your MacBook pro into the enclosure. Put in the new hard drive. Boot up from the external hard drive, use disk utility to format the new drive. Use Carbon Copy Cloner to move everything as it is from old hard drive to new internal one. Re-boot to internal drive, you are done.
    Carbon Copy Cloner is a freeware get it from download.com

  • What are the minimum CPU and Memory requirements for R12 Vision instance?

    We are in the process of trying to figure out what the minimum hardware requirements are for installing the R12 Vision instance. This Vision instance would only have 5 or less concurrent users. We may have to order a new server and we are wondering what the minimum CPU and memory would be? Oracle Support will not give us this information.
    We aleady know how much disk space it will need.
    Dan

    Hi,
    See these links.
    A Primer on Hardware Sizing for Oracle E-Business Suite
    http://blogs.oracle.com/stevenChan/2010/08/ebs_sizing_primer.html
    What Are the Minimum Desktop Requirements for EBS?
    http://blogs.oracle.com/stevenChan/2010/09/ebs_pc_clients.html
    Also, please see old threads for similar discussion.
    Hardware Requirements
    http://forums.oracle.com/forums/search.jspa?threadID=&q=Hardware+Requirements&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Benchmark
    http://forums.oracle.com/forums/search.jspa?threadID=&q=Benchmark&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Thanks,
    Hussein

  • Nio ByteBuffer and memory-mapped file size limitation

    I have a question/issue regarding ByteBuffer and memory-mapped file size limitations. I recently started using NIO FileChannels and ByteBuffers to store and process buffers of binary data. Until now, the maximum individual ByteBuffer/memory-mapped file size I have needed to process was around 80MB.
    However, I need to now begin processing larger buffers of binary data from a new source. Initial testing with buffer sizes above 100MB result in IOExceptions (java.lang.OutOfMemoryError: Map failed).
    I am using 32bit Windows XP; 2GB of memory (typically 1.3 to 1.5GB free); Java version 1.6.0_03; with -Xmx set to 1280m. Decreasing the Java heap max size down 768m does result in the ability to memory map larger buffers to files, but never bigger than roughly 500MB. However, the application that uses this code contains other components that require the -xMx option to be set to 1280.
    The following simple code segment executed by itself will produce the IOException for me when executed using -Xmx1280m. If I use -Xmx768m, I can increase the buffer size up to around 300MB, but never to a size that I would think I could map.
    try
    String mapFile = "C:/temp/" + UUID.randomUUID().toString() + ".tmp";
    FileChannel rwChan = new RandomAccessFile( mapFile, "rw").getChannel();
    ByteBuffer byteBuffer = rwChan.map( FileChannel.MapMode.READ_WRITE,
    0, 100000000 );
    rwChan.close();
    catch( Exception e )
    e.printStackTrace();
    I am hoping that someone can shed some light on the factors that affect the amount of data that may be memory mapped to/in a file at one time. I have investigated this for some time now and based on my understanding of how memory mapped files are supposed to work, I would think that I could map ByteBuffers to files larger than 500MB. I believe that address space plays a role, but I admittedly am no OS address space expert.
    Thanks in advance for any input.
    Regards- KJ

    See the workaround in http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4724038

Maybe you are looking for

  • Insert statement not working when Interface is run through PI ..

    Hi Every one, There is an interface(Server proxy) which updates a ztable in the program. Updates are happening when we are running this interface manually(putting Payload manually) but in case there are already entries in ztable and we are trying to

  • Consolidation of items in accounting document

    Hi All I have this scenario in MySAP ERP 2005 version SD Billing doc has two line items Item 1{Material=A,Plant=2000,P-centre=21000,Amount=$3,Qty=1,Tax=$0.42} Item 2{Material=A,Plant=2000,P-centre=21000,Amount=$4,Qty=1,Tax=$0.56} Revenue was posted t

  • Cannot change Datasource location in report

    Post Author: TomS CA Forum: Data Connectivity and SQL I have a CR report that I am working on (inherited from another developer). For the life of me, this report will NOT let me change the database location! I need to point it to another database (MS

  • IPod nano 7th generation sensor issue

    I just got an iPod nano 7th generation. I'm trying to figure out the nike fitness section. It keeps telling me "not linked to sensor" when I try to set up a run. I was under the impression that I wouldn't have to buy a separate sensor. Is this true?

  • How to check schedule line PO text before PO saving.

    How can i check the schedule line PO text before po saving. My idea is add a trigger in user exits. but i don't know how to check this data. I can't check the table stxh or stxl because those data are stored after PO saved. I think before that the da