How to take heap dump in weblogic
Hi All,
we have weblogic 11.1.1.5.0 installed with Jrockit JVM. we need to take heap dump for our server at will.
please suggest possible way to take heap dump.
TIA,
Bob
Hi, under JROCKIT_HOME/bin directory you can run this command:
./jrcmd [Process_id] hprofdump filename=[filename]
Similar Messages
-
Hi,
I am taken data dump on oracle 9i machine and ported (imported ) oracle 10g (production machine) ,But it will showing error : language set error,
Could you tell me how to take data dump with language set.
Regards,
SuvaHi PaulM,
Please follows the details,
Development server ,It is 9i machine (I am export in this machine) and Imported on Production Server ( It is Oracle 10 g).
When import on production server error is coming, Tis error log adding below.
Production Databse (Language details)
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET UTF8
NLS_RDBMS_VERSION 10.2.0.1.0
Development Database Language details Details.
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET UTF8
NLS_RDBMS_VERSION 10.2.0.1.0
Log file
Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
Export file created by EXPORT:V09.02.00 via conventional path
import done in WE8MSWIN1252 character set and UTF8 NCHAR character set
import server uses UTF8 character set (possible charset conversion)
export server uses AL16UTF16 NCHAR character set (possible ncharset conversion)
. importing JW_OR's objects into JW_OR
. importing JW_OS's objects into JW_OS
. importing JW_ADMIN's objects into JW_ADMIN
. importing JW_OR's objects into JW_OR
. . importing table "ACCRXNS" 234671 rows imported
. . importing table "AUTHORLINKS" 790450 rows imported
. . importing table "AUTHORS" 79500 rows imported
. . importing table "CATSOL" 25505 rows imported
. . importing table "CATSOLSYNONYMS" 80045 rows imported
. . importing table "CHAPTERTITLES" 133 rows imported
. . importing table "COMPOUNDLINKS" 601785 rows imported
. . importing table "CONDITIONS" 207445 rows imported
. . importing table "JOURNALS" 2327 rows imported
. . importing table "LANGUAGE" 0 rows imported
. . importing table "MAINDATA" 234659 rows imported
. . importing table "MOLDATA" 721174 rows imported
. . importing table "PLAN_TABLE" 1 rows imported
. . importing table "REFERENCES" 276783 rows imported
. . importing table "ROLES" 2 rows imported
. . importing table "RXNKEYLINKS" 1724404 rows imported
. . importing table "RXNKEYWORDS" 848 rows imported
. . importing table "TABLETITLES" 2400 rows imported
. . importing table "TEMP_TABLE" 165728 rows imported
. . importing table "TEMP_WILEY_MAINDATA" 155728 rows imported
. . importing table "TEMP_WILEY_PDF_MAP" 16672 rows imported
. . importing table "TEMP_WILEY_YEAR_VOL_MAP" 42 rows imported
. . importing table "WEX_ACCRXNS" 3465 rows imported
. . importing table "WEX_AUTHORLINKS" 14183 rows imported
. . importing table "WEX_AUTHORS" 79500 rows imported
. . importing table "WEX_CHAPTERTITLES" 133 rows imported
. . importing table "WEX_COMPOUNDLINKS" 10925 rows imported
. . importing table "WEX_CONDITIONS" 5297 rows imported
. . importing table "WEX_JOURNALS" 2327 rows imported
. . importing table "WEX_LANGUAGE" 0 rows imported
. . importing table "WEX_MAINDATA" 3465 rows imported
. . importing table "WEX_MOLDATA" 10358 rows imported
. . importing table "WEX_REFERENCES" 3795 rows imported
. . importing table "WEX_RXNKEYLINKS" 34540 rows imported
. . importing table "WEX_RXNKEYWORDS" 848 rows imported
. . importing table "WEX_TABLETITLES" 2400 rows imported
. . importing table "WEX_WILEY_HTML_MAP" 17316 rows imported
. . importing table "WEX_WILEY_MAINDATA" 3465 rows imported
. . importing table "WEX_WILEY_PDF_MAP" 23925 rows imported
. . importing table "WEX_WILEY_YEAR_VOL_MAP" 58 rows imported
. . importing table "WILEY_HTML_MAP" 17316 rows imported
. . importing table "WILEY_MAINDATA" 234659 rows imported
. . importing table "WILEY_PDF_MAP" 23925 rows imported
. . importing table "WILEY_YEAR_VOL_MAP" 58 rows imported
. importing JW_OS's objects into JW_OS
. . importing table "ACCRXNS" 7116 rows imported
. . importing table "ATMOSPHERE" 47 rows imported
. . importing table "AUTHORLINKS" 33276 rows imported
. . importing table "AUTHORS" 6555 rows imported
. . importing table "CATSOL" 1463 rows imported
. . importing table "CATSOLSYNONYMS" 9370 rows imported
. . importing table "CHEMICALS" 78197 rows imported
. . importing table "COMPOUNDLINKS" 20799 rows imported
. . importing table "EXPDET" 1 rows imported
. . importing table "FOOTNOTES" 77825 rows imported
. . importing table "JOURNALS" 2 rows imported
. . importing table "LANGUAGE" 2 rows imported
. . importing table "MAINDATA" 7116 rows imported
. . importing table "PATHSTEP" 7199 rows imported
. . importing table "PROCEDURENOTES" 77293 rows imported
. . importing table "ROLES" 2 rows imported
. . importing table "RXNKEYLINKS" 23096 rows imported
. . importing table "RXNKEYWORDS" 1272 rows imported
. . importing table "WEX_ACCRXNS" 135 rows imported
. . importing table "WEX_ATMOSPHERE" 47 rows imported
. . importing table "WEX_AUTHORLINKS" 613 rows imported
. . importing table "WEX_AUTHORS" 6555 rows imported
. . importing table "WEX_CHEMICALS" 0 rows imported
. . importing table "WEX_COMPOUNDLINKS" 497 rows imported
. . importing table "WEX_EXPDET" 1 rows imported
. . importing table "WEX_FOOTNOTES" 2184 rows imported
. . importing table "WEX_JOURNALS" 2 rows imported
. . importing table "WEX_LANGUAGE" 2 rows imported
. . importing table "WEX_MAINDATA" 135 rows imported
. . importing table "WEX_PATHSTEP" 135 rows imported
. . importing table "WEX_PROCEDURENOTES" 2253 rows imported
. . importing table "WEX_RXNKEYLINKS" 695 rows imported
. . importing table "WEX_RXNKEYWORDS" 1272 rows imported
. importing JW_ADMIN's objects into JW_ADMIN
. . importing table "APP_USER" 76 rows imported
. . importing table "AUTHOR" 61874 rows imported
. . importing table "CITATION"
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10794
Column 2 77
Column 3 1
Column 4 24
Column 5
Column 6 Science of Synthesis
Column 7 Negishi, E.-i.; Takahashi, T. Science of Synthesis...
Column 8 681–848
Column 9 2
Column 10
Column 11 2002
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10879
Column 2 77
Column 3 1
Column 4 110
Column 5
Column 6 Comprehensive Organic Synthesis
Column 7 Hiemstra, H.; Speckamp, W. N.; Trost, B. M.; Flemi...
Column 8 1047–108
Column 9 2
Column 10
Column 11
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10880
Column 2 77
Column 3 1
Column 4 111
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 De Koning, H.; Speckamp, W. N.; Helmchen, G.; Hoff...
Column 8 1953–200
Column 9 E21b
Column 10
Column 11 1995
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10904
Column 2 77
Column 3 1
Column 4 135
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 Ryu, I.; Murai, S.; de Meijere, A., Ed. Houben-Wey...
Column 8 1985–204
Column 9 E17c
Column 10
Column 11 1997
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10905
Column 2 77
Column 3 1
Column 4 136
Column 5
Column 6 The Chemistry of the Cyclopropyl Group
Column 7 Tsuji, T.; Nishida, S.; Patai, S.; Rappoport, Z., ...
Column 8 307–373
Column 9
Column 10
Column 11 1987
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10906
Column 2 77
Column 3 1
Column 4 137
Column 5
Column 6 The Chemistry of the Cyclopropyl Group
Column 7 Vilsmaier, E.; Patai, S.; Rappoport, Z., Eds. The ...
Column 8 1341–145
Column 9
Column 10
Column 11 1987
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10952
Column 2 77
Column 3 1
Column 4 183
Column 5
Column 6 Cyclopropane-Derived Reactive Intermediates
Column 7 Boche, G.; Walborsky, H. M. Cyclopropane-Derived R...
Column 8 117–173
Column 9
Column 10
Column 11 1990
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10958
Column 2 77
Column 3 1
Column 4 189
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 Klunder, A. J. H.; Zwanenburg, B. Houben-Weyl Meth...
Column 8 2419–243
Column 9 E17c
Column 10
Column 11 1997
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10995
Column 2 77
Column 3 1
Column 4 226
Column 5
Column 6 Science of Synthesis
Column 7 Cha, J. K. Science of Synthesis 2005, 325–338.
Column 8 325–338
Column 9
Column 10
Column 11 2005
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 17123
Column 2 82
Column 3 1
Column 4 13
Column 5
Column 6 Comprehensive Organometallic Chemistry II
Column 7 Dushin, R. G.; Edward, W. A.; Stone, F. G. A.; Wil...
Column 8 1071–109
Column 9 12
Column 10
Column 11 1995
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 17124
Column 2 82
Column 3 1
Column 4 14
Column 5
Column 6 Modern Carbonyl Olefination
Column 7 Ephritikhine, M.; Villiers, C.; Takeda, T. Ed. Mod...
Column 8 223–285
Column 9
Column 10
Column 11 2004
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 17126
Column 2 82
Column 3 1
Column 4 16
Column 5
Column 6 Transition Metals for Organic Synthesis (2nd Editi...
Column 7 Furstner, A.; Beller, M.; Bolm, C. Eds. Transition...
Column 8 449–468
Column 9
Column 10
Column 11 2004 17712 rows imported
. . importing table "FOOTNOTE" 38 rows imported
. . importing table "GT_STATS_REPORT" 0 rows imported
. . importing table "GT_VALIDATION_REPORT" 0 rows imported
. . importing table "OR_USERS" 1 rows imported
. . importing table "OS_USERS" 1 rows imported
. . importing table "PROCEDURENOTE" 70 rows imported
. . importing table "QC_TRACKING" 539881 rows imported
. . importing table "ROLE" 5 rows imported
. . importing table "SCHEMA" 3 rows imported
. . importing table "TASK_ALLOCATION" 159370 rows imported
. . importing table "USER_LOG" 174488 rows imported
. . importing table "VERSION" 3 rows imported
About to enable constraints...
IMP-00017: following statement failed with ORACLE error 2298:
"ALTER TABLE "AUTHOR" ENABLE CONSTRAINT "FK_AUTHOR_CITATIONID""
IMP-00003: ORACLE error 2298 encountered
ORA-02298: cannot validate (JW_ADMIN.FK_AUTHOR_CITATIONID) - parent keys not found
Import terminated successfully with warnings.
Regards,
Subash -
How to take IIS dump remotely for remote server
Hello ALL,
Is there any way to take IIS dump remotely using debug diag tool or using powershell script on remote computers
please answer if any knows.every time we are login into servers and taking dump before resetting IIS on our Web servers.
Regards
SatyaHi Satya,
You can post the question to IIS forum instead:
http://forums.iis.net/
If you have any feedback on our support, please send to [email protected] -
How to take DB dump for "virtual Columns" enabled env in oracle11g
Hi,
Could you please let me know the procedure/steps to take the DB dump for "virtual Columns" enabled environment in oracle11g.
Not able to take the database dump using 'exp' tool.
Thanks,
Satya AdithamWrong forum, this is a Secure Backup specific forum, not an RDBMS/RMAN forum.
-
I have a table TBTCO in quality having 2,90,000 records i want to download all but excel is allowing only 65,0000 records what way i can download remaing records could any one please suggest me.
Hi Venkat,
As you know MS-Excel-2003 and older versions have a limitation to hold only square of (256) i.e. 65536 records but now the higher version of MS-Excell i.e. 2007 can hold upto 1048576(square of 1024) records. So the solution in your case would be to install the new version.
OR Alternatively there is a work around but it needs to put in some efforts.
If you are not comfortable with the installation of new MS-Office components and still need the data in an excel format then what you can do is, you can take a download of this table in an 'unconverted' format by using the 'Download to local file' option available in the table browser and then open this file in 'EDIT-PLUS' (free trial version available on the Internet) and then copy the records in an excel file in different worksheets.
Edit Plus becoz it show the line number when you open a file in it, so it will be easier for you to copy 65536 records each time and paste it in Excel.
Try it out....
Cheerss.....Enjoy.
Regards,
Abhijit G. Borkar -
Error while taking GC heap Dump using Microsoft PerfView
Hello,
When I try to take heap dump for an application process using Microsoft
PerfView , error observed
and the error log given below. Can you please let me know the root cause for his issue ?
Steps followed
From the PerfView UI, choose “Take Heap Snapshot,” located on the Memory menu.
And choose the process to capture
Click the “Dump GC Heap” button or simply double click on the process name.
Error Log
Completed: Dumping GC Heap to C:\Install\TM\PerfView\TestProcess.gcDump (Elapsed Time: 1.156 sec)
Error: HeapDump failed with exit code 1
Directory TestProcess.gcdump does not exist
Started: Dumping GC Heap to C:\Install\TM\PerfView\TestProcess.1.gcDump
Collecting a GC Heap SnapShot for process 1704
[Taking heap snapshot of process '1704' ID 1704 to TestProcess.1.gcdump. This can take 10s of seconds to minutes.]
During the dump the process will be frozen. If the dump is aborted, the process being dumped will need to be killed.
Starting dump at 8/08/2014 3:55:15 PM
Starting Heap dump on Process 1704 running architecture AMD64.
set _NT_SYMBOL_PATH=SRV*C:\Users\UserId\AppData\Local\Temp\3\symbols*http://msdl.microsoft.com/download/symbols
Exec: "C:\Users\UserId\AppData\Roaming\PerfView\VER.2014-08-08.13.49.17.346\AMD64\HeapDump.exe" /MaxDumpCountK=250 "1704" "TestProcess.1.gcdump"
Looking for C:\Users\UserId\AppData\Roaming\PerfView\VER.2014-08-08.13.49.17.346\Microsoft.Diagnostics.FastSerialization.dll
Dumping process 1704 with id 1704.
Process Has DotNet: False Has JScript: False Has ClrDll: False
HeapDump Error: Could not dump either a .NET or JavaScript Heap. See log file for details
Completed: Dumping GC Heap to C:\Install\TM\PerfView\TestProcess.1.gcDump (Elapsed Time: 1.172 sec)
Error: HeapDump failed with exit code 1
Directory TestProcess.1.gcdump does not existHi,
Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
GRANT ALTER SESSION TO "DMSYS"
GRANT ALTER SYSTEM TO "DMSYS"
GRANT CREATE JOB TO "DMSYS"
GRANT CREATE LIBRARY TO "DMSYS"
GRANT CREATE PROCEDURE TO "DMSYS"
GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
GRANT CREATE SEQUENCE TO "DMSYS"
GRANT CREATE SESSION TO "DMSYS"
GRANT CREATE SYNONYM TO "DMSYS"
GRANT CREATE TABLE TO "DMSYS"
GRANT CREATE TRIGGER TO "DMSYS"
GRANT CREATE TYPE TO "DMSYS"
GRANT CREATE VIEW TO "DMSYS"
GRANT DROP PUBLIC SYNONYM TO "DMSYS"
GRANT QUERY REWRITE TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump?? -
How to take regular heap dumps using HPROF
Hi Folks,
I am using Oracle App server as my application server. I found that the memory is growing gradualy and gets maxed out with in 1 hour. I am using 1 GB of heap.
I defently feel this is a memory leak issue. Once the Heap usage reaches 100%, I will start getting the FULL GCs and my whole server hangs and nothing will work. Some times even the JVM crashes and restarts again.
I didn't find Out of Memory exception also in any of my logs.
I came to know that we can use Hprof to deal with this.
I use the below as my JVM agrs...
-agentlib:hprof=heap=all,format=b,depth=10,file=$ORACLE_HOME\hprof\Data.hprof
I run my load run for 10 mins, now my heap usage has been grown to some extent.
My Questions:
1. Why there are 2 files generated, one is with the name Data.hprof and another with Data.hprof.tmp. Which is what?
2. How to get the dump at 2 different points. So that I can compare the the 2 dumps and I can say which object is growing more.
I downloaded the HAT and If I use to open this Data.hprof file from HAT, I am getting this error. This error will come if I open the file with out stoping the JVM process.
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:178)
at java.io.DataInputStream.readFully(DataInputStream.java:152)
at hat.parser.HprofReader.read(HprofReader.java:202)
at hat.parser.Reader.readFile(Reader.java:90)
at hat.Main.main(Main.java:149)
If I stop the JVM process, and then open through HAT I am getting this error,
Started HTTP server on port 7000
Reading from hprofData.hprof...
Dump file created Wed Dec 13 02:35:03 MST 2006
Warning: Weird stack frame line number: -688113664
java.io.IOException: Bad record length of -1551478782 at byte 0x0008ffab of file.
at hat.parser.HprofReader.read(HprofReader.java:193)
at hat.parser.Reader.readFile(Reader.java:90)
at hat.Main.main(Main.java:149)
JVm version I am using is: Sun JVM 1.5.0_06
I am seriously fed up of this memory leak issue... Please help me out folks... I need this as early as possible..
I hope I get early replys...
Thanks in advance...First, the suggestion of using jmap is an excellent one, you should try it. On large applications, using the hprof agent you have to restart your VM, and hprof can disturb your JVM process, you may not be able to see the problem as quickly. With jmap, you can get a heap snapshot of a running JVM when it is in the state you want to understand more of, and it's really fast compared to using the hprof agent. The hprof dump file you get from jmap will not have the stack traces of where objects were allocated, which was a concern of mine a while back, but all indications are that these stack traces are not critical to finding memory leak problems. The allocation sites can usually be found with a good IDE ot search tool,
like the NetBeans 'Find Usages' feature.
On hprof, there is a temp file created during the heap dump creation, ignore the tmp file.
The HAT utility has been added to JDK6 (as jhat) and many problems have been fixed. But most importantly, this JDK6 jhat can read ANY hprof dump file, from JDK5 or even JDK1.4.2. So even though the JDK6 jhat is using JDK6 itself, the hprof dump file it is given could have come from pretty much anywhere, including jmap. As long as it's a valid hprof binary dump file.
So even if it's just to have jhat handy, you should get JDK6.
Also, the Netbeans profiler (http://www.netbeans.org) might be helpful too. But it will require a restart of the VM.
-kto -
How can I get heap dump for 1.4.2_11 when OutOfMemory Occured
Hi guys,
How can I get heap dump for 1.4.2_11 when OutOfMemory Occured, since it has no options like: -XX:+HeapDumpOnOutOfMemoryError and -XX:+HeapDumpOnCtrlBreak
We are running Webloic 8.1 SP3 applications using this Sun 1.4.2_11 JVM and it's throwing out OutOfMemory, but we can not find a heap dump. The application is running as a service in Windows Server 2003. How can I do some more analysis on this issue.
Thanks.The HeapDumpOnOutOfMemoryError option was added to 1.4.2 in update 12. Further work to support all collectors was done in update 15.
-
Full heap dump - weblogic 8.1
Hi to everyone,
Is there any possibility to full heap dump (core dump) in weblogic 8.1 manually?
Best regards,
PeterIn JDK6 there are the jhat and jmap utils which are described here
http://weblogs.java.net/blog/kellyohair/archive/2005/09/heap_dump_snaps.html
But, prior to that (JDK5 and earlier), you have to use the HAT utility which can be found at
http://hat.dev.java.net/
If you are using JRockit, you can use Mission Control for this, I believe. There's an intro to this tool at
http://dev2dev.bea.com/pub/a/2005/12/jrockit-mission-control.html?page=1 -
How to take schema level Encrypted export dump
Hi ,
Pls. let me know how to take schema level export dump with encryption .
regards
chandrasekar.vThis article on how to setup and maintain TDE may help you: Transparent Data Encryption
additionaly, you may find this interesting > Transparent Data Encryption (TDE) in Oracle 10g Database Release 2
~ Madrid. -
How to turn off core dump for WebLogic 8.1 running on AIX 5.3
Hi there,
Is there a way we can turn off core dumping for WebLogic 8.1 running on AIX 5.3?
Thank you.
Regards,
SurenderHi Surender,
Please add the following Flag in the JAVA_OPTIONS of your servers StartScript like following :
JAVA_OPTIONS=${JAVA_OPTIONS} -Xdump:system:none
Thanks
Jay SenSharma
http://jaysensharma.wordpress.com/2009/12/30/jvm-crash-and-native-outofmemory/ (WebLogic Wonders Are Here) -
Why does hprof=heap=dump have so much overhead?
I udnerstand why the HPROF option heap=sites incurs a massive performance overhead; it has to intercept every allocation and record the current call stack.
However, I don't understand why the HPROF option heap=dump incurs so much of a performance overhead. Presumably it could do nothing until invoked, and only then trace from the system roots the entire heap.
Can anyone speak to why it doesn't work that way?
- Gordon @ IATraditionally agents like hprof had to be loaded into the virtual machine at startup, and this was the only way to capture these object allocations. The new hprof in the JDK 5.0 release (Tiger) was written using the newer VM interface JVM TI and this new hprof was mostly meant to reproduce the functionality of the old hprof from JDK 1.4.2 that used JVMPI. (Just FYI: run 'java -Xrunhprof:help' for help on hprof).
The JDK 5.0 hprof will at startup, instrument java.lang.Object.<init>() and all classes and methods that use the newarray bytecodes. This instrumentation doesn't take long and is just an initial startup cost, it's the run time and what happens then that is the performance bottleneck. At run time, as any object is allocated, the instrumented methods trigger an extra call into a Java tracker class which in turn makes a JNI call into the hprof agent and native code. At that point, hprof needs to track all the objects that are live (the JVM TI free event tells it when an object is freed), which takes a table inside the hprof agent and memory space. So if the machine you are using is low on RAM, using hprof will cause drastic slowdowns, you might try heap=sites which uses less memory but just tracks allocations based on site of allocation not individual objects.
The more likely run time performance issue is that at each allocation, hprof wants to get the stack trace, this can be expensive, depends on how many objects are allocated. You could try using depth=0 and see if the stack trace samples are a serious issue for your situation. If you don't need stack traces, then you would be better off looking at the pmap command that gets you an hprof binary dump on the fly, no overhead, then you can use jhat (or HAT) to browse the heap. This may require use of the JDK 6 (Mustang) release for this experiment, see http://mustang.dev.java.net for the free downloads of JDK 6 (Mustang).
There is an RFE for hprof to allow the tracking of allocations to be turned on/off in the Java tracker methods that were injected, at the Java source level. But this would require adding some Java APIs to control sun/tools/hprof/Tracker which is in rt.jar. This is very possible and more with the JVM TI interfaces.
If you haven't tried the NetBeans Profiler (http://www.netbeans.org) you may want to look at it. It does take an incremental approach to instrumentation and tries to focus in on the areas of interest and allows you to limit the overhead of the profiler. It works with the latest JDK 5 (Tiger) update release, see http://java.sun.com/j2se.
Oh yes, also look at some of the JVM TI demos that come with the JDK 5 download. Look in the demo/jvmti directory and try the small agents HeapTracker and HeapViewer, they have much lower overhead and the binaries and all the source is right there for you to just use or modify and customize for yourself.
Hope this helps.
-kto -
Java 1.4.2. - full heap dump?
Hello,
Is there any possibility to generate full heap dump (core dump) in java 1.4.2. on demand?
Best regards,
PeterIf you are in Unix platform, you can use this script to have thread dump.
I am not sure whether you can generate coredump with out an application crash .
kill -3 <java pid> will provide full stacktrace of the java process thread dump.
Note: kill -3 will not terminate the java process, it will only generate full stacktrace in your log file and safe to use while the java process is running.
you can get the java process id, using this unix cmd "ps -ef | grep java" .
#!/bin/ksh
[ $# -le 0 ] && echo "USAGE: $0 <pid>" && exit
for i in 1 2
do
DT=`date +%Y%m%d_%H%M`
prstat -Lmp $1 1 1 >> prstat_Lmp_$i_$DT.dmp
pstack $1 >> pstack_$i_$DT.dmp
kill -3 $1
echo "prstat, pstack, and thread dump done. #" $i
sleep 1
echo "Done sleeping."
done
Pls go through some of this links, this will provide you on how to debug the issue with the logs generated by the scripts:
http://support.bea.com/application_content/product_portlets/support_patterns/wls/UnexpectedHighCPUUsageWithWLSPattern.html
http://www.unixville.com/~moazam/stories/2004/05/18/debuggingHangsInTheJvm.html -
JVMPI_GC_ROOT_MONITOR_USED - what does this mean in a heap dump?
I'm having some OutOfMemory errors in my application, so I turned on a profiler, and took a heap dump before and after an operation that is blowing up the memory.
What changes after the operation is that I get an enormous amount of data that is reported under the node JVMPI_GC_ROOT_MONITOR_USED. This includes some Oracle PreparedStatements which are holding a lot of data.
I tried researching the meaning of JVMPI_GC_ROOT_MONITOR_USED, but found little help. Should this be objects that are ready for garbage collection? If so, they are not being garbage collected, but I'm getting OutOfMemoryError instead (I thought the JVM was supposed to guarantee GC would be run before OutOfMemory occurred).
Any help on how to interpret what it means for objects to be reported under JVMPI_GC_ROOT_MONITOR_USED and any ways to eliminate those objects, will be greatly appreciated!
ThanksI tried researching the meaning of
JVMPI_GC_ROOT_MONITOR_USED, but found little help.
Should this be objects that are ready for garbage
collection? Disclaimer: I haven't written code to use JVMPI, so anything here is speculation.
However, after reading this: http://java.sun.com/j2se/1.4.2/docs/guide/jvmpi/jvmpi.html
It appears that the "ROOT" flags in a level-2 dump are used with objects that are considered a "root reference" for GC (those references that are undeniably alive). Most descriptions of "roots" are static class members and variables in a stack frame. My interpretation of this doc is that objects used in a synchonize() statement are also considered roots, at least for the life of the synchronized block (makes a lot of sense when you think about it). -
AD4J: Unable to start Heap Dump due to OS Error
When Java 'Heap Dump' is requested, the following message is shown:
Unable to start Heap Dump due to OS Error.
Details of the monitored JDK and application server are given below:
JDK: Sun JDK 1.4.2_08 on Solaris 9
Application Server: WebLogic 8.1 SP4
1. What could be the possible cause? No errors are logged in jamserv/logs/error_log. Is there any way to enable detailed logging?
2. Each time the heap dump is requested, a file of the format heapdump<n> gets created in /tmp (e.g. /tmp/heapdump12.txt). If you see the file, it contains the following:
a) a line containing summary of the heap usage, and
b) stack traces of all the threads
Thanks!Wrong Forum?
Maybe you are looking for
-
Stock inconsistency between MM and QM
We have QM managed FERT material. So after GR we recieve stock in QI with ref to 04 lot. FERT material is batch managed. From one of batch inspection lot business posted stock from QI to blocked (350) with ref to insp lot. Now lot status is SPCO (pos
-
Built in isight camera not working with flash
My built in isight camera on my macbook works just fine in photobooth and in skype or ichat.. but just in the past week or so, it will not work when im on websites that use flash. I've seen some threads that say that I need to go into settings and ma
-
Error when trying to connect Crystal Enterprise to Peoplesoft (tools 8.49)
We have been using crystal enterprise with our peoplesoft for a while now, but with the tools version 8.43 on DB2 database. We are now upgrading our tools to version 8.49 on Oracle database. We have tons of crystal reports developed on crystal enterp
-
Importing Images into IDVD for Slidshow
I am at a loss as to what is the best image size, resolution size, etc. when it comes to importing images into IDVD. I usually size my prints 4 x 6 @ 300 dpi.. This looks fine in the slideshow preview. After I burn the dvd and view it, the images los
-
Which iMac would you buy if you have 2 choices?
I have a 2 choices to choose from between ME088LL/A (i5, 2013) and MD580LL/A (i7, 2012). Both are like new condition and price difference is $350 (MD580LL/A > ME088LL/A). I roughly compared both but not sure the significant differences between them s