Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

Difference between revisions of "MemoryAnalyzer"

m (About)
(Update IBM Java system dump and PHD information)
(56 intermediate revisions by 14 users not shown)
Line 1: Line 1:
 
== About ==
 
== About ==
The [http://eclipse.org/mat Eclipse Memory Analyzer] is a fast and feature-rich heap analyzer that helps you find '''memory leaks''' and high '''memory consumption''' issues.
+
The [http://eclipse.org/mat Eclipse Memory Analyzer] tool (MAT) is a fast and feature-rich heap dump analyzer that helps you find '''memory leaks''' and analyze high '''memory consumption''' issues.
 +
 
 +
With Memory Analyzer one can easily
 +
* find the biggest objects, as MAT provides reasonable accumulated size (retained size)
 +
* explore the object graph, both inbound and outbound references
 +
* compute paths from the garbage collector roots to interesting objects
 +
* find memory waste, like redundant String objects, empty collection objects, etc...
 +
 
 +
[[Category:Tools Project]][[Category:Memory Analyzer]]
  
 
== Getting Started ==
 
== Getting Started ==
  
[http://www.eclipse.org/mat/downloads.php Download] the latest version, run the '''Memory Analyzer Basics''' Cheat Sheet (''Help -> Cheat Sheets...'').
+
=== Installation ===
  
== Tasks ==
+
See the [http://eclipse.org/mat/downloads.php download page] for installation instructions.
  
=== Getting a Heap Dump ===
+
=== Basic Tutorials ===
  
The Memory Analyzer works with heap dumps, more specifically ''HPROF binary formatted heap dumps''. Those heap dumps are written by Sun HotSpot and any VM derived from HotSpot. Depending on your scenario, your OS platform and your JDK version, you have different options to acquire a heap dump.
+
Both the [http://help.eclipse.org/neon/topic/org.eclipse.mat.ui.help/gettingstarted/basictutorial.html?cp=49_1_0 Basic Tutorial] chapter in the MAT documentation and the [http://www.vogella.com/articles/EclipseMemoryAnalyser/article.html Eclipse Memory Analyzer Tutorial] by Lars Vogel are a good first reading, if you are just starting with MAT.
 +
 
 +
=== Further Reading ===
 +
 
 +
Check [[MemoryAnalyzer/Learning Material]]. You will find there a collection of presentations and web articles on Memory Analyzer, which are also a good resource for learning.
 +
 
 +
 
 +
== Getting a Heap Dump ==
 +
 
 +
==== HPROF dumps from Sun Virtual Machines ====
 +
 
 +
The Memory Analyzer can work with ''HPROF binary formatted heap dumps''. Those heap dumps are written by Sun HotSpot and any VM derived from HotSpot. Depending on your scenario, your OS platform and your JDK version, you have different options to acquire a heap dump.
  
 
'''Non-interactive'''
 
'''Non-interactive'''
  
 
If you run your application with the VM flag '''-XX:+HeapDumpOnOutOfMemoryError''' a heap dump is written on the first Out Of Memory Error. There is no overhead involved unless a OOM actually occurs. This flag is a must for production systems as it is often the only way to further analyze the problem.
 
If you run your application with the VM flag '''-XX:+HeapDumpOnOutOfMemoryError''' a heap dump is written on the first Out Of Memory Error. There is no overhead involved unless a OOM actually occurs. This flag is a must for production systems as it is often the only way to further analyze the problem.
 +
 +
As per [http://stackoverflow.com/questions/542979/using-heapdumponoutofmemoryerror-parameter-for-heap-dump-for-jboss this article], the heap dump will be generated in the "current directory" of the JVM by default. It can be explicitly redirected with '''-XX:HeapDumpPath=''' for example ''-XX:HeapDumpPath=/disk2/dumps'' . Note that the dump file can be huge, up to Gigabytes, so ensure that the target file system has enough space.
 +
  
 
'''Interactive'''
 
'''Interactive'''
Line 20: Line 42:
 
As a developer, you want to trigger a heap dump on demand. On '''Windows, use JDK 6 and JConsole'''. On '''Linux and Mac OS X''', you can also use '''jmap''' that comes with JDK 5.
 
As a developer, you want to trigger a heap dump on demand. On '''Windows, use JDK 6 and JConsole'''. On '''Linux and Mac OS X''', you can also use '''jmap''' that comes with JDK 5.
  
 +
Via MAT:
 +
* tutorial [http://community.bonitasoft.com/blog/effective-way-fight-duplicated-libs-and-version-conflicting-classes-using-memory-analyzer-tool here]
  
Via Java VM parameters
+
Via Java VM parameters:
  
 
* -XX:+HeapDumpOnOutOfMemoryError writes heap dump on OutOfMemoryError (recommended)
 
* -XX:+HeapDumpOnOutOfMemoryError writes heap dump on OutOfMemoryError (recommended)
Line 29: Line 53:
 
Via Tools:
 
Via Tools:
  
* Sun JMap: jmap.exe -dump:format=b,file=HeapDump.hprof <pid>
+
* Sun (Linux, Solaris; not on Windows) [http://java.sun.com/j2se/1.5.0/docs/tooldocs/share/jmap.html JMap Java 5]: '''jmap -heap:format=b <pid>'''
 +
* Sun (Linux, Solaris; Windows see link) [http://java.sun.com/javase/6/docs/technotes/tools/share/jmap.html JMap Java 6]: '''jmap.exe -dump:format=b,file=HeapDump.hprof <pid>'''
 +
* Sun (Linus, Solaris) JMap with Core Dump File: '''jmap -dump:format=b,file=HeapDump.hprof /path/to/bin/java core_dump_file'''
 
* Sun JConsole: Launch jconsole.exe and invoke operation dumpHeap() on HotSpotDiagnostic MBean
 
* Sun JConsole: Launch jconsole.exe and invoke operation dumpHeap() on HotSpotDiagnostic MBean
 
* SAP JVMMon: Launch jvmmon.exe and call menu for dumping the heap
 
* SAP JVMMon: Launch jvmmon.exe and call menu for dumping the heap
Line 67: Line 93:
 
| 1.5.0_07
 
| 1.5.0_07
 
| Yes
 
| Yes
|  
+
| Yes (Since 1.5.0_15)
 
| Yes
 
| Yes
 
| Yes (Only Solaris and Linux)
 
| Yes (Only Solaris and Linux)
Line 94: Line 120:
 
|}
 
|}
  
==== Do you Support IBM System Dumps? ====
+
==== System Dumps and Heap Dumps from IBM Virtual Machines ====
  
Sorry, currently only HPROF formatted heap dumps are supported. However, we are working together with IBM to support IBM system dumps based on [http://publib.boulder.ibm.com/infocenter/javasdk/v6r0/topic/com.ibm.java.doc.diagnostics.60/html/contents.html#ToC_458 DTFJ]. We expect a solution around Q4 2008.
+
Memory Analyzer may read memory-related information from IBM system dumps and from Portable Heap Dump (PHD) files with the [http://www.ibm.com/developerworks/java/jdk/tools/dtfj.html IBM DTFJ feature] installed. Once installed, then '''File''' &gt; '''Open Heap Dump''' should give the following options for the file types:
 +
 
 +
* All known formats
 +
* HPROF binary heap dumps
 +
* IBM 1.4.2 SDFF
 +
* IBM Javadumps
 +
* IBM SDK for Java (J9) system dumps
 +
* IBM SDK for Java Portable Heap Dumps
 +
 
 +
For a comparison of dump types, see [http://www.ibm.com/developerworks/library/j-memoryanalyzer/#table1 Debugging from dumps]. System dumps are simply operating system core dumps; therefore, they are a superset of portable heap dumps. System dumps are far superior than PHDs, particularly for more accurate GC roots, thread-based analysis, and unlike PHDs, system dumps contain memory contents like HPROFs. Older versions of IBM Java (e.g. &lt; 5.0SR12, &lt; 6.0SR9) require running jextract on the operating system core dump which produced a zip file that contained the core dump, XML or SDFF file, and shared libraries. The IBM DTFJ feature still supports reading these jextracted zips; however, newer versions of IBM Java do not require jextract for use in MAT since DTFJ is able to directly read each supported operating system's core dump format. Simply ensure that the operating system core dump file ends with the '''.dmp''' suffix for visibility in the MAT Open Heap Dump selection. It is also common to zip core dumps because they are so large and compress very well. If a core dump is compressed with '''.zip''', the IBM DTFJ feature in MAT is able to decompress the ZIP file and read the core from inside (just like a jextracted zip). The only significant downsides to system dumps over PHDs is that they are much larger, they usually take longer to produce, they may be useless if they are manually taken in the middle of an exclusive event that manipulates the underlying Java heap such as a garbage collection, and they sometimes require operating system configuration ([http://www.ibm.com/support/knowledgecenter/SSYKE2_7.1.0/com.ibm.java.lnx.71.doc/diag/problem_determination/linux_setup.html Linux], [http://www.ibm.com/support/knowledgecenter/SSYKE2_7.1.0/com.ibm.java.aix.71.doc/diag/problem_determination/aix_setup_full_core.html AIX]) to ensure non-truncation.
 +
 
 +
In recent versions of IBM Java (&gt; 6.0.1), by default, when an OutOfMemoryError is thrown, IBM Java [http://www.ibm.com/support/knowledgecenter/SSYKE2_8.0.0/com.ibm.java.lnx.80.doc/diag/tools/dumpagents_defaults.html produces] a system dump, PHD, javacore, and Snap file on the first occurrence for that process (although often the core dump is suppressed by the default 0 core ulimit on operating systems such as Linux). For the next three occurrences, it produces only a PHD, javacore, and Snap. If you only plan to use system dumps, and you've configured your operating system correctly as per the links above (particularly core and file ulimits), then you may disable PHD generation with -Xdump:heap:none. For versions of IBM Java older than 6.0.1, you may switch from PHDs to system dumps using -Xdump:system:events=systhrow,filter=java/lang/OutOfMemoryError,request=exclusive+prepwalk -Xdump:heap:none
 +
 
 +
In addition to an OutOfMemoryError, system dumps may be produced using operating system tools (e.g. gcore in gdb for Linux, gencore for AIX, Task Manager for Windows, SVCDUMP for z/OS, etc.), using the [http://www.ibm.com/support/knowledgecenter/SSYKE2_8.0.0/com.ibm.java.lnx.80.doc/diag/tools/diagnostics_summary.html IBM Java APIs], using the various options of [http://www.ibm.com/support/knowledgecenter/SSYKE2_8.0.0/com.ibm.java.lnx.80.doc/diag/tools/dump_agents.html -Xdump], using [https://www.ibm.com/developerworks/community/groups/service/html/communityview?communityUuid=7d3dc078-131f-404c-8b4d-68b3b9ddd07a Java Surgery], and more.
 +
 
 +
Versions of IBM Java older than IBM JDK 1.4.2 SR12, 5.0 SR8a and 6.0 SR2 are known to produce inaccurate GC root information.
  
 
==== What if the Heap Dump is NOT Written on OutOfMemoryError? ====
 
==== What if the Heap Dump is NOT Written on OutOfMemoryError? ====
Line 116: Line 157:
 
Also please note that a heap dump is written only on the first OutOfMemoryError. If the application chooses to catch it and continues to run, the next OutOfMemoryError will never cause a heap dump to be written!
 
Also please note that a heap dump is written only on the first OutOfMemoryError. If the application chooses to catch it and continues to run, the next OutOfMemoryError will never cause a heap dump to be written!
  
=== Finding Memory Leaks ===
+
== Extending Memory Analyzer ==
 
+
The Memory Analyzer grew up at SAP. Back then, Krum blogged about
+
[https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/6856 Finding Memory Leaks with SAP Memory Analyzer]. The content is still relevant!
+
 
+
=== Analyzing Java Collection Usage ===
+
  
Check out Krum's blog about [https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/7680 Analyzing Java Collections Usage with Memory Analyzer]. Also, [https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/7006 Memory for Nothing] looks unused collections and the memory kept alive.
+
Memory Analyzer is extensible, so new queries and dump formats can be added. Please see
 +
[[MemoryAnalyzer/Extending_Memory_Analyzer]] for details.

Revision as of 15:55, 8 November 2016

About

The Eclipse Memory Analyzer tool (MAT) is a fast and feature-rich heap dump analyzer that helps you find memory leaks and analyze high memory consumption issues.

With Memory Analyzer one can easily

  • find the biggest objects, as MAT provides reasonable accumulated size (retained size)
  • explore the object graph, both inbound and outbound references
  • compute paths from the garbage collector roots to interesting objects
  • find memory waste, like redundant String objects, empty collection objects, etc...

Getting Started

Installation

See the download page for installation instructions.

Basic Tutorials

Both the Basic Tutorial chapter in the MAT documentation and the Eclipse Memory Analyzer Tutorial by Lars Vogel are a good first reading, if you are just starting with MAT.

Further Reading

Check MemoryAnalyzer/Learning Material. You will find there a collection of presentations and web articles on Memory Analyzer, which are also a good resource for learning.


Getting a Heap Dump

HPROF dumps from Sun Virtual Machines

The Memory Analyzer can work with HPROF binary formatted heap dumps. Those heap dumps are written by Sun HotSpot and any VM derived from HotSpot. Depending on your scenario, your OS platform and your JDK version, you have different options to acquire a heap dump.

Non-interactive

If you run your application with the VM flag -XX:+HeapDumpOnOutOfMemoryError a heap dump is written on the first Out Of Memory Error. There is no overhead involved unless a OOM actually occurs. This flag is a must for production systems as it is often the only way to further analyze the problem.

As per this article, the heap dump will be generated in the "current directory" of the JVM by default. It can be explicitly redirected with -XX:HeapDumpPath= for example -XX:HeapDumpPath=/disk2/dumps . Note that the dump file can be huge, up to Gigabytes, so ensure that the target file system has enough space.


Interactive

As a developer, you want to trigger a heap dump on demand. On Windows, use JDK 6 and JConsole. On Linux and Mac OS X, you can also use jmap that comes with JDK 5.

Via MAT:

Via Java VM parameters:

  • -XX:+HeapDumpOnOutOfMemoryError writes heap dump on OutOfMemoryError (recommended)
  • -XX:+HeapDumpOnCtrlBreak writes heap dump together with thread dump on CTRL+BREAK
  • -agentlib:hprof=heap=dump,format=b combines the above two settings (old way; not recommended as the VM frequently dies after CTRL+BREAK with strange errors)

Via Tools:

  • Sun (Linux, Solaris; not on Windows) JMap Java 5: jmap -heap:format=b <pid>
  • Sun (Linux, Solaris; Windows see link) JMap Java 6: jmap.exe -dump:format=b,file=HeapDump.hprof <pid>
  • Sun (Linus, Solaris) JMap with Core Dump File: jmap -dump:format=b,file=HeapDump.hprof /path/to/bin/java core_dump_file
  • Sun JConsole: Launch jconsole.exe and invoke operation dumpHeap() on HotSpotDiagnostic MBean
  • SAP JVMMon: Launch jvmmon.exe and call menu for dumping the heap

Heap dump will be written to the working directory.

Vendor / Release VM Parameter VM Tools
On OoM On Ctrl+Break Agent JMap JConsole
Sun, HP
1.4.2_12 Yes Yes Yes
1.5.0_07 Yes Yes (Since 1.5.0_15) Yes Yes (Only Solaris and Linux)
1.6.0_00 Yes Yes Yes Yes
SAP
1.5.0_07 Yes Yes Yes Yes (Only Solaris and Linux)

System Dumps and Heap Dumps from IBM Virtual Machines

Memory Analyzer may read memory-related information from IBM system dumps and from Portable Heap Dump (PHD) files with the IBM DTFJ feature installed. Once installed, then File > Open Heap Dump should give the following options for the file types:

  • All known formats
  • HPROF binary heap dumps
  • IBM 1.4.2 SDFF
  • IBM Javadumps
  • IBM SDK for Java (J9) system dumps
  • IBM SDK for Java Portable Heap Dumps

For a comparison of dump types, see Debugging from dumps. System dumps are simply operating system core dumps; therefore, they are a superset of portable heap dumps. System dumps are far superior than PHDs, particularly for more accurate GC roots, thread-based analysis, and unlike PHDs, system dumps contain memory contents like HPROFs. Older versions of IBM Java (e.g. < 5.0SR12, < 6.0SR9) require running jextract on the operating system core dump which produced a zip file that contained the core dump, XML or SDFF file, and shared libraries. The IBM DTFJ feature still supports reading these jextracted zips; however, newer versions of IBM Java do not require jextract for use in MAT since DTFJ is able to directly read each supported operating system's core dump format. Simply ensure that the operating system core dump file ends with the .dmp suffix for visibility in the MAT Open Heap Dump selection. It is also common to zip core dumps because they are so large and compress very well. If a core dump is compressed with .zip, the IBM DTFJ feature in MAT is able to decompress the ZIP file and read the core from inside (just like a jextracted zip). The only significant downsides to system dumps over PHDs is that they are much larger, they usually take longer to produce, they may be useless if they are manually taken in the middle of an exclusive event that manipulates the underlying Java heap such as a garbage collection, and they sometimes require operating system configuration (Linux, AIX) to ensure non-truncation.

In recent versions of IBM Java (> 6.0.1), by default, when an OutOfMemoryError is thrown, IBM Java produces a system dump, PHD, javacore, and Snap file on the first occurrence for that process (although often the core dump is suppressed by the default 0 core ulimit on operating systems such as Linux). For the next three occurrences, it produces only a PHD, javacore, and Snap. If you only plan to use system dumps, and you've configured your operating system correctly as per the links above (particularly core and file ulimits), then you may disable PHD generation with -Xdump:heap:none. For versions of IBM Java older than 6.0.1, you may switch from PHDs to system dumps using -Xdump:system:events=systhrow,filter=java/lang/OutOfMemoryError,request=exclusive+prepwalk -Xdump:heap:none

In addition to an OutOfMemoryError, system dumps may be produced using operating system tools (e.g. gcore in gdb for Linux, gencore for AIX, Task Manager for Windows, SVCDUMP for z/OS, etc.), using the IBM Java APIs, using the various options of -Xdump, using Java Surgery, and more.

Versions of IBM Java older than IBM JDK 1.4.2 SR12, 5.0 SR8a and 6.0 SR2 are known to produce inaccurate GC root information.

What if the Heap Dump is NOT Written on OutOfMemoryError?

Heap dumps are not written on OutOfMemoryError for the following reasons:

  • Application creates and throws OutOfMemoryError on its own
  • Another resource like threads per process is exhausted
  • C heap is exhausted

As for the C heap, the best way to see that you won't get a heap dump is if it happens in C code (eArray.cpp in the example below):

   # An unexpected error has been detected by SAP Java Virtual Machine:
   # java.lang.OutOfMemoryError: requested 2048000 bytes for eArray.cpp:80: GrET*. Out of swap space or heap resource limit exceeded (check with limits or ulimit)?
   # Internal Error (\\...\hotspot\src\share\vm\memory\allocation.inline.hpp, 26), pid=6000, tid=468

C heap problems may arise for different reasons, e.g. out of swap space situations, process limits exhaustion or just address space limitations, e.g. heavy fragmentation or just the depletion of it on machines with limited address space like 32 bit machines. The hs_err-file will help you with more information on this type of error. Java heap dumps wouldn't be of any help, anyways.

Also please note that a heap dump is written only on the first OutOfMemoryError. If the application chooses to catch it and continues to run, the next OutOfMemoryError will never cause a heap dump to be written!

Extending Memory Analyzer

Memory Analyzer is extensible, so new queries and dump formats can be added. Please see MemoryAnalyzer/Extending_Memory_Analyzer for details.

Back to the top