- 1 Frequently Asked Questions
- 1.1 Problems Starting the Memory Analyzer
- 1.2 Problems Getting Heap Dumps
- 1.2.1 Error: Found instance segment but expected class segment
- 1.2.2 Error: Invalid heap dump file. Unsupported segment type 0 at position XZY
- 1.2.3 Parser found N HPROF dumps in file X. Using dump index 0. See FAQ.
- 1.2.4 OutOfMemoryError: Requested length of new long[xxxx] exceeds limit of 2,147,483,639
- 1.3 Enable Debug Output
- 1.4 Problems Interpreting Results
- 1.5 How to analyse unreachable objects
- 1.6 Crashes on Linux
- 1.7 Extending Memory Analyzer
Frequently Asked Questions
Problems Starting the Memory Analyzer
java.lang.RuntimeException: No application id has been found.
Memory Analyzer 1.12 and later needs a Java 11 VM or later VM to run. The Memory Analyzer 1.8 to 1.11 needs a Java 1.8 VM or later VM to run (of course, heap dumps from JDK 1.4.2_12 on are supported). If in doubt, provide the runtime VM on the command line:
MemoryAnalyzer.exe -vm <path/to/java8/bin>
Alternatively, edit the
MemoryAnalyzer.ini to contain (on two lines):
(This error happens because the MAT plug-in requires a JDK 1.8 via its manifest.mf file and the OSGi runtime dutifully does not activate the plug-in.) Memory Analyzer version 1.1 will give a better error message pop-up.
Version 1.4.2 of the JVM is not suitable for this product. Version 1.5.0 or greater is required.
Out of Memory Error while Running the Memory Analyzer
Well, analyzing big heap dumps can also require more heap space. Give it some more memory (possible by running on a 64-bit machine):
MemoryAnalyzer.exe -vmargs -Xmx4g -XX:-UseGCOverheadLimit
Alternatively, edit the
MemoryAnalyzer.ini to contain:
As a rough guide, Memory Analyzer itself needs 32 to 64 bytes for each object in the analyzed heap, so -Xmx2g might allow a heap dump containing 30 to 60 million objects to be analyzed. Memory Analyzer 1.3 using -Xmx58g has successfully analyzed a heap dump containing over 948 million objects.
The initial parse and generation of the dominator tree uses the most memory, so it can be useful to do the initial parse on a large machine, then copy the heap dump and index files to a more convenient machine for further analysis.
For more details, check out the section Running Eclipse in the Help Center. It also contains more details if you are running on Mac OS X.
If you are running the Memory Analyzer inside your Eclipse SDK, you need to edit the
How to run on 64bit VM while the native SWT are 32bit
In short: if you run a 64bit VM, then all native parts also must be 64bit. But what if - like Motif on AIX - native SWT libraries are only available as 32bit version? One can still run the command line parsing on 64bit by executing the following command:
/usr/java5_64/jre/bin/java -jar plugins/org.eclipse.equinox.launcher_1*.jar -consoleLog -application org.eclipse.mat.api.parse path/to/dump.dmp.zip org.eclipse.mat.api:suspects org.eclipse.mat.api:overview org.eclipse.mat.api:top_components
or the latest version of Memory Analyzer has this
ParseHeapDump.sh script, which relies on having java in the path.
#!/bin/sh # # This script parses a heap dump. # Adjust the path to java, version 5 or later, and the heap size as required. # Suitable for 64-bit and 32-bit Java, but a 64-bit Java is required # for larger heap sizes. # # Usage: ParseHeapDump.sh <path/to/dump.dmp.zip> [report]* # # The leak report has the id org.eclipse.mat.api:suspects # The top component report has the id org.eclipse.mat.api:top_components # java -Xmx3072M -jar "`dirname "$0"`"/plugins/org.eclipse.equinox.launcher_1*.jar -consoleLog -application org.eclipse.mat.api.parse "$@"
plugins/org.eclipse.equinox.launcher_1*.jar finds a version of the Equinox Launcher available in your installation without having to specify the exact name of the launcher file, as this version changes regularly!
org.eclipse.mat.api:suspects argument creates a ZIP file containing the leak suspect report. This argument is optional.
org.eclipse.mat.api:overview argument creates a ZIP file containing the overview report. This argument is optional.
org.eclipse.mat.api:top_components argument creates a ZIP file containing the top components report. This argument is optional.
With Memory Analyzer 0.8, but not Memory Analyzer 1.0 or later, the IBM DTFJ adapter has to be initialized in advance. For parsing IBM dumps with the IBM DTFJ adapter you Memory Analyzer 0.8 should use this command:
/usr/java5_64/jre/bin/java -Dosgi.bundles=org.eclipse.mat.dtfj@4:start,org.eclipse.equinox.common@2:start,org.eclipse.update.configurator@3:start,org.eclipse.core.runtime@start -jar plugins/org.eclipse.equinox.launcher_*.jar -consoleLog -application org.eclipse.mat.api.parse path/to/mydump.dmp.zip org.eclipse.mat.api:suspects org.eclipse.mat.api:overview org.eclipse.mat.api:top_components
Problems Getting Heap Dumps
Error: Found instance segment but expected class segment
This error indicates an inconsistent heap dump: The data in the heap dump is written in various segments. In this case, an address expected in a class segment is written into a instance segment.
The problem has been reported in heap dumps generated by jmap on Linux and Solaris operation systems and jdk1.5.0_13 and below. Solution: use latest jdk/jmap version or use jconsole to write the heap dump (needs jdk6).
Error: Invalid heap dump file. Unsupported segment type 0 at position XZY
This almost always means the heap dumps has not been written properly by the Virtual Machine. The Memory Analyzer is not able to read the heap dump.
Parser found N HPROF dumps in file X. Using dump index 0. See FAQ.
This warning message is printed to the log file, if the heap dump is written via the (obsolete and unstable) HPROF agent. The agent can write multiple heap dumps into one HPROF file. Memory Analyzer 1.2 and earlier has no UI support to decide which heap dump to read. By default, MAT takes the first heap dump. If you want to read an alternative dump, one has to start MAT with the system property MAT_HPROF_DUMP_NR=<index>.
Memory Analyzer 1.3 provides a dialog for the user to select the appropriate dump.
OutOfMemoryError: Requested length of new long[xxxx] exceeds limit of 2,147,483,639
Eclipse MAT currently only supports heap sizes with up to ~2billion objects, as it uses Java arrays internally when processing the file (which are limited to 2^31 entries).
To work around this, Eclipse MAT supports a setting to "discard" some % of objects so that only a fraction of objects are loaded.
To configure it, follow the recommendation: Consider enabling object discard, see Window > Preferences > Memory Analyzer > Enable discard. Then, you should be able to open the file with Eclipse MAT.
This is useful as it allows to load heaps with many objects. However, it may miss some linkages and object references from processing and results will be less accurate as a result.
See MAT Configuration help page for more information.
Enable Debug Output
To show debug output of MAT:
1. Create or append to the file ".options" in the eclipse main directory the lines:
org.eclipse.mat.parser/debug=true org.eclipse.mat.report/debug=true org.eclipse.mat.dtfj/debug=true org.eclipse.mat.dtfj/debug/verbose=true org.eclipse.mat.hprof/debug=true org.eclipse.mat.hprof/debug/parser=true
Edit this file to remove some lines if you are not interested in output from a particular plug-in.
On macOS, this file should be placed in *.app/Contents/MacOS/.options
2. Start eclipse with the -debug option. This can be done by appending -debug to the eclipse.ini file in the same directory as the .options file.
3. Be sure to also enable the -consoleLog option to actually see the output.
4. If you want to enable debug output for the stand-alone Memory Analyzer create the options file in the mat directory and start memory analyzer using MemoryAnalyzer -debug -consoleLog
See FAQ_How_do_I_use_the_platform_debug_tracing_facility for a general explanation of how the debug trace works in Eclipse.
Problems Interpreting Results
MAT Does Not Show the Complete Heap
Symptom: When monitoring the memory usage interactively, the used heap size is much bigger than what MAT reports.
During the index creation, the Memory Analyzer removes unreachable objects because the various garbage collector algorithms tend to leave some garbage behind (if the object is too small, moving and re-assigning addresses is to expensive). This should, however, be no more than 3 to 4 percent. If you want to know what objects are removed, enable debug output as explained here: MemoryAnalyzer/FAQ#Enable_Debug_Output
Another reason could be that the heap dump was not written properly. Especially older VM (1.4, 1.5) can have problems if the heap dump is written via jmap.
Otherwise, feel free to report a bug.
How to analyse unreachable objects
By default unreachable objects are removed from the heap dump while parsing and will not appear in class histogram, dominator tree, etc. Yet it is possible to open a histogram of unreachable objects. You can do it:
1. From the link on the Overview page
2. From the Query Browser via Java Basics --> Unreachable Objects Histogram
This histogram has no object graph behind it(unreachable objects are removed during the parsing of the heap dump, only class names are stored). Thus it is not possible to see e.g. a list of references for a particular unreachable object.
But there is a possibility to keep unreachable objects while parsing. For this you need to either:
- parse the heap dump from the command line providing the argument -keep_unreachable_objects, i.e.
ParseHeapDump.bat -keep_unreachable_objects <heap dump>
- set the preference using 'Window' > 'Preferences' > 'Memory Analyzer' > 'Keep Unreachable Objects', then parse the dump. Memory Analyzer version 1.1 and later has this preference page option to select keep_unreachable_objects.
Crashes on Linux
Depending on the type of crash, consider testing with one or more of these options in MemoryAnalyzer.ini:
- Normally you must first install your distribution's xulrunner-compat package
Extending Memory Analyzer
Is it possible to extend the Memory Analyzer to analyze the memory consumption of C or C++ programs?
No, this is not possible. The design of the Memory Analyzer is specific to Java heap dumps.