Skip to main content


Eclipse Community Forums
Forum Search:

Search      Help    Register    Login    Home
Home » Archived » Memory Analyzer (MAT) » Analyser 'loosing' over half of the used memory
Analyser 'loosing' over half of the used memory [message #10425] Mon, 18 May 2009 10:55 Go to next message
Matthew Seaborn is currently offline Matthew SeabornFriend
Messages: 3
Registered: July 2009
Junior Member
I am attempting to analyse a 3Gb hprof heab dump (using Eclipse 3.4.2, jvm
1.5.0_18 on 32-bit Windows XP) of a 7Gb java process (1.5.0_18, 64-bit on
Solaris). I know from jConsole that the majority of this memory
consumption is distributed between the Edan and Old memory spaces.

However when I run the Eclipse Memory Analyzer it reports that the heap
size is only 2.5Gb in size.

When loading the HPROF file the eclipse log reports the following

!ENTRY org.eclipse.mat.ui 1 0 2009-05-18 11:29:59.361
!MESSAGE Heap C:\Archive\javamem\heap-ws01-20090512.hprof contains
14,969,072 objects

!ENTRY org.eclipse.mat.ui 1 0 2009-05-18 11:38:25.134
!MESSAGE Too many suspect instances (133664). Will use 10000 random from
them to search for common path.

As far as I can tell the file is complete and without error.

How can I get an accurate analysis of this heap dump?
Re: Analyser 'loosing' over half of the used memory [message #10490 is a reply to message #10425] Tue, 19 May 2009 08:17 Go to previous messageGo to next message
Andreas Buchen is currently offline Andreas BuchenFriend
Messages: 123
Registered: July 2009
Senior Member
Hi Matthew,

both message to the log are only informational. The first message tells
the number of objects during found during parsing, the second means that
the leak suspect report utilizes sampling.

What does the Memory Analyzer do? Well, it removes the objects which are
not reachable. Usually, those comprise only a minor percentage and are
leftovers of optimization done by the Garbage Collector.

You have two possibilities (described in a little more detail on the FAQ
page): Either look at the histogram of unreachable objects or (during the
initial parsing) let MAT mark those objects as pseudo GC roots and
therefore prevent their removal.

http://wiki.eclipse.org/MemoryAnalyzer/FAQ#How_to_analyse_un reachable_objects

What is the number of objects removed versus the number of objects kept
alive in your heap dump? The first one you find via "Java Basics" ->
"Unreachable Histogram", the second one is the total in of objects in the
default histogram.


Andreas.
Re: Analyser 'loosing' over half of the used memory [message #10523 is a reply to message #10490] Tue, 19 May 2009 16:16 Go to previous messageGo to next message
Matthew Seaborn is currently offline Matthew SeabornFriend
Messages: 3
Registered: July 2009
Junior Member
> You have two possibilities (described in a little more detail on the FAQ
> page): Either look at the histogram of unreachable objects or (during the
> initial parsing) let MAT mark those objects as pseudo GC roots and
> therefore prevent their removal.
>
> http://wiki.eclipse.org/MemoryAnalyzer/FAQ#How_to_analyse_un reachable_objects

I ran the parser with keep_unreachable_objects and the reported heap size
rose to 2.8Gb, still nowhere near what it actually was at the time the
heap was taken.
Re: Analyser 'loosing' over half of the used memory [message #10555 is a reply to message #10490] Tue, 19 May 2009 16:48 Go to previous messageGo to next message
Matthew Seaborn is currently offline Matthew SeabornFriend
Messages: 3
Registered: July 2009
Junior Member
Is this a result of the fact that I am running on a 32 bit OS using a 32
bit JVM and so have hit the limit off 32 bit numbers (2,147,483,648).
Would it have problems if the retained heap of an object exceeded this?
The object I most expected to be a memory hog cannot be found in the trace
at all.

If this is the case, if I ran this process ot 64-bit would it solve this?
Re: Analyser 'loosing' over half of the used memory [message #10620 is a reply to message #10555] Wed, 20 May 2009 15:37 Go to previous message
Andreas Buchen is currently offline Andreas BuchenFriend
Messages: 123
Registered: July 2009
Senior Member
> If this is the case, if I ran this process ot 64-bit would it solve this?

No, we have processed many heap dumps from 64bit machines and we have seen
ridiculous big objects - running MAT on 64bit will not change anything.


Keep in mind that the VM itself does a garbage collection before writing
the heap dump. So the heap dumps are usually slightly smaller - but, of
course, I understand you are talking about much bigger sizes. Sorry.

Andreas.
Previous Topic:All ObjectIds of Heap dump
Next Topic:How can I get the snapshot window after opening heap dump?
Goto Forum:
  


Current Time: Thu Apr 25 00:38:39 GMT 2024

Powered by FUDForum. Page generated in 0.03040 seconds
.:: Contact :: Home ::.

Powered by: FUDforum 3.0.2.
Copyright ©2001-2010 FUDforum Bulletin Board Software

Back to the top