Analyser 'loosing' over half of the used memory [message #10425] |
Mon, 18 May 2009 06:55  |
Eclipse User |
|
|
|
I am attempting to analyse a 3Gb hprof heab dump (using Eclipse 3.4.2, jvm
1.5.0_18 on 32-bit Windows XP) of a 7Gb java process (1.5.0_18, 64-bit on
Solaris). I know from jConsole that the majority of this memory
consumption is distributed between the Edan and Old memory spaces.
However when I run the Eclipse Memory Analyzer it reports that the heap
size is only 2.5Gb in size.
When loading the HPROF file the eclipse log reports the following
!ENTRY org.eclipse.mat.ui 1 0 2009-05-18 11:29:59.361
!MESSAGE Heap C:\Archive\javamem\heap-ws01-20090512.hprof contains
14,969,072 objects
!ENTRY org.eclipse.mat.ui 1 0 2009-05-18 11:38:25.134
!MESSAGE Too many suspect instances (133664). Will use 10000 random from
them to search for common path.
As far as I can tell the file is complete and without error.
How can I get an accurate analysis of this heap dump?
|
|
|
|
|
|
Re: Analyser 'loosing' over half of the used memory [message #10620 is a reply to message #10555] |
Wed, 20 May 2009 11:37  |
Eclipse User |
|
|
|
> If this is the case, if I ran this process ot 64-bit would it solve this?
No, we have processed many heap dumps from 64bit machines and we have seen
ridiculous big objects - running MAT on 64bit will not change anything.
Keep in mind that the VM itself does a garbage collection before writing
the heap dump. So the heap dumps are usually slightly smaller - but, of
course, I understand you are talking about much bigger sizes. Sorry.
Andreas.
|
|
|
Powered by
FUDForum. Page generated in 0.03521 seconds