Large Heap Dump analysis - Facing issue [message #756565] |
Mon, 14 November 2011 10:56  |
Eclipse User |
|
|
|
Hi ,
I am trying to analyze an IBM Heap Dump(Production heap dump) which is 474 MB and is taking too much of time to calculate precise retained size in histogram , its calculating as 5-6 days to complete... Iam using MAT through eclise and changed xmx argument in eclipse.ini configuration to 1100 MB from 512 MB..but didnt help.
My System details
OS patform - windows XP
1 GB RAM.
Please let me know how can I overcome this issue.
Thanks,
|
|
|
|
Re: Large Heap Dump analysis - Facing issue [message #757092 is a reply to message #756787] |
Wed, 16 November 2011 13:26   |
Eclipse User |
|
|
|
Hi Soumya,
Large heap dumps can take some time to process in MAT, this is a known issue, and to some extent unavoidable given the nature of the tool and its operation.
However it seems possible that you are seeing a problem over and above normal processing.
To diagnose this further it will probably be necessary to have a copy of your phd file to be able to reproduce the problem. Would this be possible?
If so would you be able to make the dump available to download (e.g. via FTP) and send details for the download to our public support email address "javatool at uk.ibm.com"?
Thanks.
|
|
|
Re: Large Heap Dump analysis - Facing issue [message #757760 is a reply to message #756565] |
Mon, 21 November 2011 14:22  |
Eclipse User |
|
|
|
Do you really need a precise retained size for every object?
Is it enough to get the minimum retained size for everything, then just the precise retained size for the interesting objects?
We could check the operation to see if it goes back to the PHD file for field/array refs - that would be slow.
Andrew Johnson
|
|
|
Powered by
FUDForum. Page generated in 0.03310 seconds