Skip to main content


Eclipse Community Forums
Forum Search:

Search      Help    Register    Login    Home
Home » Archived » Memory Analyzer (MAT) » Performing initial analysis/conversion using command line tool
Performing initial analysis/conversion using command line tool [message #3389] Tue, 26 August 2008 21:02 Go to next message
Taras Tielkes is currently offline Taras TielkesFriend
Messages: 38
Registered: July 2009
Member
I've got some large heap dumps that I cannot load when running EMA on Win32.

It would be nice if I could transfer the dump to a 64 bits *nix server,
run the initial parsing/conversion there, then transfer the index files
that EMA creates back to my local workstation.

Is such a scenario supported already (without using X etc)?
If not, is something like this on the roadmap?

Thanks,
-tt
Re: Performing initial analysis/conversion using command line tool [message #3459 is a reply to message #3389] Wed, 27 August 2008 07:22 Go to previous messageGo to next message
Andreas Buchen is currently offline Andreas BuchenFriend
Messages: 123
Registered: July 2009
Senior Member
> It would be nice if I could transfer the dump to a 64 bits *nix server,
> run the initial parsing/conversion there, then transfer the index files
> that EMA creates back to my local workstation.

That is supported. Transfer the all <dump_name>*.index files back and you
should be able to open the Heap dump on your local workstation. Analyzing
a dump, which is already parsed and indexed, is much less memory intensive.

To parse the heap dump w/o X, run the equivalent of the following windows
command:

MemoryAnalyzer.exe -application org.eclipse.mat.api.parse <path/to/dump>
Re: Performing initial analysis/conversion using command line tool [message #3622 is a reply to message #3459] Wed, 27 August 2008 16:52 Go to previous messageGo to next message
Taras Tielkes is currently offline Taras TielkesFriend
Messages: 38
Registered: July 2009
Member
Andreas Buchen wrote:
>> It would be nice if I could transfer the dump to a 64 bits *nix
>> server, run the initial parsing/conversion there, then transfer the
>> index files that EMA creates back to my local workstation.
>
> That is supported. Transfer the all <dump_name>*.index files back and
> you should be able to open the Heap dump on your local workstation.
> Analyzing a dump, which is already parsed and indexed, is much less
> memory intensive.
>
> To parse the heap dump w/o X, run the equivalent of the following
> windows command:
>
> MemoryAnalyzer.exe -application org.eclipse.mat.api.parse <path/to/dump>

What would be the equivalent on a non-Windows platform?

At first I thought I could just add "-jar something.jar", but probably
something more elaborate is required? (I assume the application is
actually a group of OSGI bundles?)

1)
Given some random *nix machine with a 64bit version of Java >=1.5
installed, what parameters (-classpath?) would I need to launch the
indexing tool?

2)
When using the tool in such an "offline" mode, are there some general
rules for the amount of heap required to analyze a snapshot of some
size? For example, how much heap will I need to index a "typical" heap
dump of 8GB?

3)
In general, if I have created the initial index files using a 64 bit
environment, will I then be able to open a snapshot of any given size on
a 32-bit system for analysis?

In other words, will some (huge) snapshots require 64 bit anyway, even
in the presence of already generated index files?

Thanks,

-tt
Re: Performing initial analysis/conversion using command line tool [message #3688 is a reply to message #3622] Thu, 28 August 2008 07:47 Go to previous messageGo to next message
Andreas Buchen is currently offline Andreas BuchenFriend
Messages: 123
Registered: July 2009
Senior Member
Taras Tielkes wrote:
>> MemoryAnalyzer.exe -application org.eclipse.mat.api.parse <path/to/dump>
> What would be the equivalent on a non-Windows platform?

On my Ubuntu installation, I can run

/MemoryAnalyzer -application org.eclipse.mat.api.parse mydump.hprof

Or, if you want to do this by hand, one could say something like:

java -jar plugins/org.eclipse.equinox.launcher_1.0.100.v20080509-1800. jar
-application org.eclipse.mat.api.parse mydump.hprof

Make sure to pick the right version number of the launcher JAR.


> (I assume the application is actually a group of OSGI bundles?)

Yes. It should behave just like an Eclipse IDE.


> 2)
> When using the tool in such an "offline" mode, are there some general
> rules for the amount of heap required to analyze a snapshot of some
> size? For example, how much heap will I need to index a "typical" heap
> dump of 8GB?

The answer is... it depends. More specifically, it depends on the number
of objects (not the size of the heap dump) and the layout of the object
graph. That's why it is hard to predict what is possible. As a rule of
thumb, we usually can parse up to 33 million objects on a 32bit machine.

About the 8GB heap dump. If you have few, but big objects (e.g. huge
primitive arrays), the 32bit might be enough. In general, I hope that one
does not need more memory than the size of the heap dump. Sorry for being
so vague.

> 3)
> In general, if I have created the initial index files using a 64 bit
> environment, will I then be able to open a snapshot of any given size on
> a 32-bit system for analysis?

> In other words, will some (huge) snapshots require 64 bit anyway, even
> in the presence of already generated index files?

Again, I don't have exact numbers. Usually, re-opening the dump on a 32bit
machine is not a problem. Of course, more memory speeds up a lot of
operations (calculating retained sizes etc.), but it should be possible to
analyze the dump on 32bit.


Andreas.
Re: Performing initial analysis/conversion using command line tool [message #4688 is a reply to message #3688] Fri, 05 September 2008 08:28 Go to previous messageGo to next message
Taras Tielkes is currently offline Taras TielkesFriend
Messages: 38
Registered: July 2009
Member
Hi Andreas,

Can I pass multiple .hprof files as arguments, to instruct the tool to
index a number of snapshots "in batch"?

If not, this would be a valuable addition.

Taras
Re: Performing initial analysis/conversion using command line tool [message #4826 is a reply to message #4688] Mon, 08 September 2008 07:59 Go to previous message
Andreas Buchen is currently offline Andreas BuchenFriend
Messages: 123
Registered: July 2009
Senior Member
> Can I pass multiple .hprof files as arguments, to instruct the tool to
> index a number of snapshots "in batch"?

Currently, the arguments are <dump> (<report>)*
Sorry, for now you have to use OS functionality to parse multiple dumps.
Previous Topic:OQL Parser
Next Topic:Announcement: New Version Uploaded (5 September 08)
Goto Forum:
  


Current Time: Sat Apr 27 03:53:24 GMT 2024

Powered by FUDForum. Page generated in 0.04496 seconds
.:: Contact :: Home ::.

Powered by: FUDforum 3.0.2.
Copyright ©2001-2010 FUDforum Bulletin Board Software

Back to the top