Any insights or suggestions? [message #1863931] |
Thu, 07 March 2024 01:42  |
Eclipse User |
|
|
|
Hello. When I attempt to build more than 1.5k files, I encounter a crash with an Out of Memory error. Despite adding flags -d64 and -Xmx6g to increase JVM memory, only 2gb of memory is reserved at the peak. It seems there might be another issue, perhaps related to garbage collection. Could it be that the garbage collection only kicks in after processing 1.5k files ?Additionally, as I build each file separately, I'm curious if there's a way to build everything at once and obtain a substantial number of CompilationUnits. Any insights or suggestions would be greatly appreciated.
|
|
|
Re: Any insights or suggestions? [message #1863949 is a reply to message #1863931] |
Thu, 07 March 2024 11:26  |
Eclipse User |
|
|
|
If only 2 GB are used, the most likely explanation would be that the arguments are not correctly passed to the JVM that crashes. If you're using the JDT in Eclipse, ensure that -Xmx6G in the eclipse.ini file comes in a line that is placed after -vmargs.
Does the tool that you're using to check peak memory give you access to the JVM command line or the maximum heap size?
The JVM determines when the garbge collection runs. Before the JVM crashes with an OutOfMemoryError (OOME), it will run a garbage collection.
Does the OOME contain an explanation message?
|
|
|
Powered by
FUDForum. Page generated in 0.03396 seconds