GC overhead limit exceeded problem [message #949064] |
Thu, 18 October 2012 12:47  |
Eclipse User |
|
|
|
Hi,
I have some memory problems(GC overhead limit exceeded) when i generate reports with large datasets (200k records). With small amount of data it runs just fine. In my reports there are cubes represented in crosstables, both are generated with the DE API.
As I searched the forums and I tried out some of suggestions like changing the MaxPermSize to 256 or editing the DataEngine parameters to make it work but i had no luck. I tried these parameters (using RunAndRenderTask):
task.getAppContext().put(DataEngine.MEMORY_BUFFER_SIZE,2);
task.getAppContext().put(DataEngine.MEMORY_USAGE, DataEngine.MEMORY_USAGE_CONSERVATIVE);
task.getAppContext().put(DataEngine.MEMORY_DATA_SET_CACHE, 0);
task.getAppContext().put(DataEngine.DATA_SET_CACHE_ROW_LIMIT, 100);
task.getAppContext().put(DataEngine.IN_MEMORY_CUBE_SIZE, 10);
I tried to achive that the engine should use the disk instead of memory, maybe i misunderstood these parameters.
Do you have any suggestion what else could I try?
Thanks for your help,
David
|
|
|
|
Re: GC overhead limit exceeded problem [message #949737 is a reply to message #949393] |
Fri, 19 October 2012 04:38  |
Eclipse User |
|
|
|
Hi Jason,
Thanks for the answer.I would create a bugzilla but unfortunately i cannot provide an example rptdesign because it contains too much company information and i dont have enough time at the moment to generate an example with the sample database. I hoped you could give me some general advice but then i will limit my datasets somehow for now and later on i will create an example and open a bugzilla.
Thanks again,
David
|
|
|
Powered by
FUDForum. Page generated in 0.03165 seconds