GC overhead limit exceeded problem [message #949064] |
Thu, 18 October 2012 16:47  |
David Vinicz Messages: 13 Registered: July 2012 |
Junior Member |
|
|
Hi,
I have some memory problems(GC overhead limit exceeded) when i generate reports with large datasets (200k records). With small amount of data it runs just fine. In my reports there are cubes represented in crosstables, both are generated with the DE API.
As I searched the forums and I tried out some of suggestions like changing the MaxPermSize to 256 or editing the DataEngine parameters to make it work but i had no luck. I tried these parameters (using RunAndRenderTask):
task.getAppContext().put(DataEngine.MEMORY_BUFFER_SIZE,2);
task.getAppContext().put(DataEngine.MEMORY_USAGE, DataEngine.MEMORY_USAGE_CONSERVATIVE);
task.getAppContext().put(DataEngine.MEMORY_DATA_SET_CACHE, 0);
task.getAppContext().put(DataEngine.DATA_SET_CACHE_ROW_LIMIT, 100);
task.getAppContext().put(DataEngine.IN_MEMORY_CUBE_SIZE, 10);
I tried to achive that the engine should use the disk instead of memory, maybe i misunderstood these parameters.
Do you have any suggestion what else could I try?
Thanks for your help,
David
|
|
|
|
|
Powered by
FUDForum. Page generated in 0.01884 seconds