|Memory crash after using output repeatedly on large datasets [message #1785137]
||Mon, 09 April 2018 23:09
|| Vincenzo Caselli
Registered: January 2012
During training, when the overall error goes below a given value (e.g. 0.1), I periodically test results using |
over around 100 test images, in order to have a custom evaluation of the network training.
However, after a few of such testings iterations the memory starts slowly raising until the JVM crashes without any error message in Java console.
I double checked how I instantiate Java new objects in this phase and cleaned every usage until I now have a static array of INDArray (one for every test image) and a correspondent static array of Strings for the labels: they are prepared once in advance and during training just tested with output() method.
Nevertheless the memory still increase as soon as I start using ouput().
Please note that the number of inputs for training is quite high (over 300,000), the network is a CNN (very similar to the Mnist CNN example included in DL4J examples) and overall it behaves just wonderfully, as soon as I do not start using .output() method.
I experienced this behavior both on Linux CentOS 7.4 and on Ubuntu 16.04. The DL4J version is 0.9.1, RAM is 16 GB + 64 GB swap available; the memory options in launch are
-XX:+UseSerialGC -Xmx20G -Dorg.bytedeco.javacpp.maxbytes=16G -Dorg.bytedeco.javacpp.maxphysicalbytes=16G
I could of course run this periodic testing session in a separate JVM instance, so to avoid crash of the training instance, but still am wondering what is causing the memory problem.
Could the output method() be affected by some form of memory leak or is there any cleanup I could do after each usage? (I already tried calling .cleanup() and also .detach() on output, but without success) ?
Thank you very much
Powered by FUDForum
. Page generated in 0.01879 seconds