In our application we have a requirement to insert huge data to Sybase database (upto 5 lakhs of records). We are using JPA + Eclipselink.
I optimizing the code to improve the performance and came across the method batch-writing.
In a MySql server to persist 10k records as a batch it take 3.9 seconds and the same 10k records takes around 45 seconds in Sybase by using batch-writing.
I tried all the options to configure batch-writing in persistence.xml i.e "JDBC" and "Buffered". In this JDBC gives a good performance comparitively to Buffered while persisting in MySql server but its still same with Sybase.
Please help me out how do I optimize or tune to achive the same in Sybase server. (from 45 seconds to 4 sec)
JDBC batch writing requires that the JDC driver support parametrized batch writing. Some JDBC drivers do not support this, so this optimization will not be available. Check with your JDBC driver to see what it supports, and see if there are any other drivers available.
If parametrized batch writing is not supported, then you could try dynamic batch writing. To enable this set "eclipselink.jdbc.bind-parameters" to "false", and batch writing to either "JDBC" or "Buffered".