[CDO] Committing large amounts of data as a single commit [message #1803666] |
Wed, 06 March 2019 13:19 |
Robert Schulk Messages: 144 Registered: July 2015 |
Senior Member |
|
|
In order to keep a persistent state of a model, we would like to apply a huge change in a single commit.
This works all good, but I have some questions about the processing:
1. The actual commit seems to be pretty fast. Other clients see the changes, and also the locks on the elements are removed. Nevertheless, the committing client is blocking for a longer amount of time after the other clients already see the new state. During this time, only the server shows a CPU load. Is this due to writing to the database backend?
2. CDOTransaction#options()#getCommitInfoTimeout: returns 10000ms. But the commit takes much longer, should I increase this value?
3. CDOSesssion#options()#getCommitTimeout() is set to 10 seconds. But the commit takes much longer, should I increase this value?
|
|
|
Powered by
FUDForum. Page generated in 0.03871 seconds