Persisting thousands of files [message #385455] |
Wed, 11 February 2009 13:53 |
Andre Ribeiro Messages: 52 Registered: July 2009 |
Member |
|
|
Hi all,
I'm implementing an use case that requires me to persist, update and
read thousands of database records (containing files in this case).
So far I've been using Spring + EclipseLink successfully with some
hundreds of rows per transaction, but once i get to the magnitude of
thousands i get OutOfMemory errors of Java heap space exceptions, etc ...
So, i would like to know if there's any implemented (unit-) test in
EclipseLink to persist, update and grab some large amount (thousands or
hundreds of thousands) of rows.
Currently my code is roughly as follows ...
@Service
public class FileImportingService {
...
@Transactional(rollbackFor = Exception.class)
public void insertFilesFromDisk(Map<String, MyFile> discPaths_files,
IProgressMonitor monitor) {
for (String discPathString : discPaths_files.keySet()) {
myFile = read_file_from_disc(discPathString);
getJpaTemplate().persist(myFile);
monitor.worked(1);
}
}
....
}
Thanks in advance for the ideas, suggestions ...
Cheers,
André Ribeiro
|
|
|
|
|
Powered by
FUDForum. Page generated in 0.03865 seconds