Persisting thousands of files [message #385455] |
Wed, 11 February 2009 08:53  |
Eclipse User |
|
|
|
Hi all,
I'm implementing an use case that requires me to persist, update and
read thousands of database records (containing files in this case).
So far I've been using Spring + EclipseLink successfully with some
hundreds of rows per transaction, but once i get to the magnitude of
thousands i get OutOfMemory errors of Java heap space exceptions, etc ...
So, i would like to know if there's any implemented (unit-) test in
EclipseLink to persist, update and grab some large amount (thousands or
hundreds of thousands) of rows.
Currently my code is roughly as follows ...
@Service
public class FileImportingService {
...
@Transactional(rollbackFor = Exception.class)
public void insertFilesFromDisk(Map<String, MyFile> discPaths_files,
IProgressMonitor monitor) {
for (String discPathString : discPaths_files.keySet()) {
myFile = read_file_from_disc(discPathString);
getJpaTemplate().persist(myFile);
monitor.worked(1);
}
}
....
}
Thanks in advance for the ideas, suggestions ...
Cheers,
André Ribeiro
|
|
|
|
Re: Persisting thousands of files [message #385482 is a reply to message #385456] |
Thu, 12 February 2009 08:36  |
Eclipse User |
|
|
|
Hi Doug,
Thank you very much for your tips :)
Are you're example applications available somewhere with public access?
I'd find it very useful to see some sample implementations to get some
ideas.
Cheers,
Andre
Doug Clarke wrote:
> Andre,
>
> I am working on some large scale example applications as well. My data
> loading was running out of memory so I increased the heap size on the
> Java VM startup but that only got me a bit further. It turned out that
> I was simply requiring more memory for my data load scenario then was
> available.
>
> To get around this I modified my data load to flush the data I am
> loading in batches. This allowed me to still have a single transaction
> for the entire data set but EclipseLink would not need to manage all
> of the instances in memory until after commit. After each batch of
> objects that I added using entityManager.persist I then issued:
>
> entityManager.flush();
> entityManager.clear();
>
> Using this approach I have been able to continually increase the
> volume of objects I load and I simply tested with various batch sizes
> to find the optimal size.
>
> DOug
>
|
|
|
Powered by
FUDForum. Page generated in 0.29960 seconds