Skip to main content


Eclipse Community Forums
Forum Search:

Search      Help    Register    Login    Home
Home » Eclipse Projects » EclipseLink » OutOfMemory for simple example persisting large amount of data
OutOfMemory for simple example persisting large amount of data [message #385244] Wed, 14 January 2009 16:43 Go to next message
Johannes Stamminger is currently offline Johannes StammingerFriend
Messages: 17
Registered: July 2009
Junior Member
Hi,

I have to persist a large number of records to a database and give the
eclipselink a try (1.2.0 with derby-10.4.2.0 and alternatively
hsqldb-1.8.0.10). Though for my IMHO simple example (just one entity
having 4 persistable attributes (3x long, 1x short) all being annotated
not to be changeable) I fail on persisting a large number of instances.

I try to persist 1.500.000 with committing the transaction every 10.000.
The test method does not hold any ref to the created instances. Though the
test runs out of memory. The heap dump shows instances of
org.eclipse.persistence.internal.sessions.RepeatableWriteUni tOfWork
preventing the entity instances to be garbage collected.

AFAIU those uow track for changes of the entity instances - though none of
the attributes can ever be updated.

Is this expected behavior?

By what way may I prevent the test (and later on the app) to run out of
memory? I searched/tries properties dealing with this - but so far nothing
gave it the trick ... :-(

Johannes
Re: OutOfMemory for simple example persisting large amount of data [message #385246 is a reply to message #385244] Wed, 14 January 2009 17:11 Go to previous messageGo to next message
Johannes Stamminger is currently offline Johannes StammingerFriend
Messages: 17
Registered: July 2009
Junior Member
My test code snippets:

<persistence-unit name="entries-test-eclipselink-derby"
transaction-type="RESOURCE_LOCAL">

<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider >
<class>test.SimpleEntryEntity</class>
<properties>
<property name="eclipselink.jdbc.driver"
value="org.apache.derby.jdbc.EmbeddedDriver"/>
<property name="eclipselink.target-database" value="Derby"/>
<property name="eclipselink.jdbc.user" value="" />
<property name="eclipselink.jdbc.password" value="" />
<property name="eclipselink.orm.throw.exceptions"
value="true"/>
<property name="eclipselink.ddl-generation"
value="create-tables"/>
<property name="eclipselink.ddl-generation.output-mode"
value="database"/>
</properties>
</persistence-unit>



@Entity
@Table(name = "SIMPLE_ENTRY_t")
public class SimpleEntryEntity {

private long fA;
private short fB;
private long fC;
private long fD;

protected SimpleEntryEntity() { } // JPA required

SimpleEntryEntity(final long a,
final short b,
final long c,
final long d) {
fA = a;
fB = b;
fC = c;
fD = d;
}

@Column(name = "A", nullable = false, updatable = false)
public long getA() {
return fA;
}

public void setA(final long a) {
fA = a;
}

@Column(name = "B", nullable = false, updatable = false)
public short getB() {
return fB;
}

public void setB(final short b) {
fB = b;
}

@Column(name = "C", nullable = false, updatable = false)
public long getC() {
return fC;
}

public void setC(final long c) {
fC = c;
}

@Id
@Column(name = "D", nullable = false, updatable = false)
public long getD() {
return fD;
}

public void setD(final long d) {
fD = d;
}
}



public void testEclipselinkHSQLDB() throws Exception {
final Map<String, String> dbProperties = new HashMap<String,
String>();
dbProperties.put(PersistenceUnitProperties.JDBC_URL,
"jdbc:hsqldb:file:" + getCurrentTestDir()
+ File.separator +
"entries-test-eclipselink-hsqldb;shutdown=true;create=true");
dbProperties.put(PersistenceUnitProperties.LOGGING_FILE,
getCurrentTestDir() + File.separator +
"entries-test-eclipselink-hsqldb.log");
final EntityManagerFactory entityManagerFactory
=
Persistence.createEntityManagerFactory("entries-test-eclipselink-hsqldb ",
dbProperties);
final EntityManager entityManager =
entityManagerFactory.createEntityManager();
final EntityTransaction transaction =
entityManager.getTransaction();
transaction.begin();
for (int i = 0; i < NB_ENTRIES; i++) {
entityManager.persist(new SimpleEntryEntity(i, (short) (i +
1), i + 2, i + 3));
if (i > 0 && i % COMMIT_AFTER == 0) {
commit(transaction, i);
transaction.begin();
}
}
commit(transaction, NB_ENTRIES);
entityManager.close();
LOG.debug("entityManager.closed");

final EntityManager entityManagerLoad =
entityManagerFactory.createEntityManager();
final Query q = entityManagerLoad.createQuery("select E from
SimpleEntryEntity E");
final List<SimpleEntryEntity> r = q.getResultList();
assertEquals(NB_ENTRIES, r.size());
LOG.debug("contents verified");

entityManagerLoad.close();
LOG.debug("entityManager.closed");
entityManagerFactory.close();
LOG.debug("entityManagerFactory.closed");
}

private void commit(final EntityTransaction transaction,
final int i) throws StandardException {
LOG.debug("commit " + i);
transaction.commit();
LOG.debug("commit " + i + " done");
}
Re: OutOfMemory for simple example persisting large amount of data [message #385249 is a reply to message #385246] Thu, 15 January 2009 12:44 Go to previous messageGo to next message
James Sutherland is currently offline James SutherlandFriend
Messages: 1939
Registered: July 2009
Location: Ottawa, Canada
Senior Member

This occurs because an EntityManager is an extended persistence context,
in that it must store and track changes to everything you read through it.

In JPA you must either:
- Create a new EntityManager for each transaction.
- Call clear() after each transaction to clear the persistence context.

If you are concerned about performance for your mass insert, you could
also enable batch writing, and just for the purpose of the mass insert you
could also disable caching (shared=false).

----
James
http://www.nabble.com/EclipseLink---Users-f26658.html


James : Wiki : Book : Blog : Twitter
Re: OutOfMemory for simple example persisting large amount of data [message #385252 is a reply to message #385249] Thu, 15 January 2009 15:43 Go to previous messageGo to next message
Johannes Stamminger is currently offline Johannes StammingerFriend
Messages: 17
Registered: July 2009
Junior Member
Thanks a lot for the reply!

James wrote:
> This occurs because an EntityManager is an extended persistence context,
> in that it must store and track changes to everything you read through it.

I understood this when having had a look to the heap dump on out-of-mem.

But the annotations define that anything may be changed. My application
use case is to store immutable entities (just an index of something else).
Am I wrong when saying that the uow is no must-have in that situation or
at least should not prevent the cached entities from being garbage
collected (at least after a commit)?


> In JPA you must either:
> - Create a new EntityManager for each transaction.

I thought I had tested this - but obviously, as it works, may be I did
with hsqldb as database backend with having a different issue there.


I just checked with hsqldb backend again: still running out of memory. But
the heap dump now shows a different cause: now a big bunch of BigDecimals
are on top of the heap histogram. My first guess (and RootSet references
show this, too) is because hsqldb not using shared tables but memory ones
("create table XYZ ..." instead of "create shared table XYZ ..."). Is
there any other way than hand-coding the table create statement?



> - Call clear() after each transaction to clear the persistence context.

Works, too. The day yesterday was too hard for me to see this ;-)


> If you are concerned about performance for your mass insert, you could

Yes, I am - or better say: I was. With your hints it looks like becoming
practicable.


> also enable batch writing, and just for the purpose of the mass insert you
> could also disable caching (shared=false).

I already found the batch writing.
The @Cache(shared = false) annotation I need to read more about before
fully understand it's meaning. But obviously it does not completely
disable the uow tracking (it still runs out of mem without clear() call).


Thanks again,
Johannes
Re: OutOfMemory for simple example persisting large amount of data [message #385254 is a reply to message #385252] Thu, 15 January 2009 15:48 Go to previous message
Johannes Stamminger is currently offline Johannes StammingerFriend
Messages: 17
Registered: July 2009
Junior Member
Just forgot to mention: meanwhile I tested without jpa, too, and found
that it is possible to insert the whole bunch of 1.500.000 entries within
one transaction when using derby (but not hsqldb - though this one is much
much faster on inserting even with committing regularly).

Is there any possibility to keep this feature but interfacing jpa instead
of derby directly?
Previous Topic:EclipseLink project works in one Eclipse Workspace but not in another
Next Topic:Unable to insert single entity
Goto Forum:
  


Current Time: Fri Mar 29 14:31:36 GMT 2024

Powered by FUDForum. Page generated in 0.04445 seconds
.:: Contact :: Home ::.

Powered by: FUDforum 3.0.2.
Copyright ©2001-2010 FUDforum Bulletin Board Software

Back to the top