Eclipse Community Forums
Forum Search:

Search      Help    Register    Login    Home
Home » Eclipse Projects » EclipseLink » Insert Duplicate key Exception - Eclipselink
Insert Duplicate key Exception - Eclipselink [message #1287336] Mon, 07 April 2014 17:26
maverickml ml is currently offline maverickml mlFriend
Messages: 2
Registered: April 2014
Junior Member
Hi

This post is regarding a persistence issue with JPA. The JPA provider used is Oracle Toplink provided by weblogic 12c and is built using EclipseLink.

The user makes 'n' number of interactions/trasnactions and the app writes each transaction to the DB.

Under heavy load, while writing the transactions, the app is facing duplicate key exceptions.

The 1st transaction is written successfully to the DB but the subsequent transaction is sometimes rejected with the duplicate key exception.

As i said the app uses JPA 2.0 in which the shared cache is enabled by default and i think this is something to do with shared cache.
I say this because the same app works fine in Weblogic 10 which uses JPA 1.0 and there is no concept of shared cache in there.

Now back to the issue, Each entity that takes part in the insert transaction is uniquely identified by an embedded primary key class with overridden

hashcode/equals()
(Please see below for the class definition).

@EmbeddedId
private CallerEntityPK pk;
	
//@Column attributes	

}

@Embeddable
public class CallerEntityPK implements Serializable {

	@Column(name="SESSION_ID")
	private String sessionId;   //FIRST_USER_SESSION,SECOND_USER_SESSION

	@Column(name="TRANSACTION_NBR")
	private String transNo;     //01 , 02 etc... 

	//Getter setters

	@Override
	public boolean equals(Object o) {
		if (o == this) {
			return true;
		}
		if ( ! (o instanceof CallerEntity )) {
			return false;
		}
		CallerEntity other = (CallerEntity ) o;
		return this.sessionId.equals(other.sessionId)
			&& this.transNo.equals(other.transNo;);
	}

	@Override
	public int hashCode() {
		final int prime = 31;
		int hash = 17;
		hash = hash * prime + this.sessionId.hashCode();
		hash = hash * prime + this.transNo.hashCode();
		return hash;
	}
}

The primary key is the combination of sessionid(FIRST_USER_SESSION) and the transaction number (01 for first insert, 02 for second insert ....)
For e.g: FIRST_USER_SESSION and 01

1st transaction pk: FIRST_USER_SESSION01
2nd transaction pk: FIRST_USER_SESSION02


1. Before writing the 1st insert transaction(entity with pk FIRST_USER_SESSION 01), its checked in the L2 cache and since its not in cache , its successfully persisted to DB.

2. After writing the first transaction , its updated in the L2 cache.(entity with FIRST_USER_SESSION 01 key is cached)

3. Now for the second insert transaction(the entity with key FIRST_USER_SESSION 02), the L2 cache is checked before persisting and and its my guess that entity for second transaction is considered identical to the one already in L2 cache. Even though the pk is different (FIRST_USER_SESSION02), i think the framework identifies it as the duplicate object.(based on the equals() and hashcode() overriden)

As a result the same duplicate object is attempted for insert and dulicate key exception is thrown.

Question 1) Is my understanding correct

Question 2) If this is the case, can i make the entity to use isolated cache and refresh always and expire instantly(like highlighted below).
I just want the cache to be disabled for this entity, Please let me know your comments

@Entity 
@Table(name="T_CALLER_TRANS") 
@Cache(isolation=CacheIsolationType.ISOLATED, expiry=0, alwaysRefresh=true)
public class CallerEntity implements Serializable {





Previous Topic:Custom Conversion
Next Topic:XmlAdapter with xml-path
Goto Forum:
  


Current Time: Fri Nov 28 07:09:13 GMT 2014

Powered by FUDForum. Page generated in 0.02076 seconds
.:: Contact :: Home ::.

Powered by: FUDforum 3.0.2.
Copyright ©2001-2010 FUDforum Bulletin Board Software