Home » Eclipse Projects » EclipseLink » read blob as byte array issue
read blob as byte array issue [message #386601] |
Mon, 06 April 2009 20:17 |
andiqo Messages: 32 Registered: July 2009 |
Member |
|
|
Hello,
I have an issue accessing a byte[] field with Postgres and EclipseLink
2.0.0. I don't understand why the conversion to a byte array failed (see
the Exception below).
In my persistence.xml, I have specified compatible=7.1 in order to force
the use of Large Object in Postgres
(http://jdbc.postgresql.org/documentation/83/connect.html):
<property name="javax.persistence.jdbc.url"
value="jdbc:postgresql://localhost:5432/jenmo?compatible=7.1" />
Persisting my SplitBlobPart cause no problem but I can't get my data from
it...
Thanks a lot for your help.
andiqo
// ---
Local Exception Stack:
Exception [EclipseLink-3002] (Eclipse Persistence Services -
2.0.0.r3652-M1): org.eclipse.persistence.exceptions.ConversionException
Exception Description: The object [35 714], of class [class
java.lang.Long], from mapping
[org.eclipse.persistence.mappings.DirectToFieldMapping[data- - >SPLITBLOBPART.DATA]]
with descriptor [RelationalDescriptor(org.jenmo.core.domain.SplitBlobPart
--> [DatabaseTable(SPLITBLOBPART)])], could not be converted to [class [B].
at
org.eclipse.persistence.exceptions.ConversionException.could NotBeConverted(ConversionException.java:71)
at
org.eclipse.persistence.internal.helper.ConversionManager.co nvertObjectToByteArray(ConversionManager.java:334)
at
org.eclipse.persistence.internal.helper.ConversionManager.co nvertObject(ConversionManager.java:138)
at
org.eclipse.persistence.internal.databaseaccess.DatasourcePl atform.convertObject(DatasourcePlatform.java:151)
at
org.eclipse.persistence.mappings.converters.TypeConversionCo nverter.convertDataValueToObjectValue(TypeConversionConverte r.java:119)
at
org.eclipse.persistence.mappings.foundation.AbstractDirectMa pping.getAttributeValue(AbstractDirectMapping.java:595)
at
org.eclipse.persistence.mappings.foundation.AbstractDirectMa pping.valueFromRow(AbstractDirectMapping.java:1102)
at
org.eclipse.persistence.mappings.foundation.AbstractDirectMa pping.buildCloneFromRow(AbstractDirectMapping.java:1075)
at
org.eclipse.persistence.internal.descriptors.ObjectBuilder.b uildAttributesIntoWorkingCopyClone(ObjectBuilder.java:1238)
at
org.eclipse.persistence.internal.descriptors.ObjectBuilder.b uildWorkingCopyCloneFromRow(ObjectBuilder.java:1359)
at
org.eclipse.persistence.internal.descriptors.ObjectBuilder.b uildObjectInUnitOfWork(ObjectBuilder.java:540)
at
org.eclipse.persistence.internal.descriptors.ObjectBuilder.b uildObject(ObjectBuilder.java:485)
at
org.eclipse.persistence.internal.descriptors.ObjectBuilder.b uildObject(ObjectBuilder.java:437)
at
org.eclipse.persistence.queries.ObjectLevelReadQuery.buildOb ject(ObjectLevelReadQuery.java:571)
at
org.eclipse.persistence.queries.ReadObjectQuery.registerResu ltInUnitOfWork(ReadObjectQuery.java:712)
at
org.eclipse.persistence.queries.ReadObjectQuery.executeObjec tLevelReadQuery(ReadObjectQuery.java:436)
at
org.eclipse.persistence.queries.ObjectLevelReadQuery.execute DatabaseQuery(ObjectLevelReadQuery.java:930)
at
org.eclipse.persistence.queries.DatabaseQuery.execute(Databa seQuery.java:664)
at
org.eclipse.persistence.queries.ObjectLevelReadQuery.execute (ObjectLevelReadQuery.java:891)
at
org.eclipse.persistence.queries.ReadObjectQuery.execute(Read ObjectQuery.java:397)
at
org.eclipse.persistence.queries.ObjectLevelReadQuery.execute InUnitOfWork(ObjectLevelReadQuery.java:954)
at
org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.int ernalExecuteQuery(UnitOfWorkImpl.java:2697)
at
org.eclipse.persistence.internal.sessions.AbstractSession.ex ecuteQuery(AbstractSession.java:1187)
at
org.eclipse.persistence.internal.sessions.AbstractSession.ex ecuteQuery(AbstractSession.java:1171)
at
org.eclipse.persistence.internal.sessions.AbstractSession.ex ecuteQuery(AbstractSession.java:1131)
at
org.eclipse.persistence.jpa.JpaHelper.loadUnfetchedObject(Jp aHelper.java:191)
at
org.jenmo.core.domain.SplitBlobPart._persistence_checkFetche d(SplitBlobPart.java)
at
org.jenmo.core.domain.SplitBlobPart._persistence_getdata(Spl itBlobPart.java)
at org.jenmo.core.domain.SplitBlobPart.openBuffer(SplitBlobPart .java:170)
at
org.jenmo.core.multiarray.MultiArrayBlobPart.onPartChange(Mu ltiArrayBlobPart.java:158)
at
org.jenmo.core.multiarray.MultiArrayBlobPart.getPart(MultiAr rayBlobPart.java:141)
at
org.jenmo.core.multiarray.MultiArrayBlobPart.getDouble(Multi ArrayBlobPart.java:245)
at
org.jenmo.core.domain.perf.TestDbHeavyScalarField.testGetAll (TestDbHeavyScalarField.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce ssorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe thodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall( FrameworkMethod.java:44)
at
org.junit.internal.runners.model.ReflectiveCallable.run(Refl ectiveCallable.java:15)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(Fr ameworkMethod.java:41)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate( InvokeMethod.java:20)
at
org.junit.internal.runners.statements.RunBefores.evaluate(Ru nBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(Run Afters.java:31)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit 4ClassRunner.java:73)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit 4ClassRunner.java:46)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java :180)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java: 41)
at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java: 173)
at
org.junit.internal.runners.statements.RunBefores.evaluate(Ru nBefores.java:28)
at
org.junit.internal.runners.statements.RunAfters.evaluate(Run Afters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
at
org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.r un(JUnit4TestReference.java:45)
at
org.eclipse.jdt.internal.junit.runner.TestExecution.run(Test Execution.java:38)
at
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTe sts(RemoteTestRunner.java:460)
at
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTe sts(RemoteTestRunner.java:673)
at
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(R emoteTestRunner.java:386)
at
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main( RemoteTestRunner.java:196)
@Entity
@Table(name = "SPLITBLOBPART")
public class SplitBlobPart implements ICopyable {
@Id
@GeneratedValue(generator = "SplitBlobPartSeq")
@SequenceGenerator(name = "SplitBlobPartSeq", sequenceName =
"SPLITBLOBPART_ID_SEQ", allocationSize = 10)
@Column(name = "ID")
private long id;
@Version
@Column(name = "VERSION")
private int version;
@SuppressWarnings("unused")
@ManyToOne(fetch = FetchType.LAZY)
@JoinColumn(name = "PARENT_ID", nullable = false, updatable = false)
private SplitBlob parent;
@Basic
@Column(name = "DATASIZE", updatable = false)
private int dataSize;
@Lob
@Basic(fetch = FetchType.LAZY)
@Column(name = "DATA", columnDefinition = "OID")
private byte[] data;
@Transient
// We don't synchronize buffer are we want best perfs...
private ByteBuffer buffer;
/**
* Opens the underlying buffer in order to be able to put/get values.
*/
public final void openBuffer() {
ByteBuffer var = buffer;
if (var == null) { // First check (no locking)
synchronized (this) {
var = buffer;
if (var == null) { // Second check (with locking)
if (data == null) {
data = new byte[dataSize];
}
var = buffer = ByteBuffer.wrap(data);
}
}
}
}
public final boolean isBufferOpen() {
return (buffer != null);
}
public final double getDouble(final int index) {
return buffer.getDouble(index);
}
public final void putDouble(final int index, final double v) {
buffer.putDouble(index, v);
}
}
|
|
|
Re: read blob as byte array issue [message #387020 is a reply to message #386601] |
Tue, 07 April 2009 13:07 |
|
This is odd. For some reason it looks like a Long was returned by the
database instead of a Blob or byte[]. Either the DATA field is of the
wrong type in the database, or the driver is returning the wrong type, or
something is going wrong in EclipseLink.
Confirm using a SQL select from your driver that the data is the correct
type. If it is, then it could be a field indexing issue in EclipseLink,
somehow the ID field could be being used for the DATA field. Please
include the SQL that was generated. Also is the 35 714 long value the
same as the id in the object? Try removing the LAZY in the data mapping
and see if the issue still occurs.
---
James
http://www.nabble.com/EclipseLink---Users-f26658.html
James : Wiki : Book : Blog : Twitter
|
|
|
Re: read blob as byte array issue [message #387038 is a reply to message #387020] |
Wed, 08 April 2009 19:44 |
andiqo Messages: 32 Registered: July 2009 |
Member |
|
|
Hello James,
In my TABLE splitblobpart:
(
id bigserial NOT NULL,
data oid,
datasize integer,
ordr integer,
"version" integer,
parent_id bigint NOT NULL,
CONSTRAINT splitblobpart_pkey PRIMARY KEY (id)
)
I have the following data:
"1";36354;8000000;0;1;1
With this url:
<property name="javax.persistence.jdbc.url"
value="jdbc:postgresql://localhost:5432/jenmo" />
the code:
Connection conn = JpaSpiActions4Test.getInstance().getConnection(em);
ResultSet rs = ps.executeQuery();
while (rs.next()) {
Long oid = rs.getLong(1);
System.out.println(oid);
}
produce 36354 -> OK
With this url:
<property name="javax.persistence.jdbc.url"
value="jdbc:postgresql://localhost:5432/jenmo?compatible=7.1" />
the code:
Connection conn = JpaSpiActions4Test.getInstance().getConnection(em);
ResultSet rs = ps.executeQuery();
while (rs.next()) {
byte[] bytes = rs.getBytes(1);
System.out.println(bytes.length);
}
produce 8000000 -> OK
Then accessing my SplitBlobPartObject:
[EL Fine]: 2009-04-08
21:36:57.199--ServerSession(1884473012)--Connection(11225853 4)--Thread(Thread[main,5,main])--SELECT
ID, ORDR, DATASIZE, DATA, VERSION, PARENT_ID FROM SPLITBLOBPART WHERE
(PARENT_ID = ?) ORDER BY ORDR ASC
bind => [1]
[EL Warning]: 2009-04-08
21:36:57.204--UnitOfWork(203141979)--Thread(Thread[main,5,ma in])--Exception
[EclipseLink-3002] (Eclipse Persistence Services - 2.0.0.r3652-M1):
org.eclipse.persistence.exceptions.ConversionException
Exception Description: The object [36 354], of class [class
java.lang.Long], from mapping
[org.eclipse.persistence.mappings.DirectToFieldMapping[data- - >SPLITBLOBPART.DATA]]
with descriptor [RelationalDescriptor(org.jenmo.core.domain.SplitBlobPart
--> [DatabaseTable(SPLITBLOBPART)])], could not be converted to [class [B].
Same issue with lazy loading disabled.
Thanks a lot for your help!
andiqo
|
|
| | |
Re: read blob as byte array issue [message #387092 is a reply to message #387085] |
Wed, 15 April 2009 12:57 |
|
I think I am beginning to understand what you are doing.
You have a LargeObject, but your JDBC driver is returning it as a Long
(its oid) not as a Blob, but you just want the bytes. If you call
getBytes() on your ResultSet the driver reads the Blob and gives you the
bytes, but if getObject() is called you just get the Long oid back.
You might be able to configure something in your driver, or define your
column type differently to fix this.
To workaround the driver issue in EclipseLink you could customize your
DatabasePlatform, override the method,
public Object getObjectFromResultSet(ResultSet resultSet, int
columnNumber, int type, AbstractSession session) throws
java.sql.SQLException
If the "type" is your oid type, then call getBytes() on the ResultSet,
otherwise call super.
James : Wiki : Book : Blog : Twitter
|
|
| |
Goto Forum:
Current Time: Sat Apr 20 02:55:55 GMT 2024
Powered by FUDForum. Page generated in 0.02740 seconds
|