[
Date Prev][
Date Next][
Thread Prev][
Thread Next][
Date Index][
Thread Index]
[
List Home]
Re: [geomesa-users] Does GeoMesa has row level size limitation?
|
Hi Amit,
Based on the Kryo exception[1] in the stack trace, do you happen
to have rows which are larger than 64 MB of data?
If so, you may be able to sort things out by increasing the Kryo
buffer size in Spark[2]. (That's a quick guess and solution.)
Cheers,
Jim
1. Caused by: com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 0, required: 42"
On 8/3/2020 4:15 PM, Amit Srivastava
wrote:
Thanks Emilio for replying. Not able to reproduce it via
Unit test. Below is the full stack trace. Please let me know
if there is some known issue.
1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureWriter$class.writeFeature(GeoMesaFeatureWriter.scala:55) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureWriter$TableFeatureWriter.writeFeature(GeoMesaFeatureWriter.scala:141) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureWriter$GeoMesaAppendFeatureWriter$class.write(GeoMesaFeatureWriter.scala:227) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureWriter$$anon$3.write(GeoMesaFeatureWriter.scala:108) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.utils.geotools.FeatureUtils$.write(FeatureUtils.scala:141) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureStore$$anonfun$addFeatures$2.apply(GeoMesaFeatureStore.scala:44) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureStore$$anonfun$addFeatures$2.apply(GeoMesaFeatureStore.scala:42) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.utils.io.WithClose$$anonfun$apply$3$$anonfun$apply$4.apply(SafeClose.scala:66) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.utils.io.WithClose$.apply(SafeClose.scala:64) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.utils.io.WithClose$$anonfun$apply$3.apply(SafeClose.scala:66) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.utils.io.WithClose$.apply(SafeClose.scala:64) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.utils.io.WithClose$.apply(SafeClose.scala:66) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureStore.addFeatures(GeoMesaFeatureStore.scala:42) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at com.gaia.atlas.geotools.accessor.GeoToolsStorageClientStrategy.put_aroundBody20(GeoToolsStorageClientStrategy.java:205) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at com.gaia.atlas.geotools.accessor.GeoToolsStorageClientStrategy$AjcClosure21.run(GeoToolsStorageClientStrategy.java:1) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149) ~[aspectjweaver.jar:1.8.1]1596358606456, at com.metrics.declarative.aspectj.JoinpointInvocationHandle.proceed(JoinpointInvocationHandle.java:60) ~[DeclarativeCoralMetricsAspectJ-2.1.jar:?]1596358606456, at com.metrics.declarative.AbstractMethodMetricInterceptor.handleInvocation(AbstractMethodMetricInterceptor.java:283) ~[DeclarativeCoralMetrics-2.1.jar:?]1596358606456, at com.metrics.declarative.aspectj.MetricMethodAspect$ConfiguredMethodAspect.invoke(MetricMethodAspect.java:101) ~[DeclarativeCoralMetricsAspectJ-2.1.jar:?]1596358606456, at com.metrics.declarative.aspectj.MetricMethodAspect.captureMethodMetrics(MetricMethodAspect.java:52) ~[DeclarativeCoralMetricsAspectJ-2.1.jar:?]1596358606456, at com.gaia.atlas.geotools.accessor.GeoToolsStorageClientStrategy.put_aroundBody22(GeoToolsStorageClientStrategy.java:196) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at com.gaia.atlas.geotools.accessor.GeoToolsStorageClientStrategy$AjcClosure23.run(GeoToolsStorageClientStrategy.java:1) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149) ~[aspectjweaver.jar:1.8.1]1596358606456, at com.metrics.declarative.aspectj.JoinpointInvocationHandle.proceed(JoinpointInvocationHandle.java:60) ~[DeclarativeCoralMetricsAspectJ-2.1.jar:?]1596358606456, at com.metrics.declarative.servicemetrics.AbstractServiceMetricsInterceptor.handleInvocation(AbstractServiceMetricsInterceptor.java:59) ~[DeclarativeCoralMetrics-2.1.jar:?]1596358606456, at com.metrics.declarative.servicemetrics.aspectj.ServiceMetricsMethodAspect$ConfiguredMethodAspect.invoke(ServiceMetricsMethodAspect.java:59) ~[DeclarativeCoralMetricsAspectJ-2.1.jar:?]1596358606456, at com.metrics.declarative.servicemetrics.aspectj.ServiceMetricsMethodAspect.invoke(ServiceMetricsMethodAspect.java:37) ~[DeclarativeCoralMetricsAspectJ-2.1.jar:?]1596358606456, at com.gaia.atlas.geotools.accessor.GeoToolsStorageClientStrategy.put(GeoToolsStorageClientStrategy.java:196) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at com.gaia.atlas.geotools.accessor.DelegateStorageClientStrategy.put(DelegateStorageClientStrategy.java:86) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at com.gaia.atlas.geotools.helper.GeoMesaSimpleFeatureUpdater.putFeature(GeoMesaSimpleFeatureUpdater.java:131) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at com.gaia.atlas.geotools.callable.PutSingleFeatureTask.call(PutSingleFeatureTask.java:26) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at com.gaia.atlas.geotools.callable.PutSingleFeatureTask.call(PutSingleFeatureTask.java:16) ~[GaiaAtlasGeoMesaClient-1.0.jar:?]1596358606456, at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_252]1596358606456, at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_252]1596358606456, at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_252]1596358606456, at java.lang.Thread.run(Thread.java:749) [?:1.8.0_252]1596358606456," Caused by: com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 0, required: 42"1596358606456, at com.esotericsoftware.kryo.io.Output.require(Output.java:163) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at com.esotericsoftware.kryo.io.Output.writeString_slow(Output.java:462) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at com.esotericsoftware.kryo.io.Output.writeString(Output.java:363) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.features.kryo.impl.KryoFeatureSerialization$KryoStringWriter$.apply(KryoFeatureSerialization.scala:177) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.features.kryo.impl.KryoFeatureSerialization$class.writeFeature(KryoFeatureSerialization.scala:70) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.features.kryo.impl.KryoFeatureSerialization$class.serialize(KryoFeatureSerialization.scala:42) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.features.kryo.KryoFeatureSerializer$MutableActiveSerializer.serialize(KryoFeatureSerializer.scala:78) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.api.WritableFeature$FeatureLevelWritableFeature$$anonfun$values$1$$anonfun$apply$1.apply(WritableFeature.scala:154) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.api.WritableFeature$FeatureLevelWritableFeature$$anonfun$values$1$$anonfun$apply$1.apply(WritableFeature.scala:154) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.api.package$KeyValue.value$lzycompute(package.scala:184) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.api.package$KeyValue.value(package.scala:184) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.hbase.data.HBaseIndexAdapter$HBaseIndexWriter$$anonfun$write$1.apply(HBaseIndexAdapter.scala:549) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.hbase.data.HBaseIndexAdapter$HBaseIndexWriter$$anonfun$write$1.apply(HBaseIndexAdapter.scala:547) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at scala.collection.Iterator$class.foreach(Iterator.scala:893) ~[scala-library.jar:?]1596358606456, at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) ~[scala-library.jar:?]1596358606456, at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[scala-library.jar:?]1596358606456, at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[scala-library.jar:?]1596358606456, at org.locationtech.geomesa.hbase.data.HBaseIndexAdapter$HBaseIndexWriter.write(HBaseIndexAdapter.scala:547) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.api.IndexAdapter$BaseIndexWriter.write(IndexAdapter.scala:149) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, at org.locationtech.geomesa.index.geotools.GeoMesaFeatureWriter$class.writeFeature(GeoMesaFeatureWriter.scala:52) ~[geomesa-hbase-spark-runtime_custom_2.11-2.4.1-0719.jar:?]1596358606456, ... 35 more
Hi Amit,
There shouldn't be any limit on the record size, other
than the constraints imposed by the underlying database.
Can you reproduce it in a unit test, or step-by-step
through the repl? That would make it easier to figure out
what's wrong. Also including the full stack trace would be
useful.
Thanks,
Emilio
On 7/13/20 6:53 PM, Amit Srivastava wrote:
Hi Team,
I am using Geomesa version 2.4.1. I am seeing an
exception if I am putting a record greater than a
certain threshold (attached sample record) size. Can
anyone help me to investigate, what is the issue and
how we can fix this?
Caused by:
com.esotericsoftware.kryo.KryoException:
Buffer overflow. Available: 0, required: 245
at
com.esotericsoftware.kryo.io.Output.require(Output.java:163)
~[geomesa-hbase-spark-runtime_2.11-2.4.0.jar:?]
at
com.esotericsoftware.kryo.io.Output.writeString_slow(Output.java:462)
~[geomesa-hbase-spark-runtime_2.11-2.4.0.jar:?]
at
com.esotericsoftware.kryo.io.Output.writeString(Output.java:363)
~[geomesa-hbase-spark-runtime_2.11-2.4.0.jar:?]
Table Schema:
INFO Describing attributes of
feature 'OSMv2Relations'
geometry | MultiPolygon
(Spatio-temporally indexed)
ingestionTimestamp | Timestamp
(Spatio-temporally indexed)
nextTimestamp | Timestamp
idValue | String (Attribute
indexed)
serializerVersion | String
featurePayload | String
User data:
geomesa.index.dtg | ingestionTimestamp
geomesa.indices |
xz3:2:3:geometry:ingestionTimestamp,id:4:3:,attr:8:3:idValue:ingestionTimestamp
geomesa.stats.enable | true
geomesa.z.splits | 127
--
Regards,
Amit Kumar Srivastava
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxx
To unsubscribe from this list, visit https://dev.eclipse.org/mailman/listinfo/geomesa-users
--
Regards,
Amit Kumar Srivastava
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxx
To unsubscribe from this list, visit https://dev.eclipse.org/mailman/listinfo/geomesa-users