Hi Diethard,
I think the fix'll be easy. Can you try this?
prepedData
.write
.format("geomesa")
.options(dsParams)
.save()
I *think* you are just missing the call to 'save'. We did recently
fix a different bug with writing in Spark, so if that doesn't do it,
let us know which version of GeoMesa you are using, etc.
As a more complete example, check out (1).
Cheers,
Jim
1.
https://github.com/locationtech/geomesa/blob/master/geomesa-accumulo/geomesa-accumulo-spark-runtime/src/test/scala/org/locationtech/geomesa/accumulo/spark/AccumuloSparkProviderTest.scala#L86
On 07/06/2017 05:16 PM, Diethard
Steiner wrote:
Thanks a lot Jim! It's working now. My next step is
to write it out into Accumulo. I tried this:
val prepedData = spark.sql("""SELECT *,
st_makePoint(Actor1Geo_Lat, Actor1Geo_Long) as geom
FROM ingested_data""")
prepedData.show(10)
prepedData
.write
.format("geomesa")
.options(dsParams)
However, the `write` part does not seem to do anything.
I get no error, but there is also no data in Accumulo.
Can you please let me know how to resolve this?
I also have the feature definition available (partial code
example):
// GeoMesa Feature
var geoMesaSchema = Lists.newArrayList(
"GLOBALEVENTID:Integer",
"SQLDATE:Date",
"MonthYear:Integer",
"Year:Integer",
Is there a way to add this as an option to the write
function?
Best regards,
Diethard
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users
|