Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[geomesa-users] Write to Geomesa from Spark

Does anyone have an example of how to write to geomesa from Spark.  This is what I have so far.  I'd welcome any feedback.

val job = new Job()
val config = new Configuration
ConfiguratorBase.setConnectorInfo(classOf[AccumuloOutputFormat], config,              ds.connector.whoami(), ds.authToken)
ConfiguratorBase.setZooKeeperInstance(classOf[AccumuloOutputFormat], config, ds.connector.getInstance().getInstanceName, ds.connector.getInstance().getZooKeepers)
OutputConfigurator.setDefaultTableName(classOf[AccumuloOutputFormat], config, ds.getSpatioTemporalIdxTableName(sft))

    val output = inputRDD.map(toFeature)
      .saveAsNewAPIHadoopFile(config.getString("accumulo.instance"),
                              classOf[Void],
                              classOf[SimpleFeature],
                              classOf[AccumuloOutputFormat],
                              job.getConfiguration)


Assume I already have the geomesa tables created and I have a handle to the datastore in the form of ds.  I also have the SimpleFeatureType as sft.  Something tells me I have default tableName identified incorrectly.  Also, I'm not sure what should go with all the classOf calls. 

Thanks
-- Adam

Back to the top