| Marcel, 
 That makes sense.  Does your cluster have YARN available?  I wonder
    if you can try out Spark 1.3.1 running on YARN.  Alternatively, you
    could try installing Spark 1.3.1 on your own.
 
 Thanks for keeping with it.
 
 Cheers,
 
 Jim
 
 
 On 07/27/2015 01:56 PM, Marcel wrote:
 
      
      You´re right the problem is fixed in version 1.3.1. But when
      executing, a problem with different serialVersionUID appears due
      to different spark versions (1.3.1 as maven dependency and 1.3.0
      using the cloudera manager). In local mode everthing is fine :/
 Marcel Jacob.
 
 
 Am 27.07.2015 16:36, schrieb Jim
        Hughes:
 
        
        Hi Marcel,
 Spark is a 'provided' dependency for GeoMesa, so you should be
        able to use slightly different versions of Spark and still see
        things work.
 
 To that end, I'd suggest trying out Spark 1.3.1.  It should have
        the fix for the issues you linked to, and it will not require
        you to rebuild Spark and work through issues like that.
 
 Thanks for the great questions.  Feel free to post more complete
        stack traces.  They can help us figure out issues more quickly
        and also other users can find threads like this as well.
 
 Cheers,
 
 Jim
 
 
 On 07/27/2015 07:16 AM, Marcel
          wrote:
 
          
          Hello,
 I´ve found the problem. The featurename 'gdelt' was wrong.
 
 But unfortunately I´ve got another problem using Spark with
          Geomesa. I´m not quite sure where the error comes from, but I
          assume its problem with Spark.
 A ClassNotFoundException is thrown with following content:
          "Failed to register classes with Kryo".
 Please have a look at https://github.com/apache/spark/pull/4258
 A solution is described there, but I´m not sure how to use
          this "patch".
 
 I´m using Spark version 1.3.0 and it´s not possible for me to
          update my version, because I use GeoMesa.
 
 Thanks in advance,
 Marcel Jacob.
 
 
 Am 22.07.2015 18:38, schrieb Jim
            Hughes:
 
            
            Hi Marcel,
 From a quick look, I'm guessing that your DataStore is
            null.  I'd suggest adding a quick check to see if 'ds' is
            null.  You don't need to specify the 'featureName' to get a
            datastore.  I don't know if that would hurt anything, but
            I'd suggest removing it.
 
 Other than that, you can double-check the settings you are
            passing by using the GeoMesa tools (http://www.geomesa.org/geomesa-tools-features/)
            like 'list' and 'describe'.  Other than that, you can use
            the Accumulo shell to scan the 'gdelt' table to make sure
            that sensible metadata is present in that table.
 
 Let us know how getting a DataStore in this code works out
            for you.  I'll add the idea of a Java GeoMesaSpark
            tutorial/example project to our list of additions to make.
 
 Cheers,
 
 Jim
 
 
 On 07/22/2015 09:13 AM, Marcel
              wrote:
 
              
              Hey,
 I´m trying to retrieve a RDD using the GeomesaSpark class.
              Unfortunately a NullPointerException is thrown during
              execution of this method:
 
 GeoMesaSpark.rdd(conf,
                sparkContext, ds, query1);It says SimpleFeatureType.encodeType throws this
              exception. Is something wrong with my datatore or my
              arguments? Here is my code:
 
 
         Map<String, String>
                map = newHashMap<String, String>();          map.put("instanceId", "accumulo");
        map.put("zookeepers", "node1-scads02:2181");
         map.put("user", "user");       
        map.put("password", "password");
         map.put("tableName", "gdelt");
        map.put("featureName", "event");
           AccumuloDataStore
                  ds = (AccumuloDataStore)
                  DataStoreFinder.getDataStore(map);
           SparkConf
                  sc = newSparkConf(true);
         sc.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
         sc.set("spark.kryo.serializer.buffer.mb", "24");
         
         Configuration
                  conf = newConfiguration();
         SparkConf
                  sc2 = GeoMesaSpark.init(sc, ds);
         SparkContext
                  sparkContext = newSparkContext("spark://node1-scads02:7077", "countryWithMostEvent", sc2);
           Filter
                  f = Filter.INCLUDE;
         Query
                  query1 = newQuery("gdelt", f, newString[]{"Actor1CountryCode", "Actor2CountryCode"});
         
         RDD<SimpleFeature>
                  actorResultRDD = GeoMesaSpark.rdd(conf, sparkContext,
                  ds, query1);Thanks again.
 
 ps: It would be great when anybody could post a working
                GeomesaSpark example in Java including a RDD
                transformation.
 
 Best regards
 Marcel Jacob.
 
 
 
 _______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users 
 
 
 _______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users 
 
 
 _______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users 
 
 
 _______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users 
 
 
 _______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
http://www.locationtech.org/mailman/listinfo/geomesa-users 
 |