All:   Ok more details after sifting through logs.  This
            appears to be a dependency issue.  In the interpreter log I
            
            am seeing:
          
            I am back with what I hope to be an easy question.
              I am attempting to follow the instructions for getting
              geomesa to work with zeppelin.
              I have successfully run the Zeppelin Tutorial/Basic
              Features (spark) configured with 
              a master of yarn-client.
              However, as soon as I add the geomesa external dependency
              either as a maven artifact or absolute
              path that default tutorial breaks.   I get the following
              stack trace no matter what spark code I run:
             
            
              java.lang.NullPointerException
              	at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
              	at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
              	at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
              	at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
              	at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
              	at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
              	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
              	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
              	at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
              	at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
              	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
              	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
              	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
              	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
              	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
              	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
              	at java.lang.Thread.run(Thread.java:748)
            
            If I remove the external dependency the Basic Features code
            works like a champ.
            I have geomesa 1.3.4 on my cluster so I have used the
              following dependency in Zeppelin:
              org.locationtech.geomesa:geomesa-accumulo-spark-runtime_2.11:1.3.4
              
            I
                have also tried this 1.3.5 version and 2.0.0 for
                version.
            I
                have tried absolute paths in addition to the maven
                artifact.
            I
                figure this is something I missed in setting up zeppelin
                but I am hoping that
                someone on this list has seen the issue.
            Below
                is the sample code I am trying to run which does not
                even invoke geomesa:
             
            %spark
              import org.apache.commons.io.IOUtils
              import java.net.URL
              import java.nio.charset.Charset
              
              // Zeppelin creates and injects sc (SparkContext) and
              sqlContext (HiveContext or SqlContext)
              // So you don't need create them manually
              
              // load bank data
              val bankText = sc.parallelize(
                  IOUtils.toString(
                      new URL("https://s3.amazonaws.com/apache-zeppelin/tutorial/bank/bank.csv"),
                      Charset.forName("utf8")).split("\n"))
              
              
              case class Bank(age: Integer, job: String, marital:
              String, education: String, balance: Integer)
              
              val bank = bankText.map(s => s.split(";")).filter(s
              => s(0) != "\"age\"").map(
                  s => Bank(s(0).toInt, 
                          s(1).replaceAll("\"", ""),
                          s(2).replaceAll("\"", ""),
                          s(3).replaceAll("\"", ""),
                          s(5).replaceAll("\"", "").toInt
                      )
              ).toDF()
              bank.registerTempTable("bank")
            
            
              
            -- 
========= mailto:dboyd@xxxxxxxxxxxxxxxxx ============
David W. Boyd                     
VP,  Data Solutions       
10432 Balls Ford, Suite 240  
Manassas, VA 20109         
office:   +1-703-552-2862        
cell:     +1-703-402-7908
============== http://www.incadencecorp.com/ ============
ISO/IEC JTC1 WG9, editor ISO/IEC 20547 Big Data Reference Architecture
Chair ANSI/INCITS TC Big Data
Co-chair NIST Big Data Public Working Group Reference Architecture
First Robotic Mentor - FRC, FTC - www.iliterobotics.org
Board Member- USSTEM Foundation - www.usstem.org
The information contained in this message may be privileged 
and/or confidential and protected from disclosure.  
If the reader of this message is not the intended recipient 
or an employee or agent responsible for delivering this message 
to the intended recipient, you are hereby notified that any 
dissemination, distribution or copying of this communication 
is strictly prohibited.  If you have received this communication 
in error, please notify the sender immediately by replying to 
this message and deleting the material from any computer.
 
            
            
            
            _______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geomesa-users