Error while running Spark java program

2015-08-18T15:50:25

I am trying to run spark standalone java program using Maven but facing issues while building the project as below.

Spark version: 1.1.0 mvn clean package mvn exec:java -Dexec.mainClass="SimpleApp"

After running mvn exec:java -Dexec.mainClass="SimpleApp" i am getting the below error.

[WARNING] Couldn't destroy threadgroup  org.codehaus.mojo.exec.ExecJavaMojo$IsolatedThreadGroup[name=SimpleApp,maxpri=10]
java.lang.IllegalThreadStateException
at java.lang.ThreadGroup.destroy(ThreadGroup.java:754)
at org.codehaus.mojo.exec.ExecJavaMojo.execute(ExecJavaMojo.java:328)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)

Copyright License:
Author:「Gaurav A」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/32066603/error-while-running-spark-java-program

About “Error while running Spark java program” questions

I am trying to run spark standalone java program using Maven but facing issues while building the project as below. Spark version: 1.1.0 mvn clean package mvn exec:java -Dexec.mainClass="SimpleAp...
When I running Python code in Spark using spark-submit --master local --packages com.databricks:spark-xml_2.10:0.4.1 \ --driver-memory 8G --executor-memory 7G I get this error 17/02/28 18:59:25
I am using spark 1.3.0. I have a problem running the python program in spark python shell. This is how I submit the job : /bin/spark-submit progname.py the error I found is, NameError: name 's...
I am using latest spark (2.1.0) and python (3.5.3) installed. I have kafka (2.10.0) installed locally. from pyspark import SparkContext from pyspark.streaming import StreamingContext from pyspark.
I am running into version issues, please direct some doc how to check version compatibility, trying to set environment plugins.sbt addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % ...
I have written a Java program for Spark. But how to run and compile it from Unix command line. Do I have to include any jar while compiling for running
I am running a Spark Standalone cluster in company server in which i have 1 master and 10 worker. In my requirement i have build a spark jar job which will read data from Azure data lake store and ...
So I am running a Python script (that I can't share due to some security reasons) and am having some problems with running it. I am using spark and am getting this error when using the groupbyKey().
I have Compiled Java program and attempted to run using spark but it is showing ClassNotFound Exception even if class file exists there. package org.apache.spark.examples; import org.apa...
I downloaded the pre-built spark 1.2.1 for hadoop 2.4. I then attempted to launch the spark shell using ./bin/spark-shell, and I got the following error message: Failed to find Spark assembly in /...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.