• Register
0 votes
176 views

Problem :

I have my Spark app which runs without any problem in the local mode, but have some problems when trying to submit to my Spark cluster.

The error message is as below:

 

20/01/05 15:42:06 WARN scheduler.TaskSetManager: Lost task 2.0 in stage 0.0 (TID 2, cluster-node-02): java.lang.ExceptionInInitializerError

    at GroupEvolutionES$$anonfun$6.apply(GroupEvolutionES.scala:579)

    at GroupEvolutionES$$anonfun$6.apply(GroupEvolutionES.scala:579)

    at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:390)

    at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1595)

    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)

    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)

    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)

    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)

    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)

    at org.apache.spark.scheduler.Task.run(Task.scala:89)

    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

    at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.spark.SparkException: A master URL must be set in your configuration

    at org.apache.spark.SparkContext.<init>(SparkContext.scala:401)

In the above error message, GroupEvolutionES is my main class. Above error message says "A master URL must be set in your configuration", but I have already provided the "--master" parameter to my spark-submit.

Anybody who can fix above problem?

My Spark version : 1.6.1

6 5 3
7,540 points

Please log in or register to answer this question.

1 Answer

0 votes

Solution :

The default value of the "spark.master" is always spark://HOST:PORT, and the below code tries to get the session from your standalone cluster that is running at the HOST:PORT, and is expecting the HOST:PORT value to be present in your spark config file.

SparkSession myspark = SparkSession
    .builder()
    .appName("SomeMyAppName")
    .getOrCreate();

Here "org.apache.spark.SparkException: A master URL must be set in your configuration" trying to say that HOST:PORT is not set in your spark configuration file.

To ignore the value of "HOST:PORT", you need to set myspark.master as local

SparkSession myspark = SparkSession
    .builder()
    .appName("SomeMyAppName")
    .config("myspark.master", "local")
    .getOrCreate();

 

9 7 4
38,600 points

Related questions

0 votes
1 answer 1.7K views
1.7K views
Problem : I want to view my php files locally on my mac. I am trying to run Yosemite on my mac. I have the MAMP installed on my mac. When I press start my MySQL server starts as usual but my Apache doesn't start at all. An error message is shown ... also tried reinstalling the OSX and that worked but until I shut down my mac the problem reoccurs. Can anyone help with the solution on above error?
asked Dec 19, 2019 alecxe 7.5k points
0 votes
1 answer 57 views
57 views
Problem : I am trying to install theano with the help of following command: conda install theano on my Ubuntu 16.04. But facing following import problems. import theano Traceback (most recent call last): File"/home/milenko/miniconda3/lib/python3.6/configparser.py", ... : To use MKL 2018 with Theano you MUST set "MKL_THREADING_LAYER=GNU" in your environement. How to resolve this issue? Any clues?
asked Dec 2, 2019 alecxe 7.5k points
0 votes
1 answer 225 views
225 views
Problem: I have recently started learning pyspark. I am very new to pyspark. I have written below code but it sowing me the error. from pyspark import SparkContext, SparkConf from pyspark.sql import SQLContext conf = SparkConf().setAppName("myApp").setMaster("local") sc ... 'ind', "state"]) Attributeerror: 'pipelinedrdd' object has no attribute 'todf' Can somebody help me in fixing my above code?
asked Aug 10, 2020 Raphael Pacheco 4.9k points
1 vote
1 answer 391 views
391 views
Problem: I got this Spark connection issue, and SparkContext didn't work for sc. The command to initialize ipython notebook: ipython notebook --profile=pyspark Environment: Mac OS Python 2.7.10 Spark 1.4.1 java version "1.8.0_65" Can anyone help?
asked Mar 27, 2020 LizzyM 6.1k points
2 votes
1 answer 96 views
96 views
Problem: I got this Spark connection issue, and SparkContext didn't work for sc. The command to initialize ipython notebook: ipython notebook --profile=pyspark Environment: Mac OS Python 2.7.10 Spark 1.4.1 java version "1.8.0_65" Can anyone explain or help?
asked Mar 24, 2020 LizzyM 6.1k points
0 votes
1 answer 109 views
109 views
Problem : I am getting bellow error while trying to run pyspark. java gateway process exited before sending the driver its port number
asked Oct 31, 2019 peterlaw 6.9k points
0 votes
2 answers 659 views
659 views
Problem : I am getting bellow error while trying to run pyspark on my macbook air exception: java gateway process exited before sending the driver its port number
asked Oct 19, 2019 peterlaw 6.9k points
0 votes
1 answer 5K views
5K views
Problem : I am developing the application in Spring, using a Tomcat, Mysql5, Java8... My problem is that I cannot deploy it, due to the "required bean 'entityManagerFactory' not found" problem. I have developed my project with the coworkers but ... a bean named 'entityManagerFactory' that could not be found. Action: Consider defining a bean named 'entityManagerFactory' in your configuration.
asked Jan 25, 2020 jwilliam 3.9k points
0 votes
1 answer 19 views
19 views
How would I refresh this project after modifying the React JavaScript code in the parent folder?
asked Dec 27, 2020 TeamScript 11.1k points
0 votes
1 answer 833 views
833 views
Problem : Getting strange error related to constant pool as follows org.apache.tomcat.util.bcel.classfile.classformatexception: invalid byte tag in constant pool: 19
asked Nov 11, 2019 peterlaw 6.9k points