• Register
0 votes
646 views

Problem :

I am getting bellow error while trying to run pyspark on my macbook air

exception: java gateway process exited before sending the driver its port number

6 5 3
6,930 points

Please log in or register to answer this question.

2 Answers

0 votes

Solution:

One possible reason behind getting this error is JAVA_HOME is not set or another reason may be the java is not installed.

I also had the same issue. As below

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 51.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:643) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:277) at java.net.URLClassLoader.access$000(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:212) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:205) at java.lang.ClassLoader.loadClass(ClassLoader.java:323) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:296) at java.lang.ClassLoader.loadClass(ClassLoader.java:268) at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:406) Traceback (most recent call last): File "<string>", line 1, in <module> File "/opt/spark/python/pyspark/conf.py", line 104, in __init__ SparkContext._ensure_initialized() File "/opt/spark/python/pyspark/context.py", line 243, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway() File "/opt/spark/python/pyspark/java_gateway.py", line 94, in launch_gateway raise Exception("Java gateway process exited before sending the driver its port number") Exception: Java gateway process exited before sending the driver its port number

at sc = pyspark.SparkConf(). I solved it by running

sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer

Further Readings:

https://github.com/jupyter/notebook/issues/743

9 7 4
38,600 points
0 votes

Solution:

One solution is including pyspark-shell to the shell environment variable PYSPARK_SUBMIT_ARGS:

export PYSPARK_SUBMIT_ARGS="--master local[2] pyspark-shell"

There is a alter in python/pyspark/java_gateway.py , which asserts PYSPARK_SUBMIT_ARGS includes pyspark-shell in case a PYSPARK_SUBMIT_ARGS variable is set by a user.

One probable cause is JAVA_HOME is not set since java is not installed.

I encountered the similar problem. It tells

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:643)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:296)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:406)
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/opt/spark/python/pyspark/conf.py", line 104, in __init__
    SparkContext._ensure_initialized()
  File "/opt/spark/python/pyspark/context.py", line 243, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "/opt/spark/python/pyspark/java_gateway.py", line 94, in launch_gateway
    raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number

at sc = pyspark.SparkConf(). I resolved it by running

sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer

What was missing in my instance was fixing the master URL in the $PYSPARK_SUBMIT_ARGS environment like this (presuming you use bash):

export PYSPARK_SUBMIT_ARGS="--master spark://<host>:<port>"

For Example

export PYSPARK_SUBMIT_ARGS="--master spark://192.168.2.40:7077"

You can place this into your .bashrc file. You will obtain the right URL in the log for the spark master (the location for this log is reported at the time you begin the master with /sbin/start_master.sh).

Afterwards consuming hours and hours attempting many diverse solutions, I can ensure that Java 10 SDK causes this error. On Mac, please navigate to /Library/Java/JavaVirtualMachines then run this command to uninstall Java JDK 10 fully:

sudo rm -rf jdk-10.jdk/

Attempt including pyspark-shell to the shell environment variable PYSPARK_SUBMIT_ARGS:

export PYSPARK_SUBMIT_ARGS="--master local[2] pyspark-shell"

There is a alter in python/pyspark/java_gateway.py , which claims PYSPARK_SUBMIT_ARGS that adds pyspark-shell at the time a PYSPARK_SUBMIT_ARGS variable is place by a user.

One probable cause maybe that the JAVA_HOME is not fix since java is not installed.

In that instance you will assignation something like this:

Exception in thread "main" java.lang.UnsupportedClassVersionError:

........

........

........

    raise Exception("Java gateway process exited before sending the driver its port number")

You can solve it by running the follwing commands:

sudo add-apt-repository ppa:webupd8team/java

sudo apt-get update

sudo apt-get install oracle-java8-installer

 

10 6 4
31,120 points

Related questions

0 votes
1 answer 101 views
101 views
Problem : I am getting bellow error while trying to run pyspark. java gateway process exited before sending the driver its port number
asked Oct 31, 2019 peterlaw 6.9k points
1 vote
1 answer 380 views
380 views
Problem: I got this Spark connection issue, and SparkContext didn't work for sc. The command to initialize ipython notebook: ipython notebook --profile=pyspark Environment: Mac OS Python 2.7.10 Spark 1.4.1 java version "1.8.0_65" Can anyone help?
asked Mar 27, 2020 LizzyM 6.1k points
2 votes
1 answer 91 views
91 views
Problem: I got this Spark connection issue, and SparkContext didn't work for sc. The command to initialize ipython notebook: ipython notebook --profile=pyspark Environment: Mac OS Python 2.7.10 Spark 1.4.1 java version "1.8.0_65" Can anyone explain or help?
asked Mar 24, 2020 LizzyM 6.1k points
0 votes
1 answer 199 views
199 views
Problem: I have recently started learning pyspark. I am very new to pyspark. I have written below code but it sowing me the error. from pyspark import SparkContext, SparkConf from pyspark.sql import SQLContext conf = SparkConf().setAppName("myApp").setMaster("local") sc ... 'ind', "state"]) Attributeerror: 'pipelinedrdd' object has no attribute 'todf' Can somebody help me in fixing my above code?
asked Aug 10, 2020 Raphael Pacheco 4.9k points
0 votes
1 answer 51 views
51 views
Problem: I am beginner to python and pyspark. I am trying to start the very simple app with the help of spark. I have already downloaded the spark with the help of following command: pip install spark. But now if I try to run the code below error occurs: Failed to ... ']= "C:\\winutils" lines = sc.textFile("ob.txt") pythonLines = lines .filter(lambda line: "python" in line) print(pythonLines)
asked Jul 28, 2020 Raphael Pacheco 4.9k points
0 votes
1 answer 61 views
61 views
Problem : I am getting bellow error while I'm trying to setup an Android dev environment on Mac error:buildtools 24.0.2 requires java 1.8 or above. current jdk version is 1.7.
asked Oct 19, 2019 peterlaw 6.9k points
0 votes
1 answer 1.1K views
1.1K views
Problem : When attempting to start the servers, Apache fails to start, and MAMP claims that my port ([::]:8888) is already in the use. I have tried virtually every way I know of determining what the process is using that port (as well as many of other solutions ... what(if anything) is really using my port, or somehow convince the MAMP that my port is not actually in use. Any suggestions on this?
asked Jan 17, 2020 jwilliam 3.9k points
0 votes
2 answers 139 views
139 views
Problem : I am using Mac. I want to run my docker code. Following is my command: docker run -d -p 80:80 --name webserver nginx However I am facing below error: docker: Error response from daemon: driver failed programming external connectivity on endpoint webserver ( ... userland proxy: Bind for 0.0.0.0:80: unexpected error (Failure EADDRINUSE). Please let me know if any solution on my error.
asked Dec 3, 2019 alecxe 7.5k points
0 votes
2 answers 72 views
72 views
Problem : I am very much new to the whole Mac experience. I have recently installed MySQL on Mac and it is asking me to reset the password after install. It is not allowing me to do anything else. Now I have already reset the password the usual way as below: update ... I need to change the password but in some different way that is not known to me. Can someone here has any solution on my problem?
asked Dec 4, 2019 alecxe 7.5k points