• Register
+2 votes
12 views

Problem:

I got this Spark connection issue, and SparkContext didn't work for sc.

The command to initialize ipython notebook:

ipython notebook --profile=pyspark

Environment:
Mac OS
Python 2.7.10
Spark 1.4.1
java version "1.8.0_65"

Can anyone explain or help?

ago by (2.6k points)  

1 Answer

+2 votes

Answer:

You actually have to define "pyspark-shell" in PYSPARK_SUBMIT_ARGS if you define this.

For instance:

import os

os.environ['PYSPARK_SUBMIT_ARGS'] = "--master mymaster --total-executor 2 --conf "spark.driver.extraJavaOptions=-Dhttp.proxyHost=proxy.mycorp.com-Dhttp.proxyPort=1234 -Dhttp.nonProxyHosts=localhost|.mycorp.com|127.0.0.1 -Dhttps.proxyHost=proxy.mycorp.com -Dhttps.proxyPort=1234 -Dhttps.nonProxyHosts=localhost|.mycorp.com|127.0.0.1 pyspark-shell"

 

ago by (4.5k points)  
...