• Register
2 votes
105 views

Problem:

I got this Spark connection issue, and SparkContext didn't work for sc.

The command to initialize ipython notebook:

ipython notebook --profile=pyspark

Environment:
Mac OS
Python 2.7.10
Spark 1.4.1
java version "1.8.0_65"

Can anyone explain or help?

10 7 2
6,060 points

Please log in or register to answer this question.

1 Answer

2 votes

Answer:

You actually have to define "pyspark-shell" in PYSPARK_SUBMIT_ARGS if you define this.

For instance:

import os

os.environ['PYSPARK_SUBMIT_ARGS'] = "--master mymaster --total-executor 2 --conf "spark.driver.extraJavaOptions=-Dhttp.proxyHost=proxy.mycorp.com-Dhttp.proxyPort=1234 -Dhttp.nonProxyHosts=localhost|.mycorp.com|127.0.0.1 -Dhttps.proxyHost=proxy.mycorp.com -Dhttps.proxyPort=1234 -Dhttps.nonProxyHosts=localhost|.mycorp.com|127.0.0.1 pyspark-shell"

 

11 6 4
34,950 points

Related questions

1 vote
1 answer 473 views
473 views
Problem: I got this Spark connection issue, and SparkContext didn't work for sc. The command to initialize ipython notebook: ipython notebook --profile=pyspark Environment: Mac OS Python 2.7.10 Spark 1.4.1 java version "1.8.0_65" Can anyone help?
asked Mar 27, 2020 LizzyM 6.1k points
0 votes
1 answer 131 views
131 views
Problem : I am getting bellow error while trying to run pyspark. java gateway process exited before sending the driver its port number
asked Oct 31, 2019 peterlaw 6.9k points
0 votes
2 answers 787 views
787 views
Problem : I am getting bellow error while trying to run pyspark on my macbook air exception: java gateway process exited before sending the driver its port number
asked Oct 19, 2019 peterlaw 6.9k points
0 votes
1 answer 25 views
0 votes
1 answer 327 views
327 views
Problem: I have recently started learning pyspark. I am very new to pyspark. I have written below code but it sowing me the error. from pyspark import SparkContext, SparkConf from pyspark.sql import SQLContext conf = SparkConf().setAppName("myApp").setMaster("local") sc ... 'ind', "state"]) Attributeerror: 'pipelinedrdd' object has no attribute 'todf' Can somebody help me in fixing my above code?
asked Aug 10, 2020 Raphael Pacheco 4.9k points
0 votes
1 answer 189 views
189 views
Problem : I have my Spark app which runs without any problem in the local mode, but have some problems when trying to submit to my Spark cluster. The error message is as below: 20/01/05 15:42:06 WARN scheduler.TaskSetManager: Lost task 2.0 in stage 0.0 (TID 2, ... but I have already provided the "--master" parameter to my spark-submit. Anybody who can fix above problem? My Spark version : 1.6.1
asked Jan 6, 2020 alecxe 7.5k points
0 votes
1 answer 179 views
179 views
Problem : I have installed the fresh copy of the Centos 7. Then I just restarted Apache but my Apache failed to start. I am stuck with the bellow error from past 5 days. Even my support could not figure out the below error. sudo service httpd start Failed to ... could not bind to address 85.25.12.20:xx Jan 04 16:08:02 startdedicated.de httpd[5710]: no listening sockets available, shutting down
asked Jan 10, 2020 alecxe 7.5k points
0 votes
1 answer 61 views
61 views
Problem : I got below problem which I am trying to fix from the few days now and I am unable to know what should I do, I am looking for the answers but all of those which I found are of no use for me. I am very new here so I really hope that somebody can help me in resolving my error $ ... ------------ f...e) Jan 05 13:23:33 startdedicated.com nginx[8315]: nginx: [emerg] bind() to ----- f...e)
asked Jan 6, 2020 alecxe 7.5k points
1 vote
1 answer 77 views
77 views
Problem : I am facing the problem which I am trying to fix from the couple of days now and still I don't know what should I do, I am searching for answers but all of those I came across are of no use to me. I am very new here and I am really hopeful ... systemd/system/nginx.service; enabled; vendor preset: disabled) Active: failed (Result: exit-code) since Sun 2019-12-29 13:23:35 GMT; 2min 20s ago
asked Dec 30, 2019 alecxe 7.5k points