问题
Here are installed kernels:
$jupyter-kernelspec list
Available kernels:
apache_toree_scala /usr/local/share/jupyter/kernels/apache_toree_scala
apache_toree_sql /usr/local/share/jupyter/kernels/apache_toree_sql
pyspark3kernel /usr/local/share/jupyter/kernels/pyspark3kernel
pysparkkernel /usr/local/share/jupyter/kernels/pysparkkernel
python3 /usr/local/share/jupyter/kernels/python3
sparkkernel /usr/local/share/jupyter/kernels/sparkkernel
sparkrkernel /usr/local/share/jupyter/kernels/sparkrkernel
A new notebook was created but fails with
The code failed because of a fatal error:
Error sending http request and maximum retry encountered..
There is no [error] message in the jupyter
console
回答1:
If you use magicspark
to connect your Jupiter notebook, you should also start Livy which is API service used by magicspark to talk to your Spark cluster.
- Download
Livy
from Apache Livy and unzip it - Check SPARK_HOME environment is set, if not, set to your Spark installation directory
- Run Livy server by
<livy_home>/bin/livy-server
in the shell/command line
Now go back to your notebook, you should be able to run spark code in cell.
来源:https://stackoverflow.com/questions/54965621/pyspark-pyspark-kernels-not-working-in-jupyter-notebook