I wanted to install pyspark
on my home machine. I did
pip install pyspark
pip install jupyter
Both seemed to work well.
<
Pyspark from PyPi (i.e. installed with pip
) does not contain the full Pyspark functionality; it is only intended for use with a Spark installation in an already existing cluster [EDIT: or in local mode only - see accepted answer]. From the docs:
The Python packaging for Spark is not intended to replace all of the other use cases. This Python packaged version of Spark is suitable for interacting with an existing cluster (be it Spark standalone, YARN, or Mesos) - but does not contain the tools required to setup your own standalone Spark cluster. You can download the full version of Spark from the Apache Spark downloads page.
NOTE: If you are using this with a Spark standalone cluster you must ensure that the version (including minor version) matches or you may experience odd errors
You should download a full Spark distribution as described here.