Spark Exception: Python in worker has different version 3.4 than that in driver 3.5
问题 I am using Amazon EC2, and I have my master and development servers as one. And I have another instance for a single worker. I am new to this, but I have managed to make spark work in a standalone mode. Now I am trying cluster. the master and worker are active (I can see the webUI for them and they are functioning). I have Spark 2.0, and I have installed the latest Anaconda 4.1.1 which comes with Python 3.5.2. In both worker and master, if I go to pyspark and do os.version_info, I will get