Spark pyspark vs spark-submit
问题 The documentation on spark-submit says the following: The spark-submit script in Spark’s bin directory is used to launch applications on a cluster. Regarding the pyspark it says the following: You can also use bin/pyspark to launch an interactive Python shell. This question may sound stupid, but when i am running the commands though pyspark they also run on the "cluster", right? They do not run on the master node only, right? 回答1: There is no practical difference between these two. If not