If I start up pyspark and then run this command:
import my_script; spark = my_script.Sparker(sc); spark.collapse(\'./data/\')
Everything is
If you built a spark application, you need to use spark-submit to run the application
The code can be written either in python/scala
The mode can be either local/cluster
If you just want to test/run few individual commands, you can use the shell provided by spark