I\'m trying to use spark-submit
to execute my python code in spark cluster.
Generally we run spark-submit
with python code like below.
Aniket Kulkarni's spark-submit args.py a b c d e
seems to suffice, but it's worth mentioning we had issues with optional/named args (e.g --param1).
It appears that double dashes --
will help signal that python optional args follow:
spark-submit --sparkarg xxx yourscript.py -- --scriptarg 1 arg1 arg2