Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker
My python application exists in
I just created /tmp/spark-events on the {master} node and then distributed it to other nodes on the cluster to work.
mkdir /tmp/spark-events
rsync -a /tmp/spark-events {slaves}:/tmp/spark-events
my spark-default.conf:
spark.history.ui.port=18080
spark.eventLog.enabled=true
spark.history.fs.logDirectory=hdfs:///home/elon/spark/events