Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker
My python application exists in
/tmp/spark-events is the location that Spark store the events logs. Just create this directory in the master machine and you're set.
/tmp/spark-events
$mkdir /tmp/spark-events $ sudo /root/spark-ec2/copy-dir /tmp/spark-events/ RSYNC'ing /tmp/spark-events to slaves... ec2-54-175-163-32.compute-1.amazonaws.com