I need to measure the execution time of query on Apache spark (Bluemix). What I tried:
import time startTimeQuery = time.clock() df = sqlContext.sql(query)
SPARK itself provides much granular information about each stage of your Spark Job.
You can view your running job on http://IP-MasterNode:4040 or You can enable History server for analyzing the jobs at a later time.
Refer here for more info on History server.