When using spark-1.6.2 and pyspark, I saw this:
where you see that the active tasks are a negative number (the difference of the the total tasks fr
It is a Spark issue. It occurs when executors restart after failures. The JIRA issue for the same is already created. You can get more details about the same from https://issues.apache.org/jira/browse/SPARK-10141 link.