When i execute query sql via spark-submit and spark-sql, corresponding spark application always fail with error follows:
15/03/10 18:50:52 INFO util.AkkaUtil
Finally I found the reason. It is because Yarn kills the executor (container) because the executor is memory overhead. Just turn up values of spark.yarn.driver.memoryOverhead or spark.yarn.executor.memoryOverhead or both.
In my case, I resolve this problem by increasing the number of parallel task that read data to RDD