I am trying to use Spark Streaming with Kafka (version 1.1.0) but the Spark job keeps crashing due to this error:
14/11/21 12:39:23 ERROR TaskSetManager: Tas
have you tried with inputs.persist(StorageLevel.MEMORY_AND_DISK_SER).
inputs.persist(StorageLevel.MEMORY_AND_DISK_SER)
E.g. http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Could-not-compute-split-block-not-found-td11186.html