Apache spark scala Exception handling
问题 How do I do Exception handling in Spark - Scala for invalid records Here is my code: val rawData = sc.textFile(file) val rowRDD = rawData.map(line => Row.fromSeq(line.split(","))) val rowRDMapped = rowRDD.map { x => x.get(1), x.get(10) } val DF = rowRDMapped.toDF("ID", "name" ) Everything works fine if the input data is fine, If I dont have enough fields, I get ArrayIndexOutOfBoundException. I am trying to put try-catch around, but I am not able to skip the records with invalid data, via try