问题
1) Initially filtered RDD with null values.
val rddWithOutNull2 = rddSlices.filter(x => x(0) != null)
2) Then converted this RDD to RDD of Row
3) After converting RDD to Dataframe using Scala :
val df = spark.createDataFrame(rddRow,schema)
df.printSchema()
Output:
root
|-- name: string (nullable = false)
println(df.count())
Output:
Error :
count : :
[Stage 11:==================================> (3 + 2) / 5][error] o.a.s.e.Executor - Exception in task 4.0 in stage 11.0 (TID 16)
java.lang.IndexOutOfBoundsException: 0
- No other spark sql functions working on this spark dataframe.
回答1:
Agree with the comments, the problem seems to be in x(0). If there is an empty row, it will throw that Exception. One solution (depending on the type of the variable x) is to retrieve it with a headOption
val rddWithOutNull2 = rddSlices.filter(_.headOption.isDefined)
来源:https://stackoverflow.com/questions/48416695/spark-dataframe-count-function-and-many-more-functions-throw-indexoutofboundsexc