Access Array column in Spark
A Spark DataFrame contains a column of type Array[Double]. It throw a ClassCastException exception when I try to get it back in a map() function. The following Scala code generate an exception. case class Dummy( x:Array[Double] ) val df = sqlContext.createDataFrame(Seq(Dummy(Array(1,2,3)))) val s = df.map( r => { val arr:Array[Double] = r.getAs[Array[Double]]("x") arr.sum }) s.foreach(println) The exception is java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [D at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:24) at $iwC$$iwC$$iwC