A Spark DataFrame contains a column of type Array[Double]. It throw a ClassCastException exception when I try to get it back in a map() function. The following Scala code ge
This approach can also be considered :
val tuples = Seq(("Abhishek", "Sengupta", Seq("MATH", "PHYSICS")))
val dF = tuples.toDF("firstName", "lastName", "subjects")
case class StudentInfo(fName: String, lName: String, subjects: Seq[String])
val students = dF
.collect()
.map(row => StudentInfo(row.getString(0), row.getString(1), row.getSeq(2)))
students.foreach(println)