Access Array column in Spark

后端 未结 2 428
时光取名叫无心
时光取名叫无心 2020-12-01 15:16

A Spark DataFrame contains a column of type Array[Double]. It throw a ClassCastException exception when I try to get it back in a map() function. The following Scala code ge

2条回答
  •  爱一瞬间的悲伤
    2020-12-01 15:32

    This approach can also be considered :

      val tuples = Seq(("Abhishek", "Sengupta", Seq("MATH", "PHYSICS")))
      val dF = tuples.toDF("firstName", "lastName", "subjects")
    
      case class StudentInfo(fName: String, lName: String, subjects: Seq[String])
    
      val students = dF
        .collect()
        .map(row => StudentInfo(row.getString(0), row.getString(1), row.getSeq(2)))
    
      students.foreach(println)
    

提交回复
热议问题