scala - Spark : How to union all dataframe in loop

前端 未结 6 1683
抹茶落季
抹茶落季 2020-12-14 22:41

Is there a way to get the dataframe that union dataframe in loop?

This is a sample code:

var fruits = List(
  \"apple\"
  ,\"orange\"
  ,\"melon\"
)          


        
6条回答
  •  无人及你
    2020-12-14 23:09

    you can first create a sequence and then use toDF to create Dataframe.

    scala> var dseq : Seq[(String,String,String)] = Seq[(String,String,String)]()
    dseq: Seq[(String, String, String)] = List()
    
    scala> for ( x <- fruits){
         |  dseq = dseq :+ ("aaa","bbb",x)
         | }
    
    scala> dseq
    res2: Seq[(String, String, String)] = List((aaa,bbb,apple), (aaa,bbb,orange), (aaa,bbb,melon))
    
    scala> val df = dseq.toDF("aCol","bCol","name")
    df: org.apache.spark.sql.DataFrame = [aCol: string, bCol: string, name: string]
    
    scala> df.show
    +----+----+------+
    |aCol|bCol|  name|
    +----+----+------+
    | aaa| bbb| apple|
    | aaa| bbb|orange|
    | aaa| bbb| melon|
    +----+----+------+
    

提交回复
热议问题