Spark Row to JSON

后端 未结 4 2056
灰色年华
灰色年华 2020-11-30 03:19

I would like to create a JSON from a Spark v.1.6 (using scala) dataframe. I know that there is the simple solution of doing df.toJSON.

However, my probl

4条回答
  •  忘掉有多难
    2020-11-30 04:11

    Here, no JSON parser, and it adapts to your schema:

    import org.apache.spark.sql.functions.{col, concat, concat_ws, lit}
    
    df.select(
      col(df.columns(0)),
      col(df.columns(1)),
      concat(
        lit("{"), 
        concat_ws(",",df.dtypes.slice(2, df.dtypes.length).map(dt => {
          val c = dt._1;
          val t = dt._2;
          concat(
            lit("\"" + c + "\":" + (if (t == "StringType") "\""; else "")  ),
            col(c),
            lit(if(t=="StringType") "\""; else "") 
          )
        }):_*), 
        lit("}")
      ) as "C"
    ).collect()
    

提交回复
热议问题