Converting DataSet to Json Array Spark using Scala

空扰寡人 提交于 2019-12-02 02:53:03
  1. You can use collect_list or collect_set as per your need on mal_name column
  2. You can directly save DataFrame/DataSet directly as JSON file
import org.apache.spark.sql.functions.{alias, collect_list}
import spark.implicits._

rawData.groupBy($"file_md5")
  .agg(collect_set($"mal_name").alias("mal_name"))
  .write
  .format("json")
  .save("json/file/location/to/save")

as wrote by @mrsrinivas I changed my code as per below

val mat2 = rawData.select(rawData("file_md5"),rawData("mal_name")).distinct().orderBy(asc("file_md5")).cache()
val labeledDf = mat2.toDF("file_md5","mal_name")
labeledDf.groupBy($"file_md5").agg(collect_list($"mal_name")).coalesce(1).write.format("json").save("/home/umesh/Documents/Demo2/src/test/run8/")

Keeping this quesion open for some more suggestions if any.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!