SPARK: How to parse a Array of JSON object using Spark

后端 未结 2 2064
灰色年华
灰色年华 2021-01-14 12:29

I have a file with normal columns and a column that contains a Json string which is as below. Also picture attached. Each row actually belongs to a column named Demo(not Vis

2条回答
  •  温柔的废话
    2021-01-14 13:22

    Aleh thank you for answer.It works fine. I did the solution in slightly different way because I am using 2.3.3 spark.

    val sch = ArrayType(StructType(Array(
      StructField("key", StringType, true),
      StructField("value", StringType, true)
    )))
    
    val jsonDF3 = mdf.select(from_json(col("jsonString"), sch).alias("Demographics"))
    
    val jsonDF4 = jsonDF3.withColumn("device_kind", expr("Demographics[0].value"))
      .withColumn("country_code", expr("Demographics[1].value"))
      .withColumn("device_platform", expr("Demographics[2].value"))
    

提交回复
热议问题