Disable parquet metadata summary in Spark

前端 未结 2 472
既然无缘
既然无缘 2021-01-02 10:30

I have a spark job (for 1.4.1) receiving a stream of kafka events. I would like to save them continuously as parquet on tachyon.

val lines = KafkaUtils.creat         


        
2条回答
  •  陌清茗
    陌清茗 (楼主)
    2021-01-02 11:36

    Spark 2.0 doesn't save metadata summaries by default any more, see SPARK-15719.

    If you are working with data hosted in S3, you may still find parquet performance hit by parquet itself trying to scan the tail of all objects to check their schemas. That can be disabled explicitly

    sparkConf.set("spark.sql.parquet.mergeSchema", "false")
    

提交回复
热议问题