How to split parquet files into many partitions in Spark?

后端 未结 5 886
萌比男神i
萌比男神i 2020-12-06 05:10

So I have just 1 parquet file I\'m reading with Spark (using the SQL stuff) and I\'d like it to be processed with 100 partitions. I\'ve tried setting spark.default.pa

5条回答
  •  悲&欢浪女
    2020-12-06 05:58

    The new way of doing it (Spark 2.x) is setting

    spark.sql.files.maxPartitionBytes
    

    Source: https://issues.apache.org/jira/browse/SPARK-17998 (the official documentation is not correct yet, misses the .sql)

    From my experience, Hadoop settings no longer have effect.

提交回复
热议问题