How to split parquet files into many partitions in Spark?

后端 未结 5 896
萌比男神i
萌比男神i 2020-12-06 05:10

So I have just 1 parquet file I\'m reading with Spark (using the SQL stuff) and I\'d like it to be processed with 100 partitions. I\'ve tried setting spark.default.pa

5条回答
  •  被撕碎了的回忆
    2020-12-06 05:40

    Maybe your parquet file only takes one HDFS block. Create a big parquet file that has many HDFS blocks and load it

    val k = sc.parquetFile("the-big-table.parquet")
    k.partitions.length
    

    You'll see same number of partitions as HDFS blocks. This worked fine for me (spark-1.1.0)

提交回复
热议问题