Dynamically write parquets using pyspark

后端 未结 0 531
野性不改
野性不改 2020-12-09 01:42

Is there a way to dynamically size parquet output files on dataframe.write using pyspark? We have a generic job that writes many tables to S3, some of those are small, but s

相关标签:
回答
  • 消灭零回复
提交回复
热议问题