When are files “splittable”?

后端 未结 1 1992
无人共我
无人共我 2020-12-17 03:04

When I\'m using spark, I sometimes run into one huge file in a HIVE table, and I sometimes am trying to process many smaller files in a HIVE table.

相关标签:
1条回答
  • 2020-12-17 03:24

    Considering Spark accepts Hadoop input files, have a look at below image.

    Only bzip2 formatted files are splitable and other formats like zlib, gzip, LZO, LZ4 and Snappy formats are not splitable.

    Regarding your query on partition, partition does not depend on file format you are going to use. It depends on content in the file - Values of partitioned column like date etc.

    EDIT 1: Have a look at this SE question and this working code on Spark reading zip file.

    JavaPairRDD<String, String> fileNameContentsRDD = javaSparkContext.wholeTextFiles(args[0]);
            JavaRDD<String> lineCounts = fileNameContentsRDD.map(new Function<Tuple2<String, String>, String>() {
                @Override
                public String call(Tuple2<String, String> fileNameContent) throws Exception {
                    String content = fileNameContent._2();
                    int numLines = content.split("[\r\n]+").length;
                    return fileNameContent._1() + ":  " + numLines;
                }
            });
            List<String> output = lineCounts.collect();
    

    EDIT 2:

    LZO files can be splittable.

    LZO files can be split as long as the splits occur on block boundaries

    Refer to this article for more details.

    0 讨论(0)
提交回复
热议问题