Exporting Hive Table to a S3 bucket

后端 未结 3 1171
无人及你
无人及你 2020-12-29 05:29

I\'ve created a Hive Table through an Elastic MapReduce interactive session and populated it from a CSV file like this:

CREATE TABLE csvimport(id BIGINT, tim         


        
3条回答
  •  温柔的废话
    2020-12-29 06:04

    Above Query needs to use EXTERNAL keyword, i.e:

    CREATE EXTERNAL TABLE csvexport ( id BIGINT, time STRING, log STRING ) 
    row format delimited fields terminated by ',' lines terminated by '\n' 
    STORED AS TEXTFILE LOCATION 's3n://bucket/directory/';
    INSERT OVERWRITE TABLE csvexport select id, time, log from csvimport;
    

    An another alternative is to use the query

    INSERT OVERWRITE DIRECTORY 's3n://bucket/directory/'  select id, time, log from csvimport;
    

    the table is stored in the S3 directory with HIVE default delimiters.

提交回复
热议问题