How to import ZIpped file into Postgres Table

我怕爱的太早我们不能终老 提交于 2021-02-07 19:28:03

问题


I would like to important a file into my Postgresql system(specificly RedShift). I have found a arguement for copy that allows importing a gzip file. But the provider for the data I am trying to include in my system only produces the data in a .zip. Any built in postgres commands for opening a .zip?


回答1:


From within Postgres:

COPY table_name FROM PROGRAM 'unzip -p input.csv.zip' DELIMITER ',';

From the man page for unzip -p:

-p     extract files to pipe (stdout).  Nothing but the file data is sent to stdout, and the files are always extracted  in  binary
       format, just as they are stored (no conversions).



回答2:


Can you just do something like

unzip -c myfile.zip | gzip myfile.gz

Easy enough to automate if you have enough files.




回答3:


This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here:

This is the format that works for me if my s3 bucket contains a gzipped .csv.

copy <table> from 's3://mybucket/<foldername> '<aws-auth-args>' delimiter ',' gzip;



回答4:


unzip -c /path/to/.zip | psql -U user

The 'user' must be have super user right else you will get a

ERROR:  must be superuser to COPY to or from a file

To learn more about this see

https://www.postgresql.org/docs/8.0/static/backup.html

Basically this command is used in handling large databases



来源:https://stackoverflow.com/questions/17750706/how-to-import-zipped-file-into-postgres-table

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!