Bulk loading into PostgreSQL from a remote client

旧城冷巷雨未停 提交于 2019-12-23 02:57:05

问题


I need to bulk load a large file into PostgreSQL. I would normally use the COPY command, but this file needs to be loaded from a remote client machine. With MSSQL, I can install the local tools and use bcp.exe on the client to connect to the server.

Is there an equivalent way for PostgreSQL? If not, what is the recommended way of loading a large file from a client machine if I cannot copy the file to the server first?

Thanks.


回答1:


COPY command is supported in PostgreSQL Protocol v3.0 (Postgresql 7.4 or newer).

The only thing you need to use COPY from a remote client is a libpq enabled client such as psql command line utility.

From the remote client run:

$ psql -d dbname -h 192.168.1.1 -U uname < yourbigscript.sql



回答2:


You can use the \copy command from psql tool like:

psql -h IP_REMOTE_POSTGRESQL -d DATABASE -U USER_WITH_RIGHTS -c "\copy 
  TABLE(FIELD_LIST_SEPARATE_BY_COMMA) from 'FILE_IN_CLIENT_MACHINE(MAYBE IN THE SAME 
DIRECTORY)' with csv header"



回答3:


Assuming you have some sort of client in order to run the query, you can use the COPY FROM STDIN form of the COPY command: http://www.postgresql.org/docs/current/static/sql-copy.html




回答4:


Use psql's \copy command to load data in sql:

$ psql -h <IP> -p <port> -U <username> -d <database>

database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' delimiter as '|' 

database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' with csv header


来源:https://stackoverflow.com/questions/3602976/bulk-loading-into-postgresql-from-a-remote-client

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!