Dump to CSV/Postgres memory

好久不见. 提交于 2019-12-11 13:14:54

问题


I have a large table (300 million lines) that I would like to dump to a csv - I need to do some processing that cannot be done with SQL. Right now I am using Squirrel as a client, and it does not apparently deal very well with large datasets - at least as far as I can tell from my own (limited) experience. If I run the query on the actual host, will it use less memory? Thanks for any help.


回答1:


I'd bet. You can directly dump a table to a CSV file using COPY, and I don't think that would use much memory.




回答2:


Try this:

COPY tablename
TO 'filename.csv'
WITH 
      DELIMITER AS  ','
      NULL AS ''
      CSV HEADER


来源:https://stackoverflow.com/questions/2768702/dump-to-csv-postgres-memory

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!