Handling backups of a large table (>1 TB) in Postgres?

瘦欲@ 提交于 2019-12-25 03:19:08

问题


I have a 1TB table (X) that is a pain to backup.

The table X contains historical log data that is not often updated after creation. We usually only access a single row at a time, so performance is still very good.

We currently make nightly full logical backups, and exclude X for the sake of backup time and space. We do not need historical backups of X, since the log files from which it is populated are backed up themselves. However, recovery of X by re-processing of the log files would take an unnecessary long time.

I'd like to include X in our backup strategy so that our recovery time can be much faster. It doesn't seem feasible to include X in the nightly logical backup.

Ideally, I'd like a single full backup for X that is updated incrementally (purely to save time).

I lack the experience to investigate solutions alone, and I'm wondering what my options are?

Barman for incremental updates? Partition X? Both?

After doing some more reading, I'm inclined to partition the table and write a nightly script to perform logical backups only on the changed table partitions (and replace the previous backups). However, this strategy may still take a long time during recovery with a pg_restore... Thoughts?

Thanks!

来源:https://stackoverflow.com/questions/52770582/handling-backups-of-a-large-table-1-tb-in-postgres

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!