Best practices for daily MySQL (partial and filtered) replication?

孤街浪徒 提交于 2019-12-10 10:39:45

问题


I have a reasonable large database with > 40 tables. I only need to replicated a few tables (+/- 5). And each tables is also filtered.

I'm looking for some best practices for replicating this data (daily is enough), where i can select just a few tables and included some WHERE clauses for each table.

I'm thinking of starting mysqldump for each table (with a where clause) and make for each table a separate .sql file. I then can then truncate all tables (all data is daily overwritten) on the destination db, and run mysql for importing each table separate.

Example:

   # dump each table 
   mysqldump -u repl_user my_database my_table -w 'id between 1000 and 1005' > my_table.sql

Im aware of replicating a full database and use blackhole table type. But since 35 tables are not needed, it seems somehow overkill. Besides, some tables only needs a filtered version, and that i can't solve via blackhole.

Any better solutions?


回答1:


MySQL natively supports replication filters, but only at the database or table level. This doesn't meet your requirement to filter a subset of rows from these tables.

FlexViews is a tool to read the binary log and replay only changes that are relevant to keeping a materialized view up to date. You could define your materialized view in such a way to implement your table filtering.



来源:https://stackoverflow.com/questions/23065411/best-practices-for-daily-mysql-partial-and-filtered-replication

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!