问题
I'm trying to implement something like https://www.elastic.co/blog/how-to-keep-elasticsearch-synchronized-with-a-relational-database-using-logstash where it uses logstash jdbc
input and elasticsearch
output
I can make it work with simple queries.
But it's getting harder to properly prepare(serialize) data (with multiple joins and etc) for output.
(When you have data from multiple tables and need to properly format data for ES, it's harder to do it in sql query)
I'm wondering if I could program the serialization part in python rather than sql.
I could do this by running a periodic celery task. Would it be feasible to do this in logstash or is there any advantage of doing it in logstash actually?
(well one is logstash keeps track of tracking_column
, but with celery I 'll have to implement that myself)
来源:https://stackoverflow.com/questions/59116809/logstash-sync-database-with-elasticsearch