ElasticSearch river JDBC MySQL not deleting records

前端 未结 2 1749
春和景丽
春和景丽 2020-12-16 08:01

I\'m using the JDBC plugin for ElasticSearch to update my MySQL database. It picks up new and changed records, but does not delete records that have been removed from MySQL.

2条回答
  •  轮回少年
    2020-12-16 08:23

    Since this question has been asked, the parameters have changed greatly, versioning and digesting have been deprecated, and poll has been replaced by schedule, which will take a cron expression on how often to rerun the river (below is scheduled to run every 5 mins)

        curl -XPUT 'localhost:9200/_river/account_river/_meta' -d '{
            "type" : "jdbc",
            "jdbc" : {
                "driver" : "com.mysql.jdbc.Driver",
                "url" : "jdbc:mysql://localhost:3306/test",
                "user" : "test_user",
                "password" : "test_pass",
                "sql" : "SELECT `account`.`id` as `_id`, `account`.`id`, `account`.`reference`, `account`.`company_name`, `account`.`also_known_as` from `account` WHERE NOT `account`.`deleted`",
                "strategy" : "simple",
                "schedule": "0 0/5 * * * ?" ,
                "autocommit" : true,
                "index" : "headphones",
                "type" : "Account"
            }
        }'
    

    But for the main question, the answer i got from the developer is this https://github.com/jprante/elasticsearch-river-jdbc/issues/213

    Deletion of rows is no longer detected.

    I tried housekeeping with versioning, but this did not work well together with incremental updates and adding rows.

    A good method would be windowed indexing. Each timeframe (maybe once per day or per week) a new index is created for the river, and added to an alias. Old indices are to be dropped after a while. This maintenance is similar to logstash indexing, but it is outside the scope of a river.

    The method i am currently using as a I research aliasing is I recreate the index and river nightly, and schedule the river to run every few hours. It ensures new data being put in will be indexed that day, and deletions will reflect every 24 hrs

提交回复
热议问题