I\'m using logstash input jdbc plugin to read two (or more) databases and send the data to elasticsearch, and using kibana 4 to vizualize these data.
This is my log
sql_last_start
is now sql_last_value
please check here
the special parameter sql_last_start
is now renamed to sql_last_value
for better clarity as it is not only limited to datetime but may have other column type as well.
so now solution may be something like this
input {
jdbc {
type => "A"
jdbc_driver_library => "C:\DEV\elasticsearch-1.7.1\plugins\elasticsearch- jdbc-1.7.1.0\lib\jtds-1.3.1.jar"
jdbc_driver_class => "Java::net.sourceforge.jtds.jdbc.Driver"
jdbc_connection_string => "jdbc:jtds:sqlserver://dev_data_base_server:1433/dbApp1;domain=CORPDOMAIN;useNTLMv2=true"
jdbc_user => "user"
jdbc_password => "pass"
schedule => "5 * * * *"
use_column_value => true
tracking_column => date
statement => "SELECT id, date, content, status from test_table WHERE date >:sql_last_value"
#clean_run true means it will reset sql_last_value to zero or initial value if datatype is date(default is also false)
clean_run =>false
}
jdbc{
#for type B....
}
}
i have tested with sql Server DB
please run for first time with clean_run=>ture to avoid datatype error while in development we may have different datatype value stored in sql_last_value
variable