logstash-configuration

Extract from ElasticSearch, into Kafka, continuously any new ES updates using logstash

你离开我真会死。 提交于 2019-12-25 04:40:13
问题 I have an ES cluster with multiple indices that all receive updates in random time intervals. I have a logstash instance extracting data from ES and passing it into Kafka. What would be a good method to run this every minute and pickup any updates in ES? Conf: input { elasticsearch { hosts => [ "hostname1.com:5432", "hostname2.com" ] index => "myindex-*" query => "*" size => 10000 scroll => "5m" } } output { kafka { bootstrap-servers => "abc-kafka.com:1234" topic_id => "my.topic.test" } } I

Does Logstash support Elasticsearch's _update_by_query?

一笑奈何 提交于 2019-12-25 00:19:00
问题 Does the Elasticsearch output plugin support elasticsearch's _update_by_query? https://www.elastic.co/guide/en/logstash/6.5/plugins-outputs-elasticsearch.html https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-update-by-query.html 回答1: The elasticsearch output plugin can only make calls to the _bulk endpoint, i.e. using the Bulk API. If you want to call the Update by Query API, you need to use the http output plugin and construct the query inside the event yourself. If you

Logstash configuration file error(Answer not working)

萝らか妹 提交于 2019-12-24 00:26:51
问题 The only thing that is certain about [url][queryString] is that it begins with 404; or that the key is long.I need to remove such keys. If I use the ruby code below it gives cannot convert linked hashmap to string exception. input { file { # Wildcards work, here :) path => ["C:\Users\ppurush\Desktop\test\*.log"] start_position => "beginning" } } filter { ruby { code=> " require json my_hash = JSON.parse([url][queryString]) my_hash.delete_if { |key,value| key.to_s.match(/^404;/) } " } } output

Using an id of a table for sql_last_value in logstash?

无人久伴 提交于 2019-12-23 03:47:08
问题 I'm having a MySQL statement as such within my jdbc plugin in logstash input. statement => "SELECT * from TEST where id > :sql_last_value" My table doesn't have any date or datetime field as such. So I'm trying to update the index, by checking minute by minute using a scheduler , whether any new rows have been added to the table. I should only be able to update the new records, rather than updating the existing value changes from an existing record. So to do this I'm having this kinda of a

How to authenticate Logstash output to a secure Elasticsearch URL (version 5.6.5)

戏子无情 提交于 2019-12-22 10:53:51
问题 I am using Logstash and Elasticsearch versions 5.6.5. So far used elasticsearch output with HTTP protocol and no authentication. Now Elasticsearch is being secured using basic authentication (user/password) and CA certified HTTPS URL. I don't have any control over the elasticsearch server. I just use it to output to from Logstash. Now when I try to configure the HTTPS URL of elasticsearch with basic authentication, it fails to create the pipeline. Output Configuration output { elasticsearch {

Retrieving RESTful GET parameters in logstash

浪子不回头ぞ 提交于 2019-12-22 08:47:11
问题 I am trying to get logstash to parse key-value pairs in an HTTP get request from my ELB log files. the request field looks like http://aaa.bbb/get?a=1&b=2 I'd like there to be a field for a and b in the log line above, and I am having trouble figuring it out. My logstash conf (formatted for clarity) is below which does not load any additional key fields. I assume that I need to split off the address portion of the URI, but have not figured that out. input { file { path => "/home/ubuntu/logs/*

Java Filter For Logstash

一笑奈何 提交于 2019-12-21 08:18:10
问题 You know how there is a Ruby filter for Logstash which enables me to write code in Ruby and it is usually included in the config file as follows filter { ruby { code => "...." } } Now I have two Jar files that I would like to include in my filter so that the input I have can be processed according to the operations I have in these Jar files. However, I cannot (apparently) include the Jar file in the ruby code. I've been looking for a solution. 回答1: So to answer this, I found this wonderful

Why is logstash throwing error of daylight saving time gap with SQL Server data

这一生的挚爱 提交于 2019-12-20 06:09:23
问题 We are using LogStash version 7.3.2 to fetch SQL Server data. And it is working fine but sometimes it is throwing below exception: Exception when executing JDBC query {:exception=># transition (daylight savings time 'gap'): 1942-09-01T00:00:00.000 (Asia/Kolkata)>} When I check in SQL server then there is no value like 1942-09-01T00:00:00.000. My LogStash config is as below: jdbc_connection_string => "jdbc:sqlserver://HOST:PORT;databaseName=DB_NAME;integratedSecurity=false jdbc_user =>

Logstash not reading in new entries from MySQL

北城以北 提交于 2019-12-19 21:25:31
问题 I have Logstash and Elasticsearch installed locally on my Windows 7 machine. I installed logstash-input-jdbc in Logstash. I have data in MySql database which I send to Elasticsearch using Logstash so I can do some report generating. Logstash config file that does this. input { jdbc { jdbc_driver_library => "C:/logstash/lib/mysql-connector-java-5.1.37-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/test" jdbc_user => "root" jdbc

Logstash not reading in new entries from MySQL

狂风中的少年 提交于 2019-12-19 21:25:15
问题 I have Logstash and Elasticsearch installed locally on my Windows 7 machine. I installed logstash-input-jdbc in Logstash. I have data in MySql database which I send to Elasticsearch using Logstash so I can do some report generating. Logstash config file that does this. input { jdbc { jdbc_driver_library => "C:/logstash/lib/mysql-connector-java-5.1.37-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/test" jdbc_user => "root" jdbc