logstash

Is there a way to join databases of two different data sources(ie. Mysql and Postgres SQL) using logstash and indexing it to elastic search?

拜拜、爱过 提交于 2019-12-24 23:24:24
问题 I am very new to ELK and want to know if there is a way around to join two databases from different sources (ie. MYSQL and Postgres) and indexing it to a single index in elasticsearch using logstash. As I am able to achieve the same with the help of pyspark. But I want to achieve the same thing using log stash if it's possible! Also, suggest some other feasible ways to achieve the same apart from the spark and logstash. Thanks in Advance! 回答1: You can definitely achieve this by sourcing data

TTL elastic search not working

萝らか妹 提交于 2019-12-24 19:16:13
问题 I need to put a TTL with each of the logs exported from logstash. I have already created a folder 'mappings' under the config folder, under which I have a folder _default, under which I have the json file default .json, which has: { "_default_" : { "_ttl" : { "enabled" : true, "default" : "10s" } } } I am exporting my logs to elastic server with logstash. THe config file is: input { stdin { type => "stdin-type" } } filter { grok { type => "stdin-type" pattern => "I am %{USERNAME:username}"

Dynamic Template not working for short, byte & float

隐身守侯 提交于 2019-12-24 18:36:26
问题 I am trying to create a template, in my template I am trying to achieve the dynamic mapping. Here is what I wrote, as in 6.2.1 the only boolean, date, double, long, object, string are automatically detected, facing issues for mapping the float, short & byte. Here if I index 127 , it will be mapped to short from the short_fields , it's fine, but when I index some 325566, I am getting exception Numeric value (325566) out of range of Java short, I want to suppress this and let long_fields ,

logstash with activeMQ/stomp

时间秒杀一切 提交于 2019-12-24 13:15:19
问题 all. I am using logstash-1.4.2 to consume messages stored in my activeMQ(with stomp plubgin). in my acitveMQ.xml config file, I have the line: <transportConnector name="stomp" uri="stomp://0.0.0.0:61613?maximumConnections=1000&wireFormat.maxFrameSize=104857600"/> when I run my logstash, I have this error: C:\logstash\logstash-1.4.2\bin>logstash agent -f logstashconfig.conf +---------------------------------------------------------+ | An unexpected error occurred. This is probably a bug. | |

grok multiple messages and process them with different tags

我的未来我决定 提交于 2019-12-24 09:49:35
问题 I want to make a filter in Logstash(version 2.4) with different matches in the same grok. I would like to add different tags depending on the match. Basically, I receive three different message pattern: "##MAGIC##%message" "##REAL##%message" "%message" I am trying to do is: grok { match => {"message" => "##MAGIC##%{GREEDYDATA:magic_message}"} match => {"message" => "##REAL##%{GREEDYDATA:real_message}"} match => {"message" => "%{GREEDYDATA:basic_message}"} if [magic_message]{ overwrite => [

Logstash conf error - amazon_es

不羁的心 提交于 2019-12-24 08:34:16
问题 I am trying to configure for the first time my logstash.conf file with an output to amazon_es. My whole logstash.conf file is here: input { jdbc { jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb" # The user we wish to execute our statement as jdbc_user => "root" jdbc_password => "root" # The path to our downloaded jdbc driver jdbc_driver_library => "/mnt/c/Users/xxxxxxxx/mysql-connector-java-5.1.45/mysql-connector-java-5.1.45-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver"

Logstash does not process files sent by filebeat

99封情书 提交于 2019-12-24 08:29:21
问题 I have setup an elk stack infrastructure with docker. I can't see files being processed by logstash. Filebeat is configured to send .csv files to logstash from logstash, to elasticsearch. I see the logstash filebeat listner staring. Logstash to elasticsearch pipeline works however there is no document/index written. Please advise filebeat.yml filebeat.prospectors: - input_type: log paths: - logs/sms/*.csv document_type: sms paths: - logs/voip/*.csv document_type: voip output.logstash: enabled

Logstash kv filter issue with blank values

柔情痞子 提交于 2019-12-24 08:23:58
问题 I am using the kv filter in my logstash configuration and I have a string that looks something like this: key1="value1" key2="" key3= key4=1 Notice that key3 has no value; that causes key3 to be assigned a value of "key4=1" how do I fix this? 回答1: It might not be the best solution, since we're blindly replacing: mutate { gsub => [ "message", "= ", '="" ' ] } With this filter before the kv filter, any empty space after an equal sign are replaced with two quotes, giving this result: "key1":

Logstash handle more than one format of text in one log file

天涯浪子 提交于 2019-12-24 08:06:11
问题 I am using ELK(filebeat,logstash,elasticsearch,kibana) to make log management. In one log file, I have three kinds of format. I am using In one format, I have date+parameters+json+stacktrace. This kind of format text has multiple lines. In second format, it is just date+requestMethod(Get or post)+some text. It is in one line. In third format, it has date+ modular Name:(In this case, it is paymentAdmin)+json I suppose I could use logStash to handle these three kinds of format by if, else if

Filtering specific lines from log file in logstash

℡╲_俬逩灬. 提交于 2019-12-24 07:19:31
问题 I am not able to get specific lines from logs file /var/log/messages. I am using logstash-forwarder in client-server and logstash, elasticsearch and kibana in log-server. I tried to install grep filter but it gives me some error so I try to implement below with grok. My original post is here . I found this but m quite unsatisfied. Following is the configuration for logstash-forwarder file-name: logstash-forwarder in client-server { "network": { "servers": [ "logstashserver-ip:5000" ],