logstash

Logstash file input: registering json file grew but not taking data in some cases

牧云@^-^@ 提交于 2020-01-06 10:04:53
问题 My config file is shown below: input { file { codec => "json" path => "/home/data/*" sincedb_path => "/dev/null" } } output { elasticsearch{ protocol => "http" host => "localhost" index => "data" } } When I download a .json file to the data directory, logstash will not receive the data or output to elasticsearch unless I first open the file and save it in gedit. Running logstash with the -vvv flag shows no errors all I get when a file is put in that directory is _discover_file: /home/data/*:

Logstash output from json parser not being sent to elasticsearch

一世执手 提交于 2020-01-05 09:07:56
问题 This is kind of a follow up from another one of my questions: JSON parser in logstash ignoring data? But this time I feel like the problem is more clear then last time and might be easier for someone to answer. I'm using the JSON parser like this: json #Parse all the JSON { source => "MFD_JSON" target => "PARSED" add_field => { "%{FAMILY_ID}" => "%{[PARSED][platform][family_id][1]}_%{[PARSED][platform][family_id][0]}" } } The part of the output for one the logs in logstash.stdout looks like

logstash - use ruby code inside of a filter

非 Y 不嫁゛ 提交于 2020-01-05 08:31:58
问题 is it possible to use ruby code inside of a filter? something like this: filter { csv{ ruby { code => " fieldArray = event['message'].split(',') for field in fieldArray event[field] = field end " } } } 回答1: No, csv{} is a filter and ruby{} is a filter, so they don't nest inside each other as you've shown. You haven't described the problem, but perhaps just using ruby{} is what you're looking for. EDIT: with more information on the problem, here are some more notes: Logstash runs one event at

Mutliple config files causing duplicate message

两盒软妹~` 提交于 2020-01-05 06:36:07
问题 I have a Logstash machine running in AWS. In Logstash I have 3 config files each having 1 input defined on them. These inputs are reading logs from following sources From s3 From http input From filebeat The problem is that I am getting duplicate messages in Kibana. So for 1 message generated by Filebeat I am seeing 3 messages in Kibana. I tried to remove 1 config file and the count got reduced to 2. So I am pretty sure that this is due to these config files. What is confusing me is that why

Unable to connect Kibana to Elasticsearch

纵饮孤独 提交于 2020-01-05 05:22:45
问题 I've installed ES 7.5 and Kibana 7.5 on RHEL7, but after starting Kibana and checking the UI, I'm seeing the error, "Kibana server is not ready yet." Checking the Kibana log, I see that it is not properly connecting to ES. Any help is appreciated! Here is the output of journalctl --unit kibana : Dec 11 10:03:05 mcjca033031 systemd[1]: kibana.service holdoff time over, scheduling restart. Dec 11 10:03:05 mcjca033031 systemd[1]: Started Kibana. Dec 11 10:03:05 mcjca033031 systemd[1]: Starting

Unable to connect Kibana to Elasticsearch

感情迁移 提交于 2020-01-05 05:22:27
问题 I've installed ES 7.5 and Kibana 7.5 on RHEL7, but after starting Kibana and checking the UI, I'm seeing the error, "Kibana server is not ready yet." Checking the Kibana log, I see that it is not properly connecting to ES. Any help is appreciated! Here is the output of journalctl --unit kibana : Dec 11 10:03:05 mcjca033031 systemd[1]: kibana.service holdoff time over, scheduling restart. Dec 11 10:03:05 mcjca033031 systemd[1]: Started Kibana. Dec 11 10:03:05 mcjca033031 systemd[1]: Starting

How do I replace a string in a field in Logstash

一笑奈何 提交于 2020-01-04 11:19:11
问题 I have an IP address field from the Windows event log that contains characters like "::fffff:" in front of the IP address. I cannot change the source here, so I have to fix this in Logstash. I must suck at googling, but I really can't find a simple way to just strip these characters from the ip-address fields in logstash. I have tried for example if ("" in [event_data][IpAddress]) { mutate { add_field => { "client-host" => "%{[event_data][IpAddress]}"} gsub => ["client-host", ":", ""] } dns {

Logstash Multiline filter

耗尽温柔 提交于 2020-01-04 07:18:12
问题 We have some files that are written out to our web servers whenever we have php errors. Each error has it's own file, but there are always multiple lines in each file. The files always start with the text "Excepton:". Is there a way to easily just say, "take the whole file as a log event?" See example below: Exception: ABC_Exception_Domain -- Message: There is no valid performance dimension for the given nodeRootId. Error Date and Time: Date: September 25, 2014 Time: 10:38:15 Timestamp:

How to handle multiple inputs with Logstash in the same file?

喜你入骨 提交于 2020-01-03 04:48:09
问题 Let's say you have very 3 different lines in your log firewall file and you want: to grok it and the result be stored into an elastic search cluster using the dedicated elastic search output. what should i do in my logstash.conf ?? Thanks. 回答1: Assuming the different logs come from the same log source (i.e. the same file) and should be regarded as being of the same type (which is judgment call) you can just list multiple grok patterns: filter { grok { match => ["message", "pattern1",

Add fields to Logstash Twitter input and Elasticsearch output

丶灬走出姿态 提交于 2020-01-03 04:47:32
问题 I am using logstash to save the twitter stream to elasticsearch. Before saving, I want to Add a new field which indicates whether the tweet is a RT or reply or organic Use the tweet id as elasticsearch's document id But I've been unable to do either! Logstash config file: input { twitter { oauth_token => "" oauth_token_secret => "" consumer_key => "" consumer_secret => "" full_tweet => true keywords => ["test"] } } filter { ruby { code => " if !event['retweeted_status'].nil? event['tweet_type