logstash

Logstash - Failed to open <file_path> Permission denied

匆匆过客 提交于 2019-12-22 12:32:53
问题 I am using logstash to push all the text logs from storage to elastic search. My storage size is about 1 TB. To Start with I have started to push 368 GB data (may be few hundred thousand files) to elastic search but logstash is failing with following error. {:timestamp=>"2014-05-15T00:41:12.436000-0700", :message=>"/root/share/archive_data/sessionLogs/965c6f46-1a5e-4820-a68d-7c32886972fc/Log.txt: file grew, old size 0, new size 1557420", :level=>:debug, :file=>"filewatch/watch.rb", :line=>"81

Getting timestamp of event from file name in logstash

心已入冬 提交于 2019-12-22 12:18:10
问题 We have a process that writes events to a file, without a timestamp. The file names themselves are suffixed with a timestamp, which is the timestamp that should be used for all the events in the file. Now, I am trying to parse the file by using the input file plugin of logstash. Is there a way by which I could get the name of the file to a field, so that I can then use the gsub filter to extract out the timestamp and then use the date filter to set the timestamp to the event? 回答1: I had a

Logstash: Parse Complicated Multiline JSON from log file into ElasticSearch

北城以北 提交于 2019-12-22 11:32:05
问题 Let me first say that I have gone through as many examples on here as I could that still do not work. I am not sure if it's because of the complicated nature of the JSON in the log file or not. I am looking to take the example log entry, have Logstash read it in, and send the JSON as JSON to ElasticSearch. Here is what the (shortened) example looks: [0m[0m16:02:08,685 INFO [org.jboss.as.server] (ServerService Thread Pool -- 28) JBAS018559: { "appName": "SomeApp", "freeMemReqStartBytes":

How to authenticate Logstash output to a secure Elasticsearch URL (version 5.6.5)

戏子无情 提交于 2019-12-22 10:53:51
问题 I am using Logstash and Elasticsearch versions 5.6.5. So far used elasticsearch output with HTTP protocol and no authentication. Now Elasticsearch is being secured using basic authentication (user/password) and CA certified HTTPS URL. I don't have any control over the elasticsearch server. I just use it to output to from Logstash. Now when I try to configure the HTTPS URL of elasticsearch with basic authentication, it fails to create the pipeline. Output Configuration output { elasticsearch {

Correlate messages in ELK by field

六月ゝ 毕业季﹏ 提交于 2019-12-22 10:53:48
问题 Related to: Combine logs and query in ELK We are setting up ELK and would want to create a visualization in Kibana 4. The issue here is that we want to relate between two different types of message. To simplify: Message type 1 fields: message_type, common_id_number, byte_count, ... Message type 2 fields: message_type, common_id_number, hostname, ... Both messages share the same index in elasticsearch. As you can see we were trying to graph without taking that common_id_number into account,

CSV geodata into elasticsearch as a geo_point type using logstash

前提是你 提交于 2019-12-22 09:31:05
问题 Below is a reproducible example of the problem I am having using to most recent versions of logstash and elasticsearch. I am using logstash to input geospatial data from a csv into elasticsearch as geo_points. The CSV looks like the following: $ head simple_base_map.csv "lon","lat" -1.7841,50.7408 -1.7841,50.7408 -1.78411,50.7408 -1.78412,50.7408 -1.78413,50.7408 -1.78414,50.7408 -1.78415,50.7408 -1.78416,50.7408 -1.78416,50.7408 I have create a mapping template that looks like the following:

Use Logstash CSV filter doesn't work

二次信任 提交于 2019-12-22 09:02:11
问题 I was trying to use CSV filter on Logstash but it can upload values of my file. I'm using Ubuntu Server 14.04, kibana 4, logstash 1.4.2 and elasticsearch 1.4.4. Next I show my CSV file and filter I wrote. Am I doing something wrong? CSV File: Joao,21,555 Miguel,24,1000 Rodrigo,43,443 Maria,54,2343 Antonia,67,213 Logstash CSV filter: #Este e filtro que le o ficheiro e permite alocar os dados num index do Elasticsearch input { file { path => ["/opt/logstash/bin/testeFile_lite.csv"] start

Retrieving RESTful GET parameters in logstash

浪子不回头ぞ 提交于 2019-12-22 08:47:11
问题 I am trying to get logstash to parse key-value pairs in an HTTP get request from my ELB log files. the request field looks like http://aaa.bbb/get?a=1&b=2 I'd like there to be a field for a and b in the log line above, and I am having trouble figuring it out. My logstash conf (formatted for clarity) is below which does not load any additional key fields. I assume that I need to split off the address portion of the URI, but have not figured that out. input { file { path => "/home/ubuntu/logs/*

logstash could not be started when running multiple instances - path.data setting

这一生的挚爱 提交于 2019-12-22 08:16:27
问题 Hi i am new to the internals of ELK stack running a logstash process in background, and when it got picked up the matching file pattern, it says as in below i want to understand here what is the importance of path.data option, Please help me out [FATAL][logstash.runner] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting. 回答1: path.data directory is used by

Logstash delete type and keep _type

↘锁芯ラ 提交于 2019-12-21 20:37:30
问题 I have a logstash client and server. The client sends logfiles with the udp output of logstash to the server and the server also runs logstash to get these logs. On the server, I have a json filter that pulls the json formatted message in the fields of the actual log, so that elasticsearch can index them. Here is my code from the server: input{ udp{} } filter{ json { source => "message" } } output{ elasticsearch{ } } And from the client: input{ file{ type => "apache-access" path => "/var/log