logstash

How to get url path using logstash on elasticsearch

独自空忆成欢 提交于 2019-12-25 18:27:05
问题 I have tested using my configuration of logstash 127.0.0.1 - - [02/Jun/2016:15:38:57 +0900] "GET /ad/adInfos?id=1 HTTP/1.1" 404 68 filter { grok { match => { "message" => "%{COMMONAPACHELOG}" } } } It's working as below { "message" => "127.0.0.1 - - [02/Jun/2016:15:39:02 +0900] \"POST /ad/signIn?id=1 HTTP/1.1\" 200 26", "@version" => "1", "@timestamp" => "2016-06-02T06:39:02.000Z", "path" => "/opt/node-v4.3.1/logs/access.log", "host" => "0.0.0.0", "clientip" => "127.0.0.1", "ident" => "-",

How to get url path using logstash on elasticsearch

点点圈 提交于 2019-12-25 18:26:09
问题 I have tested using my configuration of logstash 127.0.0.1 - - [02/Jun/2016:15:38:57 +0900] "GET /ad/adInfos?id=1 HTTP/1.1" 404 68 filter { grok { match => { "message" => "%{COMMONAPACHELOG}" } } } It's working as below { "message" => "127.0.0.1 - - [02/Jun/2016:15:39:02 +0900] \"POST /ad/signIn?id=1 HTTP/1.1\" 200 26", "@version" => "1", "@timestamp" => "2016-06-02T06:39:02.000Z", "path" => "/opt/node-v4.3.1/logs/access.log", "host" => "0.0.0.0", "clientip" => "127.0.0.1", "ident" => "-",

Unable to start logstash using mongoDB config?

萝らか妹 提交于 2019-12-25 16:50:57
问题 I am using logstash-1.5.2 with mongodb 3.0.4. and I am trying to start the logstash with below configuration which is not working. input{ stdin{ } } output { mongodb { database => "logdb" collection => "plain" uri => "mongodb://localhost:27017" } } I am facing below errror : ./logstash -f conf/mongo.conf The error reported is: uninitialized constant Mongo::URIParser Please help. 回答1: The problem is caused by a bug in the latest version of logstash-output-mongodb. Please see the issue reported

Logstash

一世执手 提交于 2019-12-25 13:15:11
logstash5.6同步sql server数据 D:\Java\ELK\logstash-5.6.16\bin .\logstash.bat -f .\mssql\jdbc.conf input { stdin { } jdbc { jdbc_driver_library => "D:\Java\ELK\logstash-5.6.16\bin\mssql\sqljdbc42.jar" jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver" jdbc_connection_string => "jdbc:sqlserver://127.0.0.1:1433;databaseName=SCST;" jdbc_user => "sa" jdbc_password => "123456" jdbc_default_timezone => "Asia/Shanghai" statement => "SELECT * FROM dbo.Student" } } output { elasticsearch { # ES的IP地址及端口 hosts => ["localhost:9200"] # 索引名称 index => "article" # 自增ID 需要关联的数据库中有有一个id字段,对应索引的id号

how to grep particulr field from logstash output

最后都变了- 提交于 2019-12-25 12:02:16
问题 I am trying to grep only few fields from this output from logstash 1.repositories#create 2.\"repo\":\"username/reponame\" . please share your ideas to grep particular info from this outpput and assign this to another variable "message" => "<190>Nov 01 20:35:15 10-254-128-66 github_audit: {\"actor_ip\":\"192.168.1.1\",\"from\":\"repositories#create\",\"actor\":\"myuserid\",\"repo\":\"username/reponame\",\"action\":\"staff.repo_route\",\"created_at\":1516286634991,\"repo_id\":44743,\"actor_id\"

Logstash filter Regex into Field

会有一股神秘感。 提交于 2019-12-25 12:00:29
问题 I am facing some issues with parsing a logline. I Have thousends of loglines and every logline contains a hostname like ABC123DF I Have writen a regex and I want to apply it to the logline and put the hostname in the field "victim" Like this: add_field => [ "victim", "/[a-z][a-z][a-z][0-9][0-9][0-9].." I have used the Mutate filter but the result is victim /[a-z][a-z][a-z][0-9][0-9][0-9].. I would like to see: victim ABC123DF how do I do this? 回答1: You don't even need complex regex action to

Nginx logs without year and seconds information

好久不见. 提交于 2019-12-25 08:57:09
问题 I have Nginx as loadbalancer which is generating logs without year and second information in timestamp. One of those logs are 08-10 09:28 root ERROR Error connecting to CRS REST API : [Errno 111] Connection refused Error connecting to CRS REST API : [Errno 111] Connection refused The pattern for this is : (?m)%{MONTHNUM:monthNum}\-%{MONTHDAY:monthDay}\s*%{HOUR:hour}:%{MINUTE:minute}\s*%{WORD}\s*%{LOGLEVEL_CUSTOM:severity}\s*%{GREEDYDATA:messagePayload} While I understand that year information

Elasticsearch mapping select all fields via template to change their data type Elasticsearch

北战南征 提交于 2019-12-25 08:37:35
问题 Hi All I am using elasticsearch-template.json to set data type of all of my fields to string. Below is the snippet of the template: { "template": "logstash-*", "settings": { "index.refresh_interval": "5s", "number_of_shards": 1, "number_of_replicas": 0 }, "mappings": { "logs": { "_all": { "enabled": true }, "properties": { "level1": { "properties": { "level2": { "properties": { "_all": {"type": "string"} } } } } } } } } Here under level2 i have got lots of fields which get created i want to

Logstash - JDBC - MYSQL config error

血红的双手。 提交于 2019-12-25 08:08:57
问题 After watching this tutorial ; https://www.youtube.com/watch?v=ZnI_rlrei1s I'm trying to fetch my localhost mysql (using laravel valet mysql) using logstash with jdbc to sent to elasticsearch server . This is my config : # file: db.conf input { jdbc { # MySQL jdbc connection string to our database, mydb jdb_connection_string => "jdbc:mysql://localhost:3306/dragon" # The user we wish to execute our statement as jdbc_user => "root" # The user password jdbc_password => "" # The path to our

Logstash Elasticsearch Input Plugin for streaming data

陌路散爱 提交于 2019-12-25 07:59:56
问题 I would like to know if we can use logstash-input-elasticsearch plugin for streaming data , ex: if I have data available in my database and I run ElasticSearch input plugin , it will index the data into an output, but if after some time more data comes of ElasticSearch database , Is ElasticSearch input plugin is able to index that data without restarting the logstash ? Thank you for your attention and your help. 回答1: By default, the elasticsearch input will run a scroll query on your ES