elastic-stack

Config file not getting read by logstash

心已入冬 提交于 2020-01-07 03:01:45
问题 I have set up Elk stack on my windows machine with the following : Elasticserach Logstash Kibana My logstash.conf input { file { path => "\bin\MylogFile.log" start_position => "beginning" } } output { elasticsearch { hosts => localhost:9200 } } MylogFile.log(Apache Log) 127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326 "http://www.example.com/start.html" "Mozilla/4.08 [en] (Win98; I ;Nav)" When I run logstash.conf it creates the following index in

ElasticSearch - matchPhraseQuery API to search with multiple fields

一曲冷凌霜 提交于 2020-01-06 05:10:19
问题 Am searching for specific field which is having ngram tokenizer and am querying that field (code) using matchPhraseQuery and it is working fine. Now, i want to search with 3 field. How can we do this.? Please find my java code which am searching for only one field (code). SearchRequest searchRequest = new SearchRequest(INDEX); searchRequest.types(TYPE); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); QueryBuilder qb = QueryBuilders.matchPhraseQuery("code", code);

Logstash output from json parser not being sent to elasticsearch

一世执手 提交于 2020-01-05 09:07:56
问题 This is kind of a follow up from another one of my questions: JSON parser in logstash ignoring data? But this time I feel like the problem is more clear then last time and might be easier for someone to answer. I'm using the JSON parser like this: json #Parse all the JSON { source => "MFD_JSON" target => "PARSED" add_field => { "%{FAMILY_ID}" => "%{[PARSED][platform][family_id][1]}_%{[PARSED][platform][family_id][0]}" } } The part of the output for one the logs in logstash.stdout looks like

Unable to connect Kibana to Elasticsearch

纵饮孤独 提交于 2020-01-05 05:22:45
问题 I've installed ES 7.5 and Kibana 7.5 on RHEL7, but after starting Kibana and checking the UI, I'm seeing the error, "Kibana server is not ready yet." Checking the Kibana log, I see that it is not properly connecting to ES. Any help is appreciated! Here is the output of journalctl --unit kibana : Dec 11 10:03:05 mcjca033031 systemd[1]: kibana.service holdoff time over, scheduling restart. Dec 11 10:03:05 mcjca033031 systemd[1]: Started Kibana. Dec 11 10:03:05 mcjca033031 systemd[1]: Starting

Unable to connect Kibana to Elasticsearch

感情迁移 提交于 2020-01-05 05:22:27
问题 I've installed ES 7.5 and Kibana 7.5 on RHEL7, but after starting Kibana and checking the UI, I'm seeing the error, "Kibana server is not ready yet." Checking the Kibana log, I see that it is not properly connecting to ES. Any help is appreciated! Here is the output of journalctl --unit kibana : Dec 11 10:03:05 mcjca033031 systemd[1]: kibana.service holdoff time over, scheduling restart. Dec 11 10:03:05 mcjca033031 systemd[1]: Started Kibana. Dec 11 10:03:05 mcjca033031 systemd[1]: Starting

How do I replace a string in a field in Logstash

一笑奈何 提交于 2020-01-04 11:19:11
问题 I have an IP address field from the Windows event log that contains characters like "::fffff:" in front of the IP address. I cannot change the source here, so I have to fix this in Logstash. I must suck at googling, but I really can't find a simple way to just strip these characters from the ip-address fields in logstash. I have tried for example if ("" in [event_data][IpAddress]) { mutate { add_field => { "client-host" => "%{[event_data][IpAddress]}"} gsub => ["client-host", ":", ""] } dns {

Kibana doesn't show any results in “Discover” tab

戏子无情 提交于 2020-01-04 05:24:07
问题 I have setup elasticsearch(version 1.7.3) and Kibana(version 4.1.2) for indexing our application's Elmah XML files errors. I am using .Net to parse the xml files and Nest ElasticSearch client to insert the indexes into ElasticSearch. The issue is that Kibana doesn't display any data in the "Discover" tab. When I run curl -XGET localhost:9200/.kibana/index-pattern/eol? command, I get the following response: {"_index":".kibana","_type":"index-pattern","_id":"eol","_version":2,"found":tru e,"

Docker container cannot send log to docker ELK stack

折月煮酒 提交于 2020-01-03 02:30:21
问题 I deployed ELK stack and another separated docker container of spring boot app. In the java app, I use the LogstashSocketAppender to send logs to logstash. If the java app running standalone without docker, it works fine. But when it's running as a docker container, the logstash cannot receive logs. Can anyone help me figure out it? part of logstash configuration: input { udp { port => 5000 type => syslog codec => json } } dcoker port: logstash$ 5000/udp -> 0.0.0.0:5000 springboot$ 8088/tcp -

Kibana - How to extract fields from existing Kubernetes logs

纵饮孤独 提交于 2020-01-01 18:19:06
问题 I have a sort of ELK stack, with fluentd instead of logstash, running as a DaemonSet on a Kubernetes cluster and sending all logs from all containers, in logstash format, to an Elasticsearch server. Out of the many containers running on the Kubernetes cluster some are nginx containers which output logs of the following format: 121.29.251.188 - [16/Feb/2017:09:31:35 +0000] host="subdomain.site.com" req="GET /data/schedule/update?date=2017-03-01&type=monthly&blocked=0 HTTP/1.1" status=200 body

Logging from Java app to ELK without need for parsing logs

心不动则不痛 提交于 2019-12-30 00:33:26
问题 I want to send logs from a Java app to ElasticSearch, and the conventional approach seems to be to set up Logstash on the server running the app, and have logstash parse the log files (with regex...!) and load them into ElasticSearch. Is there a reason it's done this way, rather than just setting up log4J (or logback) to log things in the desired format directly into a log collector that can then be shipped to ElasticSearch asynchronously? It seems crazy to me to have to fiddle with grok