elastic-stack

Kibana stopped working and now server not getting ready although kibana.service starts up nicely

耗尽温柔 提交于 2019-12-06 14:50:21
Without any major system update of my Ubuntu ( 4.4.0-142-generic #168-Ubuntu SMP ), Kibana 7.2.0 stopped working. I am still able to start the service with sudo systemctl start kibana.service and the corresponding status looks fine. There is only a warning and no error, this does not seem to be the issue: # sudo systemctl status kibana.service ● kibana.service - Kibana Loaded: loaded (/etc/systemd/system/kibana.service; enabled; vendor preset: enabled) Active: active (running) since Wed 2019-07-10 09:43:49 CEST; 22min ago Main PID: 14856 (node) Tasks: 21 Memory: 583.2M CPU: 1min 30.067s CGroup

Running Elastic without the Trial License

人走茶凉 提交于 2019-12-06 00:07:40
问题 Background: I'm trying to use the Elastic stack (Elastic, Logstash & Kibana) , but I have no money to pay. I don't mind using the parts that are closed source, as long as they are free. In this regard, I'm trying to understand how Elastic Licensing works. We Opened X-Pack seems to suggest that after Elastic 6.3 the X-Pack Code is included (though with a different license) . I also understand that some parts of X-Pack are free, but other's are not. This is all a bit confusing. Objective: I

Correlate messages in ELK by field

人走茶凉 提交于 2019-12-05 20:49:30
Related to: Combine logs and query in ELK We are setting up ELK and would want to create a visualization in Kibana 4. The issue here is that we want to relate between two different types of message. To simplify: Message type 1 fields: message_type, common_id_number, byte_count, ... Message type 2 fields: message_type, common_id_number, hostname, ... Both messages share the same index in elasticsearch. As you can see we were trying to graph without taking that common_id_number into account, but it seems that we must use it. We don't know how yet, though. Any help? EDIT These are the relevant

ELK not passing metadata from filebeat into logstash

眉间皱痕 提交于 2019-12-05 19:46:00
Installed an ELK server via: https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-centos-7 It seems to work except for the filebeat connection; filebeat does not appear to be forwarding anything or at least I can't find anything in the logs to indicate anything is happening. My filebeat configuration is as follows: filebeat: prospectors: - paths: - /var/log/*.log - /var/log/messages - /var/log/secure encoding: utf-8 input_type: log timeout: 30s idle_timeout: 30s registry_file: /var/lib/filebeat/registry output: logstash: hosts: ["my_elk

Serilog HTTP sink + Logstash: Splitting Serilog message array into individual log events

≡放荡痞女 提交于 2019-12-05 15:41:09
We're using Serilog HTTP sink to send the messages to Logstash. But the HTTP message body is like this: { "events": [ { "Timestamp": "2016-11-03T00:09:11.4899425+01:00", "Level": "Debug", "MessageTemplate": "Logging {@Heartbeat} from {Computer}", "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"", "Properties": { "Heartbeat": { "UserName": "Mike", "UserDomainName": "Home" }, "Computer": "Workstation" } }, { "Timestamp": "2016-11-03T00:09:12.4905685+01:00", "Level": "Debug", "MessageTemplate": "Logging {@Heartbeat} from {Computer}",

Use Logstash CSV filter doesn't work

天大地大妈咪最大 提交于 2019-12-05 15:13:18
I was trying to use CSV filter on Logstash but it can upload values of my file. I'm using Ubuntu Server 14.04, kibana 4, logstash 1.4.2 and elasticsearch 1.4.4. Next I show my CSV file and filter I wrote. Am I doing something wrong? CSV File: Joao,21,555 Miguel,24,1000 Rodrigo,43,443 Maria,54,2343 Antonia,67,213 Logstash CSV filter: #Este e filtro que le o ficheiro e permite alocar os dados num index do Elasticsearch input { file { path => ["/opt/logstash/bin/testeFile_lite.csv"] start_position => "beginning" # sincedb_path => "NIL" } } filter { csv { columns => ["nome", "idade", "salario"]

ElasticSearch Indexing 100K documents with BulkRequest API using Java RestHighLevelClient

筅森魡賤 提交于 2019-12-04 22:13:25
Am reading 100k plus file path from the index documents_qa using scroll API. Actual files will be available in my local d:\drive . By using the file path am reading the actual file and converting into base64 and am reindex with the base64 content (of a file) in another index document_attachment_qa . My current implementation is, am reading filePath, convering the file into base64 and indexing document along with fileContent one by one. So its taking more time for eg:- indexing 4000 documents its taking more than 6 hours and also connection is terminating due to IO exception . So now i want to

Optimal way to set up ELK stack on three servers

时光怂恿深爱的人放手 提交于 2019-12-04 13:55:34
问题 I am looking to set up an ELK stack and have three servers to do so. While I have found plenty of documentation and tutorials about how to actually install, and configure elasticsearch, logstash, and kibana, I have found less information about how I should set up the software across my servers to maximize performance. For example, would it be better to set up elasticsearch, logstash, and kibana on all three instances, or perhaps install elasticsearch on two instances and logstash and kibana

Docker apps logging with Filebeat and Logstash

不问归期 提交于 2019-12-04 07:37:02
问题 I have a set of dockerized applications scattered across multiple servers and trying to setup production-level centralized logging with ELK. I'm ok with the ELK part itself, but I'm a little confused about how to forward the logs to my logstashes. I'm trying to use Filebeat, because of its loadbalance feature. I'd also like to avoid packing Filebeat (or anything else) into all my dockers, and keep it separated, dockerized or not. How can I proceed? I've been trying the following. My Dockers

How to make Logstash multiline filter merge lines based on some dynamic field value?

放肆的年华 提交于 2019-12-04 05:34:15
问题 I am new to logstash and desparate to setup ELK for one of the usecase. I have found this question relevent to mine Why won't Logstash multiline merge lines based on grok'd field? If multiline filter do not merge lines on grok fields then how do I merge line 2 and 10 from the below log sample? Please help. Using grok patterns I have created a field 'id' which holds the value 715. Line1 - 5/08/06 00:10:35.348 [BaseAsyncApi] [qtp19303632-51]: INFO: [714] CMDC flowcxt=[55c2a5fbe4b0201c2be31e35]