elastic-stack

Elasticsearch aggregation with hierarchical category, subcategory; limit the levels

て烟熏妆下的殇ゞ 提交于 2019-12-04 05:15:58
问题 I have products with categories field. Using the aggregation I can get the full categories with all subcategories. I want to limit the levels in the facet. e.g. I have the facets like: auto, tools & travel (115) auto, tools & travel > luggage tags (90) auto, tools & travel > luggage tags > luggage spotters (40) auto, tools & travel > luggage tags > something else (50) auto, tools & travel > car organizers (25) Using aggregation like "aggs": { "cat_groups": { "terms": { "field": "categories

Optimal way to set up ELK stack on three servers

旧街凉风 提交于 2019-12-03 08:54:23
I am looking to set up an ELK stack and have three servers to do so. While I have found plenty of documentation and tutorials about how to actually install, and configure elasticsearch, logstash, and kibana, I have found less information about how I should set up the software across my servers to maximize performance. For example, would it be better to set up elasticsearch, logstash, and kibana on all three instances, or perhaps install elasticsearch on two instances and logstash and kibana on the third? Related to that question, if i have multiple elasticsearch servers in my cluster, will I

JSON parser in logstash ignoring data?

拟墨画扇 提交于 2019-12-02 18:07:34
问题 I've been at this a while now, and I feel like the JSON filter in logstash is removing data for me. I originally followed the tutorial from https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04 I've made some changes, but it's mostly the same. My grok filter looks like this: uuid #uuid and fingerprint to avoid duplicates { target => "@uuid" overwrite => true } fingerprint { key => "78787878" concatenate_sources => true }

Docker apps logging with Filebeat and Logstash

試著忘記壹切 提交于 2019-12-02 14:42:58
I have a set of dockerized applications scattered across multiple servers and trying to setup production-level centralized logging with ELK. I'm ok with the ELK part itself, but I'm a little confused about how to forward the logs to my logstashes. I'm trying to use Filebeat, because of its loadbalance feature. I'd also like to avoid packing Filebeat (or anything else) into all my dockers, and keep it separated, dockerized or not. How can I proceed? I've been trying the following. My Dockers log on stdout so with a non-dockerized Filebeat configured to read from stdin I do: docker logs -f

JSON parser in logstash ignoring data?

寵の児 提交于 2019-12-02 10:07:37
I've been at this a while now, and I feel like the JSON filter in logstash is removing data for me. I originally followed the tutorial from https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04 I've made some changes, but it's mostly the same. My grok filter looks like this: uuid #uuid and fingerprint to avoid duplicates { target => "@uuid" overwrite => true } fingerprint { key => "78787878" concatenate_sources => true } grok #Get device name from the name of the log { match => { "source" => "%{GREEDYDATA}%{IPV4:DEVICENAME}%

Failed to execute fabric8 docker plugin

谁说胖子不能爱 提交于 2019-12-02 09:58:45
Running mvn clean install pulls up this error( Windows) [ERROR] Failed to execute goal io.fabric8:docker-maven-plugin:0.20.1:start (prepare-environment) on project integration-test: Execution prepare-environment of goal io.fabric8:docker-maven-plugin:0.20.1:start failed: Start-Job failed with unexpected exception: [sebp/elk:latest] "elk": Timeout after 120365 ms while waiting on url http://localhost:32774/ 来源: https://stackoverflow.com/questions/44226315/failed-to-execute-fabric8-docker-plugin

Is it possible to update an existing field in an index through mapping in Elasticsearch?

烈酒焚心 提交于 2019-12-02 08:37:28
I've already created an index, and it contains data from my MySQL database. I've got few fields which are string in my table, where I need them as different types ( integer & double ) in Elasticsearch . So I'm aware that I could do it through mapping as follows: { "mappings": { "my_type": { "properties": { "userid": { "type": "text", "fielddata": true }, "responsecode": { "type": "integer" }, "chargeamount": { "type": "double" } } } } } But I've tried this when I'm creating the index as a new one. What I wanted to know is how can I update an existing field (ie: chargeamount in this scenario)

logstash grok filter for custom logs

吃可爱长大的小学妹 提交于 2019-12-02 06:58:25
I have two related questions. First is how best to grok logs that have "messy" spacing and so on, and the second, which I'll ask separately, is how to deal with logs that have arbitrary attribute-value pairs. (See: logstash grok filter for logs with arbitrary attribute-value pairs ) So for the first question, I have a log line that looks like this: 14:46:16.603 [http-nio-8080-exec-4] INFO METERING - msg=93e6dd5e-c009-46b3-b9eb-f753ee3b889a CREATE_JOB job=a820018e-7ad7-481a-97b0-bd705c3280ad data=71b1652e-16c8-4b33-9a57-f5fcb3d5de92 Using http://grokdebug.herokuapp.com/ I was able to eventually

How to make Logstash multiline filter merge lines based on some dynamic field value?

旧街凉风 提交于 2019-12-02 06:44:41
I am new to logstash and desparate to setup ELK for one of the usecase. I have found this question relevent to mine Why won't Logstash multiline merge lines based on grok'd field? If multiline filter do not merge lines on grok fields then how do I merge line 2 and 10 from the below log sample? Please help. Using grok patterns I have created a field 'id' which holds the value 715. Line1 - 5/08/06 00:10:35.348 [BaseAsyncApi] [qtp19303632-51]: INFO: [714] CMDC flowcxt=[55c2a5fbe4b0201c2be31e35] method=contentdetail uri=http://10.126.44.161:5600/cmdc/content/programid%3A%2F%2F317977349~programid

Filebeat not supported in Solaris. How to collect logs?

本秂侑毒 提交于 2019-12-01 11:42:48
问题 Our Sever is hosted in Solaris(OS) but we are not able to install Filebeat to forward the logs to desired port as Filebeat is not supported in Solaris. Can someone here suggest any way to solve this problem. Please note we are told not to install Logstash in the server hosted machine. Your advices are highly anticipated . Please do the needful. 回答1: Filebeat can easily be compiled to run on Solaris 11/amd64, but that is not an officially supported platform based on Elastic's support matrix.