elastic-stack

Dashboard Only Mode in Kibana 7.0.1

≯℡__Kan透↙ 提交于 2020-02-24 04:52:32
问题 I am very new to ELK stack, and just exploring kibana, already created Dashboards and now when i share that Dashboard to others it shows all other tabs too in kibana instead it should be display only Dashboard. how do i set such things in kibana? I have installed kibana 7.1.0 on my computer, i am able to access directly using below link without login. http://10.42.35.14:5601 I come to know that we can share kibana 7.1.0 dashboard using "Dashboard only mode" option but i dont know how to set

Dashboard Only Mode in Kibana 7.0.1

心已入冬 提交于 2020-02-24 04:52:31
问题 I am very new to ELK stack, and just exploring kibana, already created Dashboards and now when i share that Dashboard to others it shows all other tabs too in kibana instead it should be display only Dashboard. how do i set such things in kibana? I have installed kibana 7.1.0 on my computer, i am able to access directly using below link without login. http://10.42.35.14:5601 I come to know that we can share kibana 7.1.0 dashboard using "Dashboard only mode" option but i dont know how to set

Set _Id as update key in logstash elasticsearch

。_饼干妹妹 提交于 2020-02-06 08:07:32
问题 Im having an index as below: { "_index": "mydata", "_type": "_doc", "_id": "PuhnbG0B1IIlyY9-ArdR", "_score": 1, "_source": { "age": 9, "@version": "1", "updated_on": "2019-01-01T00:00:00.000Z", "id": 4, "name": "Emma", "@timestamp": "2019-09-26T07:09:11.947Z" } So my logstash conf for updaing data is input { jdbc { jdbc_connection_string => "***" jdbc_driver_class => "***" jdbc_driver_library => "***" jdbc_user => *** statement => "SELECT * from agedata WHERE updated_on > :sql_last_value

ElasticSearch group by documents field and count occurences

可紊 提交于 2020-01-25 06:53:27
问题 My ElasticSearch 6.5.2 index look likes: { "_index" : "searches", "_type" : "searches", "_id" : "cCYuHW4BvwH6Y3jL87ul", "_score" : 1.0, "_source" : { "querySearched" : "telecom", } }, { "_index" : "searches", "_type" : "searches", "_id" : "cSYuHW4BvwH6Y3jL_Lvt", "_score" : 1.0, "_source" : { "querySearched" : "telecom", } }, { "_index" : "searches", "_type" : "searches", "_id" : "eCb6O24BvwH6Y3jLP7tM", "_score" : 1.0, "_source" : { "querySearched" : "industry", } And I would like a query that

How can I safely move Elasticsearch indices to another mount in Linux?

只愿长相守 提交于 2020-01-17 06:36:10
问题 I'm having a number of indices which are actually causing some space issues at the moment in my Ubuntu machine. The indices keep growing on a daily basis. So I thought of moving it to another mount directory which has more space apparently. How can I do this safely? And I have to make sure that the existing ES indices and the Kibana graphs would be safe enough after the doing the move. What I did : Followed this SO and moved my data directory of Elasticsearch somehow to the directory (/data

ElasticSearch is up and running in docker but not responding to request

生来就可爱ヽ(ⅴ<●) 提交于 2020-01-16 19:11:53
问题 New to docker and ELK stack. I referred to this doc, for running elastic search in docker. Docker container command says, elastic search is up in 9200 and 9300., CONTAINER ID : ef87e2bccee9 IMAGE: docker.elastic.co/elasticsearch/elasticsearch:6.6.1 CREATED: 18 minutes ago STATUS: Up 18 minutes PORTS: 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp NAMES: dreamy_roentgen And elastic search logs says C:\Windows\system32>docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker

Unable to bind elasticsearch transport service to external interface

僤鯓⒐⒋嵵緔 提交于 2020-01-16 19:00:28
问题 I am trying to setup elasticsearch cluster with 2 virtual machines. I am not able to configure the cluster transport service with an external interface. I could able to use localhost:9300 as transport service, but I cannot use localhost URL to join the cluster. It is throwing an error when I use external interface name/IP to configure the cluster. [2017-12-22T06:58:56,979][INFO ][o.e.t.TransportService ] [node-1] publish_address {10.0.1.33:9300}, bound_addresses {10.0.1.33:9300} [2017-12

Serilog not working with Elasticsearch

﹥>﹥吖頭↗ 提交于 2020-01-14 03:47:26
问题 I have installed Elasticsearch locally on my Windows 7 machine. It is working properly and I can query it using postman. Using Postman, I have also created an index called cmstest. This has a table called zdsf. If I get the mapping (http://localhost:9200/cmstest/_mapping) for this index I get the following: I can post entries to this table as well using Postman. Now, I am trying to use Serilog from my .NET 4.5 application to log to this localhost instance of Elasticsearch.But nothing seems to

ElasticSearch 6, copy_to with dynamic index mappings

陌路散爱 提交于 2020-01-13 09:57:27
问题 Maybe I'm missing something simple, but still could not figure out the following thing: As of ES 6.x the _all field is deprecated, and instead it's suggested to use the copy_to instruction (https://www.elastic.co/guide/en/elasticsearch/reference/current/copy-to.html). However, I got an impression that you need to explicitly specify the fields which you want to copy to the custom _all field. But if I use dynamic mappings, I don't know the fields in advance, and therefore cannot use copy_to ?

Is it possible to update an existing field in an index through mapping in Elasticsearch?

安稳与你 提交于 2020-01-11 13:17:09
问题 I've already created an index, and it contains data from my MySQL database. I've got few fields which are string in my table, where I need them as different types ( integer & double ) in Elasticsearch . So I'm aware that I could do it through mapping as follows: { "mappings": { "my_type": { "properties": { "userid": { "type": "text", "fielddata": true }, "responsecode": { "type": "integer" }, "chargeamount": { "type": "double" } } } } } But I've tried this when I'm creating the index as a new