elasticsearch-plugin

How to run Elasticsearch 2.1.1 as root user in Linux machine

风格不统一 提交于 2019-11-29 05:26:06
问题 I am trying to run Elasticsearch 2.1.1 in my Linux machine which I am the root user of it. When I tried to execute the Elasticsearch.I am getting the following error: Exception in thread "main" java.lang.RuntimeException: don't run elasticsearch as root. at org.elasticsearch.bootstrap.Bootstrap.initializeNatives(Bootstrap.java:93) at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:144) at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:285) at org.elasticsearch.bootstrap

Removing old indices in elasticsearch

[亡魂溺海] 提交于 2019-11-28 18:19:13
I have the many of my logs indexed in logstash-Year-Week format. That is if i want to delete indices older than a few weeks, how can I achieve that in elasticsearch. Is there an easy, seamless way to do that? Curator would be an ideal match here. You can find the link here - https://github.com/elastic/curator A command like below should work just fine - curator --host <IP> delete indices --older-than 30 --prefix "twitter-" --time-unit days --timestring '%Y-%m-%d' You can keep in this in the CRON for removing the indices occasionally. You can find some examples and docs here - https://www

Logstash sprintf formatting for elasticsearch output plugin not working

夙愿已清 提交于 2019-11-28 14:43:44
I am having trouble using sprintf to reference the event fields in the elasticsearch output plugin and I'm not sure why. Below is the event received from Filebeat and sent to Elasticsearch after filtering is complete: { "beat" => { "hostname" => "ca86fed16953", "name" => "ca86fed16953", "version" => "6.5.1" }, "@timestamp" => 2018-12-02T05:13:21.879Z, "host" => { "name" => "ca86fed16953" }, "tags" => [ [0] "beats_input_codec_plain_applied", [1] "_grokparsefailure" ], "fields" => { "env" => "DEV" }, "source" => "/usr/share/filebeat/dockerlogs/logstash_DEV.log", "@version" => "1", "prospector" =

Find and replace in elasticsearch all documents

扶醉桌前 提交于 2019-11-28 06:22:08
问题 I wanted to replace the single username in all my elasticsearch index documents. Is there any API query ? I tried searching multiple but couldn't find. Any one has idea? My scenario: curl -XPOST 'http://localhost:9200/test/movies/' -d '{"user":"mad", "role":"tester"}' curl -XPOST 'http://localhost:9200/test/movies/' -d '{"user":"bob", "role":"engineer"}' curl -XPOST 'http://localhost:9200/test/movies/' -d '{"user":"cat", "role":"engineer"}' curl -XPOST 'http://localhost:9200/test/movies/' -d

Elasticsearch plugin to classify documents

假装没事ソ 提交于 2019-11-28 05:56:47
问题 Is there an elasticsearch plugin out there that would allow me to classify the documents that I enter in an index? The best solution for me would be a classifications of all the most recurrent terms (/ concepts) displayed in a sort of tags cloud that the user can navigate. Is there a way to achieve this? Any suggestions? Thanks 回答1: The basic idea is to use a terms aggregations, which will yield one bucket per term. POST /_search { "aggs" : { "genres" : { "terms" : { "field" : "genre" } } } }

How to control the elasticsearch aggregation results with From / Size?

若如初见. 提交于 2019-11-28 02:25:18
I have been trying to add pagination in elasticsearch term aggregation. In query we can add the pagination like, { "from": 0, // to add the start to control the pagination "size": 10, "query": { } } this is pretty clear, but when I want to add pagination to aggregation, I read a lot about it, but I couldn't find anything, My code looks like this, { "from": 0, "size": 0, "aggs": { "group_by_name": { "terms": { "field": "name", "size": 20 }, "aggs": { "top_tag_hits": { "top_hits": { "size": 1 } } } } } } Is there any way to create pagination with a function or any other suggestions? Seems like

What is the best way to implement Email Alerts in Elastisearch?

南笙酒味 提交于 2019-11-28 01:09:21
问题 We will be building a new job board type site that runs in AWS and we are using Elastisearch for all the job and candidate search functionality. The site will have email alerts. 1) Candidates can set an alert so that a new job that is posted that matches certain keywords and is within X miles of a certain zipcode will be emailed to them. 2) Recruiters will be able to set alerts so that a resume with certain keywords within X miles of a certain zipcode will be emailed to them Is there

is there any way to import a json file(contains 100 documents) in elasticsearch server.?

时光怂恿深爱的人放手 提交于 2019-11-27 17:24:23
Is there any way to import a JSON file (contains 100 documents) in elasticsearch server? I want to import a big json file into es-server.. You should use Bulk API . Note that you will need to add a header line before each json document. $ cat requests { "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } } { "field1" : "value1" } $ curl -s -XPOST localhost:9200/_bulk --data-binary @requests; echo {"took":7,"items":[{"create":{"_index":"test","_type":"type1","_id":"1","_version":1,"ok":true}}]} As dadoonet already mentioned, the bulk API is probably the way to go. To transform your

How to index a pdf file in Elasticsearch 5.0.0 with ingest-attachment plugin?

依然范特西╮ 提交于 2019-11-27 11:43:39
I'm new to Elasticsearch and I read here https://www.elastic.co/guide/en/elasticsearch/plugins/master/mapper-attachments.html that the mapper-attachments plugin is deprecated in elasticsearch 5.0.0. I now try to index a pdf file with the new ingest-attachment plugin and upload the attachment. What I've tried so far is curl -H 'Content-Type: application/pdf' -XPOST localhost:9200/test/1 -d @/cygdrive/c/test/test.pdf but I get the following error: {"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse",

Removing old indices in elasticsearch

早过忘川 提交于 2019-11-27 11:12:13
问题 I have the many of my logs indexed in logstash-Year-Week format. That is if i want to delete indices older than a few weeks, how can I achieve that in elasticsearch. Is there an easy, seamless way to do that? 回答1: Curator would be an ideal match here. You can find the link here - https://github.com/elastic/curator A command like below should work just fine - curator --host <IP> delete indices --older-than 30 --prefix "twitter-" --time-unit days --timestring '%Y-%m-%d' You can keep in this in