elasticsearch-5

Integration of elasticsearch with neo4j database

蓝咒 提交于 2019-12-02 08:41:44
问题 Am trying to use elasticsearch with my neo4j database for fast querying.I tried many sites but they are all old articles so i didn't get any clear idea. Steps I followed until now, Installed neo4j Installed elasticsearch Copy pasted elastic search plugins into neo4j plugins folder added this line into neo4j. properties file elasticsearch.host_name=http://localhost:9200 elasticsearch.index_spec=people:Person(first_name,last_name), places:Place(name) Here my question is, How elasticsearch and

Is it possible to update an existing field in an index through mapping in Elasticsearch?

烈酒焚心 提交于 2019-12-02 08:37:28
I've already created an index, and it contains data from my MySQL database. I've got few fields which are string in my table, where I need them as different types ( integer & double ) in Elasticsearch . So I'm aware that I could do it through mapping as follows: { "mappings": { "my_type": { "properties": { "userid": { "type": "text", "fielddata": true }, "responsecode": { "type": "integer" }, "chargeamount": { "type": "double" } } } } } But I've tried this when I'm creating the index as a new one. What I wanted to know is how can I update an existing field (ie: chargeamount in this scenario)

Simple date histogram?

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-02 07:39:11
Viewing documents on per weekday classification? My data is in a format like this: {"text": "hi","created_at": "2016-02-21T18:30:36.000Z"} For this I am using a dateConversion.groovy script and kept in the scripts folder in ES 5.1.1. Date date = new Date(doc[date_field].value); java.text.SimpleDateFormat format = new java.text.SimpleDateFormat(format); format.format(date) When I executed the following code in ES PLUGIN: "aggs": { "byDays": { "terms": { "script": { "lang": "groovy", "file": "dateConversion", "params": { "date_field": "created_at", "format": "EEEEEE" } } } } `` I am getting an

Integration of elasticsearch with neo4j database

馋奶兔 提交于 2019-12-02 05:36:14
Am trying to use elasticsearch with my neo4j database for fast querying.I tried many sites but they are all old articles so i didn't get any clear idea. Steps I followed until now, Installed neo4j Installed elasticsearch Copy pasted elastic search plugins into neo4j plugins folder added this line into neo4j. properties file elasticsearch.host_name=http://localhost:9200 elasticsearch.index_spec=people:Person(first_name,last_name), places:Place(name) Here my question is, How elasticsearch and neo4j are integrated. Please clarify me on this. I followed this , Link You have to install Apoc

bucket_script inside filter aggregation throws error

纵饮孤独 提交于 2019-12-02 04:29:55
I am trying to filter empty buckets in side a filter aggregation block, and I get an error from elasticsearch. without this the response is huge, as I am querying lots of metric, and nested aggregation (this is part of bigger query for simplicity ) GET index/type/_search?ignore_unavailable { "size": 0, "aggs": { "groupby_country": { "terms": { "field": "country", "size": 2000 }, "aggs": { "exists__x__filter": { "filter": { "bool": { "filter": [ { "exists": { "field": "x" } } ] } }, "aggs": { "sum": { "sum": { "script": "def val = doc['x'].value; if(val>0) Math.min(val , 20000)" } }, "average

What is the most suitable datatype to use in aggregations with ElasticSearch 5: numeric or keyword?

為{幸葍}努か 提交于 2019-12-01 20:25:15
In a Elasticsearch index, I have a few fields which are referencing main categories' ids (e.g. sector_id, country_id, etc...). These fields are solely use for filtering (using the term/terms filters) and for creating buckets in terms aggregations (among others). Each one of them is currently using the smallest suitable numeric datatype (e.g. byte, short, etc..) Is this the best datatype to be used on these for heavy aggregations? Or should these be using the keyword datatype? Thanks in advance for any advice! If the values of those fields are numeric, you should go for a numeric type, if they

What is the most suitable datatype to use in aggregations with ElasticSearch 5: numeric or keyword?

北城以北 提交于 2019-12-01 19:52:49
问题 In a Elasticsearch index, I have a few fields which are referencing main categories' ids (e.g. sector_id, country_id, etc...). These fields are solely use for filtering (using the term/terms filters) and for creating buckets in terms aggregations (among others). Each one of them is currently using the smallest suitable numeric datatype (e.g. byte, short, etc..) Is this the best datatype to be used on these for heavy aggregations? Or should these be using the keyword datatype? Thanks in

Bulk request throws error in elasticsearch 6.1.1

时光总嘲笑我的痴心妄想 提交于 2019-12-01 03:32:02
I recently upgraded to elasticsearch version 6.1.1 and now I can't bulk index documents from a json file. Wehn I do it inline, it works fine. Here are the contents of the document: {"index" : {}} {"name": "Carlson Barnes", "age": 34} {"index":{}} {"name": "Sheppard Stein","age": 39} {"index":{}} {"name": "Nixon Singleton","age": 36} {"index":{}} {"name": "Sharron Sosa","age": 33} {"index":{}} {"name": "Kendra Cabrera","age": 24} {"index":{}} {"name": "Young Robinson","age": 20} When I run this command, curl -XPUT 'localhost:9200/subscribers/ppl/_bulk?pretty' -H 'Content-Type: application/json'

Extract record from multiple arrays based on a filter

荒凉一梦 提交于 2019-11-29 12:28:58
I have documents in ElasticSearch with the following structure : "_source": { "last_updated": "2017-10-25T18:33:51.434706", "country": "Italia", "price": [ "€ 139", "€ 125", "€ 120", "€ 108" ], "max_occupancy": [ 2, 2, 1, 1 ], "type": [ "Type 1", "Type 1 - (Tag)", "Type 2", "Type 2 (Tag)", ], "availability": [ 10, 10, 10, 10 ], "size": [ "26 m²", "35 m²", "47 m²", "31 m²" ] } } Basically, the details records are split in 5 arrays, and fields of the same record have the same index position in the 5 arrays. As can be seen in the example data there are 5 array(price, max_occupancy, type,

TransportError(403, u'cluster_block_exception', u'blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];')

∥☆過路亽.° 提交于 2019-11-29 11:54:29
问题 When I try to store anything in elasticsearch, An error says that: TransportError(403, u'cluster_block_exception', u'blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];') I already inserted about 200 millions documents in my index. But I don't have an idea why this error is happening. I've tried: curl -u elastic:changeme -XPUT 'localhost:9200/_cluster/settings' -H 'Content-Type: application/json' -d '{"persistent":{"cluster.blocks.read_only":false}}' As mentioned here: