问题
As we know we can't fetch more than 10000 rows in python from elastic search because of connection error issue. I want data for two hours from my elastic clusterand for every 5 minutes, I am having approx 10000 observation.
1.) Is there is any way if I can just dump the data from elastic search directly into csv or into some Nosql db with more than 10000 count.
I writing my code in python.
I am having elasticsearch version 5
回答1:
Try the below code for scroll query
from elasticsearch import Elasticsearch, helpers
es = Elasticsearch()
es_index = "your_index_name"
documento = "your_doc_type"
body = {
"query": {
"term" : { "user" : user }
}
}
res = helpers.scan(
client = es,
scroll = '2m',
query = body,
index = es_index)
来源:https://stackoverflow.com/questions/52063840/dumping-elastic-data-into-csv-or-into-any-nosql-through-python