Druid: how to cache all historical node data in memory

左心房为你撑大大i 提交于 2019-12-04 18:01:31

Druid doesn't have any direct mechanism to force the data to be cached.To workaround this problem you may try firing some dummy queries at the startup which would load the data segment in memory. There are various level of caches that come into play when druid queries are launched:

  1. Cache at historical nodes
  2. Cache at broker nodes
  3. Page cache

First two caches are configurable and can be turned on/off as per the requirement whereas the page cache is entirely controlled by the underlying OS. Since in your setup you have lots of free memory at historical i would suggest you to fire dummy queries at startup which spans across all historical segments which would bring all the segments data in page cache and any queries fired later in time would benefit from this.

Historical and broker caches don't cache the entire data of segment but only the result of a query on each segment hence these won't be useful in case your queries are very dynamic in nature and require different aggregations and filters each time.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!