elastic-stack

Logstash: configuring aggregate + elapsed filters

北战南征 提交于 2021-01-29 14:02:07
问题 I have these logs: "03.08.2020 10:56:38","Event LClick","Type Menu","t=0","beg" "03.08.2020 10:56:38","Event LClick","Type Menu","Detail SomeDetail","t=109","end" "03.08.2020 10:56:40","Event LClick","t=1981","beg" "03.08.2020 10:56:40","Event LClick","t=2090","end" "03.08.2020 10:56:41","Event LClick","Type ToolBar","t=3026","beg" "03.08.2020 10:56:43","Event LClick","Type ToolBar","Detail User_Desktop","t=4477","end" "03.08.2020 10:56:44","Event FormActivate","Name Form_Name:IsaA","t=5444"

Kafka Consumer Failed to load SSL keystore (Logstash ArcSight module) for any keystore type and path

柔情痞子 提交于 2021-01-29 02:17:25
问题 I need to supply a certificate for client authentication for Kafka Consumer, however, it always fails with the following exception ( Failed to load SSL keystore ): ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/security/cacerts ssl.keystore.password = [hidden] ssl.keystore.type = JKS ssl.protocol

How to send data from HTTP input to ElasticSearch using Logstash ans jdbc_streaming filter?

耗尽温柔 提交于 2021-01-29 00:56:23
问题 I want to send data from Http to elasticsearch using logstash and I want to enrich my data using jdbc_streaming filter plugin. This is my logstash config: input { http { id => "sensor_data_http_input" user => "sensor_data" password => "sensor_data" } } filter { jdbc_streaming { jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"

How to send data from HTTP input to ElasticSearch using Logstash ans jdbc_streaming filter?

雨燕双飞 提交于 2021-01-29 00:55:32
问题 I want to send data from Http to elasticsearch using logstash and I want to enrich my data using jdbc_streaming filter plugin. This is my logstash config: input { http { id => "sensor_data_http_input" user => "sensor_data" password => "sensor_data" } } filter { jdbc_streaming { jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"

Consider defining a bean named 'elasticsearchTemplate' in your configuration

不打扰是莪最后的温柔 提交于 2021-01-27 05:20:11
问题 I have just started springboot and tried to implement elastic search with spring-boot but I am getting this type of error while running spring-boot app Consider defining a bean named 'elasticsearchTemplate' in your configuration. POM.XML <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-jpa</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency>

ELK. Nested values are not found

 ̄綄美尐妖づ 提交于 2021-01-07 04:14:09
问题 I have index mapping like below: { "mapping": { "properties": { "MyMapProperty": { "type": "nested", "properties": { "first": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "second": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, } }, "SecondProperty": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "ThirdProperty": { "type": "text", "fields": { "keyword": { "type":

Elasticsearch unassigned shards CircuitBreakingException[[parent] Data too large

霸气de小男生 提交于 2021-01-01 13:58:38
问题 I got alert stating elasticsearch has 2 unassigned shards. I made below api calls to gather more details. curl -s http://localhost:9200/_cluster/allocation/explain | python -m json.tool Output below "allocate_explanation": "cannot allocate because allocation is not permitted to any of the nodes", "can_allocate": "no", "current_state": "unassigned", "index": "docs_0_1603929645264", "node_allocation_decisions": [ { "deciders": [ { "decider": "max_retry", "decision": "NO", "explanation": "shard

Elasticsearch unassigned shards CircuitBreakingException[[parent] Data too large

馋奶兔 提交于 2021-01-01 13:54:46
问题 I got alert stating elasticsearch has 2 unassigned shards. I made below api calls to gather more details. curl -s http://localhost:9200/_cluster/allocation/explain | python -m json.tool Output below "allocate_explanation": "cannot allocate because allocation is not permitted to any of the nodes", "can_allocate": "no", "current_state": "unassigned", "index": "docs_0_1603929645264", "node_allocation_decisions": [ { "deciders": [ { "decider": "max_retry", "decision": "NO", "explanation": "shard

Logstash XML Parse Failed

假装没事ソ 提交于 2020-12-30 04:01:12
问题 I'm running latest ELK stack 6.6 on deviantony/docker-elk image. I have the following XML file which I try to parse into ES JSON object: <?xml version="1.0" encoding="UTF-8"?> <root> <ChainId>7290027600007</ChainId> <SubChainId>001</SubChainId> <StoreId>001</StoreId> <BikoretNo>9</BikoretNo> <DllVerNo>8.0.1.3</DllVerNo> </root> My conf file is: input { file { path => "/usr/share/logstash/logs/example1.xml" type => "xml" start_position => "beginning" sincedb_path => "/dev/null" codec =>

How to remove empty spaces from query results in Elastic Search analytics

夙愿已清 提交于 2020-12-15 03:33:04
问题 So, I have elastic search up and running, but when I look in my analytics console I see results for "". So, for example, lets say a user types "Red and Black Sneakers". <---- those spaces between the words are tracked in the query and assigned clicks. That seems to throw off the analytcs because a space, "", is not really a viable search.... How to omit these results or properly assign click thrus to the actual search, not spaces? This is a screen grab from the elastic analytics section. See