logstash-configuration

Migrating 3 million records from Oracle to Elastic search using logstash

让人想犯罪 __ 提交于 2021-02-11 14:24:50
问题 We are trying to migrate around 3 million records from oracle to Elastic Search using Logstash. We are applying a couple of jdbc_streaming filters as a part of our logstash script, one to load connecting nested objects and another to run a hierarchical query to load data to another nested object in the index. We are able to index 0.4 million records in 24 hours. The total size occupied by .4 million records is around 300MB. We tried multiple approaches to migrate data quickly into elastic

Create a new index per day for Elasticsearch in Logstash configuration

天大地大妈咪最大 提交于 2021-02-08 19:53:48
问题 I intend to have an ELK stack setup where daily JSON inputs get stored in log files created, one for each date. My logstash shall listen to the input via these logs and store it to Elasticsearch at an index corresponding to the date of the log file entry. My logstash-output.conf goes something like: output { elasticsearch { host => localhost cluster => "elasticsearch_prod" index => "test" } } Thus, as for now, all the inputs to logstash get stored at index test of elasticsearch. What I want

Not able to insert JSON from PostgreSQL to elasticsearch. Getting error - “Exception when executing JDBC query”

半城伤御伤魂 提交于 2021-02-08 10:44:22
问题 I am trying to migrate data from postgresql server to elasticsearch. The postgres data is in JSONB format. When I am starting the river, I am getting the below error. [INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2019-01-07T14:22:34,625][INFO ][logstash.inputs.jdbc ] (0.128981s) SELECT to_json(details) from inventory.retailer_products1 limit 1 [2019-01-07T14:22:35,099][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel:

Logstash: configuring aggregate + elapsed filters

北战南征 提交于 2021-01-29 14:02:07
问题 I have these logs: "03.08.2020 10:56:38","Event LClick","Type Menu","t=0","beg" "03.08.2020 10:56:38","Event LClick","Type Menu","Detail SomeDetail","t=109","end" "03.08.2020 10:56:40","Event LClick","t=1981","beg" "03.08.2020 10:56:40","Event LClick","t=2090","end" "03.08.2020 10:56:41","Event LClick","Type ToolBar","t=3026","beg" "03.08.2020 10:56:43","Event LClick","Type ToolBar","Detail User_Desktop","t=4477","end" "03.08.2020 10:56:44","Event FormActivate","Name Form_Name:IsaA","t=5444"

Read log file from a remote machine with file input plugin using logstash

我的梦境 提交于 2020-08-02 20:54:01
问题 Presently I have my logs and logstash running on the same machine, so I read my logs placed on my local machine with this config(using pull model) input { file { path => "/home/Desktop/Logstash-Input/**/*_log" start_position => "beginning" } } Now, we have logstash running on a different machine and want to read the logs remote mechine. Is there a way to set the ip in file input of config file? EDIT: I manage to do this with logstash-forwarder which is a push model(log shipper/logstash