Docker apps logging with Filebeat and Logstash

后端 未结 6 537
执念已碎
执念已碎 2021-01-30 07:34

I have a set of dockerized applications scattered across multiple servers and trying to setup production-level centralized logging with ELK. I\'m ok with the ELK part itself, bu

6条回答
  •  Happy的楠姐
    2021-01-30 08:04

    Here's one way to forward docker logs to the ELK stack (requires docker >= 1.8 for the gelf log driver):

    1. Start a Logstash container with the gelf input plugin to reads from gelf and outputs to an Elasticsearch host (ES_HOST:port):

      docker run --rm -p 12201:12201/udp logstash \
          logstash -e 'input { gelf { } } output { elasticsearch { hosts => ["ES_HOST:PORT"] } }'
      
    2. Now start a Docker container and use the gelf Docker logging driver. Here's a dumb example:

      docker run --log-driver=gelf --log-opt gelf-address=udp://localhost:12201 busybox \
          /bin/sh -c 'while true; do echo "Hello $(date)"; sleep 1; done'
      
    3. Load up Kibana and things that would've landed in docker logs are now visible. The gelf source code shows that some handy fields are generated for you (hat-tip: Christophe Labouisse): _container_id, _container_name, _image_id, _image_name, _command, _tag, _created.

    If you use docker-compose (make sure to use docker-compose >= 1.5) and add the appropriate settings in docker-compose.yml after starting the logstash container:

    log_driver: "gelf"
    log_opt:
      gelf-address: "udp://localhost:12201"
    

提交回复
热议问题