Routing Docker container logs to ELK using Docker Compose
Suppose you have several docker containers in production, one (or many!) of
them decide to suddenly explode, then it’s your turn to debug them.
Sure, if the error is easy to detect you can run:
docker logs -f containername
Then wait until all the logs are downloaded or the error is repeated, deploy
a fix, test and get back to bed.
Now for the challenge: Try doing this with a container that spits out 30
log lines per second and has been running non stop for months.
It will take longer to download all those logs than it would be to actually
fix the problem.
Use a more robust logging solution, possibly one that allows you to filter
down your log messages to those of type
error or even
For this, we need ELK: ElasticSearch + Logstash + Kibana and the Docker Compose
I’ve setup a repo with an example and further instructions: https://github.com/francolaiuppa/docker-gelf-logging-example
Yes! As long as the language you’re using can output to
stdout you should be good to go.
Unfortunately no, you can’t do that as the
logging property runs at
(host) Docker level, not at container level.
This said, you can expose the port only to the local computer or to a specific IP address.
For more info please check docker-compose docs on ports
Thanks to Christophe Labouisse‘s article for pointing me in the right direction.