Contact Archives
 

Routing Docker container logs to ELK using Docker Compose

 

The problem

Suppose you have several docker containers in production, one (or many!) of
them decide to suddenly explode, then it’s your turn to debug them.
Sure, if the error is easy to detect you can run:

1
docker logs -f containername

Then wait until all the logs are downloaded or the error is repeated, deploy
a fix, test and get back to bed.

Now for the challenge: Try doing this with a container that spits out 30
log lines per second and has been running non stop for months.
It will take longer to download all those logs than it would be to actually
fix the problem.

The solution

Use a more robust logging solution, possibly one that allows you to filter
down your log messages to those of type error or even warn.

For this, we need ELK: ElasticSearch + Logstash + Kibana and the Docker Compose
logging property.

I’ve setup a repo with an example and further instructions: https://github.com/francolaiuppa/docker-gelf-logging-example

Does it work with other languages?

Yes! As long as the language you’re using can output to stderr and stdout you should be good to go.

I don’t like exposing my Logstash instance, can I link it using Docker links?

Unfortunately no, you can’t do that as the logging property runs at
(host) Docker level, not at container level.
This said, you can expose the port only to the local computer or to a specific IP address.
For more info please check docker-compose docs on ports

Thanks to Christophe Labouisse‘s article for pointing me in the right direction.