This is the final part of setting up Elasticsearch, Logstash and Kibana using the official Docker Hub images. If you haven’t already read Part 1 Logstash or Part 2 Elasticsearch, it might be good to read them first.
So far I have a running Logstash sending messages to Elasticsearch both of which are running in separate Docker containers and now I’m going to add Kibana. Kibana adds the graphical UI that enables you to visualise, create dashboards and search for messages etc.
Kibana talks to Elasticsearch to query the data, so for our Docker run statement, we need to link the kibana instance to the elasticsearch-node. For consistency, the Kibana instance is given the name kibana-node and it’s given a parameter to talk to the elasticsearch-node via a fully qualified search url.
docker run \ --link elasticsearch-node:elasticsearch-node \ --name kibana-node \ -p 5601:5601 \ -d \ -e ELASTICSEARCH_URL=http://elasticsearch-node:9200 \ kibana
When you first log into Kibana, it will ask for an index to be created. This bit did catch me out as you can’t create an index until Elasticsearch has received a few events. Once it has, it’s as simple as giving it a name and selecting the right date field, then clicking create.
Clicking on the Discover menu will then allow you to see the rececent events and start creating new queries for turning into dashboard widgets.
I haven’t yet had enough time to create any fancy dashboards to give an example as it’s a little different to the tool I’m more familiar with (Splunk).
The last point to mention is that all of the examples so far have not saved data other than to the Docker image. Therefore if the Docker image is removed, so is all your historic data! To persist the data, I’d suggest having a look at Docker volumes – but since I haven’t tried it yet, can’t guarantee it is the right answer!
That concludes my mini-series on the subject of Elasticsearch, Logstash and Kibana (E.L.K.) on Docker 🙂