Elasticsearch, Logstash and Kibana (E.L.K.) on Docker – Part 3 Kibana

This is the final part of setting up Elasticsearch, Logstash and Kibana using the official Docker Hub images. If you haven’t already read Part 1 Logstash or Part 2 Elasticsearch, it might be good to read them first.

So far I have a running Logstash sending messages to Elasticsearch both of which are running in separate Docker containers and now I’m going to add Kibana. Kibana adds the graphical UI that enables you to visualise, create dashboards and search for messages etc.

Kibana talks to Elasticsearch to query the data, so for our Docker run statement, we need to link the kibana instance to the elasticsearch-node. For consistency, the Kibana instance is given the name kibana-node and it’s given a parameter to talk to the elasticsearch-node via a fully qualified search url.

docker run --link elasticsearch-node:elasticsearch-node --name kibana-node -p 5601:5601 -d -e ELASTICSEARCH_URL=http://elasticsearch-node:9200 kibana

When you first log into Kibana, it will ask for an index to be created. This bit did catch me out as you can’t create an index until Elasticsearch has received a few events. Once it has, it’s as simple as giving it a name and selecting the right date field, then clicking create.

Clicking on the Discover menu will then allow you to see the rececent events and start creating new queries for turning into dashboard widgets.

kibana

I haven’t yet had enough time to create any fancy dashboards to give an example as it’s a little different to the tool I’m more familiar with (Splunk).

The last point to mention is that all of the examples so far have not saved data other than to the Docker image. Therefore if the Docker image is removed, so is all your historic data! To persist the data, I’d suggest having a look at Docker volumes – but since I haven’t tried it yet, can’t guarantee it is the right answer!

That concludes my mini-series on the subject of Elasticsearch, Logstash and Kibana (E.L.K.) on Docker 🙂

Elasticsearch, Logstash and Kibana (E.L.K.) on Docker – Part 2 Elasticsearch

It’s worth reading Part 1 Logstash first.

So, today I had a chance to try out Elasticsearch on docker and it was semi easy to get it to work… the trickiest part was linking the Logstash and Kibana instances with the Elasticsearch instance. The trick is to name everything!

So here’s the command to run Elasticsearch, note I’ve given it a name of elasticsearch-node.

docker run -d --name elasticsearch-node elasticsearch

In order to then link Logstash to the Elasticsearch node, we need to change the command used to run Logstash from this…

docker run -p 13456:9999 -it --rm -v "$PWD":/config-dir logstash -f /config-dir/logstash.conf

To this…

docker run -p 13456:9999 -d -it -v "$PWD":/config-dir --link elasticsearch-node:elasticsearch-node --name logstash-node logstash -f /config-dir/logstash.conf

The differences being that the Logstash image now has a name of logstash-node when it’s run and it links to the Elasticsearch node via the name and identical alias.

The Logstash config file has also been changed to reference the Elasticsearch node as shown below.

input {
  tcp {
    port => 9999
    codec => line
  }
}

filter {
  kv {
    source => "message"
    recursive => "true"
  }
}

output {
  stdout {codec => rubydebug}
  elasticsearch {
    hosts => ["elasticsearch-node"]
  }
}

Coming up next… Kibana on Docker!

Elasticsearch, Logstash and Kibana (E.L.K.) on Docker – Part 1 Logstash

When I set up my new server back in May, I decided to try out Elasticsearch, Logstash and Kibana (E.L.K.) on it against the Aggregator (my PVOutput aggregating and uploading application) logs. It took me most of a day to get it installed, but because I followed a guide somewhere on the internet I can’t remember how it was all configured..!

I was searching Docker Hub the other day and thought… “I wonder if they have Docker images for E.L.K.?”

Luckily for me, they have taken the time to create Docker instances and it’s a good excuse to uninstall E.L.K. on my server and re-do it using Docker images!

I started off by trying out the basic Logstash example given on Docker Hub which worked fine and the decided to try to get the image to receive my log files via the Log4j SocketAppender.  No matter how I tried to get the existing Aggregator application to send the logs (SocketAppender or even via Docker volume), I could not get it to work…

So back to the drawing board!  Time to get a simple Java application up and running to try things out with… and that’s how the Spring Boot Web Example was born.

I started out with a basic Logstash config which had previously worked, using the log4j input, but found out later that Log4j and Log4j2 have incompatibilities and an addon would be needed if that input is used.

logstash.conf (1st attempt)

input {
  log4j {
    port => 9999
  }
}
filter {

}
output {
  stdout {}
}

But I don’t want to install another plugin… so I tried out various other “methods” (a.k.a. trial and error…)  and eventually found that if you use the input type tcp, you can send data to it using the Log4j2 SocketAppender, providing the Layout isn’t SerializedLayout.

logstash.conf (2nd attempt with tcp input and json codec)

input {
  tcp {
    port => 9999
    codec => json
  }
}

filter {
}

output {
  stdout {}
}

The above logstash.conf was combined with the SocketAppender and JSONLayout combo in the log4j2.xml config file

<Socket name="socket" host="pompeii" port="13456" reconnectionDelayMillis="5000">
    <JSONLayout complete="true" compact="false" eventEol="true" />
</Socket>

But I still couldn’t get it to produce the results I was after until it dawned on me that perhaps I should change the problem around… If I just throw standard log strings at it, maybe I can break them up or format them into something that’s easier for Logstash to consume!

So I then ended up with the tcp input and line codec and decided that if I send key value paired log messages at logstash, I could use the kv filter. I’ve now ended up with…

input {
  tcp {
    port => 9999
    codec => line
  }
}

filter {
  kv {
    source => "message"
    recursive => "true"
  }
}

output {
  stdout {codec => rubydebug}
}

And changed the log4j2.xml config to use a pattern layout that works better with the kv filter

log4j2.xml (the full file can be found in the spring-boot-example in the log4j2.xml.tcp file)

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn" strict="true" monitorInterval="30">
    <Properties>
        ...
        <Property name="defaultpattern">logdate=(%d{ISO8601}) thread=(%thread)) level=(%level) loggerclass=(%logger{36}) message=(%msg)%n</Property>
    </Properties>
    <Filter type="ThresholdFilter" level="trace"/>
	<Appenders>
            ...
            <Socket name="socket" host="pompeii" port="13456" reconnectionDelayMillis="5000">
                <Layout type="PatternLayout" pattern="${defaultpattern}" />
            </Socket>
        </Appenders>
        <Loggers>
            <Logger name="uk.co.vsf" level="info" additivity="false">
                <AppenderRef ref="STDOUT"/>
                <AppenderRef ref="File"/>
                <AppenderRef ref="socket"/>
            </Logger>
                ...
        </Loggers>
</Configuration>

The above Log4j2 config file specifies a patter that the key value (kv) filter will read easily, realising where each value ends because the values are wrapped with brackets. The logstash config file also specifies the out codec rubydebug as I found out (the hard way) that having debug on gives you an awful log of help when trying out config changes!

Putting it all together and running Logstash in Docker is probably the easiest part! To run logstash I have a run script which has the following command

docker run -p 13456:9999 -it --rm -v "$PWD":/config-dir logstash -f /config-dir/logstash.conf

It’s exposing the Logstash host container port 9999 as host port 13456 and loads in the custom logstash.conf file.

At present Logstash doesn’t send the received messages anywhere, but it will log the input to the console. So here’s an example of calling the get users by id service.

logstash-user

And the stdout from Logstash.

logstash-stdout

That’s it for now, but (hopefully) in the next part, I plan to hook Logstash up to Elasticsearch in docker!