Why the news about Tesco Bank today doesn’t surprise me!

There’s a headline on the BBC that has caught my eye today – “Tesco Bank blames ‘systematic sophisticated attack’ for account losses” – and I’m not at all surprised given my experience back in 2009 when I had a Tesco Bank credit card account.

In 2006, before I got my first proper (full time) job, I opened a credit card account with Tesco. Once the card arrived and all was well, I proceeded to set up online banking as it’s the most convenient method of operating accounts and gave the new account a unique password – as I do for all my accounts!

Roll on 2009 – I got home from work one day to find an envelope from Tesco Bank with a copy (not the original) of one of my statements… A brief “Huh?” moment and thought nothing of it, shredded it and forgot about it.

Then a while later I received a second copy of one of my statements and alarm bells rang!!

I rang Tesco Bank, asked why I’d recevied a copy of my statement, to which the advisor said they didn’t know, but I ordered it. I asked the advisor what the date and time was and I was given a date and time while I was sitting in a clients meeting room regarding a project we were working on… I was not impressed. I questioned how it was possible given that I was in that meeting with no phone or internet and was advised that perhaps I’d given my account details to someone else!! To which I replied, you have a problem with security and I want to close my account now.

I was never asked any further questions by the advisor and no one ever contacted me afterwards, so I can only assume the advisor thought I was nutty… but I know for a fact that the account details were never shared, unique and very unlikely to have been guessed.

Elasticsearch, Logstash and Kibana (E.L.K.) on Docker – Part 3 Kibana

This is the final part of setting up Elasticsearch, Logstash and Kibana using the official Docker Hub images. If you haven’t already read Part 1 Logstash or Part 2 Elasticsearch, it might be good to read them first.

So far I have a running Logstash sending messages to Elasticsearch both of which are running in separate Docker containers and now I’m going to add Kibana. Kibana adds the graphical UI that enables you to visualise, create dashboards and search for messages etc.

Kibana talks to Elasticsearch to query the data, so for our Docker run statement, we need to link the kibana instance to the elasticsearch-node. For consistency, the Kibana instance is given the name kibana-node and it’s given a parameter to talk to the elasticsearch-node via a fully qualified search url.

docker run --link elasticsearch-node:elasticsearch-node --name kibana-node -p 5601:5601 -d -e ELASTICSEARCH_URL=http://elasticsearch-node:9200 kibana

When you first log into Kibana, it will ask for an index to be created. This bit did catch me out as you can’t create an index until Elasticsearch has received a few events. Once it has, it’s as simple as giving it a name and selecting the right date field, then clicking create.

Clicking on the Discover menu will then allow you to see the rececent events and start creating new queries for turning into dashboard widgets.

kibana

I haven’t yet had enough time to create any fancy dashboards to give an example as it’s a little different to the tool I’m more familiar with (Splunk).

The last point to mention is that all of the examples so far have not saved data other than to the Docker image. Therefore if the Docker image is removed, so is all your historic data! To persist the data, I’d suggest having a look at Docker volumes – but since I haven’t tried it yet, can’t guarantee it is the right answer!

That concludes my mini-series on the subject of Elasticsearch, Logstash and Kibana (E.L.K.) on Docker 🙂

Elasticsearch, Logstash and Kibana (E.L.K.) on Docker – Part 2 Elasticsearch

It’s worth reading Part 1 Logstash first.

So, today I had a chance to try out Elasticsearch on docker and it was semi easy to get it to work… the trickiest part was linking the Logstash and Kibana instances with the Elasticsearch instance. The trick is to name everything!

So here’s the command to run Elasticsearch, note I’ve given it a name of elasticsearch-node.

docker run -d --name elasticsearch-node elasticsearch

In order to then link Logstash to the Elasticsearch node, we need to change the command used to run Logstash from this…

docker run -p 13456:9999 -it --rm -v "$PWD":/config-dir logstash -f /config-dir/logstash.conf

To this…

docker run -p 13456:9999 -d -it -v "$PWD":/config-dir --link elasticsearch-node:elasticsearch-node --name logstash-node logstash -f /config-dir/logstash.conf

The differences being that the Logstash image now has a name of logstash-node when it’s run and it links to the Elasticsearch node via the name and identical alias.

The Logstash config file has also been changed to reference the Elasticsearch node as shown below.

input {
  tcp {
    port => 9999
    codec => line
  }
}

filter {
  kv {
    source => "message"
    recursive => "true"
  }
}

output {
  stdout {codec => rubydebug}
  elasticsearch {
    hosts => ["elasticsearch-node"]
  }
}

Coming up next… Kibana on Docker!