Monitoring Speedtest.net CLI Results in Splunk
I was reading through posts on thinkbroadband‘s forum yesterday and came across a post which mentioned that Speedtest.net has a CLI! I don’t know why, but I’d never realised there was a CLI for Speedtest.net. I then went onto Docker Hub to see if anyone had already created an image of the example on Speedtest.net’s website for installing the Speedtest CLI script into a container and luckily quite a few people had, so I pulled down tamasboros/ookla-speedtest after checking it didn’t contain any unexpected code in the Dockerfile.
docker run \
--rm \
tamasboros/ookla-speedtest
I then thought… wouldn’t it be great to to put this into Splunk on a 30 minute basis? A quick Google later and someone had already done just the same thing https://amiracle19.blogspot.com/2016/02/getting-bandwidth-data-into-home.html but rather than run the script from Splunk, I’d prefer to have a script run on a cron and fire the data to Splunk. Below is how I achieved that in a matter of an hour from start to finish.
First step is to BACK UP MY SPLUNK INDEX DIRECTORIES!! Last time I added an index, I accidentally overwrote my splunk index directory and lost over a years worth of data 🙁 I’d rather not go through that again…
Next, create a new index to store the speedtest data within indexes.conf:
[speedtest]
coldPath = /mnt/splunk-data/speedtest/colddb
enableDataIntegrityControl = 0
enableTsidxReduction = 0
homePath = /mnt/splunk-data/speedtest/db
maxTotalDataSizeMB = 1024
thawedPath = /mnt/splunk-data/speedtest/thaweddb
I’ve given the index 1GB but this can be increased later as required.
Because I’m going to use cURL to post data after getting the Speedtest results, I need an HTTP listener rather than a TCP listener – if you use a TCP listener, when you cURL the port, the cURL request will hang and also record the request in the index as a separate event to the data, rather than just a singular event.
To setup the HTTP listener, see https://docs.splunk.com/Documentation/Splunk/8.0.4/Data/UsetheHTTPEventCollector for setting up the listener using the GUI, or add the following to your applications (or
[http://speedtest]
disabled = 0
index = speedtest
indexes = speedtest
sourcetype = _json
token = 22e2f868-72fb-4962-b6b9-71c7f26540da
To generate a uuid for the token, go to this website and copy the randomly generated UUID https://www.uuidgenerator.net/
Please note I’ve disabled SSL within the GUI for the above endpoint (see the previously linked documentation for how to get to the relevant setting).
Now I need a script that can run on a cron, triggering a docker container which executes the speedtest and sends the json to the splunk HTTP port. I’ve overridden the default entrypoint provided by the writer of that image as I want flat json rather than pretty printed + I want to use a specific speedtest server*:
speedtest.sh
#!/bin/bash
result=$(docker run \
--rm \
tamasboros/ookla-speedtest \
speedtest \
--server-id=838 \
--format=json \
--progress=no \
--accept-license \
--accept-gdpr)
curl "http://127.0.0.1:8088/services/collector" \
-H "Authorization: Splunk 22e2f868-72fb-4962-b6b9-71c7f26540da" \
--data "{\"sourcetype\": \"_json\", \"event\": ${result}}"
if you want a list of the parameters that can be passed to the speedtest script run:
docker run \
--rm \
tamasboros/ookla-speedtest \
speedtest --help
Now the script needs a cron** to run the speedtest script on a regular basis – in my case, I’ve chosen to run it every 30 minutes.
*/30 * * * * /home/victoria/scripts/cron/speedtest.sh
I can now see data coming into the index on a regular 30 minute basis, so time to create a dashboard to display the new dataset.
* I’ve chosen 838 a.k.a Xilo, see https://c.speedtest.net/speedtest-servers-static.php for a list of servers
** If you don’t know your cron syntax very well – try out this great website https://crontab.guru
Please enable the Disqus feature in order to add comments