blitzortung.org Daily Position Cron

As you probably know from my blog, I have a number of Arduino’s around the house for monitoring household and weather metrics and I’m always looking for ways to add more devices and data sets.

My dad happened to find a map one summer on blitzortung.org to view live lightning strikes and we decided to sign up for a lightning detector in December 2015 (I think!).  When you register you use your email address and can see where you are in the list of people who want a device by going back to en.blitzortung.org/cover_your_area.php and filling in your email + the other boxes.

After one year on the waiting list, our position in the queue had moved a bit, but it became tedious checking the list once a week, so I wrote a script that I’ve now made generic enough for anyone.  Simply add it to crontab and take the pain out of checking every week 🙂

GitHub gist: https://gist.github.com/vls29/aac9d3efaf265734dfc1b64c46482160#file-blitzortung-position-sh

#!/bin/bash

email=$1

epc=$(date +%s)
echo $epc

# Country does not seem to be important
res=$(curl --data "info_time=$epc&info_email=$email&info_country=United+Kingdom&info_text=TSqrb" http://en.blitzortung.org/cover_your_area.php)
#echo "-----------------------------"
#echo "HTML Response"
#echo $res
#echo "-----------------------------"

html=$(echo "$res" | grep $email)
#echo "-----------------------------"
#echo "Position Text"
#echo $html
#echo "-----------------------------"

textpositionstart=411+${#email}+1
echo "textpositionstart: $textpositionstart"

position=${html:$textpositionstart:6}
position=$(echo $position | sed 's@^[^0-9]*\([0-9]\+\).*@\1@')
echo "position: $position"

lastpositionfilename=blitzortung-last-position.txt
lastposition=$(cat $lastpositionfilename)
catresult=$?
echo "catresult $catresult"
if [ "$catresult" -eq "1" ]; then
    echo "didn't find last position file"
    lastposition=100000
else
    echo "found last position file"
fi
echo "lastposition: $lastposition"

if [ "$position" -lt "$lastposition" ]; then
    echo $position | mail -s "blitzortung.org position" $email
    echo $position > $lastpositionfilename
else
    echo "position not less than $lastposition: $position"
fi

exit

The only input to the script is the email address you’ve used on the waiting list (assuming you haven’t hardcoded it in the script like I have).  You don’t need country as that doesn’t appear to be used by the site to verify the email address.

Monumental App Update Mess Up!

A few weeks back I received a request to add in the ability to select half days in the Retirement Countdown Clock app (http://blog.v-s-f.co.uk/2016/02/retirement-countdown-clock-app/) and I decided that this was a quick change that wouldn’t take too long, so why not 🙂

Well I made a complete mess up of the update… It started off as seeming like a simple update, but I’d just had a rather large problem on my laptop that killed the SSD, so had not much software installed on the new hard drive. After all the necessary apps were installed, I set about updating the app, adding in the ability to select the half days. It only took about 4 hours in total to make the code changes and test (most of which was updating the runtime target version). I packaged it, tested it on my laptop and old phone, both of which said they would install from fresh and then added a new submission to the store.

Job done 🙂 or so I thought…

Two days after the app was published to the store, I logged in and to my horror I’d received over 11,000 crash reports!!! O.M.G!

All the crash reports were for the new version (2.1.0.0) and all were in exactly the same line of code… I wondered well how come it worked on my laptop and phone then? And the key answer was that it installed the app from fresh and didn’t do an update. I dashed around the house to find anoher phone I hadn’t tested on and updated the app from the store. Lo and behold, it crashed as soon as you tried to open the app from the start screen 🙁

I had all the info I needed in the crash reports to find the particular dodgy line of code – wasn’t handling the previously stored int and converting correctly into a decimal. Less than two hours later, a new submission was sent for approval to the store, but it takes a minimum of a day to get a submission approved… In that time the crash reports topped 20,000.

I learnt a very valuable lesson – don’t rush a change through, even if it seems simple and make sure you test it as if you’ve done an upgrade as well as a fresh install!

Sorry to all those people that downloaded the dodgy update, hopefully you’ve updated to 2.1.1.0 and it’s now working again.

Home Monitoring Upgrade (Part 2) – HSQLDB to MySQL

As mentioned in my previous post (see http://blog.v-s-f.co.uk/2017/04/home-monitoring-upgrade/), the first task I have is to migrate from HSQLDB to MySQL.

Because the system logs data every minute while I have power in the house and I want to minimise downtime when I actually have a fully working upgrade, I’ve experimented with a copy of the live HSQL database.

Once I’d copied it from my server to my laptop I then attempted to view the data in notepad++ – yeah, not a particularly smart move! NP++ cannot handle files over about 100M. It also turns out (having more’d the .data file) that the data is not in a readable state. I had a performance issue with a select query a long time back and changed the table to cached.

So, luckily at work a colleague had introduced us all to a great database tool called SQL Workbench. It can work with most databases and unlike SQuirreL, it doesn’t crash when looking at the work DB2 database.

Using SQL Workbench, I’ve then created a script file which creates the new consolidated table and loads data from the old tables in to the new + drops the old tables. The end result is a 122,398KB HSQLDB script file which is human readable.

Next step was to get MySQL running on my server. Instead of installing it directly though, there’s a Docker image available.

My first few attempts at inserting the data from the HSQLDB file in to the MySQL database were less than impressive! One of the attempts had the server running flat out (100% cpu) for over an hour when I finally decided that it was probably not going to complete the import this year and nuked it!

So having learnt a few lessons about not using single row inserts(!), but batching them in to multi-row inserts of 100,000 and a few MySQL deafult parameter increases (although I’m not sure if these are necessary as the batch inserts seemed to make most difference), I was finally able to import the data. It still took a few minutes from running the MySQL container to it being available – but that’s significantly better than running for hours importing the data!!

Now that I have the necessary scripts and knowledge to migrate the data, the next part is re-writing the application that receives the Arduino data, uploads to PVOutput and serves the hot water display Arduino.