notes by alifeeeprofile picture tagged scripting (19)rss

return to notes / blog / website / weeknotes / linktree

here I may post some short, text-only notes, mostly about programming. source code.

tags: all (50) scripting (19) linux (5) android (4) bash (4) geojson (4) jq (4) obsidian (4) github (3) html (3) ............ see all (+76)

guestbook!

show all /
sign book

automating the turning on and off of my Minecraft server#prevsinglenexttop

2025-06-10 • tags: minecraft, scripting, tmux, cron, nginx • 1108 'words', 332 secs @ 200wpm

I run a Minecraft server weekly on Tuesdays. Sometimes, I even play on it.

This describes automating the process for turning it on and off. Won't somebody at https://ggservers.com/ please hire me /jk.

The process

Turning it on

My process every Tuesday to turn on the server has been:

Turning it off

Then, on Wednesday mornings (if I remember), I:

The problems

Each of these steps can take a few seconds to run, so I am often multitasking, and I often forget things (like forgetting the backup, forgetting to actually run shutdown after all is done).

So, I've tried to automate it.

Doing it automatically

I found out that Kamatera (the server host) has an API that you can use to remotely turn on/off servers, which is the only thing that I was really missing.

cron tasks - web server

Here are the cron tasks on my web server:

# turn Minecraft server server on/off
45 16 * * 2 /home/alifeee/minecraft/togglepower.sh on >> /home/alifeee/minecraft/cron.log 2>&1
5 4 * * 3 /home/alifeee/minecraft/rsync_backup.sh on >> /home/alifeee/minecraft/cron.log 2>&1
15 4 * * 3 /home/alifeee/minecraft/togglepower.sh off >> /home/alifeee/minecraft/cron.log 2>&1

cron tasks - minecraft box

…and the cron tasks on the minecraft box:

55 16 * * 2 /home/alifeee/minecraft/tmux_make.sh >> /home/alifeee/minecraft/cron.log 2>&1
0 4 * * 2 /home/alifeee/minecraft/tmux_kill.sh >> /home/alifeee/minecraft/cron.log 2>&1

human description of cron jobs

Hopefully you can see the similarities to the process I described above, i.e.,

The scripts

These scripts are pretty simple, they are:

togglepower.sh - turn on/off the minecraft box

$ cat togglepower.sh
#!/bin/bash
# power on server
date
onoroff="${1}"
echo "got instruction: turn server <${onoroff}>"
if [[ ! "${onoroff}" == "on" ]] && [[ ! "${onoroff}" == "off" ]]; then
  echo "usage: ./togglepower.sh [on|off]"
  exit 1
fi
serverid="${serverid}"
auth=$(curl -s --request POST 'https://console.kamatera.com/service/authenticate' \
--header 'Content-Type: application/json' \
--data '{
    "clientId": "${clientId}",
    "secret": "${secret}"
}')
authentication=$(echo "${auth}" | jq -r '.authentication')
status=$(curl -s --request \
  GET "https://console.kamatera.com/service/server/${serverid}" \
  -H 'Content-Type: application/json' \
  -H "Authorization: Bearer ${authentication}"
)
power=$(echo "${status}" | jq -r '.power')
echo "current power: ${power}"
if [[ "${power}" == "${onoroff}" ]]; then
  echo "power is already ${onoroff}… quitting…"
  exit 1
fi
result=$(curl -s --request PUT \
  "https://console.kamatera.com/service/server/${serverid}/power" \
  --header 'Content-Type: application/json' \
  -H "Authorization: Bearer ${authentication}" \
  --data '{"power": "'"${onoroff}"'"}'
)
echo "complete! got ${result} from API call"

run - run the Minecraft server

$ cat ./run 
#!/bin/bash
java \
  -Xmx1G \
  -jar fabric-server-mc.1.21.4-loader.0.16.10-launcher.1.0.1.jar \
  nogui

tmux_make.sh - make a tmux session and run the Minecraft server in it

$ cat tmux_make.sh 
#!/bin/bash
date
session="minecraft"
echo "making tmux session ${session}"
tmux new-session -d -s "${session}"
echo "sending run"
tmux send-keys -t "${session}" './run' 'C-m'
echo "created !"

tmux_kill.sh - stop the Minecraft server and stop the tmux session

$ cat tmux_kill.sh 
#!/bin/bash
date
session="minecraft"
echo "sending CTRL+C to ${session}"
tmux send-keys -t "${session}" 'C-c'
echo "sent CTRL+C… sleeping 30s…"
sleep 30
echo "killing session ${session}"
tmux kill-session -t "${session}"
echo "killed session"

rsync_backup.sh - get backups using rsync

$ cat rsync_backup.sh 
#!/bin/bash
date
echo "saving cron log"
rsync minecraft:/usr/alifeee/minecraft/cron.log cron_minecraft.log
date
echo "saving world"
rsync -r minecraft:/usr/alifeee/minecraft/world/ world/
date
echo "saving dynmap"
rsync -r minecraft:/usr/alifeee/minecraft/dynmap/web/ dynmap/web/
date
echo "done!"

What about the map?

Well, I figured this was too annoying to automate, so I just wrote a front page to pick whether you wanted the "dead map" or the "live map" (on https://map.mc.alifeee.net/ – link probably dead).

The HTML for this simple picker makes quite a nice page:

see HTML
<!DOCTYPE html>
<html>
<meta name="viewport" content="width=device-width, initial-scale=1" />
<head>
<style>
html, body {
  background: black;
  color: white;
  height: 100%;
  font-family: sans-serif;
}
body {
  display: flex;
  flex-direction: column;
  align-items: center;
  justify-content: center;
}
img {
  margin-bottom: 1rem;
}
.options {
  display: flex;
}
.option {
  background: orange;
  padding: 1rem;
  border-radius: 0.5rem;
  margin: 0.5rem;
  color: black;
  text-decoration: none;
  max-width: 10rem;
  text-align: center;
}
.option.two {
  background: purple;
  color: white;
}
.option span {
  opacity: 0.75;
}
</style>
</head>

<body>

<h1>minecraft dynmap</h1>

<img src="/map/tiles/world/flat/0_0/zz_16_4.webp" />

<section class="options">
  <a class="option one" href="/map/">
    <h2>dead map</h2>
    <span>viewable all week, updates on server shutdown</span>
  </a>
  <a class="option two" href="https://livemap.mc.alifeee.net/">
    <h2>live map</h2>
    <span>viewable only when the server is live, shows players</span>
  </a>
</section>
</body>
</html>

This is served with a special nginx configuration which just serves a static file, and otherwise serves content via alias (not root):

server {
    server_name map.mc.alifeee.net;
location / {
        root /var/www/dynmap/;
        try_files /whichmap.html =404;
    }
    location /map/ {
        alias /var/www/dynmap/;
        try_files $uri $uri/ =404;
    }
}

Does it work

I think it works. I'll see if I have to make any edits tomorrow or next week.

I love scripting !

back to top

creating a desktop overlay to view players on a Minecraft server with conky#prevsinglenexttop

2025-06-03 • tags: conky, minecraft, scripting, overlay • 1049 'words', 315 secs @ 200wpm

Currently, I'm hosting a Minecraft server weekly on Tuesdays. Sometimes I even play.

It's Vanilla with a proximity voice chat mod (walk near people to hear them). Proximity voice chat is endlessly fun (see Barotrauma, Factorio, et cetera…)

Today, I wanted to have an overlay (think Discord voice chat overlay, or when you pop-out a video in Firefox, or when you use chat heads on mobile) which showed me who was online on the server.

Querying the Minecraft server status

After seeing an "enable status" option in the server's server.properties file, and searching up what it meant (it allows services to "query the status of the server), I'd used https://mcsrvstat.us/ before to check the status of the server, which shows you the player list in a browser.

But a local overlay would need a local way to query the server status. So I did some web searching, found a Python script which wasn't great (and written for Python 2), then a self-hostable server status API, which led me to mcstatus, a Python API (with command line tool) for fetching server status.

I installed and tested it with

$ cd ~/temp/minecraft/
$ python3 -m venv env
$ ./env/bin/python -m mcstatus $SERVER_IP json
{"online": true, "kind": "Java", "status": {"players": {"online": 7, "max": 69, "sample": [{"name": "Boldwolf5491", "id": "289qfhj8-a8f2-298g-19ga-897ahwf8uwa8"}, {"name": "……………

Neat!

How to make an overlay on Linux

Next, a way of having an overlay. Searching for "linux x simple text overlay" led me to xmessage, which can show simple windows, but they're more like confirmation windows, not like long-lasting status windows (i.e., it's hard to update the text).

I was also led to discover conky, which – if nothing else – has a great name. It's designed to be a "system monitor", i.e., a thing wot shows you your CPU temperature, uptime, RAM usage, et cetera. The configuration is also written in Lua, which is super neat! I still want to get more into Lua.

Using conky

By modifying the default configuration (in /etc/conky/conky.conf) like so:

diff --git a/etc/conky/conky.conf b/.config/conky/conky.conf
index 44053d5..cc319e1 100644
--- a/etc/conky/conky.conf
+++ b/.config/conky/conky.conf
@@ -37,8 +37,9 @@ conky.config = {
     out_to_stderr = false,
     out_to_x = true,
     own_window = true,
+    own_window_title = 'Minecraft',
     own_window_class = 'Conky',
-    own_window_type = 'desktop',
+    own_window_type = 'normal', -- or desktop
     show_graph_range = false,
     show_graph_scale = false,
     stippled_borders = 0,
@@ -48,25 +49,9 @@ conky.config = {
     use_xft = true,
 }
 
 conky.text = [[
-${color grey}Info:$color ${scroll 32 Conky $conky_version - $sysname $nodename $kernel $machine}
-$hr
-${color grey}Uptime:$color $uptime
-${color grey}Frequency (in MHz):$color $freq
-${color grey}Frequency (in GHz):$color $freq_g
-${color grey}RAM Usage:$color $mem/$memmax - $memperc% ${membar 4}
-${color grey}Swap Usage:$color $swap/$swapmax - $swapperc% ${swapbar 4}
-${color grey}CPU Usage:$color $cpu% ${cpubar 4}
-${color grey}Processes:$color $processes  ${color grey}Running:$color $running_processes
-$hr
-${color grey}File systems:
- / $color${fs_used /}/${fs_size /} ${fs_bar 6 /}
-${color grey}Networking:
-Up:$color ${upspeed} ${color grey} - Down:$color ${downspeed}
-$hr
-${color grey}Name              PID     CPU%   MEM%
-${color lightgrey} ${top name 1} ${top pid 1} ${top cpu 1} ${top mem 1}
-${color lightgrey} ${top name 2} ${top pid 2} ${top cpu 2} ${top mem 2}
-${color lightgrey} ${top name 3} ${top pid 3} ${top cpu 3} ${top mem 3}
-${color lightgrey} ${top name 4} ${top pid 4} ${top cpu 4} ${top mem 4}
+${execpi 5 ~/temp/minecraft/check.sh}
 ]]

…when we run conky it opens a small window which contains the output of the script ~/temp/minecraft/check.sh (the 5 after execpi means it runs every 5 seconds). If this script was just echo "hi!" then that conky window looks a bit like:

 ———————+x
 |       |
 |  hi!  |
 |_______|

I use Pop!_OS, which uses Gnome/X for all the windows. With that (by default), I can right click the top bar of a window and click "Always on Top", which effectively makes the little window into an overlay, as it always displays on top of other windows, with the added bonus that I can easily drag it around.

Writing a script for conky to use

Now, I can change the script to use the above Minecraft server status JSON information to output something which conky can use as an input, like:

#!/bin/bash
#~/temp/minecraft/check.sh
json=$(~/temp/minecraft/env/bin/python -m mcstatus $SERVER_IP json)
online=$(echo "${json}" | jq -r '.status.players.online')
players=$(echo "${json}" | jq -r '.status.players.sample[] | .name')

echo '${color aaaa99}'"${online} players online"'${color}'
echo "---"
echo "${players}" \
  | sort \
  | awk '
  BEGIN{
    for(n=0;n<256;n++)ord[sprintf("%c",n)]=n
  }{
    r=0; g=0; b=0;
    split($0, arr, "")
    for (i in arr) {c=arr[i]; n=ord[c]; r+=n*11; g+=n*15; b+=n*21}
    printf "${color %X%X%X}%s\n",
      r%128+128, g%128+128, b%128+128, $0
  }
'

The fancy awk is just to make each player be a different colour, and to randomly generate the colours from the ASCII values of the player's username.

The final output

The final output looks like:

 ——————————————————+x
 | 8 players online |
 | ---              |
 | Kick_Flip_Barry  |
 | Blue_Outburst    |
 | Kboy8082         |
 | lele2102         |
 | Compostmelon101  |
 | Nobody808        |
 | Kaithefrog       |
 | BrinnanTheThird  |
 |__________________|

…which I can drag anywhere on my screen. When people join or leave the server, I can see a flash of change out of the corner of my eye.

Conclusions

Is this useful? Should I – instead – just have been playing the game? Do I use too many en-dashes? The world only knows.

Maybe I'll use conky for something else in future… I like to wonder what it could do…

back to top

getting my wifi name and password from the terminal#prevsinglenexttop

2025-05-25 • tags: scripting, wifi, aliases • 265 'words', 80 secs @ 200wpm

I often want to get my current WiFi name (SSID) and password.

How to get name/password manually

Sometimes, it's for a microcontroller. Sometimes, to share it. This time, it's for setting up an info-beamer device with WiFi.

Before today, I would usually open my phone and go to "share" under the WiFi settings, and copy the password manually, and also copy the SSID manually.

It's finally time to write a way to do it with bash!

How to get name/password with bash

After some web-searching, these commands do what I want:

alias wifi=iwgetid -r
alias wifipw=sudo cat "/etc/NetworkManager/system-connections/$(wifi).nmconnection" | pcregrep -o1 "^psk=(.*)"

How to use

…and I can use them like:

$ wifi
the wood raft (2.4G)
$ wifipw
[sudo] password for alifeee: 
**************

Neat!

Using Atuin aliases

Finally, above I suggested I was using Bash aliases, but I actually created them using Atuin, specifically Atuin dotfile aliases, like:

atuin dotfiles alias set wifi 'iwgetid -r'
atuin dotfiles alias set wifipw 'sudo cat "/etc/NetworkManager/system-connections/$(wifi).nmconnection" | pcregrep -o1 "^psk=(.*)"'

Now, they will automatically be enabled on all my computers that use Atuin. This is actually not… amazingly helpful as my other computers all use ethernet, not WiFi, but… it's mainly about having the aliases all in the same place (and "backed up", if you will).

back to top

Getting hackspace Mastodon instances from SpaceAPI#prevsinglenexttop

2025-05-22 • tags: scripting, spaceapi, mastodon, hackspaces, json • 622 'words', 187 secs @ 200wpm

We're back on the SpaceAPI grind.

This time, I wanted to see what Mastodon instances different hackspaces used.

The "contact" field in SpaceAPI

SpaceAPI has a "contact" object, which is used for this kind of thing. For example, for Sheffield Hackspace, this is:

$ curl -s "https://www.sheffieldhackspace.org.uk/spaceapi.json" | jq '.contact'
{
  "email": "trustees@sheffieldhackspace.org.uk",
  "twitter": "@shhmakers",
  "facebook": "SHHMakers"
}

Downloading all the SpaceAPI files

Once again, I start by downloading the JSON files, so that (in theory) I can make only one request to each SpaceAPI endpoint, and then work with the data locally (instead of requesting the JSON from the web every time I interact with it).

This script is modified from last time I did it, adding some better feedback of why some endpoints fail.

# download spaces
tot=0; got=0
echo "code,url" > failed.txt
RED='\033[0;31m'; GREEN='\033[0;32m'; YELLOW='\033[0;33m'; NC='\033[0m'
while read double; do
  tot=$(($tot+1))
  name=$(echo "${double}" | awk -F';' '{print $1}');
  url=$(echo "${double}" | awk -F';' '{print $2}');
  fn=$(echo "${name}" | sed 's+/+-+g')
  echo "saving '${name}' - <${url}> to ./spaces/${fn}.json";

  # skip unless manually deleted  
  if [ -f "./spaces/${fn}.json" ]; then
    echo -e "  ${YELLOW}already saved${NC} this URL!" >> /dev/stderr
    got=$(($got+1))
    continue
  fi
  
  # get, skipping if HTTP status >= 400
  code=$(curl -L -s --fail --max-time 5 -o "./spaces/${fn}.json" --write-out "%{http_code}" "${url}")
  if [[ "${?}" -ne 0 ]] || [[ "${code}" -ne 200 ]]; then
    echo "${code},${url}" >> failed.txt
    echo -e "  ${RED}bad${NC} status code (${code}) for this url!"  >> /dev/stderr
    continue
  fi
  
  echo -e "  ${GREEN}fetched${NC}! maybe it's bad :S" >> /dev/stderr
  got=$(($got+1))
done <<<$(cat directory.json | jq -r 'to_entries | .[] | (.key + ";" + .value)')
echo "done, got ${got} of ${tot} files, $(($tot-$got)) failed with HTTP status >= 400"
echo "codes from failed.txt:"
cat failed.txt | awk -F',' 'NR>1{a[$1]+=1} END{printf "  "; for (i in a) {printf "%s (%i) ", i, a[i]}; printf "\n"}'

# some JSON files are malformed (i.e., not JSON) - just remove them
rem=0
for file in spaces/*.json; do
  cat "${file}" | jq > /dev/null
  if [[ "${?}" -ne 0 ]]; then
    echo "=== ${file} does not parse as JSON... removing it... ==="
    rm -v "${file}"
    rem=$(( $rem + 1 ))
  fi
done
echo "removed ${rem} malformed json files"

Extracting contact information

This is basically copied from last time I did it, changing membership_plans? to contact?, and changing the jq format afterwards.

# parse contact info
for file in spaces/*.json; do
  plans=$(cat "${file}" | jq '.contact?')
  [[ "${plans}" == "null" ]] && continue
  echo "${file}"
  echo "${plans}" | jq -r 'to_entries | .[] | (.key + ": " + (.value|tostring) )'
  echo ""
done > contact.txt

It outputs something like:

$ cat contact.txt | tail -n20 | head -n13
spaces/Westwoodlabs.json
twitter: @Westwoodlabs
irc: ircs://irc.hackint.org:6697/westwoodlabs
email: vorstand@westwoodlabs.de

spaces/xHain.json
phone: +493057714272
email: info@x-hain.de
matrix: #general:x-hain.de
mastodon: @xHain_hackspace@chaos.social

spaces/Zeus WPI.json
email: bestuur@zeus.ugent.be

Calculating Mastodon averages

We can filter this file to only the "mastodon:" lines, and then extract the server with a funky regex, and get a list of which instances are most common.

$ cat contact.txt | grep '^[^:]*mastodon' | pcregrep -o1 '([^:\.@\/]*\.[^\/@]*).*' | sort | uniq -c | sort -n
      1 c3d2.social
      1 caos.social
      1 hachyderm.io
      1 hackerspace.pl
      1 mas.to
      1 social.bau-ha.us
      1 social.flipdot.org
      1 social.okoyono.de
      1 social.saarland
      1 social.schaffenburg.org
      1 telefant.net
      2 social.c3l.lu
      3 mastodon.social
      4 hsnl.social
     39 chaos.social

So… it's mostly chaos.social. Neat.

back to top

comparing historical HMO licence data in Sheffield#prevsinglenexttop

2025-05-14 • tags: scripting, hmos, open-data • 1321 'words', 396 secs @ 200wpm

What is an HMO licence

Sheffield city council publishes a list of HMO (House in Multiple Occupation) licences on their HMO page, along with other information about HMOs (in brief, an HMO is a shared house/flat with more than 3 non-family members, and it must be licenced if this number is 5 or more).

How accessible is the data on HMO licences

They provide a list of licences as an Excel spreadsheet (.xlsx). I've asked them before if they could (also) provide a CSV, but they told me that was technically impossible. I also asked if they had historical data (i.e., previous spreadsheets), but they said they deleted it every time they uploaded a new one.

Therefore, as I'm interested in private renting in Sheffield, I've been archiving the data in a GitHub repository, as CSVs. I also add additional data like lat/long coordinates (via geocoding), and parse the data into geographical formats like .geojson, .gpx, and .kml (which can be viewed on a map!).

Calculating statistics from the data

What I hadn't done yet was any statistics on the data (I'd only been interested in visualising it on a map) so that's what I've done now.

I spent the afternoon writing some scripts to parse CSV data and calculate things like mean occupants, most common postcodes, number of expiring licences by date, et cetera.

General Statisitcs

I find shell scripting interesting, but I'm not so sure everyone else does (the script for the interested). So I'm not going to put the scripts here, but I will say that I used these command line tools (CLI tools) this many times:

Anyway, here are the statistics from the script (in text form, as is most shareable):

hmos_2024-09-09.csv
  total licences: 1745
  6.29 mean occupants (IQR 2 [5 - 7]) (median 6)
  amount by postcode:
    S1 (60), S2 (214), S3 (100), S4 (12), S5 (18), S6 (90), S7 (62), 
    S8 (10), S9 (5), S10 (742), S11 (425), S12 (1), S13 (2), S14 (1), S20 (1), S35 (1), S36 (1), 
  streets with most licences: Crookesmoor Road (78), Norfolk Park Road (72), Ecclesall Road (48), Harcourt Road (38), School Road (29), 
  
hmos_2025-01-28.csv
  total licences: 1459
  6.35 mean occupants (IQR 2 [5 - 7]) (median 6)
  amount by postcode:
    S1 (50), S2 (199), S3 (94), S4 (9), S5 (17), S6 (78), S7 (57), 
    S8 (10), S9 (4), S10 (614), S11 (321), S12 (1), S13 (2), S20 (1), S35 (1), S36 (1), 
  streets with most licences: Norfolk Park Road (73), Crookesmoor Road (57), Ecclesall Road (43), Harcourt Road (28), School Road (26), 
  
hmos_2025-03-03.csv
  total licences: 1315
  6.37 mean occupants (IQR 2 [5 - 7]) (median 6)
  amount by postcode:
    S1 (48), S2 (161), S3 (92), S4 (8), S5 (13), S6 (70), S7 (55), 
    S8 (9), S9 (3), S10 (560), S11 (290), S12 (1), S13 (2), S20 (1), S35 (1), S36 (1), 
  streets with most licences: Crookesmoor Road (54), Norfolk Park Road (41), Ecclesall Road (38), Harcourt Road (27), Whitham Road (24), 

Potential Conclusions

Draw your own conclusions there, but some could be that:

Statistics on issuing and expiry dates

I also did some statistics on the licence issue and expiry dates with a second stats script, which – as it parses nearly 5,000 dates – takes longer than "almost instantly" to run. As above, this used:

The script outputs:

hmos_2024-09-09.csv
  1745 dates in 1745 lines (627 unique issuing dates)
    637 expired
    1108 active
  Licence Issue Dates:
    Sun 06 Jan 2019, Sun 06 Jan 2019, … … … Wed 12 Jun 2024, Tue 09 Jul 2024, 
    Monday (275), Tuesday (440), Wednesday (405), Thursday (352), Friday (256), Saturday (5), Sunday (12), 
    2019 (84), 2020 (311), 2021 (588), 2022 (422), 2023 (183), 2024 (157), 
  Licence Expiry Dates:
    Mon 09 Sep 2024, Mon 09 Sep 2024, … … … Mon 11 Jun 2029, Sun 08 Jul 2029, 
    2024 (159), 2025 (824), 2026 (263), 2027 (225), 2028 (185), 2029 (89), 
    
hmos_2025-01-28.csv
  1459 dates in 1459 lines (561 unique issuing dates)
    334 expired
    1125 active
  Licence Issue Dates:
    Mon 28 Oct 2019, Mon 04 Nov 2019, … … … Mon 06 Jan 2025, Tue 14 Jan 2025, 
    Monday (243), Tuesday (380), Wednesday (338), Thursday (272), Friday (211), Saturday (6), Sunday (9), 
    2019 (2), 2020 (130), 2021 (567), 2022 (406), 2023 (181), 2024 (170), 2025 (3), 
  Licence Expiry Dates:
    Thu 30 Jan 2025, Fri 31 Jan 2025, … … … Mon 22 Oct 2029, Wed 28 Nov 2029, 
    2025 (681), 2026 (264), 2027 (225), 2028 (184), 2029 (105), 
    
hmos_2025-03-03.csv
  1315 dates in 1315 lines (523 unique issuing dates)
    189 expired
    1126 active
  Licence Issue Dates:
    Mon 28 Oct 2019, Mon 04 Nov 2019, … … … Wed 05 Mar 2025, Wed 05 Mar 2025, 
    Monday (217), Tuesday (339), Wednesday (314), Thursday (244), Friday (189), Saturday (4), Sunday (8), 
    2019 (2), 2020 (64), 2021 (494), 2022 (399), 2023 (177), 2024 (170), 2025 (9), 
  Licence Expiry Dates:
    Fri 07 Mar 2025, Fri 07 Mar 2025, … … … Mon 22 Oct 2029, Wed 28 Nov 2029, 
    2025 (533), 2026 (262), 2027 (225), 2028 (184), 2029 (111),

Potential conclusions on dates

Again, draw your own conclusions (homework!), but some could be:

Why is this interesting

I started collecting HMO data originally because I wanted to visualise the licences on a map. Over a short time, I have created my own archive of licence history (as the council do not provide such).

Since I had multiple months of data, I could make some comparison, so I made these statistics. I don't find them incredibly useful, but there could be people who do.

Perhaps as time goes on, the long-term comparison (over years) could be interesting. I think the above data might not be greatly useful as it seems that Sheffield council are experiencing delays over licensing at the moment, so the decline in licences probably doesn't reflect general housing trends.

Plus, I just wanted to do some shell-scripting ;]

back to top

taking a small bunch of census data from FindMyPast#prevsinglenexttop

2025-04-29 • tags: jq, scripting, web-scraping, census, data • 507 'words', 152 secs @ 200wpm

The 1939 Register was an basically-census taken in 1939. On the National Archives Page, it says that it is entirely available online.

However, further down, it lists how to access it, which says:

You can search for and view open records on our partner site Findmypast.co.uk (charges apply). A version of the 1939 Register is also available at Ancestry.co.uk (charges apply), and transcriptions without images are on MyHeritage.com (charges apply). It is free to search for these records, but there is a charge to view full transcriptions and download images of documents. Please note that you can view these records online free of charge in the reading rooms at The National Archives in Kew.

So… charges apply.

Anyway, for a while in April 2025 (until May 8th), FindMyPast is giving free access to the 1939 data.

Of course, family history is hard, and what's much easier is "who lived in my house in 1939". For that you can use:

I created an account with a bogus email address (you're not collecting THIS guy's data) and took a look around at some houses.

Then, I figured I could export my entire street, so I did.

The code and more context is in a GitHub Repository, but in brief, I:

Now it looks like:

AddressStreet,Address,Inhabited,LatLon,FirstName,LastName,BirthDate,ApproxAge,OccupationText,Gender,MaritalStatus,Relationship,Schedule,ScheduleSubNumber,Id
Khartoum Road,"1 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Constance A,Latch,31 Aug 1904,35,Manageress Restaurant & Canteen,Female,Married,Unknown,172,3,TNA/R39/3506/3506E/003/17
Khartoum Road,"4 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Catherine,Power,? Feb 1897,42,Music Hall Artists,Female,Married,Unknown,171,8,TNA/R39/3506/3506D/015/37
Khartoum Road,"4 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Charles F R,Kirby,? Nov 1886,53,Newsagent Canvasser,Male,Married,Head,172,1,TNA/R39/3506/3506D/015/39
Khartoum Road,"4 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Constance A,Latch,31 Aug 1912,27,Manageress Restairant & Cante,Female,Married,Unknown,172,3,TNA/R39/3506/3506D/015/41

Neat!

Some of my favourite jobs in the sheet of streets I collected are:

back to top

comparing EPC certificates with git-diff#prevsinglenexttop

2025-02-28 • tags: git-diff, scripting, housing • 484 'words', 145 secs @ 200wpm

Our house just got a new EPC certificate. You can (maybe) check yours on https://www.gov.uk/find-energy-certificate.

I'm interested in easy ways to see change. Trying to compare the old and new webpages by eye is hard, which leads me to text-diffing. I can copy the contents of the website to a file and compare them that way. Let's. I did a similar thing a while ago with computer benchmarks.

I manually create two files by copying the interesting bits of the webpage, called 1 and 2 (because who has time for .txt extensions). Then, I can run:

git diff --no-index -U1000 ~/1 ~/2 > diff.txt
cat diff.txt | sed -E 's#^\+(.*)#<ins>\1</ins>#' | sed -E 's#^-(.*)#<del>\1</del>#' | sed 's/^ //'

The latter command turns each into HTML by turning + lines into <ins> ("insert"), - into <del> ("delete"), and removing leading spaces on other lines. Then, I can whack the output into a simple HTML template:

<!DOCTYPE html>
<html>
  <head>
    <style>
    body { background: black; color: white; }
    pre { padding: 1rem; }
    del { text-decoration: none; color: red; }
    ins { text-decoration: none; color: green; }
    </style>
  </head>
  <body>
<pre>
diff goes here...
<del>del lines will be red</del>
<ins>ins lines will be green</ins>
</pre>
  </body>
</html>

The final output is something like this (personal information removed. don't doxx me.)

Energy rating
D

Valid until 05 February 2025 05 February 2035

Property type Mid-terrace house Total floor area 130 square metres 123 square metres

This property’s energy rating is D. It has the potential to be C. This property’s energy rating is D. It has the potential to be B.

Features in this property

Window Fully double glazed Good Roof Pitched, no insulation (assumed) Very poor Roof Roof room(s), no insulation (assumed) Very poor Roof Roof room(s), insulated (assumed) Good Lighting Low energy lighting in 64% of fixed outlets Good Lighting Low energy lighting in all fixed outlets Very good Secondary heating None N/A

Primary energy use

The primary energy use for this property per year is 303 kilowatt hours per square metre (kWh/m2). The primary energy use for this property per year is 252 kilowatt hours per square metre (kWh/m2).

Good job on us for having 100% low energy lighting fixtures, I guess...

Really, this is a complicated way to simplify something. I like simple things, so I like this.

back to top

getting hackspace membership prices from SpaceAPI#prevsinglenexttop

2025-02-21 • tags: spaceapi, scripting, hackspaces, json • 1100 'words', 330 secs @ 200wpm

SpaceAPI is a project to convince hackspaces to maintain a simple JSON file self-describing themselves.

For example, see Sheffield Hackspace's on https://www.sheffieldhackspace.org.uk/spaceapi.json. It's currently a static file.

I wanted to know which hackspaces published their membership prices using SpaceAPI, and what those rates were. Here are a few bash scripts to do just that:

There is an updated version of this script in the newer note about SpaceAPI.

# get the directory of SpaceAPIs
mkdir -p ~/temp/spaceapi/spaces
cd ~/temp/spaceapi
curl "https://directory.spaceapi.io/" | jq > directory.json

# save (as many as possible of) SpaceAPIs to local computer
tot=0; got=0
while read double; do
  tot=$(($tot+1))
  name=$(echo "${double}" | awk -F';' '{print $1}');
  url=$(echo "${double}" | awk -F';' '{print $2}');
  fn=$(echo "${name}" | sed 's+/+-+g')
  echo "saving '${name}' - <${url}> to ./spaces/${fn}.json";

  # skip unless manually deleted  
  if [ -f "./spaces/${fn}.json" ]; then
    echo "already saved!"
    got=$(($got+1))
    continue
  fi
  
  # get, skipping if HTTP status >= 400
  curl -L -s --fail --max-time 5 "${url}" -o "./spaces/${fn}.json" || continue
  echo "fetched! maybe it's bad :S"
  got=$(($got+1))
done <<<$(cat directory.json | jq -r 'to_entries | .[] | (.key + ";" + .value)')
echo "done, got ${got} of ${tot} files, $(($tot-$got)) failed with HTTP status >= 400"

# some JSON files are malformed (i.e., not JSON) - just remove them
for file in spaces/*.json; do
  cat "${file}" | jq > /dev/null
  if [[ "${?}" -ne 0 ]]; then
    echo "${file} does not parse as JSON... removing it..."
    rm "${file}"
  fi
done

# loop every JSON file, and nicely output any that have a .membership_plans object
for file in spaces/*.json; do
  plans=$(cat "${file}" | jq '.membership_plans?')
  [[ "${plans}" == "null" ]] && continue
  echo "${file}"
#  echo "${plans}" | jq -c
  echo "${plans}" | jq -r '.[] | (.currency_symbol + (.value|tostring) + " " + .currency + " " + .billing_interval + " for " + .name + " (" + .description + ")")'
  echo ""
done

The output of this final loop looks like:

...

spaces/CCC Basel.json
20 CHF monthly for Minimal ()
40 CHF monthly for Recommended ()
60 CHF monthly for Root ()

...

spaces/RevSpace.json
32 EUR monthly for regular ()
20 EUR monthly for junior ()
19.84 EUR monthly for multi2 ()
13.37 EUR monthly for multi3 ()

...

spaces/Sheffield Hackspace.json
£6 GBP monthly for normal membership (regularly attend any of the several open evenings a week)
£21 GBP monthly for keyholder membership (come and go as you please)

...
see full output
spaces/CCC Basel.json
20 CHF monthly for Minimal ()
40 CHF monthly for Recommended ()
60 CHF monthly for Root ()

spaces/ChaosStuff.json
120 EUR yearly for Regular Membership (For people with a regular income)
40 EUR yearly for Student Membership (For pupils and students)
40 EUR yearly for Supporting Membership (For people who want to use the space to work on projects, but don't want to have voting rights an a general assembly.)
1 EUR yearly for Starving Hacker (For people, who cannot afford the membership. Please get in touch with us, before applying.)

spaces/dezentrale.json
16 EUR monthly for Reduced membership ()
32 EUR monthly for Regular membership ()
42 EUR monthly for Nerd membership ()
64 EUR monthly for Nerd membership ()
128 EUR monthly for Nerd membership ()

spaces/Entropia.json
25 EUR yearly for Regular Members (Normale Mitglieder gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
19 EUR yearly for Members of CCC e.V. (Mitglieder des CCC e.V. gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
15 EUR yearly for Reduced Fee Members (Schüler, Studenten, Auszubildende und Menschen mit geringem Einkommen gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
6 EUR yearly for Sustaining Membership (Fördermitglieder gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)

spaces/Hacker Embassy.json
100 USD monthly for Membership ()

spaces/Hackerspace.Gent.json
25 EUR monthly for regular (discount rates and yearly invoice also available)

spaces/Hack Manhattan.json
110 USD monthly for Normal Membership (Membership dues go directly to rent, utilities, and the occasional equipment purchase.)
55 USD monthly for Starving Hacker Membership (Membership dues go directly to rent, utilities, and the occasional equipment purchase. This plan is intended for student/unemployed hackers.)

spaces/Hal9k.json
450 DKK other for Normal membership (Billing is once per quarter)
225 DKK other for Student membership (Billing is once per quarter)

spaces/Leigh Hackspace.json
24 GBP monthly for Member (Our standard membership that allows usage of the hackspace facilities.)
30 GBP monthly for Member+ (Standard membership with an additional donation.)
18 GBP monthly for Concession (A subsidised membership for pensioners, students, and low income earners.)
40 GBP monthly for Family (A discounted family membership for two adults and two children.)
5 GBP daily for Day Pass (Access to the hackspace's facilities for a day.)
5 GBP monthly for Patron (Support the hackspace without being a member.)

spaces/LeineLab.json
120 EUR yearly for Ordentliche Mitgliedschaft ()
30 EUR yearly for Ermäßigte Mitgliedschaft ()
336 EUR yearly for Ordentliche Mitgliedschaft + Werkstatt ()
120 EUR yearly for Ermäßigte Mitgliedschaft + Werkstatt ()

spaces/<name>space Gera.json

spaces/Nerdberg.json
35 EUR monthly for Vollmitgliedschaft (Normal fee, if it is to much for you, contact the leading board, we'll find a solution.)
15 EUR monthly for Fördermitgliedschaft ()

spaces/NYC Resistor.json
115 USD monthly for standard ()
75 USD monthly for teaching ()

spaces/Odenwilusenz.json
0 CHF yearly for Besucher ()
120 CHF yearly for Mitglied ()
480 CHF yearly for Superuser ()
1200 CHF yearly for Co-Worker ()

spaces/RevSpace.json
32 EUR monthly for regular ()
20 EUR monthly for junior ()
19.84 EUR monthly for multi2 ()
13.37 EUR monthly for multi3 ()

spaces/Sheffield Hackspace.json
£6 GBP monthly for normal membership (regularly attend any of the several open evenings a week)
£21 GBP monthly for keyholder membership (come and go as you please)

spaces/TkkrLab.json
30 EUR monthly for Normal member (Member of TkkrLab (https://tkkrlab.nl/deelnemer-worden/))
15 EUR monthly for Student member (Member of TkkrLab, discount for students (https://tkkrlab.nl/deelnemer-worden/))
15 EUR monthly for Student member (Junior member of TkkrLab, discount for people aged 16 or 17 (https://tkkrlab.nl/deelnemer-worden/))

spaces/-usr-space.json

I think a couple have weird names like <name>space or /dev/tal which screw with my script. Oh well, it's for you to improve.

Overall, not that many spaces have published their prices to SpaceAPI. Also, the ones in the US look really expensive. As ever, a good price probably depends on context (size/city/location/etc).

Perhaps I can convince some other spaces to put their membership prices in their SpaceAPI...

back to top

uploading files to a GitHub repository with a bash script#prevsinglenexttop

2025-02-02 • tags: obsidian, github, scripting • 364 'words', 109 secs @ 200wpm

I write these notes in Obsidian. To upload, them, I could visit https://github.com/alifeee/blog/tree/main/notes, click "add file", and copy and paste the file contents. I probably should do that.

But, instead, I wrote a shell script to upload them. Now, I can press "CTRL+P" to open the Obsidian command pallette, type "lint" (to lint the note), then open it again and type "upload" and upload the note. At this point, I could walk away and assume everything went fine, but what I normally do is open the GitHub Actions tab to check that it worked properly.

The process the script undertakes is:

  1. check user inputs are good (all variables exist, file is declared)
  2. check if file exists or not already in GitHub with a curl request
  3. generate a JSON payload for the upload request, including:
    1. commit message
    2. commit author & email
    3. file contents as a base64 encoded string
    4. (if file exists already) sha1 hash of existing file
  4. make a curl request to upload/update the file!

As I use it from inside Obsidian, I use an extension called Obsidian shellcommands, which lets you specify several commands. For this, I specify:

export org="alifeee"
export repo="blog"
export fpath="notes/"
export git_name="alifeee"
export git_email="alifeee@alifeee.net"
export GITHUB_TOKEN="github_pat_3890qwug8f989wu89gu98w43ujg98j8wjgj4wjg9j83wjq9gfj38w90jg903wj"
{{vault_path}}/scripts/upload_to_github.sh {{file_path:absolute}}

…and when run with a file open, it will upload/update that file to my notes folder on GitHub.

This is maybe a strange way of doing it, as the "source of truth" is now "my Obsidian", and the GitHub is really just a place for the files to live. However, I enjoy it.

I've made the script quite generic as you have to supply most information via environment variables. You can use it to upload an arbitrary file to a specific folder in a specific GitHub repository. Or… you can modify it and do what you want with it!

It's here: https://gist.github.com/alifeee/d711370698f18851f1927f284fb8eaa8

back to top

combining geojson files with jq#prevsinglenexttop

2024-12-13 • tags: geojson, jq, scripting • 520 'words', 156 secs @ 200wpm

I'm writing a blog about hitchhiking, which involves a load os .geojson files, which look a bit like this:

The .geojson files are generated from .gpx traces that I exported from OSRM's (Open Source Routing Machine) demo (which, at time of writing, seems to be offline, but I believe it's on https://map.project-osrm.org/), one of the routing engines on OpenStreetMap.

I put in a start and end point, exported the .gpx trace, and then converted it to .geojson with, e.g., ogr2ogr "2.1 Tamworth -> Tibshelf Northbound.geojson" "2.1 Tamworth -> Tibshelf Northbound.gpx" tracks, where ogr2ogr is a command-line tool from sudo apt install gdal-bin which converts geographic data between many formats (I like it a lot, it feels nicer than searching the web for "errr, kml to gpx converter?"). I also then semi-manually added some properties (see how).

{
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "properties": {
        "label": "2.1 Tamworth -> Tibshelf Northbound",
        "trip": "2"
      },
      "geometry": {
        "type": "MultiLineString",
        "coordinates": [
          [
            [-1.64045, 52.60606]
            [-1.64067, 52.6058],
            [-1.64069, 52.60579],
            ...
          ]
        ]
      }
    }
  ]
}

I then had a load of files that looked a bit like

$ tree -f geojson/
geojson
├── geojson/1.1 Tamworth -> Woodall Northbound.geojson
├── geojson/1.2 Woodall Northbound -> Hull.geojson
├── geojson/2.1 Tamworth -> Tibshelf Northbound.geojson
├── geojson/2.2 Tibshelf Northbound -> Leeds.geojson
├── geojson/3.1 Frankley Northbound -> Hilton Northbound.geojson
├── geojson/3.2 Hilton Northbound -> Keele Northbound.geojson
└── geojson/3.3 Keele Northbound -> Liverpool.geojson

Originally, I was combining them into one .geojson file using https://github.com/mapbox/geojson-merge, which as a binary to merge .geojson files, but I decided to use jq because I wanted to do something a bit more complex, which was to create a structure like

FeatureCollection
  Features:
    FeatureCollection
      Features (1.1 Tamworth -> Woodall Northbound, 1.2 Woodall Northbound -> Hull)
    FeatureCollection
      Features (2.1 Tamworth -> Tibshelf Northbound, 2.2 Tibshelf Northbound -> Leeds)
    FeatureCollection
      Features (3.1 Frankley Northbound -> Hilton Northbound, 3.2 Hilton Northbound -> Keele Northbound, 3.3 Keele Northbound -> Liverpool)

I spent a while making a quite-complicated jq query, using variables (an "advanced feature"!) and a reduce statement, but when I completed it, I found out that the above structure is not valid .geojson, so I went back to just having:

FeatureCollection
  Features (1.1 Tamworth -> Woodall Northbound, 1.2 Woodall Northbound -> Hull, 2.1 Tamworth -> Tibshelf Northbound, 2.2 Tibshelf Northbound -> Leeds, 3.1 Frankley Northbound -> Hilton Northbound, 3.2 Hilton Northbound -> Keele Northbound, 3.3 Keele Northbound -> Liverpool)

...which is... a lot simpler to make.

A query which combines the files above is (the sort exists to sort the files so they are in numerical order downwards in the resulting .geojson):

while read file; do cat "${file}"; done <<< $(find geojson/ -type f | sort -t / -k 2 -n) | jq --slurp '{
    "type": "FeatureCollection",
    "name": "hitchhikes",
    "features": ([.[] | .features[0]])
}' > hitching.geojson

While geojson-merge was cool, it feels nice to have a more "raw" command to do what I want.

back to top see more (+9)