Today, I wanted to have an overlay (think Discord voice chat overlay, or when you pop-out a video in Firefox, or when you use chat heads on mobile) which showed me who was online on the server.
Querying the Minecraft server status
After seeing an "enable status" option in the server's server.properties file, and searching up what it meant (it allows services to "query the status of the server), I'd used https://mcsrvstat.us/ before to check the status of the server, which shows you the player list in a browser.
But a local overlay would need a local way to query the server status. So I did some web searching, found a Python script which wasn't great (and written for Python 2), then a self-hostable server status API, which led me to mcstatus, a Python API (with command line tool) for fetching server status.
Next, a way of having an overlay. Searching for "linux x simple text overlay" led me to xmessage, which can show simple windows, but they're more like confirmation windows, not like long-lasting status windows (i.e., it's hard to update the text).
I was also led to discover conky, which – if nothing else – has a great name. It's designed to be a "system monitor", i.e., a thing wot shows you your CPU temperature, uptime, RAM usage, et cetera. The configuration is also written in Lua, which is super neat! I still want to get more into Lua.
Using conky
By modifying the default configuration (in /etc/conky/conky.conf) like so:
…when we run conky it opens a small window which contains the output of the script ~/temp/minecraft/check.sh (the 5 after execpi means it runs every 5 seconds). If this script was just echo "hi!" then that conky window looks a bit like:
———————+x
| |
| hi! |
|_______|
I use Pop!_OS, which uses Gnome/X for all the windows. With that (by default), I can right click the top bar of a window and click "Always on Top", which effectively makes the little window into an overlay, as it always displays on top of other windows, with the added bonus that I can easily drag it around.
Writing a script for conky to use
Now, I can change the script to use the above Minecraft server status JSON information to output something which conky can use as an input, like:
I often want to get my current WiFi name (SSID) and password.
How to get name/password manually
Sometimes, it's for a microcontroller. Sometimes, to share it. This time, it's for setting up an info-beamer device with WiFi.
Before today, I would usually open my phone and go to "share" under the WiFi settings, and copy the password manually, and also copy the SSID manually.
It's finally time to write a way to do it with bash!
How to get name/password with bash
After some web-searching, these commands do what I want:
Now, they will automatically be enabled on all my computers that use Atuin. This is actually not… amazingly helpful as my other computers all use ethernet, not WiFi, but… it's mainly about having the aliases all in the same place (and "backed up", if you will).
Once again, I start by downloading the JSON files, so that (in theory) I can make only one request to each SpaceAPI endpoint, and then work with the data locally (instead of requesting the JSON from the web every time I interact with it).
This script is modified from last time I did it, adding some better feedback of why some endpoints fail.
# download spacestot=0;got=0echo"code,url"> failed.txt
RED='\033[0;31m';GREEN='\033[0;32m';YELLOW='\033[0;33m';NC='\033[0m'whileread double;dotot=$(($tot+1))name=$(echo"${double}"|awk -F';''{print $1}');url=$(echo"${double}"|awk -F';''{print $2}');fn=$(echo"${name}"|sed's+/+-+g')echo"saving '${name}' - <${url}> to ./spaces/${fn}.json";# skip unless manually deleted if[-f"./spaces/${fn}.json"];thenecho-e" ${YELLOW}already saved${NC} this URL!">> /dev/stderr
got=$(($got+1))continuefi# get, skipping if HTTP status >= 400code=$(curl-L-s--fail --max-time 5-o"./spaces/${fn}.json" --write-out "%{http_code}""${url}")if[["${?}"-ne0]]||[["${code}"-ne200]];thenecho"${code},${url}">> failed.txt
echo-e" ${RED}bad${NC} status code (${code}) for this url!">> /dev/stderr
continuefiecho-e" ${GREEN}fetched${NC}! maybe it's bad :S">> /dev/stderr
got=$(($got+1))done<<<$(cat directory.json | jq -r'to_entries | .[] | (.key + ";" + .value)')echo"done, got ${got} of ${tot} files, $(($tot-$got)) failed with HTTP status >= 400"echo"codes from failed.txt:"cat failed.txt |awk -F',''NR>1{a[$1]+=1} END{printf " "; for (i in a) {printf "%s (%i) ", i, a[i]}; printf "\n"}'# some JSON files are malformed (i.e., not JSON) - just remove themrem=0forfilein spaces/*.json;docat"${file}"| jq > /dev/null
if[["${?}"-ne0]];thenecho"=== ${file} does not parse as JSON... removing it... ==="rm-v"${file}"rem=$(( $rem +1))fidoneecho"removed ${rem} malformed json files"
Extracting contact information
This is basically copied from last time I did it, changing membership_plans? to contact?, and changing the jq format afterwards.
We can filter this file to only the "mastodon:" lines, and then extract the server with a funky regex, and get a list of which instances are most common.
Sheffield city council publishes a list of HMO (House in Multiple Occupation) licences on their HMO page, along with other information about HMOs (in brief, an HMO is a shared house/flat with more than 3 non-family members, and it must be licenced if this number is 5 or more).
How accessible is the data on HMO licences
They provide a list of licences as an Excel spreadsheet (.xlsx). I've asked them before if they could (also) provide a CSV, but they told me that was technically impossible. I also asked if they had historical data (i.e., previous spreadsheets), but they said they deleted it every time they uploaded a new one.
Therefore, as I'm interested in private renting in Sheffield, I've been archiving the data in a GitHub repository, as CSVs. I also add additional data like lat/long coordinates (via geocoding), and parse the data into geographical formats like .geojson, .gpx, and .kml (which can be viewed on a map!).
Calculating statistics from the data
What I hadn't done yet was any statistics on the data (I'd only been interested in visualising it on a map) so that's what I've done now.
I spent the afternoon writing some scripts to parse CSV data and calculate things like mean occupants, most common postcodes, number of expiring licences by date, et cetera.
General Statisitcs
I find shell scripting interesting, but I'm not so sure everyone else does (the script for the interested). So I'm not going to put the scripts here, but I will say that I used these command line tools (CLI tools) this many times:
cat 7 times, tail 8 times, wc 1 time, csvtool 7 times, awk 4 times, sort 7 times, head 3 times, echo 7 times, uniq 2 times, sed 3 times.
Anyway, here are the statistics from the script (in text form, as is most shareable):
the mean number of occupants may be rising or it may be statistically insignificant
there are either a lot of houses becoming "not HMOs" overall (and on roads like Crookesmoor Road) or there are becoming a lot of unlicenced HMOs
Statistics on issuing and expiry dates
I also did some statistics on the licence issue and expiry dates with a second stats script, which – as it parses nearly 5,000 dates – takes longer than "almost instantly" to run. As above, this used:
date 10 times, while 6 times, cat 3 times, csvtool 2 times, tail 5 times, wc 3 times, echo 23 times, sort 11 times, uniq 4 times, sed 4 times, awk 9 times, head 2 times
The script outputs:
hmos_2024-09-09.csv
1745 dates in 1745 lines (627 unique issuing dates)
637 expired
1108 active
Licence Issue Dates:
Sun 06 Jan 2019, Sun 06 Jan 2019, … … … Wed 12 Jun 2024, Tue 09 Jul 2024,
Monday (275), Tuesday (440), Wednesday (405), Thursday (352), Friday (256), Saturday (5), Sunday (12),
2019 (84), 2020 (311), 2021 (588), 2022 (422), 2023 (183), 2024 (157),
Licence Expiry Dates:
Mon 09 Sep 2024, Mon 09 Sep 2024, … … … Mon 11 Jun 2029, Sun 08 Jul 2029,
2024 (159), 2025 (824), 2026 (263), 2027 (225), 2028 (185), 2029 (89),
hmos_2025-01-28.csv
1459 dates in 1459 lines (561 unique issuing dates)
334 expired
1125 active
Licence Issue Dates:
Mon 28 Oct 2019, Mon 04 Nov 2019, … … … Mon 06 Jan 2025, Tue 14 Jan 2025,
Monday (243), Tuesday (380), Wednesday (338), Thursday (272), Friday (211), Saturday (6), Sunday (9),
2019 (2), 2020 (130), 2021 (567), 2022 (406), 2023 (181), 2024 (170), 2025 (3),
Licence Expiry Dates:
Thu 30 Jan 2025, Fri 31 Jan 2025, … … … Mon 22 Oct 2029, Wed 28 Nov 2029,
2025 (681), 2026 (264), 2027 (225), 2028 (184), 2029 (105),
hmos_2025-03-03.csv
1315 dates in 1315 lines (523 unique issuing dates)
189 expired
1126 active
Licence Issue Dates:
Mon 28 Oct 2019, Mon 04 Nov 2019, … … … Wed 05 Mar 2025, Wed 05 Mar 2025,
Monday (217), Tuesday (339), Wednesday (314), Thursday (244), Friday (189), Saturday (4), Sunday (8),
2019 (2), 2020 (64), 2021 (494), 2022 (399), 2023 (177), 2024 (170), 2025 (9),
Licence Expiry Dates:
Fri 07 Mar 2025, Fri 07 Mar 2025, … … … Mon 22 Oct 2029, Wed 28 Nov 2029,
2025 (533), 2026 (262), 2027 (225), 2028 (184), 2029 (111),
Potential conclusions on dates
Again, draw your own conclusions (homework!), but some could be:
Sheffield council's most productive day of the week is Tuesday
Something is slowing down the process or the data uploaded in March wasn't up to date or the distribution of licences isn't even across the year, as only 9 licences are marked as issued in 2025
Why is this interesting
I started collecting HMO data originally because I wanted to visualise the licences on a map. Over a short time, I have created my own archive of licence history (as the council do not provide such).
Since I had multiple months of data, I could make some comparison, so I made these statistics. I don't find them incredibly useful, but there could be people who do.
Perhaps as time goes on, the long-term comparison (over years) could be interesting. I think the above data might not be greatly useful as it seems that Sheffield council are experiencing delays over licensing at the moment, so the decline in licences probably doesn't reflect general housing trends.
However, further down, it lists how to access it, which says:
You can search for and view open records on our partner site Findmypast.co.uk (charges apply). A version of the 1939 Register is also available at Ancestry.co.uk (charges apply), and transcriptions without images are on MyHeritage.com (charges apply). It is free to search for these records, but there is a charge to view full transcriptions and download images of documents. Please note that you can view these records online free of charge in the reading rooms at The National Archives in Kew.
So… charges apply.
Anyway, for a while in April 2025 (until May 8th), FindMyPast is giving free access to the 1939 data.
Of course, family history is hard, and what's much easier is "who lived in my house in 1939". For that you can use:
You can self-host it on a raspberry pi, a server, or otherwise a computer which is always-on and has access to the Internet (with a static IP).
I tried installing it on my server, and this is what I ran. Notably, it wiped my existing nginx configuration, so I suggested a clarification of that in the documentation and made sure to back my config up so I could restore it. Thankfully I was already doing that.
# install filescd /usr/alifeee/
git clone --depth=1 https://github.com/owntracks/quicksetup
mv quicksetup owntracks
cd /usr/alifeee/owntracks/
# edit configurationcp configuration.yaml.example configuration.yaml
nano configuration.yaml
# back up nginx.conf(cd /media/alifeee;sudo ./back-up.sh)# set upsudo ./bootstrap.sh
# reset nginx conf (it blanks it)sudochmod u+w /etc/nginx/nginx.conf
sudocp /media/alifeee/20250307T1749/nginx.conf /etc/nginx/nginx.conf
# put this into configecho'# for owntracks
map $cookie_otrauth $mysite_hascookie {
"vhwNkyPGPCvnMCiQRkCs" "off";
default "My OwnTracks";
}'# get passwordssudotail /usr/local/owntracks/userdata/*.pass
Files are stores in /usr/local/owntracks. I can then visit https://owntracks.alifeee.net/, login, and see a setup. I downloaded the Android app and set it up by opening the file from my website with the app, and it set up pretty well!
Then, I went for a walk, and looked at the "frontend", which puts out a view like this, of my location history:
___
___/ \
/ alifeee
|
/
|
\___
\__
It's pretty neat! And self-hosted! Another service to join the maybe-too-many on my server…
I made an account on it for my friends and we plan to use it to keep track of each other in our hitchhiking adventure next week.
I wanted to use Android on Linux, so I searched the web and found https://waydro.id/.
Instead of running in some kind of virtual machine, it seems to run Android slightly more natively on Linux (I really don't know how any of this works).
Here is a small adventure at trying to install it:
$ sudoaptinstallcurl ca-certificates
$ curl-s https://repo.waydro.id |sudobash
$ sudoaptinstall waydroid
$ waydroid init
[11:27:05] Failed to load binder driver
[11:27:05] modprobe: FATAL: Module binder_linux not found in directory /lib/modules/6.13.0-061300-generic
[11:27:05] ERROR: Binder node"binder"for waydroid not found
[11:27:05] See also: https://github.com/waydroid
I don't really know what this error meant, but after searching, it seemed that Waydroid needed "wayland" and I was using X (which are both Desktop thingies which make pixels appear on the screen). I read things about Pop!_OS not having something necessary installed in the kernel, but I could use "DKMS", meaning Dynamic Kernel Module Support. So I tried installing what I'd found links to with:
git clone https://github.com/choff/anbox-modules
cd anbox-modules && ./INSTALL.sh
now when I ran waydroid init it worked, but then I got nothing. I wasn't really sure what I was supposed to be doing to "open" it now that it was "init"'d. So I deleted all I could with
git clone https://github.com/n1lby73/waydroid-installer
cd waydroid-installer
sudobash install_script.sh
The script didn't complete, and complained about modules not installing, specifically that lxd-client could not be installed.
Looking at the script I saw it was trying to run apt install lxd-client but running that myself, it seemed that it didn't exist:
$ sudoaptinstall lxd-client
E: Unable to locate package lxd-client
After searching, it seems lxd-client provides a command lxc, so I looked for how to install lxc, and found it was possible via snap. I've not really used snap before and people have complained about it (about filesize and automatic updates), so I was wary to install it, but I did with:
I removed lxd-client from the install script and re-ran it, and it seemed to work OK. It said it installed a "Wayland" desktop option on my login page if I rebooted.
Opening waydroid
So I rebooted, and on the login screen selected "Pop on Wayland" (I'm still not fully sure what this X/Wayland thing is), and tried starting Waydroid.
…installed some apps and filled one of my screens with a big Android display.
I found the APKs on either F-Droid, which just has them available for download (sweet) or by searching the web and downloading them from sketchy sites.
It seems to work well!
I suppose there's a lot you can do with Waydroid, if you want. I don't think I want.
In some ways, this is an example of the involved-nature of installing things on Linux.
I've toyed for a while with microcontrollers, and only really used Arduino/C/C++. Sometimes, I've heard talk of MicroPython, but I've never tried it out.
Until today!
I had a little experiment, and it seems promising. I might have a larger experiment soon (maybe try to retry some of my hardware hacking).
I'll share here my initial experiments! I'm running on a Linux computer, on Pop!_OS.
# install files and virtual environmentmkdir-p /git/micropython
cd /git/micropython
python -m venv env. env/bin/activate
pip install esptool
# download firmware
$ ls
ESP8266_GENERIC-20241129-v1.24.1.bin
# at this point I plugged in the ESP but it was not recognised# after looking at `tail -f /var/log/syslog`, I saw that `brltty`# was doing something spooky. I remembered having this issue before,# and that `brltty` was something to help Braille readers. As I don't# need that, I...# disabled brlttysudo systemctl stop brltty.service
sudo systemctl mask brltty.service
sudo systemctl disable brltty.service
sudo systemctl restart
# now I could see the ESP as a USB device
$ lsusb
$ ls /dev/ |grep"ttyUSB"
ttyUSB0
# flash ESP
esptool.py --port /dev/ttyUSB0 erase_flash
esptool.py --port /dev/ttyUSB0 --baud1000000 write_flash --flash_size=4MB -fm dio 0 ESP8266_GENERIC-20241129-v1.24.1.bin
That's it installed! Now, using the guides above I found I needed a terminal emulator, so I used picocom. And, to reach the pinnacles of complexity, tried turning the inbuilt LED on and off.
sudoaptinstall picocom
picocom /dev/ttyUSB0 -b115200>>> from machine import Pin
>>> p = Pin(2, Pin.OUT)>>> p.off()>>> p.on()
It works! Neat! The REPL (Read-Evaluate-Print-Loop) is really nice to quickly debug with. Perhaps nicer than the "waiting-for-20-seconds-for-your-C-code-to-flash-onto-the-device".
I also tried connecting to WiFi and using the Web-REPL, so you can execute Python commands over the air! With...
>>>import network
>>> wlan = network.WLAN(network.WLAN.IF_STA)>>> wlan.active(True)>>> wlan.scan()>>> wlan.isconnected()>>> wlan.connect("ssid", "key")>>># wait a bit>>> wlan.isconnected()>>># or use function from https://docs.micropython.org/en/latest/esp8266/quickref.html#networking
Then you can configure webrepl with:
>>>import webrepl_setup
…and it will print out an IP that you can connect to and use the REPL from your browser! Very nice.
What I haven't tried yet is using boot.py. From what I know it will execute on every reset of the ESP, so basically is how you "program" it, but a lot quicker, since you just place a file on the filesystem.
I was screwing around on YouTube, and ended up watching a few videos about Rust. Actually, these ones: the first, leading to the second, leading to the third.
These videos are all by noboilerplate, and I got only 1:08 minutes into the third video before I decided to try out Rust myself.
For a long time I've been meaning to make an identicon (think: default pixelated profile picture for GitHub/etc) using Lua, after seeing a friend's identicon implementations in several language. I think, as they do, that making an identicon generator is a very fun and contained way to start experimenting with a new language - you get involved with random numbers, arrays, string formatting, loops, and maybe more.
Anyway, I still haven't made one in Lua, but I did make these three in Rust.
Installing Rust
Installing Rust was super easy, I just used the command from https://rustup.rs/.
Installing VSCodium extensions
Well, first I installed using sudo apt install cargo, but then the VSCodium extension I installed (Rust) suggested I should use rustup, so I uninstalled cargo and used rustup.
Then, I also found out that the VSCodium extension was deprecated in favour of the rust-analyzer extension, so I installed that one instead. I also installed CodeLLDB to allow debugging.
Running Rust
After installing Cargo, I ran cargo and it complained about a missing Cargo.toml, so I guessed I could run…
cargo init
…to create this, and it worked! Neat. It also showed a nice link to the documentation for Cargo.toml. I still haven't opened the Cargo.toml file. Anyway, cargo init also created a "hello world" script:
fnmain(){println!("Hello, world!");}
…which I could run with cargo run…
$ cargo run
Hello, world!
At this point, I got stuck in trying to make the above identicons. I (naturally) came across a few stumbling blocks, but the errors that the compiler provides were quite nice, so I got along OK.
Here's the final code I ended up with (feel free to tell me that several sections are "bad" or "not Rust-y")
userand::prelude::*;constWIDTH:usize=15;constHEIGHT:usize=15;constSQUARE_SIZE:usize=50;constSVG_WIDTH:usize=WIDTH*SQUARE_SIZE;constSVG_HEIGHT:usize=HEIGHT*SQUARE_SIZE;fnmain(){letmut rng =rand::rng();// generate one half of the identicon// let mut arr: [[bool; 0]; 0] = [];letmut arr:Vec<Vec<bool>>=vec![];for r in0..HEIGHT{let empty_arr:Vec<bool>=vec![];
arr.push(empty_arr);for _c in0..((WIDTH+1)/2){let random_val = rng.random_bool(0.5);
arr[r].push(random_val);}}// print the SVGprintln!("<svg version='1.1'
viewbox='0 0 {} {}'
xmlns='http://www.w3.org/2000/svg'>",SVG_WIDTH,SVG_HEIGHT);println!("<rect width='{}' height='{}' fill='black' />",SVG_WIDTH,SVG_HEIGHT);for r in0..arr.len(){let arr_first = arr.first();letmut cols =0;ifletSome(arr_first)= arr_first {
cols = arr_first.len();}for c in0..cols {let xleft = c *SQUARE_SIZE;let xright =SVG_WIDTH- xleft -SQUARE_SIZE;let y = r *SQUARE_SIZE;let filled = arr[r][c];letmut colour ="none";if filled {
colour ="red";}println!("<rect width='50' height='50' fill='{}' x='{}' y='{}' />",
colour, xleft, y
);println!("<rect width='50' height='50' fill='{}' x='{}' y='{}' />",
colour, xright, y
);}}println!(r#"</svg>"#);}
Sticking points
Two things that I got a bit stuck with were:
Not declaring loads of variables
I wasn't sure how to do a lot of things "in-line", and ended up declaring lots of variables, making the code quite verbose. For example, to push an empty vector to another vector I ended up doing (above) this…
let empty_arr:Vec<bool>=vec![];
arr.push(empty_arr);
…which I'm sure could be done in one line somehow. I don't know how.
Finding the length of an Option
To get the length of an embedded Vec (vector), I wanted to run arr.first().len() in some way, but arr.first() is either a vector or None (i.e., an optional/Option). I wanted to do something like:
if arr.first().is_none(){let cols =0;}else{let cols = arr.first().len();}
…assuming that the compiler would realise that in the else section, arr.first() was not None, but it didn't. I don't know enough to figure out a way of doing this.
It's combined from train journeys, ferry journeys, and bus journeys.
Train data
I got the train routing data in .gpx format from https://brouter.damsy.net/, selecting the "Rail" profile in the dropdown. Then, I clicked close to the stations I went to/from/past, got a nice map that looked alright, and exported it.
Bus data
I also used https://brouter.damsy.net/ for this, after I'd found it was good for trains. I just selected one of the "Car" profiles, and set my waypoints, and exported it in the same way.
Ferry data
This was different, as ferries don't use roads or train tracks [citation needed]. But! They are documented well on mapping services. So, I found the route I wanted on https://www.openstreetmap.org/ (OSM) (e.g., the Liepãja to Travemünde Ferry) by using the little questionmark "query feature" button, then opened it on https://overpass-turbo.eu/ (a website for querying OSM data) by writing the query (with the correct feature ID):
way(128069455); out geom;
Then, I can click "Export" to get the .gpx (or other format) data out.
Combining
I spent a long time trying to figure out how to combine .gpx files with ogrmerge.
However, I couldn't figure it out. .gpx is confusing, and everyone who uses it seems to use GUI tools like arcgis or qgis, while I prefer to be able to do things with a command, which I can then repeat in future.
In the end, I converted the files to .geojson (my one true love) with ogr2ogr file111.geojson file111.gpx tracks for each file, and then combined them. Handily, I'd already written a note about combining .geojson files! I wish I stuck in .geojson the whole time. .gpx gives me headaches.
The End
That's it!
I could then load the combined file into https://geojson.io/ to check all was well (it was, I expected I might have to "reverse" some paths to be "forwards"), and I uploaded it to a new GitHub repository, https://github.com/alifeee/europe-trips/.
I also laser cut a mini Europe with a line for the trip on the map, as a gift for my lover :]