notes by alifeeeprofile picturerss

return to notes / blog / website / weeknotes / linktree

here I may post some short, text-only notes, mostly about programming. source code.

tags: all (49), scripting (18), linux (5), android (4), bash (4), geojson (4), jq (4), obsidian (4), github (3), html (3) ............ see all (+65)

guestbook!

show all /
sign book

creating a desktop overlay to view players on a Minecraft server with conky#prevsinglenexttop

2025-06-03 • tags: conky, minecraft, scripting, overlay • 1049 'words', 315 secs @ 200wpm

Currently, I'm hosting a Minecraft server weekly on Tuesdays. Sometimes I even play.

It's Vanilla with a proximity voice chat mod (walk near people to hear them). Proximity voice chat is endlessly fun (see Barotrauma, Factorio, et cetera…)

Today, I wanted to have an overlay (think Discord voice chat overlay, or when you pop-out a video in Firefox, or when you use chat heads on mobile) which showed me who was online on the server.

Querying the Minecraft server status

After seeing an "enable status" option in the server's server.properties file, and searching up what it meant (it allows services to "query the status of the server), I'd used https://mcsrvstat.us/ before to check the status of the server, which shows you the player list in a browser.

But a local overlay would need a local way to query the server status. So I did some web searching, found a Python script which wasn't great (and written for Python 2), then a self-hostable server status API, which led me to mcstatus, a Python API (with command line tool) for fetching server status.

I installed and tested it with

$ cd ~/temp/minecraft/
$ python3 -m venv env
$ ./env/bin/python -m mcstatus $SERVER_IP json
{"online": true, "kind": "Java", "status": {"players": {"online": 7, "max": 69, "sample": [{"name": "Boldwolf5491", "id": "289qfhj8-a8f2-298g-19ga-897ahwf8uwa8"}, {"name": "……………

Neat!

How to make an overlay on Linux

Next, a way of having an overlay. Searching for "linux x simple text overlay" led me to xmessage, which can show simple windows, but they're more like confirmation windows, not like long-lasting status windows (i.e., it's hard to update the text).

I was also led to discover conky, which – if nothing else – has a great name. It's designed to be a "system monitor", i.e., a thing wot shows you your CPU temperature, uptime, RAM usage, et cetera. The configuration is also written in Lua, which is super neat! I still want to get more into Lua.

Using conky

By modifying the default configuration (in /etc/conky/conky.conf) like so:

diff --git a/etc/conky/conky.conf b/.config/conky/conky.conf
index 44053d5..cc319e1 100644
--- a/etc/conky/conky.conf
+++ b/.config/conky/conky.conf
@@ -37,8 +37,9 @@ conky.config = {
     out_to_stderr = false,
     out_to_x = true,
     own_window = true,
+    own_window_title = 'Minecraft',
     own_window_class = 'Conky',
-    own_window_type = 'desktop',
+    own_window_type = 'normal', -- or desktop
     show_graph_range = false,
     show_graph_scale = false,
     stippled_borders = 0,
@@ -48,25 +49,9 @@ conky.config = {
     use_xft = true,
 }
 
 conky.text = [[
-${color grey}Info:$color ${scroll 32 Conky $conky_version - $sysname $nodename $kernel $machine}
-$hr
-${color grey}Uptime:$color $uptime
-${color grey}Frequency (in MHz):$color $freq
-${color grey}Frequency (in GHz):$color $freq_g
-${color grey}RAM Usage:$color $mem/$memmax - $memperc% ${membar 4}
-${color grey}Swap Usage:$color $swap/$swapmax - $swapperc% ${swapbar 4}
-${color grey}CPU Usage:$color $cpu% ${cpubar 4}
-${color grey}Processes:$color $processes  ${color grey}Running:$color $running_processes
-$hr
-${color grey}File systems:
- / $color${fs_used /}/${fs_size /} ${fs_bar 6 /}
-${color grey}Networking:
-Up:$color ${upspeed} ${color grey} - Down:$color ${downspeed}
-$hr
-${color grey}Name              PID     CPU%   MEM%
-${color lightgrey} ${top name 1} ${top pid 1} ${top cpu 1} ${top mem 1}
-${color lightgrey} ${top name 2} ${top pid 2} ${top cpu 2} ${top mem 2}
-${color lightgrey} ${top name 3} ${top pid 3} ${top cpu 3} ${top mem 3}
-${color lightgrey} ${top name 4} ${top pid 4} ${top cpu 4} ${top mem 4}
+${execpi 5 ~/temp/minecraft/check.sh}
 ]]

…when we run conky it opens a small window which contains the output of the script ~/temp/minecraft/check.sh (the 5 after execpi means it runs every 5 seconds). If this script was just echo "hi!" then that conky window looks a bit like:

 ———————+x
 |       |
 |  hi!  |
 |_______|

I use Pop!_OS, which uses Gnome/X for all the windows. With that (by default), I can right click the top bar of a window and click "Always on Top", which effectively makes the little window into an overlay, as it always displays on top of other windows, with the added bonus that I can easily drag it around.

Writing a script for conky to use

Now, I can change the script to use the above Minecraft server status JSON information to output something which conky can use as an input, like:

#!/bin/bash
#~/temp/minecraft/check.sh
json=$(~/temp/minecraft/env/bin/python -m mcstatus $SERVER_IP json)
online=$(echo "${json}" | jq -r '.status.players.online')
players=$(echo "${json}" | jq -r '.status.players.sample[] | .name')

echo '${color aaaa99}'"${online} players online"'${color}'
echo "---"
echo "${players}" \
  | sort \
  | awk '
  BEGIN{
    for(n=0;n<256;n++)ord[sprintf("%c",n)]=n
  }{
    r=0; g=0; b=0;
    split($0, arr, "")
    for (i in arr) {c=arr[i]; n=ord[c]; r+=n*11; g+=n*15; b+=n*21}
    printf "${color %X%X%X}%s\n",
      r%128+128, g%128+128, b%128+128, $0
  }
'

The fancy awk is just to make each player be a different colour, and to randomly generate the colours from the ASCII values of the player's username.

The final output

The final output looks like:

 ——————————————————+x
 | 8 players online |
 | ---              |
 | Kick_Flip_Barry  |
 | Blue_Outburst    |
 | Kboy8082         |
 | lele2102         |
 | Compostmelon101  |
 | Nobody808        |
 | Kaithefrog       |
 | BrinnanTheThird  |
 |__________________|

…which I can drag anywhere on my screen. When people join or leave the server, I can see a flash of change out of the corner of my eye.

Conclusions

Is this useful? Should I – instead – just have been playing the game? Do I use too many en-dashes? The world only knows.

Maybe I'll use conky for something else in future… I like to wonder what it could do…

back to top

getting my wifi name and password from the terminal#prevsinglenexttop

2025-05-25 • tags: scripting, wifi, aliases • 265 'words', 80 secs @ 200wpm

I often want to get my current WiFi name (SSID) and password.

How to get name/password manually

Sometimes, it's for a microcontroller. Sometimes, to share it. This time, it's for setting up an info-beamer device with WiFi.

Before today, I would usually open my phone and go to "share" under the WiFi settings, and copy the password manually, and also copy the SSID manually.

It's finally time to write a way to do it with bash!

How to get name/password with bash

After some web-searching, these commands do what I want:

alias wifi=iwgetid -r
alias wifipw=sudo cat "/etc/NetworkManager/system-connections/$(wifi).nmconnection" | pcregrep -o1 "^psk=(.*)"

How to use

…and I can use them like:

$ wifi
the wood raft (2.4G)
$ wifipw
[sudo] password for alifeee: 
**************

Neat!

Using Atuin aliases

Finally, above I suggested I was using Bash aliases, but I actually created them using Atuin, specifically Atuin dotfile aliases, like:

atuin dotfiles alias set wifi 'iwgetid -r'
atuin dotfiles alias set wifipw 'sudo cat "/etc/NetworkManager/system-connections/$(wifi).nmconnection" | pcregrep -o1 "^psk=(.*)"'

Now, they will automatically be enabled on all my computers that use Atuin. This is actually not… amazingly helpful as my other computers all use ethernet, not WiFi, but… it's mainly about having the aliases all in the same place (and "backed up", if you will).

back to top

Getting hackspace Mastodon instances from SpaceAPI#prevsinglenexttop

2025-05-22 • tags: scripting, spaceapi, mastodon, hackspaces, json • 622 'words', 187 secs @ 200wpm

We're back on the SpaceAPI grind.

This time, I wanted to see what Mastodon instances different hackspaces used.

The "contact" field in SpaceAPI

SpaceAPI has a "contact" object, which is used for this kind of thing. For example, for Sheffield Hackspace, this is:

$ curl -s "https://www.sheffieldhackspace.org.uk/spaceapi.json" | jq '.contact'
{
  "email": "trustees@sheffieldhackspace.org.uk",
  "twitter": "@shhmakers",
  "facebook": "SHHMakers"
}

Downloading all the SpaceAPI files

Once again, I start by downloading the JSON files, so that (in theory) I can make only one request to each SpaceAPI endpoint, and then work with the data locally (instead of requesting the JSON from the web every time I interact with it).

This script is modified from last time I did it, adding some better feedback of why some endpoints fail.

# download spaces
tot=0; got=0
echo "code,url" > failed.txt
RED='\033[0;31m'; GREEN='\033[0;32m'; YELLOW='\033[0;33m'; NC='\033[0m'
while read double; do
  tot=$(($tot+1))
  name=$(echo "${double}" | awk -F';' '{print $1}');
  url=$(echo "${double}" | awk -F';' '{print $2}');
  fn=$(echo "${name}" | sed 's+/+-+g')
  echo "saving '${name}' - <${url}> to ./spaces/${fn}.json";

  # skip unless manually deleted  
  if [ -f "./spaces/${fn}.json" ]; then
    echo -e "  ${YELLOW}already saved${NC} this URL!" >> /dev/stderr
    got=$(($got+1))
    continue
  fi
  
  # get, skipping if HTTP status >= 400
  code=$(curl -L -s --fail --max-time 5 -o "./spaces/${fn}.json" --write-out "%{http_code}" "${url}")
  if [[ "${?}" -ne 0 ]] || [[ "${code}" -ne 200 ]]; then
    echo "${code},${url}" >> failed.txt
    echo -e "  ${RED}bad${NC} status code (${code}) for this url!"  >> /dev/stderr
    continue
  fi
  
  echo -e "  ${GREEN}fetched${NC}! maybe it's bad :S" >> /dev/stderr
  got=$(($got+1))
done <<<$(cat directory.json | jq -r 'to_entries | .[] | (.key + ";" + .value)')
echo "done, got ${got} of ${tot} files, $(($tot-$got)) failed with HTTP status >= 400"
echo "codes from failed.txt:"
cat failed.txt | awk -F',' 'NR>1{a[$1]+=1} END{printf "  "; for (i in a) {printf "%s (%i) ", i, a[i]}; printf "\n"}'

# some JSON files are malformed (i.e., not JSON) - just remove them
rem=0
for file in spaces/*.json; do
  cat "${file}" | jq > /dev/null
  if [[ "${?}" -ne 0 ]]; then
    echo "=== ${file} does not parse as JSON... removing it... ==="
    rm -v "${file}"
    rem=$(( $rem + 1 ))
  fi
done
echo "removed ${rem} malformed json files"

Extracting contact information

This is basically copied from last time I did it, changing membership_plans? to contact?, and changing the jq format afterwards.

# parse contact info
for file in spaces/*.json; do
  plans=$(cat "${file}" | jq '.contact?')
  [[ "${plans}" == "null" ]] && continue
  echo "${file}"
  echo "${plans}" | jq -r 'to_entries | .[] | (.key + ": " + (.value|tostring) )'
  echo ""
done > contact.txt

It outputs something like:

$ cat contact.txt | tail -n20 | head -n13
spaces/Westwoodlabs.json
twitter: @Westwoodlabs
irc: ircs://irc.hackint.org:6697/westwoodlabs
email: vorstand@westwoodlabs.de

spaces/xHain.json
phone: +493057714272
email: info@x-hain.de
matrix: #general:x-hain.de
mastodon: @xHain_hackspace@chaos.social

spaces/Zeus WPI.json
email: bestuur@zeus.ugent.be

Calculating Mastodon averages

We can filter this file to only the "mastodon:" lines, and then extract the server with a funky regex, and get a list of which instances are most common.

$ cat contact.txt | grep '^[^:]*mastodon' | pcregrep -o1 '([^:\.@\/]*\.[^\/@]*).*' | sort | uniq -c | sort -n
      1 c3d2.social
      1 caos.social
      1 hachyderm.io
      1 hackerspace.pl
      1 mas.to
      1 social.bau-ha.us
      1 social.flipdot.org
      1 social.okoyono.de
      1 social.saarland
      1 social.schaffenburg.org
      1 telefant.net
      2 social.c3l.lu
      3 mastodon.social
      4 hsnl.social
     39 chaos.social

So… it's mostly chaos.social. Neat.

back to top

comparing historical HMO licence data in Sheffield#prevsinglenexttop

2025-05-14 • tags: scripting, hmos, open-data • 1321 'words', 396 secs @ 200wpm

What is an HMO licence

Sheffield city council publishes a list of HMO (House in Multiple Occupation) licences on their HMO page, along with other information about HMOs (in brief, an HMO is a shared house/flat with more than 3 non-family members, and it must be licenced if this number is 5 or more).

How accessible is the data on HMO licences

They provide a list of licences as an Excel spreadsheet (.xlsx). I've asked them before if they could (also) provide a CSV, but they told me that was technically impossible. I also asked if they had historical data (i.e., previous spreadsheets), but they said they deleted it every time they uploaded a new one.

Therefore, as I'm interested in private renting in Sheffield, I've been archiving the data in a GitHub repository, as CSVs. I also add additional data like lat/long coordinates (via geocoding), and parse the data into geographical formats like .geojson, .gpx, and .kml (which can be viewed on a map!).

Calculating statistics from the data

What I hadn't done yet was any statistics on the data (I'd only been interested in visualising it on a map) so that's what I've done now.

I spent the afternoon writing some scripts to parse CSV data and calculate things like mean occupants, most common postcodes, number of expiring licences by date, et cetera.

General Statisitcs

I find shell scripting interesting, but I'm not so sure everyone else does (the script for the interested). So I'm not going to put the scripts here, but I will say that I used these command line tools (CLI tools) this many times:

Anyway, here are the statistics from the script (in text form, as is most shareable):

hmos_2024-09-09.csv
  total licences: 1745
  6.29 mean occupants (IQR 2 [5 - 7]) (median 6)
  amount by postcode:
    S1 (60), S2 (214), S3 (100), S4 (12), S5 (18), S6 (90), S7 (62), 
    S8 (10), S9 (5), S10 (742), S11 (425), S12 (1), S13 (2), S14 (1), S20 (1), S35 (1), S36 (1), 
  streets with most licences: Crookesmoor Road (78), Norfolk Park Road (72), Ecclesall Road (48), Harcourt Road (38), School Road (29), 
  
hmos_2025-01-28.csv
  total licences: 1459
  6.35 mean occupants (IQR 2 [5 - 7]) (median 6)
  amount by postcode:
    S1 (50), S2 (199), S3 (94), S4 (9), S5 (17), S6 (78), S7 (57), 
    S8 (10), S9 (4), S10 (614), S11 (321), S12 (1), S13 (2), S20 (1), S35 (1), S36 (1), 
  streets with most licences: Norfolk Park Road (73), Crookesmoor Road (57), Ecclesall Road (43), Harcourt Road (28), School Road (26), 
  
hmos_2025-03-03.csv
  total licences: 1315
  6.37 mean occupants (IQR 2 [5 - 7]) (median 6)
  amount by postcode:
    S1 (48), S2 (161), S3 (92), S4 (8), S5 (13), S6 (70), S7 (55), 
    S8 (9), S9 (3), S10 (560), S11 (290), S12 (1), S13 (2), S20 (1), S35 (1), S36 (1), 
  streets with most licences: Crookesmoor Road (54), Norfolk Park Road (41), Ecclesall Road (38), Harcourt Road (27), Whitham Road (24), 

Potential Conclusions

Draw your own conclusions there, but some could be that:

Statistics on issuing and expiry dates

I also did some statistics on the licence issue and expiry dates with a second stats script, which – as it parses nearly 5,000 dates – takes longer than "almost instantly" to run. As above, this used:

The script outputs:

hmos_2024-09-09.csv
  1745 dates in 1745 lines (627 unique issuing dates)
    637 expired
    1108 active
  Licence Issue Dates:
    Sun 06 Jan 2019, Sun 06 Jan 2019, … … … Wed 12 Jun 2024, Tue 09 Jul 2024, 
    Monday (275), Tuesday (440), Wednesday (405), Thursday (352), Friday (256), Saturday (5), Sunday (12), 
    2019 (84), 2020 (311), 2021 (588), 2022 (422), 2023 (183), 2024 (157), 
  Licence Expiry Dates:
    Mon 09 Sep 2024, Mon 09 Sep 2024, … … … Mon 11 Jun 2029, Sun 08 Jul 2029, 
    2024 (159), 2025 (824), 2026 (263), 2027 (225), 2028 (185), 2029 (89), 
    
hmos_2025-01-28.csv
  1459 dates in 1459 lines (561 unique issuing dates)
    334 expired
    1125 active
  Licence Issue Dates:
    Mon 28 Oct 2019, Mon 04 Nov 2019, … … … Mon 06 Jan 2025, Tue 14 Jan 2025, 
    Monday (243), Tuesday (380), Wednesday (338), Thursday (272), Friday (211), Saturday (6), Sunday (9), 
    2019 (2), 2020 (130), 2021 (567), 2022 (406), 2023 (181), 2024 (170), 2025 (3), 
  Licence Expiry Dates:
    Thu 30 Jan 2025, Fri 31 Jan 2025, … … … Mon 22 Oct 2029, Wed 28 Nov 2029, 
    2025 (681), 2026 (264), 2027 (225), 2028 (184), 2029 (105), 
    
hmos_2025-03-03.csv
  1315 dates in 1315 lines (523 unique issuing dates)
    189 expired
    1126 active
  Licence Issue Dates:
    Mon 28 Oct 2019, Mon 04 Nov 2019, … … … Wed 05 Mar 2025, Wed 05 Mar 2025, 
    Monday (217), Tuesday (339), Wednesday (314), Thursday (244), Friday (189), Saturday (4), Sunday (8), 
    2019 (2), 2020 (64), 2021 (494), 2022 (399), 2023 (177), 2024 (170), 2025 (9), 
  Licence Expiry Dates:
    Fri 07 Mar 2025, Fri 07 Mar 2025, … … … Mon 22 Oct 2029, Wed 28 Nov 2029, 
    2025 (533), 2026 (262), 2027 (225), 2028 (184), 2029 (111),

Potential conclusions on dates

Again, draw your own conclusions (homework!), but some could be:

Why is this interesting

I started collecting HMO data originally because I wanted to visualise the licences on a map. Over a short time, I have created my own archive of licence history (as the council do not provide such).

Since I had multiple months of data, I could make some comparison, so I made these statistics. I don't find them incredibly useful, but there could be people who do.

Perhaps as time goes on, the long-term comparison (over years) could be interesting. I think the above data might not be greatly useful as it seems that Sheffield council are experiencing delays over licensing at the moment, so the decline in licences probably doesn't reflect general housing trends.

Plus, I just wanted to do some shell-scripting ;]

back to top

taking a small bunch of census data from FindMyPast#prevsinglenexttop

2025-04-29 • tags: jq, scripting, web-scraping, census, data • 507 'words', 152 secs @ 200wpm

The 1939 Register was an basically-census taken in 1939. On the National Archives Page, it says that it is entirely available online.

However, further down, it lists how to access it, which says:

You can search for and view open records on our partner site Findmypast.co.uk (charges apply). A version of the 1939 Register is also available at Ancestry.co.uk (charges apply), and transcriptions without images are on MyHeritage.com (charges apply). It is free to search for these records, but there is a charge to view full transcriptions and download images of documents. Please note that you can view these records online free of charge in the reading rooms at The National Archives in Kew.

So… charges apply.

Anyway, for a while in April 2025 (until May 8th), FindMyPast is giving free access to the 1939 data.

Of course, family history is hard, and what's much easier is "who lived in my house in 1939". For that you can use:

I created an account with a bogus email address (you're not collecting THIS guy's data) and took a look around at some houses.

Then, I figured I could export my entire street, so I did.

The code and more context is in a GitHub Repository, but in brief, I:

Now it looks like:

AddressStreet,Address,Inhabited,LatLon,FirstName,LastName,BirthDate,ApproxAge,OccupationText,Gender,MaritalStatus,Relationship,Schedule,ScheduleSubNumber,Id
Khartoum Road,"1 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Constance A,Latch,31 Aug 1904,35,Manageress Restaurant & Canteen,Female,Married,Unknown,172,3,TNA/R39/3506/3506E/003/17
Khartoum Road,"4 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Catherine,Power,? Feb 1897,42,Music Hall Artists,Female,Married,Unknown,171,8,TNA/R39/3506/3506D/015/37
Khartoum Road,"4 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Charles F R,Kirby,? Nov 1886,53,Newsagent Canvasser,Male,Married,Head,172,1,TNA/R39/3506/3506D/015/39
Khartoum Road,"4 Khartoum Road, Sheffield",Y,"53.3701,-1.4943",Constance A,Latch,31 Aug 1912,27,Manageress Restairant & Cante,Female,Married,Unknown,172,3,TNA/R39/3506/3506D/015/41

Neat!

Some of my favourite jobs in the sheet of streets I collected are:

back to top

setting up OwnTracks on my server#prevsinglenexttop

2025-03-28 • tags: owntracks, installation • 389 'words', 117 secs @ 200wpm

OwnTracks is a self-hostable service for tracking locations.

It's like an open-source, self-hostable version of Life360, Apple's Find My Friends, Google's Find My Device or many other apps which let you view the location history of yourself or others.

You can self-host it on a raspberry pi, a server, or otherwise a computer which is always-on and has access to the Internet (with a static IP).

I tried installing it on my server, and this is what I ran. Notably, it wiped my existing nginx configuration, so I suggested a clarification of that in the documentation and made sure to back my config up so I could restore it. Thankfully I was already doing that.

# install files
cd /usr/alifeee/
git clone --depth=1 https://github.com/owntracks/quicksetup
mv quicksetup owntracks
cd /usr/alifeee/owntracks/

# edit configuration
cp configuration.yaml.example configuration.yaml
nano configuration.yaml

# back up nginx.conf
(cd /media/alifeee; sudo ./back-up.sh)

# set up
sudo ./bootstrap.sh

# reset nginx conf (it blanks it)
sudo chmod u+w /etc/nginx/nginx.conf
sudo cp /media/alifeee/20250307T1749/nginx.conf /etc/nginx/nginx.conf
# put this into config
echo '# for owntracks
  map $cookie_otrauth $mysite_hascookie {
    "vhwNkyPGPCvnMCiQRkCs" "off";
    default "My OwnTracks";
  }'

# get passwords
sudo tail /usr/local/owntracks/userdata/*.pass

Files are stores in /usr/local/owntracks. I can then visit https://owntracks.alifeee.net/, login, and see a setup. I downloaded the Android app and set it up by opening the file from my website with the app, and it set up pretty well!

Then, I went for a walk, and looked at the "frontend", which puts out a view like this, of my location history:

       ___
   ___/   \
  /      alifeee
  |
  /
 |
  \___
      \__

It's pretty neat! And self-hosted! Another service to join the maybe-too-many on my server…

I made an account on it for my friends and we plan to use it to keep track of each other in our hitchhiking adventure next week.

back to top

installing Waydroid, an Android emulator, on Linux#prevsinglenexttop

2025-03-19 • tags: linux, android, installation • 645 'words', 194 secs @ 200wpm

I wanted to use Android on Linux, so I searched the web and found https://waydro.id/.

Instead of running in some kind of virtual machine, it seems to run Android slightly more natively on Linux (I really don't know how any of this works).

Here is a small adventure at trying to install it:

Official installation guide

Following steps from https://docs.waydro.id/usage/install-on-desktops#ubuntu-debian-and-derivatives (I'm on Pop!_OS which I think is based on Debian).

$ sudo apt install curl ca-certificates
$ curl -s https://repo.waydro.id | sudo bash
$ sudo apt install waydroid
$ waydroid init
[11:27:05] Failed to load binder driver
[11:27:05] modprobe: FATAL: Module binder_linux not found in directory /lib/modules/6.13.0-061300-generic
[11:27:05] ERROR: Binder node "binder" for waydroid not found
[11:27:05] See also: https://github.com/waydroid

I don't really know what this error meant, but after searching, it seemed that Waydroid needed "wayland" and I was using X (which are both Desktop thingies which make pixels appear on the screen). I read things about Pop!_OS not having something necessary installed in the kernel, but I could use "DKMS", meaning Dynamic Kernel Module Support. So I tried installing what I'd found links to with:

git clone https://github.com/choff/anbox-modules
cd anbox-modules && ./INSTALL.sh

now when I ran waydroid init it worked, but then I got nothing. I wasn't really sure what I was supposed to be doing to "open" it now that it was "init"'d. So I deleted all I could with

$ sudo apt remove waydroid
$ sudo find / -type d -name "waydroid"
/var/lib/waydroid
$ rm -rf /var/lib/waydroid

…and found a helper script to do stuff for me.

Installation script

The script I found was https://github.com/n1lby73/waydroid-installer, which I ran with

git clone https://github.com/n1lby73/waydroid-installer
cd waydroid-installer
sudo bash install_script.sh

The script didn't complete, and complained about modules not installing, specifically that lxd-client could not be installed.

Looking at the script I saw it was trying to run apt install lxd-client but running that myself, it seemed that it didn't exist:

$ sudo apt install lxd-client
E: Unable to locate package lxd-client

After searching, it seems lxd-client provides a command lxc, so I looked for how to install lxc, and found it was possible via snap. I've not really used snap before and people have complained about it (about filesize and automatic updates), so I was wary to install it, but I did with:

$ sudo apt install snapd
$ sudo snap install lxd
$ lxc --version
5.21.3 LTS

I removed lxd-client from the install script and re-ran it, and it seemed to work OK. It said it installed a "Wayland" desktop option on my login page if I rebooted.

Opening waydroid

So I rebooted, and on the login screen selected "Pop on Wayland" (I'm still not fully sure what this X/Wayland thing is), and tried starting Waydroid.

Running…

waydroid session start

…and…

waydroid show-full-ui
waydroid app install org.fedorahosted.freeotp_46.apk
waydroid app install Firefox Fast & Private Browser_136.0.2_APKPure.apk

…installed some apps and filled one of my screens with a big Android display.

I found the APKs on either F-Droid, which just has them available for download (sweet) or by searching the web and downloading them from sketchy sites.

It seems to work well!

I suppose there's a lot you can do with Waydroid, if you want. I don't think I want.

In some ways, this is an example of the involved-nature of installing things on Linux.

back to top

testing micropython on an ESP8266 D1 Mini#prevsinglenexttop

2025-03-13 • tags: micropython, microcontrollers • 544 'words', 163 secs @ 200wpm

I've toyed for a while with microcontrollers, and only really used Arduino/C/C++. Sometimes, I've heard talk of MicroPython, but I've never tried it out.

Until today!

I had a little experiment, and it seems promising. I might have a larger experiment soon (maybe try to retry some of my hardware hacking).

I'll share here my initial experiments! I'm running on a Linux computer, on Pop!_OS.

I read the:

…and downloaded the firmware file for the ESP8266 D1 Mini (which I have a few of) from https://micropython.org/download/ESP8266_GENERIC/, and then ran:

# install files and virtual environment
mkdir -p /git/micropython
cd /git/micropython
python -m venv env
. env/bin/activate
pip install esptool

# download firmware
$ ls
ESP8266_GENERIC-20241129-v1.24.1.bin

# at this point I plugged in the ESP but it was not recognised
#   after looking at `tail -f /var/log/syslog`, I saw that `brltty`
#   was doing something spooky. I remembered having this issue before,
#   and that `brltty` was something to help Braille readers. As I don't
#   need that, I...
# disabled brltty
sudo systemctl stop brltty.service
sudo systemctl mask brltty.service
sudo systemctl disable brltty.service
sudo systemctl restart

# now I could see the ESP as a USB device
$ lsusb
$ ls /dev/ | grep "ttyUSB"
ttyUSB0

# flash ESP
esptool.py --port /dev/ttyUSB0 erase_flash
esptool.py --port /dev/ttyUSB0 --baud 1000000 write_flash --flash_size=4MB -fm dio 0 ESP8266_GENERIC-20241129-v1.24.1.bin

That's it installed! Now, using the guides above I found I needed a terminal emulator, so I used picocom. And, to reach the pinnacles of complexity, tried turning the inbuilt LED on and off.

sudo apt install picocom
picocom /dev/ttyUSB0 -b115200
>>> from machine import Pin
>>> p = Pin(2, Pin.OUT)
>>> p.off()
>>> p.on()

It works! Neat! The REPL (Read-Evaluate-Print-Loop) is really nice to quickly debug with. Perhaps nicer than the "waiting-for-20-seconds-for-your-C-code-to-flash-onto-the-device".

I also tried connecting to WiFi and using the Web-REPL, so you can execute Python commands over the air! With...

>>> import network
>>> wlan = network.WLAN(network.WLAN.IF_STA)
>>> wlan.active(True)
>>> wlan.scan()
>>> wlan.isconnected()
>>> wlan.connect("ssid", "key")
>>> # wait a bit
>>> wlan.isconnected()
>>> # or use function from https://docs.micropython.org/en/latest/esp8266/quickref.html#networking

Then you can configure webrepl with:

>>> import webrepl_setup

…and it will print out an IP that you can connect to and use the REPL from your browser! Very nice.

What I haven't tried yet is using boot.py. From what I know it will execute on every reset of the ESP, so basically is how you "program" it, but a lot quicker, since you just place a file on the filesystem.

I'll give that a go soon...

back to top

testing Rust for the first time by making an identicon#prevsinglenexttop

2025-03-13 • tags: rust, identicons • 8802 'words', 2641 secs @ 200wpm

I was screwing around on YouTube, and ended up watching a few videos about Rust. Actually, these ones: the first, leading to the second, leading to the third.

These videos are all by noboilerplate, and I got only 1:08 minutes into the third video before I decided to try out Rust myself.

For a long time I've been meaning to make an identicon (think: default pixelated profile picture for GitHub/etc) using Lua, after seeing a friend's identicon implementations in several language. I think, as they do, that making an identicon generator is a very fun and contained way to start experimenting with a new language - you get involved with random numbers, arrays, string formatting, loops, and maybe more.

Anyway, I still haven't made one in Lua, but I did make these three in Rust.

Installing Rust

Installing Rust was super easy, I just used the command from https://rustup.rs/.

Installing VSCodium extensions

Well, first I installed using sudo apt install cargo, but then the VSCodium extension I installed (Rust) suggested I should use rustup, so I uninstalled cargo and used rustup.

Then, I also found out that the VSCodium extension was deprecated in favour of the rust-analyzer extension, so I installed that one instead. I also installed CodeLLDB to allow debugging.

Running Rust

After installing Cargo, I ran cargo and it complained about a missing Cargo.toml, so I guessed I could run…

cargo init

…to create this, and it worked! Neat. It also showed a nice link to the documentation for Cargo.toml. I still haven't opened the Cargo.toml file. Anyway, cargo init also created a "hello world" script:

fn main() {
    println!("Hello, world!");
}

…which I could run with cargo run

$ cargo run
Hello, world!

At this point, I got stuck in trying to make the above identicons. I (naturally) came across a few stumbling blocks, but the errors that the compiler provides were quite nice, so I got along OK.

Here's the final code I ended up with (feel free to tell me that several sections are "bad" or "not Rust-y")

use rand::prelude::*;

const WIDTH: usize = 15;
const HEIGHT: usize = 15;
const SQUARE_SIZE: usize = 50;
const SVG_WIDTH: usize = WIDTH * SQUARE_SIZE;
const SVG_HEIGHT: usize = HEIGHT * SQUARE_SIZE;

fn main() {
    let mut rng = rand::rng();

    // generate one half of the identicon
    // let mut arr: [[bool; 0]; 0] = [];
    let mut arr: Vec<Vec<bool>> = vec![];
    for r in 0..HEIGHT {
        let empty_arr: Vec<bool> = vec![];
        arr.push(empty_arr);
        for _c in 0..((WIDTH + 1) / 2) {
            let random_val = rng.random_bool(0.5);
            arr[r].push(random_val);
        }
    }

    // print the SVG
    println!(
        "<svg version='1.1'
     viewbox='0 0 {} {}'
     xmlns='http://www.w3.org/2000/svg'>",
        SVG_WIDTH, SVG_HEIGHT
    );
    println!(
        "<rect width='{}' height='{}' fill='black' />",
        SVG_WIDTH, SVG_HEIGHT
    );
    for r in 0..arr.len() {
        let arr_first = arr.first();
        let mut cols = 0;
        if let Some(arr_first) = arr_first {
            cols = arr_first.len();
        }
        for c in 0..cols {
            let xleft = c * SQUARE_SIZE;
            let xright = SVG_WIDTH - xleft - SQUARE_SIZE;
            let y = r * SQUARE_SIZE;

            let filled = arr[r][c];
            let mut colour = "none";
            if filled {
                colour = "red";
            }

            println!(
                "<rect width='50' height='50' fill='{}' x='{}' y='{}' />",
                colour, xleft, y
            );
            println!(
                "<rect width='50' height='50' fill='{}' x='{}' y='{}' />",
                colour, xright, y
            );
        }
    }
    println!(r#"</svg>"#);
}

Sticking points

Two things that I got a bit stuck with were:

Not declaring loads of variables

I wasn't sure how to do a lot of things "in-line", and ended up declaring lots of variables, making the code quite verbose. For example, to push an empty vector to another vector I ended up doing (above) this…

let empty_arr: Vec<bool> = vec![];
arr.push(empty_arr);

…which I'm sure could be done in one line somehow. I don't know how.

Finding the length of an Option

To get the length of an embedded Vec (vector), I wanted to run arr.first().len() in some way, but arr.first() is either a vector or None (i.e., an optional/Option). I wanted to do something like:

if arr.first().is_none() {
  let cols = 0;
} else {
  let cols = arr.first().len();
}

…assuming that the compiler would realise that in the else section, arr.first() was not None, but it didn't. I don't know enough to figure out a way of doing this.

The End

It was quite fun using Rust for the first time.

Identicons are a lovely first project.

Perhaps I'll touch Rust again. Perhaps I won't.

back to top

how to get a GPS trace of train and boat journeys#prevsinglenexttop

2025-03-10 • tags: travel, geojson, gpx, maps • 510 'words', 153 secs @ 200wpm

I like sustainable travel. I also like interrailing. I also like maps.

Let's combine all three! This winter I went via train from Sheffield to Hamburg (for Chaos Computer Club), and then on to Lapland, and back.

The map

To skip to the chase, I made a coordinates file of the trip, and you can see it here on a map:

https://geojson.io/#data=data:text/x-url,https%3A%2F%2Fraw.githubusercontent.com%2Falifeee%2Feurope-trips%2Frefs%2Fheads%2Fmain%2F2024-12%2520CCC%2Fall.geojson

It's combined from train journeys, ferry journeys, and bus journeys.

Train data

I got the train routing data in .gpx format from https://brouter.damsy.net/, selecting the "Rail" profile in the dropdown. Then, I clicked close to the stations I went to/from/past, got a nice map that looked alright, and exported it.

Bus data

I also used https://brouter.damsy.net/ for this, after I'd found it was good for trains. I just selected one of the "Car" profiles, and set my waypoints, and exported it in the same way.

Ferry data

This was different, as ferries don't use roads or train tracks [citation needed]. But! They are documented well on mapping services. So, I found the route I wanted on https://www.openstreetmap.org/ (OSM) (e.g., the Liepãja to Travemünde Ferry) by using the little questionmark "query feature" button, then opened it on https://overpass-turbo.eu/ (a website for querying OSM data) by writing the query (with the correct feature ID):

way(128069455); out geom;

Then, I can click "Export" to get the .gpx (or other format) data out.

Combining

I spent a long time trying to figure out how to combine .gpx files with ogrmerge.

However, I couldn't figure it out. .gpx is confusing, and everyone who uses it seems to use GUI tools like arcgis or qgis, while I prefer to be able to do things with a command, which I can then repeat in future.

In the end, I converted the files to .geojson (my one true love) with ogr2ogr file111.geojson file111.gpx tracks for each file, and then combined them. Handily, I'd already written a note about combining .geojson files! I wish I stuck in .geojson the whole time. .gpx gives me headaches.

The End

That's it!

I could then load the combined file into https://geojson.io/ to check all was well (it was, I expected I might have to "reverse" some paths to be "forwards"), and I uploaded it to a new GitHub repository, https://github.com/alifeee/europe-trips/.

I also laser cut a mini Europe with a line for the trip on the map, as a gift for my lover :]

back to top see more (+39)