I'm interested in easy ways to see change. Trying to compare the old and new webpages by eye is hard, which leads me to text-diffing. I can copy the contents of the website to a file and compare them that way. Let's. I did a similar thing a while ago with computer benchmarks.
I manually create two files by copying the interesting bits of the webpage, called 1 and 2 (because who has time for .txt extensions). Then, I can run:
The latter command turns each into HTML by turning + lines into <ins> ("insert"), - into <del> ("delete"), and removing leading spaces on other lines. Then, I can whack the output into a simple HTML template:
<!DOCTYPEhtml><html><head><style>body{background: black;color: white;}pre{padding: 1rem;}del{text-decoration: none;color: red;}ins{text-decoration: none;color: green;}</style></head><body><pre>
diff goes here...
<del>del lines will be red</del><ins>ins lines will be green</ins></pre></body></html>
The final output is something like this (personal information removed. don't doxx me.)
Energy rating
D
Valid until
05 February 2025
05 February 2035
Property type
Mid-terrace house
Total floor area
130 square metres
123 square metres
This property’s energy rating is D. It has the potential to be C.
This property’s energy rating is D. It has the potential to be B.
Features in this property
Window Fully double glazed Good
Roof Pitched, no insulation (assumed) Very poor
Roof Roof room(s), no insulation (assumed) Very poor
Roof Roof room(s), insulated (assumed) Good
Lighting Low energy lighting in 64% of fixed outlets Good
Lighting Low energy lighting in all fixed outlets Very good
Secondary heating None N/A
Primary energy use
The primary energy use for this property per year is 303 kilowatt hours per square metre (kWh/m2).
The primary energy use for this property per year is 252 kilowatt hours per square metre (kWh/m2).
Good job on us for having 100% low energy lighting fixtures, I guess...
Really, this is a complicated way to simplify something. I like simple things, so I like this.
I wanted to know which hackspaces published their membership prices using SpaceAPI, and what those rates were. Here are a few bash scripts to do just that:
# get the directory of SpaceAPIsmkdir-p ~/temp/spaceapi/spaces
cd ~/temp/spaceapi
curl"https://directory.spaceapi.io/"| jq > directory.json
# save (as many as possible of) SpaceAPIs to local computertot=0;got=0whileread double;dotot=$(($tot+1))name=$(echo"${double}"|awk -F';''{print $1}');url=$(echo"${double}"|awk -F';''{print $2}');fn=$(echo"${name}"|sed's+/+-+g')echo"saving '${name}' - <${url}> to ./spaces/${fn}.json";# skip unless manually deleted if[-f"./spaces/${fn}.json"];thenecho"already saved!"got=$(($got+1))continuefi# get, skipping if HTTP status >= 400curl-L-s--fail --max-time 5"${url}"-o"./spaces/${fn}.json"||continueecho"fetched! maybe it's bad :S"got=$(($got+1))done<<<$(cat directory.json | jq -r'to_entries | .[] | (.key + ";" + .value)')echo"done, got ${got} of ${tot} files, $(($tot-$got)) failed with HTTP status >= 400"# some JSON files are malformed (i.e., not JSON) - just remove themforfilein spaces/*.json;docat"${file}"| jq > /dev/null
if[["${?}"-ne0]];thenecho"${file} does not parse as JSON... removing it..."rm"${file}"fidone# loop every JSON file, and nicely output any that have a .membership_plans objectforfilein spaces/*.json;doplans=$(cat"${file}"| jq '.membership_plans?')[["${plans}"=="null"]]&&continueecho"${file}"# echo "${plans}" | jq -cecho"${plans}"| jq -r'.[] | (.currency_symbol + (.value|tostring) + " " + .currency + " " + .billing_interval + " for " + .name + " (" + .description + ")")'echo""done
The output of this final loop looks like:
...
spaces/CCC Basel.json
20 CHF monthly for Minimal ()
40 CHF monthly for Recommended ()
60 CHF monthly for Root ()
...
spaces/RevSpace.json
32 EUR monthly for regular ()
20 EUR monthly for junior ()
19.84 EUR monthly for multi2 ()
13.37 EUR monthly for multi3 ()
...
spaces/Sheffield Hackspace.json
£6 GBP monthly for normal membership (regularly attend any of the several open evenings a week)
£21 GBP monthly for keyholder membership (come and go as you please)
...
see full output
spaces/CCC Basel.json
20 CHF monthly for Minimal ()
40 CHF monthly for Recommended ()
60 CHF monthly for Root ()
spaces/ChaosStuff.json
120 EUR yearly for Regular Membership (For people with a regular income)
40 EUR yearly for Student Membership (For pupils and students)
40 EUR yearly for Supporting Membership (For people who want to use the space to work on projects, but don't want to have voting rights an a general assembly.)
1 EUR yearly for Starving Hacker (For people, who cannot afford the membership. Please get in touch with us, before applying.)
spaces/dezentrale.json
16 EUR monthly for Reduced membership ()
32 EUR monthly for Regular membership ()
42 EUR monthly for Nerd membership ()
64 EUR monthly for Nerd membership ()
128 EUR monthly for Nerd membership ()
spaces/Entropia.json
25 EUR yearly for Regular Members (Normale Mitglieder gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
19 EUR yearly for Members of CCC e.V. (Mitglieder des CCC e.V. gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
15 EUR yearly for Reduced Fee Members (Schüler, Studenten, Auszubildende und Menschen mit geringem Einkommen gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
6 EUR yearly for Sustaining Membership (Fördermitglieder gem. https://entropia.de/Satzung_des_Vereins_Entropia_e.V.#Beitragsordnung)
spaces/Hacker Embassy.json
100 USD monthly for Membership ()
spaces/Hackerspace.Gent.json
25 EUR monthly for regular (discount rates and yearly invoice also available)
spaces/Hack Manhattan.json
110 USD monthly for Normal Membership (Membership dues go directly to rent, utilities, and the occasional equipment purchase.)
55 USD monthly for Starving Hacker Membership (Membership dues go directly to rent, utilities, and the occasional equipment purchase. This plan is intended for student/unemployed hackers.)
spaces/Hal9k.json
450 DKK other for Normal membership (Billing is once per quarter)
225 DKK other for Student membership (Billing is once per quarter)
spaces/Leigh Hackspace.json
24 GBP monthly for Member (Our standard membership that allows usage of the hackspace facilities.)
30 GBP monthly for Member+ (Standard membership with an additional donation.)
18 GBP monthly for Concession (A subsidised membership for pensioners, students, and low income earners.)
40 GBP monthly for Family (A discounted family membership for two adults and two children.)
5 GBP daily for Day Pass (Access to the hackspace's facilities for a day.)
5 GBP monthly for Patron (Support the hackspace without being a member.)
spaces/LeineLab.json
120 EUR yearly for Ordentliche Mitgliedschaft ()
30 EUR yearly for Ermäßigte Mitgliedschaft ()
336 EUR yearly for Ordentliche Mitgliedschaft + Werkstatt ()
120 EUR yearly for Ermäßigte Mitgliedschaft + Werkstatt ()
spaces/<name>space Gera.json
spaces/Nerdberg.json
35 EUR monthly for Vollmitgliedschaft (Normal fee, if it is to much for you, contact the leading board, we'll find a solution.)
15 EUR monthly for Fördermitgliedschaft ()
spaces/NYC Resistor.json
115 USD monthly for standard ()
75 USD monthly for teaching ()
spaces/Odenwilusenz.json
0 CHF yearly for Besucher ()
120 CHF yearly for Mitglied ()
480 CHF yearly for Superuser ()
1200 CHF yearly for Co-Worker ()
spaces/RevSpace.json
32 EUR monthly for regular ()
20 EUR monthly for junior ()
19.84 EUR monthly for multi2 ()
13.37 EUR monthly for multi3 ()
spaces/Sheffield Hackspace.json
£6 GBP monthly for normal membership (regularly attend any of the several open evenings a week)
£21 GBP monthly for keyholder membership (come and go as you please)
spaces/TkkrLab.json
30 EUR monthly for Normal member (Member of TkkrLab (https://tkkrlab.nl/deelnemer-worden/))
15 EUR monthly for Student member (Member of TkkrLab, discount for students (https://tkkrlab.nl/deelnemer-worden/))
15 EUR monthly for Student member (Junior member of TkkrLab, discount for people aged 16 or 17 (https://tkkrlab.nl/deelnemer-worden/))
spaces/-usr-space.json
I think a couple have weird names like <name>space or /dev/tal which screw with my script. Oh well, it's for you to improve.
Overall, not that many spaces have published their prices to SpaceAPI. Also, the ones in the US look really expensive. As ever, a good price probably depends on context (size/city/location/etc).
Perhaps I can convince some other spaces to put their membership prices in their SpaceAPI...
I write these notes in Obsidian. To upload, them, I could visit https://github.com/alifeee/blog/tree/main/notes, click "add file", and copy and paste the file contents. I probably should do that.
But, instead, I wrote a shell script to upload them. Now, I can press "CTRL+P" to open the Obsidian command pallette, type "lint" (to lint the note), then open it again and type "upload" and upload the note. At this point, I could walk away and assume everything went fine, but what I normally do is open the GitHub Actions tab to check that it worked properly.
The process the script undertakes is:
check user inputs are good (all variables exist, file is declared)
check if file exists or not already in GitHub with a curl request
generate a JSON payload for the upload request, including:
commit message
commit author & email
file contents as a base64 encoded string
(if file exists already) sha1 hash of existing file
make a curl request to upload/update the file!
As I use it from inside Obsidian, I use an extension called Obsidian shellcommands, which lets you specify several commands. For this, I specify:
…and when run with a file open, it will upload/update that file to my notes folder on GitHub.
This is maybe a strange way of doing it, as the "source of truth" is now "my Obsidian", and the GitHub is really just a place for the files to live. However, I enjoy it.
I've made the script quite generic as you have to supply most information via environment variables. You can use it to upload an arbitrary file to a specific folder in a specific GitHub repository. Or… you can modify it and do what you want with it!
I'm writing a blog about hitchhiking, which involves a load os .geojson files, which look a bit like this:
The .geojson files are generated from .gpx traces that I exported from OSRM's (Open Source Routing Machine) demo (which, at time of writing, seems to be offline, but I believe it's on https://map.project-osrm.org/), one of the routing engines on OpenStreetMap.
I put in a start and end point, exported the .gpx trace, and then converted it to .geojson with, e.g., ogr2ogr "2.1 Tamworth -> Tibshelf Northbound.geojson" "2.1 Tamworth -> Tibshelf Northbound.gpx" tracks, where ogr2ogr is a command-line tool from sudo apt install gdal-bin which converts geographic data between many formats (I like it a lot, it feels nicer than searching the web for "errr, kml to gpx converter?"). I also then semi-manually added some properties (see how).
Originally, I was combining them into one .geojson file using https://github.com/mapbox/geojson-merge, which as a binary to merge .geojson files, but I decided to use jq because I wanted to do something a bit more complex, which was to create a structure like
FeatureCollection
Features:
FeatureCollection
Features (1.1 Tamworth -> Woodall Northbound, 1.2 Woodall Northbound -> Hull)
FeatureCollection
Features (2.1 Tamworth -> Tibshelf Northbound, 2.2 Tibshelf Northbound -> Leeds)
FeatureCollection
Features (3.1 Frankley Northbound -> Hilton Northbound, 3.2 Hilton Northbound -> Keele Northbound, 3.3 Keele Northbound -> Liverpool)
I spent a while making a quite-complicated jq query, using variables (an "advanced feature"!) and a reduce statement, but when I completed it, I found out that the above structure is not valid .geojson, so I went back to just having:
I moved to Linux [time ago]. One thing I miss from the Windows file explorer is how easy it was to create text files.
With Nautilus (Pop!_OS' default file browser), you can create templates which appear when you right click in an empty folder (I don't remember where the templates file is and I can't find an obvious way to find out, so... search it yourself), but this doesn't work if you're using nested folders.
i.e., I use this view a lot in Nautilus the file explorer, which is a tree-view that lets you expand folders instead of open them (similar to most code editors).
But in this view, you can't "right click on empty space inside a folder" to create a new template file, you can only "right click the folder" (or if it's empty, "right click a strange fake-file called (Empty)").
So, I created a script in /home/alifeee/.local/share/nautilus/scripts called new file (folder script) with this content:
#!/bin/bash# create new file within folder (only works if used on folder)# notify-send requires libnotify-bin -> `sudo apt install libnotify-bin`if[-z"${1}"];thennotify-send"did not get folder name. use script on folder!"exit1fifile="${1}/new_file"i=0while[-f"${file}"];doi=$(($i+1))file="${1}/new_file${i}"donetouch"${file}"if[!-f"${file}"];thennotify-send"tried to create a new file but it doesn't seem to exist"elsenotify-send"I think I created file all well! it's ${file}"fi
Now I can right click on a folder, click "scripts > new file" and have a new file that I can subsequently rename. Sweet.
I sure hope that in future I don't want anything slightly more complicated like creating multiple new files at once...
I was given an old computer. I'd quite like to make a computer to use in my studio, and take my tower PC home to play video games (mainly/only local coop games like Wilmot's Warehouse, Towerfall Ascension, or Unrailed, and occasionally Gloomhaven).
It's not the best, and I'd like to know what parts I would want to replace to make it suit my needs (which are vaguely "can use a modern web browser" without being slow).
By searching the web, I found these commands to collect hardware information for a computer:
uname-a# vague computer information
lscpu # cpu informationdf-h# hard drive informationsudo dmidecode -t bios # bios informationfree-h# memory (RAM) info
lspci -v|grep VGA -A11# GPU info (1)sudo lshw -numeric-C display # GPU info (2)
I also found these commands to benchmark some things:
sudoaptinstall sysbench glmark2
# benchmark CPU
sysbench --test=cpu run
# benchmark memory
sysbench --test=memory run
# benchmark graphics
glmark2
I put the output of all of these commands into text files for each computer, into a directory that looks like:
I often turn lists of coordinates into a geojson file, so they can be easily shared and viewed on a map. See several examples on https://alifeee.co.uk/maps/.
One thing I wanted to do recently was turn a list of points ("places I've been") into a list of straight lines connecting them, to show routes on a map. I made a script using jq to do this, using the same data from my note about making a geojson file from a CSV.
Effectively, I want to turn these coordinates...
latitude,longitude,description,best part53.74402,-0.34753,Hull,smallest window in the UK54.779764,-1.581559,Durham,great cathedral52.47771,-1.89930,Birmingham,best board game café53.37827,-1.46230,Sheffield,5 rivers!!!
...but in a .geojson format, so I can view them on a map. Since this turns N items into N - 1 items, it sounds like it's time for a reduce (I like using map, filter, and reduce a lot. They're very satisfying. Some would say I should get [more] into Functional Programming).
So, the jq script to "combine" coordinates is: (hopefully you can vaguely see which bits of it do what)
{"type":"FeatureCollection","features":[{"type":"Feature","properties":{"description":"Hull","best part":"smallest window in the UK"},"geometry":{"type":"Point","coordinates":[-0.34753,53.74402]}},{"type":"Feature","properties":{"description":"Durham","best part":"great cathedral"},"geometry":{"type":"Point","coordinates":[-1.581559,54.779764]}},
...
]}
As with the previous post, making this script took a lot of reading man jq (very well-written) in my terminal, and a lot of searching "how to do X in jq".
I've gotten into a habit with map-making: my favourite format is geojson, and I've found some tools to help me screw around with it, namely https://github.com/pvernier/csv2geojson to create a .geojson file from a .csv, and https://geojson.io/ to quickly and nicely view the geojson. geojson.io can also export as KML (used to import into Google Maps).
In attempting to turn a .geojson file from a list of "Point"s to a list of "LineString"s using jq, I figured I could also generate the .geojson file myself using jq, instead of using the csv2geojson Go program above. This is my (successful) attempt:
First, create a CSV file places.csv with coordinates (latitude and longitude columns) and other information. There are many ways to find coordinates; one is to use https://www.openstreetmap.org/, zoom into somewhere, and copy them from the URL. For example, some places I have lived:
latitude,longitude,description,best part53.74402,-0.34753,Hull,smallest window in the UK54.779764,-1.581559,Durham,great cathedral52.47771,-1.89930,Birmingham,best board game café53.37827,-1.46230,Sheffield,5 rivers!!!
Then, I spent a while (maybe an hour) crafting this jq script to turn that (or a similar CSV) into a geojson file. Perhaps you can vaguely see which parts of it do what.
{"type":"FeatureCollection","features":[{"type":"Feature","properties":{"description":"Hull","best part":"smallest window in the UK"},"geometry":{"type":"Point","coordinates":[-0.34753,53.74402]}},{"type":"Feature","properties":{"description":"Durham","best part":"great cathedral"},"geometry":{"type":"Point","coordinates":[-1.581559,54.779764]}},
...
]}
...which I can then export into https://geojson.io/, or turn into another format with gdal (e.g., with ogr2ogr places.gpx places.geojson).
It's very satisfying for me to use jq. I will definitely be re-using this script in the future to make .geojson files, but as well re-using some of the jq techniques I learnt while making it.
Mostly for help I used man jq in my terminal, the .geojson proposal for the .geojson structure, and a lot of searching the web for "how to do X using jq".
I like markdown. I use Obsidian a lot, and write a lot in GitHub issues. Something useful I usually do is quote other people's words, so in markdown it would look like:
The Met office said
> it will definitely snow tonight
>> like... 100%
I found that I can use a command xclip to get/set my clipboard on Linux, and I use a lot of sed to do word replacement, so I realised I could copy the text
it will definitely snow tonight
like... 100%
and then run this command in my terminal (xclip gets/sets the clipboard, sed replaces ^ (the start of each line) with > )
xclip -selection c -o|sed"s/^/> /"| xclip -selection c
which would get my clipboard, replace the start of each line with a quote, and set the clipboard, setting the clipboard to:
it will definitely snow tonight
like... 100%
I've set aliases for these commands so I can use them quickly in my terminal as:
aliasgetclip='xclip -selection c -o'aliassetclip='xclip -selection c'aliasquote='getclip | sed "s/^/> /" | setclip'
but also I created a keyboard shortcut in Gnome, CTRL + SUPER + Q, which will quote my clipboard. I had to set the shortcut to run bash -c 'xclip -selection c -o | sed "s/^/> /" | xclip -selection c' as I don't think pipes sit well in shortcuts.
I like maps. I make maps. Mostly from worse maps or data that is not in map form. See some of mine on https://alifeee.co.uk/maps/.
One thing I've been doing for a map recently is geocoding, which is turning an address (e.g., "Showroom Cinema, Paternoster Row, Sheffield") into latitude/longitude coordinates.
$ ./geocode.sh "Showroom Cinema, Paternoster Row, Sheffield"
throttled... retrying...
throttled... retrying...
got response: {"standard":{"stnumber":"1", "addresst":"Paternoster Row", "statename":"England", "postal":"S1", "region":"England", "prov":"UK", "city":"Sheffield", "countryname":"United Kingdom", "confidence":"0.9"}, "longt":"-1.46544", "alt":{}, "elevation":{}, "latt":"53.37756"}
latitude longitude confidence address state city province country post code alt address alt state alt city alt province alt country alt postal
53.37756-1.465440.9 Paternoster Row England Sheffield UK United Kingdom S1
The results are "ok". They're pretty good for street addresses, but I can see a lot of wrong results. I might try and use another API like OpenStreetMap's or (shudders) Google's.