Latest Posts (20 found)
neilzone 6 days ago

decoded.legal's .onion site no longer has TLS / https

tl;dr: As of 2026-02-23, http://dlegal66uj5u2dvcbrev7vv6fjtwnd4moqu7j6jnd42rmbypv3coigyd.onion no longer offers TLS. It just has Tor’s own transport encryption. I have run .onion sites for a long time. I like the idea of people being able to access resources within the Tor network, without needing to access the clearweb. These .onion services benefit from Tor’s transport encryption. For the last four years, the decoded.legal onion site ( http://dlegal66uj5u2dvcbrev7vv6fjtwnd4moqu7j6jnd42rmbypv3coigyd.onion ) also had a “normal” TLS certificate. Setting this up was relatively straightforward . However, renewing it is a manual operation and a bit a of a faff, which suggests that I am spoiled by Let’s Encrypt. When the certificate came up for renewal this year, I decided to remove it. Why? Because I’m just not persuaded that the incremental benefits of having TLS over Tor justifies the faff, or the (low) cost. The site still has Tor’s transport encryption. And, if I’m wrong, and I get loads of complaints (of which I am not really expecting a single one), I can also put it back. I did it this way: A few weeks ago, I turned off auto-redirection within my apache2 configuration. This meant that requests to the http onion site would not redirect automatically to the https onion site. I also changed the and headers, sent when someone visits the clearweb site ( https://decoded.legal ), in favour of the http, rather than https, URL for the .onion site. In , I commented out the line which I had put in place for port 443. I restarted Tor ( ). For apache2, I removed the config file symlink, for the https config file, from . I restarted apache2 ( ). A few weeks ago, I turned off auto-redirection within my apache2 configuration. This meant that requests to the http onion site would not redirect automatically to the https onion site. I also changed the and headers, sent when someone visits the clearweb site ( https://decoded.legal ), in favour of the http, rather than https, URL for the .onion site. In , I commented out the line which I had put in place for port 443. I restarted Tor ( ). For apache2, I removed the config file symlink, for the https config file, from . I restarted apache2 ( ).

0 views
neilzone 1 weeks ago

Updating my TicWatch to AsteroidOS 2.0

I have a TicWatch Pro 2020, running AsteroidOS . I’ve been using it for about three months now, and I’ve been very pleased with it. Sure, it would be great if the battery life was longer than a day-and-a-bit, but this just means that I need to charge it each night, which is not a major hardship. It does everything I want from a smartwatch, and not really anything more. AsteroidOS launched AsteroidOS v2.0 a few days ago, and I was keen to give it a try. I installed it by following the instructions for the TicWatch (i.e. a new installation, rather an “update”), and this worked fine. I had to re-pair the watch to GadgetBridge, and then I rebooted it. When it came up, it connected to my phone, and set the time correctly. I have a feeling that the update has removed the watch face that I was using, and re-installing it would be a faff, so I just picked one of the default faces. Since I don’t have “Always on” enabled, so I see the TicWatch’s secondary LCD most of the time, this is not a big deal for me. I turned off tilt-to-wake (in Display settings), because I don’t want that; I imagine that it will be waking the watch up too often, increasing power consumption. The “compass” app is quite cool, giving me easy direction finding on my wrist, but I’m not sure I’ll have much use for it. The heart rate sensor works, showing that I do indeed have a pulse, but again, I don’t really need this day to day. Perhaps because of my incredibly basic use, most of the user-facing changes are not particularly relevant to me. I’ll be interested to see if the battery life improvements apply to my watch though. A simple, successful, update, and one which, thankfully, does not get in the way of me using the watch.

0 views
neilzone 2 weeks ago

Moving away from Nextcloud

I have used Nextcloud for a long time. In fact, I have used Nextcloud from before it was Nextcloud - before the fork of Owncloud. And while I have not used many of its features - sync, calendar, and contacts - I’ve been a very happy user for a long time. Until a year or so ago, at least. I’ve had a worry, at the back of my mind, for a while, that Nextcloud is trying to do too much. A collaborative document editor. An email client. A voice/video conferencing tool, and so on. I’m sure that, in some contexts, this is amazing, and convenient. For me, as someone who typically prefers a piece of software to do one thing well, it left me a bit uneasy. But that was not, in itself, enough of a reason for me to switch. A year or so ago, I had problem after problem keeping files in sync. I routinely got error messages about the database (or files; I don’t quite remember) being locked. And, for me, that was the mainstay of Nextcloud, and indeed the reason why I started to use it in the first place. I tried all sorts of things, including setting up redis, and trying other memcache options, even though I am the only regular user. I could not get it to sync reliably. And I really did try, using the voluminous logs to try to determine what was going wrong. But I failed. And so I started considering other options. Did I actually need Nextcloud at all? I’ve moved to Syncthing for syncing, and so far, that has been working fine. It is fast, and appears to be reliable. I should probably write about it at some point. Using Nextcloud to sync photos from my phone was not too bad, but from Sandra’s iPhone, it did not work well. I have switched to Immich for photo sync / gallery, and I’ve been very happy with it. For contacts and calendar sync - DAV - I am using Radicale . The main annoyance is that Sandra cannot invite me (or anyone) to appointments using the iOS or macOS calendar. For me, I’ve just given Sandra write access to my calendar, so that she can add events directly, but it is far from ideal. I’ve tried using Radicale’s server-side email functionality, and that is not suitable for my needs, as it sends out far too many email. But, for now, Radicale is tolerable, even if I might try to find another option at some point. And that just leaves the directories which I share via Nextcloud and mount in my file browser. Stuff that I don’t need on my computer, but still want to access. For that, I’m going back to samba. It works. And so, once I’ve finalised this and tested it and given it some time to bed in, I will turn off the Nextcloud server.

0 views
neilzone 2 weeks ago

Injecting deno into yt-dlp venv via pipx

gave me an error message of WARNING: [youtube] No supported JavaScript runtime could be found. Only deno is enabled by default; to use another runtime add –js-runtimes RUNTIME[:PATH] to your command/config. YouTube extraction without a JS runtime has been deprecated, and some formats may be missing. See https://github.com/yt-dlp/yt-dlp/wiki/EJS for details on installing one This baffled me for a bit, as it suggested that there was a runtime enabled by default (deno), and yet there was no supported JavaScript runtime. I have installed via (so ). I did not know about the “inject” command: And bingo. I now have an installed, supported, JavaScript runtime.

0 views
neilzone 4 weeks ago

Flo Mask Pro thoughts

When I first thought about going to FOSDEM , the thing at the top of my “to purchase” list was a new face mask / respirator. While I wear a mask on public transport, and in busy indoor venues, I’d heard that FOSDEM was going to be incredibly busy, with very poor air quality. And, rightly or wrongly, I felt that it would be sensible to get a better mask. I received several recommendations from mask-savvy fedizens, and I picked the Flo Mask Pro . It is not cheap, especially once one factors in international delivery, and it requires (well, says that it requires) the filter to be replaced daily. I don’t know how much that is “we want to sell you our custom filters” but I did change it this morning, before a second day’s use. I will never be in a position to say how effective it was, or whether it was more effective than my normal masks. I just can’t assess that reliably. What I can say is: Ultimately, yes, it is expensive, but I rather liked it. And if it helps lessen the risk of me getting sick, from whatever it might be, then that is money well spent. But, annoyingly, I have no way of assessing that. it is very comfortable, and I wore it for several hours at a time. It is heavier, but the two straps (sort of three, since the top strap splits in two) hold it in place nicely. It does not pull on my ears like my normal masks. I have a short beard and, as far as I could tell, it still sealed around my mouth and nose fine. Perhaps it would be better if I were clean shaven, but I am not. I can talk through it. I did a few tests with Sandra before I came (essentially, to make sure that I knew how to put it on correctly, and how to change the filter, before I arrived, but also to see what talking was like), and she said that I was clearly audible. I had no problem chatting to people while wearing it, in the quieter indoor areas, but there was no point at all in me trying to have a conversation without shouting in the busier areas, mask or no mask. changing the filter is easy, but there is a cost to the filters. It took me about a minute to change the filter this morning, and that was with me having to remind myself how to do it, since I tested it over Christmas and didn’t bring the instructions with me. yes, the shape makes you look a bit like a scifi soldier. I think it is the curves in the design. It did not bother me - I’m trying to be safe, not a fashion icon - but it might be a botheration factor for some. if the manufacturer stops selling the filters, I guess that it is finished, unless someone else starts selling them. it is relatively bulky. When I was not wearing it, I put it back in its small cloth bag, and then into my rucksack. It does fit into a hoodie’s pouch, but it is a it of a stretch (or I need a bigger hoodie), and into the pocket of my winter coat. I might wear it when I travel back on the Eurostar, in preference to my normal masks, simply because it is more comfortable. nobody gave me any grief whatsoever for wearing it. One person commented pleasantly - another person wearing a mask, saying that it was nice to see someone else wearing a mask - and that was it.

0 views
neilzone 4 weeks ago

Reflections on my first day (ever) at FOSDEM

Yesterday was my first day ever at FOSDEM, a large conference about Free and open source software, held in Brussels. It was… okay. Part of me feels that, rather than sitting my hotel room and catching up on bits and pieces, I should be rushing back today. I probably will go back for a bit today, mainly because I’ve promised to try to catch up with some people, but I don’t feel particularly compelled to do so. The smart idea would be to wait until everyone has recovered, and do video calls from home, frankly. FOSDEM is run by volunteers. They do an amazing job. So, while my conclusion is probably that FOSDEM is not for me, that’s no slight on the effort and time put in by the people who make it happen. Some of my feelings about FOSDEM stem from me not liking busy environments / lots of people. And some, I’m sure, come from having never been to a conference quite like this before, and different norms / expected behaviours. I enjoyed meeting some new people, and catching up with old friends, and meeting “away from keyboard” people with whom I normally chat online. That, for me, was the highlight. I did not get to talk with everyone with whom I wanted to talk, which is a shame, but is perhaps inevitable… IMHO, there are simply too many people for the venue. The room in which I spent most time was uncomfortably hot, and I was wearing a tshirt. Thank goodness that I was wearing a decent respirator as, even with the windows open - i.e. there is no doubt that the people running the room were doing the best that they could to improve ventilation - air quality was pretty awful. (It was not possible to keep the doors open - the noise from the corridor was loud and continuous, and disruptive to the talks.) I wanted to have a look at the stalls, and chat with the stall holders, but on the three trips I made to different buildings to do this, there were simply too many people - stalls were three rows deep with people trying to get to them, and it was incredibly loud. I had brought food in with me, and I’m pleased that I did, as queues for the food trucks - of which there were quite a few - were significant. I had most conversations with people outside. I felt this more comfortable both in terms of being able to hear what people were saying, and also in terms of temperature and air quality. We were, IMHO, extremely lucky to have good weather yesterday. This meant that people could be outside for at least some of the time. Had it been raining, or too cold to be outside for extended periods, the experience would have been even worse. There are lots of talks running in parallel. This is not a surprise in itself, but FOSDEM is the first conference that I have ever attended (as far as I can recall, anyway), where there is no consistent, common, start and end time. That means that people get up to leave half way through one talk, or walk in mid-way through a talk. I saw one poor speaker say “Any questions”, and half the room promptly stood up and walked up - leaving those of us interested in the Q&A to wait through the banging of desks and seats, and people moving about. I understand that this is completely normal, and basically expected behaviour , but wow I found that disruptive. One talk rolls into the next talk and so on. There is no time to get between talks. And rooms can be several minutes apart. I suspect that the rolling nature is to support as many talks as possible, which obviously has a value, but also comes at a cost. All(?) talks are live streamed and recorded. I suspect that I would have had a better experience, as an audience member, just watching the talks from home, either live or as and when I wanted. But, as a speaker, I really value the interactivity with the audience, and seeing, in real time, how they are responding to points that I make. So, from a speaker point of view - well, from mine, anyway - it is preferable to have people in the rom. Perhaps it is just a “me” thing, but I could not connect to the Wi-Fi on either of the devices that I had brought. I did not do extensive debugging, but I could not get an IP address on either network, IPv4 or IPv6. Lots of people said the same, that they were using their phone’s cellular connection… and this was rather slow, probably because of the number of people using it. Again, no slight on the people who volunteered to run the WLAN but, for me, it was unusable, and a tech conference without good connectivity felt… odd. If I’d wanted to live stream a talk while I was on campus, I don’t think that I’d have been able to do so. I gave a talk. It was fine, in the circumstances. I was very uncomfortable having to send in my presentation in advance, and present from Someone Else’s Computer, with no opportunity to check that everything was working and looking as it should. I get why the organisers wanted to do it that way and, fortunately, it did all work. I like to try things. And I tried FOSDEM. I’m not sure that it is for me, though. I’m still pondering heading over there now, but I am not particularly sure why , other than because I wanted to try to buy some t-shirts for some FOSS projects that I enjoy and want to support. But I’d have to deal with the scrum which is the stands. There are people with whom I’d like to chat, but I suspect that I’d be better off waiting until I got home. Right now - and perhaps my feelings will change - I’m not sure that I’d attend again.

0 views
neilzone 1 months ago

I play my first 'battle sim' Airsoft game today, and I loved it

I have played Airsoft at Red Alert, in Newbury, quite a few times over the past months, so I know the site, and the staff, pretty well, and I greatly enjoy my time there. Until today, I had only played their normal “skirmish” days, as well as their summer evening target shooting sessions. Today, I played “battlesim lite”, which is essentially a step up from a normal Sunday skirmish. Unlike a “proper” battle / military simulation game, today there were no ammo limits, no weapon restrictions (beyond the normal safety limits), and no required costumes. There were two games; one before lunch, and one after. On a normal skirmish day, there are three shorter games before lunch and two or three shorter games in the afternoon. I liked the longer games. For a start, there is less sitting around between games. I don’t take long to sort out my stuff, so I’ve taken to bringing my ereader with me to normal Sunday skirmish days. But more than that, the longer games just… worked. The format was not complicated: find some cards (smaller than a playing card, hidden around a 34 acre site; there was a lot of walking today), trade those cards for a “bomb” - a plastic case - then take that “bomb” to one of a number of designated places. I took out six magazines with me, as I was just not sure how many rounds I’d fire. I used two and a bit in the first game, and nearly two in the second game. So I could probably have got away with fewer magazines (and thus had a slightly lighter load-out). It was fun, and I very much look forward to playing a battle sim again. I might try the “proper” battle sim later in the year too. I also learned how to use the HPA refill point, which was one of my goals for today, as I’m getting very tempted by an HPA-powered RIF. Some points for me for next time: leaving a water bottle at the spawn point, or having a fancy water carrier with a hose, would have been a good idea. There’s water in the safe zone, and it was not a particular problem for me today, because of the weather mainly, but it would have been sensible to take water into the game with me. the “new fort” at Red Alert is a tough spawn point. If it is going to be a spawn point, try to pick it first (i.e. in the morning), so that, after lunch, when legs are more tired, you are starting on the other side, mainly heading downhill. I still need to sort out my glasses getting fogged up. I might have to try an “active cooling” unit, as it is a pain when I cannot see properly.

0 views
neilzone 1 months ago

Testing Radicale, a self-hosted FOSS CalDAV and CardDAV Server

I am currently using Nextcloud for calendar and contacts syncing. Since I am looking to move away from Nextcloud, I need to find an alternative means of doing caldav and carddav. I’ve had a number of recommendations for Radicale , so I am giving it a go. I installed Radicale from the Debian stable package. Yes, I get an older version of Radicale (3.5.3), but it means everything is managed through my package manager. I tried - because of the import problems I had, below - using the pip version, via , but since it did not resolve the problems, I decided to go with the Debian version instead. The config file - at - is well documented, and easy to use. Other than setting up users, the only other thing that I changed were the logging settings, while I was trying to resolve my import problems. I put it behind a reverse proxy; the official documentation worked fine but not that what they provide is not, in itself, a valid configuration. I exported my calendars using Thunderbird, and also via the Nextcloud web interface. I could import neither into Radicale, via Thunderbird, the Radicale web UI, or by using curl ( ). The web interface to Radicale gave no useful error messages, but the debug log (available via , once I’d adjusted the config file to enable all the options for detailed debug logging) was useful, indicating the specific problematic appointment UIDs. Some of the calendar entries were invalid. They were either (a) Microsoft-originated invitations which had been updated after sending, or (b) invitations for flights from British Airways, from years ago. I tried to fix them, using the validator at icalendar.org to work out what was wrong, but I struggled to do it. In the end, I deleted from the combined .ics file the problematic entries (one by one, by hand, using vim), until the import worked. I did not bother replacing the entries for old flights, or for old meetings (annoying, but oh well), but I had two appointments in the future, which I needed to preserve. I found that, while I could not import them, once I have configured the calendars on my phone (via DAVx), I could copy the appointment from my existing calendar to my new calendar. And those worked just fine, so I don’t know why I could not import them. I need to decide whether it is a deal breaker or not, that Radicale does not offer calendar sharing. I am used to being able to see Sandra’s calendars, and her mine. There is no way to do this within Radicale. There appears to be a fudge workaround, whereby I can symlink my calendars into Sandra’s directory, and hers into mine, and then we can add each other’s calendars. This should work for our needs (and it is a “should”, because I’ve yet to test it), but it does mean that we can each add and delete entries in the other person’s calendar, which is not ideal. It might still be a deal breaker.

0 views
neilzone 1 months ago

Enabling a user's processes to continue after the user disconnects their ssh session, using loginctl enable-linger

I set up Immich over the weekend , using rootless podman. An annoyance was that, when I disconnected from ssh, podman stopped running too. On an interim basis, I fudged it by opening a new session, and running podman within that. The “correct” solution, as far as I can tell, is to use for that user : Having done this, I can now disconnect from ssh, and the podman containers continue to run.

0 views
neilzone 1 months ago

My initial thoughts about Immich, a self-hosted photo gallery

I am looking to move away from Nextcloud (another blogpost at another time), and one of the things that I wanted to replace was a tool for automatically uploading photos from my phone (Graphene OS) and Sandra’s phone (an iPhone). At the moment, they are uploaded automatically to Nextcloud, although I’m not sure how well that works on Sandra’s iPhone. I also take a backup, using rsync to copy photos from the phones to an external drive , when I remember, which frankly is not often enough. But I did one of these backups ahead of installing Immich, which was useful. Lastly, I have a load of photos from the last 20 years, including our wedding photos, saved onto two different drives, and I wanted to add those to the hosted backup. I had heard a lot about Immich , with happy users, so I decided to give it a go. I did not really look around for alternatives. I already had a machine suitable to run Immich, so that was straightforward. Immich is, to my slight annoyance, only available via docker. I’m not a fan of docker, as I don’t understand it well enough. I can just about use it, but if there is a problem, I don’t have many skills beyond “bring it down then bring it up again”. I certainly haven’t mastered docker’s networking. But anyway, it is available via docker. So, naturally, I used and . I followed the official instructions , substituting “podman” for “docker”, and it worked fine. Since I am using an external drive to store the photo library, I configured that in the file. And there we go. Immich was running. For TLS termination, I use nginx, which I already had running on that machine anyway. The official suggested nginx configuration worked for http, but for whatever reason, I could not get it to work with to sort out a TLS certificate. I didn’t spend too much time debugging this, as I was not convinced I’d be able to get it to work. Instead, I started from the default nginx config, then used certbot, then put bits back in from the Immich-suggested config. I tested it with , to make sure I hadn’t somehow messed it up, and it was working, so fingers crossed. I set up an admin user, and then two normal users, for Sandra and me. I didn’t do any other specific configuration. Immich does not offer multi-factor authentication, which I find surprising, but ho hum, since it is an internal-only service, it is not a big deal for me. Perhaps I need to make Authelia or similar a future project. I set up the Android app, via F-Droid , and the iPhone app from the App Store. I decided to upload my Camera library without getting into albums, so there was no particular configuration other than choosing the Camera directory. Uploading ~3000 photos from my phone took a while, but it worked fine. For Sandra’s iPhone, I set it up, and set up the directory to sync, but I didn’t wait for it to upload. Instead - and for the other photos which I wanted to upload to Immich - I used the backup copy I had on an external drive. I used the command line tool immich-go for this. After creating both a user API key and an admin API key, I just left it to do its thing. I had to restart it a few times (and switch user API key for Sandra’s photos, to upload them to her account), but after a few runs, I had uploaded everything. I managed to upload all of my photos before I went to bed last night and, this morning, the server had created all the thumbnails, and done the facial recognition (not sure how I feel about this). I expect Sandra’s to be ready tomorrow. I have quite a few photos without a location, because they were taken many years ago on an old-fashioned digital camera. Batch replacement of location is reasonably straightforward, and I have not gone for the exact specific location, but at least the right town / city. I have fixed the date on some photos, but I need to go through more of them. 1 January is a very popular date. As far as I know, there is no way to fix incorrect orientation right now , but I think that that is coming pretty soon. So far, I am impressed. Despite docker, it was pretty straightforward to get running, and I am willing to chalk my problems with nginx down to me. I will probably use the web interface rather than the app, and that seems pretty good. Sandra will probably use the iOS app and, again, that seems fine. As long as it uploads photos automatically - which is something to test / keep an eye on - I suspect that we will be happy with it.

2 views
neilzone 1 months ago

Dealing with apt's warning 'Policy will reject signature within a year, see --audit for details'

I’ve noticed an increasing number of s result in a warning that: Running (as suggested), results in something like: My understanding is that - as the last line suggests - there has been a change in key-handling policy by apt, and that keys which were previously acceptable are (or, rather, will be) no longer acceptable by default. The “correct” way of solving this is for the repository provider to update their signing key to something which is compliant. However, I have no control over what a repository provider does, or when they will do it. For instance, the warning message above suggests to me that I will have a problem on 1 February 2026, so under a month away. I can suppress this warning - and tell apt to accept the key - by adding to : Or, to avoid having to add that each time, I can added a slightly-tweaked version of it to an apt config file, in . For instance, I can put this into : Hopefully though, repository providers will update their keys (which will then need re-importing).

0 views
neilzone 1 months ago

Removing .m4v files from my media server when an equivalent .mp4 file exists

For some reason, some of the directories on my media server have both .mp4 and .m4v versions of the same thing. I blame Past Neil for this. To save space, I wanted to delete the duplicate .m4v files. “Duplicate” here just means that the files have the same name but different extensions. This does not do anything clever/safe in terms of checking the .mp4 file is valid, is the right length etc. I was willing to take this risk. It might be possible to do all of this with a one-liner, of course. But this worked for me, and gave me a chance to eyeball the list of files at each point. I saved about 100GB of space :) One with .mp4 files, one with .m4v files, in each of the directories in : Because, to compare the lists, these need to be removed, to make the strings identical i.e. the files which exist as both .mp4 and .m4v. So that you have a list of the .m4v files, which are to be deleted.

0 views
neilzone 1 months ago

yt-dlp's --download-archive flag

Today, I learned about the flag for . From the readme : For instance: This means that, if the download stops working for whatever reason, you have a list of the files from the playlist which have been downloaded already. When you re-run the command, yt-dlp will not attempt to download the files, for which the IDs are already listed in archive.txt. Very handy! But what if you have already started downloading a playlist, and did not use the flag? You can create a suitable file, from the list of the directory of your downloads, although exactly how you do this will depend on your preferences for interacting with a computer. In the directory of the downloaded files, I used to get a file with the list of downloaded files. I then used vim’s integrated search-and-replace function to get the format right. This involved: (Yes, I could have down it with sed or awk, without vim. I did not.) downloads the files as usual adds the ID of each downloaded file to archive.txt (which is probably specific to archiving from YouTube)

0 views
neilzone 1 months ago

From proxmox to Raspberry Pi 4s again

I’ve been experimenting with proxmox for a few months now. And it was going pretty well, until Something Happened (as yet undiagnosed), which means that I cannot access the proxmox interface or ssh in. The containers on it are still running though. While I’d like to fix it - if only to understand why it crashed - I decided to move the things I cared about back onto Raspberry Pis (because that’s the hardware that I had to hand) for the time being. Restoring the services was easy, thanks to restic backups. This blog is now back onto a Pi, so if you can see this blogpost, it is working. I have one more service to move, and then I can start exploring the proxmox issue. Annoyingly, I have most of the hardware to run up a second instance of proxmox, with a plan for some basic level of failover, but I had not got around to setting it up.

0 views
neilzone 2 months ago

#FreeSoftwareAdvent 2025

Here are all the toots I posted for this year’s #FreeSoftwareAdvent. Today’s #FreeSoftwareAdvent - 24 days of toots about Free software that I value - is a off to a strong start, since you get not one, not two, not three, but four pieces of software: yt-dlp: command line tools for downloading from various online sources: https://github.com/yt-dlp/yt-dlp Handbrake: ripping DVDs you’ve bought: https://handbrake.fr/ SoundJuicer: ripping CDs you’ve bought: https://gitlab.gnome.org/GNOME/sound-juicer Libation: downloading audiobooks you’ve bought: https://github.com/rmcrackan/Libation Each is a useful starting point for building your own media library. Today’s #FreeSoftwareAdvent is Libreboot. Libreboot is a Free and Open Source BIOS/UEFI boot firmware. It is much to my chagrin that I do not have a Free software BIOS/UEFI on my machines :( https://libreboot.org/ Today’s #FreeSoftwareAdvent is Cryptomator. I use Cryptomator for end-to-end encryption of files which I store/sync on Nextcloud. It means that, if my server were compromised, all the attacker would get is encrypted objects. (In my case, my own instance of Nextcloud, which is not publicly accessible, so this might be overkill, but there we go…) https://cryptomator.org/ Today’s #FreeSoftwareAdvent is tmux, the terminal multiplexer. I know that tmux will be second nature to many of you, but I’ve been gradually learning it over the last year or ( https://neilzone.co.uk/2025/06/more-tinkering-with-tmux/) , and I find it invaluable. Although I had used screen for years, I wish that I had known about tmux sooner. I use tmux both on local machines, for a convenient terminal environment (from which I can do an awful lot of what I want to do, computing-wise), and also on remote machines for persistent / recoverable terminal sessions. https://github.com/tmux/tmux Today’s #FreeSoftwareAdvent is the Lyrion Music Server. While there are plenty of great FOSS media services out there, Lyrion solves one particular problem for me: it breathes life back into old Logitech Squeezebox units. What’s not to love about a multi-room music setup using Free software and reusing old hardware! :) https://lyrion.org/ Today’s #FreeSoftwareAdvent is forgejo, a self-hostable code forge. Although I use a headless git server for some things, I’ve been happily experimenting with a public, self-hosted instance of forgejo this year. I look forward to federation functionality in due course! https://forgejo.org/ Today’s #FreeSoftwareAdvent is Vaultwarden. Vaultwarden is a readily self-hosted alternative implementation of Bitwarden, a password / credential manager. It works with the Bitwarden clients. Sandra and I use it to share passwords with each other, as well as keeping our own separate vaults in sync across multiple devices. https://github.com/dani-garcia/vaultwarden Today’s #FreeSoftwareAdvent is all about XMPP. Snikket ( https://snikket.org/ ) is an easy-to-install, and easy-to-administer, XMPP server. It is designed for families and other small groups. The apps for Android and iOS are great. Dino ( https://dino.im/ ) is my desktop XMPP client of choice. Profanity ( https://profanity-im.github.io/ ) is a terminal / console XMPP client, which is incredibly convenient. Why not have a fun festive project of setting up an XMPP-based chat server for you and your family and friends? Today’s #FreeSoftwareAdvent is a double whammy of shell-based “checking” utilities. linkchecker ( https://linux.die.net/man/1/linkchecker; probably get it from your distro) is a tool for checking websites (recursively, as you wish) for broken links. shellcheck ( https://www.shellcheck.net/; available in browser or as a terminal tool) is a tool for checking bash and other shell scripts for bugs. Today’s #FreeSoftwareAdvent (better late than never) is “WG Tunnel”, an application for Android for bringing up and tearing down WireGuard tunnels. I use it to keep a WireGuard tunnel running all the time on my phone. I don’t do anything fancy with it, but if one wanted to use different tunnels for different things, or when connected to different networks, or wanted a SOCKS5 proxy for other apps, it is definitely worth a look. https://wgtunnel.com/ (And available via a custom F-Droid repo.) Today’s #FreeSoftwareAdvent is another bundle, of four amazing terminal utilities for keeping organised: mutt, an incredibly versatile email client: http://www.mutt.org/ vdirsyncer, a tool for synchronising calendars and address books: https://vdirsyncer.pimutils.org/en/stable/ khard, for interacting with contacts (vCards): https://github.com/lucc/khard khal, for interacting with calendars: https://github.com/pimutils/khal Use with tmux to get all you want to see on one screen / via one command :) Today’s #FreeSoftwareAdvent is LibreOffice. I know that I’ve included this in previous #FreeSoftwareAdvent toots, but I use LibreOffice, particularly Writer, every day. Better still, this one isn’t just for Linux users, so if you are thinking about moving away from your Microsoft or Apple office suite, you can give LibreOffice a try, for free, alongside whatever you use currently, and give it a good go before you made a decision. Today’s #FreeSoftwareAdvent is Kimai, a self-hostable time tracking tool, which I use to keep records of all the legal work that I do, as well as time spent on committees etc. It has excellent, and flexible, reporting options - which I use for generating timesheets each month - and there is an integrated invoicing tool (which I do not use). https://www.kimai.org/ Today’s #FreeSoftwareAdvent is another one which might appeal for ##freelance work: InvoicePlane which is - you’ve guessed it - an invoicing tool. I have used InvoicePlane for decoded.legal’s invoicing for coming up ten years now, and it has been great. Dead easy to back up and restore too, which is always welcome. I think that it is designed to be exposed to customers (i.e. customers interact with it), but I do not do this. Kimai (yesterday’s entry) also offers invoicing, but for whatever reason - I don’t remember - I went with a separate tool when I set things up. (I’ve also heard good things about InvoiceNinja, but I have not used it myself.) https://www.invoiceplane.com/ Today’s #FreeSoftware advent is EspoCRM, a self-hostable customer relationship management tool. I use it to keep track of my clients, their matters, and the tasks for each matter, and I’ve integrated various regulatory compliance requirements into different steps / activities. It took me a fair amount of time to customise to my needs, and I’m really pleased with it. https://www.espocrm.com/ Today’s #FreeSoftwareAdvent is espanso, a text expansion tool. You specify a trigger word/phrase and a replacement, and, when you type that trigger, espanso replaces it automatically with the replacement text. So, for instance, I have a trigger for “.date”, and whenever I type that, it replaces it with the current date (“2025-12-16”). It also has a “forms” tool, so I can trigger a form, fill it in, and then espanso puts in all the content - so great for structured paragraphs with placeholders, basically. I would be absolutely lost without espanso, as it saves me so many keystrokes. https://espanso.org/ Today’s #FreeSoftwareAdvent is text-mode web browser, links. Because, sometimes, you just want to be able to browse without all the cruft of the modern web. You don’t have to stop using your current GUI browser to give elinks, or another text-mode browser, a try. I find browsing with links more relaxing, with fewer distractions and annoyances, although it is not a good choice for some sites. Bonus: no-one is trying to force AI into it. https://links.twibright.com/user_en.html Today’s #FreeSoftwareAdvent is reveal.js, my tool of choice for writing presentations. I write the content in Markdown, and… that’s it. Thanks to CSS, tada, I have a perfectly-formatted, ready-to-go, presentation. It has speaker view, allows for easy PDF exports, and, because it is html, css, and JavaScript, you can upload the presentation to your website, so that people can click/swipe through it. Plus, no proprietary / vendor lock-in format. I have used reveal.js for a few years now, and I absolutely love it. I can’t imagine faffing around with slides these days. https://revealjs.com/ Today’s #FreeSoftwareAdvent is perhaps a very obvious one: Mastodon. I’ve used Mastodon for a reasonably long time; this account is coming up 8 years old. I run my own instance using the glitch-soc fork ( https://glitch-soc.github.io/docs/) , and it is my only social networking presence. “Vanilla” Mastodon instructions are here: https://docs.joinmastodon.org/admin/install/ If you want your own instance but don’t want to host it, a few places offer managed hosting. Today’s #FreeSoftwareAdvent is greenbone/OpenVAS, a vulnerability scanner. I use it to help me spot (mis)configuration issues, out of date packages (even with automatic upgrades), and vulnerabilities. It is, unfortunately, a bit of a pain to install / get working, but this script ( https://github.com/martinboller/gse ) is useful, if you don’t mind sticking (for now, anyway) with Debian Bookworm. https://greenbone.github.io/docs/latest/index.html Today’s #FreeSoftwareAdvent is F-Droid, a Free software app store for Android. I use the GrapheneOS flavour of Android, and F-Droid is my app installation tool of choice. https://f-droid.org/ / @[email protected] Today’s #FreeSoftwareAdvent are some extensions which I find particularly useful. For Thunderbird: For Firefox: Today’s #FreeSoftwareAdvent is Elementary OS. If I were looking for a first-time Linux operating system for a friend or family member, I’d happily give Elementary OS a try, as I have heard nothing but good things about it. Plus, supporting small projects doing impressive things FTW. https://elementary.io/ / @[email protected]. Today’s #FreeSoftwareAdvent is paperless-ngx, a key part of keeping us, well, paperless. It is a document management tool, but I use it in a very basic way: it is hooked up to our scanner, and anything we scan gets automatically converted to PDF and OCRd. We then shred the paper. It is particularly useful around tax return time, as it means I can easily get the information I need from stuff which people have posted to us. https://github.com/paperless-ngx/paperless-ngx yt-dlp: command line tools for downloading from various online sources: https://github.com/yt-dlp/yt-dlp Handbrake: ripping DVDs you’ve bought: https://handbrake.fr/ SoundJuicer: ripping CDs you’ve bought: https://gitlab.gnome.org/GNOME/sound-juicer Libation: downloading audiobooks you’ve bought: https://github.com/rmcrackan/Libation mutt, an incredibly versatile email client: http://www.mutt.org/ vdirsyncer, a tool for synchronising calendars and address books: https://vdirsyncer.pimutils.org/en/stable/ khard, for interacting with contacts (vCards): https://github.com/lucc/khard khal, for interacting with calendars: https://github.com/pimutils/khal F-Droid is an app, which provides an app store interface F-Droid, the project, also runs the default repository used by the F-Droid app But anyone can run their own repository, and users can add that repo and install apps from there (e.g. Bitwarden does this) Quick Folder Move lets me file email (including into folders in different accounts) via the keyboard. I use this goodness knows how many tens of times a day. https://services.addons.thunderbird.net/en-US/thunderbird/addon/quick-folder-move/ Consent-o-matic automatically suppresses malicious compliance cookie banners: https://addons.mozilla.org/en-GB/firefox/addon/consent-o-matic/ uBlock Origin blocks all sorts of problematic stuff, and helps make for a much nicer browsing experience: https://addons.mozilla.org/en-GB/firefox/addon/ublock-origin/ Recipe Filter foregrounds the actual recipe on recipe pages: https://addons.mozilla.org/en-GB/firefox/addon/recipe-filter/ Shut Up is a comments blocker, which makes the web far more pleasant, with fewer random people’s opinions: https://addons.mozilla.org/en-GB/firefox/addon/shut-up-comment-blocker/

0 views
neilzone 2 months ago

Bringing up a WireGuard tunnel automatically on boot using nmcli

I use WireGuard as my VPN protocol of choice. I want an always-on connection, from my laptop, to one of my WireGuard servers. To do this using , first list all connections: Then modify the relevant connection to enable autoconnect: This will bring the WireGuard tunnel up on boot.

0 views
neilzone 2 months ago

Calibre, AI, and one size not fitting all

I have used the eBook management tool Calibre for many years, supplementing its core functionality (of which I probably use only a tiny amount) with various plug-ins. In the release notes for 8.16.2 , Calibre introduced: Allow asking AI questions about any book in your calibre library. Right click the “View” button and choose “Discuss selected book(s) with AI” AI: Allow asking AI what book to read next by right clicking on a book and using the “Similar books” menu AI: Add a new backend for “LM Studio” which allows running various AI models locally I have found the various reactions to the introduction of “AI” most interesting. Some have been accepting, even supportive, liking the idea of being able to search using natural language. Some have welcomed the support for “local” AI, running on a user’s machine, without some of the privacy implications of the as-a-service tools. Others have been of the view that there’s no harm done anyway, since it requires enabling, without which it just appears in a few menus. Others have been… far less supportive. I’ve seen concerns about “normalising” AI, and concerns from creators who have been burnt by the appropriation of their works for training data. Someone has created a fork of Calibre called Clbre (“because the AI is stripped out”) . If nothing else, one has to admire the naming brilliance. Honestly, I wasn’t particularly keen either. I don’t really see why one would want “AI” in calibre, but that is quite a selfish perspective for a number of reasons, including that others may well have uses cases for which “AI” is beneficial to them (as opposed to my relatively simple use of calibre, and my preference for tools which do one thing, well), and, frankly, simply because I am not the author, and it is the author’s choice. I would also have preferred that, if AI support was to be implemented, it would be by way of a plug-in, so that users who do not want any kind of AI or AI-related language in Calibre do not get it, while those who want AI do. But, again, I’m not the developer. Adding “AI” to stuff is a divisive thing, and perhaps one cannot please all the people, all the time, but I’d have thought that keeping it separate from core code might have been a worthwhile approach. I wonder if there is a language aspect to this too, with more care needed to distinguish between different use cases and types of “AI”. Would that have made a different here? I don’t know.

0 views
neilzone 2 months ago

Adding a button to Home Assistant to run a shell command

Now that I have fixed my garage door controller , I wanted to see if I could use it from within Home Assistant, primarily so that I can then have a widget on my phone screen. In this case, the shell command is a simple invocation to a remote machine. (I am not aware of a way to add a button to an Android home screen to run a shell command or bash script.) Adding a button to run a shell command or bash script in Home Assistant was pretty straightforward, following the Home Assistant documentation for shell commands . To my configuration.yaml file, I added something like: The quirk here was that reloading the configuration.yaml file from within the Home Assistant UI was insufficient. I needed to completely restart Home Assistant to pick up the changes. Once I had restarted Home Assistant, the shell commands were available. To add buttons, I needed to create “helpers”. I did this from the Home Assistant UI, via Settings / Devices & services / Helpers / Create helper". One helper for each button. After I had created each helper, I went back into the helper’s settings, to add zone information, so that it appeared in the right place in the dashboard. Having created the button/helper, and the shell command, I used an automation to link the two together. I did this via the Home Assistant UI, Settings / Automations & scenes. For the button, it was linked to a change in state of the button, with no parameters specified. The “then do” action is the shell command for the door in question.

0 views
neilzone 2 months ago

Using gpioset and gpioget to control the gpio pins on a Raspberry Pi with a relay board under Debian Trixie

A couple of years ago, I bodged a web-controlled garage door opener with a Raspberry Pi . It worked fine, until I upgraded the Raspberry Pi in question to Debian Trixie. I noted that the relevant files in were no longer present, and some further research showed that this was an intentional change: In the upstream kernel, /sys/class/gpio (the sysfs interface) has been deprecated in favor of a device interface, /dev/gpiochipN. The old interface is gone from the kernel in the nightly Debian builds (bookworm/sid) for Raspberry Pi. So, unsurprisingly, my old way of doing things was not working. The documentation for the relay board has not been updated. Fortunately, after a bit of experimentation, I could get it working again using and . I could not find a way of stopping after a fixed period of time (I was expecting to do it, but it did not), so I ended up wrapping it in , which is also a bodge. Anyway, this is now what I am using:

0 views
neilzone 3 months ago

Using a2dismod apache2's mod_status which exposed information via a .onion / Tor hidden service

Earlier this week, I received a vulnerability report. The report said that, when accessing the site/server via the .onion / Tor hidden service URL, it was possible to view information about the server, and live connections to it, because of . is an apache2 default module, which shows information about the apache2 server on /server-status. It is only available via localhost but, because of the default configuration of a Tor .onion/hidden service, which entails proxying to localhost, it was available. The report was absolutely valid, and I am grateful for it. Thank you, kind anonymous reporter. It was easily fixed, made all the more annoying because I knew about this issue (it has been discussed for years) but forgot to disable the module when I moved the webserver a few months ago. One to chalk up to experience I have had a security.txt file in place on the decoded.legal website for quite a while now, but I’ve never had anyone use it. I asked the person who reported it to me if they had contacted me via it, but no, they had not.

0 views