Latest Posts (20 found)
neilzone 4 days ago

Using a2dismod apache2's mod_status which exposed information via a .onion / Tor hidden service

Earlier this week, I received a vulnerability report. The report said that, when accessing the site/server via the .onion / Tor hidden service URL, it was possible to view information about the server, and live connections to it, because of . is an apache2 default module, which shows information about the apache2 server on /server-status. It is only available via localhost but, because of the default configuration of a Tor .onion/hidden service, which entails proxying to localhost, it was available. The report was absolutely valid, and I am grateful for it. Thank you, kind anonymous reporter. It was easily fixed, made all the more annoying because I knew about this issue (it has been discussed for years) but forgot to disable the module when I moved the webserver a few months ago. One to chalk up to experience I have had a security.txt file in place on the decoded.legal website for quite a while now, but I’ve never had anyone use it. I asked the person who reported it to me if they had contacted me via it, but no, they had not.

0 views
neilzone 2 weeks ago

Perhaps I just stop reading the news?

I have been looking for a while for a reliable, online, text-based, source of important (subjective, I know, but to me that doesn’t include sport or celebrities or what is on TV) UK and world news, with a focus on reporting rather than analysis. At this point, I’ve basically given up; I don’t think that what I want exists, paid or free. But do I need to read “the news” anyway? I wonder what I really get from it, other than an increasing sense of despair and frustration. I get updates from key primary sources, through a combination of RSS and to monitor websites. I’m not concerned about missing a key regulatory or legislative update, which is important to me from a work point of view. I subscribe to 404Media, which I enjoy, although a more UK-focussed version would be amazing. I occasionally look at our local news site, when I can stomach the clickbait headlines. I think I’ve got more uBlock Origin filters set up for that site than for any other, in an attempt to make it usable. I’d rather hoped that there was a subscription option which does away with all the advertising, gives actually informative headlines and like, but no - it is an app-based offering, with an “ad-lite … experience”. I can see what people are discussing in the fediverse, where my filters for most party politics are pretty effective. But predominantly I enjoy the fediverse as a place to chat and have fun, not to be exposed to “news”. Having an appreciation of what is going on in the world, in a geopolitical sense, is also useful for my work, and that is a bit trickier. It is primarily for this that I’ve continued to read the BBC news, despite my increasing dissatisfaction with it. But perhaps it is time - even for just a test period - for me to stop reading “news sites”, and see how I fare.

0 views
neilzone 3 weeks ago

Downgrading Debian from testing to stable (trixie)

I have some machines running Debian testing. However, it looks questionable whether I would get through Cyber Essentials running testing (or sid/unstable) rather than stable. So I can either try to downgrade the machines, or else reinstall the OS. The official guidance is not to downgrade: No, it isn’t supported Instead, to wipe and reinstall. I am fully prepared to wipe and reinstall if needed - in other words, I’ve taken and tested backups - so I didn’t really have much to lose by trying a downgrade. The worst case scenario is that I end up wiping and reinstalling anyway. I took, and tested, backups first. I reverted my sources.list entry: I then pinned the stable repos: And then I did the downgrade, which took a few minutes: I actually ran it twice. And… it mostly worked. On one machine, I had to fix a few bits by hand: On another machine, it finished cleanly. I also had to set up my accounts in Thunderbird again, because of a change in profile syntax. Overall, this went a lot more smoothly than I had expected. (Obviously, perhaps, YMMV…)

0 views
neilzone 3 weeks ago

Migrating Mastodon (glitch-soc fork) to another Intel NUC

Some brief notes on a successful migration of my Mastodon server, running the glitch-soc fork) from one Intel NUC to another. Mostly in case I need to do it again! I set up the machine with Debian 13 stable with LVM and LUKS, and then ran my usual set-up / hardening script. I followed the official Mastodon installation instructions to get started. Once I had created the mastodon user, I then rsync’d the directory from my old server to my new server, rather than going through the installation. I also dumped and rsync’d from my old machine the postgresql database, and the redis database, following the Mastodon migration instructions . I moved across the nginx config, and the entirety of , although I could have generated new certs easily enough. I used the new systemd unit files. I then followed the glitch-soc update instructions . I finally adjusted DNS and added some temporary NAT rules to redirect traffic while DNS changes were propogating. When I set up my Mastodon server in early 2018, I started with a Raspberry Pi 3. I moved to a Raspberry Pi 4 ( some brief notes ) in 2021, I think. When I had about 5,000 followers, my Raspberry Pi-based setup started to struggle, and I moved to an old Intel NUC (i5-3427U, 8GB RAM). I don’t recall exactly when that was. That was great, but it has started to struggle recently (just over 9,000 followers, and a busy feed), and I have been running out of disk space on it, even with regular media purge cycles. So now it is running on a bit less old Intel NUC (i7-5557U CPU @ 3.10GHz, 16GB RAM), with some more space for my bad jokes.

0 views
neilzone 3 weeks ago

Using vimwiki as a personal, portable, knowledge base

A while back, I was looking for a tool to act as basically a semi-organised dumping ground for all sorts of notes and thoughts. I wanted it to be Free software, easy to keep in sync / use across multiple devices, I can use it offline / without a LAN connection, and it should render Markdown nicely. I looked at logseq, which looked interesting, but decided to give vimwiki a go. I spend a lot of my time in vim already, so this seemed like it would fit into the way I work very easily. And I was right. Since it is “just” a collection of .md files, it appeals to me from a simplicity point of view, and also makes synchronisation and backing up very easy. There are [multiple ways to install vimwiki]. I went for: and then adding the following to my (although I already had one of them): To add a new wiki with support for Markdown (rather than the default vimwiki syntax), I put the details into Then, I opened vim, and used to open the wiki. On the first use, there was a prompt to create the first page. The basic vimwiki-specific keybindings are indeed the ones I use the most to manage the wiki itself. For me, “ ” is “". Otherwise, I just use vim normally, which is a significant part of the appeal for me. The wiki is just a collection of markdown files, in the directory specified in the “path” field in the configuration. This makes synchronisation easy. I sync my vimwiki directory with Nextcloud, so that it propogates automatically onto my machines, and I can also push it to git, so that I can grab it on my phone. This works for me, and means that I don’t need to configure, secure etc. another sync tool or a dedicated sync system. There is support for multiple wikis, although I have not experimented much with this. Each wiki gets its own line in . You can use in vim to select which wiki you want to use. I really like vimwiki. It is simple but effective, and because it runs in vim, it does not require me to learn a different tool, or adjust my workflow. I just open vim and open my wiki. Prior to vimwiki, I was just dropping .md or .txt files into a directory which got synchronised, so this is not massively different, other than more convenient. Everything is still file-based, but with an increased ease of organisation. For someone who didn’t already use vim, it is probably a more challenging choice.

1 views
neilzone 3 weeks ago

Upgrading our time recording system from Kimai v1 to Kimai v2

I have used Kimai as a FOSS time recording system for probably the best part of 10 years. It is a great piece of software, allowing multiple users to record the time that they spend on different tasks, linked to different customers and projects. I use it for time tracking for decoded.legal, recording all my working time. I run it on a server which is not accessible from the Internet, so the fact that we were running the now long outdated v1 of the software did not bother me too much. But, as part of ongoing hygiene / system security stuff, I’ve had my eye on upgrading it to Kimai v2 for a while now, and I’ve finally got round to upgrading it. Fortunately, there is a clear upgrade path from v1 to v2 and It Just Worked. The installation of v2 was itself pretty straightforward, with clear installation instructions . I then imported the data from v1, and the migration/importer tool flagged a couple of issues which needed fixing (e.g. no email address associated with system users, which is now a requirement). The documentation was good in terms of how to deal with those. All in all, it took about 20 minutes to install the new software, and sort out DNS, the web server configuration, TLS, and so on, and then import the data from the old installation. I used the export functionality to compare the data in v2 with what I had in v1, to check that there were no (obvious, anyway) disparities. There were not, which was good! One of the changes in Kimai v2 is the ability to create customised exportable timesheets easily, using the GUI tool. This means that, within a couple of minutes, I had created the kind of timesheet that I provide to clients along with each monthly invoice, so that they can see exactly what I did on which of their matters, and how long I spent on it. For clients who prefer to pay on the basis of time spent, this is important. This is nothing fancy; just a clear summary on the front page, and then a detailed breakdown. I have yet to work out how to group the breakdown on a per-project basis, rather than a single chronological list, but I doubt that this will be much of a problem. I have yet to investigate the possibility for some automation, particularly around the generation of timesheets at the end of each month, one per customer. I’ll still check each of them by hand, of course, but automating their production would be nice. Or, even if not automated, just one click to produce them all. As with v1, Kimai v2 stores its data in MariaDB database, so automating backups is straightforward. Again, there are clear instructions , which is a good sign.

0 views
neilzone 3 weeks ago

Using LibreOffice and other Free software for documents as a lawyer

I was asked recently about how I get on using LibreOffice for document-related legal work, and I promised to write down some thoughts. The short answer is that I use a mix of LibreOffice and other FOSS tools, and I’m very positive about what I do and how I do it, with no particular concerns. (I’ve written more broadly about how I use Free software for legal work ; this blogpost is more specific.) This is about my experience. Yours might be different. You might not want to, or be able to, use, or try, LibreOffice (or vim, or git, or whatever). And that’s fine. I’m not trying to convert or persuade anyone. I do a lot of work which entails producing and amending, documents, and exchanging documents with others. This includes contracts, policies and procedures, and collaborative report writing. Occasionally, it means filling in other people’s forms. I use LibreOffice’s Writer for this. I use Writer pretty much every day, and have done for several years, with a wide range of clients and counterparties, including large law firms, small companies, and government departments, and I have no concerns, or significant gripes. I have made templates for my most common types of document, and I have styles set up to make formatting easy and consistent. (I don’t know why people produce documents without styles, but that’s just a personal gripe.) I have exchanged complex documents, usually with lots of tracked changes and comments, with many, many recipients, and I have had no problems with tracked changes, or people not being able to open documents or see what I have done. I’ve had a document recently where automatic numbering had gone wrong, and one where formatting was been messed up, but these were both documents which started life 5+ years ago, and I have not been able to identify whether this was a LibreOffice Writer issue, or a Word (or whatever tool others involved have been using) issue, or something else. In both cases, I fixed them rapidly and got on with things. I don’t know what Word is like recently, but when I last used it a few years ago, I found automatic numbering and formatting were mostly fine but occasionally a pain back then too, so perhaps this is just par for the course. I found Writer’s recent change to dark mode / theming a bit of a pain, but I seem to have resolved it now. For version control and documents, I don’t do anything fancy. I have a script which appends a date and timestamp to the beginning of the file’s name, and this works well. I get a directory of drafts, with clear naming / sequencing. I’ve experimented with git and documents, and while it sort of works to a point, it is not the right approach for me at the moment. Factors which might aid my positive experience: I do a lot of advisory work, where I produce reports, advice notes, and briefing notes. I don’t tend to use LibreOffice for this, preferring instead to use vim, writing in Markdown. For instance, this is how I prepared the new terms of service for mastodon.social / mastodon.online , and, on a friendly basis outside work, a draft vendor agreement for postmarketOS . This means none of the cruft of a document filetype, and it means that I can use git for version control in a way that actually works (unlike with documents). It also makes it easy to produce diffs. But it doesn’t work well for things like cross-referencing; it is not the right tool for the job. If the output needs to be a nicely-formatted PDF, I use pandoc and typst to convert the Markdown using a template . This makes producing a formatted document very easy, while letting me focus on the content. Some clients send and receive plain text / .md files (and, yes, you, who likes LaTeX files :)) and share .diffs, others prefer documents. Both are fine with me and I go with whichever works better for each client or each situation. I do not use Impress, the presentation tool, other than for viewing presentations which are sent to me. Instead, I use reveal.js for presentations , writing in markdown and presenting in my browser. I really like reveal.js. I can easily upload my presentations for people to view , and I can convert them to .pdf for distribution. I’ve not had to work on a collaborative presentation in the last 5+ years; I imagine that I’d have to use Impress, or a client’s hosted tool of choice, if someone wanted that. I use the spreadsheet tool, Calc, when I need a spreadsheet, which is not very often. It is mostly basic accountancy. For my limited uses, Calc has been absolutely fine, and I’m certainly not qualified to comment on it in any detail. Some clients want me to use their choice of hosted tools - Microsoft, Google Docs, Cryptpad, Nextcloud, etherpad, and so on. That’s fine; if a client wants to use them, and gives me access, I use them. All the ones that I’ve tried so far work fine in Firefox. I’m also happy to make PRs to, or commit directly into, a client’s git repositories. Over the past few years, I’ve hosted instances of Collabora (via Nextcloud), Cryptpad, and etherpad. All have had their pros and cons, and perhaps that’s something for a different blogpost. Most recently, I hosted etherpad, but right now, I’m not hosting any of these. I just don’t use them enough. I don’t depend on any third party plug-ins or integrations. I imagine that someone whose work depends on that kind of thing, then Writer might not be a good fit. I don’t do litigation, or anything which requires court filings.

0 views
neilzone 3 weeks ago

Stopping a kernel module from loading on boot in Debian Trixie using /etc/modules-load.d/

I am going through, upgrading my Debian machines from bookworm to trixie. On the whole, so far anyway, it has been pretty painless. But here’s something which needed some manual intervention. I have a couple of virtual machines still using VirtualBox. Yes, I know, but now was not the time to solve that part of the problem. They ran fine on a hypervisor running bookworm. After I had upgraded the hypervisor to trixie and started the virtual machine, I got an error: Sure enough, when I ran , the kernel module was loaded: Temporarily unloading it with did the trick. But that does not survive a reboot. Previously, I would have done something like . But says: It seems that the new (I don’t know how new) way to do things is to put a simple config file in : (It is a shame that the command uses “blacklist” in 2025.) And then I rebuilt the initramfs: . On rebooting the machine, that kernel module no longer loads automatically, and I can start my virtual machine.

0 views
neilzone 1 months ago

What goes on at a meeting of the Silicon Corridor Linux User Group in 2025

What goes on at a meeting of the Silicon Corridor Linux User Group in 2025 I found this post in my drafts, half completed. I am not really sure why I started it, but I did start it, some point earlier this year, so now I will finish it. I am a long time member of our local Linux user group, the curiously named Silicon Corridor Linux User Group (SCLUG) . (Its website looks much how you might expect the website of a Linux user group to look.) Given that we’ve only met in Reading for as long as I can remember, I guess that it is really the Reading And Thereabouts Linux User Group. RATLUG. I first went to a SCLUG meeting in around 2005, when I was back in the area after university. The group had an active email list, which was the primary means of communication. We met at the seating area in the front of a Wetherspoons (urgh). I think because the food was cheap. It certainly wasn’t because it was good. Or a pleasant place conducive to a good chat, given how loud and crowded it was. But it was fun , and it was enjoyable to chat with people developing, supporting, and using Linux (and BSD etc.). Meetings were well attended, and we often struggled for space. I stopped going for a quite a few years, both because I really wasn’t a fan of Wetherspoons, and also life got in the way. I started to go again just before the first Covid lockdown. It was still in Wetherspoons, but oh well. I think that I managed one meeting before everything was shut down. We moved online during the covid lockdowns, using jitsi as a platform for chatting. I rather enjoyed it. I particularly liked the convenience, of being able to join from home, rather than travel all the way to Reading for a couple of hours. But it was not a success from a numbers point of view, and while I liked the idea of people proposing mini-talks (as I like the idea of using the LUG as a place to learn things), that did not catch on. So now we are in 2025, and SCLUG keeps going. Times have changed, though. The mailing list is almost silent; we have a Signal group instead, but there is relatively little chat between meetings. We still meet in person, once a month, of a Wednesday evening. We have, finally, moved from Wetherspoons to another pub, thank goodness. The fact that meetings were in Wetherspoons were a significant factor in me not bothering to go, so I was keen to encourage a move to somewhere… better. At the moment, we meet in the covered garden area of The Nag’s Head and in the warmer and lighter months, it is quite pleasant. We’ve acknowledged that this is not going to be viable for much longer because of the weather, and the pub itself is small and noisy, so I suspect that we are back to looking for another venue. It is not a big group. I reckon that, on average, there are probably six or seven of us at most meetings. Visitors / drop-ins are very welcome; the Signal group is a good way of finding us, else look for the penguin on the table if I remember to bring it. “Meetings” sounds a bit formal, since it is just us sitting and chatting. There is no formality to it at all, really; turn up, have a chat, and leave whenever. I tend to be there a bit earlier than the times on the website, and leave not too late in the evening. The conversation tends to be of a technical bent, although not just Linux by any means. Self-hosting comes up a fair amount, as do people’s experiments with new devices and technologies, and chats about tech and society and politics etc. While I doubt that anyone who didn’t have an interest in such things would enjoy it, there’s certainly no expectation of knowledge/experience/expertise, nor any elitism or snobbery. I can’t say that I learn a huge amount - for me, it is definitely more social than educational. Even with a small number of people, I have to have enough social spoons left to persuade myself to go into Reading of a Wednesday evening for a chat. We have not done anything like PGP key signing, or helping people install Linux, or anything similar, for as long as I can remember. Yes, I think so. There are, of course, so many online places where one can go to chat about Linux, and to seek support, that an in-person group is not needed for this. To me, SCLUG is really now a social thing. A pleasant and laid back evening, once a month, to chat with people with complementary interests. It strikes me as of those things that will continue for as long as there are people willing and able to turn up and chat. Perhaps that will wane at some point…

0 views
neilzone 1 months ago

Is now the best time ever for Linux laptops?

As I’ve said, ad nauseum probably, I like my secondhand ThinkPads. But I’m not immune to the charms of other machines and, as far as I can tell, now is an amazing time for Linux laptops. By which I mean, companies selling laptops with Linux pre-installed or no OS preinstalled, or aimed at Linux users. Yes, it’s a bit subjective. There seems to be quite a range of machines, at quite a range of prices, with quite a range of Linux and other non-Windows/macOS operating systems available. This isn’t meant to be a comprehensive list, but just some thoughts on a few of them that have crossed my timeline recently. All have points that I really like but, right now at least, if my current ThinkPad died, I’d probably just buy another eBay ThinkPad… Update 2025-10-25: This is a list, not recommendations, but personally I won’t be buying a Framework machine: “Framework flame war erupts over support of politically polarizing Linux projects” I love the idea of the Framework laptops , which a user can repair and upgrade with ease. Moving away from “disposable” IT, into well-built systems which can be updated in line with user needs, and readily repaired, is fantastic. Plus, they have physical switches to disconnect microphone and camera, which I like. I’ve seen more people posting about Framework machines than I have about pretty much all of the others here put together, so my guess is that these are some of the more popular Linux-first machines at the moment. I know a few people who have, or had, one of these. Most seem quite happy. One… not so much. But the fact that multiple people I know have them means, perhaps, sooner rather than later, I’ll get my hands on one temporarily, to see what it is like. I only heard about Malibal while seeing if there was anything obvious that I’d missed from this post. Their machines appear to start at $4197, based on what they displayed when I clicked on the link to Linux machines, which felt noteworthy. And some of the stuff on their website seems surprising. Update 2025-10-25: The link about their reasons for not shipping to Colorado no longer works, nor is it available via archive.org (“This URL has been excluded from the Wayback Machine.”). Again, this is a list, not recommendations, but this thread on Reddit does not make for good reading. I’m slipping this in because I have soft spot for Leah’s Minifree range of machines even though, strictly, they are not “Linux-first” laptops, but rather Libreboot machines, which can come with a Linux installation. I massively admire what Leah is doing here, both in terms of funding their software development work, and also helping reduce electronic waste through revitalising used equipment. Of all the machines and companies in this blog post, Minifree’s are, I think, the ones which tempt me the most. I think the MNT Pocket Reform is a beautiful device, in a sort-of-quirky kind of way. In my head, these are hand-crafted, artisan laptops. Could I see myself using it every day? Honestly, no. The keyboard would concern me, and I am not sure I see the attraction of a trackball. (I’d happily try one though!) But I love the idea of a 7" laptop, and this, for me, is one of its key selling points. If I saw one in person, could I be tempted? Perhaps… The Pinebook Pro is a cheap ARM laptop. I had one of these, and it has gone to someone who could make better use of it than I could. Even its low price - I paid about £150 for it, I think, because it was sold as “broken” (which it was not) - could not really make up for the fact that I found it underpowered for my needs. This is probably a “me” thing, and perhaps my expectations were simply misaligned. The Pine64 store certainly hints in this direction: Please do not order the Pinebook Pro if you’re seeking a substitute for your X86 laptop, or are just curious Purism makes a laptop, a tablet, and a mini desktop PC . I love their hardware kill switches for camera and microphone. A camera cover is all well and good, but I’d really like to have a way of physically disconnecting the microphone on my machines. Again, I don’t think I know anyone who has one. Were it not for a friend of mine, I wouldn’t even be aware of Slimbook. Matija, who wrote up his experiences setting up a Slimbook Pro X 14 , is the only person I’ve seem mention them. But there they are, with a range of Linux-centric laptops , at a range of prices. I could be tempted by a Linux-first tablet, and StarLabs’ StarLite looks much the best of the bunch… But, at £540 + VAT, or thereabouts, with a keyboard, it is far from cheap for something that I don’t think would replace my actual laptop. I’m aware of System 76 , but I’m not sure I know anyone who has one of their machines. As with System 76, I’m aware of Tuxedo , which certainly appears to have an impressive range of machines. But I don’t think I’ve heard or seen of anyone using one.

0 views
neilzone 1 months ago

Installing and using Linux (AsteroidOS) on a TicWatch Pro 2020

Last week, at my local Linux User Group meeting, we were discussing alternative operating systems for phones. Those who had a play with it were rather impressed with my OnePlus 6 running postmarketOS . And then someone mentioned that there was a way to get Linux running on (some) smartwatches: AsteroidOS . I promised, there and then, that I did not need another project, or another device running Linux, and that I absolutely would not be getting involved in it. Just over 100 hours later, having successfully scored a well supported watch (a TicWatch Pro 2020) for a very reasonable price secondhand (and in pristine condition!), I had a watch running AsteroidOS. So much for my ironclad resolve. Honestly, installing AsteroidOS was so straightforward (for me; to someone who has never tried to flash a mobile device before, it might be a bit more daunting) that I was almost disappointed. But, well, the instructions on their wiki were spot on, and it Just Worked. Before my wife had even sighed and said “Neil, what are you doing now?”, I’d done it. 10 minutes from opening the box with the watch in it, and I was running Linux. As they suggested, I first went with dual-booting, running AsteroidOS alongside WearOS, which was what came with the watch. (And, yes, we live in a world where one can dual-boot Linux on a watch. Nice.) After a few minutes of testing, I just bit the bullet and installed AsteroidOS as the only OS, and I haven’t looked back. My first impression - which remains the case - is that a lot of thought has been given to the user experience. Setting up - including setting the date and time on the watch, since it did not have an Internet connection or a device with which to sync at the moment - was easy, and it had a nice demo/tutorial for how to interact with it. Only one of the two buttons is supported, but that has not posed a problem so far. It paired without a problem with Gadgetbridge . It mirrors the notifications on my phone, which is fine. There is an AsteroidOS specific client but, since I already have Gadgetbridge on my phone, I just used that. I have yet to see if I can pair it with my postmarketOS device; that is a project for the future. Most because I can, I connected it back to my computer, and then ssh’d into it via RNDIS. Again, the documentation was good. I uninstalled one of the default applications, and installed another, just to get the hang of it. I also configured Wi-Fi on the watch, using , and yes, I could then ssh into my watch over Wi-Fi, without a USB connection. But I quickly turned off Wi-Fi again, as it consumed a lot of battery power. So far today, I have had the watch on (and thus off the charger) for 12 hours. It has 77% battery remaining. By my rough maths - rather than real world experience - this suggests that it should be good for a couple of days without re-charging, but no more than this. That’s not terrible, but nor is it great. But then I am used to a PineTime, and also a Colmi P80, both of which can go for a week or so without a charge. (When I had Wi-Fi turned on, the battery would have struggled to get through a whole day.) I have very basic requirements for a “smart” watch: I just want something which will alert me to calls and messages (XMPP, Signal etc), and calendar appointments. Anything else is either a bonus, or unnecessary / unwelcome, depending on what it is. It tells the time. I quite like the dual screen, having a basic display for the time, and a secondary display for the Linux GUI, which I can activate with a tap or a press of the top button. I’m used to basic smart watches, and, because of my very limited requirements, both my PineTime and my Colmi P80 did the job just fine. Even secondhand, my TicWatch cost twice as much as either of these did new. AsteroidOS on the TicWatch supports the heart rate monitor, and it appears to have a step tracker too (although, despite walking around with it on my wrist, it tells me that I haven’t logged any steps today). The Colmi P80 offers more, if that is important, and it gets displayed nicely via Gadgetbridge. The Colmi smart ring that I got a while back does very well in this regard too. I don’t really want/need a fitness tracker, and, after the initial novelty of all the data from the Colmi watch/ring wore off, I didn’t bother checking or syncing it again. So I don’t miss its absence on AsteroidOS. If I did want that, or I wanted the convenience of USB-C charging (as the TicWatch’s charging dock is bulky, and USB-A), the Colmi P80 would be a better choice. I’ve been using it for two days now. I like it. If I can get it working with postmarketOS, I’ll be even more pleased. It does not look ostentatious, nor so bland that it almost stands out (PineTime). But one thing it does do, which the Colmi watch did not, is that it makes me smile. There is something very much appealing to me about having liberated yet another device, to make use of it while running Linux and other Free software. That might just be a Neil thing, but I am Neil, and it is a thing, and I like it.

0 views
neilzone 1 months ago

Librephone and the need for more Freedom for mobile computing

The Free Software Foundation has announced a project called “Librephone” , and I am quite excited about it. The scope of the project is narrow, which is superb. It is aimed at tackling the proprietary firmware and binary blobs which are currently necessary, even when one wants to use an alternative OS on a phone. I have a strong preference for Free software, but I have also accepted the proprietary blobs to be able to use existing hardware. I would much rather that they were not there though, and so the idea of a mobile computing device / phone, running only Free software, is attractive to me. I suspect that this is a complicated thing to achieve, even just for one device, let alone the plethora of devices out there. I am enjoying using postmarketOS on a OnePlus 6 , and it is the best modern mobile Linux-on-a-mobile-device experience that I have had so far. But I had to pick the phone carefully, and, even then, not all hardware is supported fully. It would be amazing if I could do this on more modern hardware. Fundamentally though, user choice, and software and device freedom, on mobile devices is not merely a technical problem, or the preserve of geeks and nerds. It is an environmental problem, as we - those of us who live on Earth - cannot afford to keep abandoning perfectly good hardware just because its manufacturer has stopped support it to enable it to sell the latest thing. It is unsupportably wasteful. I love the ability to install Linux on an old laptop and continue to use it, once others have discarded it. It would be amazing if the massive pile of now-aged abandoned BlackBerry devices could be re-used, and all number of other older but still functional devices. It is a privacy problem too, where one might reasonably wish to turn to Free software to avoid surveillance-based business models, or the unwarranted exercise of control over what one can run on one’s own device. And so the problem is also a regulatory one, with scope (indeed, a necessity) for regulatory/legal intervention, to protect consumer interests. In particular, in terms of locked bootloaders, as well as organisations seeking control over who can write and distribute software for devices, or what software a user can choose to install. The Librephone initiative is a great idea, and part of a much bigger, urgent, problem, and I wish them much luck.

0 views
neilzone 1 months ago

How I interact with PDFs using Free software and Linux in 2025

This blogpost contains some brief thoughts on how I interact with PDFs. It is not an exhaustive list of Free software PDF tools for Linux. I know that there are other options (including Okular, and LibreOffice’s Draw); some I have tried and some I have not. I use Gnome’s default tool, evince , also known as “Document Viewer”. The function for leaving notes on a PDF is pretty useful, although I don’t know how well it works if I wanted to share that file-with-notes with someone else, especially someone else using different software. When I review a contract or an article, I still like to do so with a pen in my hand. I don’t know why. Since I am using a ThinkPad with a touchscreen, and support for a pen , this is no problem. I convert the contract/article into PDF, most often using LibreOffice Writer’s export function. I then open the resulting PDF in Xournal++ , which is superbly useful piece of software. I like writing advice notes in Markdown, but I tend to convert them to PDF before sharing them as a “final” version. I am using pandoc and typst to do this , and it works really well, resulting in a nicely-formatted PDF. If I am converting a document into PDF for ease of scribbling, I use LibreOffice Writer’s export function. For scanning documents to PDF and OCRing them, I use paperless-ngx . I don’t appear to have blogged about it, but the gist is that I have configured our Brother MFC L2750DW to scan to a directory on a server as PDF, and paperless-ngx watches that directory and then ingests the resulting files. It works well, although one day I should move it from a Raspberry Pi 4 to something a little beefier. I use three main tools for “doing stuff” to PDFs. PDFArranger makes adding pages, deleting pages, splitting files, rotating pages, and moving pages around very easy. I use it quite a lot. I discovered Stirling-PDF more recently, and I’ve self-hosted an instance of Stirling-PDF for just under a year now. It is a web interface for a range of tools which offer quite a lot of PDF-related functions, and it works well on both desktop and mobile browsers. For batch changes to PDFs, I like PDFtk - apparently, there is a GUI version, but I use the command line tool. I find it especially useful for rotating all the pages in a file because someone has scanned a document poorly. If someone needs me to sign a PDF, in the sense of adding my name to it, I tend to use Xournal++. Rarely do I need a digital representation of my scribbled signature, but when I do, I also use Xournal++, either scribbling on the document or adding an image. For an electronic signature, I’ve experimented with LibreOffice’s functionality, but I didn’t get as far as I wanted. I should look into it some more. I ran an instance of DocuSeal for a while, but frankly I just did not use it enough to justify keeping it going. I was impressed with it and, had I needed its functionality more often, I imagine that I would have kept it. This is, fortunately, not something that I need to do day to day. I have a clunky-but-seems-to-work approach , for those very rare cases when I need it. If there is a better approach to doing this using Free software on Linux, I’d love to hear about it. I haven’t had to do this in a while, but I’ve note in my wiki that says that this ghostscript incantation has worked for me in the past:

1 views
neilzone 1 months ago

Notes on running postmarketOS on a OnePlus 6

I am experimenting with postmarketOS on a OnePlus 6. This is not my first foray into a Linux-mobile device; I have previously used postmarketOS on an old Samsung tablet, as well as Mobian, and the version of Debian that one could install onto the Planet range of PDAs. And, of course, many years ago, a Sharp Zaurus. Even though I am currently a pretty happy user of GrapheneOS, I’m keen to explore options which are further removed from Google. tl;dr: to make this work for me, I am going to need to make a lot of compromises. I am not sure if I am willing to do that or not. But I am enjoying the experiment. These are mainly some working notes. I wrote up installing postmarketOS with full disk encryption . The postmarketOS wiki is useful. I am avoiding Waydroid. Since one of the goals of this experiment is to see about move away from Android, using Waydroid seemed like cheating. I might play with it at some point but, for now, I am not looking at Android applications via Waydroid. As part of my preparation for this experiment, I looked at what apps I actually needed, and what services I could use via the web browser instead. Surprisingly, most of the parking apps that I use seem to work fine in a browser. I am not 100% sure of this - I haven’t tried them all yet - but then most of the car parks I use on a regular basis still have payment machines anyway. Banking websites generally suck, but none of my banks require - as far as I can tell - an app, as long as I’m willing to log in to the website using a card reader. I have not yet uninstalled the banking apps from my other phone, so I don’t know what will happen when a payment requires verification. Home Assistant works fine in the browser, but it does mean that my device location is not available (since this requires the app). So Sandra cannot tell where my phone is (and thus, most likely, where I am), which is a bit of a pain. I can control music around the house using the Lyrion Media Server’s web interface. Easy. I haven’t used shopping-related apps for ages, so no change here. I just use the websites. Wi-Fi works fine. I generated a WireGuard config on one of my WireGuard servers, and imported the resulting text file on the OP6. It worked, as expected. To make it connect automatically, I used: to identify the WireGuard tunnel, and then to connect automatically. It connects automatically following a device reboot. My Bluetooth headphones, and a Bluetooth keyboard, paired, connected, and work. Geary works fine, although it doesn’t appear to support GPG/PGP, which is a shame. Using GNOME’s “Online Accounts” functionality, I could connect to my Nextcloud instance, for contacts and calendar. This worked more easily than I was expecting. I bought a SIM, on the Three network (UK), put it in, and restarted the phone. It detected the SIM card automatically on reboot. SMS arrived immediately. Lots of them. I enabled mobile data from within GNOME settings (just as I do on my laptop, to control the internal modem) and it Just Worked. Traffic gets routed over the WireGuard tunnel. I can make a phone call, and I can receive a phone call, when the phone is unlocked or not. However, the stack is not particular stable: sometimes I can make and receive calls, and sometimes I cannot. There appears to be a fun bug, for me to investigate: Bitwarden is in my “sort of working” list. It works, as a Firefox add-on. I’d like to have a “desktop” way of accessing it, and I have yet to work that out. The cameras both work, after a fashion, but not if you want high quality images or video. I bought a small compact camera on eBay a while back, in the expectation that this was going to be a problem. It is nowhere near as convenient, of course. When I cycle or drive somewhere new, I tend to use my phone for navigation. Using the Maps application, I can plan a route, but I don’t get navigation instructions. This is a bit of a pain, and I don’t have a good alternative. The default SMS/XMPP/Matrix application, Chats, supports XMPP, but I cannot log in to my XMPP server (Snikket) using it. I have not dug into why this is the case, as I don’t think that I’ve got anything strange set up. Dino does work for messaging and file tranfer, and doesn’t look too bad on the smaller phone screen. But, for some undiagnosed reason, I don’t get notifications for incoming messages, and audio/video does not work, as calls fail to establish. Other people say that audio works for them, so I am not sure why it is not working for me. I need to look into this. I have not found a good option for this. Possibly some kind of Signal/XMPP bridge. I was not expecting this to work. It is a bit of a pain typing in a passphrase each time to use my phone, but oh well. I receive a call on the OP6 I answer the call while the OP6 is locked I hang up the call on the OP6 or the other end hangs up the call I unlock the OP6 it calls the number which had just called me :)

0 views
neilzone 2 months ago

What if I don't want videos of my hobby time available to the entire world?

I am very much enjoying my newly-resurrected hobby of Airsoft. Running around in the woods, firing small plastic pellets at other people, in pursuit of a contrived-to-be-fun mission, turns out to be, well, fun. I have also had to accept that, for some other players, part of that fun comes from making videos of their game days, and uploading them to YouTube. They often have quite impressive setups, with multiple cameras - head, rear-facing from barrel of weapon, and scope cam - and clearly put time, money, and effort into doing this. Great! Just like someone taking photos on their holidays, or when out and about, I can see the fun in it. It is the “non-consensually publishing it online for the world to see” aspect which bugs me a bit. In the handful of games that I have played, no-one has ever asked about consent of other participants. There has been no “put on this purple lanyard if you don’t want to be included in the public version of the video” rule, which I’ve seen work pretty well at conferences I have attended (even if it is opt-out rather than consent). I could, I suppose, ask each person that I see with a camera “would you mind not including me in anything you upload, please?”. And, since everyone with whom I’ve spoken at games, so far anyway, has been perfectly pleasant and friendly, I’d be hopeful that they would at least consider my request. I have not done this. The impression I get is that this is just seen as part and parcel of the hobby: by running around in the woods of northern Newbury on a Sunday morning, I need to accept that I may well appear on YouTube, for the world to see. I don’t love it, but it is not a big enough deal for me to make a fuss. I occasionally see people saying “well, if you don’t want to be in photos published online, don’t be in public spaces”. This is nonsense, for a number of reasons. Clearly, one should be able to exist in society, including going outside one’s own home, without needing to accept this kind of thing. In any case, here, the issue is somewhat different, since it is a private site, where people engage in private activity (a hobby). But then I’ve seen the same at (private) conferences, with people saying “Of course I’m free to take photos of identifiable individuals without their consent and publish them online”. Publishing someone’s photo online, without their consent, without another strong justification, just because they happen to be in view of one’s camera lens, feels wrong to me. This isn’t about what is legal (although, in some cases, claims of legality may be poorly conceived), but around my own perceptions of a private life, and a dislike for the fact that, just because one can publish such things, that one should .

0 views
neilzone 2 months ago

I am getting better at turning off unneeded self-hosted services

One of the things at which I’ve become better over the years is turning off self-hosted services. I love self-hosting things, and trying out new things, but I am also mindful that each new thing carries its own set of risks, particularly in terms of security. Similarly, while I do my best to secure servers and services to a reasonable standard, and to isolate things in a sensible way on our network, there is still residual risk and, if I am not using something enough, there is no point me tolerating that risk. So, once a month, I take a look at what I am self-hosting, and decide (a) if I really still need each thing, and (b) if I do, if I am doing the thing in the most sensible (for me) way. Occasionally, I am on the fence, particularly if it is something that I use infrequently but do still use it. Those are prime candidates for considering if I can achieve the same thing but in a better way. This is particularly true of services where upgrading them in a bit of a pain - I automate as much as I can, but if I have to interfere with something manually, then it is a greater risk, and I am less likely to keep it around. Overall, though, I’ve become a bit more ruthless in taking things down. I have backups, and, if I take down something which I later decide that I want to use, I can always reinstall it and restore the backup. I also go through and remove the DNS entries, and any firewall configuration, and again I have backups of those. I get the benefit of trying new things, but in a more managed manner.

1 views
neilzone 2 months ago

I am not notable enough for Wikipedia. Thank goodness

As of today, I am officially not notable enough for Wikipedia . And I am delighted. A while back - perhaps a year, perhaps a bit longer - someone told me that they’d spotted a page about me on Wikipedia. This came as a surprise. There is absolutely no reason why there should be a page about me in any encyclopaedia. Over time, a few more details were added. None of them were, in themselves, inaccurate, but they were few and far between, and so the resulting picture was patchy. Not, fortunately, in a bad way, just in the way which I suspect is inevitable when people try to find sources to cite for information about someone who doesn’t justify that kind of effort. Three different people got in touch with me by email, explaining that they were Wikipedia editors, and that, for a price to be discussed, they would beef up my page. One noted that my page was proposed for deletion, and that they could sort that out, again for a price. I have no wish to have a page about me on Wikipedia. I am not “notable”, and have no wish to be. Today, finally, that page was deleted, which is entirely as it should be.

0 views
neilzone 2 months ago

Sustainable Free and open source software... but not like that

Perhaps I am just a grumpy git. Perhaps I am just an ungrateful grumpy git. I want Free / open source software to be sustainable. I am willing to spend money on this. I do spend money on this, when I can. I understand that developers may have different ways of seeking sustainability. I should - I think - be open to different developers coming up with different approaches. But boy am I ( still ) struggling with Mozilla. I can kind of understand setting a default search engine Google; at least I can change it to something more appropriate. Shovelling AI, unwanted, into its browser, Firefox. I’ve had to tinker in to turn that off, and I am not convinced that this will survive a new version / release. Showing “sponsored” content, even though I am darned sure that I turned it off. This really irks me. But perhaps I should just shrug my shoulders, and say “Well, I use Firefox, and if this is the price - if this is how Mozilla chooses to seek funding - then I should suck it up, and just change things i do not like.” Or change to another browser, but then, as I’ve blogged before , I struggle with the notion of using a Firefox fork, since I’d still be dependent on Mozilla.

0 views
neilzone 2 months ago

Thoughts on running my own forgejo (hosted git with web stuff) server

I’ve been experimenting with git for a few months now , and I’ve been finding it both useful and enjoyable. It’s nice to learn something new, with a purpose in mind. For keeping dot files available across machines, in circumstances where Nextcloud is not a great option, I’ve been using git on a locked-down, private server. I commit locally, on whatever machine I make the change, and then push to the remote git server. This is just plain git, with no web front end or anything. Since it is just for me, and since I don’t want to expose it to the Internet, this has been fine, and it works well. I also wanted a git server with a web interface, which I can use for public-facing stuff. For that, I went with forgejo. I am always hesitant to say that installing and running something is easy, but that is subjective, but, for me, so far, it has been a very pleasant experience. At the moment, I am only using it for hosting some documentation - some sample terms for fedi instances - rather than code. I like it, and it is easy for me to use, pushing to it from a terminal. I am running it on a tiny container on a proxmox server; I might need to beef it up a little, but so far, it is doing pretty well, even in the face of sustained distributed traffic when I post links to it to the fediverse. I was pleased to see so many options for importing from other forges, including GitHub. As it turns out, I have not bothered to move anything from GitHub, as I’ve nothing there that I actually value or which I think would be of use to others. I’ll probably just delete those repos. But I very much like it that the option is there, and simple. Being my own instance, I am the only user of it, and, sadly, forgejo’s federated functionality - which would enable a user on another foregjo instance to make pull requests, raise issues etc. - is not there yet. So, for now, it is a bit isolated: it is there, on the web, but no-one else can raise issues and the like, and that feels like a shame for a collaboration platform. Perhaps I just didn’t think this through well enough when I picked a self-hosted instance of forgejo; I might have been better going with GitLab, or the like. I am always nervous about putting private information on a server accessible from the Internet. I have a Nextcloud instance, but it is not accessible from the Internet, even though I could see advantages to that. I lock down my mail server, so that one cannot log into IMAP over the Internet. I don’t expose a webmail interface. So, for now, I have not experimented with private repositories, and I am sticking with the separate git server (mentioned above) for things which I want to keep to myself. Perhaps I worry too much, and perhaps I should have some more faith in both my own sysadmin skills, and also the security of the services themselves. But, for now, separation seems like a better plan.

0 views
neilzone 3 months ago

Ofcom's perhaps inadvertent list of porn sites allegedly without age assurance made me smile

I had to smile today that, when reading Ofcom’s website about its enforcement activities under the Online Safety Act 2023, I learned about a few porn sites of which I was not previously aware. Given the nature of Ofcom’s investigations, some of these sites are probably not enforcing age assurance for users in the UK. Ofcom’s website is not (of course; it has no reason to be) behind age assurance, so I wonder at what point it becomes a destination of choice for anyone who does not want to, or is unable to, verify their age to access pornography, as the source of a curated list of probably still accessible porn sites. It is right and proper that Ofcom is transparent about the service providers subject to investigations or enforcement activity, but all the same, it tickled me. This reminded me of a journal article that I was reading a while back - sadly, I don’t recall the name, or even the journal - about child access to pornography. The gist of the article was that lots of children were aware of, and (to a lesser extent) had accessed, online pornography. Its research methodology has consisted of asking children who were 16 or so some questions about sex and pornography, including giving them a list of porn sites and asking with which they were familiar. There were some sites on that list of which I was not aware, and I imagine that that was probably true of the child research participants in the study too. I would love to have read the ethics board approval for that one…

0 views