Latest Posts (20 found)
Andre Garzia 5 days ago

Building your own blogging tools is a fun journey

# Building your own blogging tools is a fun journey I read a very interesting blog post today: ["So I've Been Thinking About Static Site Generators" by PolyWolf](https://wolfgirl.dev/blog/2026-02-23-so-ive-been-thinking-about-static-site-generators/) in which she goes in depth about her quest to create a 🚀BLAZING🔥 fast [static site generator](https://en.wikipedia.org/wiki/Static_site_generator). It was a very good read and I'm amazed at how fast she got things running. The [conversation about the post on Lobste.rs](https://lobste.rs/s/pgh4ss/so_i_ve_been_thinking_about_static_site) is also full of gems. Seeing so many people pouring energy into the specific problem of making SSGs very fast feels to me pretty much like modders getting the utmost performance out of their CPUs or car engines. It is fun to see how they are doing and how fast they can make clean and incremental builds go. > No one will ever complain about their SSG being too fast. As someone who used [a very slow SSG](https://docs.racket-lang.org/pollen/) for years and eventually migrated to [my own homegrown dynamic site](/2025/03/why-i-choose-lua-for-this-blog.html), I understand how frustrating slow site generation can be. In my own personal case, I decided to go with an old-school dynamic website using old 90s tech such as *cgi-bin* scripts in [Lua](https://lua.org). That eliminates the need for rebuilds of the site as it is generated at runtime. One criticism I keep hearing is about the scalability of my approach, people say: *"what if one of your posts go viral and the site crashes?"*, well, that is ok for me cause if I get a cold or flu I crash too, why would I demand of my site something I don't demand of myself? Jokes aside, the problem of scalability can be dealt with by having some heuristic figuring out when a post is getting hot and then generating a static version of that post while keeping posts that are not hot dynamic. I'm not worried about it. Instead of devoting my time to the engineering problem of making my SSG fast, I decided to put my energy elsewhere. A point that is often overlooked by many people developing blogging systems is the editing and posting workflow. They'll have really fast SSGs and then let the user figure out how to write the source files using whatever tool they want. Nothing wrong with that, but I want something better than launching $EDITOR to write my posts. In my case, what prevented me from posting more was not how long my SSG took to rebuild my site, but the friction between wanting to post and having the post written. What tools to use, how to handle file uploads, etc. So I begun to optmising and developing tools for helping me with that. First, I [made a simple posting interface](/2025/01/creating-a-simple-posting-interface.html). This is not a part of the blogging system, it is an independent tool that shares the code base with the rest of the blog (just so I have my own CGI routines available). Internally it uses [micropub](https://www.w3.org/TR/micropub/) to publish. After that, I made it into a Firefox Add-on. The add-on is built for ad-hoc distribution and not shared on the store, it is just for me. Once installed, I get a sidebar that allows me to edit or post. ![Editor](/2026/02/img/c6e6afba-3141-4eca-a0f4-3425f7bea0d8.png) This is part of making my web browser of choice not only a web browser but a web making tool. I'm integrating all I need to write posts into the browser itself and thus diminishing the distance between browsing the web and making the web. I added features to the add-on to help me quote posts, get addresses as I browse them, and edit my own posts. It is all there right in the browser. ![quoting a post](/2026/02/img/b6d2abc1-c4ae-42c5-931f-d390fc9b793f.png) Like PolyWolf, I am passionate about my tools and blogging. I think we should take upon ourselves to build the tools we need if they're not available already (or just for the fun of it). Even though I'm no longer in the SSG bandwagon anymore, I'm deeply interested in blogging and would like to see more people experimenting with building their own tools, especially if their focus is on interesting ux and writing workflows.

0 views
Andre Garzia 1 months ago

Three months of Poncho Wonky

# Three Months of Poncho Wonkying Today marks three months since my first release of Poncho Wonky. In this brief post, I want to chat a bit about I accomplished during this period. > Poncho Wonky is a fork of Patchwork which is a [Secure Scuttlebutt](https://ssb.nz) client. You can find the latest release [here](https://github.com/soapdog/patchwork/releases). The last release of Patchwork is 3.18.1, Poncho Wonky first release was thus versioned as 4.0.0 and as of this writing, the current version is 4.4.0. **In these three months I released 10 versions of Poncho Wonky** including patches and minor versions. My strategy is that bug fixes or feature alterations get a point release and new features get a minor version bump. A major version bump is in the work if the feature change is considered ground breaking. ## Updated dependencies Patchwork final version was released on 2021 using: * Electron 11 * Electron Builder 22 I managed to upgrade these dependencies to: * Electron 39 * Electron Builder 26 That is a 28 versions bumps on Electron alone. A major upgrade to the core technology powering Poncho Wonky. ## Releases for new architectures Patchwork binary releases were made for Windows, macOS, and Linux using x64 for all platforms and also being available as arm64 for Linux Poncho Wonky has all that plus arm64 for macOS as well. ## Signed release for macOS Poncho Wonky is the first ever Patchwork release to ~give in to Apple racketering scheme~ be signed for macOS. That means that macOS ~hostages~ users need to jump fewer hoops to get the app to run compared to Patchwork. ## Support for `blog` posts Patchwork supports displaying messages of type `blog` — long form text messages — but didn't support writing them. Poncho Wonky added writing Blog Posts and also added a new page that helps surface Blog Posts ![Composing a blog post](/2026/01/img/4c80ae63-e36e-4d80-a5fa-f2685ed344b8.jpg) ![A page to help you find cool blog posts](/2026/01/img/35b81705-44b1-4b97-b343-c56861cab290.jpg) ## Support for `bookclub` messages Book Club messages allows users to post metadata and reviews about books. Think of it as your peer to peer goodreads. You can find new books to read, organise your book collection, and keep track of your reviews. ![Your own personal goodreads](/2026/01/img/d61c76d2-ff22-4beb-8cca-abf571899c2f.jpg) Few clients supported book club messages (iirc patchfoo, patchfox, patchbay) but Poncho Wonky is the first one that includes fetching book metadata from the web based on ISBN numbers, which makes it a lot easier to add books to the ecosystem. ![Adding a book](/2026/01/img/fd4148cc-1e56-413c-aef6-649db575c98c.jpg) ## Troubleshooting tools One of the upgraded dependencies introduced some indexing bugs that are hard to track down and don't happen for all users. To help mitigate that, a new page was created with troubleshooting tools. ![Help you fix those fiddly indexes](/2026/01/img/a48ed9a1-8f5e-4221-af26-d27f64bb64a5.jpg) ## Special support for audio attachments I always loved when people shared their own music or jams on SSB. Recently, I added a feature that enables a small music player on a separate window so you can queue and play music while browsing SSB. ![A separate music player](/2026/01/img/7d50098f-b5d1-4d92-9e24-b45acc16aee4.png) ---- ## Things currently in development # User Scrips Are Coming To Poncho Wonky A side effect of being at [P2P Basel 2026](https://p2p-basel.org/) is that I get overexcited about this space and start developing stuff that I shouldn't. This time, I decided to do a major feature: adding user scripts to Poncho Wonky. > User scripts are small morsels of code that you can craft and share with your friends. If you place those script files in a special folder, they become active inside Poncho Wonky. ## What user scripts enable for the users You'll be able to make Poncho Wonky yours. Want to add a niche feature to the app? You can do it without the need to recompile Poncho Wonky source code. Just create a small file, place it in a folder, restart Poncho Wonky. > The feature is similar to what was achieved with [Greasemonkey](https://www.greasespot.net/) and other similar solutions in the browser. User scripts allows each user to tweak Poncho Wonky to make it more suitable to the experience they desire, and also to share it with their friends. ## Is It Safe? User scripts are written in [Lua](https://lua.org) which is a language with a long standing reputation as a embeddable scripting language for applications. The [Lua engine used is written in JS](http://fengari.io/) and has no access to your files, DOM, or any other personal data. It runs on its own environment and can't crash Poncho Wonky even when it goes wrong. ## Why Lua and not JS? Lua is a language created for this use case. It is very easy to lock up a Lua environment so that the developer only has access to an approved set of APIs and features. JS engines tend to be much larger and carry a lot more complexity with them. Lua is a smaller language so it is easier to learn than JS which has a ton of features. ## Can you show us something? Yes, I can! So the reason I am writing this is that I already got a lot of it working. I don't yet have all the API I want to expose, hence why this is an announcement and not a release, but I can show yous something indeed. How does it work? Poncho Wonky created an empty `custom-scripts/active/` folder in your Application Support folder. That is where you place scripts you want to activate. It also comes with a folder `custom-scripts/samples/` with example code for you to use and adapt. ![Add a file to a folder to activate a script](/2026/01/img/6a3fe567-5620-4b3d-a9bb-fa2c04f2ef75.jpg) ### A script to play music from Worm Blossom blog posts My first test script is a script that allows me to play songs from Worm Blossom blog posts that are shared here. Like the audio player, I wanted the script to add a new action button to the post that when clicked would open a music player with the song. That button is only displayed for messages that includes a link to the [Worm Blossom blog](https://worm-blossom.org/#y2026w3). ![That button only appears for specific messages](/2026/01/img/abcab22f-a4c1-4f13-a4c3-9a4193459ac5.jpg) Clicking that button opens a music player: ![A barebones music player](/2026/01/img/dcb67426-8b5d-4454-8c58-aba9d705bd2b.jpg) ### That is cool, show us the script A user script is a Lua table, think of it as analogous to a JS object: ```lua local contact = { name = "Andre", surname = "Garzia" } ``` A table can contain functions, they can appear inline with the table construction or be added later such as: ```lua function contact.sayHi() print("Hello!") end ``` So a user script is a table that contains functions. The function names are hooks, Poncho Wonky will look for functions with specific names and if it finds them, it will execute them when needed. ### Adding a button to a message The functions to add a button to a message are `buttonAction` and `action` (function names might change before I release this). * `buttonAction`: called for each displayed message and should return true or false and also a label if it is true (Lua has multiple return values). If calling it returns `true` and a _label_ then a button is added to that message with the returned _label_. * `action`: that function is called is the user clicks the button. Let me show the `buttonAction` function for that script: ```lua function wbMusic.buttonAction(msg) if msg == nil then return false, "no message" end if msg.value.content.type ~= "post" then return false, "message is not post" end local text = msg.value.content.text local url = extractWormBlossomURL(text) if url then return true, "Open in Worm Blossom Player" else return false, "no blossom url" end end ``` The `action` function for that script is a bit more involved. This is what it does when it runs: 1. Extracts the URL from the SSB post. 2. Fetches that URL. 3. Finds the ` ` music player inside the post. 4. Extracts the ` ` and peeks at the `src` attribute. The song data is in the source attribute as a parameter called `song`. 5. Opens a new Poncho Wonky window with some HTML and JS to assemble a music player able to play those chiptunes and add the extracted `song` to it. Be aware that _fetching a URL_ is an asynchronous operation, so in the Lua script, that function needs to happen inside a _co-routine_. Here is the function: ```lua function wbMusic.action(msg) local text = msg.value.content.text local url = extractWormBlossomURL(text) local co = coroutine.create(function() -- must wrapped in a coroutine because fetch is async local content = fetch(url) if content then local iframe = querySelect(content, "iframe") local src = getAttribute(iframe, "src") local song = getParam(src, "song") openWindowFromAssets("worm-blossom-music/start.js", { data = song, width = 400, height = 300 }) end end) coroutine.resume(co) end ``` That is it, a whole new music feature that I'm probably the only person interested. A feature I can have in my Poncho Wonky without actually changing the Poncho Wonky source code. How would you make Poncho Wonky yours? What scripts you'd like to build? --- _I am pretty happy with all the progress in these last three months. Working with Patchwork makes me happy. If it makes you happy as well, maybe consider [buying me a coffee](https://ko-fi.com/andreshouldbewriting)._

0 views
Andre Garzia 3 months ago

My chocolate truffles recipe

# My Brigadeiro Recipe (chocolate truffles) This is a very traditional recipe from Brazil that is a staple of birthday parties all over the country. The ingredients are: * 1 tin of condensed milk (395g) * 1/2 of a cup of cream (100g) * 1/2 tablespoon of butter (15g) * 1 and a half tablespoon of cocoa powder (23g) Mix everything on a pan until they are well mixed. Turn the heat on in medium and mix until you can use a spatula to separate the mix and when it comes back together it folds from the top like a wave. Be aware that the mixture will rise in the pot before going down again. If you have oreos, you can make little bats for halloween as seen in the photo below. ![](/2025/11/img/3f29fa74-c6a2-45c0-8332-fa77772c25ab.jpg)

0 views
Andre Garzia 3 months ago

Had a good time at Edinburgh Radical Book Fair

# Had a good time at Edinburgh Radical Book Fair Last weekend I went to [Edinburgh Radical Book Fair](https://lighthousebookshop.com/events/edinburghs-radical-book-fair-2025-ecosystems-of-change) with a friend and had a wonderful time. I was having a crap time before going and being there and seeing so much hope and energy for change made my spirit much better. ![](/2025/11/img/43688ec6-4ede-4bc2-a92e-ad1e89d8111f.jpg) It was good to see so many interesting books covering intersectional topics and cutting through so many hard challenges we face as a society. My favourite section was the anarchism corner I found tucked away with so many delightful little books. My mischievous grin could probably be seen from Glasgow as I browsed them all. In the end I grabbed three books. ![](/2025/11/img/f4def92d-430e-41ff-b675-a5f5cfc41dc8.jpg) I'm very keen on reading them all. I started with _After The Internet_ by Tiziana Terranova and am already in love with it even though I am barelly into the book. From the back blurb: > The internet is no more. If it still exists, it does so only as a residual technology, still effective in the present but less intelligible as such. After nearly two decades and a couple of financial crises, it has become the almost imperceptible background of today’s Corporate Platform Complex (CPC) — a pervasive planetary technological infrastructure that meshes communication with computation. In the essays collected in this book, Tiziana Terranova bears witness to this monstrous transformation. > > After the Internet is neither an apocalyptic lamentation nor a melancholic “rise and fall” story of betrayed great expectations. On the contrary, it looks within the folds of the recent past to unfold the potential futurities that the post-digital computational present still entails. This page was so good I wanted to highlight it all: > (the internet) It continues to exist, but interstitially, in ways that are almost hardly ever perceptible to those large and powerful entities that have overtaken it. Standards and protocols developed as part of the project of creating the internet as a public and open network still operate, but they are increasingly buried under a thick layer of corporate ones. The internet’s own native subcultures, such as those that formed in the 1980s and 1990s, have gone underground, assembling in the so-called dark web, in IRC chats, in some forums, in pirate file-sharing networks, in websites with no social plugins, in mesh networks and wikis, and maybe also in the chaotic informational milieus of some secure, encrypted, open source messaging apps. > > Reaching out with their data-mining tentacles, the new owners of the digital world have, as Marxists might say, subsumed the internet, that is, transmuted, encompassed, incorporated it, but not necessarily beaten or dissolved it. As a subsumed entity, the internet is not so much dead as undead, a ghostly presence haunting the Corporate Platform Complex with the specters of past hopes and potentials. Thus, whereby the CPC displays an increasing concentration of control, the specter of the internet persists as a much more muted, but perceptible aspiration towards an unprecedented distribution of the power to know, understand, coordinate and decide. Makes me want to dive again into decentralisation technologies and do some good FOSS work.

0 views
Andre Garzia 5 months ago

Linkgraphs are fun

As far as I am aware, link graphs were pioneered by Artemis calm web reader , you can read about their take on it as Artemis Link Graph . In their own words: Artemis Link Graph is a web extension that tells you when the page you are viewing has been linked to in a post from a website you follow on Artemis. You can use this extension to discove]r what websites you enjoy are saying about different web pages. — Source: Artemis Link Graph | Artemis It is a very neat feature. It works by surfacing posts from sites you follow that mention the page you're current reading. In an age of algorithimic timelines trying to herd you into content that improve their metrics and the profit for their shareholders, having such organic connections is a blessing. I mentioned in The Web Should Be A Conversation that I want to go back to a human-centric web instead of this agentic bullshit we've been forced into. Linkgraphs seem a wonderful tool in that direction cause they provide immediate, actionable, interesting connections between what interests you and those you subscribed to. Artemis is a wonderful calm web reader. If you're in the mood to try a different web reading / blog reading experience, I believe you should give it a try. I find it quite cute that two calm web readers with cat mascots — Artemis and BlogCat — are both developed in Scotland, must be something in the water. Inspired by Artemis, I decided to implement the same feature in BlogCat . Still, credit is due and they did it first. In BlogCat, the linkgraph surfaces as a page action. If you're viewing a page that has been linked in a post from a site you follow you'll see the little cat popup. If you click the popup, you'll see the list of posts that linked to the page you're viewing. I would love to see other blogging clients (feed readers? Can we settle on a name for these type of apps?) implementing such feature. I followed Artemis very close but didn't use any of their code cause the underlying implementation of data storage and general workflow is very different between both clients. Artemis is a SaaS that exposes the linkgraph as a JSON that is then read by their extension and saved to session storage. BlogCat is a WebExtension, a full web reader inside Firefox. When you open the reader view , BlogCat fetches the feeds it needs to fetch and then calculates the linkgraph by extracting all the links from all the posts and saving them to session storage. The linkgraph is only rebuilt when you open the reader view (BlogCat doesn't do background fetching of feeds).

0 views
Andre Garzia 6 months ago

Pão de queijo recipe

Brazilian cheese bread is a perenial snack all around the country. They're delicious and made even better if you have some dip or filling like butter or requeijão . Ingredients: Instructions: Mix the starch and salt in a large bowl. Mix the oil, water, and milk and heat them almost to the boiling point. Be careful cause boiling milk degrades and bubbles all over your stove. Stop before it reaches that point. Slowly add the hot mixture to the starch while mixing. You want to get all of the starch wet and mixed. Be careful to not allow gelification to take place, you don't want gel starch, just keep mixing and kneeding it with a spatula as it cools down. It will become dough like. Once it is cold enough that it won't cook the eggs, put them in and mix until it becomes a dough and doesn't stick too much. Add the cheese and mix until it feels right and there's cheese bits throughout the dough. Preheat over to 160ºC fan assisted. Make little balls, I like making them roughly 28g each but don't be fussed about it. Put them with an inch or two between each other on a tray with baking paper. Bake them until they a bit golden. Probably about 20 minutes or a bit more. 100ml of oil 150ml of water 150ml of milk 400g of cassava starch (sour kind, also known as polvilho azedo in Portuguese) 10g of salt 200g of cheese (I use cheddar in the UK)

0 views
Andre Garzia 8 months ago

Making massage bars

I been making massage bars and soap for a while. They're a fun activity I can do while listening to audiobooks and the end result makes me super happy. I usually end up with more than I can actually use so most of the bars become gifts for friends. These is the recipe I'm using: I got it from this amazing video . 70g 35% beeswax 50g 25% coconut oil 50g 25% cocoa butter 30g 15% shea butter 10 drops of peppermint oil 10 drops of lavender oil 2g of menthol crystal

0 views
Andre Garzia 9 months ago

Experimenting with no-build Web Applications

Some years ago I build an eBook generator called Little Webby Press . It would pick a folder with Markdown files and turn them into an EPUB3 non-DRM eBook file and also a zip containing a static website for your book. The cool thing about Little Webby Press is that all the workflow happens on the client-side. There's no back end, no database, no user accounts. You drag and drop a folder, press a button, and it works. I used it to make some niche technical books in the past and since I don't keep analytics, I have no idea how many people have used it. I spent the recent years without writing any book, but recently, after developing BlogCat and writing The Web Should Be A Conversation , I have an itch to write a book about WebExtensions . Of course, I'd like to use my own eBook generator to write this book, and that is how I nerdsnipped myself into making a new eBook generator. There was nothing wrong with the old version of Little Webby Press. It was built by leveraging the following stack: All in all a very common setup for like five years ago. The problem is that I don't like the current version of Svelte and I really dislike the current web development ecosystem with its reliance on transpilation and too many frameworks and dependencies and build steps. Feels like we're developing in a fantasy version of JS that no engine is really able to run. I hate that. So instead of writing the book I want to write, I decided to rewrite Little Webby Press in a stack that is more suitable to my own enjoyment. My main objective with the new version was to eliminate the build step. I wanted a no-build system in which the JS I write is exactly what runs on the browser. Instead of implementing everything from scratch, I kept many of the NodeJS-based dependencies but used an importMap to get them from JsDelivr . This was done so I could reuse the majority of the book and website generation code. Don't want to read this blog post, you can go check the source code . I switched from Svelte to Mithril — which is my favourite JS lib to make UIs — and completely removed BrowserFS. Instead I created my own file handling routines which have the same arguments as the NodeJS ones but different names. Instead of a full-blown filesystem implementation, it is just an object where the keys are the paths and the property content is the file content. Can't be any more naive than that. I also removed all the CSS stuff and just replaced it with Pico CSS . The code became a lot smaller than the previous version and what is written is exactly what the browser runs. There's no JSX, no Svelte templates, no JS features that the browser doesn't understand. The repository is the actual app that is run, Github Pages is just set to serve from the root of the repo. Any change there is a new deployment, no github actions, no scripts to run. It is so refreshing. Press play to it in action. It is a simple Mithril application with a router in and three routes. One for the book generation app, one for the documentation, and one for a simple about. The book generation needs a source folder. You can either drag and drop a folder, or, use the Load manuscript folder button. That will trigger a routine that will read all files in that folder and create a gigantic object with all the paths and contents. Both the ebook and the website generation functions are massive waterfalls (don't we all think that waterfalls are beautiful?) that make multiple loops over all the files assembling a resulting file tree by copying things around and using handlebars to generate HTML and other files. Once the resulting file tree is ready, it is added to zip file using JsZip and that file is downloaded from the browser into the downloads folder. EPUBs are zip files. So both the ebook and the website generation are almost the same waterfall but using different templates. The output can be configured with a TOML control file that has a ton of features and switches. Both documentation books for Little Webby Press were generated with Little Webby Press, check out Getting Started and Book Configuration Specification . The templates used are in a zip file that is loaded when the application loads. They contents of that zip file are placed in the same enourmous object with a prefix for the prop name (remember we're mimicking a filesystem). Because the template is loaded via XHR that means that someone wanting to generate a book and a website can actually use their own template as well by creating a zip with the correct contents and placing it inside the source folder. Due to not having a build system, you can run this on your machine by simply downloading the files from the repository and using your favourite web server. It is also easy to tweak for your own use in case you need it to work a bit differently in a way that the book configuration and template system can't handle. What was unexpected was the massive gain in performance. The previous Little Webby Press would take about two and half seconds to create the EPUB for Moby Dick, the new version took 125 milliseconds. To assemble both the eBook and website, the old version took 4.7 seconds and the new version takes less than half a second. This is from the console output of the old version: And the new version: It is so fast right now that I even changed the workflow to autodownload the files while in the previous one there were two buttons, one to generate and one to download cause the process took so long. I'm not sure why the old version took so long — specially considering the code is basically unchanged— I can only assume that the BrowseFS code is slower cause it is doing a lot more than simply setting a prop on an object. Going forward, I think I'm gonna focus on no-build webapps only. The most important aspect of a software I'm making and distributing for free is that I need to enjoy working with it and this ticks all my boxes. Svelte 3 for the UI. BrowserFS to have a browser-side filesystem so I could leverage NodeJS file handling routines. Handlebars for the templates used by both eBook and website generation. JsZip to create the EPUB and the zip containing the website. Rollup and a gazillion plugins Taildwind and Daisy for CSS

0 views
Andre Garzia 9 months ago

Some photos taken with Lomochrome Metropolis

Got myself a new camera. A 35mm half-frame Olympus Pen EES-2 with zone focusing. My first roll with this camera has been the Lomochrome Metropolis shot in ASA 400. That is an odd film in my opinion, I enjoy films that do color shifts and make the photogram more interesting, but I can't say I'm fully onboard with that gritty feel. I enjoyed shooting with it but don't think I'm gonna get it again, maybe some time long in the future but for sure I'll experiment with other film stocks a lot before I circle back to this one.

0 views
Andre Garzia 10 months ago

RSS doesn't necessarily means firehose

Was reading Hacker News today and saw this very interesting post Reading RSS content is a skilled activity which raises some really interesting points about feed reading and how much effort one should put into making their experience fit what they want. The following discussion on Hacker News is full of people commenting how their feed reading experience is either overwhelming or stressful cause of the firehose of posts they see on their reader app. Many feed reading apps and SaaS adopt the river of news paradigm in which news items are presented in reverse chronogical order in a list that is evergrowing. People often use the monicker firehose for this type of UX approach in which the user is flooded with content. I suspect that this became the dominant approach for feed readers as they were closely modeled against popular email clients way back when. These apps usually adopt a two or three panel layout with the first one being a navigation panel allowing you to filter the list of posts, the second being the list of posts, and the thirs being a larger one with content. It is not hard to understand how some people might feel overwhelmed by this list of posts growing while you read. Let's be fair and be honest about it, some users enjoy that approach and want the firehose. Maybe even the majority, but that is not for everyone and some users would prefer a calmer experience. Feed reading doesn't need to be a firehose. A calmer reading experience is possible. The first app I seen that explicitly focus on an experience that is not anxiety inducing is FraidyCat . From the theme to the overall UX, FraidyCat aims to make sure no publisher can take over the feed by publishing too much. It also put fetching under the user control by letting the user decide how often to fetch a given feed. Inspired by FraidyCat and others, a friend built Rad Reader which provides an even more minimal experience. Just a list of recent updated websites and the last three posts. Clicking on them opens the post on your default browser. Very simple and yet so powerful. Inspired by all of that and some past softwares I created in the past (I have a thing for blogging tools), I made BlogCat . It provides an experience similar to FraidyCat and Rad Reader in which there is no firehose, the user selects when feeds are retrieved and there is no background fetching or notifications. I'm pretty sure there are other apps with similar approaches and even more novel ones. RSS is just the data, how you approach it doesn't need to be a firehose or a email-client lookalike. There are enough apps out there that you can find one that matches what you want, and if you can't maybe you can develop it :3

0 views
Andre Garzia 10 months ago

Got my first turquoise-tinted film back from the lab

Here are some photos I took with my Pentax Auto 110 which uses 110 film cartridges. For these photos I used a turquoise tinted film by Lomography . I haven’t edited this photos at all, this is how they come out from the camera. I’m loving the results. Might even print some and will for sure experiment with other tinted films.

0 views
Andre Garzia 10 months ago

The Mozilla I want focus on people and not AI

I used to be a quite active Mozilla volunteer. I'd been part of Mozilla Reps, Mozilla TechSpeakers, and even contracted with the Foundation for a bit. It was a huge part of my life and identity of a while and if that is a healthy thing or not is up for debate, but it was true. What made me devote so much time and effort to Mozilla was a set of common values and practices that aligned with what I want for the Web. Basically, I saw in Mozilla a community of like-minded individuals who would foster a people-centric Web. In a world of gigantic tech conglomerates like Meta and Google, Mozilla was a breath of fresh air. I am fully aware that as I write these words, many people will be furiously typing their own Mozilla criticisms. I do not share their vitriol. Let's get some stuff out of the way: Now that we got the usual FUD out of the way, let me resume my blog post. Mozilla used to have a vibrant contributor community, you'd have an event anywhere and there would be Mozilla supporters and volunteers there. It was quite fond of grassroot efforts and it put people first. Mozilla Foundation initiatives such as Teach The Web, Mozilla Corporation events such as App Days, they were all focused on human connections, people over profit, and fostering a web ecosystem where everyone had a voice and everyone could participate (for not absolute quantities of everyone cause the world is not perfect). Moving on to the current hellscape of 2025. I barelly see any movement in terms of volunteers in events or community management. Whatever is happening now it is but a shadow of old Mozilla Reps days of yore. Without a community, you don't have word of mouth and will lose goodwill over time as you distance yourself from your userbase. Mozilla used to be a lot closer to its userbase than it is today. We sometimes forget that for all practical effects Mozilla is inserted in the same tech bro ecosystem as all the other large Sillicon Valey companies. It is affected by the same stupid trends. FAANG is a shoal that pursues trends together. If Meta, Google, and others start moving in a direction, the rest of the shoal moves too. Mozilla is in the shoal not because it is there as an entity but because many of the decision makers are part of that shoal and they drag Mozilla with it. Volunteers used to wear shirts and cute wrist bracelets saying "Internet by the people for the people". It used to be for the people . It doesn't seem to be the case anymore. The enshitification of the Internet is real. Those massive companies are trying to extract every single iota of profit from its users through whatever predatory means they can. From the massive book theft of Meta to all those LLM generative AI companies stealing all their artistic data from creatives portfolios without any regard for the desires of said creatives. It seems like every company under the sun is trying to push generative AI and LLM agents and other bullshit. Heck some horrible web SaaS that I need to use for work got three AI features on the main interface and no way to disable them. Gen AI and AI agents are eating the web and software world regardless if you want it. Recently, I lost hours doing coursework online just to realise later that the whole course was an AI halucination and that the features I was trying to learn were never real and were never a part of the software I was using. Someone just published a full course online with source code examples for a plugin architecture that doesn't actually exists. That is how bad shit is. Back to Mozilla, I think Mozilla dropped the ball. It was in a unique position to take a stand against this AI centric Web and go back to putting humans front and centre of the internet. It could take a stand against generative AI. It could have taken a stand against halucinating agentic interfaces. It could have done all simply by promoting and pushing features that reward a web made by humans. It could go back to foster user groups and human connections between volunteers, drawing itself closer to its userbase again. It could bring back RSS, feed reading, and try to promote ways in Firefox to subscribe and keep track of humans you like thus pushing against algorithmic timelines. It could actually make a choice and promote decentralised platforms and integrate decentralisation properly in the browser, like we dreamed ten years ago. Mozilla should be pushing towards blogging, making it simpler for people to put their dreams, voice, wants, and stories online. It should be making tools that make it easier to host and discover web pages. Not by using AI to figure shit out, but by making sure that it is easy to find your friends, to find your people, and to share with them. I want a Mozilla that champions the small web. That pushes for decentralisation and takes a stand against massive algorithmic-led social networks. A Firefox that understands and helps me side-step the enshitification of the Web. A browser that is a user agent and not a browser that is silicon valey serf that sometimes behaves like a brat. I don't want Mozilla AI , let another foundation take care of that. I want a Mozilla Labs pushing the boundaries of the IndieWeb and one that I can trust not to be part of that cursed shoal. Mozilla does with very little budget what other companies do with much larger sums. CEOs for large foundations always get paid in the millions, if you don't pay competitive salaries, you don't get someone with that kind of experience and contacts. Many people don't realise the challenges of fundraising, grantwriting, and keeping a foundation afloat. Mozilla didn't turned evil; Google doesn't sent secret letters telling Mozilla what to do.

0 views
Andre Garzia 11 months ago

Think I implemented a unique feature in my feed reader

I went through a burn out and spent the last three or so years without working on a development project of my own, then something clicked last month and I started enjoying it again. The result of that was a new add-on for Firefox called BlogCat that is a feed reader with some blog posting features as well. It is quite neat and even though I am the primary audience for it, it is the reader I want, it has some few fans already. The problem is that it is basically done, I'm enjoying working on it but I kinda done all the big features I wanted already, which is when I started thinking about novelty features. As a WebExtension, BlogCat does not work on mobile. It could work on Firefox for Android with some tweaks but I'm using iOS so I wouldn't even have a proper way to test it. For the moment, I'm using another reader on my iPhone and iPad, the venerated NetNewsWire . It is a good app, it always been a good app. And that is when a problem surfaced on my radar: how to keep the subscription list between different feed readers in sync? I will often subscribe to new sites while browsing the web on my desktop. BlogCat page action button makes that convenient. Occasionaly, I'll subscribe to a site while browsing on my phone or tablet, but that is a bit more rare. That is when I implemented OPML matching . Basically what it does is make sure you're subscribed to the same websites on multiple apps by comparing BlogCat's own subscription list to an exported OPML file from another app. After comparing both lists, it allows you to subscribe to the missing websites in BlogCat and it generate another OPML that can be imported in your other app with the sites that other app is missing. That's it, a simple feature but super convenient. I'm not sure any other feed reader has the same feature, maybe they do, but I think it is something other apps should implement. I'm super happy with this feature. It is simple and easy to use and makes my life much easier using multiple apps. I'm wondering what other cute niche features I'll dream about next.

0 views
Andre Garzia 11 months ago

Why I choose Lua for this blog

This blog used to run using with a stack based on Racket using Pollen and lots of hacks on top of it . At some point I realised that my setup was working against me. The moving parts and workflow I created added too much friction to keep my blog active. That happened mostly because it was a static generator trying to behave as if it was dynamic website with an editing interface. That can be done really well — cue Grav CMS — but that was not the case for me. Once I decided to rewrite this blog as a simpler system, I faced the dilema of what stack to choose. The obvious choice for me would be Javascript, it is the language I use more often and one that I am quite confortable with. Still, I don't think it is a wise choice for the kind of blog I want to maintain. Talking to some friends recently, I noticed that many people I know that have implemented their own blogging systems face many challenges keeping them running over many years. Not because it is hard to keep software running, but because their stack of choice is moving faster than their codebase. This problem is specially prevalent in the Javascript world. It is almost a crime that JS as understood by the browser is this beautiful language with extreme retrocompatibility, while JS as understood and used by the current tooling and workflows is this mess moving at lightspeed. Let me unpack that for a bit. You can open a web page from 1995 on your browser of choice and it will just work because browser vendors try really hard to make sure they don't break the web. Developers who built the whole ecosystem of NodeJS, NPM, and all those libraries and frameworks don't share the same ethos. They all make a big case of semantic versioning and thus being able to handle breaking changes, but they have breaking changes all the time. You'd be hardpressed to actually run some JS code from ten years ago based on NodeJS and NPM. There is a big chance that dependencies might be gone, broken, or it might be incompatible with the current NodeJS. I know this sounds like FUD, and that for many many projects, maybe even most projects, that will not be the case. But I heard from many people that keeping their blogging systems up to date requires a lot more work than they would like to do and if they don't, then they're screwed. That is also true about other languages even though many of them move at a slower speed. A friend recently complained about a blogging system he implemented that requires Ruby 2.0 and that keeping that running sucks. I want a simpler blogging system; one that requires minimal changes over time. Lua is a wonderful and nimble language that is often misunderstood . One characteristic that I love about it, is that is evolves very slowly. Lua 5.1 was introduced in 2006, Lua 5.4 which is the current version initial release was in 2020. Yes, there are point released in between, but you can see how much slower it moves when compared to JS. The differences between Lua 5.1 and Lua 5.4 are minimal when compared with how much other languages changed in the same time period. Lua only requires a C89 compiler to bootstrap itself. It is very easy to make Lua work and even easier to make it interface with something. JS is a lot larger than Lua, there is more to understand and more to remember. My blog needs are very simple and Lua can handle them with ease. This is an old-school blog. I uses cgi-bin — aka Comon Gateway Interface — scripts to run it. It is a dynamic website with a SQLite database holding its data. When you open a page, it fetches the data from a database and assembles a HTML to send to the browser using Mustache templates. One process per request. Like the old days. You might argue that if I went with NodeJS, I'd be able to serve more requests using fewer resources. That is true. I don't need to serve that many requests though. My peak access was a couple years ago with 50k visitors on a week, even my old Racket blog could handle that fine. The Lua one should handle it too; and if it breaks it breaks. I'm a flawed human being, my code can be flawed too, we're in this together, holding hands. Your blog is your place to experiment and program how you want it. You can drop the JS fatigue, you can drop your fancy Haskell types, you can just do whatever you find fun and keep going (and that includes JS and Haskell if that's your thing. You do you). Cause I'm using Lua, I don't have as many libraries and frameworks available to me as JS people have, but I still have quite a large collection via Luarocks . I try not to add many dependencies to my blog. At the moment there are about ten and that is mostly because Lua is a batteries-not-included language so you start from a minimal core and build things up to suit your needs. For a lot of things I went with the questionable choice of implementing things myself. I got my own little CGI library. It is 200 lines long and does the bare minimum to make this blog work. I got my own little libraries for many things. Micropub and IndieAuth were all implemented by hand. At the moment I'm despairing frustrated having a lot of fun implementing WebMentions . Doing the Microformats2 exorcism extraction on my own is teaching me a lot of things. What I want to say is that by choosing a small language that moves very slowly and very few dependencies, I can keep all of my blogging system in my head. I can make sure it will run without too much change for the next ten or twenty years. Lua is a lego set, a toolkit, it adapts to you and your needs. I don't need to keep chasing the new shiny or the latest framework du jour. I can focus on making the features I want and actually understanding how they work. Instead of installing a single dependency in another language and it pulling a hundred of other small dependencies all of which were transpiled into something the engine understands to the point that understanding how all the pieces work and fit together takes more time than to learn a new language, I decided to keep things simple. I got 29 Luarocks installed here and that is for all my Lua projects in this machine. That is my blog, my game development, my own work scripts for my day job. Not even half of those are for my blog. I often see wisdom in websites such as Hacker News and Lobsters around the idea of "choosing boring" because it is proven, safe, easier to maintain. I think that boring is not necessarily applicable to my case. I don't find Lua boring at all, but all that those blog posts talk about that kind of mindset are all applicable to my own choices here. Next time you're building your own blogging software, consider for a bit for how long do you want to maintain it. I first started blogging on macOS 8 in 2001. I choose badly many times and in the end couldn't keep my content moving forward in time with me as softwares I used or created became impossible to run. The last two changes: from JS to Racket and from Racket to Lua have been a lot safer and I managed to carry all my content forward into increasingly simpler setups and workflows. My blogging system is not becoming more complex over the years, it is becoming smaller, because with each change I select a stack that is more nimble and smaller than the one I had before. I don't think I can go smaller than Lua though. By small I mean: I chose Lua because of all that, and I'm happy with it and hope this engine will see me through the next ten or so years. A language I can fully understand and keep on my head. A language that I know how to build the engine and can do it if needed. An engine that requires very few resources and is easy to interface with third-party libraries in native code.

0 views
Andre Garzia 12 months ago

The Web Should Be A Conversation

For a very long time, I've defended that the Web should be a conversation, a two-way street instead of a chute just pushing content into us. The Web is the only mass media we currently have where most people can have a voice. I'm not saying all these voices have the same loudness nor that every single person in our beautiful planet and space stations can actually post to the Web, just that it is the one place where everyone has the potential to be a part of it. Contrast it with streaming services, radio, or even the traditional publishing industry and you'll see that a person alone with an idea has a lot more obstacles in their way, than when considering just starting a blog. For the last couple of years, there has been a colossal push by Silicon Valley companies towards generative AI. Not only bots are going crazy gobbling all the content they can see regardless if they have the rights to do so or not, but content farms have been pushing drivel generated by such machines into the wider Web. I have seen a horrible decline in the quality of my search results and the social platforms that I'm a part of — the ones with algorithmic timelines such as Instagram and YouTube — have been pushing terrible content towards me, the kind that tries to get a rise out of you. They do this to "toxically foster" engagement. Trying to get you to be so mad that you dive deeper into either an echo champer or a flame war. The enshitfication of the Web is real, but it is happening at a surface level. All the content you love and want is still there. They are just harder to discover cause FAANG companies got a nuclear powered shit firehose spraying bullshit all over the place. There are many ways to fight this and in this blog post, I'll outline what I am doing and try to convince you to do the same. Yes, this post has an agenda, a biased human wrote it. TL;DR: We need to get back into blogging. We need to put care and effort into the Blogosphere. A human-centric Web, in my own opinion, is one that is made by people to be browsed by people. The fine folks at the IndieWeb been hammering at this for a very long time: On Social Networks such as Facebook or YouTube, you don't own your platform. You're just feeding a machine that will decide to show your content or not to people, depending on how much their shareholders can make out of your work and passion. Your content is yours When you post something on the web, it should belong to you, not a corporation. Too many companies have gone out of business and lost all of their users’ data. By joining the IndieWeb, your content stays yours and in your control. You are better connected Your articles and status messages can be distributed to any platform, not just one, allowing you to engage with everyone. Replies and likes on other services can come back to your site so they’re all in one place. You are in control You can post anything you want, in any format you want, with no one monitoring you. In addition, you share simple readable links such as example.com/ideas. These links are permanent and will always work. — Source: IndieWeb I'm not advocating for you to stop using these bad social networks. You do whatever you want to do. I'm urging you to also own your own little corner on the Web by making a little blog. What will you post into it? Well, whatever you want. The same stuff you post elsewhere. A blog doesn't need to be anything more complicated than your random scribblings and things you want to share with the world. I know there are many people that treat it as a portfolio to highlight their best self and promote themselves, if that is you too, go forward and do it! If that is not you, you can still have a blog and have fun. There are thousands of ways to start a blog, let me list some that I think are a good way to go: These are just some ways to do it. There are many more. When you start your own blog, you're joining the conversation. You don't need the blessing of a social network to post your own content online. You certainly don't need to play their algorithm game. Join the conversation as you are and not as these companies want you to be. The Web becomes better when you are your authentic self online. Post about all the things that interest you. It doesn't matter if you're mixing food recipes with development tips. You contain multitudes. Share the blog posts and content creators that you like. Talk about your shared passions on your blog. Create connections. The way to avoid doomscrolling and horrible algorithmic timelines is to curate your own feed subscriptions. Instead of relying on social networks and search engines to surface content for you, you can subscribe to the websites you want to check often. Many websites offer feeds in RSS or Atom formats and you can use a feed reader to keep track of them. There are many feed readers out there (heck, even I made one, more about it later). Let me show you some cool ones: Once you're in control of your own feed, you step away from algorithmic timelines. You can use feed readers to subscribe not only to blogs, but your favourite creators on YouTube and other platforms too. If the website you want to subscribe to does not offer a feed, check out services like rss.app and others to try to convert it into a feed you can use on your feed reader of choice. With time, you'll collect many subscriptions and your Web experience will be filled with people instead of bots. Use opml exporting and importing from your feed reader to share interesting blogs with your friends and readers. Word of mouth and grassroot connections between people in the blogosphere is how we step out of this shit. Learn a bit of HTML to add a blogroll link to your template. Sharing is caring. As I mentioned before, I have been thinking about this for a long time. I suspect I might have created one of the first blogging clients on MacOS 8 (yeah the screenshot is from MacOS 9). I have no idea how many times I implemented a feed reader, a blogging client, or a little blogging CMS. Even this blog you're reading right now is a home grown Lua -based blogging CMS I made in an afternoon. BlogCat is my latest experiment. It is an add-on for Firefox that adds blogging features to the browser. It aims to reduce the friction between blogging and Web Browsing by making weblogs a first-class citizen inside your user agent. You can subscribe to websites, import and export OPML, all from inside the browser. You can have a calm experience checking the latest posts from the websites you follow. Being a part of the conversation is also easy cause BlogCat supports posting to Micropub-enabled sites and also microblogging to Bluesky and Mastodon. It uses a handy sidebar so you can compose your post while browsing the web. I been using it for a couple weeks now and am enjoying it a lot. Maybe you will enjoy it too. Anyway, this is not a post about BlogCat, but this post is what originally inspired BlogCat. As I drafted this post weeks ago and mused about the Web I want and the features I want on Web Browsers, I realised I knew how to make them. Instead of simply shouting about it, I decided to build it myself. You too can be a part of the conversation. You too can help build the Web you want. Let's walk away from the enshitfication of the Web by linking hands across the blogosphere. Micro.Blog : A simple and powerful blogging platform by people who actually love blogs. You need a subscription for it, but it can be as cheap as 1 buck per-month. Jekyll using Github Pages : If you're a developer and already know a bit about Git, you can quickly spin a blog using Jekyll and Github Pages. That allows you to start a blog for free. Wordpress : It pains me to write this one. I don't like Wordpress but I understand it is an easy way to start blogging for free. Blogger : Blogger still exists! A simple way to create a blog. Feedly : A SaaS that is liked by many. Create an account and subscribe to your blogs from any Web device you got. NetNewsWire : Polished macOS app that has been the gold standard for feed readers for more than a decade. It is FOSS. Akregator : From our friends at KDE, a FOSS Desktop feed reader for Linux and Windows. Miniflux : a minimalist feed reader. You can join their SaaS or self-host it. Rad Reader : A minimalist desktop reader for macOS, Linux, and Windows. BlogCat : Yep, I made this. More about this later. It is an add-on for Firefox that adds blogging features to the browser.

0 views
Andre Garzia 1 years ago

Cool RPG links from this week

Here are some RPG links that made for a good read (or watch) this week: DnD's lead designer explains how 5e fell apart - YouTube GROGNARDIA: Retrospective: Traveller: The New Era The Palace of Moliades | Dyson's Dodecahedron

0 views
Andre Garzia 1 years ago

Retaking The Web Browser, One Small Step At A Time

Browsing the web is a 3D tug-of-war between developers, companies, and users. These forces were never balanced, but we have reached quite a lowest point for users in the last decade. Our beloved web browsers feature roadmap has catered more for web developers and the companies behind each browser project than for the user themselves. We used to call those apps User Agents, but they have been less of an agent on behalf of users these days. Removal of beloved features, questionable specs, creating friction in the name of security, it is death by a thousand paper cuts for users with the browser turning into an appliance for running third-party apps. What about the user's wants and desires? Well, all that is left for the user is to rebel and take the web experience back into their own hands, one small step at a time. In this brief post, I will show you how I am slowly making my browser of choice more suitable for the experience I want by changing little things and making little interventions. TL;DR: I'm gonna gush about bookmarklets and web extensions. This post is a little essay on how I'm using them on my day-to-day browsing to make the browser experience my own. Wikipedia defines bookmarklets as: A bookmarklet is a bookmark stored in a web browser that contains JavaScript commands that add new features to the browser. — Source: Bookmarklet - Wikipedia Which is a quite dry way of saying that a bookmarklet is a tiny morsel of code that you can click and execute in the context of the page you're currently reading. They used to be a lot more popular but are now a niche that is more popular among power-users and developers. We should all be using them and sharing them around cause bookmarklets don't need approval by the browser makers or the website owners to work. They are an easy way to spread bespoke features on the web. In a previous blog post, I described how I created a small web interface to post on my blog . That interface is just a small static client for a micropub API service on my blog. With that in place, I started by adding a bookmark to the editor to my browser bookmark bar thus allowing one-click access to the posting interface to my blog and reducing the friction between web browsing and blogging. Whenever I'm browsing, if inspiration happens, I can with a single click start blogging about it. Sometimes, I forget to add all tags I need to a blog post. Normally, to fix that, I'd need to open the editor and add them, but since my blog supports micropub, I could simply write some Javascript to ask for a list of tags and submit a micropub update request to my blog and save that as a bookmarklet. Now, I can update tags on any of my own posts with one click on the bookmark toolbar while I'm reading the post. The Web should be a conversation and not a one-way media like cable. It is the only mass media we have where everyone can be present and share their own ideas, dreams, and cat photos. Blogging is a wonderful way to keep a presence online. Being such a fan of blogging, it is no surprise that I want to make my web browser more friendly to blogging. My web editor supports some URL query parameters that allows me to prefill the interface with content and title. That led me to create another bookmarlet, one that enables me to select text on any web page and compose a blog post quoting that text. Bookmarklets are powerful, especially when you combine them with custom URL schemas for the applications you use the most. Many desktop apps provide a way to use URLs to open them and perform some action. On macOS it is common for apps to not only provide that but there is also a blossoming ecosystem built around . The goal of the x-callback-url specification is to provide a standardized means for iOS developers to expose and document the methods they make available to other apps via custom URL schemes. Using x-callback-url, source apps can launch other apps passing data and context information, and also provide parameters instructing the target app to return data and control back to the source app after executing an action. Specific supported actions will be dependent on the individual apps and will not be discussed in the specification. — Source: Home | x-callback-url I use Yojimbo on my mac to keep my notes and other private information. The developers are Bare Bones Software made some bookmarklets to archive and bookmark sites with Yojimbo. I use them all the time when researching things for my stories and software. By using bookmarklets and custom URL schemes, you can reduce the friction between the vanilla browser experience and the way you actually want to use it. WebExtensions goes by many names — Firefox Add-ons, Chrome Extensions, etc — but they are all the same: small software that you add to your web browser to get features that don't come with it. Chances are that you, dear reader, already got some WebExtensions on your browser and you might think that you don't need to read what I'm about to tell you, but I urge you onwards cause I want to talk about the why's of WebExtensions and not really the tech behind them. Probably the most common type of WebExtension, the one that most people install when they migrate to a new web browser, are ad blockers. The Web has been littered with ads and it is out of control. Not only they distract you from the content you're trying to get, but they also consume resources and track you across site boundaries. The second type of WebExtensions that are super common are add-ons that provide features tied to some third-party service such as grammar checkers, or product search, etc. This is my own guesswork, I have no data on that. Let's not talk about those two cases, I want you to think of WebExtensions first in terms of an intervention . When you install a WebExtention, you're essentially telling the browser that "you are not happy with the experience they provide, and you're trying to make it better." , you are interfering with that experience and trying to make it more suitable for your taste. Even though each web browser provides thousands of WebExtensions on their official WebExtension marketplaces, you can also make your own so that the browser behaves how you want the browser to behave. Very often I see a web site for an event that is so far in the future that I'm not sure if I can go. If I decide to trust my memory that I will remember the event closer to the date, I will for sure miss it. So I made a Firefox Add-on to help me remember to go back to a page on a specific date . It works by creating an iCalendar file and downloading it to your computer. Opening that file adds an event in the default calendar application prefilled with the address for the page and the date I selected. It is very handy. I'm a huge fan of RSS feeds and am making my own blog reading app. While my own app is not usable, I'm still using other apps such as NetNewsWire and Rad Reader . I miss the days that Firefox would surface feeds by adding a RSS icon to the search bar. They made the web worse when removed that feature in my opinion; so I added it back with an add-on I made . The "Feed and Blogrool Discovery" add-on will add a little icon to the address bar if it detects a feed or blogroll on the website and then you can use it to copy the link and paste into your favourite blog reader. With each little intervention, I'm rebelling against the idea that a web browsing experience is a one-way consumption process. Not only I'm reducing the amount of clicks and effort between blogging and reading, but I'm also adding the features I want. Today, I created yet another Firefox Add-on. One that has a target audience of one. It is made just for me, it won't be available on AMO. Firefox Add-ons can open sidebars . A sidebar is a pane that is displayed at the side of the browser window, next to the web page. — Source: Sidebars - Mozilla | MDN A cool aspect of sidebars is that they can keep open while you switch tabs thus allowing a persistent interface while you browse different websites. I used that to open my blog posting interface as a sidebar making Firefox into my own editor for my blog. I can now open and close the editing interface and compose posts while browsing without moving away from the editor even if I need to switch to different websites to do research. I also created some other add-ons that are more rebellious towards the experience I'm being provided by third-party web sites. An example is how Goodreads provides links to books in Amazon –considering Amazon owns Goodreads that is understandable – I don't use Goodreads, so I created an add-on that adds a button to Goodreads to search the same book in Storygraph . I also used to have add-ons to remove sponsored content from Facebook and Twitter even though I stopped using those platforms as of lately. With each intervention, I'm making the browser my own. I'm not creating a product or a service that I will ship to mass markets. I'm tailoring my subjective web browsing experience to suit me. If any of the changes I make seem interesting to other people, I go on and publish them on AMO and other places. Web Browsers could make it even easier for us to add new features and tweaks to them, they don't do it cause they're afraid people will simply copy and paste stuff from chatbots and web pages without understanding the associated risks. That is understandable, people will do that and will be in trouble. That doesn't mean we should not strive to make browsing a pliable experience. In this tug of war, we need more users pulling the ropes towards themselves. We don't really know the reasons why in ancient times people would leave their handprint on a cave wall, but we can say they were making a mark. Showing their society and the world that they were there, that they were making that cave their own. We must add our own little handprints to the browser. We need to make them our own. We are the users and they are user agents. I would love to learn about your tweaks to your browser and your interventions. If this little blog post inspires you to write about this on your own blog, please send a WebMention to this URL. The cave becomes more fun if we're all adding to its walls.

0 views
Andre Garzia 1 years ago

Went to a book launch this week

This week went to Glasgow for Annabel Campbell's debut novel launch. The Outcast Mage is a book that I been looking forward ever since Annabel first mentioned it to me at one of the Edinburgh SFF socials. In the city of Amoria, where magic rules all, Naila is the ultimate conundrum. A student under the watchful eye of Amoria’s sprawling Academy, Naila is undeniably gifted, yet she has never been able to harness her abilities. And time is running out. If she fails, she’ll be forced into exile, or worse – consumed by her own magic. For decades mages and the magicless Hollows have lived side-by-side peacefully. But now that peace is threatened as old resentments bubble over. A powerful anti-Hollow faction led by Amoria’s most influential mages is determined to cast the Hollows out. With her Hollow background, Nalia is in danger of being exiled from everything she knows and everyone she loves if she cannot unlock her power. When a tragic incident threatens her place at the Academy, Naila is saved by Haelius Akana, the most powerful living mage. A scholar and fellow outcast, Haelius is fascinated by Naila’s inability to use magic. Eager to help someone in whom he sees so much of himself, he stakes his position at the Academy on teaching her. Trapped in the deadly schemes of Amoria’s elite, Naila must dig deep to discover the truth of her powers – or watch the city she loves descend into civil war. – Link for the book Joining her at the event were L. R. Lam and Justin Lee Anderson and together they had a wonderful conversation about the joy of epic fantasy. The Dragon Scale series by L. R. Lam is a wonderful story with feathered dragons, romance, assasins and much more. Long ago, humans betrayed dragons, stealing their magic and banishing them to a dying world. Centuries later, their descendants worship dragons as gods. But the 'gods' remember, and they do not forgive. Thief Arcady scrapes a living on the streets of Vatra. Desperate, Arcady steals a powerful artifact from the bones of the Plaguebringer, the most hated person in Lumet history. Only Arcady knows the artifact's magic holds the key to a new life among the nobles at court and a chance for revenge. The spell connects to Everen, the last male dragon foretold to save his kind, dragging him through the Veil. Disguised as a human, Everen soon learns that to regain his true power and form and fulfil his destiny, he only needs to convince one little thief to trust him enough to bond completely—body, mind, and soul—and then kill them. Yet the closer the two become, the greater the risk both their worlds will shatter. – Link to the book Justin Lee Anderson fantasy epic inspired inspired by Edinburgh is also on the top of my TBR. Really excited to see how Scottish tales and myths inspired his story. The war is over but peace can be hell. Demons continue to burn farmlands, violent mercenaries roam the wilds and a plague is spreading. The country of Eidyn is on its knees. In a society that fears and shuns him, Aranok is the first mage to be named king’s envoy. And his latest task is to restore and exiled foreign queen to her throne. The band of allies he assembles each have their own unique skills. But they are strangers to one another, and at every step across the ravaged land, a new threat emerges, lies are revealed and distrust could destroy everything they are working for. Somehow, Aranok must bring his companions together and uncover the conspiracy that threatens the kingdom – before war returns to the realms again. – Link to the book After the event, some of us went to The Last Bookstore pub and it was great. Perfect choice for celebrating a book launch. Look at how cool it is.

0 views
Andre Garzia 1 years ago

Creating a gamebook engine

I made this some time ago but never blogged about it. Unfortunately, I lost some of the source code, but that would be easy to rebuild. I decided to check what were the development options for the Playdate handheld console by Panic after receiving an email from them (I’m on the mailing list for the device). The offering is just too damn polished. Check out Develop for Playdate page. Like everything Panic does, it is damn well done. You can use the SDK to develop using C or Lua or a combination of both. They also offer an web IDE called Pulp that is similar to a pico-8 development workflow with tools for crafting fonts, screens, sprites, audio, and scripting. I went ahead and downloaded the SDK. I already had a license for Nova — which is the fancy development editor they make — and they ship lots of integrations for that editor with the SDK. Everything works out of the box. This is an example source loaded in Nova (using the Playdate theme and the extension). I’m using the editor task system to run the sample in the included emulator. It just works… I might be coding a gamebook engine for the Playdate… that was not in my strategy post, but the muse calls me and it is rude not to answer her call. And this is what my gamebook editor looks like:

0 views
Andre Garzia 1 years ago

Creating a simple posting interface

I recently noticed that I haven't been posting much on my blog which surprised me because blogging has always been among my favourite activities. The main obstacle that has prevented me from posting more often is that I didn't had an easy to use interface or app for doing so. When this blog was done with Racket + NodeJS, I implemented a MetaWeblog API server and thus could use apps such as MarsEdit to post to my blog. Once I rebuilt using Lua, I didn't finish implementing that API — I got it halfway done — and thus couldn't use that app anymore. I implemented Micropub API in Lua but am yet to find an app I like to use that supports that spec. Thankfully, Micropub is such an easy spec to implement that creating a little client for it can be done in hours if not minutes. Today, in about two hours, I made a small single-file HTML editor for my blog. It allows me to create new posts with ease including file uploads. It is actually the interface I'm using to write this post right now. It is a simple HTML form with 137 lines of vanilla JavaScript. All the JS does is simple cosmetics such as disabling buttons when posting or uploading is happening (so I don't press them twice) and using the fetch API to send data to the server. Of course this editor is super simple. There's barelly any error checking and most of the errors will just be console messages, but it is enough for my day to day usage. It serves its purpose which is to provide an easy way for me to make posts. I wonder what new features I'll implement as the week moves on.

0 views