Posts in Web (20 found)
Justin Duke Yesterday

Unshipping Keystatic

Two years after initially adopting it , we've formally unshipped Keystatic . Our CMS, such as it is, is now a bunch of Markdoc files and a TypeScript schema organizing the front matter — which is to say, it's not really a CMS at all. There were a handful of reasons for this move, in no specific order: That last point is basically what I wrote about Invoke — it's a terrible heuristic, judging a project by its commit frequency, and I know that. Things can and should be finished! And yet. When you're already on the fence, a quiet GitHub graph is the thing that tips you over. To Keystatic's credit, it was tremendously easy to extricate. The whole migration was maybe two hours of work, most of which was just deleting code. That's the sign of a well-designed library — one that doesn't metastasize into every corner of your codebase. I wish more tools were this easy to leave. Our team's use of Keystatic as an actual front-end CMS had dropped to zero. All of the non-coders have grown sufficiently adept with Markdown that the GUI was gathering dust; Keystatic had become a pure schema validation and rendering tool, and offered fairly little beyond what we were already getting from our build step. Some of the theoretically nice things — image hosting, better previewing — either didn't work as smoothly as we'd like or were supplanted entirely by Vercel's built-in features. The project appears to have atrophied a little bit, commits dwindling into the one-per-quarter frequency despite a healthy number of open issues. This is not to besmirch the lovely maintainers, who have many other things going on. But it's harder to stick around on a library you're not getting much value from when you're also worried there's not a lot of momentum down the road.

0 views
bitonic's blog. 1 weeks ago

A vibe-coded alternative to YieldGimp

If you’re a UK tax resident, short-term low-coupon gilts are the most tax efficient way to get savings-account-like returns, since most of their yield is tax free. This makes them very popular amongst retail investors, which now hold a large portion of the tradable low-coupon gilts. YieldGimp.com used to be a great free resource to evaluate the gilts currently available. However, it was recently turned into an app rather than a simple webpage. I’m not even sure if the app is free or paid, but I do not want to install the “YieldGimp platform” to quickly check gilt metrics when I buy them. So I asked my LLM of choice to produce an alternative, and after a few minutes and a few rounds of prompting I had something that served my needs. It is available for use at mazzo.li/gilts/ , and the source is on GitHub . It differs from YieldGimp in that it does not show metrics based on the current market price, but rather requires the user to input a price. I find this more useful anyway, since gilts are somewhat illiquid on my broker, so I need to come up with a limit price myself, which means that I want to know what the yield is at my price rather than the market price. It also lets you select a specific tax rate to produce a “gross equivalent” yield. It is not a very sophisticated tool and it doesn’t pretend to model gilts and their tax implications precisely (the repository’s README has more details on its shortcomings), but for most use cases it should be informative enough to sanity-check your trades without a Bloomberg terminal.

0 views
David Bushell 1 weeks ago

Web font choice and loading strategy

When I rebuilt my website I took great care to optimise fonts for both performance and aesthetics. Fonts account for around 50% of my website (bytes downloaded on an empty cache). I designed and set a performance budget around my font usage. I use three distinct font families and three different methods to load them. Web fonts are usually defined by the CSS rule. The property allows us some control over how fonts are loaded. The value has become somewhat of a best practice — at least the most common default. The CSS spec says: Gives the font face an extremely small block period (100ms or less is recommended in most cases) and an infinite swap period . In other words, the browser draws the text immediately with a fallback if the font face isn’t loaded, but swaps the font face in as soon as it loads. CSS Fonts Module Level 4 - W3C That small “block period”, if implemented by the browser, renders an invisible font temporarily to minimise FOUC . Personally I default to and don’t change unless there are noticeable or measurable issues. Most of the time you’ll use swap. If you don’t know which option to use, go with swap. It allows you to use custom fonts and tip your hand to accessibility. font-display for the Masses - Jeremy Wagner Google Fonts’ default to which has performance gains. In effect, this makes the font files themselves asynchronous—the browser immediately displays our fallback text before swapping to the web font whenever it arrives. This means we’re not going to leave users looking at any invisible text (FOIT), which makes for both a faster and more pleasant experience. Speed Up Google Fonts - Harry Roberts Harry further notes that a suitable fallback is important, as I’ll discover below. My three fonts in order of importance are: Ahkio for headings. Its soft brush stroke style has a unique hand-drawn quality that remains open and legible. As of writing, I load three Ahkio weights at a combined 150 KB. That is outright greed! Ahkio is core to my brand so it takes priority in my performance budget (and financial budget, for that matter!) Testing revealed the 100ms † block period was not enough to avoid FOUC, despite optimisation techniques like preload . Ahkio’s design is more condensed so any fallback can wrap headings over additional lines. This adds significant layout shift. † Chrome blog mention a zero second block period . Firefox has a config preference default of 100ms. My solution was to use instead of which extends the block period from a recommended 0–100ms up to a much longer 3000ms. Gives the font face a short block period (3s is recommended in most cases) and an infinite swap period . In other words, the browser draws “invisible” text at first if it’s not loaded, but swaps the font face in as soon as it loads. CSS Fonts Module Level 4 - W3C This change was enough to avoid ugly FOUC under most conditions. Worst case scenario is three seconds of invisible headings. With my website’s core web vitals a “slow 4G” network can beat that by half. For my audience an extended block period is an acceptable trade-off. Hosting on an edge CDN with good cache headers helps minimised the cost. Update: Richard Rutter suggested which gives more fallback control than I knew. I shall experiment and report back! Atkinson Hyperlegible Next for body copy. It’s classed as a grotesque sans-serif with interesting quirks such as a serif on the lowercase ‘i’. I chose this font for both its accessible design and technical implementation as a variable font . One file at 78 KB provides both weight and italic variable axes. This allows me to give links a subtle weight boost. For italics I just go full-lean. I currently load Atkinson Hyperlegible with out of habit but I’m strongly considering why I don’t use . Gives the font face an extremely small block period (100ms or less is recommended in most cases) and a short swap period (3s is recommended in most cases). In other words, the font face is rendered with a fallback at first if it’s not loaded, but it’s swapped in as soon as it loads. However, if too much time passes, the fallback will be used for the rest of the page’s lifetime instead. CSS Fonts Module Level 4 - W3C The browser can give up and presumably stop downloading the font. The spec actually says that and “[must/should] only be used for small pieces of text.” Although it notes that most browsers implement the default with similar strategies to . 0xProto for code snippets. If my use of Ahkio was greedy, this is gluttonous! A default would be acceptable. My justification is that controlling presentation of code on a web development site is reasonable. 0xProto is designed for legibility with a personality that compliments my design. I don’t specify 0xProto with the CSS rule. Instead I use the JavaScript font loading API to conditionally load when a element is present. Note the name change because some browsers aren’t happy with a numeric first character. Not shown is the event wrapper around this code. I also load the script with both and attributes. This tells the browser the script is non-critical and avoids render blocking. I could probably defer loading even later without readers noticing the font pop in. Update: for clarity, browsers will conditionally load but JavaScript can purposefully delay the loading further to avoid fighting for bandwidth. When JavaScript is not available the system default is fine. There we have it, three fonts, three strategies, and a few open questions and decisions to make. Those may be answered when CrUX data catches up. My new website is a little chunkier than before but its well within reasonable limits. I’ll monitor performance and keep turning the dials. Web performance is about priorities . In isolation it’s impossible to say exactly how an individual asset should be loaded. There are upper limits, of course. How do you load a one megabyte font? You don’t. Unless you’re a font studio providing a complete type specimen. But even then you could split the font and progressive load different unicode ranges. I wonder if anyone does that? Anyway I’m rambling now, bye. Thanks for reading! Follow me on Mastodon and Bluesky . Subscribe to my Blog and Notes or Combined feeds.

0 views
Ruslan Osipov 1 weeks ago

Homepage for a home server

I have a NAS (Network Accessible Storage) which doubles as a home server. It’s really convenient to have a set of always-on, locally hosted services - which lets me read RSS feeds without distractions, have local photos storage solution, or move away from streaming services towards organizing my (legally owned) collections of movies, shows, and audiobooks. At this point I have 20-or-so services running and sometimes it gets hard to keep track of what’s where. For that - I found Homepage . A simple, fast, lightweight page which connects to all of my services for some basic monitoring, and reminds me what my service layout looks like. Here’s what I built: I love that the configuration lives in YAML files, and because the page is static - it loads real fast. There are many widgets which provide info about various services out of the box. It’s neat. There’s definitely a question of how much I’ll keep this up-to-date: it’s not an automatically populated dashboard, and editing is a two-step process (SSH into the machine, edit the YAML configs) - which adds some friction. We’ll have to wait and see, but for now I’m excited about my little dashboard.

0 views
matduggan.com 2 weeks ago

The Small Web is Tricky to Find

One of the most common requests I've gotten from users of my little Firefox extension( https://timewasterpro.xyz ) has been more options around the categories of websites that you get returned. This required me to go through and parse the website information to attempt to put them into different categories. I tried a bunch of different approaches but ended up basically looking at the websites themselves seeing if there was anything that looked like a tag or a hint on each site. This is the end conclusion of my effort at putting stuff into categories. Unknown just means I wasn't able to get any sort of data about it. This is the result of me combining Ghost, Wordpress and Kagi Small Web data sources. Interestingly one of my most common requests is "I would like less technical content" which as it turns out is tricky to provide because it's pretty hard to find. They sorta exist but for less technical users they don't seem to have bought into the value of the small web own your own web domain (or if they have, I haven't been able to figure out a reliable way to find them). This is an interesting problem, especially because a lot of the tools I would have previously used to solve this problem are....basically broken. It's difficult for me to really use Google web search to find anything at this point even remotely like "give me all the small websites" because everything is weighted to steer me away from that towards Reddit. So anything that might be a little niche is tricky to figure out. So there's no point in building a web extension with a weighting algorithm to return less technical content if I cannot find a big enough pool of non-technical content to surface. It isn't that these sites don't exist its just that we never really figured out a way to reliably surface "what is a small website". So from a technical perspective I have a bunch of problems. I think I can solve....some of these, but the more I work on the problem the more I'm realizing that the entire concept of "the small web" had a series of pretty serious problems. First I need to reliably sort websites into a genre, which can be a challenge when we're talking about small websites because people typically write about whatever moves them that day. Most of the content on a site might be technical, but some of it might not be. Big sites tend to be more precise with their SEO settings but small sites that don't care don't do that, so I have fewer reliable signals to work with. Then I need to come up with a lot of different feeding systems for independent websites. The Kagi Small Web was a good starting point, but Wordpress and Ghost websites have a much higher ratio of non-technical content. I need those sites, but it's hard to find a big batch of them reliably. Once I have the type of website as a general genre and I have a series of locations, then I can start to reliably distribute the types of content you get. Google was the only place on Earth sending any traffic there Because Google was the only one who knew about it, there never needed to be another distribution system Now that Google is broken, it's almost impossible to recreate that magic of becoming the top of list for a specific subgenre without a ton more information than I can get from public records.

0 views
David Bushell 2 weeks ago

Declarative Dialog Menu with Invoker Commands

The off-canvas menu — aka the Hamburger , if you must — has been hot ever since Jobs’ invented mobile web and Ethan Marcott put a name to responsive design . Making an off-canvas menu free from heinous JavaScript has always been possible, but not ideal. I wrote up one technique for Smashing Magazine in 2013. Later I explored in an absurdly titled post where I used the new Popover API . I strongly push clients towards a simple, always visible, flex-box-wrapping list of links. Not least because leaving the subject unattended leads to a multi-level monstrosity. I also believe that good design and content strategy should allow users to navigate and complete primary goals without touching the “main menu”. However, I concede that Hamburgers are now mainstream UI. Jason Bradberry makes a compelling case . This month I redesigned my website . Taking the menu off-canvas at all breakpoints was a painful decision. I’m still not at peace with it. I don’t like plain icons. To somewhat appease my anguish I added big bold “Menu” text. The HTML for the button is pure declarative goodness. I added an extra “open” prefix for assistive tech. Aside note: Ana Tudor asked do we still need all those “visually hidden” styles? I’m using them out of an abundance of caution but my feeling is that Ana is on to something. The menu HTML is just as clean. It’s that simple! I’ve only removed my opinionated class names I use to draw the rest of the owl . I’ll explain more of my style choices later. This technique uses the wonderful new Invoker Command API for interactivity. It is similar to the I mentioned earlier. With a real we get free focus management and more, as Chris Coyier explains . I made a basic CodePen demo for the code above. So here’s the bad news. Invoker commands are so new they must be polyfilled for old browsers. Good news; you don’t need a hefty script. Feature detection isn’t strictly necessary. Keith Cirkel has a more extensive polyfill if you need full API coverage like JavaScript events. My basic version overrides the declarative API with the JavaScript API for one specific use case, and the behaviour remains the same. Let’s get into CSS by starting with my favourite: A strong contrast outline around buttons and links with room to breath. This is not typically visible for pointer events. For other interactions like keyboard navigation it’s visible. The first button inside the dialog, i.e. “Close (menu)”, is naturally given focus by the browser (focus is ‘trapped’ inside the dialog). In most browsers focus remains invisible for pointer events. WebKit has bug. When using or invoker commands the style is visible on the close button for pointer events. This seems wrong, it’s inconsistent, and clients absolutely rage at seeing “ugly” focus — seriously, what is their problem?! I think I’ve found a reliable ‘fix’. Please do not copy this untested . From my limited testing with Apple devices and macOS VoiceOver I found no adverse effects. Below I’ve expanded the ‘not open’ condition within the event listener. First I confirm the event is relevant. I can’t check for an instance of because of the handler. I’d have to listen for keyboard events and that gets murky. Then I check if the focused element has the visible style. If both conditions are true, I remove and reapply focus in a non-visible manner. The boolean is Safari 18.4 onwards. Like I said: extreme caution! But I believe this fixes WebKit’s inconsistency. Feedback is very welcome. I’ll update here if concerns are raised. Native dialog elements allow us to press the ESC key to dismiss them. What about clicking the backdrop? We must opt-in to this behaviour with the attribute. Chris Ferdinandi has written about this and the JavaScript fallback . That’s enough JavaScript! My menu uses a combination of both basic CSS transitions and cross-document view transitions . For on-page transitions I use the setup below. As an example here I fade opacity in and out. How you choose to use nesting selectors and the rule is a matter of taste. I like my at-rules top level. My menu also transitions out when a link is clicked. This does not trigger the closing dialog event. Instead the closing transition is mirrored by a cross-document view transition. The example below handles the fade out for page transitions. Note that I only transition the old view state for the closing menu. The new state is hidden (“off-canvas”). Technically it should be possible to use view transitions to achieve the on-page open and close effects too. I’ve personally found browsers to still be a little janky around view transitions — bugs, or skill issue? It’s probably best to wrap a media query around transitions. “Reduced” is a significant word. It does not mean “no motion”. That said, I have no idea how to assess what is adequately reduced! No motion is a safe bet… I think? So there we have it! Declarative dialog menu with invoker commands, topped with a medley of CSS transitions and a sprinkle of almost optional JavaScript. Aren’t modern web standards wonderful, when they work? I can’t end this topic without mentioning Jim Nielsen’s menu . I won’t spoil the fun, take a look! When I realised how it works, my first reaction was “is that allowed?!” It work’s remarkably well for Jim’s blog. I don’t recall seeing that idea in the wild elsewhere. Thanks for reading! Follow me on Mastodon and Bluesky . Subscribe to my Blog and Notes or Combined feeds.

0 views
David Bushell 2 weeks ago

Big Design, Bold Ideas

I’ve only gone and done it again! I redesigned my website. This is the eleventh major version. I dare say it’s my best attempt yet. There are similarities to what came before and plenty of fresh CSS paint to modernise the style. You can visit my time machine to see the ten previous designs that have graced my homepage. Almost two decades of work. What a journey! I’ve been comfortable and coasting for years. This year feels different. I’ve made a career building for the open web. That is now under attack. Both my career, and the web. A rising sea of slop is drowning out all common sense. I’m seeing peers struggle to find work, others succumb to the chatbot psychosis. There is no good reason for such drastic change. Yet change is being forced by the AI industrial complex on its relentless path of destruction. I’m not shy about my stance on AI . No thanks! My new homepage doubles down. I won’t be forced to use AI but I can’t ignore it. Can’t ignore the harm. Also I just felt like a new look was due. Last time I mocked up a concept in Adobe XD . Adobe in now unfashionable and Figma, although swank, has that Silicon Valley stench . Penpot is where the cool kids paint pretty pictures of websites. I’m somewhat of an artist myself so I gave Penpot a go. My current brand began in 2016 and evolved in 2018 . I loved the old design but the rigid layout didn’t afford much room to play with content. I spent a day pushing pixels and was quite chuffed with the results. I designed my bandit game in Pentpot too (below). That gave me the confidence to move into real code. I’m continuing with Atkinson Hyperlegible Next for body copy. I now license Ahkio for headings. I used Komika Title before but the all-caps was unwieldy. I’m too lazy to dig through backups to find my logotype source. If you know what font “David” is please tell me! I worked with Axia Create on brand strategy. On that front, we’ll have more exciting news to share later in the year! For now what I realised is that my audience here is technical. The days of small business owners seeking me are long gone. That market is served by Squarespace or Wix. It’s senior tech leads who are entrusted to find and recruit me, and peers within the industry who recommend me. This understanding gave me focus. To illustrate why AI is lame I made an interactive mini-game! The slot machine metaphor should be self-explanatory. I figured a bit of comedy would drive home my AI policy . In the current economy if you don’t have a sparkle emoji is it even a website? The game is built with HTML canvas, web components, and synchronised events I over-complicated to ensure a unique set of prizes. The secret to high performance motion blur is to cheat with pre-rendered PNGs. In hindsight I could have cheated more with a video. I commissioned Declan Chidlow to create a bespoke icon set. Declan delivered! The icons look so much better than the random assortment of placeholders I found. I’m glad I got a proper job done. I have neither the time nor skill for icons. Declan read my mind because I received a 88×31 web badge bonus gift. I had mocked up a few badges myself in Penpot. Scroll down to see them in the footer. Declan’s badge is first and my attempts follow. I haven’t quite nailed the pixel look yet. My new menu is built using with invoker commands and view transitions for a JavaScript-free experience. Modern web standards are so cool when the work together! I do have a tiny JS event listener to polyfill old browsers. The pixellated footer gradient is done with a WebGL shader. I had big plans but after several hours and too many Stack Overflow tabs, I moved on to more important things. This may turn into something later but I doubt I’ll progress trying to learn WebGL. Past features like my Wasm static search and speech synthesis remain on the relevant blog pages. I suspect I’ll be finding random one-off features I forgot to restyle. My homepage ends with another strong message. The internet is dominated by US-based big tech. Before backing powers across the Atlantic, consider UK and EU alternatives. The web begins at home. I remain open to working with clients and collaborators worldwide. I use some ‘big tech’ but I’m making an effort to push for European alternatives. US-based tech does not automatically mean “bad” but the absolute worst is certainly thriving there! Yeah I’m English, far from the smartest kind of European, but I try my best. I’ve been fortunate to find work despite the AI threat. I’m optimistic and I refuse to back down from calling out slop for what it is! I strongly believe others still care about a job well done. I very much doubt the touted “10x productivity” is resulting in 10x profits. The way I see it, I’m cheaper, better, and more ethical than subsidised slop. Let me know on the socials if you love or hate my new design :) P.S. I published this Sunday because Heisenbugs only appear in production. Thanks for reading! Follow me on Mastodon and Bluesky . Subscribe to my Blog and Notes or Combined feeds.

0 views
Dominik Weber 2 weeks ago

Lighthouse update February 9th

During the past week I finished the most important onboarding improvements. For new users it's now easier to get into Lighthouse. The biggest updates were - An onboarding email drip which explains the features of Lighthouse - Feed subscribe changes, now showing a suggestion list of topics and curated feeds, and a search for websites and feeds to subscribe to The next step becamse clear after talking to users and potential customers. The insight was that even if the structure and features of Lighthouse are much better for content curation, it doesn't matter if not all relevant content can be pulled into Lighthouse. This means first and foremost websites that don't have a feed or newsletter. So the next feature will be a website to feed conversion. That websites can be subscribed to even if they don't have a feed or newsletter. ## Pricing Big parts of the indie business community give the advice to charge more. "You're not charging enough, charge more" is a generic and relatively popular advice. I stopped frequenting these (online) places as much, so I'm not sure they give the same advice in the current environment, but for a long time I read this advice a lot. I'm sure in some areas this holds true, but I since realized that the content aggregator space is different. It's a relatively sticky type of product, people don't like to switch. Even if OPML exports and imports make it easy to move feeds, additional custom features like newsletter subscriptions, rule setups, tags, and so on make it harder to move. So people rightfully place a risk premium on smaller products. Pricing it close to the big ones is too high, and I now consider this a mistake. So I'm lowering the price from 10€ to 7€ for the premium plan. Another issue is the 3-part pricing structure. Everyone does it because the big companies do. And maybe at this point the big companies do it because "it's always been done that way". But as a small company I don't yet know where the lines are, which features are important to which customer segment. Therefore I'll remove the 2nd paid plan, to only have a free and one paid plan. I'm worried that the pricing changes are seen as erratic, but honestly too few people care yet for this worry to be warranted or important. What I find interesting is that I'm much more confident on the product side than on the business side. On the one hand this is clear, because I'm a software engineer. But on the other hand I believe it's also because (software) products are additive. In the sense that features can always be added. For pricing there is always one. The more time I have the more features I can add, so the only decision is what to do first. For pricing it doesn't matter how much time I have, I must always choose between one or the other. It doesn't really have a consequence, but I found it an interesting meta-thought.

0 views
Evan Schwartz 3 weeks ago

Scour - January Update

Hi friends, In January, Scour scoured 805,241 posts from 16,555 feeds (939 were newly added). I also rolled out a lot of new features that I'm excited to tell you about. Maybe because of some of these, I found more posts than usual that I thought were especially worth sharing. You can find them at the bottom of this post. Let's dive in! The Scour homepage has been completely revamped. It includes a new tagline, a more succinct description, and a live demo where you can try out my feed right from that page. Let me know what you think! Scour also finally has its own logo! (And it looks great on my phone's home screen, if I do say so myself! See below ) Have you ever wondered how Scour works? There is now a full documentation section, complete with detailed write-ups about Interests , Feeds , Reactions , How Ranking Works , and more. There are also guides specifically for RSS users and readers of Hacker News , arXiv , Reddit , and Substack . All of the docs have lots of interactive elements, which I wrote about in Building Docs Like a Product . My favorite one is on the Hacker News guide where you can search for hidden gems that have been submitted to HN but that have not reached the front page. Thanks to Tiago Ferreira , Andrew Doran , and everyone else who gave me the feedback that they wanted to understand more about how Scour works! Scour is now a Progressive Web App (PWA). That means you can install it as an icon on your home screen and access it easily. Just open Scour on your phone and follow the instructions there. Thanks to Adam Benenson for the encouragement to finally do this! This is one of the features I have most wanted as a user of Scour myself. When you're browsing the feed, Scour now keeps track of which items you've seen and scrolled past so it shows you new content each time you check it. If you don't want this behavior, you can disable it in the feed filter menu or change your default view to show seen posts. If you subscribe to specific feeds, as opposed to scouring all of them, it's now easier to find the feed for an article you liked . Click the "..." menu under the post, then "Show Feeds" to show feeds where the item was found. When populating that list, Scour will now automatically search the website where the article was found to see if it has a feed that Scour wasn't already checking. This makes it easy to discover new feeds and follow websites or authors whose content you like. This was another feature I've wanted for a long time myself. Previously, when I liked an article, I'd copy the domain and try to add it to my feeds on the Feeds page. Now, Scour does that with the click of a button. Some of the most disliked and flagged articles on Scour had titles such as "The Top 10..." or "5 tricks...". Scour now automatically penalizes articles with titles like those. Because I'm explicitly trying to avoid using popularity in ranking , I need to find other ways to boost high-quality content and down-rank low-quality content. You can expect more of these types of changes in the future to increase the overall quality of what you see in your feed. Previously, posts found through Google News links would show Google News as the domain under the post. Now, Scour extracts the original link. You can now navigate your feed using just your keyboard. Type to get the list of available keyboard shortcuts. Finally, here are some of my favorite posts that I found on Scour in January. There were a lot! Happy Scouring! Have feedback for Scour? Post it on the feedback board and upvote others' suggestions to help me prioritize new features! I appreciate this minimalist approach to coding agents: Pi: The Minimal Agent Within OpenClaw , even though it didn't yet convince me to switch away from Claude Code. A long and interesting take on which software tools will survive the AI era: Software Survival 3.0 . Scour uses Litestream for backup. While this new feature isn't directly relevant, I'm excited that it's now powering Fly.io's new Sprites offering (so I expect it to be a little more actively developed): Litestream Writable VFS . This is a very cool development in embedding models: a family of different size (and, as a result, cost) models whose embeddings are interoperable with one another: The Voyage 4 model family: shared embedding space with MoE architecture . A thought-provoking piece from Every about How AI Made Pricing Hard Again . TL;DR: over are the days where SaaS businesses have practically zero marginal cost for additional users or additional usage. A nice bit of UX design history about the gas tank arrow indicator on a car, with a lesson applied to AI: The Moylan Arrow: IA Lessons for AI-Powered Experiences . Helpful context for Understanding U.S. Intervention in Venezuela . Stoolap: an interesting new embedded database. Stoolap 0.2 Released For Modern Embedded SQL Database In Rust . I keep browsing fonts and, while I decided not to use this one for Scour, I think this is a neat semi-sans-serif from an independent designer: Heliotrope .

0 views
Abhinav Sarkar 2 months ago

Polls I Ran on Mastodon in 2025

In 2025, I ran ten polls on Mastodon exploring various topics, mostly to outsource my research to the hivemind. Here are the poll results organized by topic, with commentary. How do you pronounce JSON? January 15, 2025 I’m in the “Jay-Son, O as in Otter” camp, which is the majority response. It seems like most Americans prefer the “Jay-Son, O as in Utter” option. Thankfully, only one person in the whole world says “Jay-Ess-On”. If someone were to write a new compiler book today, what would you prefer the backend to emit? October 31, 2025 LLVM wins this poll hands down. It is interesting to see WASM beating other targets. Which is your favourite Haskell parsing library? November 3, 2025 I didn’t expect Attoparsec to go toe-to-toe with Megaparsec . I did some digging, and it seems like Megaparsec is the clear winner when it comes to parsing programming languages in Haskell. However, for parsing file formats and network protocols, Attoparsec is the most popular one. I think that’s wise, and I’m inclined to make the same choice. If you were to write a compiler in Haskell, would you use a lens library to transform the data structures? July 11, 2025 This one has mixed results. Personally, I’d like to use a minimal lens library if I’m writing a compiler in Haskell. What do you think is the right length of programming related blog posts (containing code) in terms of reading time? May 18, 2025 As a writer of programming related blog posts, this poll was very informative for me. 10 minute long posts seem to be the most popular option, but my own posts are a bit longer, usually between 15–20 minutes. Do you print blog posts or save them as PDFs for offline reading? March 8, 2025 Most people do not seem to care about saving or printing blog posts. But I went ahead and added (decent) printing support for my blog posts anyway. If you have a personal website and you do not work in academia, do you have your résumé or CV on your website? August 30, 2025 I don’t have a public résumé on my website either. I’d like to, but I don’t think anyone visiting my website would read it. Would people be interested in a series of blog posts where I implement the C compiler from “Writing a C Compiler” book by Nora Sandler in Haskell? November 11, 2025 Well, 84% people voted “Yes”, so this is (most certainly) happening in 2026! If I were to release a service to run on servers, how would you prefer I package it? December 30, 2025 Well, people surely love their Docker images. Surprisingly, many are okay with just source code and build instructions. Statically linked executable are more popular now, probably because of the ease of deployment. Many also commented that they’d prefer OS specify package like deb or rpm. However, my personal preference is Nix package and NixOS module. If you run services on Hetzner, do you keep a backup of your data entirely off Hetzner? August 9, 2025 It is definitely wise to have an offsite backup. I’m still figuring out the backup strategy for my VPS. That’s all for this year. Let’s see what polls I come up with in 2026. If you have any questions or comments, please leave a comment below. If you liked this post, please share it. Thanks for reading! This post was originally published on abhinavsarkar.net . If you liked this post, please leave a comment . General Programming JSON Pronunciation Compilers Compiler Backend Targets Haskell Parsing Libraries Compiler in Haskell with Lenses Blogging & Web Blog Post Length Preferences Blog Post Print Support Résumés on Personal Website “Writing a C Compiler” Blog Series Self-hosting Service Packaging Preferences Hetzner Backup Strategy

0 views
Chris Coyier 2 months ago

Ol’ Bob

My friend Jason came over as he’d just got a new MacBook and a new Black Lion audio interface he wanted to try out. We plugged my Ear Trumped Mabel into it and did a few songs. I dragged my web cam setup downstairs to do the video haha. The song is Ol’ Bob from Roger Netherton . There is a 2nd take at the end that I think I like a little better. The first take I did a 2-finger style which sometimes I like but is probably better suited when there is a guitar too. Clawhammer and fiddle is so classic.

0 views
j0nah.com 2 months ago

I failed to recreate the 1996 Space Jam Website with Claude

Link to the Hacker News post. Thanks everybody for all the engagement! Since writing this post, a few people have reached out to show me…

0 views
Manuel Moreale 2 months ago

Come on John

For all I know, John O'Nolan is a cool dude. He’s the founder of Ghost , a project that is also really cool. You know what’s also cool? RSS. And guess what, John just announced he’s working on a new RSS app (Reader? Tool? Service?) called Alcove and he blogged about it . All this is nice. All this is cool. The more people build tools and services for the open web, the better. Having said all that though, John: If you want to follow along with this questionable side project of undefined scope, I'm sharing live updates of progress on Twitter, here. You are on your own blog, your own corner of the web, powered by the platform you’re the CEO of, a blog that also serves content via RSS, the thing you’re building a tool for, and you’re telling people to follow the progress on fucking Twitter? Come on John. Thank you for keeping RSS alive. You're awesome. Email me :: Sign my guestbook :: Support for 1$/month :: See my generous supporters :: Subscribe to People and Blogs

69 views
Kix Panganiban 3 months ago

Utteranc.es is really neat

It's hard to find privacy-respecting (read: not Disqus) commenting systems out there. A couple of good ones recommended by Bear are Cusdis and Komments -- but I'm not a huge fan of either of them: Then I realized that there's a great alternative that I've used in the past: utteranc.es . Its execution is elegant: you embed a tiny JS file on your blog posts, and it will map every page to Github Issues in a Github repo. In my case, I created this repo specifically for that purpose. Neat! I'm including utteranc.es in all my blog posts moving forward. You can check out how it looks below: Cusdis styling is very limited. You can only set it to dark or light mode, with no control over the specific HTML elements and styling. It's fine but I prefer something that looks a little neater. Komments requires manually creating a new page for every new post that you make. The idea is that wherever you want comments, you create a page in Komments and embed that page into your webpage. So you can have 1 Komments page per blog post, or even 1 Komments page for your entire blog.

0 views
fLaMEd fury 3 months ago

Personal Websites Aren’t Dead

What’s going on, Internet? This post “ Personal Websites Are Dead ” has been making the rounds this week and it’s as dumb as it sounds. Naturally, I disagree. Strongly. “Personal websites are dying because platforms got better.” “Your Substack profile is a website.” The post boils down to this: platforms are easier, reach is outsourced, maintenance is annoying, and feeds have replaced homepages. Sure. But that’s not proof personal websites are obsolete. It’s proof most people stopped valuing ownership. The web didn’t change. People did. The tradeoff is simple. You either own your space or you rent one. Renting is convenient until the landlord changes the locks, rewrites the rules, or decides you don’t fit the algorithm today. A personal website isn’t about traffic spikes or “momentum”. It’s about autonomy. It’s about opting out of surveillance feeds, tracking, friction, and platform churn. It’s about having a corner of the internet that isn’t trying to convert, optimise, or harvest anything. If anything, the personal web movement shows the opposite of what this post shared on Medium (lol) claims. More people are tired of platform dependency. More people are building small, simple sites again. Not for reach. For identity. For community. For longevity. For personal archives and homes on the web that don’t disappear when a company pivots. Maintenance can be a burden depending on your skill level, but it’s all part of the craft. If someone finds updating a theme (easy example - I know) too hard, fine. But it’s not evidence the personal web is dying. It’s evidence they were never that invested in the web to begin with. Which brings me back to a question I keep asking: why isn’t making websites easier by now ? Personal websites aren’t dead. They’re just not fashionable. And that’s fine. The open web has always thrived on the people who keep publishing, keep tinkering, and keep owning their corner without needing permission. The future of the web doesn’t belong to platforms. It belongs to whoever shows up and keeps building. Hey, thanks for reading this post in your feed reader! Want to chat? Reply by email or add me on XMPP , or send a webmention . Check out the posts archive on the website.

0 views
Susam Pal 3 months ago

Nerd Quiz #2

Nerd Quiz #2 is the second release of Nerd Quiz, a single-page HTML application that invites you to gauge your nerd level through short, focused quiz sets. Each question is carefully crafted from everyday moments of reading, writing, thinking, learning and exploring. This release introduces five new questions drawn from a range of topics such as chess, graph theory and linguistics. To try the quiz, visit nq.html . Read on website | #web | #miscellaneous | #game

0 views
Kev Quirk 3 months ago

Giving My Jekyll Site a CDN Front End

I've managed to get my Jekyll based site working behind Bunny CDN, while maintaining my .htaccess redirects. Here's how I did it... Since switching back to Jekyll recently, I’ve been running this site on a Ionos-hosted VPS, then using a little deploy script to build the site and rsync it up. This all worked fine, but I really wanted to use Bunny CDN for more than just hosting a few images and my custom font. Being a static site, I could have dumped everything onto their storage platform, but I have a metric tonne of redirects in a file from various platform migrations over the years . Bunny’s Edge Platform could have handled these, but with the number of redirects I have, it would have been a slog to maintain. So I assumed I’d never be able to put Bunny in front of my Jekyll site easily and went about my business. 💡 Then I had an epiphany. What if I created a Bunny pull zone that uses as the public domain, then set up a separate domain on my VPS, host the site there, and use that as the pull zone origin? My theory was that Bunny would still be requesting content from the VPS, so my redirects might still work. …turns out, they did. I duplicated my live site so I could experiment safely. The setup looked like this: The first thing I had to do was update the field in my Jekyll from to , rebuild the site, and upload it to . Now, you might be thinking, “Kev, why build the site with the wrong domain?” But I haven’t. By building the site with the test domain, all links point to . If I built it with the origin domain, all internal links would lead to the wrong place. They would still work, but the site would be served from , which is not what I want. Next up was redirect testing. I visited , which should hit and redirect to . The redirect worked fine, but the resulting URL was being served from the origin domain. So instead of seeing I saw . This happened because Bunny requested the file from the origin using its own hostname, not the hostname I typed. In simple terms: So Apache went: “Oh, you want ? Sure. That’s at . Here ya go…” This would not break anything for visitors, but I didn’t want appearing anywhere. It looked messy. Two changes fixed it: Forward Host Headers makes Bunny tell the VPS the hostname the visitor used. So instead of: “I’m asking for this on behalf of .” Bunny says: “I’m asking for this on behalf of , not .” The domain alias ensures Apache accepts that hostname and serves it correctly. The other thing to double-check is that every page sets a proper URL. The origin domain is publicly accessible, so crawlers need to know which domain is the real one. That should always be the Bunny pull zone domain. In Jekyll this is simple. Add the following to the section of your layout: With the redirect behaviour sorted, the last step was to add a purge step to my deploy script so Bunny knows to fetch the latest version whenever I publish a new post or update something. Here’s the snippet I added: I set my Bunny API key and Pull Zone ID as variables at the top of the script. The statement simply says, “If the response isn’t 200 or 204, tell Kev what went wrong.” And that is it. Last night I flipped the switch. Bunny CDN now sits in front of the live site. I also moved the VPS from Ionos to Hetzner because Ionos now charge extra for a Plesk licence. I went with Hestia as the control panel on the new server. If you spot any bugs, please do let me know, but everything should be hopping along nicely now. (See what I did there? God I’m funny!) Thanks for reading this post via RSS. RSS is great, and you're great for using it. ❤️ Reply to this post by email - the domain configured in the Bunny pull zone - the origin domain on my VPS, where the site actually lives I visited . Bunny checked its cache. It wasn’t there, so it asked my origin for the file. But Bunny made that request using its own hostname ( ), not the one I typed. Apache saw the request coming from and applied the redirect to . Apache then rebuilt the redirect URL using the hostname it was given, which was . Enable Forward Host Headers in my Bunny pull zone. Add as a domain alias of on my VPS.

0 views
Circus Scientist 3 months ago

Tools For Developers

I have been meaning to do a blog post about this for some time. The list I have is not going to be long – I don’t need to talk about Git, VSCode or Docker for example, everybody knows those. This is for things that I think might not be so well known – that I use every day. I honestly don’t know how I used a terminal before without this tool. It’s a history manager for your terminal – providing instant access to recent commands with adaptive text completion and search. Ctrl+R … type … Enter! https://atuin.sh/ If you look at my blog you will see this come up again and again. It’s the AI Coding assistant for Power Users! https://github.com/dwash96/aider-ce Previously I used Ngrok, but Cloudflared Tunnels are so much more intuitive to use. Just run your local web service on your own laptop, type one command to securely share it with the client. Let them see the updates in real time! https://developers.cloudflare.com/cloudflare-one/networks/connectors/cloudflare-tunnel/do-more-with-tunnels/trycloudflare/ – requires zero project setup (unlike Ngrok). That’s it, I told you it was a short list! I have no idea why anyone subscribes to Spotify or Youtube Music when we have radio (OK I’m just old) . Seriously though, this Android app has thousands of online radio stations from all genres. Get it on F-Droid On the Desktop: use Clementine music player, just set up the IceCast plugin and the Internet of Radio is yours! I have a ton of websites I host on my own VPS at DigitalOcean . Some of them I manage for other people, others are for my own entertainment businesses. Needless to say, I am invested in keeping them all up and running. Website Monitor is an Android app that checks periodically to see if a given set of websites is up and running. You get alerts if one is offline. It just works! Get Website Monitor on F-Droid The post Tools For Developers appeared first on Circus Scientist .

1 views
Jim Nielsen 3 months ago

Leveraging a Web Component For Comparing iOS and macOS Icons

Whenever Apple does a visual refresh in their OS updates, a new wave of icon archiving starts for me. Now that “Liquid Glass” is out, I’ve begun nabbing the latest icons from Apple and other apps and adding them to my gallery. Since I’ve been collecting these icons for so long, one of the more interesting and emerging attributes of my collection is the visual differences in individual app icons over time. For example: what are the differences between the icons I have in my collection for Duolingo? Well, I have a page for that today . That’ll let you see all the different versions I’ve collected for Duolingo — not exhaustive, I’m sure, but still interesting — as well as their different sizes . But what if you want to analyze their differences pixel-by-pixel? Turns out, There’s A Web Component For That™️. Image Compare is exactly what I was envisioning: “A tiny, zero-dependency web component for comparing two images using a slider” from the very fine folks at Cloud Four . It’s super easy to use: some HTML and a link to a script (hosted if you like, or you can vendor it ), e.g. And just like that, boom, I’ve got a widget for comparing two icons. For Duolingo specifically, I have a long history of icons archived in my gallery and they’re all available under the route for your viewing and comparison pleasure . Wanna see some more examples besides Duolingo? Check out the ones for GarageBand , Instagram , and Highlights for starters. Or, just look at the list of iOS apps and find the ones that are interesting to you (or if you’re a fan of macOS icons, check these ones out ). I kinda love how easy it was for my thought process to go from idea to reality: And I’ve written the post, so this chunk of work is now done. Reply via: Email · Mastodon · Bluesky “It would be cool to compare differences in icons by overlaying them…“ “Image diff tools do this, I bet I could find a good one…“ “Hey, Cloud Four makes a web component for this? Surely it’s good…” “Hey look, it’s just HTML: a tag linking to compiled JS along with a custom element? Easy, no build process required…“ “Done. Well that was easy. I guess the hardest part here will be writing the blog post about it.”

1 views
JSLegendDev 3 months ago

I'm Making a Tiny RPG and I Need Feedback Regarding Performance

This past month, I’ve been working secretly on a small RPG game. While the game is not ready at all and I didn’t plan on talking about it, I’m now kind of forced to. I’ve been using JavaScript + the KAPLAY game library to make this game but I’ve been experiencing performance issues. However, it seems that others aren’t experiencing them so now I wonder, is it just my machine? I’m using a Macbook Air M3 with 16GB of RAM. Normally, things should be smooth and they are when playing the game in the browser via Firefox. However, since I’m making this game with Steam in mind, it’s especially important that the game performs well when wrapped as a desktop app. For this reason, I decided to reach out to you, my audience, for feedback. I have included a build of the unfinished game for Windows, Mac and Linux. It would be very nice if you could try it out on your machine. Additionally, recording gameplay and sharing the link in the comment section of this post would be greatly appreciated. Here is the link to the game in its unfinished current form : https://jslegend.itch.io/small-rpg-performance-playtest Below is a gameplay video to give you an idea of how the game is supposed to be played. You can move around with arrow keys and enter battle by overlapping with a star in the overworld. Performance issues, if any, occur mostly during battles. Thanks in advance! UPDATE 7/11/2025 : It seems that this post was shared on Hacker News and is getting more attention than usual! If you’re new to my Substack and are potentially interested in updates regarding this project, I recommend subscribing. Subscribe now In the meantime, you can read some of my previous posts!

0 views