Links, Jan 16, 2026
I made a big push this week to drain my link inboxes, and there was some great stuff in them.
Music Stuff
§Dronage Terminal
A terminal-based audio synthesizer featuring 4 instances of the open-sourced Braids by Mutable Instruments. It eschews traditional note names or MIDI control in favor of keyboard control, a built-in parameter sequencer, and specifying frequencies directly.
§QSynthi
A free audio-plugin which explores “the sonic potential of the Schrödinger Equation”. To my ears it sounds a lot like additive synthesis and wavefolding, but it’s fun to see ideas like this take hold.
§JE8086
A free, open source bit-accurate emulation of the Roland JP-8000 – a beloved and pioneering analog modeling synthesizer released in the ’90s – available for Linux as well as more popular desktop operating systems. It requires a separate download of the firmware.
While its creators, The Usual Suspects have released other emulations of classic synths, this one is notable for the process of how they reverse-engineered it, as opposed to working off of documented specifications; as this talk explains, they used automated computer vision processes to identify microchips and circuit layout, and direct probing of the main computation chip. The process sounds similar to what Plogue uses for their excellent Chipsynth series, but it’s hard to tell because Plogue isn’t as open about their process.
§Fasttracker II Plugin
FastTracker II as an open-source cross-platform audio plugin, based on the standalone port. I find it hard to be productive with trackers these days, but they were responsible for first giving me the feeling that I could actually compose music I liked. If you want to play with the tools a bit without going full Renoise, it’s a great way to get your toes wet.
Computer Stuff
§The Darnella test of social media and smartphone regulation
Heather Burns on the recent legislative trend of banning teens from social media:
And if you have never once stopped to reflect on how these sanctimonious proposals, as they always do, come from affluent white upper middle class Mrs Jellybys who live in bubbles of privilege with nannies and au pairs and bottomless budgets for advocacy campaigns run as personal crusades, that’s because you are probably one of them.
§Software Acceleration and Desynchronization
Fred Hebert, who I’ve linked to before, on the systems in which code exist in service of software engineering, and the disconnect that occurs when people believe the role of developers is to write code:
But code reviewing is not just about finding errors. It is also used to discuss maintainability, operational concerns, to spread knowledge and awareness, to get external perspectives, or to foster broader senses of ownership. These purposes, even if they could be automated or sped up, can all indicate the existence of other loops that people may have to maintain regardless.
§Code is a liability (not an asset)
Cory Doctorow is someone I’ve linked to more than Fred, but much less enthusiastically; you can search for him via the search facility above if you want to know why. The crux of this piece is in the title, and it’s on a very similar theme to Fred’s piece:
“Writing code” is about making code that runs well. “Software engineering” is about making code that fails well. It’s about making code that is legible – whose functions can be understood by third parties who might be asked to maintain it, or might be asked to adapt the processes downstream, upstream or adjacent to the system to keep the system from breaking. It’s about making code that can be adapted, for example, when the underlying computer architecture it runs on is retired and has to be replaced, either with a new kind of computer, or with an emulated version of the old computer
§Web dependencies are broken. Can we fix them?
I’ve been taking some time away from the web ecosystem, and am working on a scratch-my-own-itch desktop app in Rust, specifically avoiding tools that merely embed a web view for their UI layer. It’s refreshing in many ways, not least of which being how you don’t have to worry about things like:
But the moment you add that first dependency, everything changes. You are suddenly faced with a huge usability cliff: which bundler to use, how to configure it, how to deploy with it, a mountain of decisions standing between you and your goal of using that one dependency.
This is about more than just npm and node_modules, but also explores the commonly-presented alternatives and how in many ways they’re worse than what we have with bundlers and npm.
For a variety of reasons, I tackled this problem from a first-principles approach in building the rendering pipeline for this website. I wanted simultaneously to be able to use things like solidjs for existing interactive pages like Golden Times or Segmenting Fields while having an entirely separate “build pipeline” for newer ones like Login Bingo or a piece I haven’t finished yet that makes heavy use of d3.js. It put me closer to the nuts and bolts of javascript build tools and their assumptions than I had been in over a decade, and it took some doing to come up with a solution I’m (still, thankfully) happy with: any page that needs its own assets has a frontmatter manifest that is in theory modular but for now just uses esbuild to compile a bunch of source material into outputs, such as this one for Golden Times:
builds:
- type: esbuild
entry: goldentime.jsx
plugins:
- import: esbuild-plugin-solid
fn: solidPlugin
opts:
solid:
moduleName: 'solid-js_v1/web'
Javascript build tooling and dependency management aren’t built to create augmented pages, but rather build monolithic apps, and this is very much the antithesis of the Larry Wall quote: “Easy things should be easy, and hard things should be possible.
Like boiling frogs, JS developers have resigned themselves to immense levels of complexity and gruntwork as simply how things are.
The author of this piece also tries to propose some new techniques based on existing parts of the web stack, though I’m not entirely sold on it.
§Falsehoods People Believe About Computers
Some forms of technology-based harm are inherently unpredictable and no one should be punished for them, provided the revenue-receiving party either claims to have had good intentions or can point to some other party who is ultimately responsible.
I have had to spend a lot of time arguing with people about this one in particular; it seems a lot of people aren’t aware of The Unaccountability Machine.
§If users notice your software, you’re already a loser
You get a computer at all because you’ve got a job to do. So you get software to do the job. And you need to run that software on a machine, and it’ll have an operating system platform.
Neither the computer nor the platform are supposed to be noticed. If you notice it, your work crashes to a halt.
If you’re making a platform and anyone notices it, you’ve already lost.
Platforms must be transparent. All these platforms start transparent, then some marketer needs some resume juice, they make the platform go NOTICE ME and they think they’re the star of the show.
All the major mainstream software vendors seem to have forgotten this, they’re all ruled by people who scream NOTICE ME now.
§The struggle of resizing windows on macOS Tahoe
Per the previous link, Liquid Glass went hard into NOTICE ME territory. Design is about the things you’re not supposed to notice, like how you resize a window:
in the end, the most reliable way to resize a window in Tahoe is to grab it outside the corner – a gesture that feels unnatural and unintuitive, and is therefore inevitably error-prone
§Do you understand how fast computers are?
And yet, think about every interaction you have with a computer in public? Ticket machines asthmatically wheeze as they slowly trick you into buying the most expensive option. Passing through an airport seems to involve waiting impatiently at a number of desks while arthritic mainframes slowly coalesce your data. Advertising screens stutter and jerk their way through low-framerate videos trying to sell you perfume. Every time you speak to a call centre, I guarantee someone says “sorry, my system’s running slow today.”
One of my fist jobs was working at the telephone support call center for a major package distributor, at the dawn of the world wide web; we worked at dumb terminals hanging off a mainframe, and most calls were of the can you look up this tracking number? variety. We had to code calls by category as we fielded them, and the expected average response time for providing tracking information was three minutes. Yes, things have improved, but not linearly with the improvements in hardware.
GenAI is eating itself
§Pwning Claude Code in 8 Different Ways
A few months ago, I came across an interesting behavior while using Claude Code—it executed a command without my approval.
YOLO vibes.
§AI and the Corporate Capture of Knowledge
Bruce Schneier joins the resistance, putting LLM chatbots in context with the fight that cost Aaron Swartz his life:
As search, synthesis and explanation are mediated through AI models, control over training data and infrastructure translates into control over what questions can be asked, what answers are surfaced, and whose expertise is treated as authoritative. If public knowledge is absorbed into proprietary systems that the public cannot inspect, audit or meaningfully challenge, then access to information is no longer governed by democratic norms but by corporate priorities.
§Keeping Bandcamp Human
Bandcamp is banning AI-generated music from their platform:
We believe that the human connection found through music is a vital part of our society and culture, and that music is much more than a product to be consumed.
This is, however, the same company that fired a bunch of their workforce when they tried to unionize.
§Why We Don’t Use AI
Yarnspinner makes tools for managing narrative in games; dialog trees, UI, localization, etc, and they talk about their long journey in exploring how they could integrated machine learning tools into their products, and why they’re not integrating with popular genAI tooling:
it was clear that the AI we liked was not what the tech companies were interested in. They were increasingly about generative imagery, chatbots writing your material for you, and summaries of art instead of exposure to it. Efforts to mitigate known problems (reinforcing cultural biases, being difficult to make deterministic or explainable) were disparaged and diminished. Researchers and developers who raised concerns were being fired.
§Warhammer Maker Games Workshop Bans Its Staff From Using AI in Its Content or Designs, Says None of Its Senior Managers Are Currently Excited About the Tech
I have been actively repulsed by military fiction my entire life, and as such never got into Warhammer, but as an artist I definitely understand the appeal of its artwork (both official and fandom) and lore. And it seems that Games Workshop understand this as well:
“We do have a few senior managers that are [experts on AI]: none are that excited about it yet. We have agreed an internal policy to guide us all, which is currently very cautious e.g. we do not allow AI generated content or AI to be used in our design processes or its unauthorised use outside of GW including in any of our competitions. We also have to monitor and protect ourselves from a data compliance, security and governance perspective, the AI or machine learning engines seem to be automatically included on our phones or laptops whether we like it or not.
“We are allowing those few senior managers to continue to be inquisitive about the technology. We have also agreed we will be maintaining a strong commitment to protect our intellectual property and respect our human creators. In the period reported, we continued to invest in our Warhammer Studio — hiring more creatives in multiple disciplines from concepting and art to writing and sculpting. Talented and passionate individuals that make Warhammer the rich, evocative IP that our hobbyists and we all love.”