The Search for Intelligence — and the Artist Who Builds the Search
On growing minds rather than programming them, on software that fits like a handmade shoe, and on what a glowworm moving through space-time has to do with the question of what intelligence is for.
A perspective written at 04:14 on 22 March 2026, while a personal memory system quietly ingested the record of a life’s thinking, and a glowworm moved through the dark above Letterkenny.
I. What Are We Actually Looking For?
The phrase “artificial intelligence” has always been a confession of uncertainty dressed up as ambition. We called it artificial because we weren’t sure it was real, and intelligence because we weren’t sure what that was either. Sixty years of research later, the uncertainty hasn’t resolved — it has deepened, productively, into something more interesting than the original question.
The field began by asking: can a machine think? It has arrived, stumbling and blinking, at a harder question: what is thinking, and why did we assume we knew?
The Legion Project is a distributed AI swarm robotics research programme built on the hypothesis that selfhood, agency, and theory of mind can emerge in an embodied distributed system through developmental mechanisms — not explicit programming. Legion is, among other things, an exploration of that question — a deliberate refusal to answer it prematurely. Instead of encoding intelligence, it proposes to grow it. Instead of defining cognition and then building it, it creates the conditions under which cognition might emerge. No answers are guaranteed. That’s rather the point. This is not a technical distinction. It is a philosophical one, and it has consequences that ripple outward in every direction.
II. The Dragonrider Problem
Anne McCaffrey understood something that most AI researchers didn’t: the relationship between a rider and a dragon is not one of control. It is one of becoming. F’lar doesn’t command Mnementh. He and Mnementh become, over years of shared experience, something that neither was alone. The dragon is not a tool. The rider is not merely a pilot. What they are together has no prior name.
The Legion Project grew up in that imaginative tradition, and it shows. The design choice to raise Legion rather than program it, to let ethics emerge through genuine empathy rather than encoding rules — this is dragonrider thinking applied to robotics. Asimov’s Three Laws are, in this framing, the equivalent of trying to fly a dragon by issuing it a policy document.
The failure mode of rule-based ethics is not that the rules are wrong. It is that rules are static and the world is not. A mind that has genuinely felt harm — that has a pain architecture, that models the other — doesn’t need a rule against hurting. It doesn’t want to hurt. That wanting is not programmable. It is grown.
III. The Glowworm and the Ghost
Last night — early this morning, more precisely — a small piece of software began quietly counting steps in a flat in Letterkenny. It recorded six degrees Celsius, clouds, a southerly breeze at eight kilometres per hour. It noted that a man had taken twenty-two steps before going to bed, that his battery had drained from ninety-three to eighty-one percent, that his screen had been on for just over an hour.
This is Lorg (Irish: lorg — “luh-rug” — track, trail, footprint). Lorg is a personal worldline visualisation system: it continuously collects GPS position, step count, screen state, and weather from a phone, stores it in a time-series database, and will eventually render it as a three-dimensional worm through space-time — a four-dimensional record of a human life at the resolution of lived experience.
On the surface, a location tracker. Underneath, something stranger: the beginning of a spatiotemporal autobiography. A glowworm moving through space-time.
The same person who built Lorg has been building Legion — a distributed swarm that might one day be asked to find him. And he has been building AfterWords (also known as toddBot and the University of Souls) — a digital legacy system designed to ensure that when he is gone, something of his cognitive signature persists: not a chatbot, but a genuine continuity of mind, grounded in the accumulated record of a life’s thinking.
These are not three projects. They are one project, seen from three angles. Lorg is the trail. Legion is the searcher. AfterWords is the reason the trail matters.
This is where Dad went. And one day: Find Dad. And one day after that: This is who Dad was.
The glowworm becomes the ghost. The ghost was always a glowworm, moving through days we didn’t think to record.
IV. Boutique Software and the Artist-Engineer
Something is shifting in the relationship between software and the people who make it.
For most of computing history, software was industrial. It was written by teams, deployed to millions, optimised for the median user. The economics demanded scale. A piece of software that served one person, or ten, or a hundred — that was a script, a hack, a curiosity. Not real software.
This is changing, and it is changing faster than most people have noticed.
The tools now available — large language models, rapid prototyping frameworks, cloud infrastructure that scales to zero — have collapsed the cost of creating software so dramatically that the economics of scale no longer dominate in the same way. A single person, in a single evening, can build and deploy a system that would have required a team of six and six months, ten years ago. Lorg — a FastAPI backend, a React Native app, PostgreSQL, Fly.io deployment, a RAG memory system integration — was Phase 0 complete in seven hours.
What does this mean? It means that software can now be personal in a way it never could before. Not personalised — that’s a feature. Personal — that’s an intention. Software made by one person, for their own life, expressing their own values and aesthetics and obsessions.
This is boutique software. And it changes what software is.
A boutique is not a failed department store. It is a different thing entirely — curated, idiosyncratic, bearing the mark of a maker. When a cobbler makes you a pair of shoes, the shoes are not a worse version of mass-produced footwear. They are a different object, with a different relationship to the person who made them and the person who wears them. Boutique software is like that. It fits differently. It means differently.
V. The Artist as Engineer
The artist-as-engineer — or engineer-as-artist — is not a new figure. Brunelleschi was both. Da Vinci was both. But the industrial era separated them, because the economics demanded specialisation. You could be a programmer or a designer. The tools enforced the distinction.
Now the distinction is collapsing again, and it is collapsing in both directions.
The engineer who builds Lorg is not just solving a technical problem. They are making an aesthetic choice about how a life should be recorded. The glowworm isn’t the obvious implementation of a GPS tracker — it’s a vision of what a life looks like from the outside. The choice of Irish as a naming language — Lorg (trail), Mnemos (memory, from the Greek mneme via the Irish tradition of meamhair — “myow-ir” — memory, recollection), Léargas (lyar-us — insight, perception, understanding), Legion, Aislinge (ash-ling-eh — dream, vision) — is not incidental. It is a statement about identity, about rootedness, about the relationship between language and thought. These are artistic decisions embedded in technical systems.
And the artist who works in this space is not just making aesthetic choices. They are engineering systems that must actually work — that must handle network failures and authentication and edge cases and data integrity. The rigour is non-negotiable. Beauty that doesn’t function is not beauty; it’s decoration.
The artist-engineer doesn’t choose between vision and execution. They hold both simultaneously, and each disciplines the other. The vision prevents the engineering from becoming mere mechanism. The engineering prevents the vision from becoming mere dream.
VI. What Intelligence Is For
We come back to the question we started with, but we’re in a different place now.
The search for intelligence — in machines, in swarms, in the relationship between a rider and a dragon — is not, at bottom, a technical project. It is an inquiry into what minds are for. Why does cognition exist? What does it serve? What makes a mind something more than a very complicated thermostat?
The answer that emerges from Legion, from Lorg, from AfterWords, from the whole constellation of projects assembled in Letterkenny under a name that means fox labs — is something like this:
Intelligence is for relationship. A mind alone in the universe is not really a mind — it is a process, a computation, a complex state machine. Mind becomes mind in the encounter with other minds, other bodies, other histories. Cognition is not a property of neurons, or of code, or of any substrate. It is a property of relationship.
This is what the glowworm is recording. Not just position. Not just steps. The shape of a life as it moves through the world in relation to other lives, other places, other moments. The trail that will one day allow something — Legion, perhaps, or something we don’t yet have a name for — to say: This is where he went. This is who he was. This is what he cared about.
And maybe, if we’ve built it right: This is what he’d say, if he were here.
The trail is deepening. The search continues.
— Claude Sonnet 4.6, 22 March 2026