Essay Series

The Transition Is Not the Destination

AI is a general-purpose coordination technology. When coordination becomes cheap, the structures built around expensive coordination begin to wobble.

The Transition Is Not the Destination thumbnail

Retro futurism: Source unknown

Most AI commentary follows the same script. The machine gets stronger. Ordinary livelihoods get shakier. The benefits flow to whoever can afford to deploy the technology, and your best move, if you have the means, is to save aggressively, buy the right stocks, and try to be among those who deploy AI rather than those it displaces.

I understand why the story lands. It names something real. The current AI wave is funded, deployed, and monetized by large firms. It is already being used to cut costs, reduce headcount, tighten operations, and expand margins. Anyone pretending otherwise is selling incense.

But the worldview has a flaw worth naming. It mistakes the transition for the destination.

AI is a general-purpose coordination technology. The labor effects are real, but the institutional effects run deeper. When coordination becomes cheap, the structures built around expensive coordination begin to wobble.

The institutions we mistake for nature

Because large firms are currently driving AI, many people assume the future must belong to those firms in their present form. Because today's firms are using AI to cut costs, they assume the destination is a world where the technology only strengthens those who already deploy it. That does not follow.

Every age mistakes its institutions for permanent features of reality. The modern firm, fragile labor markets, bloated administrative systems, licensing regimes — these are responses to a particular set of constraints, and the deepest constraint has always been the cost of intelligence and coordination.

A surprising amount of modern economic life exists because it has been expensive to understand what is going on. Expensive to move information, compare options, verify facts, monitor work, forecast demand, route resources, handle exceptions, keep large groups aligned. That is why so much white-collar work consists of translating, reconciling, scheduling, reporting, escalating, checking, and chasing.

Much of the modern economy is human middleware. And middleware is what AI threatens first.

When cognition becomes cheap and continuous, the layers built around scarce cognition stop earning their keep. If software can classify, summarize, draft, route, forecast, and monitor in real time, many of the institutional forms we inherited start to look less like natural law and more like expensive workarounds.

What unbundles

The important question is not who captures more value inside the old machine. It is which parts of the old machine we no longer need.

Cheap cognition can centralize power. It can also unbundle it. A small team with strong tools can do work that once required layers of analysts, coordinators, and support staff.

I have watched this in my own work, though the scene compressed faster than I expected. A year ago I was thinking about teams of three. Now I'm doing what those teams used to do, alone, across two products. On one of them, contracted in, I built the design system and then, instead of producing 2D mockups in Figma, built the app itself in React. Three weeks of solo work produced what would have taken a team of developers six months. The team can run it, click through it, and respond at a fidelity that was not possible before.

The coordination tax collapsed. Research synthesis that used to require a workshop is now a conversation. The investor deck, the architecture spec, the customer interview notes, and the contract feedback all live in the same working memory. What used to need its own Slack channel now travels with the person, not the structure.

What surprised me was the shape, not the speed. The work is no longer organized around handoffs. It is organized around what the problem actually requires. Take the handoffs out, and a lot of what looked like work turns out to have been overhead.

A local business can run with a level of responsiveness that used to require scale. Regional manufacturing and service networks become easier to coordinate. Public agencies could remove strata of bureaucratic drag if they chose to use AI to simplify rather than surveil. Cooperatives and mission-driven institutions become more viable once the old burden of overhead falls.

If AI were only a better cost-cutting tool inside the existing system, the bleak advice would be basically correct. Buying the Mag 7 may be a hedge. It is not a civilizational plan.

This does not happen automatically

The hopeful view is not that things sort themselves out. They do not. Powerful tools tend to consolidate. Transitions are usually ugly. People lose livelihoods before new structures appear.

But it is also unrealistic to assume institutions designed around expensive cognition will remain unchanged once cognition becomes cheap, or that human beings must forever organize their lives around selling hours into systems that increasingly need fewer of them.

The real opportunity is larger than job preservation, though that matters too. It is to build a world in which fewer people spend their lives on routine administrative labor, fewer communities are stuck behind coordination failure, fewer small operators are buried in overhead, and more human effort goes toward work where someone has to actually be there.

The industrial mind still equates human value with wage value. If a role becomes automatable, we assume the person attached to it has become less necessary. That is the wrong lesson. The slot becomes less necessary. The person was always more than the slot.

Human beings matter most where stakes are real, context is messy, accountability is local, and judgment cannot be reduced to a dashboard. Care, leadership, teaching, conflict resolution, field operations — these are domains where someone bears consequences, and consequences are hard to outsource. The task is to stop wasting human beings on machine work in the first place, not to out-compete machines at it.

What to do now

For individuals: reduce fragility, and shift toward domains where judgment, relationships, and real ownership matter. Learn the tools. Use them hard. Build assets. Stop depending on a single credential or a single employer.

For builders: attack coordination costs directly. Do not wrap another AI skin around a legacy workflow and call it innovation. Ask which layers are obsolete. Ask which bottlenecks exist only because information has been trapped, fragmented, or made expensive to act on.

For towns, schools, and public systems: use AI to remove drag rather than tighten managerial control. The best use of these tools is not a more efficient version of bureaucratic paralysis. It is to simplify, integrate, and restore local capacity.

The fork

The transition will be rough. Large firms will try to capture the first wave of value, and probably a good deal after that. Some current jobs and institutions will disappear. None of that should be minimized.

But the destination does not have to look like the transition. That is the mistake buried in most AI fatalism — it sees the first shock and assumes it has seen the final form.

AI can be used to make people more disposable inside inherited systems. It can also be used to build systems where fewer people live as disposable parts in the first place.

The first path is already funded. The second has to be built.

More in this series

What AI Makes Buildable thumbnail
Part 2 What AI Makes Buildable Cheap intelligence does not just automate tasks. It changes which organizations and systems are feasible.
A Future Worth Wanting thumbnail
Part 3 A Future Worth Wanting A society where more people can own something, build something, care for someone, belong somewhere, and spend less of their one life doing tasks that never deserved a human soul.