Skip to content
← Back to Writing

The Third Enlightenment

January 12, 2026

AI work enlightenment human flourishing

Picture the old clerical offices—rows of workers at wooden desks, ledger books open, columns of numbers that had to balance by end of day. Each person doing exactly one function: this one adds, that one copies, another one verifies. The organization had learned to unbundle what looked like a single role into specialized tasks. A job wasn’t one thing—it was a bundle.

The same unbundling is happening now, at a scale those offices couldn’t have imagined.


An uncomfortable truth about how most organizations work: they treat people like interchangeable parts.

This isn’t malice. It’s necessity. When you’re running a hospital or a factory or a government office, you need to know that if Sarah quits, Marcus can step in and the work continues. You need standardized roles with defined responsibilities. You need the function to survive the person.

Max Weber called this bureaucracy, and he saw its efficiency as a genuine achievement. Compared to the chaos of patronage systems where your cousin got the job regardless of competence, bureaucracy was progress. Hire the qualified person. Define the role clearly. Measure performance against objective criteria.

But progress has costs. To make roles interchangeable, you have to strip away everything that makes people different. Sarah’s specific gifts, her unusual way of seeing problems, her capacity to notice what others miss—none of that fits in the job description. The job description needs to work for Marcus too. And for whoever comes after Marcus.

So we end up with what Weber also called the “iron cage”—organizations that work because they’ve squeezed the human out of the human.


I used to think the conversation about AI and work was fundamentally about replacement. Machines get smarter, humans get displaced, we need universal basic income to pick up the pieces. The framing assumed that jobs were atomic units. Either you have the job or you don’t. Either the human does it or the machine does.

But jobs aren’t atomic. They’re bundles. The job of “financial analyst” bundles together: reading documents, building models, catching errors, explaining conclusions, noticing what’s missing, maintaining relationships, showing up on time. Some of these tasks require human judgment. Others are mechanical and always were—we just didn’t have machines precise enough to handle them.

AI changes the equation by taking over the mechanical parts of the bundle. The cog-work. The parts that required a human only because we didn’t have an alternative.

What remains is the irreducibly human.


Call this the Third Enlightenment — the synthesis I describe in Third Enlightenment, where Eastern awareness meets Western agency. What does that synthesis look like in organizational life?

The efficiency gains of machine-thinking redirected toward human flourishing. The dehumanizing parts of work — the parts where we had to pretend to be machines because no actual machines could do it — handled by actual machines.


Consider what bureaucracy required of people.

Show up at the same time every day, regardless of your energy or creativity. Perform the same functions repeatedly, regardless of your growth or interests. Suppress idiosyncrasy in favor of standardization. Be predictable. Be reliable. Be, in essence, a machine.

We accepted this bargain because the alternative was worse. Organizations needed consistency to function. And human consistency—human reliability over time, across contexts—is expensive to produce. It requires training, monitoring, incentives, cultural reinforcement. The whole apparatus of management exists because humans aren’t naturally consistent in the way organizations require.

But what if consistency stopped being a human job?

The work itself continues. The consistency becomes a machine’s job. The showing-up-the-same-way-every-day. The mechanical reliability that organizations need but humans were never designed to provide.

AI can be the cog. The reliable, consistent, predictable part. This liberates the human to be fully human—to bring judgment instead of procedure, to notice instead of process. To do the kind of connecting that was never mechanical in the first place — the part the bureaucracy couldn’t afford to value.


This is already happening in my own work. I built a system where I drop thoughts into a shared space and AI processing handles the connecting.

The work divides naturally. Some parts are irreducibly mine: noticing what’s worth paying attention to, feeling when something’s off, bringing the accumulated weight of my own history to bear on a problem. Other parts are better suited for the machine: maintaining context across thousands of notes, finding patterns across materials I’d forgotten I read, connecting ideas across domains.

The division falls in a revealing place. I do the parts that require being me. The machine does the parts that require consistency, scale, tireless attention.

What remains for me is more human than the original job was. Before the partnership, I spent hours on mechanical tasks—filing, searching, remembering where I’d put things, processing inputs that didn’t require judgment but did require time. Now the machine handles that. What’s left is the judgment. The noticing. The being present in a way that no amount of processing power can replicate.


The fear about AI—the justified fear—is that organizations will use it to reduce headcount. And of course they will. Every technology that increases efficiency gets used, at least initially, to do the same work with fewer people.

But there’s another path.

Using AI to make work more human rather than less. Strip away the machine-parts of the job, leaving only what requires genuine human presence.

The financial analyst who no longer spends hours building models from scratch but instead uses that time to understand what the numbers mean for actual humans. The doctor who no longer fights with electronic health records but talks to patients. The teacher freed from grading mechanics to focus on the student in front of them.

This isn’t utopian fantasy. It’s already happening in pockets. The question is whether we’ll recognize it, amplify it, make it the norm rather than the exception.


Think about those clerical workers when the calculator arrived. The ones whose only valued function was precise addition suddenly found their bundle shifted. Some were left out.

This is real. Transitions hurt real people. I’m not pretending the path from here to human flourishing is painless or automatic.

But I also wonder what else those workers might have been. What capacities they had that the job never asked for, never made space for. The bureaucracy needed calculation, so calculation is what they gave it. Everything else—whatever made them irreducibly them—stayed hidden.

The Third Enlightenment looks like organizations that can stop demanding consistency and start using what their people actually have. The machine-parts can be handled by actual machines.

We built organizations that required humans to act like cogs because we had no other cogs available. Now we do.

If you're thinking about similar questions—or building systems that grapple with them—I'd welcome the conversation.

Continue the conversation →