What's Left When AI Writes the Code

A year ago, I was faster. Today, I'm slower and the difference isn't me.
The Moment
Mid-2024, I was maintaining a decentralized finance app built in React. The codebase was about two years old, not terribly structured but not great either. I opened Cursor, pointed Gemini at a feature, and waited.
Six to ten attempts just to get the TypeScript to compile. Each fix introduced a different problem. I'd describe what I wanted, the AI would deliver something that looked right, and then the type checker would light up like a Christmas tree. After hours of back and forth I had the same bug I started with, just in a different place.
I closed the tab, went back to writing it myself, and thought: cool, but not useful yet.
That "yet" was the mistake.
End of 2025, different project. A high-frequency trading bot in Python — consuming Binance, Coinbase, Chainlink data, running a pipeline to backtest different strategies. Built from scratch, well-planned, clean component structure. I pointed AI at it and it went smoothly. Not just functional, but better documented than most code I've seen from senior devs. More tests. More edge cases covered.
The difference wasn't the AI getting smarter, although there is no denying that it has... the difference was that the project was planned for AI to write it. Clean structure, clear boundaries, well-defined components.
When the input is right, the output is right.
And planning the input is a different skill than writing the output.
The act of writing code—the skill I spent years sharpening—had become the least important step in the chain.
The Grief
It wasn't fear and it definitely wasn't anger. It was some sort of strange void—staring at a craft you spent years mastering and realizing it no longer separates you from anyone.
Instead of the expected panic I got a thought that was much clearer:
"My work here is done."
That version of my job—the one where the value lived in writing lines of code—was over. Not dying. Done.
There's no negotiating with it.
The Trap
I look at my developer friends. Most of them are using Cursor, Copilot, or Claude to do exactly the same thing they were doing before. Just faster.
That's not adaptation.
That's pretending that the water you feel sloshing around your ankles is meant to be there and the Titanic won't sink.
Blockbuster didn't die because they couldn't build a website. They died because they treated the web as a supplement to the physical store; a way to check inventory or reserve a tape for pickup.
Reserving a movie online was just a faster version of walking the aisles. Netflix understood that the internet was the replacement; they took an "internet native" approach.
Using AI to write the same code you would have written manually is the Blockbuster strategy. Comfortable, familiar, and unfortunately temporary.
If the AI writes the code, what's left of you?
Most devs I know avoid that question. They hide behind "AI still makes mistakes" or "you still need someone to review it." Sure. For now. That window is closing fast—and "I'm the human who checks the AI's homework" is not a career.
What's Left
What's left is what AI cannot replicate: who you are... but what is that exactly?
Not your skills. Not your stack. Not your whatever names you've got on your CV. Those are inputs—and inputs get commoditized. What doesn't get commoditized is the specific way you see problems because of the unique life you've lived.
I spent over fifteen years writing code. ERPs at Linx and Bohm. Banking systems at Itaú. SAP consulting. DeFi protocols at Moonwell, close to the Coinbase ecosystem. Different industries, different scales, different kinds of failure.
I watched a ticket reconciliation system at a travel company fail three times in a row. It failed not because the code was bad, it failed because every architectural decision was wrong. Monolithic when it should have been distributed. Synchronous when it should have been async. Each rewrite fixed one thing and broke the assumptions underneath. We finally containerized it with Docker, split the workers, and then... the pandemic killed the whole division before it shipped.
The code was never the problem. The thinking was.
When I look at a problem today, I don't see the technical steps—I see the decisions behind them. The places where someone is about to optimize for the wrong thing. That pattern recognition comes from fifteen years of watching things go wrong. It's not in any model. It can't be prompted, at least for a while?
A founder can vibe code an app. A marketing person can build a website and the customer won't know the difference. And that's fine, that's good! The code was never the product. The result was. We just convinced ourselves the craft of writing it mattered more than what we brought to it.
But here's the part nobody wants to think about.
Not everyone has a singular identity to fall back on. Not everyone spent years accumulating a specific combination of experiences that makes their judgment hard to replace; many people optimized for one skill, got good at it, and built their entire career on the scarcity of that skill.
When that skill becomes cheap, or worse free—which it will—what's left?
We're not far from a world where a significant part of the population does work that's simply cheaper to give to a human than to automate, just because the robot costs more for now. We've seen it happen again and again from agriculture to automotive to everything between. It's the economics of every labor transition in history. The difference now is that the transition is faster than any before it... and the category of "cheaper than a machine" keeps shrinking.
The people who will be fine are the ones with something AI doesn't absorb: a perspective, a specific judgment, shaped by a life no one else has lived. The people who won't be fine are the ones who defined themselves entirely by a skill that just became free.
"I am what I code" is already dead. "I am what I've lived, and what I build from it" is the only one that holds.
The Start
I'm not on the other side of this. I am in the middle, working my way through it.
Every day something shifts. Every day I find a faster way to connect the pieces, a model that handles something I was doing manually last week. I'm not precious about any of it.
The workflow I built yesterday might be obsolete tomorrow.
Things are going to move fast.
The devs still polishing their code like artisans are going to look up one day and realize the person next to them—who can't write a single function—shipped the same product in half the time. And a lot of people who never bothered to ask themselves "what makes me, me... beyond my job title" are going to find out the hard way that the market doesn't care about your identity crisis.
Key takeaways
- Planning beats writing. When the project is structured for AI, the output is right. The scarce skill is designing the input, not typing the code.
- Using AI to do the same thing faster is the Blockbuster strategy. Treating AI as a supplement to "how we've always worked" is comfortable and temporary.
- What doesn't get commoditized is the way you see problems because of the life you've lived—judgment, pattern recognition from failure, the decisions behind the technical steps.
- "I am what I code" is already dead. The identity that holds is: "I am what I've lived, and what I build from it."
- The people who will be fine have a perspective AI can't absorb. The people who won't be fine defined themselves entirely by a skill that just became free.