TL;DR: AI-driven development is transforming software engineering from code writing to AI orchestration. While applications will become “black boxes” we can’t fully understand, engineers remain essential—shifting focus from execution to scoping, architecting and comprehensive testing. The message: embrace AI as a partner or risk becoming irrelevant. Engineers who adapt will find themselves doing more strategic, creative work instead of routine coding.
And I feel sorry for Bambi’s mom.
Look, I get it. Nobody wants to hear that their carefully cultivated career might be about to get steamrolled by a chatbot with delusions of grandeur. Years of taking the bus to school, college debt, 25 years of successful engineering career rendered irrelevant by, what? A $20 / month subscription-based code generator? But here we are, folks—living through the most dramatic shift our industry has seen since someone decided “maybe we should use version control.”
Spoiler alert: engineers will be needed for quite some time. Claude is not going to take your job – but befriending it, it is a must, young padawan.
The rise of AI-Driven Development (AIDD) isn’t just another shiny tool for your already-overflowing toolbox. It’s more like someone just handed every developer on Earth a robot assistant that never sleeps, never complains about your variable naming conventions, and generates code faster than the Feds mint pennies.
To be clear: I am not trying to convince anyone of anything, nor argue a point. I just want to share my thoughts on the future of AIDD, so if it comes to being, we don’t look like deer in the headlights.
Well, This Escalated Quickly
Remember when asking ChatGPT-4 to tell a joke, felt like black magic? Kids would hide in the hay, granny would grab the holy water, dad would feel for the baseball bat under the bed. That was 2023, but in AI years, it feels like centuries.
Now we’re living in a world where AI attempts to refactor your whole code base while you sleep. We went from “wow, it autocompleted my function” to “wow, it just built me an entire service while I was getting coffee.” Sure, it doesn’t fully work, but it still did 80% of the work. And this is the worst it will ever be.
The pace isn’t just fast—it’s absolutely bonkers. Every 2 weeks there is a new feature being released that makes the last one look like a pocket calculator – frankly, it’s terrifying, exhilarating and exhausting, all at once. I remember when I tried to introduce my parents to Nirvana, back in the 90’s: they were terrified, their ears were bleeding, but they kept a straight face, not to break my … teen spirit.
The biggest challenge of the current environment is that every day there is a new champion, and it is impractical to switch your AI every morning. At the same time, you don’t want to be stuck using the loser of the group. It’s not an easy decision.
From “Cool Trick” to “Existential Necessity”
Right now, if your team is using AI effectively, you’re basically that kid in school who got a calculator while everyone else is still using an abacus. GitHub’s data from nearly a million Copilot users shows developers are completing tasks faster, staying in flow 73% of the time, and actually enjoying their work more – who would have thought that possible?!
But here’s the kicker: nearly half of tech leaders say AI is already “fully integrated” into their core business strategy. Sure, let’s call that please-buy-my-stock AI hype, but let’s cut that in half to 25%. Still. That competitive edge you’re enjoying? It’s about to become as basic as having electricity in your office. By early 2026, not having AI in your development workflow will be like trying to compete in Formula 1 with a Topolino.

Let’s say you dislike or distrust AI, you’re old skool, you love the smell of the whale oil lamp. Whatever your reasons, you don’t want to use AI – and I respect that. Your current employer might be fine with it too (for a while anyway), but I can guarantee that one-big-fat-topic in the interview for your next job will be how much you are using AI.
It Loves Me, It Loves Me Not
Currently, we’re in this weird experimental collaborative dance where developers accept about a quarter of AI suggestions and get a good number of daily completions per user. Think of it as the world’s most productive pair programming session, except your partner never argues and, well, it periodically hallucinates. Microdosing that went macro maybe?
Microsoft’s research shows it takes about 11 weeks to fully appreciate these tools—which is roughly the same time it takes to accept that your new junior developer might not have much common sense, but it can generate code ridiculously fast.
AI does enough of a good job to give you a glimpse of the potential, but also enough mistakes to not fully win you over. So you’re on the fence, at which point your personality takes over and you’re looking at the classical half-full half-empty glass. Do you dismiss it as a half-baked feature, because currently it fails a lot? Or will you adopt it, learn to compensate for its weaknesses, while seeing it getting better every couple weeks?
Hell, personally, I would take anything that can give me an edge, especially knowing that it will only get better.
The Control Freak is Freaking Out
I remember growing up in the late 80’s in Romania, part of the Soviet Block, we had 2 hours of television (one channel, state-controlled) a day. You could honestly say that you’ve watched every minute of TV without missing anything. Everyone remembered all the shows, the presenters, and the movies. Right now, I’m tearing up, remembering how it took Bambi’s mom 4 weeks to die – because we only got 5 minutes of cartoons every Saturday.

Today, with thousands of shows available everywhere, there is no way I could keep up. Even if I have all the time in the world, I don’t have the mental capacity to ingest everything. That’s how the engineer will feel when AI creates thousands of lines of production-ready code in a day.
The only reason you can currently understand a block of code is because A) You wrote it yourself; B) Someone else wrote it, but you spent valuable time deciphering it, rolling your eyes, hanging on every little mistake (we call that “reviewing”). With AIDD you won’t be able to do any of these things anymore because AI will generate so much code that you won’t be able to keep up. Now you get the Romanian TV analogy, I hope.
We’re heading toward a future where you’ll describe what you want in plain English, and AI will generate thousands of lines of production-ready code faster than you can read your morning Slack messages. Your carefully crafted applications will become AI coded black boxes: functional, but you might not know what’s inside. Enter AaaBB (Application as a Black Box).
AaaBB (Application as a Black Box)
Just to clarify, what is a Black Box application? Simply put, code that someone else wrote. All you know is that you put oranges in it at one end, and you get the (hopefully still orange) juice at the other end. Now, when I say Black Box, it’s more of a scale, from slightly opaque to impenetrable – it all depends on how much time you will have on your hands to review the code.
This black box idea is not new. Most things in your life work like that, including things we entrust ours and our family’s life to. Think about your phone and its apps. Or the software in your car. Or the air traffic control tower at Newark (too soon?). Any software that you are using, and you haven’t written it yourself nor reviewed. Do you care about the code and design quality? No. All you care about is if it does what it is supposed to be doing.
So you say, it’s going to be a disaster. How can I tell if the code is bad? Well, you won’t need to. Just like with the applications on your phone, the collective mindset is going to change from focusing on code quality, to caring about the final product.
Remember when we cared about elegant code? Beautiful algorithms? That perfect function that made you feel like a programming poet? Well, research shows AI is already creating 4x more copy-paste code patterns, treating codebases like a short-term developer who doesn’t care about your feelings or your architecture.
The new quality metric won’t be “does this code make senior developers weep with joy?” It’ll be “does this thing actually work for our customers?” As Andrej Karpathy (the guy who went from OpenAI to Tesla and probably knows things) puts it: future programmers won’t be maintaining complex repositories or analyzing running times. They’ll be more like AI whisperers than code poets.
If I Can’t Control It, How Can I Be Responsible for It?
To be fair, engineers have all the reasons in the world to hate black boxes. Top of mind:
Security. The code might introduce security vulnerabilities.
Complexity. The code might be needlessly complex, or thousands of lines more than it should be.
Maintainability. How can I maintain this if I don’t understand it?
Productions issues. How can I fix urgent bugs and issues in Production if I’ve never seen this before?
Does it do what it needs to do? I don’t know man, I just work here. Oh, look, it’s 5 o’clock!
All valid points. If we remain stuck in the moment that is, and AI does not evolve anymore, which – excluding a nuclear war – is not a realistic scenario.
Let’s do a quick deconstruction:
Security. We already have non-AI tools called vulnerability scans, that look for code patterns that could lead to security holes. We do that today and we will continue to do it with AI generated code. Nothing changes.
Complexity. Sure, the black box might have more code that you would have written, not as elegant, maybe consuming more resources. But from a business perspective, the speed at which AI will be able to deliver the box will trounce pretty much all the other considerations. Ask your business leader: will they trade complex code for speed to market? Actually, don’t ask, you don’t want to hear the answer.
Maintainability. Aaah, this is a big one, worth its own post. Also, it is a construct borne out of current human limitations. Long story short, through the brute force that AI provides, the maintainability concept will be replaced by AI creating a new application from scratch. Better, faster, more modern. Remember: the only reason we need to “maintain” is because we don’t have the time to rewrite. AI will change that.
Production issues. If AI gets that good that it can write production-ready code, and test the hell out of it, production issues will be far rarer. Currently, most production issues happen because engineers spend too much time coding and too little time testing. AI will change that.
Does it do what it needs to do? Well, if it walks like a duck, swims like a duck, and quacks like a duck, then it’s probably a duck. Translation: if we can use an AI specialized for testing, that makes sure that for every 10 oranges we put in, we get a glass of orange juice, then we’ll be fine. Remember, a testing AI can be extremely fast, comprehensive, and – with your help – more effective than a team of 100 QAs.
To be honest, I would be more concerned with how tight my 401k is correlated to the success of AI.
So, no more Engineers?
No, no, no, no, no. Not at all. Engineers will always be needed, but their relationship with applications will transform significantly.
Let’s say that the main phases of developing an app are Scoping, Execution, Testing, and Maintenance. Currently it would be fair to say that most of your time goes into Execution (writing the code, iterating, debugging, managing dependencies, creating the infrastructure) and Maintenance (tweaks, upgrades, refactoring, patches). In the upcoming era, AI will do the Execution and Maintenance part for you, while you focus a lot more on Scoping and Testing.
Scoping is telling AI what you want to achieve, giving it business and industry context, providing design and coding guidelines, and specifying technologies. Then AI will get to coding, creating the black box. Because the control freak inside of you doesn’t have any idea what’s inside the box, you will have to do the Testing thing really, really well – you will use a different AI, not the one that wrote the code, to test all the aspects of the final product and ensure that the coder AI actually delivered what you asked it to: complex E2E, latency, stress, edge cases, response time, uptime, anything you can think of that defines a … good product.
Most engineers don’t like it when they are called “developers”, because it implies that they just type code, and don’t do much thinking. “Engineering” is a lot more about thinking, but still, most of the effort goes into coding. Fast forward a couple years, the same engineers won’t like it when you call them “engineers” because it will imply that they are still coding, stuck in the pre-AI era, instead of researching, experimenting, designing and architecting.
Why would anyone want to still write code by hand, when there are so many other more exciting things to do with your time? Who still wants to navigate by stars or use a compass, to find their way in Los Angeles? Record cassettes? Use a typewriter? Type a phone number? Sleep in the 6 am line at the local DMV to renew car registration?
You get it. The future is here and it ain’t waiting for no one. Grab it by the tail and don’t you dare letting it go.

Leave a Reply