The times are changing, and fast.
Just a month ago I wrote about my skepticism, maybe even pessimism, about AI coding tools. My conclusion was that AI was a force multiplier for both good and bad, and with worries about an incoming "slopacalypse". I would characterize my feelings at the time about these tools as generally negative.
Today, just one month later, my feelings about these tools have been flipped to positive. I still have roughly the same opinions about force multipliers and incoming slop. But I've seen how these tools can bring nontechnical people to the point of building dashboards extremely quickly. Faster than an engineer would have built the tool in the past. At the same, or even better, quality. The people with the right mindset can vibe code and produce quality products. Now I come worried about my career.
Skill Acquisition
AI coding tools turned a corner in effectiveness late last year. I put Sonnet 4.5 as the first model that I truly trusted to execute a task for me. It took a while to build that trust, but my interactions with the model were still limited in scope. The tooling wasn't quite there, at least for me. The interactions with my codebase weren't quite what I wanted, and it still fell off track rather easily when confused. But still, I only trusted it for personal projects; professionally, I still wrote all my code by hand.
Something in the tooling has really turned a corner this calendar year, and it is accelerating.
Maybe we just got used to the models. Maybe the tools themselves got that little bit better at keeping the models on track. Maybe we adopted best practices for the models slowly and now it had turned a corner. Or maybe the models got that much better. I don't know the answer. It's probably some combination of all of the above. But now all that messaging about 90% of code being written by AI is looking awfully prophetic.
It's not just slop coming at us. It's legitimately good code, well designed and tested and making tradeoffs that align with the goals of the project. My workplace has aggressively adopted AI tools this calendar year, and it's accelerated in the last month. Word on the street is that other workplaces have similarly turned a corner with their usage. This is not the AI enthusiast world that I experienced a year ago. These are real businesses pivoting real workflows over to AI.
The biggest problem with us software engineers is that we love building things. I've been using AI tools enough that I wrote a tool to keep my markdown files sanely formatted. Other people have done much more productive things, like OpenClaw that coordinates models to operate over much more than just your coding tools (but it's pretty good at that), or even small tools to improve your coding tool's memory capabilities. There's a whole ecosystem of people figuring out how they like to using AI and then building tools to expand that functionality.
It's impressive. We've built the tools that will be orchestrated to lead us to our own demise, at least as far as the current understanding of the software engineer role is. Maybe there's a group or skillset that will survive and push forward, intact.
Historically, software engineers have survived pivots in the industry rather well. I missed the transition to Ruby on Rails, which I understand was transformative to the way that we think about developing web applications. I didn't miss the transformation that focused on automated tests for your code; that, too, was a moment where many engineers had to figure out their role in the new world, and quickly. My internship at Palantir was as a point-and-click tester; that role didn't exist a year later.
Other careers did not fare so well. Floor traders in stock exchanges did not survive the digital transition. The industrial revolution put an end to a whole host of careers. It sure feels like we're in a similar position right now with AI coding tools. If you don't use them, your job is at risk. Even if you look at the total output from companies and don't believe that AI is helping people do their job, the perception of increased output and productivity is enough to make you fall behind. That's how powerful these tools seem to be.
So what skills do I need to learn to survive? It's unclear. Unlike some of the previous pivots in software, it's not obvious what the answer is other than "keep using these tools and make sure you're using the best ones". That's a tough pill to swallow when you're working 9-10 hours a day and still have to take the time for your family and your health. I can spend time trying to use new tools and incorporate new thinking into my workflows, but that's effort that I may not be able to muster after handling everything else in my life.
The Career
Now the big question: is my career at risk? The answer revolves around the perception that these tools are currently good enough that we don't hire junior developers anymore, so I do believe the more experience software engineer role is soon to be at risk. That's the attitude I'm hearing from the industry. Smaller companies are canceling plans to hire interns. Even senior engineers are being told about changing expectations. Demands are put on people to build more things with AI. But the companies running these tools are losing money, so maybe it's just temporary.
If you read more anti-AI pieces from people like Ed Zitron, you may doubt the economics of these tools. You might compare to other money-burning endeavors of the past. But I wasn't a fully conscious adult for the Uber renaissance, or the AWS push to cloud. I was still finding my way and not really thinking about how companies were operating. I was aware of blockchain technologies enough to stay away from companies but saw some potential in tokens (though I continue to be skeptical). Some of these things worked out in a big way and are obvious now. I don't know what to think about AI. Some of it is hype, for sure. But as I see the value, I see that I would pay more than I currently do to use the tools. Enough to make these foundational model companies economically viable? That, I don't know.
On one hand, at current prices and performance it's just clearly worth it to use these models. The speed at which you can build tools and basic features is just ridiculous. Some might say that this means the future of the software career is in code review, not creation. Some people say that writing code was never the bottleneck. I strongly agreed with this sentiment when it was written. 9 months later, I'm not confident.
Who's to say AI isn't better than humans at code review? Given the correct training, why couldn't they design codebases and system architectures more effectively than a human could?
As I mentioned, the biggest problem with software engineers is that we love building. We have, potentially, literally built our own demise as an industry. We've trained and improved on these tools to the point where they can operate as a reasonably high level version of a human, without needing to take the time to train the human.
Conclusion
So yeah, I'm worried. I don't know if my job will exist in 5 years. It might not exist in 2. Or it might exist, but in such a limited headcount that I never make the cut.
I don't quite know how to approach this existential question, either. I'm not 25 and hungry. I'm motivated to do better by I'm also 33 now and thinking about what a family might look like. I'm considering what to do when my or my wife's parents need assistance as they age. I'm not thinking about reinventing myself as an AI specialist, or a superuser of these tools.
But maybe I need to.