Theme:
New
  • ai|
  • opinion

Making sense of the AI era

Guy Waldman's ProfileGuy Waldman
January 20, 2026
The start of this post is a bit gloomy, but I recommend to read it through. My goal is to provide some words of comfort in a time where I see many people being uncertain.
Also, this post (which is all human-typed! Imagine that) contains opinions which do not reflect my employer or anyone else.
You no longer write code by hand, you are now a "product engineer".
Your friends jokingly call you a prompt engineer.
The software industry has been mainly outsourced, so you feel lucky to have a job. Prompts are here to stay at least for the next few years.
So you write Markdown for a living, big deal.Let them laugh.
You sit at your desk.
The tool of your craft is not an IDE, it is a web app that looks like a war room - split screens each showing you flying generated text produced by agents.
You are a conductor, playing an orchestra of thousands of agents at a time, only very rarely interfering if they hit a roadblock.
There's a rumor that a guy at the office caught a bug once and stopped one of the agents mid-session, no one really believes it.
You have no daily standup, no scrum.
You start the day by going over some tickets from some kanban an AI provided from you and click "Approve". Sometimes one doesn't make sense and you shift its priority or remove it altogether.
You think the tickets are based off of customer feedback and complex industry analysis, but you don't know for sure.
What's Gartner next to SOTA AI models?
You pick up a few tasks and work on them in parallel.
You write a single prompt, and at least 100 agents running Opus 56.5 spit out a hundred iterations of an implementation. It used to be around 10K, but they needed to cut costs.
Then 10 more agents (you vaguely remember the term "mixture of experts" back in the early AI days) select the best implementation from those 100 iterations, followed by 50 more agents reviewing the code.
You take a sip of your Diet Coke, click "Approve" again, and then some QA and DevOps agents run some tests and ship it to prod. You don't really know what they do and don't care.
It's getting hard to keep track of all those sessions and every day the expectations from you are rising. It's mentally draining. You consider taking ADHD meds again.
Your computer science degree is now pretty much worthless. Your parents told you computer science is a waste of time, but you went anyway because you liked solving puzzles.
Your dad tells you that he used to be a "real programmer". He talks enthusiastically about something called "vim" and hundreds of hours he spent getting good at "macros". You don't know what any of it means, but it sounds very outdated. No one really uses the terminal anymore, and you only know it because of college. Could just as well have been punchcards slotted into some IBM mainframe.

I think even a few months ago, describing something like this would sound crazy.
Honestly, this sounds like more of a reality nowadays.
I've been thinking a lot in the past few weeks about how fast the AI cycle is going.
I wrote about it here (2021) and more recently here. I watched this video where Marques Brownlee shows how tiny transistors have become in the last several decades:
This made me think a lot.
If Opus 4.5 is a transistor from the 1950s that's around the size of the nail and the modern day transistor is on the scale of just thousands of atoms, what does this mean for AI?
Does Moore's Law hold? Do AI scaling laws continue forever?
What about physical limitations? We can't go smaller than an atom, can we? Or what about quantum computing? Should I be scared of the first time quantum computers efficiently run neural nets?
Are all of those people saying that the era of humans writing code is over really believe in it, or are do they have an agenda? Maybe all they're seeing is the unfiltered hype?
OpenAI recently partnered with Cerebras and this makes me think even more - super low-latency AI might be a crazy paradigm shift. Instead of carefully prompting a large LLM (because you don't want to wait a minute for it to finish the task), if it takes less than a second, then why care too much about the "perfect prompt" anymore? If it generates 10K tokens/sec, you can just experiment without worrying about time wasted.
With all these questions in mind, I am thinking about myself too! I've been coding since I was around 12, software (with a few adventures in graphic/motion design) was my life. I loved it! And still do.
I get excited about crafting the perfect system, "connecting legos", taking a huge complex thing and making it simple and robust. Making users have a delightful experience. Engineering the perfect solution.
Combining beautiful aesthetics from my design/UX days, with refined software across the stack.
This is what held me together and kept my passion up.
So what, now some GPUs go brr and this is all now irrelevant?
Yesterday during a meeting about AI dev tooling, I jokingly said that LLMs are like a "genius 6-year-old". Some people laughed. Should we be worried about when this very young genius graduates from university?
Well, I am writing all of this to actually give you some words of comfort. This is my personal take, and my perspective only.
First and foremost - don't worry! No one knows what tomorrow would like.
Maybe there is less software and every company downsizes. Maybe the grim story I described earlier comes much earlier than we think.
Or maybe, just maybe, it is actually the case that now that software is easier to write, SWEs "go up a level of abstraction" and there are much more software jobs! And juniors have a very easy time finding work, because they can do a lot more and don't cost as much. If you are an experienced developer, you are even more highly desired since you have real expertise.
My point is - you simply don't know and shouldn't spend too much energy worrying about all of this.
I like to help "junior" (don't really like that word) and aspiring software developers to learn and land a job.
And my suggestion to them and every software developer, which maybe some people won't agree with, is not "quickly adopt AI or stay behind"; it's much simpler - work on becoming a better professional. Stay curious. Learn fundumentals and first principles. Practice coding. Read books. Understand tradeoffs. Take technical courses.
These skills, in my personal opinion, will stay relevant for the foreseeable future.
If you think this is dangerous advice and LLMs (who were trained on every piece of data that you can possibly find) will do everything better than you, then I would argue that we are now in a philosophical discussion about how every knowledge worker is affected. In other words, in the case of "AGI", if your output is text, whether you are a programmer or an accountant, this is now a general humanity problem and not you-as-a-software-developer problem.
I honestly have no idea if the investment into AI will plateau. No idea if this will become just a daily part of our lives (kind of like COVID-19) or it will keep accelerating.
But I do know that in any profession and in any industry, you need to reinvent yourself.
I would recommend to every person reading this, to reinvent yourself as a better professional. As a better person. Follow the trends, leverage AI if you want, but please keep sane - don't be stressed about the hype. Don't try to control what you can't.
Work to become better. That's all any of us can do in this life, really.

Related content

    • opinion|
    • ai
    The future of software development (again)
    Post
    December 12, 2025
    Reflecting on my post from 2021
    • productivity|
    • ai
    Introducing: magic-cli
    Post
    July 16, 2024
    A command line utility that will make you a magician in the terminal
    • opinion|
    • productivity|
    • ai
    The future of software development
    Post
    June 29, 2021
    What will the future of software tooling look like?
    • opinion|
    • security
    Thoughts on the CrowdStrike incident
    Post
    July 19, 2024
    Today there was an incident with CrowdStrike, here's what I think