Everyone loves a good analogy to put something complex into simpler terms. Right now, software industry analogies are everywhere as a way to simplify the tech job market and where software is headed as a career.
I think that some of them kind of work, but maybe not for the intended reasons.
Software has been an anomaly as a career for much of its existence. The idea was that you’d get a college degree or bootcamp certificate, no licenses, and get a 100-200k starting salary for knowing how to do this thing that others find mysterious and hard. Kids were sold this dream for at least 15 years now, that they just “learn to code,” and get a job with a fun office and free food.
We don’t know for certain where this career’s headed. But many people are making analogies to help explain the future:
- Home kitchens didn’t kill restaurants, so AI coding won’t kill software. People still order Doordash or go to fancy restaurants to taste food from a master chef. Now, you can vibe code anything you want and people will still want something built by a master that thinks through edge cases.
- Software is like agriculture. Mass automation makes the “artisanal” version more expensive. Cheap AI code means “very good software” becomes premium.
- Developers are like scribes before the printing press.
- Mobile phones democratized photography like AI democratizes coding. Everyone can do it now, so quality drowns in noise.
My main problem with the analogies is that they are sometimes used in part to argue that the career will be fine if you are good at your job. As I described above, software has been anomalous for a while now:
- Overpaid considering no licensing or liability.
- Artificially scarce, protected by cryptic syntax and tooling that made it seem harder than it was.
- Low regulation.
For the average career in software development, pay was good due to supply being lower than demand. It seems more clear now that demand for the developers themselves is trending downwards (do more with less), while supply continues to increase quickly due to even higher accessibility.
So, I feel that many of the analogies actually just end up arguing that software is going more to a restaurant model. Most failing (even worse than before), heavy competition, everyone could do the same, smaller teams and less capital. Where if you don’t own your own practice, you’re a line cook hoping the place doesn’t close (avg developer). Where a little food poisoning here and there doesn’t really matter, because they probably won’t even know it was your fault (security).
Of course, the analogy isn’t 1:1, but it holds well for the business side. McDonald’s is like Meta, massive margins, serving billions, and nobody pretends the quality is great. In-N-Out is a mid-size SaaS like Linear that gets a loyal following and does well in its niche. The local spot that everyone in the neighborhood loves but can’t grow beyond that is the solo developer’s tool with 500 users. The restaurant industry has its tiers too, and it didn’t make the industry a high-paying career for most of its workers.
It’s like everything that made software an epic way to make quick money is now compounding at once to kill its role as the special career it was. Low regulation has been at the forefront of moving fast in software for its whole existence, and as I covered in my last post, breaches may never surface or have consequences at all. If more people end up consuming software from smaller teams with less capital, security may increasingly just need to keep the lights on too.
But at the same time, it’s true that people aren’t going to vibe code personal software:
- Users don’t seem to pay attention or care that much about software quality at all. It’s mostly just a means to an end.
- People are pretty lazy too. They’re at least more lazy than they are ambitious to have some personalized tool.
- It seems to require a certain type of person to hack away at something, automate it, fix and clean up edge cases, and maintain it, that the average user doesn’t really associate with.
Think about the apps most people use every day. The experience is often mediocre at best, and few people seem to care, because it gets the job done and it’s convenient. Software quality has almost never been what consumers optimize for. Maybe the “artisanal software” pitch, where indie developers and small teams sell premium carefully built tools will exist the way organic food does. It’s a real market, but one that most people won’t pay for regularly. Not enough to sustain the career at the scale it existed before.
So yes, software itself is not going away, but the gold rush era of the career seems to be ending. Especially with the post 2022 headcount corrections still ongoing in Seattle, with software job postings down more than 60% since pre-covid and job searches taking over 12 months. Some will argue that this is just “return to normalcy,” but it doesn’t look like it is, since now there are many more people to fill far fewer openings than before. Still with no required licenses, credentials, or anything. It looks like we might be reaching normalcy for the first time.
Whether it’s “because of AI,” is a separate problem, but to me the current pressure we’ve seen this year and last is the compounding of over hiring, AI, lower standards, and high supply. What’s left going forward will be the interesting part, because it’s clear that many big enterprises don’t offer nearly enough value anymore, but as we just said, many people don’t care enough to DIY when it comes to software.
Maybe going forward, average users consume more from tiny teams, or the big platforms absorb more with skeleton crews.
The restaurant industry employs millions of people, but almost nobody goes into it expecting the deal that software developers got for the last 20 years. It feels like software is becoming a normal job for the first time.

Comments
Loading comments...