- AI will make developers irrelevant
- Why?
- Because LLMs can write code
- Do you know what I do for a living?
- Yes, write code?
- Yes, about 2-5% of the time. Less now.
- But you said you are a developer?
- I did
- So what do you do 95-98% of the time?
- I understand things and then apply my ability to formulate solutions
- But I can do that!
- So why aren't you?
The developers who still think their job is about writing code will perhaps not have a job in the future. Brutal as it may sound: I'm fine with that. I'm getting old and I value my remaining time on the planet.Business owners who think they can do without developers because they think LLMs replace developers are fine by me too. Natural selection will take care of them in due course.
Argument B: AI means you don't learn as much, and the single most useful work product of a software engineer is knowing how the code functions, so it's depriving your company of the main benefit of your work. Also, layoffs are terrible business strategy because every lost employee is years of knowledge walking out the door, every new hire is a risk, and red PRs are derisking the business.
Institutional and personal knowledge seem similar, but the implications of each are radically different.
The thing is, software engineering has never been about the code or the programming language. It's always about knowing how to solve digital problems. There's a reason why the programming language used in my university CS courses was "math", and they religiously avoided teaching concrete languages or frameworks or technologies that were popular at the time. If you understand the patterns of how to recognize and solve problems, and how to apply existing research toward them, you can implement that in any language.
That still holds in the AI era. I've found that just knowing "Here's the architecture of how the solution should look like" and being able to call out the LLM when it goes off track has made me way more effective at LLM-based coding. I can frame the problem quickly, I can specify the architecture, and I can keep the LLM from spinning its wheels or giving plausible but wrong answers. That's a superpower in a world where your coder might hallucinate.
There are also those for whom that percentage is higher, let’s say 6-50%.
> I understand things and then apply my ability to formulate solutions
The AI is coming for that too.
You might just be lucky to be in circumstances that value your contributions or an industry or domain that isn’t well represented in the training data, or problem spaces too complex for AI. Not everyone is, not even the majority of devs.
People knocking out Jira tickets and writing CRUD webapps will end up with their livelihood often taken away. Or bosses will just expect more output for same/less pay, with them having to use AI to keep up.
If you mean creating software, well we are creating more software than ever before and the definition of what software is has never been so diverse. I can see many different careers branching off from here.
It also feels like the hiring "signal", which was always weak before, is just completely gone now, when every job you do advertise receives over 500 LLM written applications and cover letters that all look and feel the same.
The pro-athlete comparison in this article is bit silly IMO- there are obvious physical body issues that occur with aging if you rely on your muscles etc to make money. If you compare to other fields of knowledge work, such as say law or medicine, there are loads of examples of very experienced, very sharp operators in their 40s and 50s.
Yes, LLMs might dramatically reduce the amount of code we write by hand. But I'm a lot less convinced they'll solve all of the amorphous, human-interacting aspects of the job.
I certainly don't have the money or time to go back to college and start a new career at the bottom.
Less "pure" programming, but lots more programming in general.
I dont know, maybe in your part of the world, but where I'm from we have a series of robust worker protection laws that try to limit the damage the work does to you. We generally consider it a bad thing for workers to damage their bodies, and if we could build houses without it, we'd prefer that.
In this specific case we do have a techniques to build software without causing damage, so why change that?
This post is arguing that maybe software enginnering should start being harmful, even though we know it doesn't have to. It's a post of a guy begging to be fed into the capitalist meat grinder. Meaningless self sacrifice.
More AI Soothsaying. Not so hard on the Inevitabilism this time.
It's ability to write code is alright. Sometimes it impresses me, sometimes it leaves me underwhelmed. It certainly can't be left to do things autonomously if you are responsible for its output.
Moderately useful tool, but hellishly expensive when not being subsidized by imbeciles that dream of it undrrmining labor. A fool and his money should be separated anyway.
What I am really concerned about the incoming economic disaster being brewed. I suspect things will get very ugly pretty soon.
The first is that AI is achieving human-level expertise and capability, but since they're now being increasingly trained on their own output they are fighting an uphill battle against model collapse. In that case, perhaps AI is going to just sort of max out at "knowing everything" and maybe agentic coding is just another massive paradigm shift in a long line of technological paradigm shifts and the tooling has changed but total job market collapse is unlikely.
The other possibility is that we're going to continue to see escalating AI capability with regard to context, information retrieval, and most importantly "cognition" (whatever that means). Maybe we overcome the challenges of model collapse. Maybe we figure out better methodologies for training that don't end up just producing a chatbot version of Stack Overflow + Wikipedia + Reddit. Maybe we actually start seeing AI create and not just recreate.
If it's the latter, then I think engineers who think they are going to stay ahead of AI sound an awful lot like saddle makers who said "pffft, these new cars can only go 5 miles per hour."
If we truly need to sacrifice our skill to be productive by using LLMs that atrophy us, then the only devs that have a limited lifespan are us. The next ones won't have a skillset to atrophy since they won't have built it through manual work.
Also, I hereby propose to publicly ban the "LLMs generating code are like compilers generating machine code" analogy, it's getting old to reargue the same idea time after time.
if you do that then... likely very replacable.
Virtually, the entire blog is about AI with a ridiculous publishing rate (https://www.seangoedecke.com/page/5), funny how I can look at this site HTML and know right away it was done with AI.
Can we stop upvoting vibe published articles? The arguments are flawed and don't even make sense to anyone who does software
If you believe this about your software career, how do you think your going to switch into another career as a junior and keep up?
If talking to an AI makes me dumber and a limited career, then all the customer support people that ever existed were in the same or worse position talking to dumb humans on chat all day answering tickets always about the same topics and linking the same docs over and over. This makes no sense.
It's a tool for knowledge work.
No carpenter is a specialist in drills.
It seems to me that the best way to navigate a long term career is to have another specialty and use software engineering as a tool within that specialty.
> The AI is coming for that too.
In that case all [1] non manual work is doomed, until robotics has an LLM moment.
[1] With the exception of all fields protected by politics or nepotism.
Because people want to discuss about the topic of the headline.
I also see that in the future humans will adapt to AI, instead of the opposite. Why? Because it's a lot easier for humans to adapt to AI, than the opposite. It's already happening -- why do companies ask their employees to write complete documentation for AI to consume? This is what I called "Adaption".
I can also imagine that in the near future, when employment plummets, when basic income become general, when governments build massive condos for social housing -- everything new will be required to adapt to AI. The roads, the buildings, everything physical is going to be built with ease-of-navigation by AI in consideration. We don't need a Gen AI -- that is too expensive and too long term for the Capitalist class to consider. We only need a bunch of AI agents and robots coordinated in an environment that is friendly to them.
The idea that productivity gains which result in more of something being produced also create more demand for labour to produce that thing is more often wrong that true as far as I can tell. In fact, it's quite hard to point to any historical examples of this happening. In general labour demand significantly decreases when productivity significantly increases and typically people need to retrain.
Instead, we get an economy that feels like it's on the cusp of a fall or at the very least a roller coaster. Poor tax incentives to hire. ZIRP is long gone. And the hiring managers are overrun with slop.
But bosses are happy to say it's AI because that makes you sound in control.
There aren't any careers and if there were, you would have to pay. Corporations certainly won't except under extremely rare situations where they have to to compete.
The interesting thing to me is that, Software Engineering will have to evolve. Processes and tools will have to evolve, as they had evolved through the years.
When I was finishing university in 2004, we learned about the "crisis of software " time, the Cascade development process and how new "iterative methods" were starting.
We learned about how spaghetti code gave way to Pascal/C structured prpgramming, which gave way to OOP.
Engineering methods also evolved, with UML being one infamous language, but also formal methods such as Z language for formal verification; or ABC or cyclomatic complexity measurements of software complexity.
Which brings me to Today: Now that computers are writing MOST of the code; the value of current languages and software dev processes is decreasing. Programming Languages are made for people (otherwise we would continue writing in Assembler). So now we have to change the abstractions we use to communicate our intent to the computers, and to verify that the final instructions are doing what we wanted.
I'm very interested to see these new abstractions. I even believe that, given that all the small details of coding will be fully automated, MAYBE we will finally see more Engineering (real engineering) Rigor in the Software Engineering profession. Even if there still will be coders, the same way there are non-engineers building and modifying houses (common in Mexico at least)
s/creating software/typing correspondence/
In a world where software programming/architecting is solved by AI, value will accrue to people with expertise in other domains (who have now been granted the power of 1000 expert developers), not the people whose skillsets have been made redundant by better, faster and cheaper AI tools.
In 2026, both companies decide that AI can accelerate their developers by a factor of 10x. I'm not asserting that's reality, it's just a nice round number.
Company 1 fires 90 of their programmers and does the same work with 10.
Company 2 keeps all their programmers and does ten times the work they used to do, and maybe ends up hiring more.
Who wins in the market?
Of course the answer is "it depends" because it always is but I would say the winning space for Company 1 is substantially smaller than Company 2. They need a very precise combination of market circumstances. One that is not so precise that it doesn't exist, but it's a risky bet that you're in one of the exceptions.
In the time when the acceleration is occurring and we haven't settled in to the new reality yet the Company 1 answer seems superficially appealing to the bean counters, but it only takes one defector in a given market to go with Company 2's solution to force the entire rest of their industry to follow suit to compete properly.
The value generation by one programmer that can be possibly captured by that programmer's salary is probably not going down in the medium and long term either.
Which is true, but it’s true as long as it’s not true.
The classic example of how drastically this kind of thinking can fail is Malthusian theory, that populations would collapse because food growth was linear while population growth was exponential. This was true for all of history until Malthus actually made this observation.
At a mechanistic level, the “we have always found other jobs” argument misses that the reason we’ve always found other jobs is because humans have always had an intelligence advantage over automation. Even something as mechanical as human inputs in an assembly line was eventually dependent on the human ability to make tiny, often imperceptibly, adjustments that a robot couldn’t.
But if something approximating AGI does work out, human labor has absolutely no advantage over automation so it’s not clear why the past “automation has created more human jobs” logic should continue.
We saw this pre-ai with uber and door dash. I think as AI automation dies down and most companies are competing at a near optimal level with the new tools we'll need humans again in more traditional roles to build the next generation of innovations. And then the whole cycle will repeat.
LLM code generation: "Here is an intent/specification. Invent code that hopefully satisfies it."
Does the compiler analogy provide value under those terms? I don't think it does. In fact, I think it provides negative value.
We don't need to use tortured analogies to express excitement over these tools.
I think the right analogy to calculators and CAD tools, is IDE with Intellisense for SWE -- instead of typing code one char by one char, we can tab to automate some part of it.
But I agree with your consensus -- SWE is changing, whether we like it or not. We need to adapt, or find a niche and grit to retirement.
You might need to relocate to a place with much lower costs of living.
This was the idea behind remote working discussed during COVID-19 times:
- the company can pay less money because the employee is living at a much cheaper place than the expensive city where the company is located
- on the other hand, even with a smaller salary, the employee has more money at the end of the month because of the smaller costs of living
So both sides win.
(Also why I think Google and Meta might not make it to the end of the AI 'revolution'...)
We have seen this in other knowledge industries. U.S. legal sector job count is about the same today as it was 20 years ago. But billing rates have exploded and revenues in the 200 largest firms have increased more than 50% after adjusting for inflation. Higher-end law firms have leveraged technology to be able to service much more of the demand and push out smaller regional competitors.
It also isn't true. The story of the last 50 years has been that technology, especially computer and communications technology, has facilitated the concentration of wealth. The skilled work got computerized, or outsourced to India or China. That left U.S. workers with service jobs where they have much lower impact on P&L and thus much less leverage.
In my field, we used to have legal secretaries and law librarians and highly experienced paralegals. They got paid pretty well and had pretty good job security because the people who brought in the revenue interacted with them daily and relied on them. Now, big firms have computerized a lot of that work and consolidated much of the rest into centralized off-site back-office locations. Those folks who got downsized never found comparable work. They didn't, and couldn't, go work for WestLaw to maintain the new electronic tools. The law firms also held on to many of them until retirement or offered them early retirement packages, and then simply never filled the positions. It used to be a pretty solid job for someone with an associates degree, and it simply doesn't exist anymore.
The only thing keeping the job market together is the explosion in healthcare workers. My Gen-Z brother and sister in law are both going into those fields. In a typical tertiary American city, the largest employers are the local hospital and perhaps a university or community college. Both of those get most of their revenue directly or indirectly from the government. It's not clear to me how that's sustainable.
Malthusian observation can still be true...It only has to be true once, and the only reason people say it isn't right now is due to industrial fertilizers and short memories.
Anecdotally I see a lot of schadenfreude online about tech jobs after a decade or two of lecturing everybody from Appalachians in coal country, to Midwestern autoworkers, that they should just “learn to code.”
It's a lot easier to be early than to be smart or quick.
Education in what, though? No idea. And if there was one answer, and it's true that there will be fewer software developers, you'd likely be competing with many people for few jobs.
The last work-house closed in the 1930s.
That all started not because people were afraid jobs were going to be replaced by the new loom. People had been using looms for centuries. They were protesting working conditions: low wages, lack of social protections when people were let go, child labor, work houses, etc. There were no labor laws at the time to protect workers... but there were these valuable new machines that the capital owners valued greatly. The machines were destroyed as leverage: a threat.
Since the capitalists ultimately "won" that conflict it has been written, by technocrats, that technological progress is virtuous and that while workers will initially be displaced the benefit to society will be enough such that those displaced will find productive work else where.
But I think even capitalist economists such as Keynes found the idea a bit preposterous. He wrote about how the gains in productivity from technological advances aren't being distributed back to workers: we're not working less, we're working more than ever. While it isn't about displacement of workers, it is displacement of value and that tends to go hand in hand.
I think asking, "Where do I go?" is a valid question. One that workers have been trying to ask since the Luddites at least. Unfortunately I think it's one that gets brushed under the rug. There doesn't seem to be much political will to provide systems that would make losing a job a non-issue and work optional.
That would give us the most leverage. If I didn't have to work in order to live I could leave a job or get displaced by the latest technological advancement. But I could retrain into anything I wanted and rejoin the work force when I was good and ready. I wouldn't have to risk losing my house, skipping meals, live without insurance, etc.
Like a politician who's asked about this in a town hall, but thinks that "our plan is to do absolutely nothing" doesn't sound very appealing.
Of course, there are jobs that will still require human labour for some time yet, but in reality a lot jobs that require physical human labour are now done in other parts of the world where labour is cheaper.
Those which cannot be exported like plumbing or waitressing only have limited demand. You can't take 50% of the current white-collar workforce and dump them in these careers and expect them to easily find work or receive a decent wage. The demand simply does not exist.
Additionally, at the same time as white-collar jobs are being lost an increasing number of "low-skill" manual labour jobs are also being automated. Self-checkout machines mean it's harder to get work in retail, robotaxis and drone delivery will make it harder harder to find work in delivery and logistics, robots in warehouses will make it harder to find warehouse jobs.
It seems to me there is an implicate assumption that AI will either create a bunch of new well-paid jobs that employers need humans for (which means AI cannot do them) and jobs which cannot be exported abroad for cheaper. What well-paid jobs would even fit the category of being immune to AI and immune to outsourcing? Are we all going to be really well paid cleaners or something? It makes no sense.
A lot of the advice we're seeing today about retraining in construction worker or plumber seems to assume that there's an unlimited demand for this labour which there simply is not. And even if hypothetically there was about to be a huge increase in demand for construction workers, it would take years to even have the machinery, supply chain and infrastructure in place to support the millions of people entering construction.
The most likely scenario is that people will lose their jobs and will be stuck in an endless race to the bottom fighting for the limited number of jobs that are left in the domestic economy while everything else is either outsourced or done by robots and AI.
The better advice is to start preparing for this reality. Do not assume the government will or can protect you. When wealth concentrates corruption because almost inevitable and politicians have families to look after too.
Please take this seriously. Even if I'm wrong it's better to prepare for the worse rather than to assume everything will be find and you'll be able to retrain into a new well-paid career.
If ai does take a lot of white collar work, is it a lot of comfort that maybe jobs in a very different sector will be better in 20 years?
I'm not discounting ageism in the industry, but how popular of a career was it 30+ years ago compared to now?
If you're a paralegal or an accountant who can't manage their workflows with AI, you're going to be way less productive than someone who can.
And if you're a paralegal or an accountant who can manage a lot of your workflows with AI, you don't need custom software (hence less dedicated software engineers).
There's no category difference between being an expert in carpentry vs masonry and being an expert in drills vs hammers. They are both just areas of expertise.
Going down the path of trying to define what is expert functions and what is "merely" a tool using anything but descriptive technique is nonsense.
Expert functions are just those areas where using a tool is sufficiently difficult to require expertise.
That's great, but its nowhere near the norm, and people have been doing generalist software engineering for decades. There has been a sufficient amount of work for a long time to be performed by generalists that it has been a very reasonable career.
IMO AI is the first thing that has ever actually challenged that.
I think there are trades where tool (or process if i may be allowed to extend the analogy) specialists exist and are highly valued. My dad is a plumber, so ill use that example but id trust similar is true for carpentry. there are specialists by task/output (new construction, repairs, boilers etc) but also tool specialist plumbers and companies for example drain clearing equipment or certain kinds of pipe for handling chemicals other than water are very specialised, and there are roles for them because the thing they enable, and the criticality of the task, and often the cost and complexity of using the tool are high enough to make specialisation valuable.
IMO software has, for the 10 years ive been working in it, been in an unusual position where the tools (languages, engineering practices, tech stacks) were super technical and involved, but also could be applied to a large number of problems. That is the perfect recipe for tool specialists: complex tool with high value and broad domain/problem space applicability.
Because of that tool specialisation, we've separated the application of the tool to a problem/domain from the tool use. reduction of complexity of applying these tools to many problems, means all domain specialists will use them, relying less on tool specialists.
imaging a mcguffin tool for attaching any two materials together, but which took a degree to figure out (loose hyperbole here), that sudenly you could use for 5 bucks and a quick glance at the first page of the manual. An industry that used to have lots of mcguffin engineers, would be mega disrupted, and you could argue that those tool specialists would have to identify more with what they were building than the mcguffin they were using.
We've entered a period of single-use-plastic software, piling up and polluting everything, because it's cheaper than the alternative
Actually, at this point I feel that the value in software engineering is moving from coding to testing and quality assurance.
I’m kind of confused how you might think it wasn’t. Going through a career as a software dev until retirement was very common.
Software engineers didn’t just disappear after age 40.
I don’t think there’s compelling evidence that using AI makes you less intelligent overall1. However, it seems pretty obvious that using AI to perform a task means you don’t learn as much about performing that task. Some software engineers think this is a decisive argument against the use of AI. Their argument goes something like this:
I don’t necessarily agree with (2). On the one hand, moving from assembly language to C made programmers less effective in some ways and more effective in others. On the other hand, the transition from writing code by hand to using AI is arguably a bigger shift, so who knows? But it doesn’t matter. Even if we grant that (2) is correct, this is still a bad argument.
Until around 2024, the best way to learn how to do software engineering was just doing software engineering. That was really lucky for us! It meant that we could parlay a coding hobby into a lucrative career, and that the people who really liked the work would just get better and better over time. However, that was never an immutable fact of what software engineering is. It was just a fortunate coincidence.
It would really suck for software engineers if using AI made us worse at our jobs in the long term (or even at general reasoning, though I still don’t believe that’s true). But we might still be obliged to use it, if it provided enough short-term benefits, for the same reason that construction workers are obliged to lift heavy objects: because that’s what we’re being paid to do.
If you work in construction, you need to lift and carry a series of heavy objects in order to be effective. But lifting heavy objects puts long-term wear on your back and joints, making you less effective over time. Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”2.
If AI does turn out to make you dumber, why can’t we just keep writing code by hand? You can! You just might not be able to earn a salary doing so, for the same reason that there aren’t many jobs out there for carpenters who refuse to use power tools. If the models are good enough, you will simply get outcompeted by engineers willing to trade their long-term cognitive ability for a short-term lucrative career3.
I hope that this isn’t true. It would be really unfortunate for software engineers. But it would be even more unfortunate if it were true and we refused to acknowledge it.
The career of a pro athlete has a maximum lifespan of around fifteen years. You have the opportunity to make a lot of money until around your mid-thirties, at which point your body just can’t keep up with it. A common tragic figure today is the professional athlete who believes the show will go on forever and doesn’t prepare for the day they can’t do it anymore. We may be in the first generation of software engineers in the same position. If so, it’s probably a good idea to plan accordingly.
If you’re thinking “wait, there’s research on this”, you can likely read my take on the paper you’re thinking of here, here or here.
Of course, construction workers do have layers of techniques for avoiding lifting heavy objects when possible (cranes, dollies, forklifts, and so on). There’s a natural analogy here to a set of techniques for staying mentally engaged that software engineers are yet to discover.
In theory labor unions could slow this process down (and have forced employers to slow down this race-to-the-bottom in other industries). But I’m pessimistic about tech labor unions for all the usual reasons: the job is too highly-paid, you can work (and thus scab) from anywhere on the planet, and so on.
If you liked this post, consider subscribing to email updates about my new posts, or sharing it on Hacker News.
Here's a preview of a related post that shares tags with this one.
Luddites and burning down AI datacenters
Is it time to start burning down datacenters?
Some people think so. An Indianapolis city council member had his house recently shot up for supporting datacenters, and Sam Altman’s home was firebombed (and then shot) shortly afterwards. People from all sides of the argument are sounding the alarm about imminent violence.
The obvious historical comparison is Luddism, the 19th-century phenomenon where English weavers and knitters destroyed the machines that were automating their work, and (in some cases) killed the machines’ owners. Anti-AI people are reclaiming the term to describe themselves, and many of the leading lights of the anti-AI movement (like Brian Merchant or Gavin Mueller) have written books arguing more or less that the Luddites were right, and we ought to follow their example in order to resist AI automation.
Continue reading...
Managers can go back to being technical, because they are still interacting with problems that require human thinking. Token farmers don't.
It seems to me like all of these people are flocking now to healthcare fields. That seems totally unsustainable.
This is sarcasm, but it's probably also going to get sold as a feature at some point.
At the end of the 90th and beginning of the 00th ("dotcom bubble") it was a common saying that if as a programmer, when you are 30 or 40, you don't have a very successful company (and thus basically set for life), you basically failed in life; exactly because "everybody" knew that programming is a "young man's game" (i.e. you likely won't get a programming job anymore when you are, say, 35 or 40 years old).
So,
> Software engineers didn’t just disappear after age 40.
is rather a very recent phenomenon.
This will change for the better if more and more educated people relocate there.
That seemed commonly held among folks participating in the dot-com bubble. Plenty of people had been doing it for decades even as the bubble was growing.
> Software engineers didn’t just disappear after age 40.
>> is rather a very recent phenomenon.
Not really. It's not that they disappeared, it's that they're a small fraction of the overall SWE population as a side-effect of how much that population has grown.