"Amazon announces $35 billion investment in India by 2030 to advance AI innovation, create jobs" https://www.aboutamazon.com/news/company-news/amazon-35-bill... (Dec 9 2025)
What AI does is remove a bunch of the humiliating, boring parts of being junior: hunting for the right API by cargo-culting Stack Overflow, grinding through boilerplate, getting stuck for hours on a missing import. If a half-decent model can collapse that search space for them, you get to spend more of their ramp time on “here’s how our system actually fits together” instead of “here’s how for-loops work in our house style”.
If you take that setup and then decide “cool, now we don’t need juniors at all”, you’re basically saying you want a company with no memory and no farm system – just an ever-shrinking ring of seniors arguing about strategy while no one actually grows into them.
Always love to include a good AI x work thread in my https://hackernewsai.com/ newsletter.
But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm.
I'm pretty sure AI is going to lead us to a deskilling crash.
Food for thought.
Pair them with a senior so they can learn engineering best practices:
And now you've also just given your senior engineers some extra experience/insights into how to more effectively leverage AI.
It accelerates the org to have juniors (really: a good mix of all experience levels)
It's like expecting someone to know how to use source control (which at some point wasn't table stakes like it is today).
I kind of agree with this point from the perspective of civilisation.
2. Junior engineer's heavy reliance on AI tools is a problem in itself. AI tools learn from existing code that is written by senior engineers. Too much use of AI by junior engineers will result in deterioration of engineering skills. It will eventually result in AI learning from AI generated code. This is true for most other content as well, as more and more content on internet is AI generated.
The only relevant point here is keeping a talent pipeline going, because well duh. That it even needs to be said like it's some sort of clever revelation is just another indication of the level of stupid our industry is grappling with.
The. Bubble. Cannot. Burst. Soon. Enough!!
I realized that they are shockingly bad at most basic things. Still their PR:s look really good (on the surface). I assume they use AI to write most of the code.
What they do excel in is a) cultural fit for the company and b) providing long-term context to the AIs for what needs to be done. They are essentially human filters between product/customers and the AI. They QA the AI models' output (to some extent).
I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.
> The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]
> If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.
We do not need to hire anymore outside senior developers who need to be trained on the codebase with AI, given that the junior developers catch up so quickly they already replaced the need to hire a senior developer.
Therefore replacing them with AI agents was quite premature if not completely silly. In fact it makes more sense to hire far less senior developers and to instead turn juniors directly into senior developers to save lots of money and time to onboard.
Problem solved.
Now claude had access to this[2] link and it got the daya in the research prompt using web-searcher. But that's not the point. Any Junior worth their salt — distributed systems 101 — would know _what_ was obvious, failure to pay attention to the _right_ thing. While there are ideas on prompt optimization out there [3][4], the issue is how many tokens can it burn to think about these things and come up with optimal prompt and corrections to it is a very hard problem to solve.
[1] https://github.com/humanlayer/humanlayer/blob/main/.claude/c... [2] https://litestream.io/guides/vfs/#when-to-use-the-vfs [3] https://docs.boundaryml.com/guide/baml-advanced/prompt-optim... [4]https://github.com/gepa-ai/gepa
https://news.ycombinator.com/item?id=44972151
Does this story add anything new?
I do agree with him about AI being a boon to juniors and pragmatic usage of AI is an improvement in productivity, but that's not news, it's been obvious since the very beginnings of LLMs.
I strongly disagree, a Senior who cannot ask a "dumb question" is a useless developer to me. Ask away, we want to figure things out, we're not mind readers. Every good senior developer I know asks questions and admits when they don't know things. This is humility and in my eyes is a strong market for a good Senior Developer. The entire point of our job is to ask questions (to ourselves typically the most) and figure out the answers (the code).
Juniors could or should ask more questions, and by the time you're a Senior, you're asking key questions no matter how dumb they sound.
I also don’t believe juniors, kids, seniors, staff, principals, distinguished/fellow should be replaced by AI. I think they WILL be, but they shouldn’t be. AI at Gemini 3 Flash / Claude Opus 4.5 level is capable with help and review of doing a lot of what a lot of devs do currently. It can’t do everything and will fail, but if the business doesn’t care, they’ll cut jobs.
Don’t waste time trying to argue against AI to attempt to save your job. Just learn AI and do your job until you’re no longer needed to even manage it. Or, if you don’t want to, do something else.
One is a "one junior per team" model. I endorse this for exactly the reasons you speak.
Another, as I recently saw, was a 70/30 model of juniors to seniors. You make your seniors as task delegators and put all implementation on the junior developers. This puts an "up or out" pressure and gives very little mentorship opportunities. if 70% of your engineers are under 4 years of experience, it can be a rough go.
It’s damned near impossible to figure out where to spend your time wisely to correct an assumption a human made vs. an AI on a blended pull request. All of the learning that happens during PR review is at risk in this way and I’m not sure where we will get it back yet. (Outside of an AI telling you - which, to be fair, there are some good review bots out there)
I was never formally trained so I just keep asking "why" until someone proves it all the way. Sales itself is also a lot about asking questions that won't come up to find the heart of the thing people actually want... which is just another side of the coin.
Junior devs do that naturally (if you have the culture) because they don’t know anything already. It’s great
I see none of that happening - software quality is actually in freefall (but AI is not to blame here, this began even before the LLM era), delivery doesn't seem to be any faster (not a surprise - writing code has basically never been the bottleneck and the push to shove AI everywhere probably slows down delivery across the board) nor cheaper (all the money spent on misguided AI initiatives actually costs more).
It is a super easy bet to take with money - software development is still a big industry and if you legitimately believe AI will do 90% of a senior engineer you can start a consultancy, undercut everyone else and pocket the difference. I haven’t heard of any long-term success stories with this approach so far.
Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.
That's such a terrible trend.
Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!
Considering the talk around junior devs lately on HN, there's way too many of them, it would indeed be amusing.
You can describe pre-ai developers and like that too. It's probably my biggest complaint about some of my Co workers
To what?
In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.
You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.
From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.
I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).
There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.
The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.
For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.
I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.
But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.
I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.
Perhaps more complacent firms are the same as they were a year ago.
I think LLM is a reflection of human intelligence. If we humans become dumber as a result of LLM, LLMs will also become dumber. I’d like to think in some dystopian world, LLM’s trained from pre 2023 data will be sought after.
It should probably be supplemented with some good old RTFM, but it does get us somewhat beyond the "blind leading the blind" StackOverflow paradigm of most software engineering.
Coding in any sufficiently large organization is never the main part of senior's time spend, unless its some code sweatshop. Juniors can do little to no of all that remaining glue that makes projects go from a quick brainstorming meeting to live well functioning and supported product.
So as for worth - companies can, in non-idedal fashion obviously, work without juniors. I can't imagine them working without seniors, unless its some sort of quick churn of CRUDs or eshops from some templates.
Also there is this little topic that resonates recently across various research - knowledge gained fast via llms is a shallow one, doesn't last that long and doesn't go deeper. One example out of many - any time I had to do some more sophisticated regex-based processing I dived deep into specs, implementation etc. and few times pushed it to the limits (or realized task is beyond what regex can do), instead of just given the result, copypasted it and moved along since some basic test succeeded. Spread this approach across many other complex topics. That's also a view on long term future of companies.
I get what you say and I agree partially but its a double edged sword.
That's my thought too. It's going to be a triple whammy
1. Most developers (Junior and Senior) will be drawn in by the temptation of "let the AI do the work", leading to less experience in the workforce in the long term.
2. Students will be tempted to use AI to do their homework, resulting in new grads who don't know anything. I have observed this happen first hand.
3. AI-generated (slop) code will start to pollute Github and other sources used for future LLM training, resulting in a quality collapse.
I'm hoping that we can avoid the collapse somehow, but I don't see a way to stop it.
The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.
And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.
I'm yet to see that production-grade code written by these production-grade models;
Tell me you've never worked at FAANG without telling me you've never worked at FAANG...
Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.
This might be the professional/career version of "buy when there's blood in the streets."
It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.
But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.
I would argue a machine that short circuits the process of getting stuck in obtuse documentation is actually harmful long term...
How are we defining "learning" here? The example I like to use is that a student who "learns" what a square root is, can calculate the square root of a number on a simple 4 function calculator (x, ÷, +, -) if iteratively. Whereas the student who "learns" that the √ key gives them the square root, is "stuck" when presented with a 4 function calculator. So did they 'learn' faster when the "genie" surfaced a key that gave them the answer? Or did they just become more dependent on the "genie" to do the work required of them?
This is "the kids will use the AI to learn and understand" level of cope
no, the kids will copy and paste the solution then go back to their preferred dopamine dispenser
Me too. Fire your senior devs. (Ha ha, not ha ha.)
Sorry, what does that mean exactly ? Are you claiming that a junior dev knows how to ask the right prompts better than a Senior dev ?
So if AI gets you iterating faster and testing your assumptions/hypothesis I would say that's a net win. If you're just begging it to solve the problem for you with different wording - then yeah you are reducing yourself to a shitty LLM proxy.
If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.
If you want to complain about tech companies ruining the environment, look towards policies that force people to come into the office. Pointless commutes are far, far worse for the environment than all data centers combined.
Complaining about the environmental impact of AI is like plastic manufacturers putting recycling labels on plastic that is inherently not recycleable and making it seem like plastic pollution is every day people's fault for not recycling enough.
AI's impact on the environment is so tiny it's comparable to a rounding error when held up against the output of say, global shipping or air travel.
Why don't people get this upset at airport expansions? They're vastly worse.
I'm a big fan of the "staff engineer" track as a way to avoid this problem. Your 10-15 year engineers who don't vibe with management should be able to continue earning managerial salaries and having the biggest impact possible.
I'm also a fan of leadership without management. Those experienced engineers should absolutely be taking on leadership responsibilities - helping guide the organization, helping coach others, helping build better processes. But they shouldn't be stuck in management tasks like running 1-1s and looking after direct reports and spending a month every year on the annual review process.
The "over" deserves a lot of emphasis. To this day, I save my code at least once per line that I type because of the daily (sometimes hourly) full machine crashes I experienced in the 80s and 90s.
I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.
At the end of the day, radiologists are still doctors.
So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.
You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.
I would argue a machine that short-circuits the process of getting stuck in obtuse books is actually harmful long term...
The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.
Any task has “core difficulty” and “incidental difficulty”. Struggling with docs is incidental difficulty, it’s a tax on energy and focus.
Your argument is an argument against the use of Google or StackOverflow.
It’s diamond age and a half - you just need to continue to be curious and perhaps slow your shipping speed sometimes to make sure you budget time for learning as well.
If you can just get to the answer immediately, what’s the value of the struggle?
Research isn’t time coding. So it’s not making the developer less familiar with the code base she’s responsible for. Which is the usual worry with AI.
Cannot wait for the 'Oh dear god everything is on fire, where is the senior dev?' return pay packages.
(also I’m just talking out of my ass on a tech forum under a pseudonym instead of going to well-publicized interviews)
We do too, don't worry.
There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.
Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers

AWS CEO Matt Garman outlined 3 solid reasons why companies should not focus on cutting junior developer roles, noting that they “are actually the most experienced with the AI tools”.
In a tech world obsessed with AI replacing human workers, Matt Garman, CEO of Amazon Web Services (AWS), is pushing back against one of the industry’s most popular cost-cutting ideas.
Speaking on WIRED’s The Big Interview podcast, Garman has a bold message for companies racing to cut costs with AI.

He was asked to explain why he once called replacing junior employees with AI “one of the dumbest ideas” he’d ever heard, and to expand on how he believes agentic AI will actually change the workplace in the coming years.
First, junior employees are often better with AI tools than senior staff.
“Number one, my experience is that many of the most junior folks are actually the most experienced with the AI tools. So they're actually most able to get the most out of them.”
Fresh grads have grown up with new technology, so they can adapt quickly. Many of them learn AI-powered tools while studying or during internships. They tend to explore new features, find quick methods to write code, and figure out how to get the best results from AI agents.
According to the 2025 Stack Overflow Developer Survey, 55.5% of early-career developers reported using AI tools daily in their development process, higher than for the experienced folks.
This comfort with new tools allows them to work more efficiently. In contrast, senior developers have established workflows and may take more time to adopt. Recent research shows that over half of Gen Z employees are actually helping senior colleagues upskill in AI.
Second, junior staff are usually the least expensive employees.
“Number two, they're usually the least expensive because they're right out of college, and they generally make less. So if you're thinking about cost optimization, they're not the only people you would want to optimize around.”
Junior employees usually get much less in salary and benefits, so removing them does not deliver huge savings. If a company is trying to save money, it doesn’t make that much financial sense.
So, when companies talk about increasing profit margins, junior employees should not be the default or only target. True optimization, Real cost-cutting means looking at the whole company because there are plenty of other places where expenses can be trimmed.
In fact, 30% of companies that laid off workers expecting savings ended up increasing expenses, and many had to rehire later.
Third, companies need fresh talent.
“Three, at some point, that whole thing explodes on itself. If you have no talent pipeline that you're building and no junior people that you're mentoring and bringing up through the company, we often find that that's where we get some of the best ideas.”
Think of a company like a sports team. If you only keep veteran players and never recruit rookies, what happens when those veterans retire? You are left with no one who knows how to play the game.
Also, hiring people straight out of college brings new ways of thinking into the workplace. They have fresh ideas shaped by the latest trends, motivation to innovate.
More importantly, they form the foundation of a company’s future workforce. If a company decides to stop hiring junior employees altogether, it cuts off its own talent pipeline. Over time, that leads to fewer leaders to promote from within.
A Deloitte report also notes that the tech workforce is expected to grow at roughly twice the rate of the overall U.S. workforce, highlighting the demand for tech talent. Without a strong pipeline of junior developers coming in, companies might face a tech talent shortage.
When there are not enough junior hires being trained today, teams struggle to fill roles tomorrow, especially as projects scale.
This isn’t just corporate talk. As the leader of one of the world’s largest cloud computing platforms, serving everyone from Netflix to the U.S. intelligence agencies, Garman has a front-row seat to how companies are actually using AI.
And what he is seeing makes him worried that short-term thinking could damage businesses for years to come. Garman’s point is grounded in long-term strategy. A company that relies solely on AI to handle tasks without training new talent could find itself short of people.
Still, Garman admits the next few years will be bumpy. “Y_our job is going to change_,” he said. He believes AI will make companies more productive as well as the employees.
When technology makes something easier, people want more of it. AI enables the creation of software faster, allowing companies to develop more products, enter new markets, and serve more customers.
Developers will be responsible for more than just writing code, with faster adaptation to new technologies becoming essential. But he has a hopeful message in the end.
That’s why Geoffrey Hinton has advised that Computer Science degrees remain essential. This directly supports Matt Garman’s point. Fresh talent with a strong understanding of core fundamentals becomes crucial for filling these higher-value roles of the future.
“I’m very confident in the medium to longer term that AI will definitely create more jobs than it removes at first,” Garman said.
There can sometimes be too much competition, but often there is only the illusion of too much if you don't look at quality. You can find a lot of cheap engineers in India, but if you want a good quality product you will have to pay a lot more.
But these are the things people learn through experience and exposure, and I still think AI can help by at least condensing the numerous books out there around technology leadership into some useful summaries.
Maybe. The naturally curious will also typically be slower to arrive at a solution due to their curiosity and interest in making certain they have all the facts.
If everyone else is racing ahead, will the slowpokes be rewarded for their comprehension or punished for their poor metrics?
There is such a thing as software engineering skill and it is not domain knowledge, nor knowledge of a specific codebase. It is good taste, an abstract ability to create/identify good solutions to a difficult problem.
Good luck maintaining that.
AI, on the other hand...
Of course no-one's stopping a junior from doing it the old way, but no-one's teaching them they can, either.
Also the difference between using it to find information versus delegating executive-function.
I'm afraid there will be a portion of workers who crutch heavily on "Now what do I do next, Robot Soulmate?"
Microsoft docs are a really good example of this where just looking through the ToC on the left usually exposes me to some capability or feature of the tooling that 1) I was not previously aware of and 2) I was not explicitly searching for.
The point is that the path to a singular answer can often include discovery of unrelated insight along the way. When you only get the answer to what you are asking, you lose that process of organic discovery of the broader surface area of the tooling or platform you are operating in.
I would liken AI search/summaries to visiting only the well-known, touristy spots. Sure, you can get shuttled to that restaurant or that spot that everyone visits and posts on socials, but in traveling that way, you will miss all of the other amazing food, shops, and sights along the way that you might encounter by walking instead. Reading the docs is more like exploring the random nooks and crannies and finding experiences you weren't expecting and ultimately knowing more about the place you visited than if you had only visited the major tourist destinations.
As a senior-dev, I have generally a good idea of what to ask for because I have built many systems and learned many things along the way. A junior dev? They may not know what to ask for and therefore, may never discover those "detours" that would yield additional insights to tuck into the manifolds of their brains for future reference. For the junior dev, it's like the only trip they will experience is one where they just go to the well known tourist traps instead of exploring and discovering.
Complaining about docs is like complaining about why research article is not written like elementary school textbooks.
Just an example. I've been in so many code bases over the years… I had a newer engineer come aboard who, when he saw some code I recently wrote with labels (!) he kind of blanched. He thought "goto == BAD". (We're talking "C" here.)
But this was code that dealt with Apple's CoreFoundation. More or less every call to CF can fail (which means returning NULL in the CF API). And (relevant) passing NULL to a CF call, like when trying to append a CF object to a CF array, was a hard crash. CF does no param checking. (Why, that would slow it down—you, dear reader, are to do all the sanity checking.)
So you might have code similar to:
CFDictionary dict = NULL;
dict = CFCreateDictionary();
if (!dict)
goto bail;
You would likely continue to create arrays, etc—insert them into your dictionary, maybe return the dictionary at the end. And again, you checked for NULL at every call to CF, goto bail if needed.Down past 'bail' you could CFRelease() all the non-null instances that you do not return. This was how we collected our own garbage. :-)
In any event, goto labels made the code cleaner: your NULL-checking if-statements did not have to nest crazy deep.
The new engineer admitted surprise that there might be a place for labels. (Or, you know, CF could have been more NULL-tolerant and simply exited gracefully.)
You won’t need Vim except to review changes and tweak some things if you feel like it.
Pointing out that it wasn’t always that will make you seem “negative.”
This is an example of a book on Common Lisp
https://gigamonkeys.com/book/practical-a-simple-database
What you usually do is follow the book instructions and get some result, then go to do some exploration on your own. There’s no walk in the dark trying to figure your own path.
Once you learn what works, and what does not, then you’ll have a solid foundation to tackle more complex subject. That’s the benefit of having a good book and/or a good teacher to guide you to the path of mastering. Using a slot machine is more tortuous than that.
"bespoke, hand generated content straight to your best readers"
Conversations like this are always well intentioned and friction truly is super useful to learning. But the ‘…’ in these conversations seems to always be implicating that we should inject friction.
There’s no need. I have peers who aren’t interested in learning at all. Adding friction to their process doesn’t force them to learn. Meanwhile adding friction to the process of my buddies who are avidly researching just sucks.
If your junior isn’t learning it likely has more to do with them just not being interested (which, hey, I get it) than some flaw in your process.
Start asking prospective hires what their favorite books are. It’s the easiest way to find folks who care.
Also, for a lot of things, that is how people learn because there aren't good textbooks available.
Please read it as: "who knows what you'll find if you take a stop by the library and just browse!"
I was helping a few people on getting started with an Android Development bootcamp and just being able to run the default example and get their bearing around the IDE was interesting to them. And I remember when I was first learning python. Just doing basic variable declaration and arithmetic was interesting. Same with learning C and being able to write tic-tac-toe.
I think a lot of harm is being done by making beginner have expectations that would befit people that have years of experience. Like you can learn docker in 2 months to someone that doesn't even know Linux exists or have never encountered the word POSIX.
Please do read the following article: https://www.norvig.com/21-days.html