There's lies, damned lies, and then: there's statistics.
You have to counter the growth in jobs based on how many new people there are to take them, the location in which they are, and somewhat weirdly other jobs.
Plenty of people feel so dejected at the current state of things that they leave computer work entirely making "openings" where there isn't actually any growth.
Like all things that you try to understand: a single datapoint, when averaged, is like trying to calculate the heat from the sun by looking through a telescope at jupiter. It will give you a far-out tiny facet of data that only makes sense when coalesced with a hundred other ones.
I'd like to use this on my website and also see if I can create variations for some of the major EU markets.
This makes sense given both automation and the US's role in the global economy, but it runs somewhat contrary to standard ideas of class and inequality.
Apple, a very successful company, makes 300B/y revenue? (ish)
~10% is all you need to be Apple.
And, it can work by taking all of 10% of the jobs and collecting the whole salary (the AI employee -- dubious proposition),
or by taking 10% of everyone's salary and automating part of everyone's job (the AI "tool" -- much more plausible).
If "part" being automated is >10%, we all win in the long run, every company gets productivity growth without cost growth, etc etc.
If you add in data center costs, and multiple competing AI companies, and then expand the TAM to all white collar work worldwide, you can make everyone successful beyond their wildest dreams with a "20% of work for 20% of the cost" model. Again, how you distribute that 20% remains to be seen (20% new unemployment, or new 0% unemployment with "tools".
I formalized my thoughts here: https://jodavaho.io/posts/ai-jobpocolypse.html
Yay!
>Computer Programmers: -6%
Oh no
Whats the outlook like?
Thank you!
BLS forward looking guidance means nothing when technology revolutionizes the nature of work.
Needs
- [utility] add filter by keyword / substring match, e.g majority of visualized reports are un-labeled requiring hovering with a mouse pointer
- [improve discovery] add sort by demographic / pop impact, e.g largest block is 7m ('Hand laborers and movers') and default sorted to bottom-left
I guess that was to be expected...
https://apnews.com/article/trump-jobs-firing-f00e9bf96d01105...
Started my career in the decade of offshoring and didn't think we'd have anything close to an "AI" taking our jobs before we potentially unionized or had a government that would protect its labor force from being replaced by literal robots.
2020-2022 felt like the usa tech ship was finally growing into something really great. All gone now.
When I worked in devops I always worried that my job was automating away other engineers, it definitely had a "when will this come for me" feeling, because it really was, now the dev and ops are both getting automated away.
This is my first time looking at HN in practically a year. Tech is just so uninteresting to me now. Nobody is hiring SDE/SWE/SREs except for the problem makers, like Anthropic, Meta, etc. Anthropic has pages and pages of $300k-$600k roles open right now. But do you go help the rest of your colleagues lose their jobs?
I guess lets talk about kubernetes or something...
Chief Executives is actually a specific sub-category of it and is, obviously, much smaller.
Can you elaborate?
I think AI outcomes distribute to contexts where it is used, and produce a change in how we work, what work we take on. Competition takes care of taking those surpluses and investing them in new structure, which becomes load bearing and we can't do without it anymore.
In the end it looks like we are treading water, just like it was when computers got 1M times faster in a couple of decades, but we felt very little improvement in earnings or reduction in work.
Surplus becomes structure and the changed structure is something you can't function without. Like the cell and mitochondrion, after they merged they can't be apart, can't pay their costs individually anymore. Surplus is absorbed into the baseline cost.
Stand in front with a gun while mobs come to burn down the data center that took their jobs.
(I think I'm half joking).
It's not great for them, but it's a definite advantage for people who are already in the mindset of distinguishing and discriminating information and sources on merit, instead of running an "AI bad" rubric as part of their filter.
AI has already won. It's taking over. It might be a year or two, or five, or ten, but AI isn't slowing down, nobody is going to pause, and there's a whole shit ton of work people do that won't be meaningful or economically relevant in the very near term. Jevons paradox isn't relevant to cognitive surplus - you need a very different model to capture what's going to happen.
It's time to surf or drown, because it doesn't look like any of the people in charge have the slightest clue about how to handle what's coming.
> Rate the occupation's overall AI Exposure on a scale from 0 to 10.
The sad part isn't that this is low-effort AI slop, but that intelligent people and policy makers are going to see it and probably make important decisions impacting themselves and others based on these numbers.
> Taxi Drivers, Shuttle Drivers, and Chauffeurs
> Overall employment of taxi drivers, shuttle drivers, and chauffeurs is projected to grow 9 percent from 2024 to 2034, much faster than the average for all occupations.
...word?
A -4.0% hit to cashiers may have less of an impact than -4.0% to lawyers or another category that is propping up the middle of the economy with spending.
Apparently "top executive" median pay is $105,350 per year: https://www.bls.gov/ooh/management/top-executives.htm
For a business, the question is whether you can make more money by doing more ambitious things.
Agriculture is a good example of that: http://www.johnhearfield.com/History/Breadt.htm
I think this is a very important point. The hedonic treadmill means real gains are discounted. The novelty information cycle is like an Osborn Effect for improvements, like the semi-annual Popular Mechanic's flying car covers where there is an enticing future perpetually nearly here and at the same time disappointingly never materialized.
This time the jobs most in the crosshairs of AI are the jobs that constituted the paper pushing overhead of modern society, all the paper pushing jobs. Instead of $1 widgets from China replacing $2 domestic widgets it's gonna be $1 AI services replacing $2 services that require a real human.
This is hard to reason about because people tend to consume these kinds of services in big multi hundred or multi thousand dollar increments but in practice what it means is that when you have to engage an accountant, engineer, having something planned out in accordance with some standard, that will be substantially cheaper because of the reduced professional labor component.
And of course, as usual, the string pulling and in investor class will get fabulously wealthy along the way.
Published AI generated code is a mild negative signal for quality, but certainly not a fatal one.
Published AI generated English writing is worthless and should be automatically ignored.
My friends and I who have a bachelor's degree in CS make more money than my friends who have or are working towards master's degrees in CS, because the former are working in the private sector and the latter are in academia making peanuts.
Edit: Another possible reason that Masters degrees were less common in the past, so the Bachelors pay statistics skew towards people with more work experience in their higher earning years, whereas the Masters pay statistics skew towards younger people with less work experience.
It's also understated, because the real value of AI is not in replacing work, but making new products possible either because it's finally cheap enough to make them, or because -- AI.
Given the state of AI (LLMs) - they still need a very human (skilled driver) to operate
Potable water is far more important than AI or iPads ever will be, but the world's most valuable water company only does about 5B/year in revenue: https://en.wikipedia.org/wiki/American_Water_Works
Reason for hope
They're saying that programmers will be declining. While Developers, and crucially, Testers and QA people will be increasing. That testers and QA become more important in the future sounds plausible to me in a future hypothetical world of ubiquitous AI.
All of that doesn't necessarily imply that the Developer class of employees will grow at the same rate as the Tester and QA classes of employees.
Putting aside the slop facade place atop the data....why would we trust the data?
(don't forget to "allow pasting" in [chrome] console first)
But given that the stock market hasn't panicked, this must mean at least one of these premises is false:
1. Economic activity is relatively flat.
2. AI makes us a million billion zillion times more productive than we used to be.
3. The stock market is rooted in reality.
The 1% pockets, this is where the vast majority of the extra productivity computers/internet/automation brought goes to for the last 50 years: https://www.epi.org/productivity-pay-gap/
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Computer Programmers median pay according to BLS: $98,670 per year
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Software developers typically do the following:
- Analyze users’ needs and then design and develop software to meet those needs Recommend software upgrades for customers’ existing programs and systems Design each piece of an application or system and plan how the pieces will work together
- Create a variety of models and diagrams showing programmers the software code needed for an application
- Ensure that a program continues to function normally through software maintenance and testing
- Document every aspect of an application or system as a reference for future maintenance and upgrades
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Computer programmers typically do the following:
- Write programs in a variety of computer languages, such as C++ and Java
- Update and expand existing programs
- Test programs for errors and fix the faulty lines of computer code
- Create, modify, and test code or scripts in software that simplifies development
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
On second thought, client service folks might do extremely well here!
No, AI has not "already" won. And phrasing it as you do, "It's taking over. It might be a year or two, or five, or ten" is an admission of that.
People may indeed not pause, but there's never any guarantee that the next step of progress is possible; whatever we reach may be all we can do, and we'll only find out when we get there. Or it might go hyperbolic and give us everything.
I'm not certain, but I suspect Jevons paradox is probably the wrong thing to bring up here, that's about cheaper stuff revealing more latent demand, and sure, that's possible and it may reveal a latent demand for everyone to build their own 1:1 scale model of the USS Enterprise (any of them) as a personal home, but we may also find that AI ends the economic incentives for consumerism which in turn remove a big driver to constantly have more stuff and demand goes down to something closer to a home being a living yurt made out of genetically modified photovoltaic vines that also give us unlimited free food.
(I mean, if we're talking about the AI future, why not push it?)
What I do think is worth bringing up is comparative advantage: Again, this is just an "I think", I'm absolutely not certain here, but if AI can supply all demand at unlimited volumes*, I think the assumptions behind comparative advantage, break.
> It's time to surf or drown, because it doesn't look like any of the people in charge have the slightest clue about how to handle what's coming.
Yes, and I think they've also not even managed to figure out the internet yet.
* and AI may well be able to, even if all models collectively "only" reach the equivalent of a fully-rounded human of IQ 115; and yes I know IQ tests are dodgy, but we all know what they approximate, by "fully rounded" I mean that thing their steel-man form tries to approach, not test passing itself which would have the AI already beat that IQ score despite struggling with handling plates in a dishwasher.
Maybe it was linked from a comment somewhere on HN but just today I saw a post saying “Microwaves are the future of all food: if you don’t think so, you better get out of the kitchen”
Microwaves have already won. There will be a microwave in every home over the next few years.
It’s time to start microwave cooking or drown
This was already obvious, the more important question is what are we (collectively, society & our governments) going to do about it?
We (should have) already known most of our jobs were bullshit jobs, especially white collar jobs. The difference is now we might have something coming that will eliminate the bullshit jobs.
But society will always need bullshit jobs or the whole system collapses. Not everyone can go dig ditches, so what do we do?
This is a research tool that visualizes 342 occupations from the Bureau of Labor Statistics Occupational Outlook Handbook, covering 143M jobs across the US economy. Each rectangle's area is proportional to total employment. Color shows the selected metric — toggle between BLS projected growth outlook, median pay, education requirements, and AI exposure. Click any tile to view its full BLS page. This is not a report, a paper, or a serious economic publication — it is a development tool for exploring BLS data visually.
LLM-powered coloring: The source code includes scrapers, parsers, and a pipeline for writing custom LLM prompts to score and color occupations by any criteria. You write a prompt, the LLM scores each occupation, and the treemap colors accordingly. The "Digital AI Exposure" option is one example — it estimates how much current AI (which is primarily digital) will reshape each occupation. But you could write a different prompt for any question — e.g. exposure to humanoid robotics, offshoring risk, climate impact — and re-run the pipeline to get a different coloring.
View the Digital AI Exposure scoring prompt (example)
You are an expert analyst evaluating how exposed different occupations are to AI. You will be given a detailed description of an occupation from the Bureau of Labor Statistics. Rate the occupation's overall AI Exposure on a scale from 0 to 10. AI Exposure measures: how much will AI reshape this occupation? Consider both direct effects (AI automating tasks currently done by humans) and indirect effects (AI making each worker so productive that fewer are needed). A key signal is whether the job's work product is fundamentally digital. If the job can be done entirely from a home office on a computer — writing, coding, analyzing, communicating — then AI exposure is inherently high (7+), because AI capabilities in digital domains are advancing rapidly. Even if today's AI can't handle every aspect of such a job, the trajectory is steep and the ceiling is very high. Conversely, jobs requiring physical presence, manual skill, or real-time human interaction in the physical world have a natural barrier to AI exposure. Use these anchors to calibrate your score: - 0–1: Minimal exposure. The work is almost entirely physical, hands-on, or requires real-time human presence in unpredictable environments. AI has essentially no impact on daily work. Examples: roofer, landscaper, commercial diver. - 2–3: Low exposure. Mostly physical or interpersonal work. AI might help with minor peripheral tasks (scheduling, paperwork) but doesn't touch the core job. Examples: electrician, plumber, firefighter, dental hygienist. - 4–5: Moderate exposure. A mix of physical/interpersonal work and knowledge work. AI can meaningfully assist with the information-processing parts but a substantial share of the job still requires human presence. Examples: registered nurse, police officer, veterinarian. - 6–7: High exposure. Predominantly knowledge work with some need for human judgment, relationships, or physical presence. AI tools are already useful and workers using AI may be substantially more productive. Examples: teacher, manager, accountant, journalist. - 8–9: Very high exposure. The job is almost entirely done on a computer. All core tasks — writing, coding, analyzing, designing, communicating — are in domains where AI is rapidly improving. The occupation faces major restructuring. Examples: software developer, graphic designer, translator, data analyst, paralegal, copywriter. - 10: Maximum exposure. Routine information processing, fully digital, with no physical component. AI can already do most of it today. Examples: data entry clerk, telemarketer. Respond with ONLY a JSON object in this exact format, no other text: {"exposure": <0-10>, "rationale": "<2-3 sentences explaining the key factors>"}
Caveat on Digital AI Exposure scores: These are rough LLM estimates, not rigorous predictions. A high score does not predict the job will disappear. Software developers score 9/10 because AI is transforming their work — but demand for software could easily grow as each developer becomes more productive. The score does not account for demand elasticity, latent demand, regulatory barriers, or social preferences for human workers. Many high-exposure jobs will be reshaped, not replaced.
Frequently seen as a big fun number in pitch decks. "The TAM for our new Coca-Cola killer is $1.6T: all humans who imbibe liquids on a regular basis. You simply MUST invest."
1) The salaries of corporate employees 2) Shareholders and capital owners
Regarding number 2: "Shareholders" would include anyone who owns any stock at all, including a lot of middle class people with a simple S&P 500 ETF in their portfolio.
And the increase in productivity allowed more people to become capital owners, AKA entrepreneurs. The explosion in software entrepreneurs, for example.
What you mention here is the exact thing why my earlier relationship went bust, because I didnt have any of these, then the children arrived :-X
Now just think of the comp levels in sectors like government, education, etc.
It's annoying that the dishes still have some pooled water in them when the cycle finishes; it doesn't always get everything perfectly clean; I have to know not to put the knives or the wooden stuff or anything fancy in it. But in spite of all of that, I use it every day, it's a huge productivity boost, and I'd hate to be without it.
You think "there's a whole shit ton of work people do that won't be meaningful or economically relevant in the very near term" is wrong?
Ah, the classic, forever-untestable "it's just around the corner" hypothesis.
I've lived through multiple "it's gonna be over in 12-18 months" arguments since November 2022. It's a truism for any technology to say that it's going to get better over time. But if you're convinced that "AI has already won", why not make a specific prediction? What jobs are going to be obsolete by when?
Jevons paradox was never relevant to cognitive surplus. That isn't what it's about.
Cognitive surplus only strengthens Jevons paradox. Humans are a competitive advantage for businesses in a world dominated by human needs
You'd probably put me into that bucket, although I'd disagree. I'm not at all against using AI to do something like: type up a high level summary of a product featureset for an executive that doesn't require deep technical accuracy.
What I AM against is: "summarize these million datapoints and into an output I can consume".
Why? Because the number of times I've already witnessed in the last year: someone using AI to build out their QBR deck or financial forecast, only to find out the AI completely hallucinated the numbers - makes my brain break. If I can't trust it to build an accurate graph of hard numbers without literally double checking all of its work, why would I bother in the first place?
In the same way, if you tell me you've got this amazing dataset that AI has built for you, my first thought is: I trust that about as much as the Iraqi Information Minister, because I've seen first hand the garbage output from supposedly the best AI platforms in the world.
*And to be clear: I absolutely think businesses across the board are replacing people with AI, and they can do so. And I also think it'll take 18+ months for someone to start asking questions only for them to figure out they've been directing the future of their company on garbage numbers that don't reflect reality.
There is definitely impact on Software engineering jobs at the moment, interns/juniors are struggling to find jobs, companies are squeezing every bit of dev slack time to produce more stuff with AI.
Because no matter what fairy tales you want to believe in your $20 "invested" in palantir won't make you a "shareholder" lmao
It feels like the intent was that "Programmers" were the ones doing the routine / lower skill tasks while the Developers were the ones that did the specification and architecture.
Those got juggled around and largely people getting listed as "Computer Programmer" is going down as the company relists them as Software Developer.
This is also part of the confusion of "Web Developer" which is also in there.
It reflects what government thought management thought title and roles were some years ago.
Fridge OTOH, not so much.
It's the combination of tech and big or fast growing companies.
People who operate in FAANG or Silicon Valley bubbles (or who spend too much time on Blind) can lose track of what salaries look like in the rest of the world.
I often share Buffer's open salary page because their compensation is actually pretty normal from all of the data I've seen and hiring I've done: https://buffer.com/salaries
Every time it gets posted there are comments from people aghast that the software engineers "only" make $200K and in disbelief that the CEO's salary is "only" $300K.
If you click the link it mentions "general and operations managers". They're tossing a lot of different roles into the category.
That really only makes sense if for households with a toaster oven, single adults, childless couples, and retired people. A toaster oven makes a lot more sense for small meals, in part because it can heat up much faster than a full oven.
Otherwise, a daily family meal isn't a special occasion.
Ovens are a special occasion thing in my house because our oven is huge and I can usually do the same thing in the air fryer, which is just a small convection oven.
Programmers is like a translator; somebody else came up with what to do and you're doing the mechanical work of converting words into C++.
Developer involves coming up with what to do.
Hence programmers is a lower paid position.
LLMs require a lot more effort.
I’m from northern Europe. I might use the micro to heat up leftovers or a cup of water for tea or whatever in a pinch, but in this household (and at all my friends’), the stove and the oven cooks the food. I know literally no-one who could say they cook most meals in the micro.
I didn’t have a microwave oven before we bought a house. It took up too much space to justify, for such a relatively rarely-used appliance.
Although, the analogy seems sort of useless, in that the food preparation ecosystem is really not any less complex than the program creation ecosystem, so it doesn’t offer any simplification.
This is an incredible self-report. If you consider microwaved meals to be your default method of cooking and not something primarily for reheating leftovers or defrosting frozen meat, I sincerely hope you've gotten your cholesterol and blood pressure checked recently. That is not normal.
Thankfully there is real data if we want to know how microwaves are used. Survey below says they are used a bit more than ovens, but half as much as cooktops/stoves. Varies by cohort and meal.
Source: https://indoor.lbl.gov/publications/residential-cooking-beha...
(The original phrase was not just made up, it was sourced from actual news articles and marketing about microwave ovens, that’s why it feels relevant to a hype cycle like this)
You also see this kind of naive optimism if you go look at illustrations from the early 1900s. People believed everything would eventually be a machine: that a machine would feed you, wake you up in the morning, physically move everything within your home etc. And yeah those things are possible to do, but in reality they aren’t practical and we do not actually use machines to do everything because it has costs
If you turn on the color filters in accessibility settings in macOS you can see what the contrast could look like to a colorblind person.
The general trick is you can rely on differences in color lightness, patterns, text and icons, but not differences in color hue. The page should be usable in grayscale.
just because it was wrong once doesn't mean its never wrong. And was it really that wrong? The internet is great but would it be the worst thing in the world if we didn't live our lives around it?
All the "research" on the site comes from a single LLM prompt.
In addition, little work is done to separate the classes. He has probation officers in the same node as teachers, completely separate from law enforcement.
Here's some much better examples:
- https://www.washingtonpost.com/nation/2022/05/04/abortion-nu...
- https://flowingdata.com/2015/04/02/how-we-spend-our-money-a-...
1. Brick and mortar is dead.
2. The internet will die.
3. What is the business model? (this one still seems to exist to this day to some extent, lol)
Reality fell between 1 and 2.
Is that notion supported by this content? The BLS Outlook for most software engineering jobs is most in the "much faster than average" growth range.
Lots of middle class people have graduated into upper-middle class: https://www.aei.org/research-products/report/the-middle-clas...
Wealth inequality is still a problem. But it's not just the people at the very top benefitting.
There's no functional difference between a 'software developer' and a 'programmer'. they're just synonyms that sometime pay differently.
I can tell you that I didn't observe a single hand-wash-only holdout.
Perhaps such holdouts existed at a point, but a restaurant can only flatter the ego of their performatively-unproductive seniors for so long. Competition exists.
this is nuts! I use an oven every day dude - so its a special occasion is it?
The default method for cooking is using an oven or using a stove. Microwaving is for heating up left-overs for the most part.
One of the dangers of people who are too close to programming is that they think of life as binary.
I've lived without a microwave for a long time and it's only a little bit inconvenient because things take longer to reheat.
Hand-washing dishes also, from what I understand, uses more energy and water than the dishwasher does.
I think OP is just an outlier.
So, you know how looking at one pattern and then just saying "this one will be like that one?" without considering the similarities and differences is similar to what people complain about AIs doing?
Consider: Unlike my Microwave, Claude can work on Claude. Unlike my Microwave, Claude gets better at more things. Unlike my microwave, we do not know what causes Claude to work so well. My Microwave cannot improve the process that makes my microwave.
Also, um.
I'm not sure if you noticed?
But machines are everywhere.
I'm typing on one while another one (a microwave, in fact!) heats my breakfast, while another one washes my clothes, while another one vacuums my floor, while another one purifies the air in my room, while another one heats the air in my room, while another one monitors my doors and windows for unauthorized entry and another one keeps my food cool and another one pumps the Radon gas out of my basement and another one scoops my cat's poop.
What would be useful is tracking the change in minimum pay per hour from legitimate job listings, now that there are quite a few states that require posting pay ranges on job listings.
If I were in need of hard analytics you can be damn sure I'd have it build a tool with a solid suite of tests following a rigorous process to ensure the outputs are sound. That's the difference between engineering and vibing.
And for whatever reason a lot of people in startup/tech seem to have a huge Dunning-Kruger effect blind spot where they believe knowing a lot about one thing makes them an expert in everything.
This used to just be funny, but when it started to intersect with politics it began to actively contribute to destroying society. It isn't funny anymore.
(I don't think Karpathy's job data here is destroying society, this is a more generalized observation).
* Yes software engineering jobs can grow - by increasing demand for custom software thanks to coding agents unlock
* AI can impact it - by making software engineers LLM code approvers
https://images.seattletimes.com/wp-content/uploads/2017/12/9...
https://www.peoplespolicyproject.org/wp-content/uploads/2020...
https://datawrapper.dwcdn.net/CvQar/full.png
https://static.guim.co.uk/ni/1415721490539/Wealth_line-chart...
Correct, more energy, detergent, and water. Dishwashers are more efficient than what you can do by hand because they effectively manage their water usage.
A modern dishwasher will use 3 to 4 gallons on a run. By comparison, my kitchen sink holds about 10 gallons of water on each side. When I wash by hand, I'll fill one side with soapy water and rinse each dish individually. Easily more than 10 gallons of water get used in the whole process.
Dishwashers are so efficient because they rinse everything off the dishes with about ~1 gallons of water, they drain the water, then use detergent in the second run which gets off the tougher food stains, another 1 gallons of water. Then they rinse with another gallon of water.
Dishwashers maximize getting food particulates into dirty water in a way that you can't really sanely do by hand.
Modern high-efficiency dishwashers probably beat the most efficient humans now, but that's relatively recent and not a huge margin (and may not get the same results).
I use the time I spend to hand-wash my dishes as a time to pause and to let my mind wander. Having the hands in water is soothing.
And its a pleasant feeling, where cleaning is part of the food workflow : I cook, I eat, I clean (the kitchen, the dishes, my teeth).
I hate home dishwashers: you have to play Tetris after each meal to fill them, trying not to get your hands/arms dirty, then you have to let it do the work, and now you have to spend a few minutes to get the dishes out and store them where they should be, even though most of them are not linked to a meal you just had. Maybe worse, you could unload the dishwasher at a time completely unrelated to food, so that breaks the link.
On the other hand, having worked in restaurants, industrial dishwashers are awesome.
Or you're batch cooking
I’d also say that while I like my air fryer oven, I would prefer to do some of the bigger things like a whole bird in the oven. It’s cheaper to buy a whole bird for meal prep.
You’re kind of missing the point a bit. Yes, machines are everywhere but the details are very different.
The machines don’t magically do that stuff for you. You have to buy them, plug them in, turn them on and off. Lots of people don’t have any at all. They can’t do most things unsupervised. There are still lots and lots of tasks for which a machine exists, that people will still do entirely manually
There is a naivety to these predictions that is chipped away by the mundane details of having to exist in the real world. Cost, effort etc
It is significantly less productive to do both, and yet…
The food have been cooked in industrial ovens in the factory.
Its especially(!) very common for people who made an exit and are now "wealthy" - sure they can afford to have an oppinion on everything, but very often they are just talking bullshit, thinking: "hey, I made it in field X, so why do not try field Y".
Esp the "MBA crowd" is famous for this: For whatever reason they think they are more intelligent than ana engineer who filed a patent, e.g. (while most of the MBA bobos would fail just in acquiring all documents required for this)
Other example: If you wrote once a book and it got traction, even if you are not a proven expert you will be invited to television shows etc. (and MORE than the people who are real experts with proven track record)
i think i need more patience -- i seem to fall into a certain tone due to my low expectations, and it's likely a self-fulfilling process which i am complicit in
Upper-middle class is people making ~$200k/year.
A lot of people have moved from middle class to upper middle class over the last decade. Both those categories are outside the 1%.
This is the equivalence of telling a Designer that can't create infographics on anything but principled design subjects -- or else they're out of line. Any research or data they might use isn't relevant because they're not exerts? lol?
If I hand wash, I wash as I go. It takes maybe 5 minutes to wash up dishes from breakfast or lunch, maybe a little more for a big dinner, maybe not.
Dishwashers let you accumulate dirty dishes for a day or two which is the real advantage in water savings. But I've noticed a lot of people pre-wash by hand and then load the dishwasher. I don't understand that, if I'm going to "pre-wash" anything I'll just wash it completely and put it away.
I'm pro-dishwasher, but you could use much less water handwashing.
If I don't have a dishwasher, my normal method is to stopper one side of my sink, squirt some dish soap on the first few dishes, and run just enough water to wet the dishes. Then I scrub some dishes, run the water (into the stoppered sink) just to rinse them as I transfer to the dish rack, then turn off the water and repeat. The dirtiest dishes that have the most food stuck on get done last so they get the most time soaking in the soapy rinse water from the rest of the dishes. I can do a full dishwasher load with one side of my sink maybe 1/4 full of water.
It is a website that visualizes the output of an LLM prompt and passes it off as data. Big difference between the two.
5 minutes of most sinks running is 10 gallons of water. (Most kitchen sinks are 2 gallons per minute).
> Dishwashers let you accumulate dirty dishes for a day or two which is the real advantage in water savings.
I agree. If you aren't filling the dishwasher then you are probably wasting water. However, a full dishwasher is going to be a real water/energy saver. Especially if you aren't washing the dishes before putting them in the dishwasher. (I know a decent number of people do that. It's a hard habit to break).
(The "normal" cycle is specced for 11.0-27.7 litres but uses more electricity, which is more expensive than water.)
My wife and her family :D. Water conservation mentality is a battle.