2. Are there kids like that still?? I would love to think so. None of the kids in my circle of parents are. There is one teenager who's going into computer science because they are smart and love math, which is great, but they never built or explored or been curious about anything on their computer as far as I can tell. There is a big ecosystem of wish fulfilment and instant gratification, and I think (right) limitations like the author insinuated are part of the allure.
bro its your blog just say sex
I still made it work. I got pretty good at reading the waveform preview, and was able to use that to figure out where to do cuts. I would apply effects and walk through frame by frame with the arrow keys to see how it looked. It usually took all night (and sometimes a bit of the next day) to render videos into 1080i, but it would render and the resulting videos would be fine.
Eventually I got a job and saved up and bought a decent CPU and GPU and editing got 10x easier, but I still kind of look back on the time of me having to make my shitty computer work with a certain degree of fondness. When you have a decent job with decent money you can buy the equipment you need to do most tasks, but there's sort of a purity in doing a task that you really don't have the equipment you need.
Personally I think a the Steam Machine will have a better chance to cheat a general computing device into the home of someone not looking for it. The Neo gives me hope on price point.
I'm in the same boat as the author; I cut my teeth on a hand-me-down 2005 eMac, then a hand-me-down 2008 Macbook, before finally getting my own 2011 iMac. I think this is overly harsh on Chromebooks given they belong to the cheaper end of the market - you can still put linux on them and go for gold, you're just going to hit earlier resource limits.
I think when you're younger and building an aptitude for computers, it's the limitations of what you have that drive an off-the-shelf challenge: doing what you can with what you've got. That can vary from just trying to play the same video games as your friends (love what /r/lowendgaming does), usage restrictions (e.g locked down school issued laptops) or running professional tooling (very slowly) just like the author.
When IT caught my interest, I did all of the above - on Mac, Windows and Linux, on completely garbage machines. The Macbook Neo is an awesome machine for it's cost/value, but I don't think it's hugely special in the respect described beyond making more power available at a more accessible price point.
”Mrs. Jonson, the result are back. You son has autism.”
Or they learn to enable developer mode, unlock the bootloader, and install Linux, or use the officially supported Crostini, or so on. There's like 3 different ways to run Linux desktop apps on a modern Chromebook.
The Macbooks don't let have an officially supported path to unlocking the bootloader (edit: yes, I'm aware of asahi linux, which lives on the edge of what apple allows) and install your own OS. The Chromebooks do. I don't think that comparison plays as favorably as you think.
What I feel a bit sad about is, I was that kid. Growing up in a 3rd world country, running games that i didn't own on hardware that ought not run it, debugging why those games don't work, rooting my phone and installing custom OSs just for the heck of it. Man I had so much time to tinker.
Now I have amazing gaming hardware but I barely touch games. When I do, its on steam. I've swapped out the endless tinkerability of android with the vanilla 'it just works'-ness of the iphone. That curiosity took me far, but I seem to have lost it along the way.
I had replaced all the Windows sounds and cursors to customize the system so it looked and sounded like a Sci Fi system. I even patched the boot screen to be a humorous screen of "MS Broken Windows". It also was quite broken from messing with system files I didn't understand.
It was a magical period and I learned so much.
This is a $599 computer with purpose-built architecture for (barely) running (small, underpowered, near-useless) LLMs. There are children saving pennies for this machine that will do great, horrifying, dangerous things with these computers. I can’t wait to see the results.
As someone who lived on a chromebook for fun because it was a cheap way to get a browser machine that also had Linux access. I don't really get this. You can run blender on a chromebook as soon as you turn on the linux container. It will run even better if you install linux on it after a quick firmware flash.
If it's locked down by a school that's not really the chromebooks fault, schools are gonna lock down Macbook Neos via management policy the exact same way.
BTW a laptop is definitely the only choice if the kid wants mobility, though.
10 year old me identifies with this so much.
I managed to get the computer to display 256 colours instead of the 16 it had been set up with. Everyone was impressed and this meant I was now allowed to take the computer apart and put it back together again without anyone being scared.
Brilliant. Thank you for that.
That Cyrix machine was already miles ahead of the 386 that was handed down to me to play text based games on and learn dos through hard knocks. I remember leafing through old hard drives that had 10mb of capacity and realizing they had no value despite not being that old.
Later in college, I had the confidence to build my own first desktop with parts cobbled together from sketchy resellers. Athlon A1 single core 1ghz. Man that thing could fly.
> But just using the Neo, without any consideration that it’s memory limited, I haven’t noticed a single hitch. I’m not quitting apps I otherwise wouldn’t quit, or closing Safari tabs I wouldn’t otherwise close. I’m just working — with an even dozen apps open as I type this sentence — and everything feels snappy.
https://daringfireball.net/2026/03/the_macbook_neo
I think people are assuming it's going to be a worse experience than it actually is. I don't know how it does it with 8GB of RAM either, but apparently it does; i suppose my guess would be SSD and bus are fast enough that swapping on app change is no longer so disruptive? (I don't know if improvements in virtual memory swap logic could also be a thing that matters or not, this is not my area).
These things were the ones that led me to this passion and that today, with LLMs and almost only business applications to develop for the sake of living, feel like the magic is finally fading.
God I miss the old times! (I'm 36 yo but I feel like 70 in this specific moment!)
I'm still this person today :)
Edit: for true self embarrassment purposes you can see some of the films we made here:
https://youtu.be/FRQv7VUauWs?t=447&si=lCu3rp28XfKWN5ch
And also here:
https://youtu.be/ytKIG802baw?t=615&si=FJI8Cm9yYaXKMZdS
I put the start times at my movies. But there are others as well. lol.
I bought a 16GB M2 MacBook Air after I was Amazoned to work on a side contract when I was between jobs. I used it for four weeks and the only thing I ran on it was VSCode, Safari and Zoom. I would have been fine with the MacBook Neo. Right now with a job, it’s about the same - we use GSuite in a browser.
But it was mine, I tinkered with it forever, learned databases enough to turn Access into a basic quasi-Excel for my needs, cataloged things that really didn't need to be tabulated, and generally learned as much as that little machine would let me.
That was a limited computer, one that couldn't possibly have let me do what I needed to do when I hit university. But it got me started, taught me to tinker, and I'm prety sure pushed me to learn more than a state-of-the-art for the time computer would have.
And so I do wonder, at times, if it's the nostalgic look back at early computing that makes people inclined to say "my god that would have been an amazing computer to start out with" when you look at an entry-level computer. I'm inclined, even, to say man that's going to be an epic $100 computer on the second-hand market in a half decade or less.
When at the same time, it's actually a solid machine for more of us than us geeks with our inflated expectations of computers have than we'd like to accept. That, too, is pretty cool.
If I'd see that on my kids' computer it would fill me with pride.
I do miss the days that a (Linux) computer was like this for me. I say Linux because I had a similar obsession with FOSS and what it meant in a broad sense. But it doesn't matter, before that I de-compiled some program to make the text on the Windows START button different. Re-installed Win 2000 about every week, often after f-ing it up. Before that I changed some lines in DOS' autoexe.bat so it would ask for a password (which was just 2 input parameters readable in the autoexec.bat, but that is some mighty fine security (through obscurity) in a normie family).
This is why you should grow your own trees and wait a million years and melt the silicon into nvidia gpus and install linux.
learn linux.
only sanfran idealists dreaming of a world that destroys them use apple products.
May all the hackers out there, old and young, discover the beauty of the personal computer.
Different computers definitely give me different feelings when I use them. Some inspire creativity and the desire to make something meaningful or beautiful, others feel like machines made for work or for play. All of the boundaries are fuzzy but, for me, computers definitely have an emotional valence.
When computing was niche, you really only got into if you had a real interest in it (I assume - I wasn't around back then). That's changed, but it doesn't mean that that category doesn't exist anymore. If anything, it's probably way larger in absolute terms, if a smaller proportion of people who work on software in general.
I'd install Photoshop and Illustrator on my shitty computer I put together from spare parts my dad didn't have the use of anymore from his business computers. It was horribly slow, but I kinda made it work slowly.
The thing is that I think this is what made me think a bit differently, since everything was slowed down and took more time than I would want it to, I had to make deliberate decisions on what to add/edit. I still work the same way today to pa point, but that's because I'm both faster, more experienced and the computers have gotten more performant (and because I can afford better devices sure).
When I look at my half-brother and his teenage generation I wonder if they can still have such an experience. The personal devices have gotten better and faster, most things are really convenient and you sometimes even don't have to think a lot to do something also because they're cheap to do... they probably won't have the experience of "grinding it out" just for the sake of producing something they like...maybe sports is the closest...no idea, but have been thinking about this quite a lot recently...
Chromebooks have a linux VM where you can install anything, including GUI apps, and doing that is much more straightforward then installing something from the web on a mac. Download, right-click, install on linux. No scary warnings. No need to go to system settings.
Love the spirit of the post.
As a high school dropout, with a GED, I’ve spent my entire adult life, looking up noses. I chose a career jammed to bursting, with sheepskins, because I really enjoy doing tech. Not because I wanted to make money, or because I wanted to be a big shot.
My first ever program, was in the 1970s, some time. It was a Heathkit programmable calculator. My first ever ”serious” program, was Machine Code, typed into a 6800-based STD card, nailed to a piece of wood, with a hex keypad, and an 8-digit LED display. My first personal computer, was a VIC-20, with 3KB of RAM. My first Apple computer was a Mac Plus, with 4MB of RAM, and an external 20MB SCSI hard drive.
Learning on limited resources helps us to become frugal and efficient. It also helps us to become tough as hell. Some of the best engineers I ever worked with, had rough backgrounds.
These days, I use a pretty maxed-out Mini, and an LG ultrawide screen. I’m spoiled.
This hits home. Not because I did it as a kid; I'm a bit old for that. But because I've done this exact thing two or three times. You stare and know, just know, that somewhere in this byzantine interface there is the raw power to do lots of cool 3D stuff. But damn. It's quite an interface.
> That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
Yeah. For me it was an old, beat-up 286 that I couldn't get anyone to upgrade and and loving devotion to MS-DOS, old EGA Sierra games, TSR programs, TUIs, GeoWorks, and just not being able to get enough of it.
When I finally saved up enough to buy a 486 motherboard, I installed Linux because it seemed cool (and was cool) and never looked back. But that 286 sparked my obsession with computers that has influenced almost every aspect of my life.
Anywhere in the world, the kind of kid that does all this and installs Blender on it is WAY more likely to save up for any janky terrible half-working PC laptop with a bit better specs (memory in particular), or a desktop computer if possible, because A. games B. Linux C. piracy and more software D. he does not care about it being Apple or "just working", in the words of the author himself. I don't know how the US kids in particular feel about this since the reality distortion field is so strong, but anywhere else it's like this.
That kid will be much better off with a used laptop and Linux or BSD.
Is that even possible now? Probably not. Years ago I tried to get my kids interested in playing with their own Raspberry Pi when they came out, that they could do whatever they wanted with on the side, to little effect. Not even the idea of setting one up as their own Minecraft server (they were heavily into it at the time) piqued their interest. Oh well.
With a clear feedback loop and the insane motivation of a child I learnt to make games/software on basic which ended up defining my life.
Sometimes we overthink it, all a child needs is a safe environment to fool around and letting them be obsessed about things.
When I was working my most recent corporate job (as a people manager, natch) there were new hires even in 2019 that had never owned a computer that wasn't a phone, and just used whatever laptop or other system was supplied by their school or (now) work. This experience blackpilled me a little, I will say.
I haven't used a computer more recent than 2016. As far as I can tell, the only thing I'm missing is AAA gaming (RTX looks cool), and local LLMs.
I did a bunch of game jams on it. Even won one! (Of course, even 2010s hardware is overkill for 2d games :)
I also did some basic video editing on it but it was a bit slow to render.
I won't say I'm not missing out — I'm certainly looking forward to an upgrade! — but that you can get surprisingly far with surprisingly little.
It is not the proverbial gift horse. You are paying fresh $ for it. So, it is only reasonable to have some baseline expectations on redeeming value from it.
Also, an important point of the MacBook Neo criticism is that because of its cut-down features, a Neo may never graduate to a "hand-me-down computer", but instead head straight to the e-waste pile.
As a hacker and tinkerer I could never justify the cost of purchasing one of their machines, but I see people around me trying to sell their old Apple machines and phones at absurd prices (I do live in a peripheral economy and Apple stuff is even more expensive here).
So it makes sense for Apple to segment its market like that. It makes sense for their audience too.
When I shop for a car, I find the same issue: most analyses have very little to do with the car's technical attributes and there's a lot of gibberish about design and lifestyle.
8-bit. 16KiB of RAM. BASIC as the programming language. 640x256 resolution in 8 colours.
I could make that thing sing in an hour. It was hard to get it to do much, but then the difficulty was the fun thing.
By the time we got to the early 2000s and I could buy something with more RAM, CPU and storage than I could ever reasonably max out for the problems I was interested in at the time, I lost something.
Working within constraints teaches you something, I think. Doing more with less makes you appreciate the "more" you eventually end up with. You develop intuitions and instincts and whole skillsets that others never had to develop. You get an advantage.
I don't think we should be going back to 8-bit days any time soon, but in the context of this post, I want novices to try and build software on an A18 chip, I want learners to be curious enough to build a small word game (Hangman will do at first, but the A18 will let them push way, way past that into the limits of something that starts to feel hard all of a sudden), to develop the intuition of writing code on a system that isn't quite big enough for their ideas. It'll make them thirsty for more, and better at using it when they get it.
when you're young, time is infinite, money is scarce.
Older, and time seems to take over. The limitations are - when can you free up the time? Is relaxing allowed?
It's not locked down in any way, this is a fully featured machine, unlike Chromebooks, which in my experience don't cost a cent less than equivalently specced Windows laptops. Due to this I never considered buying those.
I'm not even a Mac fan, I just think they make nice computers and their OS has the least amount of downsides atm.
I always wonder what the world would be like in a battle between Google and M$ rather than M$ and Apple. Obviously less advertisement, more focus on function and less form.
Then I closed it for a year. Opened it up again one day, followed a box-modeling tutorial (from the documentation PDF linked in the Help menu!), and I was hooked. Spline modeling, rigging, walk cycles, texturing, lighting experiments, every spare minute for the rest of high school.
I still remember the whole-body panic of accidentally turning on "adaptive degradation", which replaced all meshes with their bounding cubes when rotating the viewport camera, and thinking I had broken my video card.
It absolutely does. But every system has constraints; even when provided with massive resources, humans tend to try things that exceed those resources, as evidenced by Parkinson's Law of data https://en.wikipedia.org/wiki/Parkinson%27s_law
I have a typical yuppie software job with decent pay, so generally I will buy the right tools for a job now instead of trying to make due with whatever I can scrap together. I'm not that busy of a person, but I certainly have more obligations than I did when I was sixteen, and now sometimes it really is worth it to spend an extra grand on something than it is to spend a week hacking together something from my existing stuff.
Still, I look back at the hours I spent making terrible YouTube videos with my terrible computer really fondly. I was proud of myself for making things work, I was proud of the little workarounds I found.
I think it's the same reason I love reading about classic computing (80's-90's era). Computers in the 80's were objectively terrible compared to anything we have now, and people still figured out how to squeeze every little bit of juice possible to make really awesome games and programs. The Commodore 64 and Amiga demos are fun to play around with because people will figure out the coolest tricks to make these computers do things that they have no business doing. I mean, the fact that Bad Apple has been "ported" to pretty much everything is something I cannot stop being fascinated by. [1] [2] [3] [4]
[1] https://youtu.be/2vPe452cegU
[2] https://youtu.be/qRdGhHEoj3o
> trackpad is only incrementally better than what was available in Chromebooks and cheap PCs
Did you use a touchpad of an old cheap PC? Apple would not dare to use one comparable to that in their wildest nightmares.
That's definitely valuable, but not for a child in my opinion, it's the type of luxury equivalent to a Mercedes over a Renault. Perfectly defensible but, just like a Mercedes is hardly a starter car, I don't think an MBP is that fit for a starter PC. It's also mostly useless if you're not traveling for work regularly.
That said, does any of that even matter any more? People were learning Blender, programming and whatever else 15 years ago on low to mid range machines already. The equivalently priced - or dirt cheap second hand - machines of today are multiple times more capable at everything. Stick Linux and a $5 mouse in it and you're 90% of the way to a macbook pro in terms of user experience.
That's to say, I agree with the core of the article: kids will make the most out of the least. But I disagree that this particular laptop is a necessity or a boon for that. If anything, it's a hindrance for being a mac.
It was a Sony Vaio, and the only thing I really remember about the hardware/specs is that it had a physical scroll widget under the touchpad on the edge of the case. Software-wise, it was running some relatively locked down version of Windows. I installed Arch on it and used it to rebuild and manage the non-profit's website.
The other thing that I remember from it is that it was my entrance into using the terminal as my primary interface - the first place I used Vim regularly, and the first time I'd installed tmux. One day I was trying to test a dropdown or something on their website, and discovered that my touchpad didn't work. It turned out to have been broken by an Arch update, which wasn't terribly surprising. What was surprising is that once I'd traced down the issue and corrected it, I realized that it had been broken for almost two weeks. I'd used that computer every day and hadn't needed to use a mouse even once.
That's basically the reason I learned Linux initially, and those hours debugging video driver issues would serve me well later on.
It brought back memories of when I first started using a Unix time share at university, and exhaustively read all the man pages. Didn’t know why, just wanted to discover everything.
The bootloader kids get my deep respect. I think I'd rather give my kid a Neo to begin with.
I feel this, and on the whole I've done the same thing. I'm deep in the Apple ecosystem because it all just works together without me having to tinker with it. I think this is mostly a reaction to now doing that stuff professionally - 4 days a week, whether I feel like it or not, I'm required to make computers do things they couldn't do before I started.
When I get to the end of the work day, or out of bed on a Sunday morning, I might get the urge to tinker with things but I refuse to have tinkering with things to make them work be a requirement for my rest time. Leisure tinkering must be on my terms, because if I'm forced to tinker with something just to do what I really wanted to do that's not tinkering, that's the thing fucking with me, and I will swear profusely at it throughout.
Now, 20+ years later all my home computers are running Linux (Debian though), and my kids grew up using Linux.
But I'm going to send my teenager to college with Windows or a Mac. They're going to be 1200 miles away, and they're going to need to get support for their computer and I won't be there.
Yes, I like Linux 1000x better than Windows or Mac, but Linux demands a different relationship with the admin. This kid hasn't wanted that relationship with tech, and will rely on friends to help get Office or Zoom or whatever installed.
I'm still deciding between Mac and Windows now. I'll probably end up getting a quality used business laptop from FB marketplace, but the Neo is interesting too.
I’m sure most other applications are less Mac-optimized, though (software development, 3d/graphics editing, gaming, …).
Cheap computers with hardware constraints have been around for decades. Now Apple ships one with pretty damn good performance, and they've invented "cheap computers with hardware constraints." HA!
My first computer was a Commodore 64 I found in a pile of trash a few years after they came out. My first PC was a 33Mhz Cyrix Instead I bought off my first college roommate. Now there are some real hardware constraints!
But yeah, necessity is the mother of invention. No doubt about it. Just not seeing how a $600 polished and performant laptop fits that bill ;)
The kid’s parents - and the kid - all have iPhones, so it’s familiar.
The kid’s school requires Windows or Mac for their WiFi and won’t let the kid use Linux because they don’t trust it.
There’s plenty of reasons why Linux isn’t the answer in current climate.
And they can do the former in a VM anyway. Install Linux, or a BSD, and go. With the bonus that you can experiment fearlessly because you've got snapshots and the worst-case for experimentation still leaves you with an entirely functioning laptop. Or use a cheap VPS, remotely.
The game Elite did something extremely evil and clever: it was actually able to switch between modes partway through each frame, so that it could display higher-resolution wireframe graphics in the upper part of the screen and lower-resolution more-colourful stuff for the radar/status display further down.
There's adults nowadays that do their taxes on their phone, cut videos on their phone, and edit spreadsheets on their phone.
And while smartphones and chromebooks are great at accomplishing your desired tasks, they offer no opportunities for growth. You can't change and play around with the system, become a power user, modify your system, look behind the curtain, and gain real understanding.
There's an excellent blogpost on this topic, "The Slow Death of the Power User": https://fireborn.mataroa.blog/blog/the-slow-death-of-the-pow...
of course it will praise the product like it's golden, turning disadvantages into "that's actually the good part"
Chrome books and phones teach nothing.
Uh... if you need to compile for iOS, sure.
But outside of that, no its not.
You are literally just paying the Apple tax that they deliberately choose.
So this is only for the kids who are obsessed not just with computers, but brand-new computers. Which is a different demographic.
As somebody who identified a lot with what's in that article I can say that I haven't just made peace with having been "different" but I love it and wouldn't want it any different comparing my life today with that of the arrogant non-nerds who made fun of us back in school.
Oh so all our hypothetical child has to do to discover what computers can actually do is completely rebuild one's software from scratch with no prior knowledge.
Next you'll tell me F1 drivers in their teens just have to LS swap a Saturn SC2 and book time at a track.
I wrote about the Mac in general since that’s what I know, but I imagine if I grew up in the Windows world and liked Windows more, I would have a similar experience with my dad’s old ThinkPad or something.
And macOS frankly provides a far better Unix experience than ChromeOS, in my experience, having actually used both (including for development, though only for a short time on ChromeOS because it was horrible).
It shouldn't be, except that the author chose to make every single paragraph about Mac, Apple ecosystem and bashing Chromebook.
As for piracy, it's just as easy on a Mac, and MacOS has more quality software than any other platform - unless you're talking about software used in factories and such.
But how many kids actually "save up" for a computer vs being given one by their parents or getting a hand-me-down from relatives? I would suspect that many parents would be more than happy to buy a Macbook for their kid if they showed that kind of interest.
True, and suffering through the limitations of the Apple platform will show the kid why Linux is better.
There is a certain kind of computer review that is really a permission slip. It tells you what you’re allowed to want. It locates you in a taxonomy — student, creative, professional, power user — and assigns you a product. It is helpful. It is responsible. It has very little interest in what you might become.
The MacBook Neo has attracted a lot of these reviews.
The consensus is reasonable: $599, A18 Pro, 8GB RAM, stripped-down I/O. A Chromebook killer, a first laptop, a sensible machine for sensible tasks. “If you are thinking about Xcode or Final Cut, this is not the computer for you.” The people saying this are not wrong. It is also not the point.
Nobody starts in the right place. You don’t begin with the correct tool and work sensibly within its constraints until you organically graduate to a more capable one. That is not how obsession works. Obsession works by taking whatever is available and pressing on it until it either breaks or reveals something. The machine’s limits become a map of the territory. You learn what computing actually costs by paying too much of it on hardware that can barely afford it.
I know this because I was running Final Cut Pro X on a 2006 Core 2 Duo iMac with 3GB RAM and 120GB of spinning rust. I was nine. I had no business doing this. I did it every day after school until my parents made me go to bed.
The machine came as a hand-me-down from my nana. She’d wiped it, set it up in her kitchen in Massachusetts. It was one software update away from getting the axe from Apple. I torrented Adobe CS5 the same week. Downloaded Xcode and dragged buttons and controls around in Interface Builder with no understanding of what I was looking at. I edited SystemVersion.plist to make the “About this Mac” window say it was running Mac OS 69, which is the s*x number, which is very funny. I faked being sick to watch WWDC 2011 — Steve Jobs’ last keynote — and clapped alone in my room when the audience clapped, and rebuilt his slides in Keynote afterward because I wanted to understand how he’d made them feel that way.
I knew the machine was wrong for what I wanted to do with it. I didn’t care. Every limitation was just the edge of something I hadn’t figured out yet. It was green fields and blue skies.
I thought about all of this when I opened the Neo for the first time.
What Apple put inside the Neo is the complete behavioral contract of the Mac. Not a Mac Lite. Not a browser in a laptop costume. The same macOS, the same APIs, the same Neural Engine, the same weird byzantine AppKit controls that haven’t meaningfully changed since the NeXT era. The ability to disable SIP and install some fuck-ass system modification you saw in a YouTube tutorial. All of it, at $599.
They cut the things that are, apparently, not the Mac. MagSafe. ProMotion. M-series silicon. Port bandwidth. Configurable memory. What remains is the Retina display, the aluminum, the keyboard, and the full software platform. I held it and thought, “yep, still a Mac.”
Yes, you will hit the limits of this machine. 8GB of RAM and a phone chip will see to that. But the limits you hit on the Neo are resource limits — memory is finite, silicon has a clock speed, processes cost something. You are learning physics. A Chromebook doesn’t teach you that. A Chromebook’s ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself. The kid who tries to run Blender on a Chromebook doesn’t learn that his machine can’t handle it. He learns that Google decided he’s not allowed to. Those are completely different lessons.
Somewhere a kid is saving up for this. He has read every review. Watched the introduction video four or five times. Looked up every spec, every benchmark, every footnote. He has probably walked into an Apple Store and interrogated an employee about it ad nauseam. He knows the consensus. He knows it’s probably not the right tool for everything he wants to do.
He has decided he’ll be fine.
This computer is not for the people writing those reviews — people who already have the MacBook Pro, who have the professional context, who are optimizing at the margin. This computer is for the kid who doesn’t have a margin to optimize. Who can’t wait for the right tool to materialize. Who is going to take what’s available and push it until it breaks and learn something permanent from the breaking.
He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called “Projects” with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes. He is going to open GarageBand and make something that is not a song. He is going to take screenshots of fonts he likes and put them in a folder called “cool fonts” and not know why. Then he is going to have Blender and GarageBand and Safari and Xcode all open at once, not because he’s working in all of them but because he doesn’t know you’re not supposed to do that, and the machine is going to get hot and slow and he is going to learn what the spinning beachball cursor means. None of this will look, from the outside, like the beginning of anything. But one of those things is going to stick longer than the others. He won’t know which one until later. He’ll just know he keeps opening it.
That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
I was that kid.
He knows it’s probably not the right tool. It doesn’t matter. It never did.
The reviews can tell you what a computer is for. They have very little interest in what you might become because of one.
“Tomorrow's World: Nellie the School Computer 15 February 1969 - BBC”
The idea that constraints aid in creativity is not new.
It was junk. The EeePC was cheaper, lasted much longer and had Debian out of the box.
The school thing is different, but also hardly unique. A school issued macbook is often similarly locked down and unusable as a dev machine, due to the student lacking permissions to install anything the school deems dangerous.
And when I go to the grocery store, I am paying the Safeway tax that they deliberately choose, and when I go to the gas station I am paying the Exxon tax that they deliberately choose, and so on.
My 12-year-old wanted a laptop to build mods for games. I got her a new M1 Macbook 8GB - $425 from Walmart, refurbished.
My 17-year-old wanted a laptop for college, but wasn't sure what she needed or wanted yet. I gave her my 2020 M1 MBP.
If either of those situations arose today, I'd get them a Neo.
(yes, I know we're all on the spectrum somewhere, but a diagnosis is defined as it impacting your life severely, and I think many people would say that I have many autistic traits that are negative).
I think folks want to hate Apple more than they want to admit that Chromebooks kinda suck.
What would have been a trivial porting work with documentation, becomes extremely time-consuming and hard work without documentation.
That is why Asahi Linux lags by several years with the support for Apple computers, and it is unlikely that this lag time will ever be reduced. Even for the old Apple computers the hardware support is only partial, so such computers are never as useful for running Linux as AMD/Intel based computers.
That said, I quickly upgraded to a 4 year used Thinkpad and that was a huge difference.
When a PC was expected to boot to an OS and not much else, we had all the freedom - by necessity - to tinker and learn. Hardware was barely enough for most day-to-day usage, so we upgraded relatively frequently and got to know the physical innards as well.
This is all so streamlined today that even computers can be smartphones with "apps", or even just a browser that gets you to google slides and everything else (or the MS equivalents). It was probably a necessity that, as computers became infrastructure, they would become simplified, so 90% of the population can indeed file their tax return online (and the remaining 10% have their younger family members do it).
This also means that people nowadays simply don't know that they can walk into any second hand store and get a $200 PC with a warranty that'll be much more productive than any smartphone if they have the knowledge to use it properly. But was there really a loss? These are, for the most part, people that would not have been able to hop on the internet wagon if it'd relied on maintaining a linux distro at all. That's regarding adults; children now do indeed grow up with walled systems for the most part, and that might be a loss.
Are there even any x86 Chromebooks left at that price point? They are only one that are still capable of chrooting into Linux. ARM Chromebooks remain locked up.
If I got a Chromebook as a personal machine as a kid, I probably would’ve rooted it and see what I could do, but growing up, the beauty of the Mac (in that Snow Leopard era) was progressive disclosure. I could start on the happy path and have a perfectly stable machine, then customize the behaviors through the terminal, see what it does, mess with the system files, see what breaks, revert it, then go back to using iMovie like normal.
In my (admittedly limited) time using a rooted Chromebook, it’s much more like a switch flip. You go from mandatory water wings directly into getting pushed into the ocean and Google shouting “Good luck!!”
It's still the best desktop UNIX experience, especially since cheap PC laptops (and until very recently expensive ones) almost always have horrible build quality. It's also within only the last few years that PC trackpads came anywhere near the trackpads on Apple machines. Sometimes what you call a "tax" is literally some of us wanting quality.
Though let's be realistic, here: $600 is much more than the typical school-assigned Chromebook.
These use the A series chip, and even supporting new M chip revisions has been enough of an undertaking that I wouldn't really expect this to get Asahi linux anytime soon....
And apple can lock down the bootloader to be closer to the iPad/iPhone at any time with no notice, and based on their past actions, it would be quite in-line with their character to do so.
5 seconds of googling will get you an answer to "install blender on a Chromebook"
Honestly I’m pretty convinced that this « open » bootloader was just there to avoid criticism and bad press from specialized outlets when they presented the M1 because, for once, they needed specialized outlet to benchmark the M1 performance and not have anything bad to say about anything else.
They constantly break everything year after year without documenting any change which effectively makes Asahi unusable in anything recent.
I’m betting that they are just patiently waiting for Asahi to die by being too late of several years (which is already the case) to announce « The most secure Mac ever » silently releasing with closed bootloader when nobody and especially the press will care anymore.
Don’t get me wrong, I love Asahi and I even have it installed on my M2 Air, the project is doing incredible quality work. But I don’t believe it will last long. Hope I’m wrong, though.
I don't think the author is exactly bashing the Chromebook. I'm reading it in that the author praise an open ecosystem where you have flexibility and the choice to "take off the guard rails" and go where the device was not originally made to take you.
There's an argument to be made that it is ironic that Apple is the example of this, but to me that shows why I _still_ like MacOS, when all the other variants (iPadOS and iOS) are entirely locked down.
>and MacOS has more quality software than any other platform
This is simply untrue, and not something a tinkerer cares about on a general-purpose machine anyway (with my niece and son as n=2).
Meanwhile, some other kid in your area probably got scolded for installing F-Droid. Oh well...
Which is what makes a used Lexus so compelling. Toyota engineering under the hood, "Luxury" design on top.
(And I'm not trying to say anything is special about the laptop I'm using. I adore using trackpoint (so much that I brought my own trackpoint keyboard in to work to use there) so would gladly trade for an old thinkpad if what I had didn't already do what I need it to do).
But also, not every kid is interested in that anyway.
The cooling will be terrible so that every 30 seconds the fans kick in at full speed, thermal throttling takes hold, and then it decides all is fine 30 seconds after that so you're working on a machine that's constantly cycling fan noise. The trackpad will suck, meaning you need to have a mouse precariously balanced somewhere anytime you're using it. It'll have some irritating BIOS feature you can't work out how to turn off that flashes a giant icon on the screen whenever you hit the caps-lock button. The casing will be plastic and snap where the screen hinges are screwed in after a year of light use. It may even be a Chromebook, making it a glorified tablet with an attached keyboard and (terrible) trackpad.
The thing Apple have done here isn't to release a $599 laptop, its to release a $599 laptop where the compromises are ones the average user buying a $599 laptop isn't going to notice everyday, while not compromising on the things they do notice. Its made of metal, the trackpad feels good to use, the keyboard is pleasant, and the battery is large enough that you're not carrying around a charger that weighs as much as the laptop itself.
It can do most anything. It may not be amazing, but people get buy. And they may be ok with it.
I saw tons of comments in the original post about the Neo from people who talked about how they used extremely old hand-me-down/used laptops to learn to start programming and fall in love with computers.
I was just watching a video from ETA PRIME who tests lots of small computers to see how good they are for gaming.
He was playing RoboCop on it, and it ran pretty well. 45-ish FPS. It was using 11 gigs of RAM at the time. So it was obviously in swap.
Is that ideal? No. But it works.
Now it's much easier to start using a computer, but going beyond that has become so much harder.
I don't know if this is particularly current or what, or if it's easy to setup to run another OS or whatever, but it meets your price and architecture criteria.
https://www.bestbuy.com/product/hp-14-chromebook-intel-celer...
If the school is managing these Macs, including laptops sent home with a student, then unless it's for a specific purpose they aren't allowing you modify files, you probably aren't allowed to open a terminal or system settings, and you definitely aren't disabling SIP. You might not even be able to access the open internet if they've hard-configured it into a VPN. No different from a managed Chromebook.
Likewise, even older and lower-end unmanaged Chromebooks can enable a full Linux environment that runs a terminal in a browser tab. Doing so doesn't require root or developer mode, and it doesn't change or sacrifice any of the rest of the ChromeOS environment (for which your core assertion, that an unmanaged Mac is a computer and an unmanaged Chromebook is a thin-client appliance, still fundamentally holds). You can install Blender and have it running in a window by about 1 minute into watching a YouTube video titled "How To Download Or Update ANY VERSION Of Blender On Chromebook".
Gaining root on a Chromebook is mostly just a prerequisite to modifying things specific to ChromeOS, but the easier to access, more featureful, and safer LDE is still an entire operating system that you can tinker with, screw up, overload, blow up, and reset to zero, all without losing the happy path of opening up Canva (or, more likely, CapCut on their phone/iPad) and editing videos or whatever.
Your parent comment is arguing against perpetuating the wrong negative connotations and lack of understanding of autism.
Not to say the original author was doing it maliciously, I don’t think they were.
The Macbook Neo will be no different if the school is actually managing them properly.
Fedora is the best OS humanity has ever made. No exaguration. There needs to be the best, and its Fedora.
Linux gets a bad reputation because 20-ish years ago Ubuntu sent out free CDs and became the dominant OS. Ubuntu/Mint is part of Debian family, outdated linux. They call outdated Linux 'stable', but its not stable like a table. Its software version frozen. Bugs that are fixed today wont get those fixes for 2 years. Not to mention, a new mouse you buy from amazon/nvidia card/web video player wont work due to the outdated nature of these distros. (And yes, I know you can do surgery to update it, but no one likes that)
Fedora is not Arch. Fedora is the consumer grade Red Hat.
But we know there’s lots of other models that they’re already working on. We don’t know how similar or different it is from an OS perspective.
> There's an argument to be made that it is ironic that Apple is the example of this, but to me that shows why I _still_ like MacOS, when all the other variants (iPadOS and iOS) are entirely locked down.
This was such a natural and common thing that I never even questioned if others were having a different experience with computers. This sounds crazy now, but it felt as if everyone was either going to learn to program or already had, not as a career choice but as an essential form of literacy. I mean even the calculators were programmable!
To me, Macs were just "the boring computers" we had at school and what my grandparents bought. They seemed locked down and weird like an appliance. I have no idea what my life would be like now if I had grown up in a different time and with a Mac.
This isn't to hate on Macs, but to tell the story of the dominance of Microsoft at the time and how much culture shifted towards more "dumb" consumerism. By the time the first iPod came out I realized the adults had no interest in any of this more progressive future. Then the iPhone and Windows Vista confirmed it.
I installed Ubuntu on the ThinkPad I had in high school and never really looked back. To this day, I am still baffled by the obsessions people have with AI "replacing jobs" and Apple devices as status symbols. I think those people miss the point entirely and worry about their incomplete worldview being passed down to younger generations. What I see is the masses refusing to participate and technofeudalists taking advantage of them.
Is that gonna be before or after the iphone with no usb port?
That doesn't mean that the engineeers will necessarily ship something more flexible than what the PMs asked for. Often not.
But sometimes they will.
Not entirely sure about that. With the coupons/ deals HP does, I landed a brand new HP 17" (17t-cn500) laptop with a intel core ultra 5 225u, 16gb ram and the 1920x1080 fhd screen upgrade for $588 tax included. Obviously not the same form factor as the Neo, but this is also a laptop for my kitchen that will spend most of its life streaming media and displaying recipes. I'm sure you can find something closer to the size of the neo with better specs for a similar price.
Oh my god yes. I've read way too much discussions that completely overlook this aspect.
So many people get so fixated on meaningless labels such as "smartphone CPU" (meaning it's bad) and things to pick on like "ew no HDMI or Ethernet" as if it was a life-saving thought terminator that preserves the world view in which absolutely nothing under the Apple brand can be in any way good.
Of course not. I could do it in a coma. I've also been using computers since 2004, and you're probably similar.
You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software that Apple makes that much more difficult. Most Apple users are content to use apps written by others and that is as far as their interest goes.
An analogy is the car market. Most people don't care how the car works, etc. They just want to get to places. If you only need to drive to the shops and do minimal errands, you don't even need a truck - a sedan will do just fine. Same with computers, lots of different market segments with distinct needs and expectations.
But at the time there was nothing, because Apple Silicon wasn't a platform anyone but them was targeting, because they had just created it.
So they built the infrastructure, and then waited for someone to actually start taking advantage of it, before bothering to acknowledge it.
And because that "someone" isn't a bigcorp (i.e. Microsoft) wanting to do a co-marketing push, but just FOSS people gradually building something but never quite "launching" a 1.0 of it — Apple just "acknowledged" it quietly, at developer conferences, exposing it only via developer-centric CLI tooling, rather than with the sort of polished UI experience they would need if Microsoft was trying to convince Joe Excel User to dual-boot Windows on their Apple Silicon MBP.
> announce « The most secure Mac ever » silently releasing with closed bootloader
That's extremely unlikely to happen, as Apple's hardware and OS developers build Macs and macOS (and all the other hardware + OSes) using Macs and macOS. And those engineers (and engineers working at Apple's hardware and accessory manufacturing partners) will always need to be able to diddle around with the kernel and extensions "in anger" without needing to go through a three-day-turnaround code-signing process.
There's a whole proprietary, distributed kernel development and QC flow for macOS, that looks a lot like the Linux one (i.e. with all the same bigcorps involved making sure their stuff works), but all happening behind closed doors. But all the same stuff still needs to happen regardless, to ensure that buggy drivers don't ship. Thus macOS kernel development mode being just one reboot-and-toggle away.
But for Linux, the creative software simply isn't there in many cases for a kid to start learning. Unless it's programming, which is not everybody's talent.
A kid tinkering with any kind of creative software learns and absorbs important skills which they can build on later if they want to. These things are much more valuable than system troubleshooting or becoming skilled in a game.
I don’t want one, it doesn’t do what I need. But I can definitely see the use cases especially at that price point.
I've been an Ubuntu user for 20 years, and RedHat and Suse prior to that. Ubuntu just worked. Debian had packages for everything, including from 3rd party vendors. It lets me focus on my work, and not worry about the OS, or compiling packages, or finding installers. When I had issues (rare), the large user base meant that someone had already figured out a solution to the problem.
The flavor of Linux doesn't matter so much in my opinion.
Ubuntu, Mint, PopOS, and others with Debian as an upstream are not Debian. They build their own packages on their own schedules.
Fedora is not “consumer grade RedHat”. It’s the rolling release upstream of RHEL, much like Debian Testing is upstream of Debian Stable.
The main reason Linux got a bad reputation was the tribalism of people going off half-cocked talking about their personal preferences without actually working with the alternatives and starting this sort of holy war diatribe.
i have never once successfully installed fedora. probably just hardware stuff, but as often as i've wanted to try it and opensuse, they have never booted post-install for me. on machines i've successfully installed Debian and openBSD. go figure. i know i'm an outlier here. maybe it's just bad luck.
but reading your post, it sounds like a club i don't want to be a part of. linux is linux. distros don't matter. you can get nearly anything to work if you spend enough time on it. GUI OS installers that fail are not worth my time.
Never underestimate the time investment and frugality of a "technically curious" young person... Myself, I would have been a happy end-user, loading/playing games, running software - except, I bought a cheap modem - with physical IRQ jumpers - and no documentation - and it's default jumper settings conflicted with my mouse in Windows. If it hadn't been for that cheap/frugal purchase and then having to invest the time to troubleshoot, I wouldn't have become "technical" and moved on to greater and greater challenges and learning experiences. Most people would have just returned it and got an external modem instead, or given-up on even the possibility of connecting to BBS's...
What is fundamentally different from the late 80's/early 90's, is now there is a tremendous wealth of knowledge on the internet to actually facilitate that troubleshooting type of learning activity. Is that better? Well - there will always be a "known solution", but what I find many people do now, is follow whatever the first set of instructions they find, treating them like a "magical spell", without knowing/learning "why/how"... [And if the first set of instructions doesn't work, the majority just "give up"]
Overall - in my experience, the percentage of people who are truly "technically curious" is about the same as it ever was - single-digits... It ultimately depends on whether or not their interests/passions/blockers align with being forced to go "beyond" their comfort zone.
And, yes, I have an overlaid Linux-Libre kernel in SilverBlue.
Reverse engineering needs a lot of time and hard work, which may not be worthwhile.
Sometimes someone does this work, and everyone may benefit from it, but you should never count on this happening, unless you do the work yourself.
You don't really need that to use Linux.
People should stop copy/pasting urban myths or stories from the late 90's. We are in 2026 and one can perfectly buy a laptop preinstalled on linux with full support and just find the apps they need from an "app store" which in this case is just the frontend for the flatpak and packages manager. Picking up an app from Gnome Software is no different than installing an app from the play/apple/microsoft store.
And I prefer my Mac to this day as my main machine.
Consumer user or Linux hacker is a false dichotomy people sometimes like to try to slot people into (not accusing you GianFabien).
But anyway, I find it funny that author implies if a kid gets a MacBook Neo they will explore all the possibilities to use and customize it, but somehow the same kid won't try to push a Chromebook to its limit. It 100% matches my stereotype of how Apple fans view machines.
I imagine the young hackers of today just find other things with which to tinker.
I've seen kids build some amazing things in Minecraft. Is this really all that different from modifying the source to a game you copied-by-hand from BYTE magazine?
I haven't tried gaming, but I feel like it'll suck for almost anything that's not natively ARM64. Steam doesn't have an ARM64 based client yet, AFAIK.
Are these not creative software? Perhaps not industry standard, but what is industry going to look like in a couple of decades anyway?
It's also important to remember that Microsoft was in the middle of their Qualcomm exclusivity deal at the time of the M1's release, and thus Windows for ARM wasn't available on anything other than a few select devices or unofficial use of Insider builds.
That deal didn't actually expire until 2024[1], at which point Windows for ARM finally started to be sold in an official capacity with stable builds widely available.
It's entirely possible, though unconfirmed, that Apple was intentionally leaving the door open for "Boot Camp 2", and Microsoft simply never took them up on the offer, either because they were stuck in a deal made prior to the M1's release that prevented it, or because they no longer saw a financial benefit to being able to sell Windows to Mac users (possibly since Windows license sales are effectively a rounding error to Microsoft at this point; they make way more off of subscription services and/or Office, all of which are already available on macOS without having to dual-boot Windows).
[1]: https://www.tomshardware.com/pc-components/cpus/windows-on-a...
These days there are two things I use my "real" non-server computers for in my personal life:
1) Media piracy, only because I've been too lazy to set up a headless torrent system on my media server w/ VPN connection (why does the media server exist? Again, purely for piracy, remove that use case and it'd be gone as a waste of electricity and space the very next day). I could fix this in a Saturday and remove this entire use case, improving my process significantly at the same time, just haven't yet.
2) Video games. Really hoping that new Steam console doesn't end up being sky-high expensive, because I'm excited about "consolizing" this use case and ditching my last "real" PC tower (other than one server that I go months and months without directly messing-with; and that "real" PC tower is already running Bazzite to make it Steamdeck-like, it's just janky as fuck because it's Linux on frankensteined hardware so of course it's janky as fuck, so I'm still eager to swap it for something designed and with support for that actual use case).
Everything actually-important happens on iOS, and usually on a phone.
In my personal life, I've concluded that I just have no idea what more-useful-than-the-time-it-takes stuff I'd even do with a "real" computer, despite spending absolute fuckloads of my time screwing around with them from ages like 7-30. What I learned in that time was "how to computer" to a pretty advanced level, and luckily that per se has paid the bills and then some, but in all that time the actually-directly-useful-to-me stuff I've done with it has amounted to very little.
To me, personally, I've realized PCs are a solution looking for a problem, and that so rarely does my trying to bridge that gap result in an actual net-benefit (usually, not even close) that I've mostly stopped trying. That impulse got me where I am, can't complain, that "wasted" time over literal decades gained me a bunch of skills that are (it turns out) almost entirely useless to me personally but that others are willing to pay for, great, but I find myself a computer nerd who doesn't actually know WTF to do with a "real" computer. I just use my damn phone.
I suppose I'd still find them a lot more useful if iPads and iPhones didn't exist and so I needed to do banking and reading and such on some other computery device, but those do exist, so... yeah, not a lot of motivation to even own a "real" PC anymore, to include a MacBook—I doubt I'll replace my M1 Air when it finally dies.
To sum up, as a lifelong computer dork, I don't even know why I need a "real" computer any more, let alone why anyone else does.
How about Qubes OS? Also the parent never said anything about isolation and roll-backs. Nobody mentioned Silverblue except you. The discussion is about ordinary users, not hackers.
You can't really compare Audacity to Garage Band or GIMP to Affinity (which is now free).
AFAICT, the way Microsoft wants things to work, is that "Windows" is the native fat-client platform / SDK that ISVs are supposed to use/target when building fat-client apps that interact with (i.e. generate spend on) Azure-based backend systems. The #1 way Microsoft makes money at this point isn't from direct consumer or even volume-licensed subscriptions; it's from providing paid backend infra to dev shops who had long since locked themselves into the Microsoft/Windows development ecosystem, and who therefore saw Azure as the only valid cloud backend to integrate with when "cloud-enabling" their software (and/or, where the compliance story of integrating their previously native-and-local-syncing software with Azure, was 100x simpler than with integrating with any other cloud, due to Azure+Windows being able to act as a trusted principal-agent pair that can enforce policy-based security via a shared "cloud domain" identity [Entra ID] baked right into the OS ACL layer.)
Until recently, though, Microsoft thought of the Windows "platform" the same way Apple do of the Mac "platform": that "Windows"-the-platform-SDK was the same thing as Windows-the-OS. Which necessarily meant that consumers must be pushed with all conceivable effort toward using Windows-the-OS on their machines, so that these dev shops who had targeted Windows-the-SDK could reach them with their software (so that those dev shops would in turn spend more on Azure.)
But I think this equivalence is going away!
From what I've seen of discussions in various Microsoft-aligned sources recently, it feels to me like some part of what Windows 12 may mean by calling itself a "modular OS", is that Microsoft may be establishing some kind of very clean boundary layer between Windows-the-OS and Windows-the-platform/SDK.
---
What would that look like? I don't know for sure, but here's some spitballing:
Picture Mono, but as a complete UWP projection, shipping with all the native libraries that are built into Windows.
Or, if you'd prefer, picture Wine/Proton; but rather than black-box-reverse-engineered equivalents to Windows DLLs, it is all the DLLs that come with Windows. Except, now rebuilt from the ground up so that they compile against NTOS or Mach or Linux syscalls.
Basically, "the complete Windows platform" as a JVM-like runtime you "get for free" when installing Windows-the-OS, but can install on top of macOS and Linux. (Probably in various runtime profiles, as with Embedded vs Server vs Desktop JREs. You don't need D3D on your server.)
This would be likely to take 100% of the wind out of the sails of the Wine/Proton projects overnight. And maybe kill Mono itself, too. After all, why bother with half-assed third-party implementations of the Windows Platform, when you can just install the "real" Windows Platform, and get guaranteed bug-for-bug compatibility with existing Windows software (relying on the same databases of app shims and fixes Windows-the-OS has shipped with for ~forever)?
SteamOS would be reduced to "a Linux distro that preinstalls the Windows Platform." ReactOS might or might not (depends on stubbornness) be reduced to "a clean-room-implemented NTOSKRNL-compatible base OS, that preinstalls the Windows target of the Windows Platform."
Wine/Proton themselves would, if they even bothered to keep going, end up rebranded as "an alternative ground-up Windows Platform runtime." (If the official "Windows Platform runtime" was then open-sourced, then likely Wine/Proton would fully fade into obscurity, as anyone who wanted to maintain their own libre Windows Platform runtime would just start by forking Microsoft's. Very similar to the situation with OpenJDK.)
---
In any case, regardless of how they do it, any move in this direction would make it blindingly obvious why Microsoft wouldn't care about enabling something like a "Boot Camp 2" feature on Macs any more: they no longer care if you install Windows-the-OS on a Mac; they rather want you to install the Windows Platform runtime under macOS. And then they'll have you as a consumer of Windows Platform products all the same.
(Actually, even better, as they'll have you far more of the time. In the Boot Camp business strategy, any time you spend booted into macOS puts you out of Microsoft's reach, save for the few first-party apps Microsoft has ported to macOS + sells on the App Store. In the Windows Platform business strategy, meanwhile, you can be running arbitrary Windows Platform apps on your Mac [and so generating Azure spend for some ISV somewhere] 100% of the time you're using it!)
That sounds flippant, but it's the visceral reaction I have to trying to do everything on iOS (iPads are a little better than iPhone in this regard, but still have the "everything through a keyhole" feeling).
That said, tiny viewports are not really the main problem, since obviously a modern iPhone has far more resolution than almost any monitor available in 1990, or even 2000. It's more that most exploration and creation is not doable on an iPhone, by design.
My original point, though, was that seeing new grad software developers who have never had any non-externally-directed interaction with a computer made me realize why we have to show them how to use the shell in a terminal, and why they seemed to have no particular curiosity about any software thing not directly related to the task that they're doing. For new grads who aren't curious, finding that something they're doing doesn't work due to architectural issues or some nonobvious combination of bugs prompts neither asking for help nor a deep dive into what the problem could be. Asking for help is basically cheating, to them, and they've never before encountered real problems that weren't explained in the text, handled by their group partner, or trivially stack-exchangable (remember, this realization was 2019, and therefore before common LLM usage). At standup after standup, they report making progress on working through the moderately complex ticket they picked up, but in fact, nothing useful is happening. Sometimes people like this can be shown the way, and then become functioning and capable developers and troubleshooters. Sometimes not.
Anyway, /old-man-rant.
So the only real benefit compared to my MacBook Air is that the screen is a bit nicer, and I can keep more Firefox tabs open, because it has more RAM.
But honestly I did not like Silverblue. I had a 13 year old gaming computer I installed it on and I couldnt get the ancient GPU drivers installed due to the way things are containerized. This would have been a few commands otherwise.
Maybe its fine for chromebook-like things. I might have picked a bad testcase.
The terminal app is not iterm. But Apple's own Terminal.app
And no there's no package manager but there's brew and macports.
The most important one is that an app's lifecycle can be different than a web browser. You don't always keep a web browser open, but you might want to keep Discord open regardless of what you do with the web browser. That kind of lifecycle management can be tedious and frustrating for a regular user.
Discord's electron app has many features that its web app doesn't such as "Minimize to system tray", "Run at startup", "Game/media detection", "In-game overlays" etc.
Even PWAs can't have most of these features, so that's why we have to deploy an entire browser suite per app nowadays.
I'va been migrating my workflow to this approach and I'm an embedded dev! My closet does have hw strewn about but once you set it up that you don't have to touch wires it's super convenient.
My one gripe with MacBook airs up to m4 was support for only one external monitor. But m4 fixed this.
RDP: Every time a native notification pops up, I get disconnected (usually the notification is about something I've been doing, such as starting a self-hosted server or running winget via unigetui). It randomly disconnects when I've been using it for more than a few minutes, even when there isn't a notification. All of this so far seems to be limited to Android's rdp client (the Windows app). For Windows built-in RDP client, my issue is that there's no way to make it resize the desktop like vmconnect does when you resize the window (and no way to proxy vmconnect connections easily for home use--I do not want to enable WinRM for the full system and figure out how to secure it, I just want a single PC on the LAN to be able to access a single VM conveniently, preferably able to log in as different users)
But there's issues with ssh (and likely WinRM/ps remoting, though I haven't used it) as well: with Linux you need to use sudo, but with Windows there's apparently no CLI requirement; ssh runs elevated (though apparently you can change this; I make do with connecting to a running psmux session that's not elevated). So far as I know, there's no way to elevate without the GUI being involved (admittedly I haven't looked since I started using ssh with Windows).
Linux? Connecting to Linux works perfectly. I can't use xrdp or ssh or vnc or forwarding x11 over ssh or [other] and they work perfectly. I used to use x2go before Wayland, and despite the pain of actually getting it working even that worked better; XDMCP required some amount of setup, but it was awesome (too bad there's nothing that efficient with Wayland); xpra looks great, but either didn't exist or I was unaware at the time. The only issues with Linux remoting are, again, Windows-related (it's seemingly impossible to get vmconnect enhanced session to work properly with Linux at all on Fedora 43; the things I've found online don't seem to work for me).