I gotta say I am continuously amazed how much Musk is allowed to get away with. I know he can get some things done and he is, apparently, skilled manager, fund raiser and bs'er of epic proportions, but I have a hard time understanding how all this didn't catch up to him yet.
Class actions in the Netherlands mostly favor lawyers.
It's run by the person mentioned in the article, and unsurprisingly the domain is Dutch, but seems the same thing will apply in lots of countries if FSD rolls out there too, not just Netherlands.
People don't talk about these cars driving themselves enough imho
I think, this is a calculation to understand if an upgrade of hw3 to hw4 actually solves the problem or if hw3 must be updated to hw5.
One upgrade is more economical than two, but I would be annoyed for sure as well.
The math doesn't work out. It should be \euro 20,000,000 in FSD purchases, no?
After paying the full cost and being stuck on old software that had a promise of having the hardware required for it
HW5 is unlikely to solve FSD.
You can use FSD with HW3 in other countries like Canada.
Also the EU adopted laws restricting self-driving behavior, making FSD far less capable there. For example, the software cannot exert a lateral acceleration of more than 3m/sec^2. It must also cancel lane changes after 5 seconds after the start of engaging the turn signal. Tesla gimped their self-driving features in the EU & Australia because of this.[1]
It’s only the latest version of FSD (which only runs on HW4) that lacks these restrictions and has been approved for use in the Netherlands. Even then, it requires you to pay attention to the road, so it's not what he paid for.
1. https://electrek.co/2019/05/17/tesla-nerfs-autopilot-europe-...
If I can't go to sleep lying down on the seat as a sole occupant, it's not yet self driving.
On the other hand, when I got FSD trials in the model 3 in the last year or so, it never managed to get more than ~a mile without me having to disengage.
What's worse is this is all going to end up happening again when HW5 comes out and all of the HW4 cars start getting a trimmed down version of the FSD software from HW5, like HW3 is currently receiving.
It’s because driving on the freeway isn’t FSD, it’s a better version of cruise control, and other companies also offer similar capabilities. Within a city, the thing is a shitshow. It does random things all the time and it’s almost a larger cognitive burden on me to constantly be on the lookout for it to make mistake where I have to take over vs me just driving the car myself. For me specifically, it’s just impossible to drive because it fails to recognize curved streets and a couple of other irregularities just within blocks of where I live.
I thought the diver was supposed to keep hands on the wheel in case consuming hits wrong.
That's why Tesla fans buy those weighted gizmos to fool the computer into thinking they're still holding the steering wheel.
Other brands have had self driving features for years now. Some even operate at a higher level of automation.
https://www.faistgroup.com/site/assets/files/1657/j3016-leve...
While FSD's manipulation of controls is impressive -- it is missing a very critical component that is required for self driving: the ability to guarantee whether or not it can make a safe decision. Tesla's FSD still offloads this task to the human driver. Once they can do this more than zero percent of the time, they will have achieved level 3.
I do not want to be the 'manager of my car'. That'd be a downgrade from being an actual driver.
Lane Assist, auto-stop-start, cruise control are enough for me and have been available mostly for decades and require a similar amount of attention.
FSD is a busted flush and I can't believe those who got conned by it aren't more vocal.
On a freeway it’s only kind of usable. It switches lanes far too aggressively and for no reason, to the point that it makes the ride uncomfortable.
What I really want is auto steer with lane switching when I signal, which for some reason I could never get working in any mode. It either doesn’t change lanes at all, or changes them arbitrarily of its own volition. And if I change lanes manually it turns off autosteer, which is too irritating to use in practice.
Tesla self driving, in any mode, is a bad product. And I say this as a Tesla fan.
It always had the feeling of being outside with your toddler by the pool. I can look away but I have 50/50 odds of a dead toddler if I do it for to long.
Then, on the way home it drove me home on the wrong side of the street and I had to take over. Such a silly mistake.
Similar to what you said; from there on out, it was more trouble than it's worth because you can't let your guard down.
1. https://www.tesla.com/customer-stories/cross-country-trip-fu...
If you’re driving, your brain can automatically prioritize the importance of things that you see. But since a computer fails in different ways than a human, you lose all automatic prioritization
FSD is amazing. Any notion it takes more effort to use it than driving is made up.
A "self-driving" tesla is an adversary you need to supervise to make sure it doesn't take actions you wouldn't expect of a normal car.
As other posters have pointed out, it's like running an LLM with `--dangerously-skip-permissions`: I wouldn't `rm -rf /` my computer (or in the case of tesla, my life), but an AI might.
One such study is "Performance consequences of automation-induced 'complacency'" (Parasuraman, Molloy & Singh, 1993) https://www.pacdeff.com/pdfs/Automation%20Induced%20Complace...
Previous studies had found that a human and a computer performed markedly better than either a human alone or a computer alone - but in those studies failures were quite common, so they didn't give the humans time to get bored or distracted.
When researchers got test subjects to perform a simulated flying task, monitoring a system with 99%+ reliability, they found the humans were proportionally much worse at stepping in than they were on less reliable systems.
Swimming pool lifeguards will often change posts every 15-20 minutes and and get a 10-15 minute break every hour, to keep things interesting enough that they can pay attention. Good luck getting drivers to do that.
I find it less cognitive load to drive it myself. It's easier to predict what other vehicles will do than my own. Boo.
I have sympathy with the challenges as I worked in the field.
Triggering psychosis is not difficult and the LLM is easily capable of doing that. For a person they soon get freaked out and are likely to summon help. "Johnny started acting crazy and I'm not sure what to do, please come". But the LLM isn't a person, Johnny needs to know more about the CIA's programme to cross breed Venusians with Hollywood stars? Here's an itinerary with the address of a real hotel in LA and an entirely hallucinated CIA officer's schedule.
Next thing you know, Johnny is shot dead by officers responding to a maniac with a fire axe who broke into an LA hotel and was screaming about space aliens.
FWIW: My 2026 Huyndai's driver assistance is better than my old 2018 Tesla Model 3's enhanced autopilot.
And that was actual hands-free, while Teslas at the time required you to take putting torque on the wheel to lie to the system.
Even then my 2017 Hyundai did practically everything but steer. Get it on the highway, turn on ACC, and it'll handle the traffic just keep it in the lane. It even did all the stop and go traffic.
Funny, I was going to mention exactly that. I'm a private pilot with a modern autopilot and flying is exhausting. Partly because the piston engine is rattling your brain the entire time but also because you're on high alert the entire time. You're always making sure the autopilot is keeping the plane on the blue (or green) line and is being predictable. And my smartwatch shows my heart rate is usually more elevated on autopilot than not.
I heard the same thing in 2019, HW3 solved all the issues, it finally just works as advertised. That was after HW2 was guaranteed to ship with all the hardware needed for FSD a decade ago, for real this time.
I'll probably wait for HW5, then you'll tell me its really there. This time it won't even run people over, and it actually stops at stop signs more than just 98% of the time.
Personally I try and avoid systems that drive people in front of trains. https://www.youtube.com/watch?v=vMqTmOTtft4

The Dutch Tesla owner who launched a collective claim against Tesla over FSD on HW3 cars called Tesla to ask about the €6,400 he paid for “Full Self-Driving” in 2019. After 7 years of waiting, Tesla’s answer was to “just be patient.”
It’s an almost comically tone-deaf response that perfectly encapsulates Tesla’s approach to the HW3 problem — and it’s only going to fuel the growing legal pressure in Europe.
Mischa Sigtermans, the Dutch Model 3 owner who launched the HW3 collective claim site we reported on earlier this week, called Tesla today and recorded the entire conversation. He posted the details in a thread on X.
Sigtermans paid €6,400 for FSD when he bought one of the first Model 3s in the Netherlands in 2019. Last week, the Dutch vehicle authority RDW granted Tesla type approval for FSD Supervised — the first in the EU. But the approved build only runs on Tesla’s newer AI4 computer. HW3 cars like his get nothing.
So he called Tesla. His first question: when does FSD come to HW3 cars?
Tesla’s answer: “No information about when it comes, or if it comes at all.”
Not when. If.
Sigtermans then asked what exactly he paid for. Tesla told him he paid for “the full self-drive capability.” As he pointed out, that’s what’s on his 2019 invoice — “capability.” Not “supervised.” Not “lite.” The full capability.
When he brought up Musk’s admission that HW3 isn’t enough for unsupervised FSD, Tesla said it had “no information about this.” When he asked about the promised free hardware upgrade, Tesla said there was “no information within Europe.” When he asked how Tesla plans to handle all the Europeans who bought FSD on HW3, Tesla said: “We share whatever information is available at that moment.” The information available: none.
Sigtermans then told the agent about the 3,000 HW3 owners from 29 countries who signed up to his claim site — representing €6.5 million in FSD purchases. He asked to speak to a spokesperson about finding a solution. The agent put him on hold, checked with his manager, and came back with the final answer: “You just have to be patient.”
After Sigtermans hung up, Tesla immediately closed his case. He received an automated email: “Your question is closed” — with a link to book a test drive.
The full context here makes Tesla’s “be patient” response even more absurd. Here’s what HW3 owners have been told over the years:
In 2019, when Sigtermans and hundreds of thousands of other owners purchased FSD, Tesla sold it as a package that would enable full autonomy through software updates alone. The hardware was supposedly sufficient.
By August 2024, Tesla VP of AI Ashok Elluswamy acknowledged that HW3 runs a “relatively smaller model” than AI4 with workarounds. The gap between HW3 and HW4 was widening, not closing.
In January 2025, Elon Musk finally admitted what many had long suspected: Tesla would “need to replace all HW3 computers in vehicles where FSD was purchased.” On the Q4 2024 earnings call, he called the hardware replacement “painful and difficult” and said he was “kind of glad that not that many people bought the FSD package.”
Tesla even filed a patent describing a “math trick” to squeeze a modern FSD model onto HW3. The patent itself acknowledges this workaround can render the system “inoperable” for perception units.
Now, 15 months after Musk’s admission, Tesla still has no hardware retrofit program, no refund policy, and no concrete timeline. The company has vaguely promised a stripped-down “v14 Lite” for HW3 sometime in Q2 2026, but that’s a fundamentally different product than what was sold. It’s a diet version of a system that itself is still only Level 2 driver assistance — not the autonomous driving Tesla originally promised.
And when an owner who has waited since 2019 calls to ask about it, the answer is: be patient.
Sigtermans isn’t just venting on X. He launched hw3claim.nl, a site to bundle HW3 + FSD owners across the EU into a collective claim against Tesla, seeking €6,800 per owner. In one week, 3,000 owners from 29 countries signed up — representing over €6 million in FSD purchases.
The timing is significant. FSD launching in Europe was always going to be the moment the HW3 problem stopped being abstract and became a concrete, quantifiable harm. European owners can now see exactly what they’re missing — their neighbors with AI4 cars are getting FSD Supervised, while they get nothing despite paying thousands of euros for the same promise.
EU consumer protection law is considerably stronger than what Tesla faces in the US. Buyers have robust rights around conformity with advertised features, and countries like the Netherlands, Germany, and France have mature collective-redress frameworks.
This isn’t the first legal action either. In October 2025, thousands of Tesla owners joined a class-action lawsuit in Australia alleging Tesla misrepresented FSD capabilities. That action was directly triggered by Musk’s HW3 admission.
“Be patient” is an extraordinary thing to tell someone who paid you €6,400 seven years ago for a product you now admit you can’t deliver on their hardware.
We’ve been covering the HW3 saga for years, and this phone call perfectly captures the core problem: Tesla has no answer. Not a bad answer — no answer. The company hasn’t announced a retrofit program, hasn’t offered refunds, hasn’t set a timeline. All it can offer is the same thing it’s been offering since 2019: wait.
The difference now is that the waiting has an endpoint, and it’s not the one Tesla promised. FSD launched in Europe last week, and HW3 owners are locked out. The harm isn’t theoretical anymore — it’s their neighbor driving with FSD while they stare at the same “coming soon” message they’ve had for seven years.
Sigtermans’ collective claim is going to grow. EU consumer law is built for exactly this scenario: a company that sold a capability it cannot deliver. Tesla’s own CEO admitted HW3 can’t support self-driving. Tesla’s own patent describes workarounds that can render the system “inoperable.” That’s not a he-said-she-said — that’s Tesla’s own paper trail.
I’m increasingly convinced this will end up in court. And when it does, “be patient” is going to look very bad in front of a European judge.
FTC: We use income earning auto affiliate links. More.
1. https://www.thedrive.com/opinion/40604/five-things-my-roomba...
2. https://www.thedrive.com/news/a-tesla-actually-drove-itself-...
Totally fully self driving even though you need not one, not two, but three autonomous driving experts with you. And be sure to have a second car with you when your first autonomous vehicle strands you. Sure sounds like a reliable system ready for the masses to use on public roadways!
Have Tesla and its fanboys overstated FSD’s capabilities? Absolutely. But I’m not saying that FSD is currently good enough that one should expect to have thousands of miles between interventions. I’m trying to convince someone that it has been done. The reason I’m trying to do this is because that same cannot be said for any other self-driving technology available in a consumer vehicle today, so claiming that FSD is no better than competing offerings is not accurate. FSD overhyped? Sure. Late? Extremely. Fraudulent, bordering on criminal? I could see that. But it’s still in a league of its own in terms of what it can do.
I’m not making any claims about FSD’s safety or how ready it is for mass usage on public roads. I am trying to figure out what information would convince you that someone has used FSD for thousands of miles without intervening. Does this count or not? If not, why?