This argument is inherently anti-progress. It's like saying human had been using sextants to navigate for hundreds of years, why GPS?
A more sensible question is, why not?
The article starts out without saying it but my takeaway at the end is "Not $200" and "Not in the near future"?
I always thought the argument that humans are adequate drivers and hence only cameras was not great. Why not actually be better than humans at sensing and driving?
>"This misleading article contains numerous factual errors regarding automotive lidar. Here are the most glaring:
There are multiple manufacturers, including Hesai, that use mechanical means for at least one scan axis and are already sold for a fraction of the "$10k - $20k" price noted by the author. Luminar itself built this class of scanners before going bankrupt.
Per Microvision's own website, the Movia-S does not use a phased array and also does not have a range anywhere near 200m.
Velodyne and Luminar do not even exist as companies anymore. Both have gone bankrupt and been acquired by competitors."
There's no way a sensor can tell if a signal was from its origin?
Guessing any signal should be treated as untrusted until verified but I suspect coders won't be doing that unless it's easy
Waymo benefits from Google's unparalleled geospatial data. Waymo also has a support architecture that doesn't depend on real time remote operation, which can't be implemented reliably in almost all cases. You can't be following your supposedly unsupervised cars with a supervisor in a chase car. You can't even be driving remotely. Your driver software has to be able to drive independently in all cases, even those where it needs to ask a human how to proceed.
The difference between level two and level three driver assist and level four autonomy is like the difference between suborbital flight and putting a payload in orbit. What looks like a next logical step actually takes 10X or more effort, scale, and testing.
I personally find it convincing that the problem with self-driving is mostly that the models aren't intelligent enough, and that adding LiDAR wouldn't be enough to achieve the reliability required. But I don't know, I don't really work in that field so maybe engineers who have more experience with self driving might say otherwise.
Cepton Technologies offers Nova [0], Nova-Ultra [1] sensors both at a sub-$100 price point [2]. These feature a 120°(H) x 90°(V) FOV at 50m, with 2.7M points per second sampling.
Velodyne introduced Velabit in 2021, for $100. Boasting 100m range and a 60-degree horizontal FoV x 10-degree vertical FoV.
The article claims that:
> What distinguishes current claims is the explicit focus on sub-$200 pricing tied to production volume rather than future prototypes or limited pilot runs.
which is simply not true. Cepton (currently offering) and Velodyne (acquired by Ouster in 2023) have done this for years.
[0]: https://www.cepton.com/products/nova
[1]: https://www.cepton.com/products/nova-ultra
[2]: https://www.cepton.com/announcements/ceptons-nova-lidar-named-as-ces-2022-innovation-awards-honoree
[3]: https://lidarmag.com/2020/01/07/velodyne-lidar-introduces-velabit/https://www.fccidlookup.com/report/tesla-new-millimeter-wave...
Of course, ambitious pricing like this is all about economies of scale - sensors that are used in production vehicles are ordered by the million, and that lowers the costs massively. When the huge orders didn't materialise, the economies of scale and low prices didn't materialise either.
[1] https://web.archive.org/web/20161013165833/http://content.us...
I have been watching the sensor space for a while. Cheap LIDAR units could open up weird DIY uses and not just cars. ALSO regulatory and mapping integration will matter. I tried to work with public datasets and it's messy. The hardware is only one part! BUT it's exciting to see multiple vendors in the space. Competition might push vendors to refine the software stack as well as the hardware. HOWEVER I'm keeping an eye on how these systems handle edge cases in bad weather. I don't think we have seen enough data yet...
> phased-array
I'm not well versed into RF physics. I had the feeling that light-wave coherency in lasers had to be created at a single source (or amplified as it passes by). That's the first time I hear about phased-array lasers.
Can someone knowledgeable chime in on this?
> pricing below US $200. That’s less than half of typical prices now, and it’s not even the full extent of the company’s ambition.
This means there are sensors available for like $500 or more. At 4 per car, this is still just $2000, which is a very reasonable cost add even for a midrange car.
And with price comparisons like this, I'm sure Chinese competitors aren't factored in, I'm sure the Chinese have stuff for cheaper.
So Affordable Lidar is not a limitation. Despite that, self-driving doesn't really exist outside of Waymo, which people take to assume that Lidar is their killer advantage, but with other cars having Lidar, I think that might not turn out to be the deciding facotr.
Glad to see someone lowering the cost of this technology, and hope to see lots of engineers using this tech as a result.
We might even see a boom in LIDAR tech as a result
Is there any actual technical reason why automobile Lidar be expensive? Just combine visual processing with single point sampler that will feed points of interest and accurate model of the surroundings will be built.
It looks like these sensors have just enough range to be effective for lidar terrain scanning. I would have bought a Movia S right now just to try it out.
If the pros of having a camera are monumental, then couldn't the video and lidar be combined to be even greater?
(Insert old man rant “Why are everyone’s headlights so gosh darn bright these days?!”)
Note: I have not had the pleasure of riding in one yet, but from what my friend in SJ says, it’s very convenient and confidence-inspiring.
Biggest risk is that a beam steering element stops while the emitters are running. Basically impossible with a phased array emitter like the article discusses.
And you'd probably have to be staring into the laser at close range while it was doing that.
The laser beams usually aren't tiny points like your laser pointer. Several centimeters across is more typical, especially at typical road distances. Your pupil is very small in comparison.
The optical hazard calculations are a very early part of the design of a LIDAR system, and all of this does get considered. Or should anyway.
Biggest risks are for people involved in R&D, where beams may be static and very close to personnel.
Automotive LIDARs are like, 128x64[px] for production models or 1920x1080[px] for experimental models with GbE and/or HDMI-equivalents-of-industry outputs. Totally different technologies.
- Setting up infrastructure and support for consumers is expensive and hard to do well, especially if that's not your main industry.
- Some products are only economical if mass produced, and that requires large, guaranteed buyers.
Consider an exhaust condensation cloud coming from a vehicle's tail pipe -- it could be opaque to a camera/computer-vision system. Can you model your way out of that? Or is it also useful to do sensor fusion of vision data with radar data (cloud is transparent) and others like lidar, etc. A multi-modal sensor feed is going to simplify the model, which in the end translates into compute load.
Even if it’s an intelligence problem, it’s possible that machine intelligence will not get to the point where it can resolve anytime soon, whereas more sensors might circumvent the issue completely. It’s like with Musk’s big claim (that humans use camera only to drive); the question is not if a good enough brain will be able to drive vision-only, but if Tesla can make that brain.
I am skeptical that tesla has this solved but interested in seeing how it goes when as they move to expand their robotaxi service.
Sensors or intelligence, at the end of the day it’s an engineering problem which doesn’t require pure solutions. Sometimes sensors break and cameras get covered in mud.
The problem is maintaining an acceptable level of quality at the lowest possible price, and at some point you spend more money on clever algorithms and researchers than a lidar.
Basically they're saying "we can catch up to China by 2028/2029" ||so please subsidize us||
Even mid-range sensors used in ADAS systems only cost $600-750. The long-range stuff that's needed for trucking or robotaxis is $1,500–6,000
Of course MicroVisiom is only claiming their LIDAR to be suitable for advanced driver assist, but ADAS encompasses a wide array of capabilities: basically everything between cruise control and robotaxis, so there's no definition of how much LIDAR you need to do the job, just however much you feel like. Tesla feels like none at all.
In practice, this can be done with phase change materials (heat/cool materials to change their index), or micro ring resonators (to divert light from one wave guide to another).
The beam then self-interferes, and the resulting interference pattern (constructive/destructive depending on the direction) are used to modulate the beam orientation.
You are right that a single source is needed, though I imagine that you can also use a laser source and shine it at another "pumped" material to have it emit more coherent light.
I've been thinking about possible use-cases for this technology besides LIDAR,. Point to point laser communication could be an interesting application: satellite-to-satellite communication, or drone-to-drone in high-EMI settings (battlefield with jammers). This would make mounting laser designators on small drones a lot easier. Here you go, free startup ideas ;)
There might be something cute you can do with interference patterns but no idea about that. We do sort of similar things with astronomic observations.
NB: just my layman's understanding
For lidar you transmit a pulse from a single source and receive its reflection at multiple points. Mentioning phased array with lidar almost always means receiving.
Please don't post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data.
Also, range is probably a factor. In a living room, you probably need something like 20m max. You car should "see" farther.
And here's one of Elon's mentions (he also has talked about it quite a bit in various spots).
https://xcancel.com/elonmusk/status/1959831831668228450?s=20
Edit: My personal view is that LiDAR and other sensors are extremely useful, but I worked on aircraft, not cars.
Human eyes do not have distance information, either, but derive it well enough from spatial (by ‘comparing’ inputs from 2 eyes) or temporal parallax (by ‘comparing’ inputs from one eye at different points in time) to drive cars.
One can also argue that detecting absolute distance isn’t necessary to drive a car. Time to-contact may be more useful. Even only detecting “change in bearing” can be sufficient to avoid collision (https://eoceanic.com/sailing/tips/27/179/how_to_tell_if_you_...)
Having said that, LiDAR works better than vision in mild fog, and if it’s possible to add a decent absolute distance sensor for little extra cost, why wouldn’t you?
How much of Waymo's training data is based on LIDAR mapping versus satellite/aerial/street view imagery? Before Waymo deploys in a new city, it deploys a huge fleet of cars that spend months of driving completely supervised, presumably to construct a detailed LIDAR map of the city. The fact that this needs to happen suggests Google's geospatial data moat is not as wide as it seems.
If LIDAR becomes cheap, you could imagine other car manufacturers would add it cars, initially and ostensibly to help with L2 driver aids, but with the ulterior motive of making a continuously updated map of the roads. If LIDAR were cheap enough that it could be added to every new Toyota or Ford as an afterthought, it would generate a hell of a lot more continuous mapping data than Waymo will ever have.
But suborbital flight and payload in orbit is much less of a difference than you might think.
The delta V is not that significantly different. Scale is almost the same, and a little bit more power and (second stage) your payload is now hurtling around the earth instead of falling like an ballistic missile which was what their suborbital predecessors are.
That's true, and they have a huge headstart, but I wonder if all these cubesat companies can bring the price down on data enough that others will be able to compete.
And from there it's easy to think: couldn't the car also detect white lines and stay within them? It doesn't have to be perfect; it can be cruise control++. If it errs a little, I can save it. But otherwise, this is a function I'd love to use if it was available, for a sub $1000 price point.
Where? How? I'm only seeing the Nova on ebay for between $4000 and $5000.
Interestingly, there are already some comparatively cheap LIDAR units on the market.
In the automotive market, ideally you need a 200m+ range (or whatever the stopping distance of your vehicle is) and you need to operate in bright direct sunlight (good luck making an eye-safe laser that doesn't get washed out by the sun) and you need more than one scanning plane (for when the car goes over bumps).
On the other hand, for indoor robotics where a 10m range is enough and there's much less direct sunlight? Your local robotics stockist probably already has something <$400
https://www.fleetowner.com/technology/article/55316670
The ~$75k per sensor in 2015 refers to the long-range sensors. 99% of production is from 4 Chinese companies: Hesai, RoboSense, Huawei, and Seyond.
The cheap RADAR devices you're talking about usually only output range and velocity, sometimes for a handful of rather large azimuth slices. That doesn't compete with LIDAR at all.
As for the range, again pretty powerful lasers are sold for sub 10SUD prices on retail. I am sure that there must be higher calibration and precision requirements as the distance increase but is it really order of magnitudes higher? 120 meters laser measurer with 1cm accuracy is 15 Euros on Temu and that thing has an LCD screen and a battery as a handheld device. How much distance do you actually need?
Single human eyes do resolve depth perception. Not as good as binocular vision, but you don't loose all depth perception of you lose an eye.
also regulators gather srastics and if cars with something do better they will mandate it.
It may just be faster to make lidar cheap. And lidar can do things humans can't.
Computer vision does not work exactly like human vision, closely equating the two has tended to work out poorly in extreme circumstances.
High performance fully automated driving that relies solely on vision is a losing bet.
Also, military sensor use shows the best answer is to have as many different types of sensors as possible and then do sensor fusion. So machine vision, lidar, radar, etc.
That way you pick up things that are missed by one or more sensor types, catches problems and errors from any of them, and end up with the most accurate ‘view’ of the world - even better than a normal human would.
It’s what Waymo is doing, and they also unsurprisingly, have the best self driving right now.
https://electrek.co/2026/02/17/tesla-robotaxi-adds-5-more-cr...
- cost (no longer a problem)
- too much code needed and it bloats the data pipelines. Does anyone have any actual evidence of this being the case? Like yes, code would be needed, but why is that innately a bad thing? Bloated data pipelines feels like another hand-wave when I think if you do it right it’s fine. As proven by Waymo.
Really curious if any Tesla engineers feel like this is still the best way forward or if it’s just a matter of having to listen to the big guy musk.
I’ve always felt that relying on vision only would be a detriment because even humans with good vision get into circumstances where they get hurt because of temporary vision hindrances. Think heavy snow, heavy rain, heavy fog, even just when you crest a hill at a certain time of day and the sun flashes you
They don’t focus on safety or effectiveness except to say that vision should be ‘sufficient’. Which is damning with faint praise imho.
If that link was to try and argue that the removal of sensors makes perfect sense i have to point out that anyone that reads that would likely have their negative viewpoint hardened. It was done to reduce cost (back when the sensors were 1000’s) and out of a ridiculous desire by Musk for minimalism. It’s the same desire that removed the indicator stalk i might add.
The reasoning was simply that LIDAR was (and incorrectly predicted to always be) significantly more expensive than cameras, and hypothetically that should be fine because, well, humans drive with only two eyes.
Musk miscalculated on 1) cost reduction in LIDAR and 2) how incredible the human brain is compared to computers.
Having similar sensors certainly doesn't guarantee your accidents look the same, so I don't think your logic is even internally sound.
It's not fair to say that vision based models will "make the same mistakes people do" as >99% of the mistakes people make are avoidable if these issues were addressed. And a computer can easily address all those issues
Neither do cameras, or eyeballs.
“Just buy FSD” isn’t a reasonable answer to a problem literally no other automaker suffers from.
MicroVision, a solid-state sensor technology company located in Redmond, Wash., says it has designed a solid-state automotive lidar sensor intended to reach production pricing below US $200. That’s less than half of typical prices now, and it’s not even the full extent of the company’s ambition. The company says its longer-term goal is $100 per unit. MicroVision’s claim, which, if realized, would place lidar within reach of advanced driver-assistance systems (ADAS) rather than limiting it to high-end autonomous vehicle programs. Lidar’s limited market penetration comes down to one issue: cost.
Comparable mechanical lidars from multiple suppliers now sell in the $10,000 to $20,000 range. That price roughly tenfold drop, from about $80,000, helps explain why suppliers now are now hopeful that another steep price reduction is on the horizon.
For solid-state devices, “it is feasible to bring the cost down even more when manufacturing at high volume,” says Hayder Radha, a professor of electrical and computer engineering at Michigan State University and director of the school’s Connected & Autonomous Networked Vehicles for Active Safety program. With demand expanding beyond fully autonomous vehicles into driver-assistance applications, “one order or even two orders of magnitude reduction in cost are feasible.”
“We are focused on delivering automotive-grade lidar that can actually be deployed at scale,” says MicroVision CEO Glen DeVos. “That means designing for cost, manufacturability, and integration from the start—not treating price as an afterthought.”
Tesla CEO Elon Musk famously dismissed lidar in 2019 as “a fool’s errand,” arguing that cameras and radar alone were sufficient for automated driving. A credible path to sub-$200 pricing would fundamentally alter the calculus of autonomous-car design by lowering the cost of adding precise three-dimensional sensing to mainstream vehicles. The shift reflects a broader industry trend toward solid-state lidar designs optimized for low-cost, high-volume manufacturing rather than maximum range or resolution.
Before those economics can be evaluated, however, it’s important to understand what MicroVision is proposing to build.
The company’s Movia S is a solid-state lidar. Mounted at the corners of a vehicle, the sensor sends out 905-nanometer-wavelength laser pulses and measures how long it takes for light reflected from the surfaces of nearby objects to return. The arrangement of the beam emitters and receivers provides a fixed field of view designed for 180-degree horizontal coverage rather than full 360-degree scanning typical of traditional mechanical units. The company says the unit can detect objects at distances of up to roughly 200 meters under favorable weather conditions—compared with the roughly 300-meter radius scanned by mechanical systems—and supports frame rates suitable for real-time perception in driver-assistance systems. Earlier mechanical lidars, used spinning components to steer their beams but the Movia S is a phased-arraysystem. It controls the amplitude and phase of the signals across an array of antenna elements to steer the beam. The unit is designed to meet automotive requirements for vibration tolerance, temperature range, and environmental sealing.
MicroVision’s pricing targets might sound aggressive, but they are not without precedent. The lidar industry has already experienced one major cost reset over the past decade.
“Automakers are not buying a single sensor in isolation... They are designing a perception system, and cost only matters if the system as a whole is viable.” –Glen DeVos, MicroVision
Around 2016 and 2017, mechanical lidar systems used in early autonomous driving research often sold for close to $100,000. Those units relied on spinning assemblies to sweep laser beams across a full 360 degrees, which made them expensive to build and difficult to ruggedize for consumer vehicles.
“Back then, a 64-beam Velodyne lidar cost around $80,000,” says Radha.
Comparable mechanical lidars from multiple suppliers now sell in the $10,000 to $20,000 range. That roughly tenfold drop helps explain why suppliers now believe another steep price reduction is possible.
“For solid-state devices, it is feasible to bring the cost down even more when manufacturing at high volume,” Radha says. With demand expanding beyond fully autonomous vehicles into driver-assistance applications, “one order or even two orders of magnitude reduction in cost are feasible.”
Lower cost, however, does not come for free. The same design choices that enable solid-state lidar to scale also introduce new constraints.
“Unlike mechanical lidars, which provide full 360-degree coverage, solid-state lidars tend to have a much smaller field of view,” Radha says. Many cover 180 degrees or less.
That limitation shifts the burden from the sensor to the system. Automakers will need to deploy three or four solid-state lidars around a vehicle to achieve full coverage. Even so, Radha notes, the total cost can still undercut that of a single mechanical unit.
What changes is integration. Multiple sensors must be aligned, calibrated, and synchronized so their data can be fused accurately. The engineering is manageable, but it adds complexity that price targets alone do not capture.
DeVos says MicroVision’s design choices reflect that reality. “Automakers are not buying a single sensor in isolation,” he says. “They are designing a perception system, and cost only matters if the system as a whole is viable.”
Those system-level tradeoffs help explain where low-cost lidar is most likely to appear first.
Most advanced driver assistance systems today rely on cameras and radar, which are significantly cheaper than lidar. Cameras provide dense visual information, while radar offers reliable range and velocity data, particularly in poor weather. Radha estimates that lidar remains roughly an order of magnitude more expensive than automotive radar.
But at prices in the $100 to $200 range, that gap narrows enough to change design decisions.
“At that point, lidar becomes appealing because of its superior capability in precise 3D detection and tracking,” Radha says.
Rather than replacing existing sensors, lower-cost lidar would likely augment them, adding redundancy and improving performance in complex environments that are challenging for electronic perception systems. That incremental improvement aligns more closely with how ADAS features are deployed today than with the leap to full vehicle autonomy.
MicroVision is not alone in pursuing solid-state lidar, and several suppliers including Chinese firms Hesai and RoboSense and other major suppliers such as Luminar and Velodyne have announced long-term cost targets below $500. What distinguishes current claims is the explicit focus on sub-$200 pricing tied to production volume rather than future prototypes or limited pilot runs.
Some competitors continue to prioritize long-range performance for autonomous vehicles, which pushes cost upward. Others have avoided aggressive pricing claims until they secure firm production commitments from automakers.
That caution reflects a structural challenge: Reaching consumer-level pricing requires large, predictable demand. Without it, few suppliers can justify the manufacturing investments needed to achieve true economies of scale.
Even if low-cost lidar becomes manufacturable, another question remains: How should its performance be judged?
From a systems-engineering perspective, Radha says cost milestones often overshadow safety metrics.
“The key objective of ADAS and autonomous systems is improving safety,” he says. Yet there is no universally adopted metric that directly expresses safety gains from a given sensor configuration.
Researchers instead rely on perception benchmarks such as mean Average Precision, or mAP, which measures how accurately a system detects and tracks objects in its environment. Including such metrics alongside cost targets, says Radha, would clarify what performance is preserved or sacrificed as prices fall.
IEEE Spectrum has covered lidar extensively, often focusing on technical advances in scanning, range, and resolution. What distinguishes the current moment is the renewed focus on economics rather than raw capability
If solid-state lidar can reliably reach sub-$200 pricing, it will not invalidate Elon Musk’s skepticism—but it will weaken one of its strongest foundations. When cost stops being the dominant objection, automakers will have to decide whether leaving lidar out is a technical judgment or a strategic one.
That decision, more than any single price claim, may determine whether lidar finally becomes a routine component of vehicle safety systems.
You can solve this by adding an emitter next to the camera that does something useful, be it just beaconing lights or noise patterns or phase synced laser pulses. And those "active cameras" are what everyone call LIDARs.
"Necessary"? Seems like a straw man, don't you think? I strive to argue against the strongest reasonable claim someone is making.
Lots of reasonable people suggest LIDAR is helpful to fill in gaps when vision is compromised, degraded, or less capable.
People running businesses, of course, will make economic trade-offs. That's fine. But don't confuse, say, Elon's economic tradeoff with the full explanation of reality which must include an awareness that different sensors have different strengths in different contexts.
So, when one thinks about what sensor mix is best for a given application, one would be wise to ask (and answer) such questions as:
- What is the quality bar?
- What sensors are available?
- Wow well do various combinations of sensors work across the range of conditions that matter for the quality bar?
- WRT "quality bar": who gets to decide "what matters"? The company making the cars? The people that drive them? regulators that care about public safety. The answer: it is a complex combination.
It is time to dismiss any claim (or implication) that "technology good, regulation bad". That might be the dumbest excuse for a philosophy I've ever heard. It is the modern-day analogue of "Brawndo's got what plants crave." Smart people won't make this argument outright, but unfortunately, their claims sometimes reduce to this level of absurdity. Neither innovation nor regulation are inherently good nor bad. There are deeper principles in play.
Yes, some individuals would use their self-proclaimed freedom to e.g. drive without seatbelts at 100 mph at night with headlights off. An extreme example, but it is the logical extension of pure individualism run amok. Regulators and anyone who cares about public safety will draw a line somewhere and say "No. Individual stupidity has a limit." Even those same people would eventually come to their senses after they kill someone, but by then it is too late.
Not entirely true. From their recent "road trips" last year, the trend is they just deploy less than 10 cars in a city for a few weeks (3-4 weeks from what I recall) for mapping and validating. Then they come back after a few months to setup infrastructure for ride hailing (depot, charging, maintenance, etc.) and start service.
The only reason not to have more sensors of different types is cost (equipment and processing costs). Those costs are coming down fast.
Even Tesla used to have radar and ultrasonic in their cars until relatively recently. And they use lidar (from Luminar) in their mapping fleet.
Note that humans do not rely strictly on our eyes as cameras to measure distances. There is a huge amount of inference about the world based on our internal world models that goes into vision. For example, if you put is in a false-perspective or otherwise highly artifical environment, our visual acuity goes down significantly; conversely, people with a single eye (so no parallax-based measurement ability) still have quite decent depth perception compared to what you'd naively expect. Not to mention, our eyes are kept very clean, and maintain their alignment to a very high degree of precision.
People on here used to buy servers themselves (very few of us still do), most now rent via cloud.
Why should transportation be different?
The drive was delightful and felt really safe. It handled the SF terrain, traffic and mixed traffic like trams very well.
I wouldnt trust a self driving tesla ( or any camera only systems) though!
They might not use them for autopilot, but maybe for some emergency braking stuff, when everything else failed.
It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
I mean it doesn't. If you actually look at it comma.ai proves that level two doesn't require lidar. Thats not the same as full speed safe autonomy.
whilst it is possible to drive vision only (assuming the right array of cameras (ie not the way tesla have done it) lidar gives you a low latency source of depth that can correct vision mistakes. Its also much less energy intensive to work out if an object is dangerous, and on a collision course.
To do that in vision, you need to work out what the object is (ie is it a shadow) then you have to triangulate it. That requires continuous camera calibration, and is all that easy. If you have a depth "prior" ie, yes its real, yes its large and yes its going to collide, its much much more simple to use vision to work out what to do.
I have no proof of course and it might be coincidence, or just difference of mindset between US citizens and Europe citizens. It happened a few times already and to me looks sus.
But if they actually read and not just ctrl+f <company name>, then of course not writing the company name, but hinting at it in an obvious way is no more helpful either.
This conversational disconnect is as old as the hills:
1. Person 1 asks "what's wrong" (if it ain't broke don't fix it)
2. Person 2 wants to make something better
My meta-goal here on HN (and many places where people converse) is for people to step back and recognize the conversational context and not fall into the predictable patterns that prevent us from making sense of the world as best as we can.
It's frustrating to still see it repeated over a decade later. It was always bullshit. It was always a lie.
They might have flipped a switch after that, causing this.
This is a difficult problem to solve and perhaps a pragmatic approach was/is to make your life as simple as possible to help get to a fully working solution, even if more expensive, then you can improve cost and optimise.
It's also recently gotten much worse at lane departure sensing, often confused by snow or slightly faded road markers. Not pleasant to have the alarms go off while calmly and safely driving.
Not sure if the ld06 is a scanner like this or if it's just a line (like you'd use for a cheaper robot vac).
Later, improved units based on the same principle became ubiquitous in Chinese robot vacuums [2]. Such LIDARs, and similarly looking more conventional time-of-flight units are sold for anywhere between $20-$200, depending on the details of the design.
[1] https://scholar.google.com/scholar?q=%22A+Low-Cost+Laser+Dis... [2] https://github.com/kaiaai/awesome-2d-lidars/blob/main/README...
Thanks! What a headache
No no it's the cabal...
Several companies, most notably Tesla, have done this well enough to drive in all manner of traffic. I'm not going to comment about if lidar is strictly needed or not to achieve better-than-human safety, that's yet to be proven one way or another by anyone. The point is that cameras + local inference can do a pretty good job at distance estimation
I would argue that yes, we do use vision but we get that "lidar depth" from our stereo vision. And that used to be why I thought cameras weren't enough.
But then look at all the work with gaussian splatting (where you can take multiple 2d samples and build a 3d world out of it). So you could probably get 80% there with just that.
The ethos of many Musk companies (you'll hear this from many engineers that work there) is simplify, simplify, simplify. If something isn't needed, take it out. Question everything that might be needed.
To me, LIDAR is just one of those things in that general pattern of "if it isn't absolutely needed, take it out" – and the fact that FSD works so well without it proves that it isn't required. It's probably a nice to have, but maybe not required.
But I think costs were just part of the reason why Elon decided against Lidar. Apparently, they interfere with each other once the market saturates and you have many such cars on the same streets at the same time. Haven't heard yet how the Lidar proponents are planning to address that.
I assume Musk, et al are acting in best faith in trying to find the right compromises.
It is pretty incredible but people will (rightly so?) hold automated drivers to an ultra high standard. If automated driving systems cause accidents at anywhere near the human rate, it'll be outlawed pretty quickly.
And, less excusable, ignorant of how incredible human eyes are compared to small sensor cameras. In particular high DR in low light, with fast motion. Every photographer knows this.
Given that Musk has a history of driving lower costs, it's unlikely he overestimated the long-term cost floor. He just thought we were close to self-driving in 2014.
Another factor is Andrej Karpathy, who was the primary architect for the vision-only approach. Musk wanted fewer parts, and Karpathy believed he could deliver that. Karpathy is still an advocate of vision-only.
He wanted (needed?) to get on the hype train for self driving to pump up the stock price, knew that at the time there was zero chance they could sell it at the price point lidar required at the time - or even effective other sensors (like radar) - and sold it anyway at the price point that people would buy it at, even though it was not plausibly going to ever work at the level that was being promised.
There is a word for that. But I’m sure there are many lawyers that will say it was ‘mere fluffery’ or the like. And I’m sure he’ll get away with it, because more than enough people are complicit in the mess.
Miscalculation assumes there was a mistake somewhere, but near as I can tell, it is playing out as any reasonable person expected it too, given what was known at the time.
Then again, it's good that we have self-driving companies with lidar and without — we will find out which approach wins.
I've been in zero-road-speed whiteout conditions several times. The only move to make is to the side of the road without getting stuck, and turning on your flashers.
Low-light cameras would not have worked. Sonar would not have worked. Infrared would not have worked.
It turns out it’s the sensors that are easily damaged by high powered lidar lasers.
https://spectrum.ieee.org/amp/keeping-lidars-from-zapping-ca...
I would imagine, even with safe dosages, there would be some form of cumulative effect in terms of retinal phototoxicity.
More so if we consider the scenario that this becomes a standard COTS feature in cars and we are walking around a city centre with a fleet of hundreds of thousands of these laser sources.
Pros and cons. :/
It'll never happen, but we need a bill of rights for privacy. The laypeople aren't well-versed or pained enough to ask for this, and big interest donors oppose it.
Maybe the EU and states like California will pioneer something here, though?
Edit: in general, I'm far more excited by cheap lidar tech than I am afraid of the downsides. We just need to be vigilant.
Suborbital "trips" straight up, beyond the atmosphere, are very cheap.
Strategy: The move brings Tesla's sensor approach closer to competitors like Ford, GM, and Rivian, who utilize multi-modal systems (cameras plus radar) for their driver-assistance features.
Potential: This 'HD radar' could provide critical redundancy and data needed for achieving higher levels of driving automation and improving system performance in all conditions.
https://www.forbes.com/sites/bradtempleton/2025/03/17/youtub...
Lidar struggles with things like rain and snow way worse than cameras do.
We're always getting closer at emulating this, but we're still a ways off from matching it.
Stereo based depth mapping is kind of bad, especially so if it is not IR assisted. The quality you get from Lidar out of the box is crazy good in comparison.
What you can do is train a model using both the camera and Lidar data to produce a good disparity and depth map but this just means you're using more Lidar not less.
>It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
This feels like a highly misleading claim that might technically be true in the sense that there are less restrictions, but a reduction in restrictions doesn't imply an increase in capability.
The comment about Waymo seems to be particularly myopic. Waymo has self driving technology and is operating as a financially successful business. There is no conceivable situation where the mere existence of competition with almost the same capabilities would shake that up. Why isn't it companies like Uber, who have significantly fallen behind, that are in trouble?
>Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
And so is the comment about Tesla cyber cabs.
Humans don't have wheels and cannot go 70MPH. Humans also don't have rear view cameras and cannot process video feeds from 8 cameras simultaneously. The point of these machines is to be better than humans for transportation. If adding LIDAR means that these vehicles can see better than humans and avoid accidents that humans do get into, then I for one want them in my vehicle.
It's not only failing, it's causing false positives.
They also have several cameras all around providing constant 360° vision.
The reports that Tesla submits on Austin Robotaxis include several of them hitting fixed objects. This is the same behavior that has been reported on for prior versions of their software of Teslas not seeing objects, including for the incident for which they had a $250M verdict against them reaffirmed this past week. That this is occurring in an extensively mapped environment and with a safety driver on board leads me to the opposite conclusion that you have reached.
Google doesn't do retail other than Chromecast and Pixel phones, and that is already annoying to them as it is because it involves something Google is notoriously bad at - actual customer support.
Starting up a car brand is orders of magnitude worse.
For one, people actually need to trust your brand to survive for at least five to ten years - cars are an investment, and a car that I can't trust to get safety-relevant spare parts (brake rotors, brake pads, axle bearings) all of a sudden is essentially an oversized paperweight. For a company such as Google, this alone (remember Killed By Google) is a huge obstacle to overcome.
Then, you need production. Sure, you can go to Magna or other contract manufacturers, or have an established large brand build vehicles for you, or you say you have to go the Tesla route and build everything from scratch. Either way has associated pros and cons.
And then, you need a nationwide network of spare parts, dealerships, repair shops and technicians that can fix the issues that people will get alone because the wide masses abuse cars in ways you might not even dare think about while testing, or because other people run into your cars and so your cars need repairs.
Even being a derivative of an established car brand can be a royal PITA. Let's take Mercedes Benz as an example with the 2003-2009 Mercedes-Benz SLR McLaren. On paper, it's a Mercedes vehicle, with a lot of the parts actually originating from stock Mercedes cars - but most dealerships will refuse to work on it. Either because they lack the support to even properly jack the car up, or because they lack the specialized tools for the AMG engine, or because they cannot even order the parts as Mercedes gates repairs for that thing to special shops. Or, again Mercedes, with Maybach luxury cars. The situation isn't as bad as with the McLaren, but their cars are challenging in another way - the S 650 Pullman weighs around 3 metric tons empty and is 6.50 meters long. Good luck finding a jack even capable of lifting that beast, most Mercedes sports-car shops don't carry jacks that are normally used to lift Mercedes Vito transporters!
Even Tesla, and they've been at it for the better part of two decades, still struggles with that. Their shitty spare parts logistics actually drive up not just insurance prices for their own customers, but for everyone - hit a Tesla with your Dodge and be at fault, and now your insurance has to pay out for months of a rental car because Tesla can't be arsed to provide the body shop the Tesla ends up at with spare parts in any reasonable time.
Established car brands however have all of that ironed out for many, many decades now. American, Asian, European, doesn't matter. And the spare parts don't even have to be made for cars: ask your local Volkswagen dealer to order a few pieces of "199 398 500 A" and one piece of "199 398 500 B" and you'll probably have a lead time of less than a day, at least in Germany - for the uninitiated: that part number belongs to the famous sausage, the second one to the accompanying curry ketchup, with more sausages being sold each year than actual cars.
And established car brands also bring something to the table: their own experiences with integrating smart technology. Yes, particularly German carmakers are notoriously bad in that regard, but for example Mercedes Benz was the first car brand in the world to get a certified Level 3 system on the road [1] and are now working on a Level 4 certification [2]. That kind of experience in navigating bureaucracy, integration and testing cannot be paid for in money.
tl;dr: I see no way in which Waymo goes to general availability regarding selling cars. They will run their own autonomous car fleets in select markets where they can fully control everything, but seeing Waymo tech generally available will be as part of established car brands.
[1] https://group.mercedes-benz.com/technologie/autonomes-fahren...
[2] https://group.mercedes-benz.com/technologie/autonomes-fahren...
My understanding is that cyber cabs still need safety drivers to operate, is that not the case?
The Tesla FSD system has... well, sure, a few more cameras, but they're low resolution, and in inconveniently fixed locations.
My alley has an occlusion at the corner where it connects to the main road: a very tall, very ample bush that basically makes it impossible to authoritatively check oncoming traffic to my left. I, a human, can determine that if I see the light flicker even slightly as it filters through the bushes, that the path is not clear: a car is likely causing that very slight change in light. My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar, in a fixed location that means that without nosing my car _into_ the travel lane, there is literally no way for it to be sure the path is clear.
This edge case is navigated near-perfectly by Waymo, since its roof-mounted lidar can see above and beyond the bush and determine that the path is clear. And to hit back on the "Tesla is making cheaper cars that can drive autonomously anywhere in the world": I mean, they still aren't? Not authoritatively. Not authoritatively enough that they aren't seeing all sorts of interventions in the few "driverless" trials they're doing in Austin. Not authoritatively enough when I have my Tesla FSD to glory. It works well enough on the fat part of the bell curve, but those edges will get you, and a vision only system means that it is extremely brittle in certain conditions and with certain failure modes, that a lidar/radar backup help _enhance_.
Moreover, Waymo has brought lidar development in-house, they're working to dramatically reduce their vehicle platform cost by reducing some redundant sensors, and they can now simulate a ground truth model of an absurd number of edge cases and odd scenarios, as well as simulate different conditions for real-world locations in parallel with their new world modeling systems.
None of which reads to me as "not going well for Waymo." Waymo completes over 450,000 fully autonomous rides per week right now. They're dramatically lowering their own barriers to new cities/geographies/conditions, and they're pushing down the cost per unit substantially. Yeah, it won't get to be as cheap as Tesla owning the entire means of production, but I'm still extremely bullish on Waymo being the frontrunner for autonomous driving for the foreseeable future.
Wait what? when did they actually enter mass production?
> I mean humans have Lidar sensors
Real time slam is actually pretty good, the hard part is reliable object detection using just vision. Tesla's forward facing cameras are effectively monocular, which means that its much much harder to get depth (its not impossible but moving objects are much more difficult to observe if you only have cameras aligned on the same plane with no real parallax)
Ultimately Musk is right, you probably don't need lidar to drive safely. but its far more simple and easier to do if you have Lidar. Its also safer. Musk said "lidars are a crutch", not because he is some sort of genius, Its obvious that SLAM only driving is the way forward since the mid 00's (of not earlier). The reason he said it is because he thought he could save money not having lidar. The problem for him is that he didn't do the research to see how far away proper machine perception is to account for the last 1% in accuracy needed to make vision only safe and reliable.
You're listening to the road and car sounds around you. You're feeling vibration on the road. You're feeling feedback on the steering wheel. You're using a combination of monocular and binocular depth perception - plus, your eyes are not a fixed focal length "cameras". You're moving your head to change the perspective you see the road at. Your inner ear is telling you about your acceleration and orientation.
Sufficient to build something close to human performance. But self driving cars will be held to a much higher standard by society. A standard only achievable by having sensors like LiDAR.
Now you might say "use a depth model to estimate metric depth" and I think if you spend 5 minutes thinking about why a magic math box that pretends to recover real depth from a single 2D image is a very very sketchy proposition when you need it to be correct for emergency braking versus some TikTok bokeh filter you will see that also doesn't get you far.
A lot of folks are relearning lessons on this front in Cloud right now.
The response to the challenge shouldn't be whittling down your sensor-suite to a single type, but to get good at sensor fusion.
One of Udacity's first courses was on self-driving, taught by Sebastian Thrun who later cofounded Waymo. He went through some Bayesian math that takes a collection of lidar points, where each point contributes to a probabilistic assessment of what's really going on. It's fine if different points seem to contradict each other, because you're looking for the most likely scenario that could produce that combined sensor data. Transformers can do the same sort of thing, and even with different sensor types it's still the same sort of problem.
We have lots of evidence of similar strategies being used in other domains, this seems like an especially life-critical domain that ought to have high rigor and standards applied.
Of course you do, you're driving at much higher speeds and so is the surrounding traffic. You can't just guess what you might be looking at, you have to make clear decisions promptly. Lidar is excellent in that case.
It's significant that a truly hard problem like autonomous driving doesn't respond to a "brute force" management style. Rockets aren't in this category because the required knowledge and theory is fairly complete, whereas real autonomous driving is completely novel.
If we could make sensors that lets an autonomous vehicle drive reliably in any snow/rain where a human could drive (although carefully) then we're good. But we are a long way from that. Especially since a lot of sensor tech like cameras tend to fail in 2 ways, both through their performance being worse in adverse condition but also simply failing to function at all if they are covered in ice/snow/water.
The EU requires every new car to have Autonomos Emergency Braking. If LiDAR becomes cheaper than radar, this is a potential market of millions.
It's not safe just because it's infrared. And the claims that it's safe because of the exposure time is highly questionable, would you be okay with that for any other laser?
* I have no way to estimate installation costs, but smartphones show that manufacturing at this scale doesn't need to increase total cost 10x more than the B.o.M.
There are SLAM cameras that only select "interesting" points, which are privacy preserving. They are also very low power.
Right, but how likely is it that there will be LIDAR and no cameras (especially given the low cost of the latter)?
If the data were positive for Tesla, Tesla would publish it
They do not, so one can infer it is not flattering
(Before you post the "Miles driven with FSD" chart, you should know upfront (as Tesla must) that chart doesn't normalize by age of vehicle or driving conditions and is therefore meaningless/presumably designed to deceive)
Good question, and for many it will not be, and rentals are acceptable.
But also for many, renting a car has a huge ICK factor. It is one thing while traveling to rent from an agency who has (purportedly) thoroughly cleaned and inspected the car before you get it. It would be quite another to rent cars like scooters, where the previous user likely smoked, left wrappers and food debris, and who knows what else, even damage. Plus, most people who own cars keep a fair amount of stuff in the car for their specific convenience, and have their own settings, etc.
The fact that the likes of Zipcar, Turo, and the lot have not entirely taken over urban transport but instead remain niche players shows the extent of this preference.
For suburban and rural markets, it just gets more extreme. How quickly could a rental service be able to deliver a car; could it reliably do it in less than 5-10 minutes for people to run an errand? If not, unless they are insanely cheap, ppl will likely want to own their own. Perhaps it'll be more of a hybrid, households owning one car and renting the spare for specific trips?
This is evidently false. Robotaxi crash rates exceed human drivers', but there's not an effective regulatory agency to outlaw them!
https://futurism.com/advanced-transport/tesla-robotaxis-cras...
Tesla is spending upwards of $6B/year to Waymo’s $1.5B. Only one of these companies makes an autonomous robotaxi that’s actually autonomous.
Radar is just cheaper than the number of cameras and compute, it's also not really a strict requirement.
Look at how the current cars fuck up, it's mostly navigation, context understanding, and tight manoeuvres. Lidar gives you very little in these areas
There was someone who had his eyes damaged by sitting next to a heater.
The grandparent comment is about camera lenses with little to no near infrared cutoff filter. Some older iPhones were like that and that was the original breaking story.
So they don't care if that breaks my phone camera? Wtf?
They're just fancy cameras with synced flashes. Not Star Trek material-informational converting transporters. Sometimes they rotate, sometimes not. Often monochrome, but that's where Bayer color filters come in. There's nothing fundamentally privacy preserving or anything about LIDARs.
Maybe their navigation system will be better than the competition due to real-time traffic data from Google Maps users, but I don't think it'll be so much better as to be an unbeatable advantage.
It occurs to me there is an opportunity here. Passive lidar detectors sampling fleets of vehicles in the real world, measuring compliance and detecting outliers, would be interesting. A well placed, stationary device could sample thousands of vehicles every day. Patterns will emerge among manufacturers. Failure modes will be seen.
Cursory queries on this reveal nothing. Apparently, no one is doing this. We're all relying on front end certification and compliance. No thought given to the real world of design flaws, damage, faulty repairs, unanticipated failure modes, etc.
Apparently there are lidar jammers. I bet those are rigorously compliant with Class 1 safety regs... No one manufacturing those is ever going to think; "hey, why not a 50W pulse train?"
There is also flagging abuse which effectively kills the comment /post.
Part of that is that humans are distractible, and their performance can be degraded in many ways, and that silicon thinks faster than meat.
But part of it is the sensor suite. Look at Waymo vs Tesla robotaxi accident rates.
But is it going to raise to a level of concern? I don't think we're going to see a ton of cars with blinding lasers installed, unless they are installed to intentionally blind people.
If you have used face I'd, or someone has used a face detection on modern smart phone on you, or if you've pulled up to a modern intersection, you've been blasted with lasers. It may come one day where that's the largest concern but today it's not my primary problem and investing in FUD isn't going to bring any benefits.
A good safety system requires multiple of these failures to occur together to become unacceptable in risk.
This is why we create regulations and inspectors.
Like the difference between "what can do we with an LLM on my maxxed-out laptop with an RTX 5090 card" vs. "what can we do with a mac mini." Self-driving car version.
There aren't a million Teslas with FSD active in the US. According to Tesla in their latest earnings report there are 1.1 million people worldwide with FSD.
Notably, human perception is effectively monocular in driving situations at distances of 60 feet or farther. It's best in the area where your limbs can reach.
We don't need stereoscopic vision to drive.
Sensor fusion is not far simpler, when the sensors disagree, and they will often, you have to pick which to trust.
It is amazing to see how many people here are confident they know the one true way to build autonomous systems based on nothing but wanting to confirm their biases
It has a wide angle camera in front that you usually can never see outside service menu. It should cover that case.
"mass" is a strong word but the first one came off their production line 5 days ago
ramp to high volume will probably be extremely slow
https://www.reddit.com/r/SelfDrivingCars/comments/1mdl5zn/tw...
https://www.reddit.com/r/waymo/comments/1pggtpu/two_waymos_m...
As far as distinguishing shadows on the road, that's what radar is for. Shadows on the road as seen by the vision system don't show up on radar as something the vehicle will run into.
I own a Tesla and paid about $10K for the full self driving capability a few years ago. Yeah, I would not trust a Tesla to drive me from airport to my house. There is a reason Tesla is still stuck at level 2 autonomy certification and not 3, 4 or 5.
XPeng system is sensor fusion. It is not camera only. Waymo is even clearer. For them LiDAR is not optional. aiMotive has now started to market camera only, but its experimental, no production deployments.
Those bits should be easy, unless the OEM was tragically stupid. Where you'll get into trouble is when you need replacement computer bits; those are often tricky for mainstream brands, but if your niche brand ECUs all fail around the same time (wouldn't be the first time for a Google product), and the OEM isn't around to make new ones or make it right, off to the junkyard with all of them. If it's just normal failure rates, you can probably scavenge from totaled vehicles at junkyards even after new parts become unobtainium.
OEM style lighting will also probably get hard to find. Ideally a niche maker would lean towards standard parts there, but that's not the fashion of the times.
Tesla did it, and is more valuable than most other car brands added together. They had a novel product: a good EV that was fun to drive. Is that a unique situation? Could a truly autonomous car launch do it?
Your arguments make sense in themselves, but maybe underestimate the revolutionary value that a level 4 car would provide.
However, there is also a lot of interaction between our perceptual system and cognition. Just for depth perception, we're doing a lot of temporal analysis. We track moving objects and infer distance from assumptions about scale and object permanence. We don't just repeatedly make depth maps from 2D imagery.
The brute-force approach is something like training visual language models (VLMs). E.g. you could train on lots of movies and be able to predict "what happens next" in the imaging world.
But, compared to LLMs, there is a bigger gap between the model and the application domain with VLMs. It may seem like LLMs are being applied to lots of domains, but most are just tiny variations on the same task of "writing what comes next", which is exactly what they were trained on. Unfortunately, driving is not "painting what comes next" in the same way as all these LLM writing hacks. There is still a big gap between that predictive layer, planning, and executing. Our giant corpus of movies does not really provide the ready-made training data to go after those bigger problems.
We often greatly underestimate / undervalue the role of our ears relative to vision. As my film director friend says, 80% of the impact in a movie is in the sound
Whether thats worth completely throwing away LiDAR is a different question, but your argument is just obviously false.
> Moving to a longer wavelength that does not penetrate the human eye allows new lidars to fire more powerful pulses and stretch their range beyond 200 meters, far enough for stopping faster cars. Now a claim of lidar damage to the charge-coupled-device (CCD) sensor on a photographer's electronic camera has raised concern that new eye-safe long-wavelength lidars might endanger electronic eyes.
> Producers of laser light shows are well aware that laser beams can damage electronic eyes. “Camera sensors are, in general, more susceptible to damage than the human eye,” warns the International Laser Display Association
"doesn't penetrate the human eye" seems a bit hand wavy, but I take it to mean "these length pulses in this wavelength are tuned to have the power not be enough to damage the eye". Camera lenses may not have the same level of IR filtering/gathering area or, if they do, there is nothing implying the image sensor has the exact same tolerances as the inside of the eye. From the same:
> Sensor vulnerability to infrared damage would depend on the design of the infrared filters
A heater usually damages the eyes through drying out/heating up the outside layer with constant high intensity, not by causing damage to the retina (post filtering). https://hps.org/publicinformation/ate/q12691/
> Furthermore, since the eye blocks the IRR, the eye begins to overheat leading to eye damage and possible blindness. Because of this, you should not look at the heater for an extended period of time.
Enough intensity of any wavelength is enough to damage any camera or eye of course, but the scenario here seems to be built around that question for the eye. Similarly, I've heard of Waymo's causing 6 mph accidents but no reports of eye damage from any car LiDAR. Despite that, in the above YouTube clip Marques Brownlee actively shows his camera being clearly damaged as its moved around.
Shame that perverts had to ruin that for us, it was kinda neat to point a TV remote as the camera and see the bulb light up.
The SAE autonomy scale is about dividing responsibility between the driver and the assistance system. The lowest revel represents full responsibility on the driver and the highest level represents full responsibility on the system.
If there is a geofenced transportation system like the Vegas loop and the cars can drive without a human driver, then that is a level 5 system. By the way, geofencing is not an "SAE level 5" requirement. Geofencing is a tool to make it easier to reach requirements by reducing the scope of what full autonomy represents.
Tesla FSD is not accurately described as a "no LIDAR at all" approach, and claiming it as such is technically misleading.
https://electrek.co/2026/01/22/tesla-didnt-remove-the-robota...
It would be funny, but tbh it's just sad.
Everything for the stock pump
Well... just look at Tesla. A lot of their parts don't come from the classic supplier-OEM delivery chain model, but Tesla makes as much as they can on their own. It saves them a bunch of money, both when it comes to the profit margin of the supplier, and being at the whims of their supplier, but it is nasty for the customers when there simply is no parts OEM that one could go to when the vehicle manufacturer goes out of business or refuses to support the car any further.
> Where you'll get into trouble is when you need replacement computer bits
Oh hell yes. New EU law is particularly to blame here. OBD diagnosis always was nasty enough, you virtually always need to buy expensive diagnosis software and hardware (e.g. Mercedes XENTRY, VW ODIS, BMW ICOM)... but the newest requirements enforce live digital signatures and anti-tamper checks. Nasty as hell. And the buses itself... it's no longer just one CAN bus doing everything, not since the Kia Boys, it's multiple buses of different speeds, some using encryption on the wire, all making diagnosis, troubleshoots and repairs much more difficult than it used to be.
And that is before getting into the replacement parts issue itself that you wrote up.
Tesla ""autopilot"" fatalities: 65
Waymo fatalities: 0
Absorbing the laser isn't necessarily any good. Very hypothetically it could lead to cataracts.
Humans have always done mass surveillance on eachother. You don't need technology for that.
Half of Tesla's value is hopium, the rest of it is pure trust in that the current government will continue propping Elon up (even if he personally ran afoul of the Dear Leader). A lot of the promises Elon made, particularly when it comes to FSD, had to be tracked back and I don't see them ever coming to fruition - at least not for the cars that don't have LIDAR hardware.
https://waymo.com/blog/2024/08/meet-the-6th-generation-waymo...
This company claims their LIDAR works conservatively at 250m, and up to 750m depending on reflectivity
https://www.cepton.com/driving-lidar/reading-lidar-specs-par...
Deciding to crash faster, or "tell human to take over" really fast is NOT better.
If we are just talking about smart cruise control, most cars are using cameras and radar, not lidar yet. But Tesla is special since it doesn’t even use radar for its smart cruise control implementation, so that could make it less safe than other new cars with smart cruise control, but Autopilot was never competing with Waymo.
Do you actually own a Tesla? I do. With FSD. And let me assure you, you are very wrong.
By some measures Waymo is actually at -1 fatalities. There has been one confirmed birth of a child in a Waymo. https://apnews.com/article/baby-born-waymo-san-francisco-6bd...
Tesla notes:
> These assumptions may contain limitations with respect to reporting criteria, unreported incident estimations (e.g., NHTSA estimates that 60% of property damage-only crashes and 32% of injury crashes are not reported to police
https://www.researchgate.net/publication/378671275/figure/fi...
Scale matters.
While the lack of anonymity in small towns certainly puts a damper on one's ability to deviate too far from social norms, the list of things and subject that could get you subjected to government violence without creating a victimized party was infinity shorter. Things that get state or state deputized enforcers on your case today were matters of "yeah that's distasteful, he'll have to settle that with god" or it would come back to bite you when something happened 150+yr ago because society did not have the surplus to justify paying nearly as manny people to go around looking for deviance that could be leveraged to extract money. These people had way more practical day to day freedom to run and better their lives than we do now, if constrained by the fact that they had substantially less wealth to leverage to that effect.
> Modern Homeowners Associations prove that localized oversight is often the most intrusive form of management
And they almost exclusively deal in things that historical societies didn't even bother to regulate.
You're beyond delusional if you think running afoul of HOA is worse than running afoul of the local, state or federal government. Yeah they can screech and send you scary letter with scary numbers but they don't get the buddy treatment from courts that "real" governments do (to the great injustice of their victims) and their procedural avenues for screwing their victims on multiple axis are way more limited.
Seriously, go get in a pissing match with a municipality over just where the line for "requires permit" is and get back to me. Unless you want to do something that is more than petty cosmetic stuff and unambiguously in violation of the rules a HOA is a paper tiger for the most part (not to say that they don't suck).
Why is it clearly false? It might be false, but clearly? I would definitely like to see evidence either way.
> I think it's far more likely that humans don't report most minor collisions to insurance, and that both Robotaxis and Waymo are safer than human drivers on average.
That sounds like you are trying to find reasons to get the conclusion you want.
We find that the cases where lidar really helps are in gathering training data, parking, and if focused enough some long distance precision.
None of these have been instrumental in a final product; personally I suspect that many of the cars including lidar use it for data collection and edge cases more than as part of the driving perception model.
I also wonder if the smaller sensor size on phones contributes, since the energy is being focused onto a smaller spot.
Either way, for that to happen he was filming the LIDAR while active, for a decent amount of time, from right next to the car. I assume under normal conditions it wouldn't be running constantly while the vehicle is stationary?
Top 5 fines:
1 - Meta - Ireland - €1.2 billion
2 - Amazon Europe - Luxembourg - €746 millions
3 - WhatsApp - Ireland - €225 millions
4 - British Airway - UK - £183 millions
5 - Google - France - €60 millions
I wish every law barely got enforced this way.
And note that there is evidence for cities of tens of thousands of inhabitants from 3000 BCE, while Rome reached 1 000 000 residents by 1CE. Again, without becoming some Hobbesian nightmare.
People don't hesitate to be aggressive even when they're not anonymous and there's a threat of accountability - see, all crime, or people just acting shitty toward others.
Mass surveillance does not cause everyone to magically get along.
If you go to the NHTSA's page regarding their Standing General Order[2] and download the CSV of all ADS incidents[3], you can filter where the reporting entity is Waymo and find 520 rows. If you filter where the vehicle was stopped or parked, you'll find 318 crashes. If you scan through the narrative column, you'll see things like a Waymo yielding to pedestrians in a crosswalk and getting rear-ended, or waiting for a red light to change and getting rear-ended, or yielding to a pickup truck that then shifted into reverse and backed into the Waymo. In other words: the majority of Waymo collisions are due to human drivers.
So either Waymos are ridiculously unlucky, or when these sorts of things happen between two human driven cars, it's rarely reported to insurance. In my experience, if there's only minor damage, both parties exchange contact info and don't involve the authorities. Maybe one compensates the other for damage, or maybe neither party cares enough about a minor dent or scrape to deal with it. I've done this when someone rear-ended me, and I know my parents have done it when they've had collisions.
If human driven vehicles really did average 229k miles between any collision of any kind, we'd see many more pristine older vehicles. But if you pay attention to other cars on the road or in parking lots, you'll see far more dents and scratches than would be expected from that statistic. And that's not even counting the damage that gets repaired!
1. See page 13 of https://www.nhtsa.gov/sites/nhtsa.gov/files/2025-04/third-am...
2. https://www.nhtsa.gov/laws-regulations/standing-general-orde...
3. https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...
Let me guess, you heard this from Elon?
Waymo has driven tens of millions of autonomous miles with a serious injury/fatality rate dramatically lower than human drivers. The actual data shows the technology works. Tesla FSD still requires active driver supervision and is not legally or technically a robotaxi system. Comparing them as if they're at parity is wrong.
LIDAR gives direct metric depth with no inference required. Camera-only systems must infer depth from 2D images using neural networks, which introduces failure modes LIDAR doesn't have. Radar is very valuable when LIDAR and cameras give ambiguous data.
What metrics has Telsa overtaken Waymo? Deployed robotaxi revenue miles? No. Disengagement rates? No published comparable data. Safety per mile in driverless operation? No.
But also kinda weird. There seems to be a lot of fines for hospitals for example.
Some Portuguese hospital was fined €400,000 for ‘Insufficient technical and organisational measures to ensure information security’
Waymo used LIDAR in the realtime control loop. It combines LiDAR, camera, and radar data in real time to build a 3D representation of the environment, which is constantly updated.
I fundamentally don't trust any level 4 system that doesn't use LIDAR
Laptops aren't generally being used in the same areas as cars though, so you wouldn't expect to see as many cases involving Windows Hello compatible laptops/cameras.
Drivers can and do misuse adaptive cruise control systems, sometimes with fatal consequences. Memes aside, there is no strong evidence that fatal misuse occurs more frequently by owners of Tesla cars than with comparable systems from other brands.
This perception reflects the Baader–Meinhof phenomenon, more commonly known as the frequency illusion. Nobody is collecting statistics for other brands, so it’s assumed the phenomenon doesn’t occur.
A similar pattern occurred with media coverage of EV fires. Except in this case, good statistics exist which prove the opposite: ICE vehicles catch fire more often than EVs.
So, you do not own a Tesla.
Hmm. Is it ragebaiting to respond to a tired and wrong statement by saying that it's tired and wrong and that the situation is merely the product of piss poor management decisions? People get understandably frustrated seeing the same wrong talking point that people with domain knowledge in computer vision and robotics have repeatedly explained is wrong in extremely fundamental ways.
> I don't own a Tesla.
n.b. The shoe/foot comment was not about you. It was about Musk. It wouldn't make any idiomatic sense for the expression to be about you given what you said and what you were responding to. If they'd said "pot, meet kettle", then it would have been about you. In that context, saying that you don't own a Tesla feels like a weird thing for you to insert in your comment. It potentially comes across as suspiciously defensive.
By edge cases I mean scenarios like the lights going out in an underground garage; low vision due to colourful smoke or dust, or things like optical illusions or occlusion that a human would just need to remember.
Lidar can help, but not really enough to be worth it.
You don't need the mm precision of lidar very often; we find that it offers nothing at speed over radar; and in tight manoeuvres the cameras we need for human park assist and ultrasonics do well enough.
It in not more accurate; but it is more precise, but that doesn't really matter. (Radar gives you relative speed directly, this is more important than a very precise point at highway speeds).
I don't know about you, but on that income I would certainly not brush off such a fine as a "cost of doing business". Would it cause me financial trouble, or would it force me to sacrifice other expenses? Absolutely not. But would I feel frustrated at having to pay it, feel stupid for my mistake, and do my best to avoid it in the future? Absolutely yes.
Anyway I'm curious why - despite having less anonymity than at any point in history, at least from the perspective of law enforcement - we still see high crime rates, from fraud to murders?
* someone parking carefully, misjudges depth perception, bumps an object
* person driving at night, their eyes failed to perceive a poorly lit feature of the road/markings/obstacles
* person driving and suddenly blinded by bright object (the sun, bright lights at night)
* person pulling out in traffic who misinterprets their depth perception and therefore misjudges the speed of approaching traffic
* people can only focus their eyes at one distance at a time, and it takes time to focus at a different distance. It is neither unsafe nor unexpected for humans to check their instruments while driving -- but it can take the human eye hundreds of milliseconds to focus under normal circumstances -- If you look down, focus, look back up, and focus, as quick as you can at highway speeds, you will have travelled quite a long distance.
These type of failures can happen not as a result of poor decision making, but of poor perception.
"people have driven coast to coast without a single intervention on FSD including parking spot to parking spot"
I find this claim very dubious. Prove it. Teslas never drive empty for a very good reason.
Writing this and linking to fake Wikipedia is actually hilarious.
They do not. They have a very small number of them open to a select number of people, not the general public. And they are limited to even smaller areas. You need to understand that Musk is NOT an engineer, he is more of a con man desperate to inflate tesla stock price. If he says self driving cars don't need LIDAR then they must actually need it.
https://futurism.com/future-society/polymarket-fortune-betti...
Polymarket user David Bensoussan has made $36,000 by betting against Musk's wildly optimistic self driving predictions.
linking to grokipedia feels like intentional rage-baiting.
There isn't a trend of increasing fines, nor has any fine even reached the cap, let alone applied multiple times for the recurring violations. Even more with the current US administration's foreign policy towards the EU.
While GDPR as a law is fine, with the exception of enforcement limitations, enforcement so far has been a complete joke.
Whats wrong with grokipedia its a bit less woke/far left wing, more balanced.
https://www.forbes.com/sites/alanohnsman/2025/08/20/elon-mus...
https://futurism.com/leaked-elon-musk-self-driving
For nearly a decade Elon Musk has claimed Teslas can truly drive themselves. They can’t. Now California regulators, a Miami jury and a new class action suit are calling him on it.
https://en.wikipedia.org/wiki/List_of_predictions_for_autono...