Tesla has to pay historic $243M judgement over Autopilot crash, judge says

105 pointsposted 5 hours ago
by jeffbee

125 Comments

tass

3 hours ago

I’m not usually an apologist, and I’d agree with this judgement if the car was left to its own devices, but the driver of the car held his foot on the accelerator which is why it blew through those stop signs and lights.

In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?

jrjeksjd8d

3 hours ago

The average person does not know how to fly a plane or what a plane autopilot does. It's a ridiculous superficial comparison. Planes have professional pilots who understand the capabilities and limits of aviation autopilot technology.

Tesla has had it both ways for ages - their stock price was based on "self-driving cars" and their liability was based on "asterisk asterisk the car cannot drive itself".

seanmcdirmid

a minute ago

Autopilots are kind of dumb, which is why Tesla doesn’t use the name as branding for its full self driving software. People at least know that much.

nitinreddy88

3 hours ago

According to your analogy. Certified pilot = Certified driving license holder. Its not like Tesla is advertising non driving license or in eligible person can drive using Autopilot. I wonder how can you even justify your statement

tapoxi

3 hours ago

Autopilot is part of a private pilots license and systems are approved by the FAA. Tesla autopilot isn't part of a driving license, nor did it undergo review by the NHTSA prior to launch because Elon considered it "legal by default".

nickff

3 hours ago

If the average person does not know what an autopilot does, why would they expect Tesla's 'autopilot' to take such good care of them? I am reminded of a case many years ago when a man turned on the cruise control in his RV and went to the back to make himself lunch, after which the RV went off some sort of hill or cliff.

Rudimentary 'autopilots' on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.

tass

43 minutes ago

This would be more like they enabled cruise control, hit the brakes, and sued the manufacturer because they were rear-ended.

D-Coder

2 hours ago

> If the average person does not know what an autopilot does

The average person does know what an autopilot does, they're just wrong.

I think the example you provided supports that.

user

3 hours ago

[deleted]

zadikian

36 minutes ago

Not sure what "autopilot" means in a car. Is the self-parking feature called "landing gear"?

Starman_Jones

29 minutes ago

The original judgement held that the driver was 2/3 responsible, Tesla 1/3 responsible, which seems reasonable. The $243 million wasn't for causing the accident, but was a punitive amount for doing things that looked an awful lot like lying to the court and withholding evidence.

carefree-bob

20 minutes ago

This makes a lot of sense and makes the verdict seem reasonable, thanks for providing the context.

Gud

2 hours ago

A “reasonable person” in a cockpit is not the same as a “reasonable person” behind the steering wheel.

Pilots undergo rigorous training with exam after exam they must pass.

No one is handed the keys to a Boeing 747 after some weekly evening course and an hours driving test.

tass

an hour ago

I don't mean a reasonable pilot. Would a reasonable person expect autopilot in a plane prevents a plane from crashing into something that the pilot was accelerating towards while physically overriding the controls. The claim is that autopilot should not have been able to crash even with the driver actively overriding it and accelerating into that crash.

To me, it's reasonable to assume that the "autopilot" in a car I drive (especially back in 2019) is going to defer to any input override that I provide. I wouldn't want it any other way.

xiphias2

4 hours ago

This case will make settlement amounts higher, which is the main thing car companies care about when making decisions about driving features/marketing.

With Robotaxi it will get even higher as it will be clear 100% the company's fault.

1970-01-01

4 hours ago

Fight Club 2.0: You pay to retrain it only if the AI will kill more people than our settlement fund can pay out.

coredog64

3 hours ago

You're already downvoted, but this quote from Fight Club always annoyed me as it misunderstands how recalls work.

1. Insurance companies price in the risk, and insurance pricing absolutely influences manufacturers (see the absolute crap that the Big 3 sold in the 70s) 2. The government can force a recall based on a flaw whether or not the manufacturer agrees

1970-01-01

3 hours ago

v2.0- Tesla drivers insure with Tesla and the recalls are all OTA software fixes.

jqpabc123

5 hours ago

Tesla Apologists: The judge/jury agreed that Tesla was "Full Self Driving" all the way to the scene of the crash.

dekhn

4 hours ago

If I read the article it says autopilot, not FSD.

palmotea

3 hours ago

> If I read the article it says autopilot, not FSD.

What's the difference? And does it matter?

Both are misleadingly named, per the OP:

> In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”

> Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.

> This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.

dekhn

3 hours ago

Autopilot is similar to cruise control that is aware of other cars, and lane keeping. I would fully expect the sort of accident that happened to happen (drop phone, stop controlling vehicle, it continues through an intersection).

FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.

The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.

jqpabc123

2 hours ago

Autopilot is similar to cruise control that is aware of other cars, and lane keeping.

Thanks for explaining why labeling it "Autopilot" is misleading and deceptive.

omnimus

2 hours ago

This is not even funny anymore. You reap what you sow.

FireBeyond

16 minutes ago

> FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.

FSD at one point had settings for whether it could roll through stop signs, or how much it could exceed the speed limit by. I've watched it interpret a railroad crossing as a weirdly malfunctioning red light with a convoy of intermittent trucks rolling by. It took the clearly delineated lanes of a roundabout as mere suggestions and has tried to barrel through them in a straight line.

I'd love to know where your confidence stems from.

dekhn

11 minutes ago

My confidence comes only from what I hear people doing with the system. I have zero experience with it and consider most of the PR from Tesla to be junk.

"would expect" is the way a cautious person demonstrates a lack of confidence.

atonse

3 hours ago

I remember having this argument with a friend.

My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think "Autopilots" on an aircraft do. (And that is probably good enough to argue in court, that it doesn't matter what's factually correct, it matters what people understand based on their knowledge)

Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it'll also take an exit ramp. (which was called Navigate on Autopilot)

Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).

Full Self Driving (to which they've now added the word "Supervised" probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That's the true "summon my car from LA to NY" dream at least.

So to answer your question, "What's the difference" – it's huge. And I think they've covered that in earlier court cases.

But one could argue that maybe they should've restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don't know the details of each recent crash.

Retric

3 hours ago

Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking.

Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.

NikolaNovak

40 minutes ago

>"Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking."

Can you elaborate? My very limited knowledge but of very real airplane autopilots in little Cessna and Pipers is that they are in fact far easier than cars - they are a simple control feedback loop that maintains altitude and heading, that's it. You can crash into ground, mountain, or other traffic quite cheerfully. I would not be surprised to find adaptive cruise in cars is far more complex of a system than basic aircraft "autopilot".

beering

3 hours ago

Doesn’t basic airplane autopilot just maintain flight level, speed, and heading? What are some other things it can do?

Retric

3 hours ago

Recover from upsets is the big thing. Maintaining flight level, speed, and heading while upside down isn’t acceptable.

Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.

ErroneousBosh

2 hours ago

That's still massively simpler than making a self-driving car.

It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude you want and over a reasonable timescale it will do just that.

Retric

2 hours ago

> It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude

That seemingly shifts the difficulty from the autopilot to the airframe. But that’s not actually good enough, it doesn’t keep an aircraft flying when it’s missing a large chunk of wing for example. https://taskandpurpose.com/tech-tactics/1983-negev-mid-air-c...

Instead, you’re talking about the happy path and if we accept the happy path as enough there’s the weekend equivalents of self driving cars built using minimal effort, however being production worthy is about more than being occasionally useful.

Autopilot is difficult because you need to do several things well or people will defiantly die. Self driving cars are far more forgiving of occasional mistakes but again it’s the or people die bits that makes it difficult. Tesla isn’t actually ahead of the game, they are just willing to take more risks with their customers and the general public’s lives.

keeganpoppen

3 hours ago

well the other person in the comments said the guy literally held his accelerator to the floor the entire time. is that actually a reasonable standard, or are you preemptively out for blood because you would never let reality get in the way of a good agenda? ironic, given that you go out of your way to accuse others of this. methinks you doth protest too much?

selridge

3 hours ago

Hope he sees this, bro

randyrand

an hour ago

Jeeze, save some for the rest of us!

joshfraser

2 hours ago

For every story like this, there are 10 stories of people who died the old fashioned way behind the wheel.

Accidents like this are obviously tragic, but let's remember that self driving software is already ~10x safer than a human. Unfortunately, lawsuits like this will slow down the rollout of this life-saving technology, resulting in a greater loss of life.

bathtub365

an hour ago

Companies that roll it out responsibly aren’t having issues. Tesla deserves these judgments and should not be allowed to roll this software out in the irresponsible way they have been.

zadikian

an hour ago

Possibly, but this wasn't a self-driving car. The ruling gave Tesla some blame for falsely marketing it as one.

sonofhans

an hour ago

> self driving software is already ~10x safer than a human.

This is, technically speaking, pure bullshit. You have no proof because none exists.

motbus3

2 hours ago

This is 0.005 of what musk allegedly have ? He might be sad

hnburnsy

2 hours ago

We are all paying these large verdicts through higher product costs, lower salaries, lower stock market returns, and higher insurance rates.

alamortsubite

13 minutes ago

100% correct. Also, with 40,000 deaths due to car crashes each year in the U.S. (and 2 million+ severe injuries not resulting in death), I'd consider it drop in the bucket.

robotnikman

3 hours ago

Will this have any effect on other companies developing self driving tech? It sets a very high precedent for fines, and may discourage companies from further working on such tech.

janalsncm

3 hours ago

Developing, no, but once companies start releasing vehicles onto our shared public streets I have a lot less tolerance for launching science experiments that end up killing bystanders.

I can understand the argument that in the abstract over-regulation kills innovation but at the same time in the US the pendulum has swung so far in the other direction that it’s time for a correction.

Zababa

2 hours ago

I have no tolerance for bystanders being killed in general. If the science experiments kill on average less bystanders I'm all for them, if they don't they should be stopped until made safer.

mmooss

3 hours ago

That's an old argument by corporations against liability. Should they not be fully liable?

It should discourage them from making unsafe products. If it's not economical for them to make safe products, it's good that they go bankrupt and the economic resources - talent, money - go to someone else. Bankruptcy and business failure are just as fundamental to capitalism as profit.

nickff

3 hours ago

These product-liability lawsuits are out of control; perhaps this judgement is directionally correct, but the punitive damages seem insane. This reminds me of the lawsuits which drove Instant Pot bankrupt, where the users were clearly doing very stupid things, and suffered injuries because they were able to physically overpower the safety mechanisms on their pressure-cookers.

mmooss

3 hours ago

> These product-liability lawsuits are out of control

Businesses also claim that, all the time. We need some evidence.

I remember doctors claiming that malpractice lawsuits were out of control; research I read said that it wasn't an economic issue for doctors and that malpractice was out of control.

nickff

2 hours ago

I invite you to read both the claims and the judgements related to the Instant Pot lawsuits yourself; they're all quite clear, and you can come to your own decision about how reasonable they are.

My read is that people overpowered the safety interlock, after which the lid (predictably) flew off, and they were injured (mostly by the hot steam and bits of food). I think it's ridiculous for people to expect safety mechanisms to be impossible to bypass, but maybe you disagree!

bsimpson

3 hours ago

Good

It seems clear that "autopilot" was a boisterous overclaim of its capabilities that led to people dying.

It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it's also clear that smaller numbers don't act as an effective deterrent to people like Elon Musk.

ramses0

2 hours ago

I've always thought of it more as "Co-Pilot", but formally: "Autopilot" might truly be the better definition (lane-keeping, distance-keeping), whereas a "Co-Pilot" (in aviation) implies more active control, ie: pulling you up from a nose dive.

So... informally, "Tesla Co-Pilot" => "You're still the pilot but you have a helper", vs "Tesla Autopilot" => "Whelp, guess I can wash my hands and walk away b/c it's AuToMaTiC!"

...it's tough messaging for sure, especially putting these powertools into peoples hands with no formal training required. Woulda-coulda-shoulda, similar to the 737MAX crashes, should "pilots" of Teslas required training in the safety and navigation systems before they were "licensed" to use them?

eYrKEC2

3 hours ago

Right! We demand engineering perfection! No autopilot until we guarantee it will NEVER kill a soul. Don't worry that human drivers kill humans all the time. The rubric is not better than a human driver, it is an Angelic Driver. Perfection is what we demand.

bsimpson

2 hours ago

Waymo drives better than most people.

Tesla Autopilot seems to mostly drive hubris. The fine print says you're still supposed to maintain control. They don't have as sophisticated sensors as competitors because Elon decreed "humans don't have LiDAR, so we don't need to pay for it."

Nobody is saying it has to be perfect, but Tesla hasn't demonstrated that it's even trying.

zadikian

an hour ago

I can see where they're coming from with the video-only concept, but even they admit it's not self-driving yet, so just don't call it self-driving (or "FSD**" or "autopilot") until it is.

tehjoker

3 hours ago

It's crazy that they weren't reeled in by a regulator and it had to make it all the way through the court system. People are dead. A court judgement can't change that. Preemptive action would have.

palmotea

4 hours ago

> Tesla also claimed that references to CEO Elon Musk’s statements about Autopilot during the trial misled the jury....

> The company essentially argued that references to Elon Musk’s own public claims about Autopilot, claims that Tesla actively used to sell the feature for years, were somehow unfair to present to a jury. Judge Bloom was right to reject that argument.

Of course, since Elon Musk has lied and over-promised a lot about Tesla's self-driving technology. It's an interesting defense to admit your CEO is a lair and can't be trusted.

standardUser

4 hours ago

I'm not clear on what Tesla is doing these days. They've been left in the dust on autonomous driving, they've failed to update their successful car models, and their only new model was a spectacular failure.

mey

4 hours ago

Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?

palmotea

3 hours ago

>> I'm not clear on what Tesla is doing these days.

> Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?

I believe Musk wants to hype humanoid robots, because he can't get away with irrationally hyping electric cars or self-driving technology like you used to.

Tesla was never a car company, their real product is sci-fi dreams.

blackjack_

3 hours ago

Agreed, and he’s already behind in humanoid robots, so the hype there won’t last long. The problem is that China is obliterating him at every turn because they actually build things that work instead of just hyping things and saying fake numbers of how much money it could be if every human on the planet bought 20.

Almondsetat

3 hours ago

By which metrics has Tesla been left in the dust wrt autonomous driving? Right now they are the only brand where you can buy a car and have it do basically 90% (or sometimes 100%) of your daily driving. Sure, it's supervised, but the alternatives are literally extremely geogated taxis

paxys

2 hours ago

> By which metrics has Tesla been left in the dust wrt autonomous driving

By the fact that they don't have autonomous driving. And this very judgement demonstrates that.

If you have to keep your full attention on the road at all times and constantly look out for the 10% case where the autopilot may spectacularly fail, it instantly turns off the vast majority of prospective users.

Funny enough the tech that Musk's tweets and the Tesla hype machine has been promising for the last decade is actually on the streets today. It's just being rolled out by Waymo.

bigyabai

3 hours ago

It's not free, is it? You buy the car, subscribe to their arbitrarily-priced subscription service, and then it does 90% of your driving.

That's like paying for a "self-juicing juicer" that only works with proprietary juice packages sold through an overpriced subscription.

Edit: Mostly a criticism. I have no bone to pick with Elon, but subscription slopware is the reason why Chinese EVs are more desirable to average Joes like me.

WarmWash

3 hours ago

Tesla has a level 3 system that it's willing to gamble on not needing intervention for a handful of miles for a handful of Tesla fanboys. It's very telling that their "level 4" robotaxis are basically unicorns and only exist (existed? it's not clear they are even available anymore) in a single neighborhood subsection of the level 3 robotaxis full area in Austin.

Waymo on the other hand has a level 4 system, and has for many years, in many cities, with large service areas.

Tesla is unquestionably in the dust here, and the delusional, er, faithful are holding out for this mythical switch flip where Elon snaps his fingers and every Tesla turns into a level 4 robotaxi (despite the compute power in these cars being on the level of a GTX 5090, and the robotaxis having custom hardware loadouts)

Almondsetat

3 hours ago

I don't understand the point of your reply. Waymo is geofenced taxis. You cannot buy a Waymo. It cannot drive basically wherever you want. Teslas mostly can. So, again, how is Tesla the one left in the dust?

WarmWash

2 hours ago

Yes, Tesla leads on level 3 driving.

If you want to call this "autonomous" well then we are arguing semantics. But I think colloquially, autonomous means "no human".

user

3 hours ago

[deleted]

mr_00ff00

3 hours ago

Yet Tesla is trading near its all time high.

bryanlarsen

3 hours ago

And as the lunar new year demo dance shows, China is leaving them in the dust building humanoid robots.

m463

4 hours ago

My question too.

though they did update the model y (looks like a duck), they just cancelled the model S and X

WarmWash

3 hours ago

Optimus robots!

In 2 years Tesla will be replacing most factory workers with fully autonomous robots that will do most of the work. This will generate trillions in revenue and is totally definitely trust me bro possible.

Expect huge updates on this coming in the near future, soon. Tesla will be the most valuable company on Earth. Get in the stock now.

(cars, solar panels, energy storage, and robotaxis are no longer part of the roadmap because optimus bots will bring in so much money in 2 years definitiely that these things won't matter so don't ask about them or think about them thanks.)

slowmovintarget

2 hours ago

https://www.jdpower.com/business/press-releases/2026-us-elec...

Tesla Model 3 highest overall in owner satisfaction.

"Left in the dust?"

protimewaster

an hour ago

I don't understand how that's a retort against the claim that they've "been left in the dust on autonomous driving". Are you contending that autonomous driving is the only reason that Tesla owners would like their cars?

mrguyorama

33 minutes ago

JD power is a company you pay to give you an award.

That's why Chevy has a bunch from them, including "Highest initial quality"

WalterBright

4 hours ago

[flagged]

palmotea

4 hours ago

> It's an absurd judgement.

> Consider also how many lives have been saved by the autopilot.

> Be careful what you wish for.

How many? Tell me.

breakyerself

4 hours ago

It saved my life. I was standing on the Golden gate bridge looking over the edge. A Tesla model 3 pulled over and started playing baby I need your lovin at full volume. I cried and climbed into the car and it turned on the seat heater.

1970-01-01

3 hours ago

There's legit dashcam video showing Autopilot preventing severe crashes. Go on YouTube instead of balking. Here's a few to get your algorithm working:

https://www.youtube.com/watch?v=A3K410O_9Nc

https://www.youtube.com/watch?v=Qy6SplEn4hQ

https://www.youtube.com/watch?v=GcfgIltPyOA

https://www.youtube.com/watch?v=Tu2N8f3nEYc

palmotea

3 hours ago

> There's legit dashcam video showing Autopilot preventing severe crashes. Go on YouTube instead of balking. Here's a few to get your algorithm working:

So? I just watched those. They don't prove anything about "lives [that] have been saved by the autopilot." They all look like scenarios a human driver could handle (and *I, personally have handled situations similar to some of those). If autopilot is saving lives, you have to show, statistically, it's better than human drivers in comparable conditions.

Also the last one appears to be of a Tesla fanboy who had just left a Tesla shareholder meeting, and seems pretty biased. I'd say his Cybertruck actually reacted pretty late to the danger. It was pretty obvious from the dashcam that something was wrong several seconds before the car reacted to it at the last second.

ethmarks

3 hours ago

I can't speak for Tesla's FSD specifically, but Waymo did a study on the collision rate of their autonomous cars compared to human drivers: https://waymo.com/safety/impact/. They found that Waymos get into about 81% fewer crashes per mile. Compared to a statistical human driver, Waymo prevented around 411 collisions that would have resulted in any injury, and 27 collisions that would have resulted in serious injury or death. It seems like for Waymo specifically, self-driving cars are demonstrably safer than human drivers. Not sure if that generalizes to Tesla FSD, though.

palmotea

3 hours ago

> I can't speak for Tesla's FSD specifically, but Waymo did a study on the collision rate of their autonomous cars compared to human drivers: https://waymo.com/safety/impact/. They found that Waymos get into about 81% fewer crashes per mile.

I think that's true. Though I recall that Waymo limits their cars to safer and more easily handled conditions, which is totally the right thing to do, but it probably means that statistic needs an asterisk.

> Not sure if that generalizes to Tesla FSD, though.

I don't think it does. They're two totally different systems.

1970-01-01

3 hours ago

The last link is literally a man stating he could not handle the situation without FSD driving him. You're experiencing cognitive dissonance with the evidence.

palmotea

3 hours ago

> The last link is literally a man stating he could not handle the situation without FSD driving him. You're experiencing cognitive dissonance with the evidence.

And I have doubts about that man's reliability.

WalterBright

3 hours ago

Statistics aren't collected on that. But I've read anecdotes where individuals recounted the autopilot saving them from a severe accident.

You can also google "how many lives has tesla autopilot saved?" and the results suggest that the autopilot is safer than human pilots.

triceratops

3 hours ago

That doesn't make any sense. If I, a human, hit the brakes in time and avoid an accident today, then hit someone tomorrow, should I not be held responsible for the second incident?

WalterBright

3 hours ago

The point is to compare the consequences of more deaths by not using autopilot with deaths by the autopilot.

If you accidentally kill someone with your car, do you think you should have to pay $243m?

triceratops

an hour ago

> If you accidentally kill someone with your car, do you think you should have to pay $243m?

It would be a large portion of my net worth for sure. Maybe also prison time. Can we put Autopilot or Tesla in a prison?

recursivecaveat

an hour ago

The judgement is only so high because of punitive damages for misleading marketing. Their actual liability for the collision itself is relatively low (and indeed the jury found them only 33% at fault).

palmotea

2 hours ago

> If you accidentally kill someone with your car, do you think you should have to pay $243m?

If you have billions of dollars, and somehow can't go to prison, yes you should. If not in compensation to the victim, then in some kind of fine scaled to wealth. The amount paid needs to be high enough to be a deterrent to the individual who did wrong.

WalterBright

2 hours ago

If this extreme judgement causes Tesla to abandon autodrive, there will be more deaths on the road. It just isn't rational.

Besides, nobody makes you turn on the autodrive.

breve

an hour ago

You haven't established that it has saved any lives beyond vague, hand waving anecdotes.

What is autodrive? Are you talking about basic autopilot, enhanced autopilot, or full self-driving? They are separate modes:

https://en.wikipedia.org/wiki/Tesla_Autopilot#Driving_featur...

Which revision of the hardware and software is the "good one"? Remember that Tesla claimed in 2016 that all Teslas in production "have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver". But that was, of course, a lie:

https://web.archive.org/web/20240730071548/https://tesla.com...

https://electrek.co/2025/10/22/tesla-changes-all-cars-have-s...

What Tesla used to claim was "full autonomy" is now called "Full Self-Driving (supervised)", whatever that's supposed to mean. How many times has "Full Self-Driving (supervised)" gone dangerously wrong but was stopped? How many times was supervision not enough:

https://electrek.co/2025/05/23/tesla-full-self-driving-veers...

Show me some concrete numbers to back your claims. If you can't do that then I think you've fallen victim to Tesla's dishonest marketing.

josefritzishere

4 hours ago

Tesla would benefit from the board replacing the CEO. It's increasingly clear that there is a problem and it's not talent, it's decision-making.

dolphinscorpion

4 hours ago

Their stock would crash to $10 without the hype machine

mmooss

3 hours ago

Après moi, le déluge.

breakyerself

4 hours ago

It will be zero if they keep doing the same shit

pm90

3 hours ago

Ultimately, I believe there will need to be something catastrophic to oust musk/change leadership. And by that point, its questionable if anything worthwhile will be left to salvage.

My current bet is that optimus will fail spectacularly and Tesla gets left far behind as Rivian's R2 replaces it.

One thing I will note: I know folks that work at TSLA. Musk is more of a distraction. If he goes and if competent leadership is brought in, there's still enough people and momentum to make something happen...

maxdo

3 hours ago

this is literally one of 1-3 companies who have a decent strategy in the age of AI. the rest is pretending changes will not affect them. even this judgement: the guy decided to pick the phone while driving car not capable of red light detection. It could be any other car with similar auto steer capabilities. Right now same car with OTA updates would keep him alive. Sure, they are doing something wrong.

madeofpalk

3 hours ago

Will it actually? Has the market sent any signal that they won’t tolerate Musk?

You’re a lot more optimistic about this than I am.

josefritzishere

an hour ago

Tesla has made some great cars, but their CEO is not making sound decisions . I really think a new CEO could turn around Tesla. It doesn't need to hit rock bottom first. Every major auto company has been through this.

DoesntMatter22

4 hours ago

This will continually be appealed until it’s reduced.

DannyBee

4 hours ago

They claim have a pretrial agreement to reduce it to 3x compensatory damages (which would make the total judgemnet 160 million instead of 243 million).

Appealing is expensive because they have to post a bond with 100% collateral, and you pay for it yearly. In this case, probably around 8 million a year.

So in general its not worth appealing for 5 years unless they think they will knock off 25-30% of the judgement.

Here it's the first case of it's kind so i'm sure they will appeal, but if they lose those appeals, most companies that aren't insane would cut their losses instead of trying to fight everything.

LeoPanthera

4 hours ago

This was the appeal.

DannyBee

4 hours ago

No it wasn't, it was a motion to set aside the verdict, made before the trial judge.

The appeal will go to the 11th circuit.

tiahura

4 hours ago

No it wasn't. This was the trial judge deciding to not reduce it. $43 million in compensatory damages is unusually high for a wrongful death.

jeffbee

4 hours ago

$43 millon does not seem spectacularly high compensation for killing someone at the age of 22.

dekhn

4 hours ago

When my spouse worked in the area of determining "the value of an individual" (economically, not morally), it was computed as present value lifetime earnings: the cumulative income of the individual, converted back to its current value (using some sort of inflation model). IIRC, the PVLE averaged out to about $1-10M.

nradov

4 hours ago

You shouldn't be down voted. Regardless of the moral or technical issues involved, there are established formulas used to calculate damages in wrongful death civil suits. Your range is generally correct although certain factors can push it higher. (Punitive damages are a separate issue.)

jeffbee

3 hours ago

There are not "established formulas" or, to the extent that they are, the coefficients and exponents are not determined. The parties always argue about the discount rates and whatnot.

dekhn

3 hours ago

Sure, no argument there, I was just referring to research like this: https://escholarship.org/uc/item/82d0550k

"""Results. At a discount rate of 3 percent, males and females aged 20-24 have the highest PVLE — $1,517,045 and $1,085,188 respectively. Lifetime earnings for men are higher than for women. Higher discount rates yield lower values at all ages."""

dekhn

3 hours ago

I generally don't complain about being downvoted, but it is always puzzling when I post a neutral fact without any judgement.

EastSmith

22 minutes ago

Why the f this was downvoted? Literally from the article:

> Tesla has indicated it will appeal the verdict to a higher court.

Hamuko

3 hours ago

Does the American legal system have infinite appeals?

maxdo

3 hours ago

I'm so lost. The guy decided to pick up the phone from the floor while driving the car at high speed.

1. It could be ANY car with similar at that time auto steer capabilities. 2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).

madsmith

3 hours ago

Not sure if it’s using the same FSD decision matrix but my model S chimed at me to drive into the intersection while sitting at a red light Last night with absolutely zero possibility it saw a green light anywhere in the intersection.

Perfectly isn’t a descriptor I would use. But this is just anecdotal.

BugsJustFindMe

3 hours ago

> Why the hate , because of some false promise ?

Another name for "false promise" when made for capital gain is "fraud". And when the fraud is in the context of vehicular autonomy, it becomes "fraud with reckless endangerment". And when it leads to someone's death, that makes it "proximate cause to manslaughter".

SpicyLemonZest

2 hours ago

As the source article says, the jury did agree that the driver was mostly liable. They found Tesla partially liable because they felt that Tesla's false promise led to the driver picking up his phone. If they'd been more honest about the limitations of their Autopilot system, as other companies are about their assisted driving functionalities, the driver might have realized that he needed to stop the car before picking up his phone.

user

3 hours ago

[deleted]