krisoft
3 months ago
To be honest. I think this is one of the strengths of autonomous cars.
With humans when they do this at max we can punish that individual. To increase population wide compliance we can do a safety awareness campaign, ramp up enforcement, ramp up the fines. But all of these cost a lot of money to do, take a while to have an effect, need to be repeated/kept up, and only help statistically.
With a robot driver we can develop a fix and roll it out on all of them. Problem solved. They were doing the wrong thing, now they are doing the right thing. If we add a regression test we can even make sure that the problem won't be reintroduced in the future. Try to do that with human drivers.
themafia
3 months ago
> we can develop a fix and roll it out on all of them.
You have to know what you're fixing first. You're going to write a lot of code in blood this way.
It's not that people are particularly bad at driving it's that the road is exceptionally dynamic with many different users and use cases all trying to operate in a synchronized fashion with a dash of strong regulation sprinkled in.
krisoft
3 months ago
> You have to know what you're fixing first.
In this case the expected behaviour is clearly spelled out in the law.
> You're going to write a lot of code in blood this way.
Do note that in this case nobody died or got hurt. People observed that the autonomous vehicles did not follow the rules, the company got notified of this fact and they are working on a fix. No blood was spilled to achieve this result.
Also note that we spill much blood on our roads already. And we do that without much of any hope of learning from individual accidents. When George runs over John there is no way to turn that into a lesson for all drivers. There is no way to understand what went wrong in George’s head, and then there is no way to adjust all driver’s heads so that particular problem won’t happen again.
jfoster
3 months ago
And "much blood" is (globally) to the tune of ~1.2 million lives lost, and many more injuries.
Compared to that, autonomous vehicles have barely harmed anyone. Also they will probably save most of those lives once they become good.
The "least harm" approach is to scale autonomous vehicles as quickly as possible even if they do have accidents sometimes.
Earw0rm
3 months ago
That's true at least once they surpass human drivers in collisions per driver mile under equivalent conditions.
It seems like we're pretty close to that point, but the numbers need to be treated with care for various reasons. (Robotaxis aren't dealing with the same proportions of conditions - city vs suburban vs freeway - and we should probably exclude collisions caused by human bad-actors which should have fallen within the remit of law enforcement - drink/drugs, grossly excessive speed and so on).
sdenton4
3 months ago
Why should we exclude the cases of human bad-actors? That's explicitly a major case solved by getting rid of the human behind the wheel...
manwe150
3 months ago
At least some of them will likely still occur as those people may decide to override the robot drivers safer choices to save 30 seconds or have fun
inglor_cz
3 months ago
This is a tradeoff, in which the original case might have been the less dangerous one.
Autonomous fleets have a major potential flaw too, in form of a malicious hacker gaining control over multiple vehicles at once and wreaking havoc.
Imagine if every model XY suddenly got a malicious OTA update and started actively chasing pedestrians.
sdenton4
3 months ago
Hm, so you would put a hypothetical scenario on the same footing as thousands of actual deaths caused by drunk drivers each year? 30% of us road fatalities involve a drunk driver each year...
I seriously doubt that the "mass takeover and murder" scenario would ever actually happen, and further doubt that it would cause anywhere near 10k deaths if it did occur.
inglor_cz
3 months ago
"I seriously doubt that the "mass takeover and murder" scenario would ever actually happen"
OK, so you are optimistic. My own specialization is encryption/security, so I am not. State actors can do such things, too, and we've already had a small wave of classical physical-world sabotages in Europe that everyone suspects Russia of.
"further doubt that it would cause anywhere near 10k deaths"
This is something I can agree upon, but you have to take into account that human societies don't work on a purely arithmetic/statistical basis. Mass casualty events have their own political and cultural gravitas, doubly so if they were intentional.
Sinking of the Titanic shocked the whole world and it is still a frequent subject for artists 100 years later, even though 1500 deaths aren't objectively that many. I don't doubt that way more than 1500 people drowned in individual accidents worldwide in April 1912 alone, but the general public didn't care about those deaths.
And a terrorist attack with merely 3000 dead put the US on a war footing for more than a decade and made it spend a trillion dollars on military campaigns, even though drunk American drivers manage the same carnage in five months or so.
Earw0rm
3 months ago
Because the baseline of human-operated safety is "get law enforcement to do their job of getting rid of the bad actors."
sdenton4
3 months ago
Why is that the baseline? Actual human performance as it exists today gives us tens of thousands of road fatalities per year in the US. We have not solved that problem despite decades of opportunity to introduce regulations and enforcement. Getting rid of human drivers looks like a very promising way forward.
Earw0rm
3 months ago
Because failure to solve it is political and intellectual laziness and cowardice.
It's like jets falling out of the sky because the guy that bolts the wings on is only half doing his job, we can all see it and know about it and yet.. nobody wants to speak up.
rightbyte
3 months ago
I don't think we are better off putting Elon Musk behind every wheel.
TOMDM
3 months ago
Good thing no one is suggesting that
rightbyte
3 months ago
I was a bot hyperbolic but having Teslas steer by wire with remote code execution is close enough to an Elon Musk behind every wheel. What was the name of the movie, "Leave the World Behind"?
harperlee
3 months ago
Not sure about a movie but that reminded me of the "Driver" short story in the "Valuable Humans In Transit and Other Stories" tome by QNTM (https://qntm.org/vhitaos).
I'd recommend to buy the book, but here's an early draft of that particular story:
Earw0rm
3 months ago
There are ways, but our individualistic, consumerist, convenience-first society is reluctant to implement them - as, same as gun control, they're incompatible with certain notions of freedom.
MindSpunk
3 months ago
> You have to know what you're fixing first. You're going to write a lot of code in blood this way.
This is exactly how the aviation industry works, and it's one of the safest ways to travel in the world. Autonomous driving enables 'identify problem -> widely deployed and followed solutions' in a way human drivers just can't. Things won't be perfect at first but there's an upper limit on safety with human drivers that autonomous driving is capable of reaching past.
It's tragic, but people die on roads every day, all that changes is accountability gets muddier and there's a chance things might improve every time something goes wrong.
ninalanyon
3 months ago
But other countries have far fewer accidents than the US so it isn't quite so black and white. The gain from autonomous vehicles will be much less in the UK for instance.
If you really want to reduce accident rates you need to improve road design and encourage more use of public transport and cycling. This requires no new vehicles, no new software, no driver training, and doesn't need autonomous vehicles at all.
fendy3002
3 months ago
But you still don't have autonomous flying, even though the case is much simpler than driving: take off, ascend, cruise, land.
It isn't easy to fix autonomous driving not because the problem isn't identified. Sometimes two conflicting scenario can happen on the road that no matter how good the autonomous system is, it won't be enough
Though I agree that having different kind of human instead will not make it any safer
inetknght
3 months ago
> But you still don't have autonomous flying, even though the case is much simpler than driving: take off, ascend, cruise, land.
Flying is actually a lot more complicated than just driving. When you're driving you can "just come to a stop". When you're flying... you can't. And a hell of a lot can go wrong.
In any case, we do have autonomous flying. They're called drones. There are even prototypes that ferry humans around.
Dylan16807
3 months ago
Being unable to abort a flight with a moment's notice does add complication, but not so much that flying is "a lot more complicated" than driving. The baseline for cars is very hard. And cars also face significant trouble when stopping. A hell of a lot can go wrong with either.
JumpCrisscross
3 months ago
> When you're driving you can "just come to a stop". When you're flying... you can't
Would note that this is the same issue that made autonomous freeway driving so difficult.
When we solve one, we'll solve the other. And it increasingly looks like they'll both be solved in the next half decade.
fendy3002
3 months ago
a bit unclear from my statement before but that's the point. Something that feels easy is actually much more complicated than that. Like weather, runway condition, plane condition, wind speed / direction, ongoing incidents at airport, etc. Managing all that scenario is not easy.
the similar things also applied in driving, especially with obstacles and emergency, like floods, sinkhole in Bangkok recently, etc.
Spooky23
3 months ago
Flying is the “easy” part. There’s a lot more wood behind the arrow for a safe flight. The pilot is (an important) part of an integrated system. The aviation industry looks at everything from the pilot to the supplier of lightbulbs.
With a car, deferred or shoddy maintenance is highly probable and low impact. With an aircraft, if a mechanic torques a bolt wrong, 400 people are dead.
Nextgrid
3 months ago
At least one reason for intentionally not having fully autonomous flying is that you want the human pilots to keep their skills sharp (so they are available in case of an emergency).
gazook89
3 months ago
Also, humans will intentionally act counter to regulations just to be contrarian or send a message. Look at “rolling coal”, or people who race through speed meters to see if they can get a big number. Or recently near me they replaced a lane to many a dedicated bus lane, which is now a “drive fast to pass every rule follower” lane.
Earw0rm
3 months ago
For some reason law enforcement seem to be particularly reluctant to deal with this kind of overtime dumbfuckery when it involves automobiles.
If you try something equivalent with building regs or tax authorities, they will come for you. Presumably because the coal-rolling dumbasses are drawn from the same social milieu as cops.
heavyset_go
3 months ago
Planes maintain vertical and lateral separation away from literally everything. Autonomy is easier in relatively controlled environments, navigating streets is more unlike flying than it is similar.
jefftk
3 months ago
> You're going to write a lot of code in blood this way.
Waymo has been doing a lot of driving, without any blood. They seems to be using a combination of (a) learning a lot from close calls like this one where no one was hurt even through it still behaved incorrectly and (b) being cautious so that even when it does something it shouldn't the risk is very low because it's moving slowly.
warkdarrior
3 months ago
Waymo operates in San Francisco, Phoenix, Los Angeles, Austin, and Atlanta so I am sure they encountered school buses by now and learned from those encounters.
themafia
3 months ago
Waymo operates in a very limited scope and area. I would not attempt to extrapolate anything from their current performance.
andoando
3 months ago
Very limited scope and area is now the whole of a few major cities.
https://support.google.com/waymo/answer/9059119?authuser=1
This is actually the one technology I am excited about. Especially with the Zoox/mini bus /carpool model, I can see these things replacing personal cars entirely which is going to be a godsend for cost, saftey and traffic
kelnos
3 months ago
I absolutely would, since operating in a slowly growing limited scope and area is a part of the safety strategy.
seanmcdirmid
3 months ago
> Waymo operates in a very limited scope and area. I would not attempt to extrapolate anything from their current performance.
This is less and less true every year. Yes, it doesn't drive in the snow yet, no, I don't drive in the snow either, I'm ok with that.
user
3 months ago
mastax
3 months ago
If you were trying to evaluate that code deployed willy nilly in the wider world, sure. But that code exists within a framework which is deliberately limiting rollout in order to reduce risk. What matters is the performance of the combined code and risk management framework, which has proven to be quite good.
Airbus A320s wouldn’t be very safe if we let Joe Schmo off the street fly them however he likes, but we don’t. An A320 piloted within a regulated commercial aviation regime is very safe.
What matters is the safety of the entire system including the non-technological parts.
dmix
3 months ago
I'm just curious to see how they handle highways more broadly which is where the real danger is and where Tesla got in trouble in the early days. Waymo avoided doing that until late last year, and even then it's on a very controlled freeway test in Phoenix, not random highways
https://waymo.com/blog/2024/01/from-surface-streets-to-freew...
lclarkmichalek
3 months ago
Highways are pretty safe. The road is designed from start to finish to minimise the harm from collisions. That’s not true of urban streets
thechao
3 months ago
They drive on highways here, in Austin, all the time. They do just fine. My kids love to wave to Waymo.
programjames
3 months ago
The human traffic code is also written in blood. But humans are worse at applying the patch universally.
hamdingers
3 months ago
We don't even try. In the US you demonstrate that you know the rules at one point in time and that's it, as long as you never get a DUI you're good.
For instance, the 2003 California Driver's Handbook[1] first introduced the concept of "bike lanes" to driver education, but contains the advice "You may park in the bike lane unless signs say “NO PARKING.”" which is now illegal. Anyone who took their test in the early 2000s is likely unaware that changed.
It also lacks any instruction whatsoever on common modern roadway features like roundabouts or shark teeth yield lines, but we still consider drivers who only ever studied this book over 20 years ago to be qualified on modern roads.
1. https://dn720706.ca.archive.org/0/items/B-001-001-944/B-001-...
Natsu
3 months ago
Some places will dismiss a traffic ticket if you attend a driver's education class to get updates, though you can only do this once every few years. So at least there have been some attempts to get people to update their learning.
hamdingers
3 months ago
This only happens if you get a traffic ticket, which is rare and getting rarer.
Ironically this means the people with the cleanest driving record are least likely to know the current ruleset.
Detrytus
3 months ago
Which, ironically, would mean that knowing the current rule set is not needed to drive safe.
hamdingers
3 months ago
Not getting tickets does not mean you are a safe driver. No amount of crashing results in traffic school, just certain kinds of tickets.
dragonwriter
3 months ago
> No amount of crashing results in traffic school, just certain kinds of tickets.
Well, sufficient at-fault crashing will suspended your license, and among other requirements for restoring the license may be traffic school, DUI school, or some other program depending on the reason for suspension, so this is not strictly correct. You can't use optional voluntary traffic school to clear points from a collision from your record BEFORE getting a suspension the way you can with minor moving violations without a collision, but that doesn’t mean collisions won’t force you into traffic school.
smcin
3 months ago
Which states/counties/cities? IME that rarely happens, tickets are often used for revenue-raising. And some recent laws e.g. 2008, 2009, 2025 CA cellphone use laws cannot be discharged by traffic school, AFAIK.
Natsu
3 months ago
Phoenix, Arizona
kelnos
3 months ago
> Anyone who took their test in the early 2000s is likely unaware that changed.
That's silly. People become aware of new laws all the time without having to attend a training course or read an updated handbook.
I took the CA driver's written test for the first time in 2004 when I moved here from another state. I don't recall whether or not there was anything in the handbook about bike lanes, but I certainly found out independently when it became illegal to park in one.
kevincox
3 months ago
I don't doubt that many people are aware of many of the new laws. But I strongly suspect that a very significant number of drivers are unaware of many new laws.
kelnos
3 months ago
> You're going to write a lot of code in blood this way.
Maybe? In this particular case, it sounds like no one was injured, and even though the Waymos didn't follow the law around stopping for school buses, it exercised care when passing them. Not great, certainly! But I'd wager a hell of a lot better than a human driver intentionally performing the same violation. And presumably the problem will be fixed with the next update to the cars' software. So... fixed, and no blood.
seanmcdirmid
3 months ago
I haven't dealt with a school bus in....maybe 20 years, and it would definitely be an exception if I had to deal with one tomorrow. I kind of know what I should do, but it isn't instinct at this point.
A waymo, even if it drove in urban Seattle for 20 years where school buses aren't common, it would know what to do if it was presented with the exception tomorrow (assuming it was trained/programmed correctly), it wouldn't forget.
user
3 months ago
user
3 months ago
kiba
3 months ago
Road designs play an important role as well, it's not just enforcing the law.
Some roads are going to be safer simply because drivers don't feel safe driving fast. Others are safer simply because there's less opportunities to get into a collision.
Wide street in cities encourage faster driving which doesn't really save a lot of time while making the streets more dangerous, for example.
quickthrowman
3 months ago
After seeing how road design calmed traffic speeds in multiple areas in my city by reducing lanes and using medians that ‘narrow’ the apparent width of the roadway, and in some places just with road paint (it’s weird but it works).
I believe that road design is the only way to control how fast people drive. Speed limits are useless for controlling speed, people drive the speed the road is designed for.
stuaxo
3 months ago
Most US roads do not have pedestrian safety in mind at all.
trollbridge
3 months ago
… assuming the GiantCorp running the robotaxis cares about complying with the law, and doesn’t just pay a fine that means nothing to them.
bluGill
3 months ago
The first fines should be meaningless to the company. If the issue isn't fixed the fines should get higher and higher. If the company fixes one issue but there is a second discovered quickly we should assume they don't care about safety and the second issue should have a higher fine than the first even though it is unrelated.
Companies (and people) have an obligation to do the right thing.
AlotOfReading
3 months ago
What do you mean by "second issue"? A second instance of the same underlying problem, or a different underlying problem? The way you phrase it as unrelated suggests the latter to me.
It's pretty wild to jump straight to "they don't care about safety" here. Building a perfect system without real world testing is impossible, for exactly the same reason it's impossible to write bug-free code on the first try. That's not a suggestion to be lax, just that we need to be realistic about what's achievable if we agree that some form of this technology could be beneficial.
bluGill
3 months ago
The courts get to decide that. Often it is a "I know it when I see it". The real question is did they do enough to fix all possibly safety issues before this new one happened that was different. If they did "enough" (something I'm not defining!) then they can start over.
lawlessone
3 months ago
>The first fines should be meaningless to the company.
Why?
platevoltage
3 months ago
Fines are completely useless if they are small enough to be considered "the price of doing business".
lawlessone
3 months ago
I agree.
hyghjiyhu
3 months ago
The goal should be to make them appropriately cautious. Not careless but also not paralyzed by fear. Escalating fines have the property that they are self-tuning. They basically say "go ahead and try it! But if there are issues you have to fix them promptly"
Dylan16807
3 months ago
Because 100 million dollars isn't a reasonable fee for a traffic violation.
xbar
3 months ago
Waymo seems more interested in delivering a true solution than I have seen elsewhere.
whimsicalism
3 months ago
the discourse around “corporations” has gotten absolutely ridiculous at this point, especially on this website.
d4mi3n
3 months ago
It’s not an unreasonable take given historic behavior. Rather than decrying the cynicism, what steps can we take to ensure companies like Tesla/Waymo/etc are held accountable and incentivized to prioritize safety?
Do we need hasher fines? Give auto regulators as much teeth as the FAA used to have during accident investigations?
Genuinely curious to see how addressing reasonable concerns in these areas can be done.
terminalshort
3 months ago
Why isn't allowing people to sue when they get hurt and general bad PR around safety enough? Did you see what happened to Boeing's stock price after those 737 crashes?
d4mi3n
3 months ago
I’d counter that with the Equifax breach that raised thei stock prices when it became clear they weren’t being fined into oblivion. Suing is also generally only a realistic option if you have money for a lawyer.
trollbridge
3 months ago
Right. We have a precedent for how to have an ridiculously safe transportation system: accidents are investigated by the NTSB, and every accident is treated as an opportunity to make sure that particular failure never happens again.
dmix
3 months ago
South Park had a good satire on this sort of generic anti-corporation comment. paraphrasing
"Corporations are bad"
"Why?"
"Because, you know, they act all corporate-y."
https://www.tiktok.com/@plutotvuk/video/7311643257383963937 (sorry googles first result was titktok)
varenc
3 months ago
It's ironic given this forum began as a place for aspiring startup (Delaware C-Corp) founders.
paganel
3 months ago
Some of us came here because we were finding programming.reddit.com too mainstream (after all this thing was written in Arc! of which almost no-one knew any details for sure, but it was Lisp, so Lisp cool), for sure we weren't visiting this place in order to become millionaires.
Even though I agree, there was a time and a place (I'd say 2008-2010) when this forum was mostly populated by "I want to get rich!" people, maybe that is still the case and they've only learned to hide it better, I wouldn't know.
saalweachter
3 months ago
I feel like crypto has absorbed a lot of the "I want to get rich!", so that instead of posts about "Look at my burrito-as-a-service React app!" it's all "Invest in my Burritocoin!", which all kind of fades into the background like the ad banners our eyes pass over without seeing them.
sagarm
3 months ago
Corporations are bad because the diffusion of responsibility and legal liability shield turns them, to a first approximation, into amoral paperclip maximizers that will employ slave labor to save nickels and dimes.
The invariably sociopathic leadership is a symptom, not a cause: they're the least encumbered by ethics, and the best fit for corporate leadership.
The cure is a strong regulatory state.
terminalshort
3 months ago
It really has turned into a bitter losers bitch fest in here
paganel
3 months ago
I agree, much of the people here are still way too lenient when it comes to big corps.
user
3 months ago
renewiltord
3 months ago
We feared the advent of LLMs since they could be used as convincing spam tools. Little did we know that humans would often do the same.
daseiner1
3 months ago
> a fine that means nothing to them
Yes, this is often the case. In this instance, though, endangering children is just about the worst PR possible. That's strong leverage.
tanseydavid
3 months ago
This^^^ -- the impact of positive vs negative PR is unusually huge with this type of tech.
tgv
3 months ago
> With a robot driver we can develop a fix and roll it out on all of them. Problem solved.
I find that extremely optimistic. It's almost as if you've never developed software.
I am curious about Waymo's testing. Even "adding a regression test" can't be simple. There is no well defined set of conditions and outputs.
> Try to do that with human drivers.
At least where I live, the number of cars and car-based trips keeps increasing, but the number of traffic deaths keeps falling.
krisoft
3 months ago
> It's almost as if you've never developed software.
I do develop software. In fact I do develop self driving car software.
Yes it is not easy. Just talking about this particular case. Are the cars not remaining stationary because the legally prescribed behaviour is not coded down? Or are they going around school busses because the "is_school_bus" classifier or the "is_stop_arm_deployed" classifier having false negative issues? If we fix/implement those classifiers will we see issues caused by false positives? Will we cause issues where the vehicles suddenly stop when they think they see a stop arm but there isn't one actually? Will we cause issues if a bus deploys a stop arm as we are overtaking them? What about if they deploy the stop arm while we are 10 meter behind them? 20? 30? 40? 100?
And that's just one feature. How does this feature interact with other features? Will we block emergency vehicles sometimes? What should we do if a police person is signalling us to proceed, but the school bus's stop arm is stopping us? If we add this one more classifier will the GPU run out of vram? Will we cause thread thrashing? Surely not, unless we implement it wrong. In which case definitely. Did we implement it right? Do we have enough labeled data about stop arms of school buses? Is our sensor resolution good enough to see them far enough? Even in darkness? What about fog? Or blinding light? Do every state/country uses the same rules about school busses?
> I am curious about Waymo's testing
They do publish a lot. This one is nice overview but not too technical: https://downloads.ctfassets.net/sv23gofxcuiz/4gZ7ZUxd4SRj1D1...
Or if you want more juicy details read their papers: https://waymo.com/safety/research/
Spooky23
3 months ago
I disagree about the fixing, because ultimately self driving services will have political power to cap their liability. Once they dial in the costs and become scaled self sustaining operations, the incentive will be reduced opex.
I think the net improvements will come from the quantitative aspect of lots and lots of video. We don’t have good facts about these friction points on the road and rely on anecdotal information, police data (which sucks) and time/morion style studies.
JumpCrisscross
3 months ago
> ultimately self driving services will have political power to cap their liability
You're fighting an objectively safer future on the basis of a hypothetical?
Also, we already have capped liability with driving: uninsured and underinsured drivers.
Spooky23
3 months ago
The real cap is the operator ultimately is accountable.
When a software defect kills a bunch of people, the robot operator’s owners will subject to a way lower level of liability. Airlines have international treaties that do this.
An objectively safer future is common carriers operating mass transit. Robot taxi will creating a monster that will price out private ownership in the long term. Objectively safer remains to be seen, and will require a nationwide government regulatory body that won’t exist for many years.
JumpCrisscross
3 months ago
> real cap is the operator ultimately is accountable
Which is in practice lower than what a large operator would pay, particularly if they also write the software.
> will require a nationwide government regulatory body
It doesn’t require any such thing. That would be nice. But states are more than capable of regulating their roads.
Spooky23
3 months ago
Spoken by someone without knowledge of motor carrier regulation. That’s not a dig, few people are.
States are incredibly bad at regulating commercial entities. The Federal DOT contracts with a few universities (or at least they did) use evidence based sampling and enforcement, fulfilled by state authorities for trucks and buses in their scope. Only states like California, New York, Texas would have the resources to do it, and it would be really difficult to do anything effective when there’s 53 or more flavors.
bobthepanda
3 months ago
even if we had good data, the major problem in the US is that the funding liabilities of transportation agencies generally massively outweighs revenues, particularly if legislators keep earmarking already limited funds for yet more road expansion in their districts.
1970-01-01
3 months ago
It's a strength if you catch the bug and fix it before it injures anyone. If anything, this proves edge-cases can take years to manifest.
tehjoker
3 months ago
Why accept the company's say so without any proof being offered or even a description of the fix? If it's been years and this kind of thing, described in regulations so clearly some attention was paid by engineers, still happens, then maybe fixing it isn't trivial.
kelnos
3 months ago
Sure, perhaps we shouldn't accept the company's say-so, but this seems like a fairly easy thing for a third party to verify. If that's not being done, that's not Waymo's fault; lobby the local regulatory body or legislature to get that sort of thing required.
tehjoker
3 months ago
Maybe verifying isn't trivial either? Sometimes bugs only appear with a lot of interactions.
heavyset_go
3 months ago
Unless there is one car that everyone drives, it will never be this easy.
And if there is one car that everyone drives, it's equally easy for a single bug to harm people on a scale that's inconceivable to me.
goobatrooba
3 months ago
Maps and routing errors would likely lead to masses of deaths, an entire motorway population rather than individuals not paying attention..
Like the various "unfinished/broken bridge" deaths that have happened with Google maps involved (not saying to blame.. but certainly not innocent either) https://www.bbc.com/news/world-us-canada-66873982 https://www.bbc.com/news/articles/cly23yknjy9o
manwe150
3 months ago
Granted having all warning signs and barricades removed by vandals seems like the more major issue there, which drivers usually do pay attention to
falcor84
3 months ago
Well, maybe not "this easy", but if we can all agree on an extensive test suite that all autonomous cars have to follow to be allowed on the road, it'd be almost like that, without the risk of a single bug taking down all of them.
dangus
3 months ago
As a counterpoint, a large fine or jail time as a deterrent actually has meaning.to an individual.
For a company, it's a financial calculation.
https://en.wikipedia.org/wiki/Grimshaw_v._Ford_Motor_Co.
(Add the period to the end of the link, HN won't do it)
userbinator
3 months ago
What a dystopian view.