To be honest. I think this is one of the strengths of autonomous cars.
With humans when they do this at max we can punish that individual. To increase population wide compliance we can do a safety awareness campaign, ramp up enforcement, ramp up the fines. But all of these cost a lot of money to do, take a while to have an effect, need to be repeated/kept up, and only help statistically.
With a robot driver we can develop a fix and roll it out on all of them. Problem solved. They were doing the wrong thing, now they are doing the right thing. If we add a regression test we can even make sure that the problem won't be reintroduced in the future. Try to do that with human drivers.
… assuming the GiantCorp running the robotaxis cares about complying with the law, and doesn’t just pay a fine that means nothing to them.
The first fines should be meaningless to the company. If the issue isn't fixed the fines should get higher and higher. If the company fixes one issue but there is a second discovered quickly we should assume they don't care about safety and the second issue should have a higher fine than the first even though it is unrelated.
Companies (and people) have an obligation to do the right thing.
What do you mean by "second issue"? A second instance of the same underlying problem, or a different underlying problem? The way you phrase it as unrelated suggests the latter to me.
It's pretty wild to jump straight to "they don't care about safety" here. Building a perfect system without real world testing is impossible, for exactly the same reason it's impossible to write bug-free code on the first try. That's not a suggestion to be lax, just that we need to be realistic about what's achievable if we agree that some form of this technology could be beneficial.
The courts get to decide that. Often it is a "I know it when I see it". The real question is did they do enough to fix all possibly safety issues before this new one happened that was different. If they did "enough" (something I'm not defining!) then they can start over.
> a fine that means nothing to them
Yes, this is often the case. In this instance, though, endangering children is just about the worst PR possible. That's strong leverage.
Waymo seems more interested in delivering a true solution than I have seen elsewhere.
the discourse around “corporations” has gotten absolutely ridiculous at this point, especially on this website.
It’s not an unreasonable take given historic behavior. Rather than decrying the cynicism, what steps can we take to ensure companies like Tesla/Waymo/etc are held accountable and incentivized to prioritize safety?
Do we need hasher fines? Give auto regulators as much teeth as the FAA used to have during accident investigations?
Genuinely curious to see how addressing reasonable concerns in these areas can be done.
It's ironic given this forum began as a place for aspiring startup (Delaware C-Corp) founders.
We feared the advent of LLMs since they could be used as convincing spam tools. Little did we know that humans would often do the same.
I’m also curious about school zones. The one near my house has a sign, “School”
“Speed Limit 35”
“7:00AM to 4:00PM School Days”
Now, how does a robotaxi comply with that? Does it go to the district website and look up the current school year calendar? Or does it work like a human, and simply observe the patterns of the school traffic, and assume the general school calendar?
I suspect it continues in Mad Max mode.
Wait, how does that work? Every person in your city needs to know the exact calendar of that school?
Yes, that's how it works in Alberta. It's particularly confusing because not all schools have the same academic calendar (e.g., most schools have a summer break, but a few have summer classes).
Unlike the sibling comment, there are no lights or indications of when school is in session. You must memorize the academic calendar of every school you drive past in order to know the speed limit. In practice, this means being conservative and driving more slowly in unfamiliar areas.
the sign says the hours for the reduced speed limit or, more commonly in my experience, has a light that activates during school hours.
The light and often even the sign itself are typically considered informational aids rather than strict determinants of legality. The driver is expected to comply with all the nitpicky details of the law regardless of whether the bulb is burned out or the school schedule changes.
Needless to say, most people regularly violate some kind of traffic law, we just don't enforce it.
of course. i'm confident slowing down near a school is pretty intuitive for the vast majority of drivers, though.
Sure, but the context here is a discussion about how a computer can know all of these "intuitive" rules humans follow.
The answer is encoded in the map data in this case, but it's an interesting category of problems for autonomous vehicles.
Now I’m imagining the Waymo Driver calling out to Gemini to determine "school hours" by looking it up on the Internet, and wondering about the nature of life.
Aren’t they supposed to read signs? Otherwise they’d also ignore the overhead speed limits on the highway for traffic jams / air quality adjustments during the day.
They should just always observe the lower speed limit.
The difference is usually 5 or maybe 10 mph.
Which over the distance of a school zone is nothing.
Are school days ever Sundays? If not, perhaps all drivers just treat every non-Sunday as school day. If so, they probably just slow every day.
I'm not complaining, but like..maybe also do this for the vast majority of human drivers who also flout these rules.
I mean, we do. The problem is that you need to be physically present to catch and deal with those people, and you can only really deal with one party (others will do their thing while a police officer is dealing with the first driver they stop). Not to mention that drivers change their behavior if they see the police around, so it's harder to catch them in the act. So for a variety of reasons it's harder to solve the human driver problem.
With how cheap high definition cameras are, I don’t see why society needs a person to be physically present.
Hence high definition camera. Most states have tints on windshield and dark tints on front windows as illegal. Also, the license plate is all that is needed, ticket the owner and they will readily give up the driver.
Other countries have no issues with camera based traffic law enforcement.
I'm going to call bullshit on this. Most human drivers do not flout these rules.
No kidding. Try doing this once or twice and the driver will record your information and you’ll get a nice visit from the police.
Out here in rural nowhere it most certainly does. The school bus driver will record your number plate, and school buses have the equivalent of dashcams now.
I think maybe they meant that the majority of vehicles that flout the rules are human-driven.
Human drivers can be seen and stopped by police and given an appropriate punishment. Self-driving cars have nobody to take accountability like that so you need to go back to the source.
Yeah but in many cases they're not, traffic enforcement went way down during Covid and it's still down.
Most large cities I've lived in, general traffic enforcement essentially only exists on that month's/quarter's designated ticket-writing day. i.e., when highway patrol and city police just write speeding tickets all day to juice revenue
Off-topic... what poor writing:
> a Waymo did not remain stationary when approaching a school bus with its red lights flashing and stop arm deployed.
Because it's physically possible to approach something while remaining stationary?