Cameras could do it if:
* they were positioned for stereopsis like the human visual system
* had 6 degrees of motion freedom like the human visual system
* were hyper-adaptive to lighting conditions like the human visual system
* had a significantly higher density of pixels per degree of arc in the focus region like the human visual system
* and were backed by a system capable of intuiting object inertia like the human visual system.
Tesla does none of those.
Even if cameras can do it, it feels wrong to not use tech that can do it better that vision alone. Even if it costs more, it should still be used because these are machines that can kill people, and life is too valuable. If it can't be done the best we can, then maybe it shouldn't be done at all yet.
Some people live in brick houses and others in trailers or worse. It’s just not how the reality of the world works. The reality is we’re lucky to even have the luxury of a warm trailer relative to the chaos and pain of nature.
> feels wrong
Reductio ad absurdum.
EVs are safer, why we haven't banned ICE vehicles yet?
How about screening every food item in supermarket for chemicals and pathogens. Surely that will minimize excess death?
Lot's of things can be "best we can", ask yourself why it's not done yet.
The corporation's risk assessment department has calculated it's more cost effective to deny and then fight the consequences in court vs. spend the extra money up front?
Volvo just abandoned LIDAR.
On a different note TIL learned that Tesla uses the raw camera sensor data and creates an occupancy network, instead of using images and object detection, which feels like to me that Tesla isn't really using vision.
As I understand it, Volvo's relationship with Luminar broke down, not that there was an obvious problem with the technology itself.
Volvo has the tech problems...
https://insideevs.com/news/773202/volvo-ex90-software-issues...
-- Volvo is upgrading the central computer on all 2025 EX90s for free.
-- The company has spent over a year trying to squash software bugs in the EX90, but owners are still reporting serious issues and glitches.
-- One owner told InsideEVs that her EX90 has been a "dumpster fire inside a train wreck."
Cameras can do it, but more instruments are better. Planes use more than the eyes of the pilot.
That's why we are putting those bird stickers on any transparent surface outside, like noise cancelling walls around roads or bus stations.
Sadly, Morgan Stanley suggests that the release of this information may result in invoking the "Osbourne Effect." Osbourne Computer, decades ago, released information about a great new future model. This resulted in customers refraining from buying the current Osbourne model, resulting in steep losses and bankruptcy. Not to say bankruptcy will happen to Rivian, but many customers may refrain from buying the current R1 model, and instead opt for the future R2 model, which may result in a bad quarter or two for Rivian. FWIW. Rivian seems to have amazing technology in the works, but investors may be in for a bumpy ride until a successful R2 rollout ... according to Morgan Stanley (which may be conflicted by their business with Tesla).
I'm not sure how informed most consumers are these days.
I was going to say the opposite: that unlike back in the Osbourne days, consumers today understand that there will always be “something better” announced soon, and they’re used to making purchase decisions anyway.
>>Morgan Stanley suggests that the release of this information may result in invoking the "Osbourne Effect." Osbourne Computer, decades ago, released information about a great new future model. This resulted in customers refraining from buying the current Osbourne model, resulting in steep losses and bankruptcy.
Sadly that did not work for Tesla, and the promises of FSD next year...for the last 10 years...
FSD is a software feature, planned for support on existing hardware. Over-the-air hardware updates are not so easy.
It’s literally called the FSD computer and the software is called “supervised”. So if anything they’re directly claiming it’s a hardware accomplishment while the software is lesser
If you think FSD is solely limited by LIDAR I have some news to you - Waymo does most of its driving on cameras.
Their production stack is explicitly multi-sensor, and LiDAR is a primary source for metric 3D geometry plus localization and cameras are mainly for semantics. Waymo documents the Waymo Driver as LiDAR plus cameras plus radar.
> cameras are mainly for semantics
Which is the most important part...
'Different article' from the 11th, Same discussion.
how on earth 3 links to Github actions price increases are relevant here ?
stop derailing the discussion...