Why The Tesla/Mobileye Fight Defines An Industry-Wide Schism
Mobileye and Tesla have begun trading barbs illuminating the real reason behind their split. These attacks mask an as-yet undiscussed schism in the sector that transcends their public statements.
(Teslas Autopilot) is not designed to cover all possible crash situations in a safe manner, said Amnon Shashua, Chairman and CTO of Mobileye, the Israel-based maker of collision detection and driver assistance systems. (Telsa) was pushing the envelope in terms of safety.
Teslas response? When Tesla refused to cancel its own vision development activities and plans for deployment, Mobileye discontinued hardware support for future platforms and released public statements implying that this discontinuance was motivated by safety concerns.
These statements highlight a distinct but unspoken truth in the burgeoning self-driving car sector. Mobileyethe company whose technology underlies the majority of ADAS (Advanced Driver Assistance Systems) and semi-autonomous driving suites on the market, may not be at the cutting edge of the technology on which theyve built their reputation.
https://www.yahoo.com/news/why-tesla-mobileye-fight-defines-202306123.html
BlueStreak
(8,377 posts)as a totally driverless solution. All anybody has managed to demonstrate so far are systems that are able to stay within lanes on heavily mapped roads and not hit other cars, pedestrians, cyclists and large dogs most of the time. There are thousands of complications that come up in everyday driving that are easily a decade away from being solved.
Just for a few really obvious examples, there has been no demonstration of any car capable of autonomously handing these conditions:
* Cop directing traffic at an accident site
* Funeral procession requiring other cars to disregard traffic signals
* Autonomous car pulling over when being arrested by a policeman
* Black ice, slush, snow bank, mudslide blocking obscuring lane markings
* Traffic temporarily routed the wrong direction on a one-way street, such as after a concert or sporting event
These are just a few. If one simply pays a tiny bit of attention to daily driving, one will quickly conclude the list of complications is very long, and none of them are handled at all by the current technology.
The idea that cars are close to thinking like humans is a lie -- a massive con. Is it the latest flight of fancy and pretty soon the press will move on the the next ridiculous proposition.
The issue with Mobileye is not whether anybody else has technology that is marginally better than Mobileye. It just doesn't matter because they are all decades away from safely piloting driverless cars in all circumstances. The issue is that Tesla lied to Mobileye about how it would use and represent Mobileye's technology. Tesla deployed it is a dangerous way that has already caused at least two fatalities. And then after causing these fatalities, Elon Musk tried to blame the deaths on Mobileye. Mobileye decided Tesla wasn't worth the risk to their reputation because they could not be trusted to partner in a responsible way. That's the issue, plain and simple.
yurbud
(39,405 posts)half paying attention.
But if there's a way to screw up, people will do it.
BlueStreak
(8,377 posts)And therein lies the fatal (literally) dilemma. If you are in a vehicle that can only handle 99% or even 99.9% of the driving actions, the driver is less likely to be paying attention, ready to take control. And the more situations the technology can handle, the worse that problem becomes. Let's say you only have to take over one time in 100 miles, that one time could be deadly if you have not been paying attention the other 99.99% of the time.
And likewise, if you only have to take over once in a month of driving, what are the chances you are paying close attention? And how are your skills and reflexes if you haven't done any driving for a month?
Here is the paradox. These things are infinitely MORE dangerous than human drivers until you get to the point where they really can do 100% of the driving. And the closer you get to 100% (without actually getting to 100%) the MORE dangerous the vehicle becomes.
And that doesn't even get into the inherent moral hazard questions, such as if the vehicle is truly driverless, what does it do when it is thrust into a situation where there are only two options: either run over a baby carriage or an old man?
yurbud
(39,405 posts)see a lot of stupid, reckless, dangerous drivers everyday (I have a long commute).
I'd rather see a self-driving car that can't handle 0.1% of situations than a human driver whose error rate can be anywhere from 1-100% depending on their mood, level of intoxication, who they are talking to on their phone, and whether there's a full moon.