By Joshua Dowling
After a number of deaths in the US linked to Tesla owners placing too much faith in autonomous driving systems – and amid ongoing investigations by road safety regulators there – Elon Musk says mastering the tech is more difficult than he expected.
- Tesla boss Elon Musk has finally admitted what many automotive industry experts have long realised: autonomous vehicle technology is a lot more difficult than it seems, due to the countless variables a car must learn, detect, and avoid while on the move.
In addition to being able to detect lanes, other vehicles, traffic signals and pedestrian movement, fully autonomous cars must learn to pre-empt the behaviour of other drivers and pedestrians in the same way a human would, an almost impossible task.
On social media platform Twitter this week – in response to a query about Tesla’s next update on autonomous tech – Elon Musk wrote: “FSD 9 (Full Self Driving 9) beta is shipping soon I swear! Generalized (autonomous) self-driving is a hard problem, as it requires solving a large part of real-world AI (artificial intelligence).
“Didn’t expect it to be so hard, but the difficulty is obvious in retrospect. Nothing has more degrees of freedom than reality.”
The admission by Elon Musk is being regarded as a relief to industry observers, as car companies have grappled with the roll-out of the technology and, in the US at least, regulators are increasingly keeping a close eye on real-world driving tests of the future tech.
In October 2017 a high-ranking autonomous vehicle expert at US car giant General Motors said Tesla’s claim that it already had the technology for total hands-free driving was “full of crap”.
At the time, Elon Musk said Tesla cars “already have the hardware needed for a full self-driving capability”, known in the industry as a “Level Five” engineering standard.
However in a briefing with Australian media in Detroit at the time, Scott Miller, General Motors’ director of autonomous vehicle integration said: “I think he’s full of crap”, when asked what he thought about Musk’s claim.
“If you think you can see everything you need for a Level Five autonomous (car) with cameras and radar, I don’t know how you do that,” Mr Miller told media in October 2017.
“To be what an SAE (Society of Automotive Engineers) Level Five full autonomous system is, I don’t think he (Elon Musk) has the content to do that. Do you really want to trust one sensor measuring the speed of a car coming into an intersection before you pull out? I think you need some confirmation.”
The car industry is now of the view each sensor in an autonomous vehicle needs confirmation from one or more other sensors before making a critical decision, which adds cost and requires phenomenal processing speeds not seen before in automobiles.
While not backing away from his push for autonomous cars, it appears Musk is developing a more level-headed approach to the technology, especially in the wake of fatalities in the US involving Tesla drivers that placed way too much reliance on the self-driving systems.
In a YouTube interview with US automotive industry expert Sandy Munro in February 2021, Musk said: “For self-driving (technology), even if the road is painted completely wrong and a UFO lands in the middle of the road, the car still cannot crash and still needs to do the right thing.”
He continued: “The prime directive for the autopilot system is ‘don’t crash’. That really over-rides everything, no many what the lines say or how the road is done … minimise the probability of impact while getting you to your destination conveniently and comfortably.
Musk said at the time while it would “certainly be helpful to have roads with accurate markings” the car needs to be smarter than that.
“For self-driving, it’s got to be even if someone tries to trick the car (that) they do not succeed in tricking the car, because people will do weird things, it’s got to be maintain safety no matter what … don’t let yourself (Tesla autonomous tech) get tricked.”