Elon Musk says that Tesla FSD system is “essential” or company is “worth basically zero”
The NHTSA reports that over 9 months Teslas using driver-assist systems were involved in 273 crashes, Elon Musk says without system Tesla is worthless.
Elon Musk said in a recent interview that it’s “essential” that Tesla solve its full self-driving technology (FSD). Musk feels that it should be the “overwhelming focus” for the carmaker because “it’s really the difference between Tesla being worth a lot of money or worth basically zero.”
Meanwhile, the driver-assist system has come under increased scrutiny from the National Highway Traffic Safety Administration (NHTSA). The auto-safety agency expanded an investigation into Teslas and on Wednesday released a report that found Teslas using driver-assist systems were involved in 273 crashes over 9 months.
Tesla drivers are expected to be fully attentive
Tesla has come under criticism for calling its driver assistant software “Full Self-Driving”. The head of the US National Transportation Safety Board (NTSB), Jennifer Homendy, told CNBC’s “Squawk Box” that it’s “misleading” and “people are misusing the vehicles and the technology” because of the way it’s being marketed.
On Tesla’s support page for the Autopilot and Full Self-Driving Capability states that the systems “are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.” The company is working on making the features more capable over time but “the currently enabled features do not make the vehicle autonomous.”
Tesla could have a long road ahead to reaching “full self-driving” cars
The CEO of the electric car company has stated that the FSD program could potentially achieve Level 4 autonomy by the end of 2022. His optimism most likely exceeds what Tesla’s engineers can achieve though; he’s made the same prediction several times in the past.
Because the responsibility lies with the driver and not Tesla’s system, it is classified as a Level 2 advanced driver-assist system. For the system to be considered a Level 4, or high-driving automation, no human interaction in the vehicle’s operation would be required. A Level 4 vehicle could even forgo having a steering wheel and pedals, being programmed to stop itself in the event of system failure.
Achieving full autonomy will require a lot of data to teach the algorithms needed for a car to drive itself. The Rand Corporation has calculated that it would take around 400 years for a fleet of 100 self-driving cars driving 24 hours a day to cover the 875 million miles to develop a system 20 percent safer than a human driver.
Some companies are reducing their programs, like Uber which sold off its unit developing self-driving cars, due to the daunting nature of the task at hand.
There were over 2.3 million Teslas on the road at the end of 2021 and Musk intends to see that number grow to 4 million by the end of this year. All new cars built by the electric vehicle manufacturer are equipped with the hardware for Autopilot and FSD capability.
However, to get access to the FSD features owners need to fork over $12,000 for the package or sign up for a monthly subscription. And not all Tesla drivers that have opted for the FSD system participate in the Beta version the company is using to test new features. Musk has said that there were 100,000 owners testing the technology in real-world situations, and that number may now be double.
NHTSA could be preparing for a recall of Tesla self-driving system
Last year the NHTSA put out an order that required manufacturers and operators of Automated Driving Systems (ADS) and Advanced Driver Assistance Systems (ADAS) to report crashes to the auto-safety agency. On Wednesday it released its initial round of data which found there have been 367 car crashes in the US involving advanced driver-assistance technologies.
Tesla cars were involved in 273 of the incidents, however NHTSA administrator Steven Cliff advised “caution before attempting to draw conclusions based only on the data.” Tesla’s technology is the most commonly used being standard on all new models.
However, some of the data the NHTSA collected in its investigation of Teslas rear-ending stationary emergency vehicles could raise questions about how responsive drivers are while the driver assistant is activated. In half of 106 accidents analyzed, evidence suggests drivers weren’t fully attentive.
In 43 crashes where detailed car log data was available, “37 indicated that the driver’s hands were on the steering wheel in the last second prior to the collision.” Tesla has installed systems to observe if drivers have their eyes on the road in addition to sensors to detect hands on the steering wheel.
However, if the NHTSA finds that these are defective, they could press the electric carmaker to issue a recall to correct the problem.