DETROIT (AP) — Elon Musk’s tweet indicating that Tesla may allow some owners who are testing a “full self-driving” system to turn off a warning reminding them to keep their hands on the wheel has attracted the attention of US regulators. .
The National Highway Traffic Safety Administration said it has asked Tesla for more information about the tweet. The agency said last week that the issue is now part of a broader investigation into at least 14 Tesla vehicles that crashed into emergency vehicles while using the Autopilot driver assistance system.
Since 2021, Tesla has been conducting “full self-driving” beta testing with owners who have not been trained on the system but are actively monitored by the company. Tesla said earlier this year that 160,000 vehicles took part, which is roughly 15% of the Tesla vehicles currently on US roads. Wider distribution of the software was supposed to start at the end of 2022.
Despite the name, Tesla still says on its website that cars cannot drive on their own. Teslas using “full autonomous driving” can navigate the roads themselves in many cases, but experts say the system could be wrong. “We’re not saying it’s fully prepared for no one to drive,” CEO Musk said in October.
On New Year’s Eve, one of Musk’s most ardent fans tweeted that drivers who have completed more than 10,000 miles of “Full Self Driving” tests should be able to turn off “steering wheel stick,” a warning that tells drivers to keep their hands on the wheel.
Musk replied: “I agree, the update will be in January.”
It’s not clear from the tweets exactly what Tesla will do. But disabling the driver monitoring system on any vehicle that automates speed and steering could pose a risk to other drivers on the road, said Jake Fisher, senior director of consumer reports’ autotesting division.
“By using the beta version of FSD, you become part of the experiment,” Fisher said. “The problem is that other road users near you didn’t sign up to participate in this experiment.”
Tesla did not respond to a message asking for comment on the tweet or driver monitoring.
Automotive safety advocates and government investigators have long criticized Tesla’s monitoring system as inadequate. Three years ago, the National Transportation Safety Board cited poor monitoring as a contributing factor to the fatal 2018 Tesla crash in California. The board recommended a better system but said Tesla did not respond.
The Tesla system measures steering torque to make sure drivers are paying attention. Many Teslas have cameras that track the driver’s gaze. But Fisher says these cameras aren’t infrared like some competitors’ driver-assistance systems, so they can’t see at night or if the driver is wearing sunglasses.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, argues that Tesla is contradicting itself in ways that can confuse drivers. “They are trying to make customers happy by taking their hands off the wheel, even though the manual (owners) says, ‘Don’t do this.’ “
Indeed, Tesla’s website says that Autopilot and a more sophisticated “full self-driving” system is intended to be used by “a fully attentive driver who has his hands on the wheel and is ready to take control at a moment’s notice.” It states that the systems are not completely autonomous.
The NHTSA noted in the documents that there have been numerous Tesla crashes in which drivers had their hands on the wheel but were still not paying attention. The agency said that autopilot is used in areas where its capabilities are limited and that many drivers fail to take action to avoid crashes despite vehicle warnings.
Tesla’s partially automated systems have been under investigation by the NHTSA since June 2016, when a driver using autopilot was killed after his Tesla was hit by a tractor-trailer crossing his path in Florida. A separate investigation into the Teslas that were using autopilot when they crashed into the ambulances began in August 2021.
Including the Florida crash, the NHTSA sent investigators to 35 Tesla crashes that were suspected of using automated systems. Nineteen people died in these accidents.
Consumer Reports tested Tesla’s monitoring system, which changes frequently with online software updates. Initially, the system did not alert the driver without hands on the steering wheel for three minutes. Lately, however, warnings come in as little as 15 seconds. Fisher said, however, that he’s not sure how long a driver can keep their hands off the wheel before the system slows down or shuts down completely.
By disabling the “obtrusive steering wheel”, Tesla could switch to the camera to follow drivers, according to Fisher, but this is not clear.
Despite the implication in the titles that autopilot and “full self-driving” can drive themselves, Fisher said, it’s clear that Tesla expects owners to still be drivers. But the NTSB says human drivers may end up letting their guard down and relying too much on systems while looking elsewhere or performing other tasks.
Those using Full Self-Driving are likely to be more vigilant in driving, Fisher said, because the system makes mistakes.
“I wouldn’t take my hands off the wheel using this system just because it can do things unexpectedly,” he said.
Koopman said he doesn’t see much of a safety risk in disabling the steering wheel because Tesla’s monitoring system is so flawed that disabling it doesn’t necessarily make Teslas any more dangerous.
NHTSA has enough evidence to take action to force Tesla to install a better monitoring system, he said.
The agency says it does not comment on open investigations.