Business

Tesla Autopilot involved in hundreds of crashes, has ‘serious safety gap’: NHTSA


The federal government said there were “serious safety gaps” in the TeslaIts Autopilot system contributed to at least 467 crashes, 13 resulting in deaths and “many others” resulting in serious injuries.

The findings come from analysis by the National Highway Traffic Safety Administration of the 956 crashes in which Tesla Autopilot was believed to have been used. The results of the nearly three-year investigation were released Friday.

The NHTSA report said Tesla’s Autopilot design “led to foreseeable misuse and avoidable crashes.” The system did not “ensure sufficient driver attention and appropriate use.”

The agency also said it was opening a new investigation into the effectiveness of a software update that Tesla previously issued as part of the December recall. That update was intended to fix the bugs. of Autopilot that NHTSA has identified as part of this same investigation.

The voluntary recall via over-the-air software update covers 2 million Tesla vehicles in the US and is said to specifically improve the driver monitoring system in Autopilot-equipped Teslas.

NHTSA suggested in Friday’s report that the software update may be incomplete as more Autopilot-related crashes are reported.

In one recent example, a Tesla driver in Snohomish County, Washington, struck and killed a motorcyclist on April 19, according to records obtained by CNBC and NBC News. The driver told police he was using Autopilot at the time of the crash.

NHTSA’s findings are the most recent in a series of reports by regulators and watchdogs questioning the safety of Tesla’s Autopilot technology, which the company has promoted as a key difference compared to other auto companies.

On its websiteTesla says Autopilot is designed to reduce the “workload” for drivers through advanced autopilot and cruise control technology.

Tesla did not provide a response to the NHTSA report on Friday and also did not respond to requests for comment sent to Tesla’s press inbox, its investor relations team and the company’s vice president of vehicle engineering, Lars Moravy.

Following the release of the NHTSA report, Sens. Edward J. Markey, D-Mass., and Richard Blumenthal, D-Conn., issued a statement calling on federal regulators to require Tesla to limit its Autopilot feature “on roads where it was used”. designed for.”

On its User Guide websiteTesla warns drivers not to operate Autopilot’s Autosteer function “in areas where there may be cyclists or pedestrians,” along with a series of other warnings.

“We urge the agency to take all necessary actions to prevent these vehicles from endangering lives,” the senators said.

Earlier this month, Tesla resolve a lawsuit from the family of Walter Huang, an Apple engineer and father of two, who died in a crash when his Tesla Model . Tesla has sought to keep the terms of the deal under wraps from the public.

In light of these events, Tesla and CEO Elon Musk signaled this week that they are betting the company’s future on autonomous driving.

“If someone doesn’t believe that Tesla is going to solve the autonomy problem, I don’t think they should be an investor in the company,” Musk said during Tesla’s earnings call Tuesday. He added, “We will and we are.”

Musk has for years promised customers and shareholders that Tesla would be able to turn its existing cars into self-driving cars with just a software update. However, the company only provides driver assistance systems and has not produced self-driving cars to date.

He also made safety claims about Tesla’s driver assistance systems without allowing third parties to review the company’s data.

For example, in 2021, Elon Musk declared in a post on social networks“Tesla with autopilot now has 10 times lower accident risk than a conventional vehicle.”

Philip Koopman, auto safety researcher and Carnegie Mellon University associate professor of computer engineering, said he considers Tesla’s marketing and claims to be “auto-washing.” He also said in response to the NHTSA report that he hopes Tesla will take the agency’s concerns seriously in the future.

“People are dying because of misplaced trust in Tesla’s autopilot capabilities,” Koopman said. Even simple steps can improve safety.” “Tesla can automatically limit Autopilot use to intended roads based on map data already in the car. Tesla can improve monitoring so that drivers cannot constantly be focused on their phones.” their cell phone while Autopilot is in use.”

One version of this story was posted on NBCNews.com.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button