Auto Express

Tesla recalls ‘Full Self-Driving’ software that runs stop signs



Tesla’s the most recent update to the so-called Full Self-Driving software (Beta) includes a “Assertive” mode allows vehicles to cross stop signs at up to 5.6 mph without coming to a complete stop. Turned out – not surprisedwe may add – this feature has National Highway Traffic Safety Administration regulations. Based on documents posted by NHTSA“Failing to stop at a stop sign may increase the risk of a collision.”

The resulting recall covers 53,822 vehicles, including Model WILL X sedan and SUV from 2016 to 2022, as well as 2017 to 2022 Model 3 sedan and from 2020 to 2022 Model Y SUV car. Tesla is not aware of any incidents or injuries caused by this feature. An over-the-air firmware release to disable rolling stops is expected to be shipped in early February, and owners will receive the required notification letter by March 28.

As we always point out when reporting on Tesla’s Full Self-Driving and Autonomous Driving technologies, they are not fully autonomous or self-driving. These are not legitimate SAE Level 4 autonomy programs. Drivers shouldn’t expect their Tesla cars to be able to drive them without human interaction.

Tesla is said to have agreed to disable rolling stops with a software update on January 20 following a meeting with NHTSA on the 10th and 19th of January.

The “stop rolling” feature allows Tesla cars to run through stop signs on all roads if the owner has enabled this feature. According to documents posted by NHTSA, vehicles must be traveling less than 5.6 mph when approaching the intersection and cannot detect the “relevant” car, pedestrian or cyclist. move nearby. All roads leading to an intersection must have a speed limit of 30 mph or less. If those conditions are met, Teslas will then be allowed to cross the intersection at 0.1 mph to 5.6 mph without coming to a complete stop.

Safety advocates complain that Tesla should not be allowed to test vehicles with untrained drivers and that Tesla’s software could misbehave, leaving motorists and Other pedestrians are in danger. Most other auto companies have similar software that tests with trained human safety drivers.

Alain Kornhuser, Dean of self-driving car engineer at Princeton University, said the recall is an example of how NHTSA is doing its job as the nation’s road safety watchdog. The recalls “show that they can be effective even if Tesla should have been more accountable in the first place,” he said.

In November, NHTSA said it had review complaint from a Tesla driver that the “Full Self-Driving” software caused the problem. The driver complained to the agency that the Model Y was in the wrong lane and was hit by another vehicle. According to the lawsuit, the SUV warned the driver while turning, and the driver attempted to steer to avoid other traffic, according to the lawsuit. But the car lost control and “forced itself into the wrong lane,” the driver reported. No one was injured in the November 3 crash in Brea, California, according to the lawsuit.

In December, Tesla agreed to update its “Autopilot” driver assistance system to be less complicated after NHTSA opened an investigation. Company agreed to stop allowing video games on the central touchscreen when its vehicle is moving.

The agency is also investigating why Teslas on Autopilot repeatedly crashed into emergency vehicles parked on the roadway.

Materials from the Associated Press were used in this report.

Related videos:



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button