News

Tesla driver charged in a fatal crash involving Autopilot: NPR

California prosecutors filed two counts of vehicle manslaughter against the Tesla driver in a car that automatically ran a red light, crashed into another vehicle and killed two people in 2019.

David Zalubowski / AP


hide captions

switch captions

David Zalubowski / AP


California prosecutors filed two counts of vehicle manslaughter against the Tesla driver in a car that automatically ran a red light, crashed into another vehicle and killed two people in 2019.

David Zalubowski / AP

DETROIT – California prosecutors have filed two counts of vehicle manslaughter against the Tesla driver in an Autopilot vehicle that ran a red light, crashed into another vehicle and killed two people in 2019.

The defendant appears to be the first person charged with a felony in the United States for a fatal crash involving a driver using a partially automated driving system. Los Angeles County prosecutors filed the charges in October, but they only came to light last week.

The driver, Kevin George Aziz Riad, 27, has pleaded not guilty. Riad, a limousine service driver, is on free bail while the case is pending.

The abuse of Autopilot, which can control steering, speed and braking, has occurred multiple times and is the subject of two federal agency investigations. Filing charges in the California crash could inform drivers using systems like Autopilot that they cannot rely on them to control their vehicle.

The criminal charges are not the first involving autonomous driving systems, but they are the first involving widely used driving technology. Authorities in Arizona filed charges of negligent homicide in 2020 against a driver that Uber has hired participate in the test of fully autonomous vehicles on public roads. The Uber vehicle, an SUV with a human backup driver on board, hit and killed a pedestrian.

In contrast, Autopilot and other driver assistance systems are widely used on roads around the world. An estimated 765,000 Tesla vehicles are equipped with this feature in the United States alone.

During the Tesla crash, police say a Model S was traveling at high speed when it left the freeway and ran a red light in the Los Angeles suburb of Gardena and hit a Honda Civic at an intersection on December 29, 2019. Two Civic occupants, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, died at the scene. Riad and a woman in the Tesla were hospitalized with non-life-threatening injuries.

The incriminating documents make no mention of Autopilot. However, the National Highway Traffic Safety Administration, which sent investigators to the crash, confirmed last week that Autopilot was in use in the Tesla at the time of the crash.

The Riad’s defense attorney did not respond to requests for comment last week, and the Los Angeles County District Attorney’s Office declined to discuss the case. The Riad’s preliminary hearing is scheduled for February 23.

Federal safety regulators investigated abuse of Autopilot

NHTSA and National Transportation Safety Board has been looking into widespread abuse of Autopilot by motorists, who are overconfident and inattentive have been blamed for many accidents, including fatal ones. In an incident report, the NTSB called its abuse “automation complacency”.

In a 2018 crash in Culver City, California, in which a Tesla hit a fire hydrant, the design of the Autonomous Driving system “allowed the driver to leave the vehicle,” the agency said. driving duty.” No one was injured in that collision.

Last May, a California man was arrested after officers noticed his Tesla was traveling down the highway with the man in the back seat and no one behind the wheel.

Teslas that used Autopilot have also crashed into a highway barrier or a tractor-trailer crossing the road. NHTSA has dispatched teams to investigate 26 Autopilot-related crashes since 2016, involving at least 11 deaths.

Messages were left seeking comment from Tesla, which has disbanded its media relations department. Since the Autopilot crashes started happening, Tesla has updated the software to try to make it harder for drivers to abuse it. It also tries to improve Autopilot’s ability to detect emergency vehicles.

The company has said that Autopilot and a more complex “Full Self-Driving” system cannot drive on their own and that drivers must be attentive and ready to react at any time. “Full self-driving” is being tested by hundreds of Tesla owners on public roads in the US

Bryant Walker Smith, a law professor at the University of South Carolina who studies autonomous vehicles, said it was the first case in the US he was aware of that included a serious criminal charge in a fatal crash. people involved in partially automated driving assistance systems. Tesla, he said, could be “criminally, civilly or morally prosecuted” if found to have put a dangerous technology on the road.

Donald Slavik, a Colorado attorney who has served as an advisor in auto technology lawsuits, including many against Tesla, said he is also unaware of any felony charges previously filed. filed against a U.S. driver who was using autonomous driver technology partially involved in a fatal crash.

The families of Lopez and Nieves-Lopez have sued Tesla and Riad in separate lawsuits. They have accused Riad of negligence and accused Tesla of selling faulty vehicles that can cause sudden acceleration and lack an effective automatic emergency braking system. A joint test is expected in mid-2023.

Lopez’s family, in court documents, allege that the vehicle “accelerated suddenly and unintentionally to an excessive, unsafe and uncontrollable speed.” The family of Nieves-Lopez further asserted that Riad was an unsafe driver, with multiple moving faults on record, and was unable to control the high-performance Tesla.

Separately, NHTSA is investigating dozens of accidents in which a Tesla on Autopilot crashed into several parked emergency vehicles. In the collisions under investigation, at least 17 people were injured and one person was killed.

When asked about the manslaughter charges against the Riad, the agency released a statement saying none of the vehicles for sale could be self-driving. And whether cars are using partially automated systems or not, the agency said, “every vehicle requires a human to be in control at all times.”

NHTSA added that all state laws hold drivers accountable for the operation of their vehicles. While automated systems can help drivers avoid collisions, the agency says the technology must be used responsibly.

Rafaela Vasquez, the driver of Uber’s self-driving test vehicle, was charged with negligent homicide in 2020 after an SUV killed a pedestrian in suburban Phoenix in 2018. Vasquez pleaded not guilty. Arizona prosecutors declined to bring criminal charges against Uber.

Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button