Business

Should cars drive like humans or robots? Tesla forces the question


A Tesla Model Y electric vehicle is displayed on a showroom floor at the Miami Design District on Oct. 21, 2021, in Miami, Florida.

Joe Raedle | Getty Images

Matt Smith didn’t necessarily mind that the software inside his Tesla would occasionally skirt a traffic law.

For a while, his Tesla Model Y was programmed to automatically roll past stop signs at up to 5.6 miles per hour without stopping if it sensed the coast was clear of pedestrians and others. If anything, Tesla’s experimental driver-assistance features could seem a little conservative to him.

“Sometimes it would stop for five seconds at a time and then slowly creep forward,” said Smith, a 35-year-old investment manager who lives in suburban Detroit. “You and I feel comfortable rolling at 5 miles per hour or so if we feel that it’s safe to go.”

Exactly when Tesla’s software started performing rolling stops isn’t entirely clear. Last September, a Tesla driver posted a video on social media of a rolling stop. And in January, Tesla released an “assertive mode” version of its “full self-driving beta,” a premium driver assistance option that featured rolling stops along with “smaller following distance” and a propensity to “not exit passing lanes.”

Tesla recently removed the rolling-stops feature with a software update, but the automaker has opened a question that the average driver may not have thought about: Should cars robotically obey traffic laws, even when human drivers sometimes break them for convenience?

For Tesla critics, the updates are evidence that the company, led by CEO Elon Musk, operates with little regard for rules or for others on the road including pedestrians, even as they promote the potential safety benefits of a driverless future.

Musk said Thursday at the opening of a Tesla vehicle assembly plant in Austin, Texas, that FSD Beta, a full self-driving program, will roll out to almost all Tesla owners who have the option in North America by the end of this year.

“You said they would be perfect drivers. Why are you teaching them bad human habits?” said Phil Koopman, an engineering professor at Carnegie Mellon University and an expert in advanced driver assistance systems and autonomous vehicle technology.

Tesla executives have defended the company’s choices, saying in a letter to Congress last month and on social media that their vehicles are safe.

“There were no safety issues,” Musk tweeted in February after Tesla disabled automatic rolling stops. He said the cars simply slowed to about 2 miles per hour and continued forward if the view was clear with no cars or pedestrians present.

Tesla did not respond to requests for an interview or for comment on how driver-assistance features should interact with traffic laws.

Smith, the Tesla driver who manages a fund that owns shares in the company, said he’s torn on Tesla’s approach because in the short term a feature such as rolling stops could damage public perception of the overall technology even if automated vehicles might one day be safer than humans.

“They are pushing the boundaries,” said Smith, who is part of the company’s FSD Beta program, in which Tesla says nearly 60,000 customers are testing, on public roads, new driver assistance features that are not fully debugged. He said the features are improving quickly, including with a software update this week.

Customers have to notch a high score on Tesla’s in-vehicle safety rating app to gain access, and they must have the company’s premium driver assistance option installed in their car already. Tesla says it monitors drivers with sensors in the steering wheel and an in-cabin camera to ensure they are paying attention while using the features, though tests by Consumer Reports found their driver monitoring systems to be inadequate.

In recent weeks, Tesla started offering FSD Beta access to drivers in Canada, and Musk said that the experimental software would be available in Europe as early as this summer, pending regulatory approvals.

Growing oversight

The oversight mechanism for human drivers is pretty familiar: flashing lights, a police officer and a pricey ticket. It’s not as clear for automated vehicles.

The idea that cars can now include systems designed to intentionally violate traffic law presents a challenge for regulators on all levels of government, from federal officials who write and enforce safety standards to state and local authorities who handle road signs, licensing and the rules of the road.

“We need laws that clarify, and regulators that intervene and hold manufacturers accountable when their systems fail to live up to the promises they make,” said Daniel Hinkle, senior state affairs counsel for the American Association for Justice, a trade group for plaintiffs’ lawyers.

Hinkle said only five states have regulations in place for developmental driving systems such as Tesla’s FSD Beta, or robotaxis from Cruise, Waymo and others. The states are California, Nevada, New York, Vermont and Washington, plus Washington, D.C. Other states are weighing new rules.

For experts and regulators, features that sidestep traffic laws also pose complicated questions about transparency in how these proprietary systems work and about how much oversight regulators can even have.

Koopman said it’s impossible to say what traffic laws, if any, Tesla has designed its software to violate. Even if someone were able to independently review the car’s computer code, that wouldn’t be enough, he said.

“Code review wouldn’t really help you. It’s all machine-learning. How do you review that?” he said. “There’s no way to know what it will do until you see what happens.”

Many drivers misunderstand the limits of technology already on the road today. The public is confused about what “self-driving” means, for example, as driver-assistance systems become more common and more sophisticated. In a survey last year by the analyst firm J.D. Power, only 37 percent of respondents picked the correct definition of self-driving cars.

Neither Tesla nor any other company is selling a self-driving, or autonomous, vehicle capable of driving itself in a wide array of locations and circumstances without a human ready to take over.

Nonetheless, Tesla markets its driver assistance systems in the U.S. with names that regulators and safety experts say are misleading such as Autopilot for the standard package, and Full Self-Driving for the premium package.

At the same time, Tesla warns drivers in owners’ manuals that it’s their responsibility to use the features safely and they must be prepared to take over the driving task at any moment with eyes on the road and hands on the wheel.

The difficulty of navigating an unpredictable environment is one reason truly self-driving cars haven’t happened yet.

“An autonomous vehicle has to be better and more nimble than the driver it is replacing, not worse,” said William S. Lerner, a transportation safety expert and delegate to the International Organization for Standardization, a group that sets global industrial standards.

“I wish we were there yet, but we are not, barring straight highways with typical entrance and exit ramps that have been mapped,” he said.

‘Caught in the cookie jar’

Tesla’s rolling-stop feature was around for months before it drew much notice. Chris, who chronicles the good and the bad of Tesla’s latest features on YouTube under the name DirtyTesla, said his Tesla did automatic rolling stops for over a year before Tesla disabled the feature. He agreed to be interviewed on the condition that only his first name be used due to privacy concerns.

Scrutiny picked up this year. Regulators at the National Highway Traffic Safety Administration asked Tesla about the feature, and in January, the automaker initiated an “over-the-air” software update to disable it. NHTSA classified the software update as an official safety recall.

Russian invasion driving more disinformation online, Meta says Critics were taken aback not only by the choice to design software that way but also by Tesla’s decision to test out the features using customers, not professional test drivers.

Safety advocates said they didn’t know of any U.S. jurisdiction where rolling stops are lawful, and they couldn’t determine any safety justification for allowing them.

“They’re very transparently violating the letter of the law, and that is completely corrosive of the trust that they’re trying to get from the public,” said William Widen, a law professor at the University of Miami who has written about autonomous vehicle regulation.

“I would be upfront about it,” Widen said, “as opposed to getting their hand caught in the cookie jar.”

Safety advocates also questioned two entertainment features unrelated to autonomous driving that they said sidestepped safety laws. One, called Passenger Play, allowed drivers to play video games while moving. Another, called Boombox, let drivers blast music or other audio out of their cars while in motion, a possible danger for pedestrians, including blind people.

Tesla recently pushed software updates to restrict both of those features, and NHTSA opened an investigation into Passenger Play.

Tesla, the top-selling electric vehicle maker, has not called the features a mistake or acknowledged that they may have created safety risks. Instead, Musk denied that rolling stops could be unsafe and called federal automotive safety officials “the fun police” for objecting to Boombox.

Separately, NHTSA is investigating Tesla for possible safety defects in Autopilot, its standard driver assistance system, after a string of crashes in which Tesla vehicles, with the systems engaged, crashed into stationary first-responder vehicles. Tesla has faced lawsuits and accusations that Autopilot is unsafe because it can’t always detect other vehicles or obstacles in the road. Tesla has generally denied the claims made in lawsuits, including in a case in Florida where it said in court papers that the driver was at fault for a pedestrian death.

NHTSA declined an interview request.

It’s not clear what state or local regulators may do to adjust to the reality that Tesla is trying to create.

“All vehicles operated on California’s public roads are expected to comply with the California Vehicle Code and local traffic laws,” the California Department of Motor Vehicles said in a statement.

The agency added that automated vehicle technology should be deployed in a manner that both “encourages innovation” and “addresses public safety” — two goals that may be in conflict if innovation means purposely breaking traffic laws. Officials there declined an interview request.

Musk, like most proponents of self-driving technology, has focused on the number of deaths that result from current human-operated vehicles. He has said his priority is to bring about a self-driving future as quickly as possible in a theoretical bid to reduce the 1.35 million annual traffic deaths worldwide. However, there’s no way to measure how safe a truly self-driving vehicle would be, and even comparing Teslas to other vehicles is difficult because of factors such as different vehicle ages.

Industry pledges

At least one other company has faced an allegation of purposefully violating traffic laws, but with a different result from Tesla.

Last year, San Francisco city officials expressed concern that Cruise, which is majority-owned by General Motors, had programmed its vehicles to make stops in travel lanes in violation of the California vehicle code. Cruise’s developmental driverless vehicles are used in a robo taxi service that picks up and drops off passengers with no driver behind the wheel.

Cruise responded with something that Tesla’s hasn’t yet offered: a pledge to obey the law.

“Our vehicles are programmed to follow all traffic laws and regulations,” Cruise spokesperson Aaron Mclear said in a statement.

Another company pursuing self-driving technology, Waymo, has programmed its cars to break traffic laws only when they’re in conflict with each other, such as crossing a double yellow line to give more space to a cyclist, Waymo spokesperson Julianne McGoldrick said.

“We prioritize safety and compliance with traffic laws over how familiar a behavior might be for other drivers. For example, we do not program the vehicle to exceed the speed limit because that is familiar to other drivers,” she said in a statement.

A third company, Mercedes, said it was willing to be held liable for accidents that occur in situations where they promised that their driver assistance system, Drive Pilot, would be safe and adhere to traffic laws.

Mercedes did not respond to a request for information about its approach to automated vehicles and whether they should ever skirt traffic laws.

Safety experts aren’t ready to give Tesla or anyone else a pass to break the law.

“At a time when pedestrian deaths are at a 40-year high, we should not be loosening the rules,” said Leah Shahum, director of the Vision Zero Network, an organization trying to eliminate traffic deaths in the U.S.

“We need to be thinking about higher goals — not to have a system that’s no worse than today. It should be dramatically better,” Shahum said.





Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button