The News Source You've Been Waiting For


Tesla Recalls Autopilot Software in 2 Million Vehicles

Tesla’s reputation for making technologically advanced cars suffered a blow on Tuesday when the company, under pressure from regulators, recalled more than two million vehicles. U.S. officials said the automaker had not done enough to ensure that drivers remained attentive when using a system that can steer, accelerate and brake cars automatically.

The recall by Tesla, the world’s dominant maker of electric vehicles, was its fourth in less than two years and the most significant to date. It covers nearly all cars the company has manufactured in the United States since 2012, including its most popular, the Model Y sport-utility vehicle.

Tesla accounts for about half of the electric passenger cars sold in the United States, but its market share has been slipping as General Motors, Hyundai, Ford Motor and other automakers have begun selling electric models. In addition, recent public statements by Elon Musk, Tesla’s chief executive, have been widely interpreted as antisemitic and offended some customers. The recall amounts to another dent in the company’s image.

“There’s no question” that the company’s brand “has taken a hit this year,” Gary Black, managing partner of the Future Fund, who is generally positive about Tesla, said on the social media site X, which is owned by Mr. Musk.

The recall follows an investigation into Tesla’s driver-assistance system, Autopilot, which the National Highway Traffic Safety Administration began in August 2021 after a series of accidents, some fatal, involving the technology. Autopilot is designed to control vehicles on its own when on highways. Tesla’s owners manuals tell drivers that they should keep their hands on the wheel and take over if anything goes wrong.

The recall reflects regulators’ concern that Tesla did not do enough to prevent drivers from misusing the system, including by turning it on while they travel local roads and by becoming distracted because they assumed that their car could drive itself.

What Tesla calls Autopilot is a collection of features allowing the car’s technology to take over a vehicle’s control to varying degrees. The feature singled out in the recall, Autosteer, can keep a car in a lane without driver intervention.

There may be “increased risk of a crash,” the safety administration said, when Autosteer is engaged and drivers do not “maintain responsibility for vehicle operation.”

To address that problem, Tesla said it would wirelessly update its cars to add new, more prominent visual alerts and checks when Autosteer is engaged to remind drivers to keep their hands on the wheel and pay attention to the road. Mr. Musk did not respond to a request for comment.

Tesla said it did not agree with agency’s assessment of the system. The company has maintained that Autopilot makes its cars safer.

Manufacturers are responsible for preventing foreseeable misuse of their technology, the transportation secretary, Pete Buttigieg, who oversees the auto safety agency, said during a meeting with reporters and editors at The New York Times on Wednesday.

“They’re going to say that it’s safer with these systems,” Mr. Buttigieg said, referring to Tesla. “That could be true. But you know, a car with an airbag is safer than a car without an airbag. If the airbag blows up, we’re still going to recall it.”

On Monday, Tesla said on the X platform, “It is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury,” a response to an article in The Washington Post about the technology’s flaws.

Some experts question whether Autopilot makes driving safer. Philip Koopman, an associate professor at Carnegie Mellon University who studies self-driving software, said that improvements in safety come almost entirely from one feature, automatic emergency braking, which is commonplace on all new vehicles.

“Autopilot is not a safety feature, it’s a convenience feature,” Mr. Koopman said.

The regulator said its investigation would continue.

The recall was “egregiously overdue,” two Democratic senators — Edward Markey of Massachusetts and Richard Blumenthal of Connecticut — said in a statement. “We urge N.H.T.S.A. to continue its investigations to spur necessary recalls, and Tesla to stop misleading drivers and putting the public in great danger.”

The investigation is the most prominent example of a wider push and pull among government regulators and companies developing technologies that allow vehicles to drive on their own in certain situations.

In October, California regulators ordered Cruise, a G.M. subsidiary, to stop its driverless taxi service in San Francisco after a series of traffic mishaps, including one in which a Cruise car dragged a pedestrian 20 feet after a crash. The company has since suspended its driverless car operations across the country.

Mr. Buttigieg said that some companies have pushed out self-driving technology before it was proved to be safe for broad use. These companies, he said, were motivated in part by the large number of highway accidents and deaths. More than 30,000 people were killed in traffic accidents in the first nine months of this year, down 4.5 percent from a year earlier but about 6,000 more than in the same period in 2013.

“I think the culture in a lot of these companies is around getting to that promised land, where the faster we build this technology and the sooner we get it out there, the better off everybody will be because human drivers have a murderous track record,” he said.

The auto safety agency said that in August 2021 it began investigating 11 incidents involving Tesla vehicles that were operating with Autosteer engaged. A series of meetings between the agency and Tesla followed, and the company decided this month to voluntarily administer a recall.

Over the course of its investigation, the safety agency said, it has reviewed 956 crashes in which Autopilot was engaged. Then it focused on 322 crashes, including frontal collisions and situations where Autopilot may have been accidentally engaged.

Tesla began issuing wireless software updates to certain vehicles this week, safety officials said. The remaining vehicles will receive updates later. Tesla has for years updated the software in its cars using cellular networks, usually overnight when cars are parked.

Depending on the hardware on a car, some updated vehicles will feature more prominent visual alerts as well as additional checks when using Autosteer. The feature will also be suspended if drivers repeatedly fail to use it responsibly.

Letters to Tesla owners notifying them of the update are expected to be mailed in February.

Tesla’s recall this week is the latest episode to heighten public scrutiny of the automaker. In October, a California jury found that the company’s driver-assistance software was not at fault in a crash that killed a Tesla owner and seriously injured two passengers.

Several similar cases are being litigated around the country. One involves a 2019 crash in Florida in which a Tesla operating on Autopilot on a country road ran through a stop sign and hit a parked car, killing Naibel Benavides, 22, and severely injuring her companion.

Todd Poses, a lawyer representing the family of Ms. Benavides, said the recall showed Tesla was aware Autopilot was being used on roads where it wasn’t safe, and failed to restrict where it could be turned on.

“This accident never should have happened because it doesn’t work on roads like this,” Mr. Poses said on Wednesday, referring to Autopilot.

The driver of the Tesla had dropped his phone and bent down to look for it, trusting Autopilot to steer his Tesla Model S. But the car failed to stop at a T-intersection with a flashing red light.

“This technology is not safe, it needs to be off the roads,” said Dillon Angulo, who was with Ms. Benavides and suffered a traumatic brain injury, a broken pelvis and a broken jaw.

Tesla has faced several other recalls. In May, China ordered Tesla to recall 1.1 million vehicles, citing an issue with the acceleration and braking systems of certain models manufactured in China and abroad.

A few months earlier, Tesla recalled more than 362,000 cars equipped with its Full Self Driving driver-assistance system, a more advanced technology than Autopilot, after U.S. regulators found it increased the risk of accidents.

The more advanced system allows vehicles to travel above legal speed limits and through intersections in “an unlawful and unpredictable manner,” safety officials said.

And in early 2022, Tesla recalled 54,000 cars equipped with its Full Self Driving software to disable a feature that in certain conditions let the vehicles roll slowly through intersections without making required stops.

Tesla sells Full Self Driving separately from Autopilot. But the two services are underpinned by the same technologies. In the past, drivers who have not purchased the more advanced system have been able to use Autopilot on roads that are not highways.

The company’s latest recall explains that drivers will be alerted when they are using Autopilot outside of roadways where the technology is intended to operate. But it is unclear whether they will still be permitted to use the technology in these situations.

“N.H.T.S.A. has forced Tesla to focus on the right issues,” said Matthew Wansley, professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies. “But everything depends on the details.”

Neal E. Boudette and Eric Lipton contributed reporting.


Your email address will not be published. Required fields are marked *