Why Is Tesla Recalling Almost All Its Cars in the U.S.?

by TexasDigitalMagazine.com


Photo: Justin Sullivan/Getty Images

On Wednesday, Tesla announced that it would recall more than 2 million cars — almost all of its vehicles on the road in the U.S. — over issues with its “Autopilot” system, one of the electric carmaker’s central features. The recall was spurred by a yearslong investigation by the National Highway Traffic Safety Administration, which found that Autopilot, which helps drivers steer, accelerate, and brake, has been involved in a large number of accidents. (Tesla disputes that characterization.) The recall is a blow for the Elon Musk–owned company, which has been struggling with production issues, sales, and other issues lately. Below is a rundown of the known safety issues with Autopilot and what the recall will mean.

According to U.S. regulators at the National Highway Traffic Safety Administration, the recall is to fix defects in how Teslas make sure drivers are paying attention when they are using the vehicle’s Autopilot advanced driver-assistance system. It comes after meetings between the NHTSA and Tesla, starting in October, during which investigators at the agency detailed the safety issues they believed needed to be fixed. Tesla then agreed to a voluntary recall implementing some software changes, though it did not agree with the NHTSA’s analysis. The recall follows a two-year investigation by the agency into crashes — including some with fatalities — involving Tesla vehicles that occurred when the Autopilot system was engaged.

Virtually all of the Tesla vehicles currently on the road in the U.S. unless they don’t support the company’s Autopilot system: models Y, S, 3, and X produced since October 2012.

Tesla’s Autopilot is an advanced “hands-on” driver-assistance system that includes the Autosteer and Traffic-Aware Cruise Control features. The system, which comes standard on all the company’s vehicles, uses sensors and cameras to enable semi-autonomous driving.

Tesla owners can also pay extra for Enhanced Autopilot that adds additional features like an automatic navigation assist, lane change, and self-parking functionality. Autopilot is not the same thing as the company’s Full Self-Driving Capability, which costs even more, and includes all of the above plus a few more features to allow the car to “drive itself almost anywhere with minimal driver intervention.”

The recall is just an automatic over-the-air software update pushed out to the vehicles. Tesla began rolling out the update this week and it will come at no cost to Tesla owners.

Simply put, the software update will add additional driver monitoring to Tesla’s Autopilot system, which means new driver-engagement checks and alerts aimed at preventing Tesla drivers from taking their attention off of driving when using Autopilot. For instance, the updated software will alert drivers if they try to use Autosteer in driving conditions it wasn’t designed for (i.e., anywhere other than limited-access highways and expressways that have on-off ramps, a center divider, and no cross traffic). It will also disable Autosteer under certain circumstances, including, per the NHTSB, “if the driver repeatedly fails to demonstrate continuous and sustained driving responsibility while the feature is engaged.” Driver engagement is monitored using hand-detecting pressure sensors on the steering wheel and a camera inside the vehicle that tracks the driver’s head movements.

A recent Washington Post investigation that combed through NHTSA data found that since 2019, Teslas in Autopilot mode have been involved in 736 crashes. The Post reported that the number of accidents has surged over the last four years, which reflects “the hazards associated with increasing use of Tesla’s driver-assistance technology as well as the growing presence of the cars on the nation’s roadways.” Autopilot appears to be particularly hazardous for motorcyclists; four motorcyclists have been killed since 2019 in Autopilot-related crashes, out of 17 fatal collisions overall.

Autopilot has also come under questioning from the federal government before over a feature that allowed drivers to take their hands off the wheel.

The problem with the question of overall safety is one of comparison. Tesla claims Autopilot prevents many accidents that would otherwise occur, and that drivers get into more accidents when they don’t use the system than when they do. But as the New York Times noted in 2022, that statistic is misleading, since Autopilot is generally used in highway driving, which tends to be safer than suburban or rural areas. And there’s a serious lack of data comparing Autopilot accident statistics to other similar systems:

Tesla has not provided data that would allow a comparison of Autopilot’s safety on the same kinds of roads. Neither have other carmakers that offer similar systems. Autopilot has been on public roads since 2015. General Motors introduced Super Cruise in 2017, and Ford Motor brought out BlueCruise last year. But publicly available data that reliably measures the safety of these technologies is scant. American drivers — whether using these systems or sharing the road with them — are effectively guinea pigs in an experiment whose results have not yet been revealed.

Bloomberg notes that the recall could prompt and/or bolster lawsuits which allege the use of Tesla’s Autopilot led to crashes:

Half a dozen lawsuits headed to trial in the next year in Florida, California and Texas allege that Tesla allowed Autopilot to be used on roads for which it wasn’t designed and that the technology failed to send sufficient warnings when drivers became disengaged. Lawyers leading the cases say these very issues are mirrored in the recall.

As the Wall Street Journal reports, Tesla has also been accused of overpromising what Autopilot is capable of:

Tesla, which didn’t respond to requests for comment, has previously argued in a court filing that its statements about its driver-assistance technology are legally protected forecasts, truthful opinions or “inactionable corporate puffery.” …

Tesla’s promotion of Autopilot has for years sparked criticism that the company has provided drivers with an inflated sense of the technology’s capabilities and created confusion over what constitutes safe use.  The U.S. Justice Department and Securities and Exchange Commission have opened investigations into whether Tesla misled the public in how it marketed Autopilot. Neither has brought any enforcement actions against Tesla in connection with the investigations. In a continuing private case in Florida involving a 2019 fatal crash, a judge ruled in November that the plaintiff could seek punitive damages, saying there was evidence Tesla overstated Autopilot’s performance. 

No, according to critics, though safety experts say it’s a start, at least. As the Verge’s Andrew J. Hawkins notes, the software update will make it harder to misuse Autopilot, but not impossible:

“It’s progress,” said Mary “Missy” Cummings, a robotics expert who wrote a 2020 paper evaluating the risks of Tesla’s Autopilot system, “but minimal progress.” Cummings said the National Highway Traffic Safety Administration missed an opportunity to force the company to address concerns around Tesla owners using Autopilot on roads where it wasn’t intended to work. … “It’s very vague,” she said.

Another expert told the Verge that Tesla drivers will still be able fool their car’s monitoring system if they want to:

Allowing Tesla to push an over-the-air software update ignores many of the structural defects with Autopilot, said Sam Abuelsamid, principal research analyst at Guidehouse Insights. The torque sensors are prone to false positives, such as when drivers try to trick the system by adding a weight to the steering wheel that counteracts automatic movements, and false negatives, like when the wheel fails to detect a driver’s hands if they are holding it steady. …

Meanwhile, the camera, which only went into use for Autopilot driver monitoring in 2021, doesn’t work in low-light conditions, he noted. Other automakers use infrared sensors that can detect depth and work in low-light situations. Consumer Reports demonstrated recently that Tesla’s cameras could be tricked into thinking there was someone in the driver’s seat when there wasn’t.

“This absolutely could have gone another way,” Abuelsamid said. “NHTSA could do its job and actually force Tesla to do a recall and install robust driver eye and hand monitoring and true geofencing of the system or disable Autosteer altogether if they cannot do a hardware update.”

This post has been updated.



Source link

You may also like