Tesla's Autopilot feature raises concerns from safety experts

Three crashes involving Teslas that killed three people last month have increased scrutiny of the company's Autopilot system just months before CEO Elon Musk wants to put self-driving cars on the streets.

A Tesla Model S left a freeway in California, ran a red light then struck a Honda Civic, killing two people inside, police said. A Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla. Finally, on December 7, another Model 3 struck a police cruiser on a Connecticut highway.

The National Highway Traffic Safety Administration is looking into the California crash, but hasn't decided whether it will review the Indiana crash. In both cases, authorities haven't determined if Tesla's Autopilot was used. NHTSA also is investigating the Connecticut crash, in which the driver told police the car was on Autopilot. 

Autopilot is a Tesla system designed to keep a car in its lane and a safe distance from other vehicles. In April, Musk said he expected to start converting Tesla's electric cars into self-driving vehicles this year. Tesla has said its Autopilot system is designed to assist drivers, who must pay attention and be ready to intervene. The company contends Teslas with Autopilot are safer than vehicles without it, but cautions that the system doesn't prevent all crashes.

Record sales

The Autopilot concerns come at a time when Tesla appears to be gaining momentum. 

The company reported Friday that it delivered about 112,000 vehicles in the fourth quarter, a record. Telsa's Model 3 accounted for 83 percent of fourth-quarter sales at 92,550. The Model S Sedan and Model X made up the rest. The sales increase should bode well for Tesla's fourth-quarter earnings next month. In the third quarter, Tesla posted a surprising $143 million profit.

Shares of Tesla rose more than 4% in late morning trading, gaining $17.67 to $447.93.

"If Tesla is able to sustain this level of profitability and demand for the company going forward, especially in Europe and China, then the stock will open up a new chapter of growth and multiple expansion in our opinion," analysts from Wedbush wrote in a Friday note.

Too reliant on Autopilot?

Still, safety advocates say the Tesla crashes raise questions about whether drivers are too reliant on Autopilot and whether the company does enough to ensure drivers pay attention. Critics have said it's time for NHTSA to stop investigating and take action.

NHTSA is investigating 13 Tesla crashes dating back to at least 2016 in which the agency believes Autopilot was engaged. The agency has yet to issue regulations.

"At some point, the question becomes: How much evidence is needed to determine that the way this technology is being used is unsafe?" said Jason Levine, executive director of the nonprofit Center for Auto Safety. "In this instance, hopefully these tragedies will not be in vain and will lead to something more than an investigation by NHTSA."

Levine and others have called on the agency to require Tesla to limit the use of Autopilot to mainly four-lane divided highways without cross traffic. They also want Tesla to install a better system to make sure drivers are paying attention. Tesla's system requires drivers to place their hands on the steering wheel. But federal investigators found that Autopilot lets drivers zone out.

Doubts about Tesla's Autopilot system have long persisted. In September, the National Transportation Safety Board reported that a design flaw in Autopilot and driver inattention caused a Tesla Model S to slam into a parked firetruck near Los Angeles in January 2018. The board determined the driver was overly reliant on Autopilot. 

David Friedman, vice president of advocacy for Consumer Reports and a former acting NHTSA administrator, said the agency should have declared Autopilot defective and sought a recall after a 2016 crash in Florida that killed a driver. In the 2016 crash, NHTSA closed its investigation without seeking a recall and Friedman said the agency determined the problem didn't happen frequently enough. 

In a statement, NHTSA said it relies on data to make decisions, and if it finds any vehicle poses an unreasonable safety risk, "the agency will not hesitate to take action." 

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.