Tesla Autopilot is under investigation by The National Highway Traffic Safety Administration after reporting nearly a dozen collisions with emergency vehicles and first responders.

Detailing 11 crashes resulting in 17 injuries and one death, the NHTSA’s report states that all of the accidents have been caused by Tesla’s autopilot function, or the vehicle’s similar Traffic Aware Cruise Control.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first
responder vehicle lights, flares, an illuminated arrow board, and road cones,” the investigation report read.

This investigation into Tesla’s autopilot function marks the third time the company has been come under fire from the NHTSA within the last year. In January, a separate investigation was carried out following accidents from “sudden unintended acceleration,” including over 110 crashes and 52 injuries. A third investigation in March led to a recall of over 5,530 Tesla vehicles over poorly attached front seat belts not remaining secure, increasing the risk of injury.

As a result of the investigation, Tesla stock fell 5% the same morning that the probe was announced. Earlier last week, Elon Musk and Tesla were also not invited to participate in a White House event promoting electric cars.

According to Tesla, although the autopilot system is designed to operate controlled acceleration, slight turns, and navigation, drivers are warned of the danger of taking their hands off the wheel at any time. When the driver engages Autopilot, they are shown a visual reminder on the screen to, “keep your hands on the wheel.” As the NHTSA report suggests, “With the Advanced Driver Assistance System (ADAS) active, the driver still holds primary responsibility for Object and Event Detection and Response,” should an emergency or irregularity become too complex for the vehicle.

Despite referring to the function as “Autopilot and Full Self-Driving Capability,” Tesla’s website lists the dangers of using the technology, citing that, “it does not turn a Tesla into a self-driving car nor does it make a car autonomous.” Tesla writes that limitations to the Autopilot feature stem from factors such as poor visibility, bright lights, and interferences from other vehicles.

The investigation into Tesla Autopilot will “assess the technologies and methods used to monitor, assist, and enforce
the driver’s engagement with the dynamic driving task during Autopilot operation,” as well as confirm that Tesla’s Autopilot function was the primary cause to blame for the vehicle’s collisions with emergency responders.

Just a couple of days before the NHTSA announced their investigation, Tesla Tweeted an unconfirmed statistic that, “In 2020, a Tesla with Autopilot engaged experienced 0.2 accidents per million miles driven, while the US average was 9x higher.”

Critics have long described Tesla’s Autopilot function as unsafe, citing the risks of unpredictability on the road. In a review of Tesla’s Autopilot in Road & Track, the site called the function, “laughably bad and potentially dangerous.” Industry Week said in 2018 that the car was, “unsafe at every speed,” and Consumer Reports published a study that compared the Autopilot’s movements to that of a drunk driver.

Jake Fisher, senior director of auto testing for Consumer Reports, told USA Today earlier this April that, “there’s certainly nothing anywhere close to self-driving that is in production right now.” The comments came after an accident involving a Tesla vehicle in Houston which resulted in the death of two passengers.

Last year, the National Transportation Safety Board (NTSB) issued a letter to the NHTSA urging the agency to more strictly regulate self-driving cars. In the letter, NTSB chair Robert Sumwalt wrote that, “Tesla is testing on public roads a highly automated AV technology but with limited oversight or reporting requirements.”

Tesla has yet to comment on the recent investigation involving their self-driving cars.