Tesla Autopilot Leads to Another Crash Into a Police Car

Lisa Steuer McArdle
· 3 min read
It's early morning on a
Florida
Saturday. A Mercedes-Benz SUV is experiencing trouble and is stopped on the side of the road. A police officer stops to provide assistance when suddenly both vehicles are side-swiped in the gray dawn light by a car on
autopilot
—a
Tesla
Model 3.
Fortunately, none of the people involved were seriously injured, but all three vehicles were severely damaged in a crash that never should have occurred.
A Tesla Model 3 using autopilot was involved in an accident with a police car.

Tesla driver on autopilot crashes into stopped police car

The crash occurred at 5 AM on the side of I-4. The SUV had been disabled and the police officer was offering assistance when the crash occurred. Thankfully, the driver of the SUV, the police officer, and the Tesla Model 3 driver were not injured in the incident, but images of the smashed vehicles remind all how close the situation really was.
The driver of the Tesla Model 3 said that her vehicle was using Tesla's
Autopilot driver-assist system
.
This isn't the first time that Teslas on autopilot have hit stopped emergency vehicles on the side of the road. In the last few years, Teslas have hit police cars, fire trucks, and other emergency vehicles—specifically when they were stopped to assist a disabled vehicle.
Let Jerry find your price in only 45 seconds
No spam · No long forms · No fees
Find insurance savings

The police investigate Tesla's autopilot safety

MORE: Tesla Driver-Assist Stops Unconscious Drunk Driver From Crashing
Tesla has long been criticized for marketing self-driving vehicle modes that are not really self-driving. In the fine print, Tesla lets drivers know that while autopilot features are handy, drivers should keep both hands on the wheel and still pay close attention to the road while autopilot is engaged.
Why? Because sometimes the autopilot chooses wrong. When the computer makes a mistake, it is essential for drivers to be aware and marketing the system as fully self-driving may be a premature move for both Tesla and their customers who are put at risk by computer driving errors.
This crash occurred only two weeks after the National Highway Traffic Safety Administration opened an official investigation into Tesla's autopilot safety. The investigation focuses specifically on these incidents where autopilot unexpectedly careens into stopped emergency vehicles helping disabled vehicles. This pattern of crashes with little to no driver intervention is a troubling history despite Tesla's otherwise automation-ready performance.

What this means for Tesla autopilot

Tesla cars have delivered many safe mostly-automated rides. The driver-assist features offer an advanced experience in lane position, camera detection, and even traffic navigation automation.
However, Tesla's autopilot system is not yet ready to be fully unmanned. The dream of kicking back to watch a movie while your car does the commute is still in the future for Tesla owners and even Waymo has yet to fully waive "safety drivers" for their automated taxi service.
The lesson to take from this incident and others like it is that self-driving is not quite ready to be driver-free. It's still important to pay attention when driving a mostly-automated vehicle because these incidents do still happen from time to time.

Easiest way to compare and buy car insurance

√
No long forms
√
No spam or unwanted phone calls
√
Quotes from top insurance companies
Find insurance savings