Crashes involving Tesla’s autopilot feature
have been highlighted heavily by the news media. There’s always been a lot of controversy surrounding these accidents—but whether faulty technology or driver error was to blame has been difficult to pin down so far. But thanks to a comprehensive story published by Consumer Reports, it seems that one fatal flaw in Tesla’s
autopilot system could bring about a whole new wave of safer regulations regarding driver assistance systems. And, considering that automated vehicle (AV) technology for everyday use is becoming a reality looming on the horizon, implementing safeguards for driver assistance systems might not be such a bad idea. A recent tragedy involving a Tesla
The accident that has motivated the latest push for Tesla’s autopilot system (and those offered by competitors) to be more strictly regulated occurred on April 17th, 2021. A Tesla model S veered off of a winding road outside of Houston, TX. The two passengers in the vehicle were killed in the crash after it struck a tree; police reports have alleged that neither occupant was driving.
While Tesla has indicated that further investigation into the accident found that the warped shape of the steering wheel indicated someone was in the driver’s seat at the time of the wreck, it seems an unlikely theory that isn’t yet supported by local authorities.
Consumer Reports investigates Tesla’s autopilot system
MORE: A Shady Car Insurance Fraud Ring Might Have Just Been Busted
In the wake of April’s latest accident, the non-profit site Consumer Reports
(CR) conducted their own Tesla autopilot system investigation. Using a Model Y, CR engineers found that the Tesla model would in fact operate in autopilot mode without anyone sitting in, or applying pressure to, the front seat. CR’s Model Y did not announce any warnings for passengers of an empty driver’s seat when vacant. What would autopilot regulation look like?
It’s currently hard to say what AV regulations will look like in the future. However, it’s something that is on the minds of many in the House and Senate. The NCSL
(National Conference of State Legislatures) reported that since 2012, 41 states have considered passing enacting legislation to regulate the safety and precautionary measures used in driver-assist technologies. Of those 41 states, 29 have actually enacted legislation to protect drivers and roadways against faulty assisted driving technology. The most recent crash in Texas has only increased the demand for stricter regulatory measures.
Another reason why there’s not much regulation going on? The NHTSA (National Highway Traffic Safety Administration
) doesn't have anything in place for semi-autonomous driving tech like Tesla’s autopilot mode. In fact, the NHTSA doesn’t even have any regulations or safety measures in place for completely autonomous vehicles, either. These loopholes will have to be closed soon. Otherwise, we’ll continue to see tragedies like the one that occurred in April popping up across news headlines. And it isn’t like autopilot systems would disappear if they were regulated more closely—they’d just be safer.
Until our laws catch up with this new frontier
in the automotive industry, Tesla’s autopilot mode, and assisted driver technology in general, will continue to operate in the wild, wild west.