Ethical Issues That Will Come With Self-Driving Cars

Find out if you're getting ripped off on your car insurance in less than two minutes.
Find insurance savings (100% Free)
No long forms · No spam · No fees
Genevieve Fraser
Updated on Jun 27, 2022 · 4 min read
Emerging technologies
used in
self-driving cars
can prevent injuries and save lives.
In fact, a 2015 report by
McKinsey & Company
estimates that self-driving cars could reduce accident rates by up to 90%.
Such technologies can warn occupants of their proximity to other vehicles, prevent risky lane changes, and automatically brake in reaction to sudden changes in environment. This technology continues to evolve, with the ultimate goal of maximizing
safety for everyone
.
However, despite the benefits of driverless cars, social scientists argue that they could raise important ethical issues that negatively impact the public and the environment.

Accident scenarios pose near-impossible challenge

Self-driving cars may be the future, but at what cost?
As the
Good Men Project
explains, self-driving cars introduce an unprecedented dilemma: when an accident is unavoidable, whose safety will the vehicle prioritize?
How will they navigate unfamiliar roads,
extreme weather
, and crowded areas? The short answer is we don’t know yet. Neither the
current infrastructure
nor the technology are sophisticated enough yet to provide the answers, at least ones that suffice for all self-driving cars.
It certainly seems highly unethical for technology to make decisions about life and death, and to prioritize one individual’s safety over another. This concept is encompassed by a sociological concept known as the "trolley problem."
As
Forbes
explains, it goes something like this: Imagine that there are five people in the path of your vehicle, and your autonomous vehicle thankfully swerves to avoid them. But in the process, the vehicle strikes another pedestrian in that path. Was the right decision made?
This is an impossible question that automakers and engineers must face when designing the executive decision-making functions of the technology.
Wired
reports that driverless technology is taught to determine vulnerable and non-vulnerable beings. Unprotected humans would naturally be considered most vulnerable, as opposed to say, a parked car.
But what if there were people in that
parked car
? It is a complex problem indeed, and one that needs extensive investigation.

Hackers could control self-driving cars

In some cases, it could only take an internet connection and some know-how for ill-intentioned individuals to gain access to your vehicle’s controls, which depend highly on software to operate.
Full access would mean that the hacker could control the speed and direction of the vehicle, with potentially devastating consequences.
Although automakers do their best to secure it effectively,
software can be vulnerable to attacks
, and hackers often adapt their practices to infiltrate upgraded security measures.
The fact that self-driving cars are usually connected to multiple devices (other autonomous vehicles, networks, and infrastructure) also inherently exposes them to hacks.

Private data is at risk

Self-driving cars rely on a huge amount of data that they build and learn from as they navigate their surroundings. While this is supposed to make them super smart, it could be at your expense.
Smartphone integrations, vehicle networks, and internet connectivity mean that the vehicle is privy to the data on these devices, which could be ruinous if sensitive financial, identity-related, or other personal information
got into the wrong hands
.

How can we mitigate these ethical issues?

Automakers and authorities are working hard to develop solutions to these problems in order to make self-driving cars a safe and effective reality for all.
Forbes calls for greater transparency between the auto industry and the public, which would equip consumers with increased awareness of the benefits and potential drawbacks of driverless technology.
They note that many of these ethical considerations shift the responsibility from humans to technology, and in turn, onto automakers who need to be accountable for their programming decisions.
The National Highway Traffic Safety Administration (NHTSA) is beginning to address the "trolley problem" by asking automakers for letters of safety assessment, which are meant to reference ethics.
The
NHTSA
specifically provides information on issues surrounding hacking and cybersecurity stating that they are working in conjunction with the auto industry to develop risk-based processes to quickly identify, measure, and mitigate security threats.
To address privacy concerns regarding the collection of personal data by autonomous vehicles,
Celantur
explains that there are regulations such as the
Driver Privacy Act of 2015
in place. This Act upholds lessee and owner rights by protecting the information collected by event data recorders, except in circumstances of legal, maintenance, or safety concerns.
Celantur also notes that European lawmakers suggest using a "data minimization method" of collecting only absolutely necessary information.
Despite the ethical concerns presented by self-driving cars, it could be argued that the benefits are too great to ignore. After all, the
NHTSA
says that human error causes 94% of serious crashes. It is the hope that with effective technology in charge, this could be drastically reduced, but first, there is clearly a lot of work to do.

Compare Car Insurance for Your Car

Easiest way to compare and buy car insurance

√
No long forms
√
No spam or unwanted phone calls
√
Quotes from top insurance companies
Find insurance savings — it's 100% free