estimates that self-driving cars could reduce accident rates by up to 90%.
Such technologies can warn occupants of their proximity to other vehicles, prevent risky lane changes, and automatically brake in reaction to sudden changes in environment. This technology continues to evolve, with the ultimate goal of maximizing safety for everyone.
However, despite the benefits of driverless cars, social scientists argue that they could raise important ethical issues that negatively impact the public and the environment.
nor the technology are sophisticated enough yet to provide the answers, at least ones that suffice for all self-driving cars.
It certainly seems highly unethical for technology to make decisions about life and death, and to prioritize one individual’s safety over another. This concept is encompassed by a sociological concept known as the "trolley problem."
explains, it goes something like this: Imagine that there are five people in the path of your vehicle, and your autonomous vehicle thankfully swerves to avoid them. But in the process, the vehicle strikes another pedestrian in that path. Was the right decision made?
This is an impossible question that automakers and engineers must face when designing the executive decision-making functions of the technology.
reports that driverless technology is taught to determine vulnerable and non-vulnerable beings. Unprotected humans would naturally be considered most vulnerable, as opposed to say, a parked car.
But what if there were people in that parked car? It is a complex problem indeed, and one that needs extensive investigation.
MORE: Why Do Drivers Feel Less Safe With Autonomous Vehicles on the Road?
In some cases, it could only take an internet connection and some know-how for ill-intentioned individuals to gain access to your vehicle’s controls, which depend highly on software to operate.
Full access would mean that the hacker could control the speed and direction of the vehicle, with potentially devastating consequences.
Although automakers do their best to secure it effectively, software can be vulnerable to attacks, and hackers often adapt their practices to infiltrate upgraded security measures.
The fact that self-driving cars are usually connected to multiple devices (other autonomous vehicles, networks, and infrastructure) also inherently exposes them to hacks.
Private data is at risk
Self-driving cars rely on a huge amount of data that they build and learn from as they navigate their surroundings. While this is supposed to make them super smart, it could be at your expense.
Smartphone integrations, vehicle networks, and internet connectivity mean that the vehicle is privy to the data on these devices, which could be ruinous if sensitive financial, identity-related, or other personal information got into the wrong hands.
How can we mitigate these ethical issues?
MORE: Self-Driving Car Insurance May Be an Emerging Market
Automakers and authorities are working hard to develop solutions to these problems in order to make self-driving cars a safe and effective reality for all.
Forbes calls for greater transparency between the auto industry and the public, which would equip consumers with increased awareness of the benefits and potential drawbacks of driverless technology.
They note that many of these ethical considerations shift the responsibility from humans to technology, and in turn, onto automakers who need to be accountable for their programming decisions.
The National Highway Traffic Safety Administration (NHTSA) is beginning to address the "trolley problem" by asking automakers for letters of safety assessment, which are meant to reference ethics.
specifically provides information on issues surrounding hacking and cybersecurity stating that they are working in conjunction with the auto industry to develop risk-based processes to quickly identify, measure, and mitigate security threats.
To address privacy concerns regarding the collection of personal data by autonomous vehicles,
in place. This Act upholds lessee and owner rights by protecting the information collected by event data recorders, except in circumstances of legal, maintenance, or safety concerns.
Celantur also notes that European lawmakers suggest using a "data minimization method" of collecting only absolutely necessary information.
Despite the ethical concerns presented by self-driving cars, it could be argued that the benefits are too great to ignore. After all, the
says that human error causes 94% of serious crashes. It is the hope that with effective technology in charge, this could be drastically reduced, but first, there is clearly a lot of work to do.