Skip to main content


You are here:

The New Moral and Legal Issues with Self-Driving Cars

In 2017, self-driving cars are one of the hottest technology innovations that both consumers and businesses are talking about. Big names including Apple, Tesla, and Uber are all working tirelessly to bring driverless cars to market. The benefits are easy to see. Driverless cars offer a luxurious experience, reduce the stress of driving, eliminate drunk driving, can be profitable as a taxi when the owner isn’t using the car, and much more. However, driverless cars also bring several unique moral and legal issues to the table that have never been faced before, and to this point, it is unknown how these big companies will face these challenges.

First, there is the issue of programming morality into the decision-making software of the driverless cars. For instance, if a driverless car is faced with the scenario of crashing into another car or swerving out of the way and potentially hitting pedestrians, what decision is the right one? Does the car have a responsibility to protect the driver? After all, the driver is the company’s customer, right? Or does the car have a responsibility to protect the pedestrians? There are several difficult scenarios that the companies building these cars need to consider with great care. 

Second, is it okay for these companies to decide these important outcomes without input from the government or public? How are these decisions being made? To this point, there has been minimal transparency regarding what experts are consulting companies in this area and how the decision making process works. Because private industry has never dealt with this type of issue before, there is no regulatory framework to follow. Future driverless car customers and helpless consumers who aren’t even interested in driverless cars will both have to wait and see how they could be effected. 

Third, beyond the tough moral decisions related to self-driving cars, there lies an even bigger elephant in the room – legal outcomes. If and when a driverless car causes an accident, there is no precedence for who should be liable. Should the consumer who bought the car be liable or should the company that actually maintains the software that drives the car be liable? Will a consumer go to jail for an accident he or she had no control over? Considering that the consumer doesn’t actually drive, it would be logical that companies like Tesla and Apple will need to assume at least partial liability of accidents.This could create huge costs associated with manufacturing driverless cars. Plus, even if a company is found liable for a death, who would get prison time? Nobody. Is that fair to the victim’s family? For many people, financial compensation will not be enough to feel that justice has been served. The first cases involving driverless cars will set precedents for years to come, and they will also create a new niche for personal injury attorneys to specialize in.

Overall, self-driving cars will impact our society in ways that the technologies of the past twenty years have not. Only time will tell if morality can be successfully programmed into the decision-making software or who will be prosecuted for the wrongful death of an innocent person. Cars and driving are such a big part of American law and society that it could take decades to transition to an environment in which driverless cars are widely accepted. Until the technology hits the streets and there are real case studies to base regulations on, the only thing to do is wait and see what happens.