Personal injury law hinges around concepts of negligence, namely, the concept that a person can be held legally and financially liable for careless or reckless behavior that results in an accident. Auto accident law hinges on the idea of personal responsibility that goes along with the right to drive on public roads. When we drive, we have a duty to protect ourselves, our passengers, and other drivers on the road.
A Forbes reporter recently rode in one of Google’s driverless cars, observing the car’s reaction to bicyclists on the road, a crossing senior citizen, and a woman with a dog. These scenarios are frequently encountered by everyday drivers, but when it comes to a driverless car, programmers determine how the car will react.
Many tout driverless cars as the wave of the future. Some claim that personal injury lawyers and insurance companies will become obsolete professions as a result of these safer vehicles. However, the programmers who determine how these cars navigate complex road hazards will have to ask themselves questions that drivers sometimes face when they make split-second decisions.
For instance, if you’re driving down the road on a rainy day and a deer runs into the highway, your choices will often depend on many factors, including whether there are cars in other lanes and whether you have a toddler in the car. However, what happens if a person were to jump in front of your car, and what were to happen if there was a truck in the oncoming lane? A driver in this scenario would have to make a split-second difficult decision. Drive onto the oncoming lane and risk a head-on collision as a result of taking action. Or, choose to take no action, avoid the head-on collision, and most certainly strike the pedestrian.
A case like this would be challenging enough to litigate in civil court. But what is a programmer to tell his or her driverless car to do? And, if a programmer tells the car to take no action, does the pedestrian have the right to sue the passenger or the programmer for his or her purchase or programming decisions?
The Atlantic recently wrote an article about the ethics of driverless car programming. While we are more likely to forgive a driver for making the split-second decision to avoid a head-on collision in favor of hitting a pedestrian—as terrible as this scenario is—we may be less likely to forgive a programmer, who, from the comfort of his cubicle, programs a car to kill.
The situation is similar to a famous ethics problem. A runaway trolley is about to hit five people. You can throw a switch and the trolley would run on another track and strike only one person. Do you throw the switch? Or do you do nothing?
Driverless cars raise ethical questions that will have to be answered before the cars take to the road. However, the questions remain pertinent to personal injury law in Minneapolis, Minnesota. When a driver takes deadly or injurious action to prevent other injuries or greater damages, who, really, is at fault?
The Law Office of Martin T. Montilino helps clients answer these questions every day. If you’ve been injured in an auto accident, our firm may be able to assist you in receiving money for your injuries, lost wages, and pain and suffering.