Man vs Computer
Self driving cars will be inputted with algorithms that help the car drive safely and keep the car on the road, but when buying the self driving car should you be given the choice to decide what decisions your car makes? It depends on what kind of decisions we’re talking about, the decisions like where is the car going to take me and what roads the car should take are some of the decisions that the driver should have. When faced with the decision to make ethical choices is one that is very tough and should not be made by the driver. The car is built to keep the car safe and the occupant safe. You should not be able to choose if you should hit a dog or swerve off and hit a pole, those choices should be made by the algorithms, and if you do not believe that is not fair then you should not buy a self driving car.
Everyone drives differently no driver is the same and it would make it very difficult for the manufacturers to come up with so many different algorithms to satisfy each persons moral choices. A Computer Scientist at Massachusetts Institute of Technology Iyad Rahwan says “People who think about machine ethics make it sound like you can come up with a perfect set of rules for robots, and what we show here with data is that there are no universal rules”. It is not possible for the companies to make such algorithms that follow everyones moral choices. There are so many different choices that the car will have to make like will it limit damage to the car or will it limit damage to the occupants or will it limit the damage to other things around the car like animals, pedestrians and cause harm to the occupants and the car. These choices are just to complex and the buyer of the car should not be given the option to choose what the car does in these ethical situation.
With the building and manufacturing of self driving cars the decisions should be made by the manufacturers and the specialized teams they have in place to make the car as safe as possible. These teams that these big manufacturers have are some of the smartest people in the world and they know what they are talking about. In an article posted on Towards data science, Andy Lau stats this ” The intent of the inventors is to create a better society for drivers and the planet. In addition, self-driving cars have proven to be significantly safer than having an actual driver; this has been shown by numerous studies and data collected from them. In the long run, autonomous cars will increase efficiency and productivity for people around the world. For more people to feel at ease with self-driving cars, companies, and self-driving car owners should understand they are responsible for the safety of all stakeholders. Risk management techniques can be used to quantify probabilistic risk in a way that is transparent and flexible. To create ethical vehicles, developers should continue to learn from past experiences in risk management and morally challenging situations.” Why should we let the buyer of the self driving car make the decisions of what the car should do in different ethical situations, when scientist and many years of research is put in to building these self driving cars to make the road safe and prevent crashes. We should believe in the choices that the manufactures they would not make a car that does not value the people, they would not make a car that is not safe and won’t protect the consumer. They will put the right algorithms together to allow for a safer road and more efficient world. The owners of the car should leave the decisions making of what the car should do to the people who make the cars and if they do not like that choice then they can drive the car them selves and make the choices on their own.
Waymo is another big competitor in the self driving car world and their teams has put together the first self driving car on the road. The team at Waymo has designed the car to be fully autonomous and are training the car to drive like a human, they are not giving the choice of what the car should do to the buyer. Waymo is working everyday to make the car able to share the road with human drivers, they are trying to fix small things that will allow their car to drive smoothly snd freely on the road. Waymo is the leading manufacturers for self driving car in an article from the verge it said that, ” Waymo already has a huge lead over its competitors in the field of autonomous driving. It has driven the most miles — 6 million on public roads, and 5 billion in simulation — and has collected vast stores of valuable data in the process.”
Another big topic that could be a major issue in the case of letting the owner of the car choosing what choice the car makes is legal issues. If the owner of the car tells the car what to do in a situation does that make them responsible and not the car, because the car is doing what the human said. If the owner does not tell the car what to do then only the manufactures could be at fault for legal issues that happen with the car.
While you would love to know what your car will do in any situation and you wish you could have a say in what it does, but that right now just doesn’t seem to be in the playing field. It is much safer for the people who studied most of their life to and put hours of work into these algorithms to be the ones who say what the car should do. They know what’s best for them and what’s best for their consumers.
Hawkins, Andrew J. “Inside Waymo’s Strategy to Grow the Best Brains for Self-Driving Cars.” The Verge, The Verge, 9 May 2018, http://www.theverge.com/2018/5/9/17307156/google-waymo-driverless-cars-deep-learning-neural-net-interview.
Maxmen, Amy. “Self-Driving Car Dilemmas Reveal That Moral Choices Are Not Universal.” Nature News, Nature Publishing Group, 24 Oct. 2018, http://www.nature.com/articles/d41586-018-07135-0.
Andy Lau, MBA. “The Ethics of Self-Driving Cars.” Medium, Towards Data Science, 13 Aug. 2020, towardsdatascience.com/the-ethics-of-self-driving-cars-efaaaaf9e320.