Who Gets The Say
From a young age every kid watched movies that had self driving cars or flying cars, we always wondered when they would come around. Cars have evolved over the last hundred or so years. They started off with cars that had wooden wheels, then going to cars that had stick shift, then years later the automatic was invented making the stick shift car a thing of the past. In about 2015 Elon Musk started his journey along with many other companies to become the first to have self driving cars. Everyone was excited about the news on self driving cars but not many people took into account the many questions and many unsolved problems with self driving cars. When will every car on the road be self driving? Will it be 5 years from now or 10? Will you allow a computer to drive you and your family around? So many different questions to be answered and it seems like more news on the self driving cars come out each day.
Self driving cars seem to be the next big thing for our generation. In 2019 all of Tesla’s cars come standard with autopilot technology. The standard self driving feature is called “Traffic-Aware Cruise Control and Auto-steer”. If you would like to, Tesla does offer “Full auto pilot” this is something you have to pay extra money for and does not come standard. The thing about self driving cars is how can they make the right decision and if they can’t how often can they make the right choice.
The autonomous car means to have the freedom to govern itself or control its own affairs. Many people dream of having a self driving car but many people have never experienced being in one. It is a totally different feeling when you are in a car with no one driving except for some algorithms and some cameras that will determine if you get to your destination safely. If all cars were self driving cars then that would make things a little bit more safer because the algorithms can be programmed from car to car but if you have self driving cars and human driven cars there are always going to be problems. A study was done in the UK and an article was written about how the citizens would feel with self driving cars on the road. In this article it states “The UK government has stated its vision to have driverless cars on UK roads by 2021, but nearly a quarter of the UK public said they felt apprehensive about the prospect of self-driving cars on the roads in the next three years, and a fifth of respondents felt fearful.” If someone was intoxicated and they swerved suddenly into the side of the self driving car will the self driving car be able to see or avoid it? We don’t know there are many ethical and moral decisions for self driving cars and what they will actually do in a life or death situation.
Do self driving cars have morals? Will the self driving car swerve off the road avoiding a dog and drive into some bushes or will the car hit the dog and not swerve off the road. According to a PBS article written about how cars will make life or death decisions it was said that “But some coders say that while these hypothetical situations are interesting, they are misleading because autonomous cars do not make judgments based on value, they make them based on protocol. While moral decisions will come into play when programmers decide how to use which algorithms, an assistant professor in computer science at Carnegie Mellon University said the car itself does not have a moral agency.” They are saying the car will already know what to do when that situation arises. The car will know the outcome of what is going to happen before you do, because a group of programmers made these algorithms to help guide the car to make the best decision for the car and the passenger.
In addition to all of these problems that arise from self driving cars there are many problems that will affect humans when self driving cars hit the roads in numbers. These self driving cars will be fully electric therefore that means there will be no more use for gas stations and gas station workers. Also self driving cars last a really long time and many people won’t need to purchase a new car every 3-7 years like most people normally do. It seems that there are a lot of small problems that are rising from the self -driving cars. Hopefully none of these things happen and the self driving car actually boosts job opportunities.
There are major benefits for self driving cars that could potentially save the earth from pollution. Self driving cars run on batteries therefore there will be no more gas being burned into the air from the millions of cars driving everyday. All cars will be charged by pulling the charger into your house that could be powered by solar panels and that could help save the planet a bit more.
It is inevitable that the world will soon have all self driving cars and the days with humans driving the cars will just be a thing of the past. The only thing that is stopping this from happening a lot sooner is the difficulties in perfecting the technology of the self driving car.
In a world changing very rapidly there are many new things coming in the near future. The word autonomous doesn’t really ring a bell in most people’s heads most people know the other word for it, “self driving”. Self driving cars have been in the works for many years and many people are excited for it and can’t wait for them to come out, should you be excited or worried?
Autonomous vehicles are the thing of the future and they seem like they will be amazing and have so many benefits to our world today. The thing is not many people think about the smaller questions that these automotive manufacturers don’t seem to have the answer to yet because there has not been enough research done. There are many moral questions that play into getting into a self driving car, is the car going to protect you because you paid for the car and it’s supposed to do everything in its power to protect you? What if your driving down the road and a kid jumps out into the street, is the car going to swerve away from the child into oncoming traffic to save the Child’s life meanwhile risking your life and maybe a car in another lane or is the car going to hit the kid saving your life and the car in the other lanes life? Small decisions like this are huge when thinking about getting a self driving car. In a case of an emergency will you be able to take control of the wheel in a split second? Probably not.
Many studies have been done with trying to figure out what people would want the car to do when faced with different types of situations that are very challenging and require a lot of thinking. When you are buying the Self driving car are you buying that car knowing that it is going to make the decisions for you and should you be allowed to say what the car should do in those types of situations. In an article from the Washington Post it talks about this difficult study they sent out. “The study, published in Nature, identified a few preferences that were strongest: People opt to save people over pets, to spare the many over the few and to save children and pregnant women over older people. But it also found other preferences for sparing women over men, athletes over obese people and higher status people, such as executives, instead of homeless people or criminals. There were also cultural differences in the degree, for example, that people would prefer to save younger people over the elderly in a cluster of mostly Asian countries.” These studies proved that people will tend to save younger peoples lives rather than an older person because they have more of a life to live. Also people chose to save animals and would rather swerve off of the road and maybe hitting a pole on the side. What would you want to do if you were driving? You would probably choose to spare the life of the animal and try and slam on the breaks or even swerving off of the road.
Volvo, a new competitor in the Self driving car market, has come out in an article published to Scientific America saying that ” Self-driving pioneers, in fact, are starting to make the switch. Last October, Volvo declared that it would pay for any injuries or property damage caused by its fully autonomous IntelliSafe Autopilot system, which is scheduled to debut in the company’s cars by 2020. The thinking behind the decision, explains Erik Coelingh, Volvo’s senior technical leader for safety and driver-support technologies, is that Autopilot will include so many redundant and backup systems—duplicate cameras, radars, batteries, brakes, computers, steering actuators—that a human driver will never need to intervene and thus cannot be at fault. “Whatever system fails, the car should still have the ability to bring itself to a safe stop,”. Therefore no human can be at fault for a crash this should be implemented in every single self driving car manufacturer, crashes are pretty much inevitable and no matter what safety protocols are put in place there will always be some type of crash or accident.
In a perfect world every person would wish they could have a safe road with no crashes but we do not live in a perfect world and with the making of these self driving cars is a closer step to making our roads safer. In a report to Consumers Report it was stated that ” In the far distant future, there’s little debate that self-driving cars have the potential to drastically reduce, or possibly even eliminate, crashes. In the interim, as self-driving cars navigate traffic alongside unpredictable human drivers, things will be murky.” This will be a big factor with self driving cars if everyone in the world was in a self driving car it would be almost impossible for crashes to happen but with self driving cars on the road with human drivers there will still be accidents. Most crashes are a result of human error and until all cars are self driving and everyone is in one the roads will still not be as safe as they could be. The algorithms in the car predict what the other cars around them should be doing but if someone not in a self driving car swerves at an unpredictable time it will cause a crash. If there are all self driving cars on the road they would work on the same wavelength and they would be able to know what each car was going to do because they have the same algorithms and can predict each other.
When it comes down to getting a self driving car you should know what you are buying into and how your car will protect you. You should not buy something and not know how it will protect you. To have the safest roads you need to know that your car will keep you safe and those who are in your car safe.
Self driving cars will be inputted with algorithms that help the car drive safely and keep the car on the road, but when buying the self driving car should you be given the choice to decide what decisions your car makes? It depends on what kind of decisions we’re talking about, the decisions like where is the car going to take me and what roads the car should take are some of the decisions that the driver should have. When faced with the decision to make ethical choices is one that is very tough and should not be made by the driver. The car is built to keep the car safe and the occupant safe. You should not be able to choose if you should hit a dog or swerve off and hit a pole, those choices should be made by the algorithms, and if you do not believe that is not fair then you should not buy a self driving car.
Everyone drives differently, no driver is the same and it would make it very difficult for the manufacturers to come up with so many different algorithms to satisfy each person’s moral choices. A Computer Scientist at Massachusetts Institute of Technology Iyad Rahwan says “People who think about machine ethics make it sound like you can come up with a perfect set of rules for robots, and what we show here with data is that there are no universal rules”. It is not possible for the companies to make such algorithms that follow everyone’s moral choices. There are so many different choices that the car will have to make like will it limit damage to the car or will it limit damage to the occupants or will it limit the damage to other things around the car like animals, pedestrians and cause harm to the occupants and the car. These choices are just too complex and the buyer of the car should not be given the option to choose what the car does in these ethical situations.
With the building and manufacturing of self driving cars the decisions should be made by the manufacturers and the specialized teams they have in place to make the car as safe as possible. These teams that these big manufacturers have are some of the smartest people in the world and they know what they are talking about. In an article posted on Towards data science, Andy Lau stats this ” The intent of the inventors is to create a better society for drivers and the planet. In addition, self-driving cars have proven to be significantly safer than having an actual driver; this has been shown by numerous studies and data collected from them. In the long run, autonomous cars will increase efficiency and productivity for people around the world. For more people to feel at ease with self-driving cars, companies, and self-driving car owners should understand they are responsible for the safety of all stakeholders. Risk management techniques can be used to quantify probabilistic risk in a way that is transparent and flexible. To create ethical vehicles, developers should continue to learn from past experiences in risk management and morally challenging situations.” Why should we let the buyer of the self driving car make the decisions of what the car should do in different ethical situations, when scientist and many years of research is put in to building these self driving cars to make the road safe and prevent crashes. We should believe in the choices that the manufacturer they would not make a car that does not value the people, they would not make a car that is not safe and won’t protect the consumer. They will put the right algorithms together to allow for a safer road and more efficient world. The owners of the car should leave the decision making of what the car should do to the people who make the cars and if they do not like that choice then they can drive the car themselves and make the choices on their own.
Waymo is another big competitor in the self driving car world and their teams have put together the first self driving car on the road. The team at Waymo has designed the car to be fully autonomous and are training the car to drive like a human, they are not giving the choice of what the car should do to the buyer. Waymo is working everyday to make the car able to share the road with human drivers, they are trying to fix small things that will allow their car to drive smoothly and freely on the road. Waymo is the leading manufacturer for self-driving cars. In an article from the verge it said that, ” Waymo already has a huge lead over its competitors in the field of autonomous driving. It has driven the most miles — 6 million on public roads, and 5 billion in simulation — and has collected vast stores of valuable data in the process.”
Another big topic that could be a major issue in the case of letting the owner of the car choose what choice the car makes is legal issues. If the owner of the car tells the car what to do in a situation, that makes them responsible and not the car, because the car is doing what the human said. If the owner does not tell the car what to do then only the manufactures could be at fault for legal issues that happen with the car.
While you would love to know what your car will do in any situation and you wish you could have a say in what it does, but that right now just doesn’t seem to be in the playing field. It is much safer for the people who studied most of their life to put hours of work into these algorithms to be the ones who say what the car should do. They know what’s best for them and what’s best for their consumers.
Kelkar, K. (2016, May 28). How will driverless cars make life-or-death decisions? Retrieved October 12, 2020, from https://www.pbs.org/newshour/nation/how-will-driverless-cars-make-life-or-death-decisions
Johnson, C. (2018, October 24). Self-driving cars will have to decide who should live and who should die. Here’s who humans would kill. Retrieved October 26, 2020, from https://www.washingtonpost.com/science/2018/10/24/self-driving-cars-will-have-decide-who-should-live-who-should-die-heres-who-humans-would-kill/
Monticello, M. (n.d.). Will Self-Driving Cars Make Our Roads Safer? Retrieved October 26, 2020, from https://www.consumerreports.org/self-driving-cars/will-self-driving-cars-make-our-roads-safer/
Hawkins, Andrew J. “Inside Waymo’s Strategy to Grow the Best Brains for Self-Driving Cars.” The Verge, The Verge, 9 May 2018, www.theverge.com/2018/5/9/17307156/google-waymo-driverless-cars-deep-learning-neural-net-interview.
Maxmen, Amy. “Self-Driving Car Dilemmas Reveal That Moral Choices Are Not Universal.” Nature News, Nature Publishing Group, 24 Oct. 2018, http://www.nature.com/articles/d41586-018-07135-0.
Andy Lau, MBA. “The Ethics of Self-Driving Cars.” Medium, Towards Data Science, 13 Aug. 2020, towardsdatascience.com/the-ethics-of-self-driving-cars-efaaaaf9e320.
Ryan Whitwam on September 8, 2014 at 3:45 pm Comment. “How Google’s Self-Driving Cars Detect and Avoid Obstacles.” ExtremeTech, 8 Sept. 2014, http://www.extremetech.com/extreme/189486-how-googles-self-driving-cars-detect-and-avoid-obstacles.
May, and Katie Burke. “How Do Self-Driving Cars Make Decisions?: NVIDIA Blog.” The Official NVIDIA Blog, 7 May 2019, blogs.nvidia.com/blog/2019/05/07/self-driving-cars-make-decisions/.
DeBord, Matthew. “Elon Musk Promises an Autopilot ‘Quantum Leap’ in the next Few Weeks. Here’s How Tesla’s One-of-a-Kind Bet on Self-Driving Tech Works.” Business Insider, Business Insider, 18 Aug. 2020, http://www.businessinsider.com/tesla-self-driving-technology-compared-to-everyone-see-how-it-works-2020-7.
Schmelzer, Ron. “What Happens When Self-Driving Cars Kill People?” Forbes, Forbes Magazine, 26 Sept. 2019, http://www.forbes.com/sites/cognitiveworld/2019/09/26/what-happens-with-self-driving-cars-kill-people/.