Self Driving cars are becoming more of a reality each day as companies like Tesla motors gather extensive data to program full autonomy. Unfortunately, there are many issues concerning this new technology.
Ethics and morals are concerns of engineers, drivers, and pedestrians. The Observer provided an interesting article here. How does autopilot decide who lives or dies based on the scenario that is given to the Artificial Intelligence. That is one, if not the all important question that is surrounding the new tech. Simply, the vehicle is probably going to choose whoever has the higher chance to survive an accident.
Researchers at MIT stated that they wanted, “[to] cover all the complexities of traffic laws, where pedestrians are either jaywalking or following the walk signal. That affects whether you kill more or less people”. This is important because in different parts of the country, or world for that matter. People violate all kinds of traffic laws on a regular basis. Where else do you think we got the California roll through a stop sign?
Additionally, who is going to be liable for an accident? The manufacturer, regulators, or the driver. This is a multifaceted issue, and it certainly has many levels of responsibility. We’re going to have to establish this before the masses begin to use autonomous driving. Fortunately, if everyone adopts it quicker, then the cars can just avoid having accidents.