Should driverless cars be taught ethics?
It takes me half an hour to cover a 9 km commute and reach my office. In the evening when I am driving back home, the distance seems to have increased because it takes me little more than an hour to cover the same number of kilometers. The drive is stressful partly because I am a poor driver. I have to slam the brakes to save a pedestrian who suddenly decides to sprint across the road or the bus driver who ignores the presence of all other vehicles. I wonder if driverless cars will solve my problem.With more than one death and four injuries every minute, India reports the highest number of road fatalities in the world and the prime reason is "drivers' fault". The reasons could range from poor driving skills, faulty brakes, drunken driving and most of all, the general desire to speed up whenever possible. On most city roads in India, one can barely drive faster than the competing cyclists. Hence the moment we see a clean patch, the driver has an uncontrollable desire to speed up for the next 200 meters or the next traffic snarl whichever is earlier. Will driverless cars make roads safer?
Getting there
High end cars have sensors built in that warn the driver of obstructions while reversing. The cameras and sensors now come built in many cars. The new car by BMW called i3 can park itself as they demonstrated to awestruck visitors during the recent consumer electronics show in Las Vegas. Several other companies like Audi, Mercedes Benz and Honda have already demonstrated their driverless cars.https://www.youtube.com/watch?v=KPx5icIAklQThey are inevitable. At some point of time in the not too distant future you will see one of them in India – maybe some industrialist or film star will have one. Maybe one of those driverless cars will run over some pedestrian and the police will be confounded whom to pin the blame on. After all, they have been known to not pin the blame when someone rich and powerful mows down some homeless folks sleeping on the footpath. But will we feel more enraged if a "cute" driverless car causes an accident?
Ethical dilemmas
Isaac Asimov invented the “Three Laws of Robotics” to serve as a hierarchical ethical code for the robots in his stories: first, never harm a human being through action or inaction; second, obey human orders; last, protect oneself. You could argue that we could program the driverless cars with a set of instructions that follows these principles. Will that solve the problem?Imagine that you are in a railway trolley hurtling down a track. In its path are five people who are trapped and will die for sure when the trolley hits them. But fortunately you have a switch in the trolley that can divert it down a fork to another path. Even there in that path there is a man trapped who cannot escape and will die when the trolley hits him. Would you flip the switch and kill one person and save five?Imagine this scenario on a crowded road with a driverless car that you have programmed. What will the car do in that split second? Whose sense of ethics will the car manufacturer use to teach these robots to navigate these complexities? The possibilities in the real world are infinite and complex. If the driverless car has a choice between hurting an old man on the road and saving a baby what should it do? If the choice is between saving a Nobel Laureate and a famous artist, would it make the choice easier or more complex?Choices in the human world are rarely binary and never free of consequences. Human beings also have difficulty making these choices – unless they are psychopaths. They feel no guilt or shame. That means psychopaths are partly robotic in a way. Cars have been programmed to automatically signal and change lanes. Now we will need robot-ethicists before we let them loose in the streets.Should we wait until we have thought through these dilemmas? Leave your answer in the comments.------------How will you score on the psychopath scale? Take the test <click here>Join me on twitter @AbhijitBhaduri