Monday, December 21, 2015

In An Accident, Who Will A Driverless Car Be Programmed To Kill?


"When ethics are automated, life-and-death decisions may be in Google's hands." A somewhat scary excerpt:
When an accident happens or is about to happen, your car needs to do something, and what it does will be determined by how it is programmed by it's makers. Lin points out that an ethical difference lays between a human driver’s reaction and a driverless car’s decision, a pre-meditated programming choice to value one life over another, even if the precise owner of that life is not known at the time. In the exact same situation, even if the car reacts exactly the same way as a human would, Lin says that the decision could be viewed as "premeditated homicide." 
If your car is programmed to "minimize harm" by choosing to swerve into a helmet-wearing cyclist instead of the helmet-less cyclist on the other side, then aren't responsible people being penalized? In this world, bike users might skip wearing helmets to avoid becoming victims of robo-cars.
Read more in FastCo.

Why Women Aren’t C.E.O.s, According to Women Who Almost Were

"It’s not a pipeline problem. It’s about loneliness, competition and deeply rooted barriers." Read more in the NYT .