观点manbetx app苹果 商业的未来

Life, death and the limits of crowdsourcing morality

Today I have been both murderous and merciful. I have deliberately mown down pensioners and a pack of dogs. I have ploughed into the homeless, slain a couple of athletes and run over the obese. But I have always tried to save the children.

As I finish my session on the Moral Machine — a public experiment being run by the Massachusetts Institute of Technology — I learn that my moral outlook is not universally shared. Some argue that aggregating public opinions on ethical dilemmas is an effective way to endow intelligent machines, such as driverless cars, with limited moral reasoning capacity. Yet after my experience, I am not convinced that crowdsourcing is the best way to develop what is essentially the ethics of killing people. The question is not purely academic: Tesla is being sued in China over the death of a driver of a car equipped with its “semi-autonomous” autopilot. Tesla denies the technology was at fault.

Anyone with a computer and a coffee break can contribute to MIT’s mass experiment, which imagines the brakes failing on a fully autonomous vehicle. The vehicle is packed with passengers, and heading towards pedestrians. The experiment depicts 13 variations of the “trolley problem” — a classic dilemma in ethics that involves deciding who will die under the wheels of a runaway tram.

您已阅读33%(1311字),剩余67%(2634字)包含更多重要信息,订阅以继续探索完整内容,并享受更多专属服务。
版权声明:本文版权归manbetx20客户端下载 所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。
设置字号×
最小
较小
默认
较大
最大
分享×