| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
An international debate is needed on the use of autonomous1 military robots, a leading academic has said. 一个领导性的学术研究称,关于自主军队机器人的使用还需要一个国际性的讨论。 Noel Sharkey of the University of Sheffield said that a push toward more robotic technology used in warfare2 would put civilian3 life at grave risk. Technology capable of distinguishing friend from foe4(敌人,仇敌) reliably(可靠地,确实地) was at least 50 years away, he added. However, he said that for the first time, US forces mentioned resolving such ethical5 concerns in their plans. "Robots that can decide where to kill, who to kill and when to kill is high on all the military agendas," Professor Sharkey said at a meeting in London. "The problem is that this is all based on artificial intelligence, and the military have a strange view of artificial intelligence based on science fiction." 'Odd way' Professor Sharkey, a professor of artificial intelligence and robotics, has long drawn6 attention to the psychological distance from the horrors of war that is maintained by operators who pilot unmanned aerial vehicles (UAVs), often from thousands of miles away. "These guys who are driving them sit there all day...they go home and eat dinner with their families at night," he said. "It's kind of a very odd way of fighting a war - it's changing the character of war dramatically." The rise in technology has not helped in terms of limiting collateral7 damage(附带损害), Professor Sharkey said, because the military intelligence behind attacks was not keeping pace. Between January 2006 and April 2009, he estimated, 60 such "drone" attacks were carried out in Pakistan. While 14 al-Qaeda were killed, some 687 civilian deaths also occurred, he said. That physical distance from the actual theatre of war, he said, led naturally to a far greater concern: the push toward unmanned planes and ground robots that make their decisions without the help of human operators at all. The problem, he said, was that robots could not fulfil two of the basic tenets(教义,信条) of warfare: discriminating8 friend from foe, and "proportionality", determining a reasonable amount of force to gain a given military advantage. "Robots do not have the necessary discriminatory ability," he explained. "They're not bright enough to be called stupid - they can't discriminate9 between civilians10 and non-civilians; it's hard enough for soldiers to do that. "And forget about proportionality, there's no software that can make a robot proportional," he added. "There's no objective calculus11 of proportionality - it's just a decision that people make." Policy in practise Current rules of engagement to which the UK subscribes12 prohibit the use of lethal13 force without human intervention14. Nigel Mills is aerial technology director at defence contractor15 QinetiQ, who make a number of UAVs and ground robots for the armed forces. He told BBC News that building in autonomy to the systems required assurances of the importance of human input16. "The more autonomous a system is, the more effort you have to put into the human/machine interface17 because of the rules of engagement. "Complete autonomy - where you send a UAV off on a mission and you don't interact with it - is not compatible with our current rules of engagement, so we're not working on such systems." The US air force published its "Unmanned Aircraft Systems Flight Plan 2009-2047" in July, predicting the deployment18 of fully19 autonomous attack planes. The document suggests that humans will play more of a role "monitoring the execution of decisions" than actually making the decisions. "Advances in AI will enable systems to make combat decisions and act within legal and policy constraints20(系统规定参数) without necessarily requiring human input," says the report. However, it concedes(承认,让步) that "authorising a machine to make lethal combat decisions is contingent21 upon(视……而定) political and military leaders resolving legal and ethical questions. "Ethical discussions and policy decisions must take place in the near term in order to guide the development of future UAS capabilities22, rather than allowing the development to take its own path apart from this critical guidance," it continues. While the US's plans are vague(模糊的,不明确的), Professor Sharkey said the mere23 mention of ethical issues was significant. "I'm glad they've picked up on that, because if you look at any previous plan, they hadn't done so," he told BBC News. However, he warned that work toward ever more autonomous killing24 machines is carrying on, noting the deployment of Israel's Harpy - a fully autonomous UAV that dive-bombs radar25 systems with no human intervention. He cautioned that an international debate was necessary before further developments in decision-making robots could unfold. 点击收听单词发音
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
- 发表评论
-
- 最新评论 进入详细评论页>>