The United States vice-chairman of the Joint Chiefs of Staff, General Paul Selva, has called it "The Terminator conundrum".
That is, should we empower machines to kill? It's a question the Pentagon is already grappling with as drone technology improves. And with Canberra defence chiefs keen to acquire armed drones, confirmed by the recent White Paper, Australia is also dipping its toe in this tricky ethical terrain. 
Anticipating Australia's purchase of armed drones by the start of the next decade, US firm General Atomics, which makes the famed Predator drone, opened an office in Canberra in   November.
The market dominance of the Predator - and its latest version known as the Reaper - is overwhelming, with 60 to 70 of these remotely piloted aircraft in the sky at any moment. They have been the Obama administration's weapon of choice in the war against Islamic extremists in the Middle East, and have been bought by Britain, France and Spain. The world's militaries agree that unmanned combat systems will play a bigger role in the future of warfare. As the technology evolves, so do the ethical questions.
Ken Loving, General Atomics' strategic development manager, when asked how drone technology will develop over the next two decades, says the inevitable answer is that they will "get smarter".
He stresses this doesn't mean artificially intelligent machines will be sent out to act independently. Rather it means certain decisions will be made by the machines themselves, but still based on strict algorithms. This makes sense, he says, given that the drone's link to its operator through satellites is its greatest vulnerability.
For instance, NASA has written algorithms for General Atomics drones to avoid collisions in case they lose contact with their pilots on the ground. The drone would be able to take over and fly itself to avoid another aircraft - a small step towards autonomy but one that nevertheless takes detailed work and is still a couple of years away.
Drones will also increasingly filter data so they don't bombard the operators on the ground with irrelevant information.
Mr Loving says he doesn't see drones being allowed to make the decision to kill any time soon. But he says some greater autonomy could be driven by the reality of new weapons.
Futuristic weapons such as hypersonic missiles and directed energy weapons travel so fast that human pilots' brains - in the plane or on the ground - won't be able to react in time. Only a machine could think fast enough to detect the threat and take countermeasures.