Only A Good Killer Robot With A Gun Can Stop A Bad Killer Robot With A Gun

241
Republish
Reprint

The people who build the robots that will build the future are worried that a runaway robotic arms race will doom humanity instead.

Led by billionaire space and electric vehicle entrepreneur Elon Musk, a group of 116 roboticists have pressed the United Nations to ban weaponized artificial intelligence.

In an open letter offering to advise the UN’s new Group of Governmental Experts on Lethal Autonomous Weapon Systems, the group warns that killer robots “threaten to become the third revolution in warfare.”

Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.

Of course, the true threat of killer robots is not that they will exist — this is probably inevitable — but that they will become as widespread and available as small arms. Killer robots can potentially be built by anyone, let alone an entity with the resources of a nation-state. Once designed and tested, a killer robot changes the game, and then everyone must have them, or at least have them available.

If Pyongyang’s dictator wants an autonomous killing machine, nothing in the world can prevent him from having one. The best the world can possibly do, short of using killer robots to strike all of North Korea from out the blue, is slow down that future Kim’s killer robot development plan for a while.

Now just think of the non-state actors who might like to have an AI killer, or a whole armada of them. The world is filled with bright minds that build robots as well as evil minds with guns; the two often find each other wherever the money is.

Furthermore, because robots are basically just programming and electronics, the means of building them comes pre-democratized. Our future will involve 3D printing and e-commerce and dark webs to make killer robot ownership easy. People will order them on their Amazon Echo Dot™ with two-day shipping by drone delivery.

My point here is that killer robots are probably happening. Our likely future is a series of ads from the National Robot Association explaining the virtue of killer robot ownership and self-defense in the name of freedom. Its president will pay politicians to defend this right to killer robots in Congress and legislatures — even after some very angry person inevitably programs their killer robot for a mass shooting at a daycare center.

At that point, the NRA will say that killer robots don’t kill people, that instead, people use killer robots to kill people. They will have a point, since all robots are programmed, and humans decide what missions the killer robots should perform. As long as humans hold the trigger, they bear the blame, so why not killer robots for everyone? If killer robots are outlawed, only outlaws will have killer robots, etc.

It’s all as inevitable as the next solar eclipse.

Note to our readers: Please share/tweet our articles. Trump supporting trolls targeted our site’s account and reported it en masse, without cause. This triggered a seemingly automatic suspension. Twitter support has failed to address this issue. Thank you!


Featured image via Department of Defense under public domain

Click here for reuse options!
Copyright 2017 DeepStateNation.com