Such systems are currently being researched by a number of militaries.
Broadly defined, military robots date back to World War II and the Cold War in the form of the German Goliath tracked mines and the Soviet teletanks. The MQB-1 Predator drone was when "CIA officers began to see the first practical returns on their decade-old fantasy of using aerial robots to collect intelligence".
The use of robots in warfare, although traditionally a topic for science fiction, is being researched as a possible future means of fighting wars. Already several military robots have been developed by various armies.
Some believe the future of modern warfare will be fought by automated weapons systems. The U.S. Military is investing heavily in research and development towards testing and deploying increasingly automated systems. The most prominent system currently in use is the unmanned aerial vehicle (IAI Pioneer & RQ-1 Predator) which can be armed with Air-to-ground missiles and remotely operated from a command center in reconnaissance roles. DARPA has hosted competitions in 2004 & 2005 to involve private companies and universities to develop unmanned ground vehicles to navigate through rough terrain in the Mojave Desert for a final prize of 2 Million.
Artillery has seen promising research with an experimental weapons system named "Dragon Fire II" which automates loading and ballistics calculations required for accurate predicted fire, providing a 12-second response time to fire support requests. However, military weapons are prevented from being fully autonomous: they require human input at certain intervention points to ensure that targets are not within restricted fire areas as defined by Geneva Conventions for the laws of war.
There have been some developments towards developing autonomous fighter jets and bombers. The use of autonomous fighters and bombers to destroy enemy targets is especially promising because of the lack of training required for robotic pilots, autonomous planes are capable of performing maneuvers which could not otherwise be done with human pilots (due to high amount of G-Force), plane designs do not require a life support system, and a loss of a plane does not mean a loss of a pilot. However, the largest draw back to robotics is their inability to accommodate for non-standard conditions. Advances in artificial intelligence in the near future may help to rectify this.
In current use
- US Mechatronics has produced a working automated sentry gun and is currently developing it further for commercial and military use.
- MIDARS, a four-wheeled robot outfitted with several cameras, radar, and possibly a firearm, that automatically performs random or preprogrammed patrols around a military base or other government installation. It alerts a human overseer when it detects movement in unauthorized areas, or other programmed conditions. The operator can then instruct the robot to ignore the event, or take over remote control to deal with an intruder, or to get better camera views of an emergency. The robot would also regularly scan radio frequency identification tags (RFID) placed on stored inventory as it passed and report any missing items.
- Tactical Autonomous Combatant (TAC) units, described in Project Alpha study 'Unmanned Effects: Taking the Human out of the Loop'
- Autonomous Rotorcraft Sniper System is an experimental robotic weapons system being developed by the U.S. Army since 2005. It consists of a remotely operated sniper rifle attached to an unmanned autonomous helicopter. It is intended for use in urban combat or for several other missions requiring snipers. Flight tests are scheduled to begin in Summer 2009.
- The "Mobile Autonomous Robot Software" research program was started in December 2003 by the Pentagon who purchased 15 Segways in an attempt to develop more advanced military robots. The program was part of a $26 million Pentagon program to develop software for autonomous systems.
Effects and impact
Autonomous robotics would save and preserve human life by removing serving soldiers who might otherwise be killed, while in service, from the battlefield. Lt. Gen. Richard Lynch of the United States of America Army Installation Management Command and assistant Army chief of staff for installation stated at a conference 
As I think about what’s happening on the battlefield today ... I contend there are things we could do to improve the survivability of our service members. And you all know that’s true.
Major Kenneth Rose of the US Army's Training and Doctrine Command outlined some of the advantages of robotic technology in warfare:
Machines don't get tired. They don't close their eyes. They don't hide under trees when it rains and they don't talk to their friends ... A human's attention to detail on guard duty drops dramatically in the first 30 minutes ... Machines know no fear.
Increasing attention is also paid to how to make the robots more autonomous, with a view of eventually allowing them to operate on their own for extended periods of time, possibly behind enemy lines. For such functions, systems like the Energetically Autonomous Tactical Robot are being tried, which is intended to gain its own energy by foraging for plant matter. The majority of all military robots are tele-operated and not equipped with weapons; they are used for reconnaissance, surveillance, sniper detection, neutralizing explosive devices, etc. Current robots that are equipped with weapons are tele-operated so they are not capable of taking lives autonomously. Advantages regarding the lack of emotion and passion in robotic combat is also taken into consideration as a beneficial factor in significantly reducing instances of unethical behavior in wartime. Autonomous machines are created not to be a "truly 'ethical' robots", yet ones that comply with the laws of war (LOW) and rules of engagement (ROE). Hence the fatigue, stress, emotion, adrenaline, etc. that affects a human soldiers rash decisions are removed; there will be no affect on the battlefield caused by the decisions made by the individual.
In 2009, academics and technical experts attended a conference to discuss the impact of the hypothetical possibility that robots and computers could become self-sufficient and able to make their own decisions, also known as the Singularity. They discussed the possibility and the extent to which computers and robots might be able to acquire any level of autonomy, and to what degree they could use such abilities to possibly pose any threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved cockroach intelligence.
Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions. The US Navy has funded a report which indicates that as military robots become more complex, there should be greater attention paid to implications of their ability to make autonomous decisions.
American soldiers have been known to name the robots that serve alongside them; sometimes after human friends, family, and celebrities; pets; or simply after themselves. The 'gender' assigned to the robot may be related to the marital status of its operator.
Some affixed fictitious medals to battle-hardened robots, and even held funerals for destroyed robots. An interview of 23 explosive ordnance detection members shows that while they feel it is better to lose a robot than a human, they also felt anger and a sense of loss if they were destroyed. A survey of 746 people in the military showed that 80% either 'liked' or 'loved' their military robots, with more affection being shown towards ground rather than aerial robots. Surviving dangerous combat situations together increased the level of bonding between soldier and robot, and current and future advances in artificial intelligence may further intensify the bond with the military robots.
Human rights groups and NGOs such as Human Rights Watch and the Campaign to Stop Killer Robots have started urging governments and the United Nations to issue policy to outlaw the development of so-called "lethal autonomous weapons systems" (LAWS). The United Kingdom opposed such campaigns, with the Foreign Office declaring that "international humanitarian law already provides sufficient regulation for this area".
In July 2015, over 1,000 experts in artificial intelligence signed a letter warning of the threat of an arms race in military artificial intelligence and calling for a ban on autonomous weapons. The letter was presented in Buenos Aires at the 24th International Joint Conference on Artificial Intelligence (IJCAI-15) and was co-signed by Stephen Hawking, Elon Musk, Steve Wozniak, Noam Chomsky, Skype co-founder Jaan Tallinn and Google DeepMind co-founder Demis Hassabis, among others.
In fictional media
- Robot combat
- Unmanned combat air vehicle
- Powered exoskeleton
- Network-centric warfare
- DARPA Grand Challenge
- Multi Autonomous Ground-robotic International Challenge
- Three Laws of Robotics
- Nyagudi, Nyagudi Musandu. "Humanitarian Algorithms : A Codified Key Safety Switch Protocol for Lethal Autonomy" (PDF). Cornell University Library. Retrieved 27 October 2015.
- Nyagudi, Nyagudi Musandu. "Post-Westgate SWAT : C4ISTAR Architectural Framework for Autonomous Network Integrated Multifaceted Warfighting Solutions Version 1.0 : A Peer-Reviewed Monograph" (PDF). Cornell University Library. Retrieved 27 October 2015.
- Steve Coll, Ghost Wars (Penguin, 2005 edn), pp.529 and 658 note 6.
- Robots and Robotics at the Space and Naval Warfare Systems Center Pacific
- DARPA Grand Challenge: Home
- Technology Review: The Ascent of the Robotic Attack Jet
- Guardium Military robot
- Korean gun bots theregister.co.uk
- Schafer, Ron (July 29, 2003). "Robotics to play major role in future warfighting". United States Joint Forces Command. Retrieved 2013-04-30.
- Page, Lewis (21 April 2009). "Flying-rifle robocopter: Hovering sniper backup for US troops". The Register. Retrieved 2009-04-21.
- "U.S. Army Tests Flying Robot Sniper". Fox News. 2009-04-22. Retrieved 2009-04-23.
- Hambling, David (May 2009). "UAV Helicopter Brings Finesse to Airstrikes". Popular Mechanics. Retrieved 2009-04-21.
- Hambling, David (April 21, 2009). "Army Tests Flying Robo-Sniper". Wired, "Danger Room" blog. Retrieved 2009-04-21.
- "Military wants to transform Segway scooters into robots". seattlepi.com. 2003-12-02. Retrieved 2009-04-24.
- Cheryl Pellerin (American Forces Press Service) - DoD News:Article published Aug. 17, 2011 published by the U.S. Department of Defense, WASHINGTON (DoD) [Retrieved 2015-07-28]
- "Robot soldiers". BBC News. 2002-04-12. Retrieved 2010-05-12.
- Hellström, Thomas (June 2013). "On the moral responsibility of military robots". Ethics and Information Technology. 15 (2): 99–107. doi:10.1007/s10676-012-9301-2.
- Lin, Bekey, Abney, Patrick, George, Keith (2009). "Robots in War: Issues of Risk and Ethics".
- Call for debate on killer robots, Jason Palmer. BBC News, 8/3/09.
- New Navy-funded Report Warns of War Robots Going "Terminator", by Jason Mick (Blog), dailytech.com, February 17, 2009.
- Navy report warns of robot uprising, suggests a strong moral compass, by Joseph L. Flatley engadget.com, Feb 18, 2009.
- Nidhi Subbaraman. "Soldiers <3 robots: Military bots get awards, nicknames ... funerals". NBC News.
- Bowcott, Owen Bowcott. "UN urged to ban 'killer robots' before they can be developed". the Guardian. Retrieved 2015-07-28.
- Bowcott, Owen. "UK opposes international ban on developing 'killer robots'". the Guardian. Retrieved 2015-07-28.
- Gibbs, Samuel. "Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons". the Guardian. Retrieved 2015-07-28.
- "Musk, Hawking Warn of Artificial Intelligence Weapons". WSJ Blogs - Digits. 2015-07-27. Retrieved 2015-07-28.
|Wikimedia Commons has media related to Military robots.|
- "Biomass military robot in development"
- EATR: Energetically Autonmous Tactical Robot - Phase II Project
Ethical and legal concerns
- Interviews on ethical and legal aspects of Robotic Combat Systems
- Public Say It's Illegal to Target Americans Abroad as Some Question CIA Drone Attacks, according to Fairleigh Dickinson University PublicMind poll - February 7, 2013
- The future of warfare: Why we should all be very afraid (2014-07-21), Rory Tolan, Salon
- Archive on air wars, Geographical Imaginations
- Logical Limitations to Machine Ethics, with Consequences to Lethal Autonomous Weapons. Also discussed in: Does the Halting Problem Mean No Moral Robots?
- Robots in War: Issues of Risk and Ethics - 2009
- United States Joint Forces Command website: "Leading the transformation of the U.S. military"
- irobot.com, builder of the PackBot and the R-Gator systems
- Boston Dynamics, builder of BigDog
News articles/press releases
- USJFC: 'Robotics to play major role in future warfighting'
- "From bomb disposal to waste disposal" Robots help deal with hazards, emergencies and disasters (International Electrotechnical Commission, July 2011)
- "War robots still in Iraq", DailyTech, April 17, 2008
- New Model Army Soldier Rolls Closer to Battle (SWORDS)
- TALON Small Mobile Robot
- TWG Military Robots
- Carnegie Mellon University's snooping robot going to Iraq
- PackBot Battlefield robotic Platform
- Miniature Unmanned Aerial Systems - UAV
- Guardium Autonomous Security Vehicle
- Unmanned Ground Systems from Israel
- High-Tech Military in Due Course
- Launching a new kind of warfare
- Gerry J. Gilmore (January 24, 2006). "Army's Veteran Bomb-Disposal Robot Now 'Packs Heat'". American Forces Press Service. Retrieved 2008-02-02.
- As Wars End, Robot Field Faces Reboot April 11, 2012