(Editor’s Note: Mad Scientist Laboratory is pleased to present the second guest blog post by Dr. Richard Nabors, Associate Director for Strategic Planning and Deputy Director, Operations Division, U.S. Army Research, Development and Engineering Command (RDECOM) Communications-Electronics Research, Development and Engineering Center (CERDEC), addressing how Augmented and Mixed Reality are the critical elements required for integrated sensor systems to become truly operational and support Soldiers’ needs in complex environments.
Dr. Nabors’ previous guest post addressed how the proliferation of sensors, integrated via the Internet of Battlefield Things [IoBT], will provide Future Soldiers with the requisite situational awareness to fight and win in increasingly complex and advanced battlespaces.)
Speed has always been and will be a critical component in assuring military dominance. Historically, the military has sought to increase the speed of its jets, ships, tanks, and missiles. However, one of the greatest leaps that has yet to come and is coming is the ability to significantly increase the speed of the decision-making process of the individual at the small unit level.
To maximize individual and small unit initiative to think and act flexibly, Soldiers must receive as much relevant information as possible, as quickly as possible. Integrated sensor technologies can provide situational awareness by collecting and sorting real-time data and sending a fusion of information to the point of need, but that information must be processed quickly in order to be operationally effective. Augmented Reality (AR) and Mixed Reality (MR) are two of the most promising solutions to this challenge facing the military and will eventually make it possible for Soldiers to instantaneously respond to an actively changing environment.
AR and MR function in real-time, bringing the elements of the digital world into a Soldier’s perceived real world, resulting in optimal, timely, and relevant decisions and actions. AR and MR allow for the overlay of information and sensor data into the physical space in a way that is intuitive, serves the point of need, and requires minimal training to interpret. AR and MR will enable the U.S. military to survive in complex environments by decentralizing decision-making from mission command and placing substantial capabilities in Soldiers’ hands in a manner that does not overwhelm them with information.
On a Soldier’s display, AR can render useful battlefield data in the form of camera imaging and virtual maps, aiding a Soldier’s navigation and battlefield perspective. Special indicators can mark people and various objects to warn of potential dangers. Soldier-borne, palm-size reconnaissance copters with sensors and video can be directed and tasked instantaneously on the battlefield. Information can be gathered by unattended ground sensors and transmitted to a command center, with AR and MR serving as a networked communication system between military leaders and the individual Soldier. Used in this way, AR and MR increase Soldier safety and lethality.
In the near-term, the Army Research and Development (R&D) community is investing in the following areas:
• Reliable position tracking devices that self-calibrate for head orientation of head-worn sensors.
• Ultralight, ultrabright, ultra-transparent display eyewear with wide field of view.
• Three-dimensional viewers with battlefield terrain visualization, incorporating real-time data from unmanned aerial vehicles, etc.
In the mid-term, R&D activities are focusing on:
• Manned vehicles with sensors and processing capabilities for moving autonomously, tasked for Soldier protection.
• Robotic assets, tele-operated, semi-autonomous, or autonomous and imbued with intelligence, with limbs that can keep pace with Soldiers and act as teammates.
• Robotic systems that contain multiple sensors that respond to environmental factors affecting the mission, or have self-deploying camouflage capabilities that stay deployed while executing maneuvers.
• Enhanced reconnaissance through deep-penetration mapping of building layouts, cyber activity, and subterranean infrastructure.
Once AR and MR prototypes and systems have seen widespread use, the far term focus will be on automation that could track and react to a Soldier’s changing situation by tailoring the augmentation the Soldier receives and by coordinating across the unit.
In addition, AR and MR will revolutionize training, empowering Soldiers to train as they fight. Soldiers will be able to use real-time sensor data from unmanned aerial vehicles to visualize battlefield terrain with geographic awareness of roads, buildings, and other structures before conducting their missions. They will be able to rehearse courses of action and analyze them before execution to improve situational awareness. AR and MR are increasingly valuable aids to tactical training in preparation for combat in complex and congested environments.
AR and MR are the critical elements required for integrated sensor systems to become truly operational and support Soldiers’ needs in complex environments. Solving the challenge of how and where to use AR and MR will enable the military to get full value from its investments in complex integrated sensor systems.
For more information on how the convergence of technologies will enhance Soldiers on future battlefields, see:
– The discussion on advanced decision-making in An Advanced Engagement Battlespace: Tactical, Operational and Strategic Implications for the Future Operational Environment, published by our colleagues at Small Wars Journal.
– Dr. James Canton’s presentation from the Mad Scientist Robotics, Artificial Intelligence, & Autonomy Conference at Georgia Tech Research Institute last March.
– Dr. Rob Smith’s Mad Scientist Speaker Series presentation on Operationalizing Big Data, where he addresses the applicability of AR to sports and games training as an analogy to combat training (noting “Serious sport is war minus the shooting” — George Orwell).
Dr. Richard Nabors is Associate Director for Strategic Planning, US Army CERDEC Night Vision and Electronic Sensors Directorate.
4 Replies to “48. Warfare at the Speed of Thought”
From a “thought grenade” aspect, perhaps the AR/MR aspect would have a measure of negative/positive feedback aspects for the troops integrated such as the “catch up” aspect of the loader used by Riley in the movie “Aliens” where the machine response is looking to maintain contact with the human (a foot goes forward int he harness and the foot is out of contact with the machine so the machine responds by “catching up” to Ripley’s foot).
Other aspect that I got my mind jogged with after reading this post was the movie “Ready Player One” I just saw the other night. AR driven by a massive AI generating avatars but feedback “suits” worn that are sensitive enough to transmit touch and attack (but cuts off if the player is ‘killed’) from an interactive/training attribute.
And a really good movie!
Re: information and small units, Linda Nagata has a couple of books about a “Linked Combat Squad”. Each soldier in the squad has some small brain implants. These allow communications within the squad, with analysts back home, and with local UAS (which also serve as the radio relay).
She also makes the point that the soldiers can become dependent on the implants because it helps regulate their brain chemistry, which I think is a good example of a possible side effect that those of us in the technology business should keep in mind.
I am heartened by the discussion on speeding up decision-making through the exploration of AR/MR solutions. However, decision-making is more than technology – it requires a paradigm shift in thinking. Our leaders must first explore the limits/blinders of metacognition in the Department of Defense (DoD). DARPA and service level equivalents (e.g., the Marine Corps Warfighting Lab) are actively working technical solutions to help commanders “control” the battlefield of the future. But what we need is more research on how to best “command” the battlefield of the future.
Brain-Computer/Machine Interface and Whole Brain Emulation will help the commander and operator control the battlespace, but only when we advance our cognitive abilities to rapidly process the data into knowledge will we achieve “warfare at the speed of thought.”
I’m interested in how we can reduce cognitive load by migrating information inputs from visual and auditory through haptics.