89. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our September edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]


1. Can you tell a fake video from a real one? and How hard is it to make a believable deepfake? by Tim Leslie, Nathan Hoad, and Ben Spraggon, Australian Broadcast Corporation (ABC) News, 26 and 27 September 2018, respectively.


Deep Video Portraits by Hyeongwoo Kim, Pablo Garrido, Ayush Tewari, Weipeng Xu, Justus Thies, Matthias Nießner, Patrick Perez, Christian Richardt, Michael Zollhöfer, and Christian Theobalt, YouTube, 17 May 2018.

Mad Scientist has previously sounded the alarm regarding the advent and potential impact of “DeepFakes” – deceptive files created using artificial neural networks and graphics processors that yield nearly undetectably fake imagery and videos. When distributed via Social Media, these files have the potential to “go viral” — duping, deceiving, and manipulating whole populations of viewers.

ABC’s first news piece provides several video samples, enabling you to test your skills in trying to detect which of the videos provided are real or fake. ABC then goes on to warn that “We are careening toward an infocalypse,” where we may soon find ourselves in living in “A world without truth”.

Source: ABC News

In their second piece, ABC delves into the step-by-step mechanics of how DeepFakes are created, using former Australian PM Malcolm Turnbull as a use case, and posits placing this fabricated imagery into different, possibly compromising, scenes, manipulating reality for a credulous public.

The Deep Video Portraits YouTube video (snippets of which were used to illustrate both of the aforementioned ABC news pieces) was presented at the Generations SIGGRAPH conference, convened in Vancouver, BC, on 12-16 August 2018. In conjunction with the ABC articles, the combined narration and video in Deep Video Portraits provide a comprehensive primer on how photo realistic, yet completely synthetic (i.e., fictional) re-animations can be accomplished using source and target videos.

Source: Deep Video Portraits – SIGGRAPH 2018 via YouTube /
Christian Theobalt

When combined with the ubiquity of Social Media, these public domain AI algorithms (e.g., FakeApp, DerpFakes, DeepFakes) are democratizing an incredibly disruptive capability. The U.S. must develop and implement means (e.g., education) to “inoculate” its citizenry and mitigate this potentially devastating Gray Zone weapon.

Attacking an adversary’s most important center of gravity — the spirit of its people — no longer requires massive bombing runs or reams of propaganda. All it takes is a smartphone and a few idle seconds. And anyone can do it.” — P.W. Singer and Emerson T. Brooking in LikeWar – The Weaponization of Social Media


2.The first “social network” of brains lets three people transmit thoughts to each other’s heads,” by Emerging Technology from the arXiv, MIT Technology Review, 29 September 2018.

In 2015, scientists at University of Washington in Seattle connected two people via a brain to brain interface. The connected individuals were able to play a 20 questions game. Now these scientists have announced the first group brain to brain network. They call the network the “BrainNet” and the individuals were able to play a collaborative Tetris-like game.

Source: BrainNet: A Multi-Person Brain-to-Brain Interface for
Direct Collaboration Between Brains / https://arxiv.org/pdf/1809.08632.pdf

To date, our future operational environment has described the exploding Internet of Things and even the emerging concept of an Internet of Battle Things. The common idea here is connecting things – sensors, weapons, and AI to a human in the or on the loop. The idea of adding the brain to this network opens incredible opportunities and vulnerabilities. We should start asking ourselves questions about this idea: 1) Could humans control connected sensors and weapons with thought alone, 2) Could this be a form of secure communications in the future, and 3) Could the brain be hacked and what vulnerabilities does this add? (Read Battle of the Brain) There are many more questions, but for now maybe we should broaden our ideas about connectivity to the Internet of Everything and Everyone.


3.Scientists get funding to grow neural networks in petri dishes,” Lehigh University, 14 September 2018.

An overview of running image recognition on living neuron testbed / Source: Xiaochen Guo / Lehigh University

The future of computing may not necessarily be silicon or quantum-based — it may be biological! The National Science Foundation (NSF) recently awarded an interdisciplinary team of biologists and computer engineers $500,000. This is in support of Understanding the Brain and the BRAIN Initiative, a coordinated research effort that seeks to accelerate the development of new neurotechnologies. The intent is to help computer engineers develop new ways to think about the design of solid state machines, and may influence other brain-related research using optogenetics, a biological technique that uses light to control cells —  “spike train stimuli” — similar to a two-dimensional bar code. The encoding of the spike train will then be optically applied to a group of networked in vitro neurons with optogenetic labels. This hybrid project could lead to a better understanding of how organic computers and brains function.  This suggests a radically different vision of future computing where, potentially, everything from buildings to computers could be grown in much the same way that we “grow” plants or animals today.


4.These “Robotic Skins” Turn Everyday Objects into Robots,” by Rachael Lallensack, Smithsonian.com, 19 September 2018, reviewed by Ms. Marie Murphy.

Source: Yale via Smithsonian.com

A team of roboticists out of Yale University published a report announcing the development of OmniSkins, pliable material with embedded sensors that can animate ordinary, inert objects. OmniSkins turn ordinary objects into robots on contact. These removable sheets can be reused and reconfigured for a variety of functions, from making foam tubes crawl like worms to creating a device which can grab and hold onto objects out of static foam arms. Initially developed for NASA, demonstrations reveal that OmniSkins can make a stuffed animal walk when wrapped around its legs and correct the posture of a person when embedded in their shirt. While these are fun examples, the realistic military applications are vast and varied. OmniSkins could represent a new development in performance-enhancing exoskeletons, enabling them to be lighter and more flexible. These sheets can turn ordinary objects into useful machines in a matter of minutes and can be modified with cameras or other sensors to fit the needs of the mission.


5.Movement-enhancing exoskeletons may impair decision making,” by Jennifer Chu,  MIT, 5 October 2018.

PowerWalk / Source: Bionic Power Inc. via MIT

Researchers from MIT have discovered that the use of exoskeletons to enhance speed, power, and endurance could have a negative effect on attention, decision-making, and cognition. The researchers found that 7 out of 12 subjects actually performed worse on cognitive tasks while wearing an exoskeleton through an obstacle course. The researchers tested them on several cognitive tasks from responding to visual signals to following their squad leader at a defined distance. They concluded that more than half of the subjects wearing the exoskeleton showed a marked decline in reaction time to the various tests. This presents an interesting challenge for technology developers. Does a positive solution in one area negatively affect another, seemingly unrelated, area? Would the subjects in the test have performed better if they had prolonged training with the exoskeletons as opposed to a few days? If so, this presents an additional burden and training demand on Soldiers and the Army. Will a trade study involving not just physical measures, but cognitive ones now need to be integrated into all new Army technology developments and what does this do to the development timeline?


6.Researchers Create “Spray On” 2-D Antennas,” by Michael Koziol, IEEE Spectrum, 21 September 2018.

Drexel’s MXene “Antenna Spray Paint” / Source: YouTube via IEEE Spectrum

Researchers from Drexel University have developed a novel solution to reducing the size and weight of traditional antennas. Using a metal like titanium or molybdenum, bonded with carbides or nitrides called MXene, they were able to produce a spray-on antenna solution. By dissolving the MXene in water, and using a commercial off-the-shelf spray gun, one can rapidly design, customize, and deploy a working antenna. The spray-on antenna is 100nm thick (versus a traditional copper antenna that is 3,000nm) and has a higher conductivity than carbon nanotubes – a previous solution to the small and thin antenna problem.  On a hyperactive battlefield where Soldiers may need on-demand solutions in a compressed timeline, MXene spray-on antennas may be a potential game changer. How much time, materials, and processing can be saved in an operational environment if a Soldier can quickly produce a low profile antenna to a custom specification? What does this mean for logistics if repair parts for antennas no longer need to be shipped from outside the theater of operations?


7.NASA’s Asteroid-Sampling Spacecraft Begins Its Science Work Today,” by Mike Wall, Space.com, 11 September 2018.

NASA Infographic on the OSIRIS-REx Mission / Source: https://www.space.com/11808-nasa-asteroid-mission-osiris-rex-1999-rq36-infographic.html

NASA’s OSIRIS-REx (short for Origins, Spectral Interpretation, Resource Identification, Security – Regolith Explorer) spacecraft commenced studying near-Earth asteroid Bennu’s dust plumes from afar on 11 September 2018. Once the probe achieves orbit around the ~500m-wide space rock on 31 December 2018, it will further explore that body’s dust, dirt, and gravel. Then, in mid-2020, OSIRIS-REx will swoop down to the surface to collect and return a sample of material to Earth in a special return capsule. While this piece represents very cool extraterrestrial science, it is also significant for what it bodes for the future Operational Environment, Multi-Dimensional Operations in the Space Domain, and our newly established Space Force.

“The $800 million OSIRIS-REx mission will … contribute to planetary-defense efforts. For example, the probe’s observations should help researchers better understand the forces that shape potentially dangerous asteroids’ paths through space… (Bennu itself is potentially hazardous; there’s a very small chance that it could hit Earth in the late 22nd century.)”

OSIRIS-REx is not the only probe sampling asteroids – Japan’s Hayabus2 spacecraft is preparing to touch down on the asteroid Ryugu this month. NASA has estimated the total value of resources locked in asteroids is equivalent to $100 Billion for every man, woman, and child on Earth.

This century’s new space race to capitalize on and exploit our solar system’s heretofore untapped mineral wealth, while defending critical space assets, will demand that the U.S. budgets for, develops, and maintains future space-based capabilities (initially unmanned, but eventually manned, as required by mission) to protect and defend our national and industrial space interests.


8.Soldiers who obliterate enemy fighters with drones will be guided on the morality of their actions by specially trained army chaplains,” by Roy Tingle, Daily Mail Online, 25 September 2018.

Source: Defense Visual Information Distribution Service (DIVIDS)

In possibly an all-time record for the worst news article title, it has been revealed that the British Army is training ethicists to teach soldiers about the morality of killing with drones. Chaplains will spend one year studying and obtaining a Master’s degree in Ethics at Cardiff University so that they can instruct officers on the moral dilemmas involved in killing an enemy from thousands of miles away. Officials have long been concerned about the emotional trauma suffered by drone pilots, as well as the risk that they will be more likely to use deadly force if the confrontation is being played out on a computer screen. This is about the speed of future combat and the decisive action that will be needed on the battlefield in the future. War will remain a human endeavor, but our Soldiers will be stressed to exercise judgement and fight at the ever increasing machine speed. The Army must be prepared to enter new ethical territory and make difficult decisions about the creation and employment of cutting edge technologies. While the Army holds itself to a high ethical standard, new converging technologies may come at an ethical cost. Updating guidance, policy, and law must keep up with what is employed on the battlefield. Many of these ethical dilemmas and questions lack definite answers and are ethical considerations that most of our future adversaries are unlikely to consider.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

87. LikeWar — The Weaponization of Social Media

[Editor’s Note: Regular readers will note that one of our enduring themes is the Internet’s emergence as a central disruptive innovation. With the publication of proclaimed Mad Scientist P.W. Singer and co-author Emerson T. Brooking’s LikeWar – The Weaponization of Social Media, Mad Scientist Laboratory addresses what is arguably the most powerful manifestation of the internet — Social Media — and how it is inextricably linked to the future of warfare. Messrs. Singer and Brooking’s new book is essential reading if today’s Leaders (both in and out of uniform) are to understand, defend against, and ultimately wield the non-kinetic, yet violently manipulative effects of Social Media.]

“The modern internet is not just a network, but an ecosystem of 4 billion souls…. Those who can manipulate this swirling tide, steer its direction and flow, can…. accomplish astonishing evil. They can foment violence, stoke hate, sow falsehoods, incite wars, and even erode the pillars of democracy itself.”

As noted in The Operational Environment and the Changing Character of Future Warfare, Social Media and the Internet of Things have spawned a revolution that has connected “all aspects of human engagement where cognition, ideas, and perceptions, are almost instantaneously available.” While this connectivity has been a powerfully beneficial global change agent, it has also amplified human foibles and biases. Authors Singer and Brookings note that humans by nature are social creatures that tend to gravitate into like-minded groups. We “Like” and share things online that resonate with our own beliefs. We also tend to believe what resonates with us and our community of friends.

Whether the cause is dangerous (support for a terrorist group), mundane (support for a political party), or inane (belief that the earth is flat), social media guarantees that you can find others who share your views and even be steered to them by the platforms’ own algorithms… As groups of like-minded people clump together, they grow to resemble fanatical tribes, trapped in echo chambers of their own design.”

Weaponization of Information

The advent of Social Media less than 20 years ago has changed how we wage war.

Attacking an adversary’s most important center of gravity — the spirit of its people — no longer requires massive bombing runs or reams of propaganda. All it takes is a smartphone and a few idle seconds. And anyone can do it.”

Nation states and non-state actors alike are leveraging social media to manipulate like-minded populations’ cognitive biases to influence the dynamics of conflict. This continuous on-line fight for your mind represents “not a single information war but thousands and potentially millions of them.”


LikeWar provides a host of examples describing how contemporary belligerents are weaponizing Social Media to augment their operations in the physical domain. Regarding the battle to defeat ISIS and re-take Mosul, authors Singer and Brookings note that:

Social media had changed not just the message, but the dynamics of conflict. How information was being accessed, manipulated, and spread had taken on new power. Who was involved in the fight, where they were located, and even how they achieved victory had been twisted and transformed. Indeed, if what was online could swing the course of a battle — or eliminate the need for battle entirely — what, exactly, could be considered ‘war’ at all?

Even American gang members are entering the fray as super-empowered individuals, leveraging social media to instigate killings via “Facebook drilling” in Chicago or “wallbanging” in Los Angeles.

And it is only “a handful of Silicon Valley engineers,” with their brother and sister technocrats in Beijing, St. Petersburg, and a few other global hubs of Twenty-first Century innovation that are forging and then unleashing the code that is democratizing this virtual warfare.

Artificial Intelligence (AI)-Enabled Information Operations

Seeing is believing, right? Not anymore! Previously clumsy efforts to photo-shop images and fabricate grainy videos and poorly executed CGI have given way to sophisticated Deepfakes, using AI algorithms to create nearly undetectable fake images, videos, and audio tracks that then go viral on-line to dupe, deceive, and manipulate. This year, FakeApp was launched as free software, enabling anyone with an artificial neural network and a graphics processor to create and share bogus videos via Social Media. Each Deepfake video that:

“… you watch, like, or share represents a tiny ripple on the information battlefield, privileging one side at the expense of others. Your online attention and actions are thus both targets and ammunition in an unending series of skirmishes.”

Just as AI is facilitating these distortions in reality, the race is on to harness AI to detect and delete these fakes and prevent “the end of truth.”

If you enjoyed this post:

– Listen to the accompanying playlist composed by P.W. Singer while reading LikeWar.

– Watch P.W. Singer’s presentation on Meta Trends – Technology, and a New Kind of Race from Day 2 of the Mad Scientist Strategic Security Environment in 2025 and Beyond Conference at Georgetown University, 9 August 2016.

– Read more about virtual warfare in the following Mad Scientist Laboratory blog posts:

— MAJ Chris Telley’s Influence at Machine Speed: The Coming of AI-Powered Propaganda

— COL(R) Stefan J. Banach’s Virtual War – A Revolution in Human Affairs (Parts I and II)

— Mad Scientist Intiative’s Personalized Warfare

— Ms. Marie Murphy’s Virtual Nations: An Emerging Supranational Cyber Trend

— Lt Col Jennifer Snow’s Alternet: What Happens When the Internet is No Longer Trusted?

55. Influence at Machine Speed: The Coming of AI-Powered Propaganda

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following guest blog post by MAJ Chris Telley, U.S. Army, assigned to the Naval Postgraduate School, addressing how Artificial Intelligence (AI) must be understood as an Information Operations (IO) tool if U.S. defense professionals are to develop effective countermeasures and ensure our resilience to its employment by potential adversaries.]

AI-enabled IO present a more pressing strategic threat than the physical hazards of slaughter-bots or even algorithmically-escalated nuclear war. IO are efforts to “influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries;” here, we’re talking about using AI to do so. AI-guided IO tools can empathize with an audience to say anything, in any way needed, to change the perceptions that drive those physical weapons. Future IO systems will be able to individually monitor and affect tens of thousands of people at once. Defense professionals must understand the fundamental influence potential of these technologies if they are to drive security institutions to counter malign AI use in the information environment.

Source: Peter Adamis / Abalinx.com

Programmatic marketing, using consumer’s data habits to drive real time automated bidding on personalized advertising, has been used for a few years now. Cambridge Analytica’s Facebook targeting made international headlines using similar techniques, but digital electioneering is just the tip of the iceberg. An AI trained with data from users’ social media accounts, economic media interactions (Uber, Applepay, etc.), and their devices’ positional data can infer predictive knowledge of its targets. With that knowledge, emerging tools — like Replika — can truly befriend a person, allowing it to train that individual, for good or ill.

Source: Getty Creative

Substantive feedback is required to train an individual’s response; humans tend to respond best to content and feedback with which they agree. That content can be algorithmically mass produced. For years, Narrative Science tools have helped writers create sports stories and stock summaries, but it’s just as easy to use them to create disinformation. That’s just text, though; today, the AI can create fake video. A recent warning, ostensibly from former President Obama, provides an entertaining yet frightening demonstration of how Deepfakes will challenge our presumptions about truth in the coming years. The Defense Advanced Research Projects Agency (DARPA) is funding a project this summer to determine whether AI-generated Deepfakes will become impossible to distinguish from the real thing, even using other AI systems.

Given that malign actors can now employ AI to lieat machine speed,” they still have to get the story to an audience. Russian bot armies continue to make headlines doing this very thing. The New York Times maintains about a dozen Twitter feeds and produces around 300 tweets a day, but Russia’s Internet Research Agency (IRA) regularly puts out 25,000 tweets in the same twenty-four hours. The IRA’s bots are really just low-tech curators; they collect, interpret, and display desired information to promote the Kremlin’s narratives.

Source: Josep Lago/AFP/Getty Images

Next-generation bot armies will employ far faster computing techniques and profit from an order of magnitude greater network speed when 5G services are fielded. If “Repetition is a key tenet of IO execution,” then this machine gun-like ability to fire information at an audience will, with empathetic precision and custom content, provide the means to change a decisive audience’s very reality. No breakthrough science is needed, no bureaucratic project office required. These pieces are already there, waiting for an adversary to put them together.

The DoD is looking at AI but remains focused on image classification and swarming quadcopters while ignoring the convergent possibilities of predictive audience understanding, tailored content production, and massive scale dissemination. What little digital IO we’ve done, sometimes called social media “WebOps,” has been contractor heavy and prone to naïve missteps. However, groups like USSOCOM’s SOFWERX and the students at the Naval Postgraduate School are advancing the state of our art. At NPS, future senior leaders are working on AI, now. A half-dozen of the school’s departments have stood up classes and events specifically aimed at operationalizing advanced computing. The young defense professionals currently working on AI should grapple with emerging influence tools and form the foundation of the DoD’s future institutional capabilities.

MAJ Chris Telley is an Army information operations officer assigned to the Naval Postgraduate School. His assignments have included theater engagement at U.S. Army Japan and advanced technology integration with the U.S. Air Force. Chris commanded in Afghanistan and served in Iraq as a United States Marine. He tweets at @chris_telley.

This blog post represents the opinions of the author and do not reflect the position of the Army or the United States Government.