251. The Convergence: The Next Iteration of Warfare with Lisa Kaplan

[Editor’s Note: Mad Scientist Laboratory is pleased to announce our latest episode of “The Convergence” podcast, featuring Lisa Kaplan discussing weaponized information as a national security problem, algorithmic silos created by social media, and disinformation as the next iteration of warfare. Please note that this podcast and several of the embedded links below are best accessed via a non-DoD network due to network priorities for teleworking — Enjoy!]

In this latest episode of “The Convergence,” we talk with Lisa Kaplan, who founded Alethea Group to help organizations navigate the new digital reality and protect themselves against disinformation. Ms. Kaplan served as digital director for Senator Angus King’s 2018 campaign, where she designed and executed a strategy to identify, understand, and respond to disinformation. She is one of the few people who has firsthand experience combating disinformation on the campaign trail. Ms. Kaplan has also briefed US, NATO, EU, and G-7 policy makers and officials on disinformation. Previously, she consulted with PwC for the U.S. State Department, and served as a U.S. Senate aide.

In this episode, we talk with Ms. Kaplan about weaponized information as a national security problem, algorithmic silos created by social media, and disinformation as the next iteration of warfare. Some of the highlights from our interview include the following:

      • Disinformation is a national security problem manifesting itself in politics. Open source information can be leveraged to create effective digital strategies to counter this rapidly-proliferating threat.
      • Social media algorithms create algorithmic silos: personal echo chambers that create individual realities for users. This method of platform retention is creating more polarized information spaces. Algorithms will continue to get stronger over time, increasing the impact of this problem.
      • Disinformation will become the next iteration of warfare, as it is comparatively inexpensive and easy to use. Bad actors can leverage algorithmic silos to target their disinformation to vulnerable populations. As a result, the government should identify vulnerable populations and develop support plans.
      • The proliferation of fringe and conspiracy media outlets will make it difficult to know which information to trust. We should begin examining the long term impacts for children growing up in this environment, particularly in relation to their feelings towards U.S. competitors.
      • We are all targets of disinformation, so we can all combat it. Thinking before you share, reading critically, searching for the right sources/authors, and avoiding sensationalized media can reduce the impact of disinformation. Remember, you are likely a trusted source to those around you.
      • Conversations about disinformation trends are an important part of combating this threat. The U.S. Government has unmatched capacity to address disinformation, but needs to work towards legislation that will allow it to act in this space.

Stay tuned to the Mad Scientist Laboratory for our next podcast with LTC Arnel David, U.S. Army, and Maj Aaron Moore, British Army, as they discuss Fight Club, the current revolution in Professional Military Education, and the role of Artificial Intelligence in future military operations on 23 July 2020!

If you enjoyed this post, check out the following related posts:

The Convergence: True Lies – The Fight Against Disinformation with Cindy Otis

GEN Z and the OE: 2020 Final Findings

The Death of Authenticity: New Era Information Warfare

LikeWar — The Weaponization of Social Media

… and videos [via a non-DoD network]:

Weaponization of Social Media and Fictional Intelligence (FICINT), with Peter W. Singer and August Cole

AI and Manufacturing Reality, with Drs. Marek Posard and Christopher Paul

The Storm After the Flood Virtual Wargame, moderated by Dr. Gary Ackerman

>>> REMINDER 1: The Mad Scientist Initiative will facilitate the next webinar in our Weaponized Information Virtual Events series next week on Wednesday, 15 July 2020:

AI Speeding Up Disinformation — This virtual panel will be moderated by LTG John D. Bansemer (USAF-Ret.), and will feature Dr. Margarita Konaev, Katerina Sedova, and Tim Hwang, all from Georgetown University’s Center for Security and Emerging Technologies.

In order to participate in this virtual webinar, you must first register here [via a non-DoD network].

>>> REMINDER 2: We will facilitate our Weaponized Information Virtual Conference, co-sponsored by Georgetown University and the Center for Advanced Red Teaming, University at Albany, SUNY, on Tuesday, 21 July 2020. The draft agenda for this event can be viewed here. In order to participate in this virtual conference, you must first register here [via a non-DoD network].

>>> REMINDER 3: If you missed participating in any of the previous webinars in our Mad Scientist Weaponized Information Virtual Events series — no worries! You can watch them again here [via a non-DoD network] and explore all of the associated content (presenter biographies, slide decks, scenarios, and notes from each of the presentations) here.

Share on Facebook Share on LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *