Phoenix Challenge® London 2025

Abstract Submission

2025 Phoenix Challenge® - The Disinformation Kill Chain
Tuesday, March 4th through Thursday, March 6th
IET London: Savoy

A “kill chain” is a military term for any process comprised of ordered steps or stages to analyze an attack. First used by cyber defense professionals, it is now increasingly used by counter disinformation practitioners, as a means to disrupt potential disinformation attackers during the following six “kill chain” stages: 

  1. Reconnaissance: malign actors evaluate the operational environment to find weaknesses in the audience they are trying to manipulate.
  2. Asset Development: malign actors mobilize relevant groups and communities; create believable personas; develop social media accounts and websites; and pre-position other influence assets.
  3. Content Development: malign actors develop compelling content such as memes, photos and videos, often using artificial intelligence, to increase the volume of content and generate synthetic content, such as deepfakes.
  4. Deployment: malign actors use assets to deploy content on the internet or offline.
  5. Amplification: malign actors use bots, social engineering techniques, coordinated networks, and exploit social media algorithms to spread relevant content. Malign actors can “troll,” harass, or provoke unwitting users to promote sustained content engagement.
  6. Actualization: malign actors achieve their goals, changing the minds of target audiences or even mobilizing them to action

 

Each of these stages presents an opportunity to counter disinformation actors. Phoenix Challenge® invites abstracts that focus on any of the relevant elements below, which can help stop or impede adversaries along the different stages of the kill chain.

  • Deterrence. How can we deter malign actors from conducting their disinformation or foreign malign influence operations at the time of their choosing?
  • Detection. How can we better detect false, misleading, inauthentic, or deepfake content or coordinated, inauthentic behavior?
  • Current and Future Capabilities. How can we leverage current and emerging technologies or capabilities, or use them in novel ways to more effectively counter disinformation?  How are adversaries using these emerging technologies or capabilities?  
  • Countering Disinformation with OIE. How can we use operations in the information environment capabilities to counter disinformation and foreign malign influence? What best practices and lessons-learned should be engaged now?
  • Strengthening Resilience. How can practitioners legally and effectively strengthen resilience to disinformation by foreign malign actors?
  • Assessments. How can practitioners accurately assess the effect of disinformation on national security? How do we accurately assess the effectiveness of our own counter disinformation efforts?

Continuing the 2024 Global Information Conference theme “from strategy to action”, we are especially interested in tangible or actionable recommendations or solutions. Applicants are strongly encouraged to include vignettes and scenarios of how existing cognitive and technical information capabilities can be employed against adversaries, as well as innovative concepts that reimagine how a current capability could be applied in the future.

Industry presentations should focus on operational construct/concepts and not be solely focused on industry products and services.

Submission Guidelines

Tell us what you would like to share in 500 words or less! Abstracts are due by December 15, 2024 Midnight Eastern with notification of acceptance/rejection by January 10.

You can present a technical talk/session (40 minutes total including audience questions) or propose a panel discussion.

Bio about you (1000 characters or less)



Submit Abstract

Edit an Abstract

Edit an abstract you entered previously

[Forgot Your Password?]
Edit Abstract

DoD conference ID #20250302934