AI Impersonation of Secretary Marco Rubio Sparks Diplomatic Security Alert
In a startling demonstration of artificial intelligence’s potential for misuse, an unknown individual recently used AI-generated voice technology to impersonate U.S. Secretary of State Marco Rubio, targeting high-profile international figures. This alarming episode unfolded in June 2025 and has since led the U.S. State Department to issue a global security advisory to all diplomatic and consular posts.
How the Deception Unfolded
The impersonator leveraged the encrypted messaging platform Signal to contact three foreign ministers, a U.S. state governor, and a member of Congress. Under the cover of an account displaying the address [email protected]—which was created in mid-June—the actor sent text messages and left voice mails crafted with AI-generated speech closely mimicking Rubio’s voice.
According to a diplomatic cable reported by Reuters and first revealed by The Washington Post, these communications were designed to manipulate recipients into discussions under false pretenses, with the possible intent of extracting sensitive information or gaining unauthorized access to accounts.
- The impersonator left voicemails for at least two targets.
- In one case, a text message invited further communication on Signal.
- Despite attempts, no breach of State Department IT systems was detected.
State Department’s Response and Ongoing Investigation
The State Department promptly alerted its global network, urging vigilance and caution among diplomats and external partners against potential AI-driven impersonations. A senior department official, speaking on condition of anonymity, reassured that no direct cyber threat to the department is evident but emphasized the risk posed if individuals compromised sensitive data unknowingly.
“We take our responsibility to protect sensitive information seriously and are continuously enhancing our cybersecurity measures to counter evolving threats,” the official told Reuters.
The Associated Press noted that this specific impersonation attempt was ultimately unsuccessful and not deemed highly sophisticated. Secretary Rubio has not made a public statement regarding the incident.
Broader Concerns About AI in Political Deception
This incident is part of an escalating pattern of AI-facilitated disinformation and fraud that U.S. authorities have begun to address more aggressively. The FBI issued warnings in May 2025 about malicious actors exploiting AI capabilities to mimic government officials’ voices and texts, aiming to infiltrate personal and official accounts for data theft or financial gain.
Earlier cases include a 2023 fake robocall that used a cloned voice to urge voters to boycott the New Hampshire primary, an act described by officials as a deliberate attempt to undermine democratic processes.
What This Means for Diplomatic Security and Public Trust
The Rubio impersonation episode underscores the urgent need for robust safeguards against AI-enabled social engineering attacks. Experts warn that as artificial intelligence continues to advance, verification protocols for official communications must evolve correspondingly to preserve trust and information security.
American policymakers face a critical challenge: balancing the innovative benefits of AI technologies with comprehensive strategies to combat their weaponization in diplomacy, election integrity, and national security.
Editor’s Note
This incident raises critical questions about how governments and institutions worldwide can adapt to the rapidly changing technological landscape. While no breach occurred this time, the potential for AI-driven impersonation to cause diplomatic crises or compromise sensitive information is real and growing.
Readers are encouraged to monitor how cybersecurity frameworks and diplomatic protocols evolve in response, particularly regarding AI’s role in misinformation campaigns. The story calls for broader public discussions on technological literacy and the trusted verification of official communications amid an era of digital deception.