Russian intelligence agencies have begun employing deepfake technology to fabricate evidence aimed at discrediting foreign politicians. The Robert Lansing Institute reported on September 9, 2025, that Moscow is using the Turkish newspaper Aydınlık to target US Senator Lindsey Graham through fabricated videos. The operation aims to undermine American support for Ukraine ahead of Senate elections by creating fake negotiations between Ukrainian officials and Graham. The article begins:
Russian intelligence agencies responsible for influence operations have begun to employ artificial intelligence, particularly deepfake technology, to fabricate “evidence” aimed at discrediting foreign politicians. One recent example appeared in the Turkish newspaper Aydınlık, known for its sensationalist leanings. The outlet published a video purporting to show Andriy Yermak, the head of Ukraine’s presidential office, Ukraine’s commander-in-chief Oleksandr Syrsky, and U.S. Senator Lindsey Graham. Even Aydınlık acknowledged that the authenticity of the footage could not be confirmed.
Key Points
- The fabricated video bears hallmarks of Russian disinformation including absence of metadata, Russian language interface elements, unnatural facial movements, and delayed eye-mouth synchronization typical of AI manipulation.
- Aydınlık newspaper is closely linked to the Vatan Party of Doğu Perinçek, a long-time advocate of “Eurasianism” who has appeared on Russian media platforms and participated in nationalist forums in Moscow.
- This continues a pattern from 2023 when Russian state media circulated a manipulated video of Graham’s meeting with Zelensky, splicing his words to suggest he said “the Russians are dying — the best money we’ve ever spent”.
- The journalist behind the piece, Yiğit Saner, attended the International Russophile Movement Congress in Moscow in 2024 and later visited occupied Mariupol, with his reporting frequently amplified by Russian state media.
Deepfakes and Influence Operations: Rising Tool of Disinformation
Deepfake technology has become a flexible weapon in global influence operations, undermining public trust in institutions and reshaping political discourse. Russian operators recently spoofed Western outlets with AI-generated articles and synthetic celebrity voice-overs to influence audiences around major international events. A coordinated strategy known as Operation Undercut used fabricated video and audio deepfakes to erode Western support for Ukraine by mimicking credible news sources.
In Asia, efforts emerged where ~manipulated clips in the Philippine electoral context~were circulated to sway voter sentiment. Even European politicians were duped by a staged video call impersonating Navalny’s chief-of-staff, illustrating how synthetic media can infiltrate diplomacy. Beyond these cases, researchers warn that AI-enabled influence operations are becoming scalable across elections worldwide, with models capable of generating persuasive text, images, and videos tailored to local contexts.
Campaigns in India already demonstrate how millions of deepfakes can flood social platforms to distort political discourse, while U.S. analysts point to the risk of foreign actors weaponizing such tools in the 2024 election cycle. Together, these developments reveal how authoritarian and opportunistic actors are embedding synthetic media into disinformation arsenals to fracture democratic processes, polarize societies, and undermine media credibility.
External References:
- AI-Enabled Influence Operations: Safeguarding Future Elections
- Indian Voters Are Being Bombarded With Millions of Deepfakes. Political Candidates Approve
- Foreign Influence Operations in the 2024 Elections
Disclaimer
The Global Influence Operations Report (GIOR) employs AI throughout the posting process, including generating summaries of news items, the introduction, key points, and often the “context” section. We recommend verifying all information before use. Additionally, images are AI-generated and intended solely for illustrative purposes. While they represent the events or individuals discussed, they should not be interpreted as real-world photography.