Poland was hit by an unprecedented disinformation attack following a Russian drone incursion on September 10, triggering a wave of approximately 200,000 social media messages originating from Russian and Belarusian accounts that blamed Ukraine or NATO for the incident. On 8 October 2025, Le Monde reported that the provocation, during which around 20 Russian drones violated Polish airspace, was followed by what cybersecurity experts described as a tsunami of disinformation coordinated with the military actions. The article begins:
Cybersecurity and disinformation experts in Poland choose their words carefully when they speak of a state of war. The war waged by Russia and Belarus against the European Union and North Atlantic Treaty Organization began in 2014 with the invasion of Crimea and the Donbas. The conflict, with its own history and distinct phases, escalated during the night of September 10 to 11, when around 20 Russian drones violated Polish airspace. NATO, for the first time since its founding in 1949, was forced to open fire on enemy flying objects in European airspace. That night, the Polish internet exploded. Many experts described it as a tsunami of disinformation.
Read more: https://www.lemonde.fr/en/international/article/2025/10/08/poland-hit-by-unprecedented-disinformation-attack-following-russian-drone-incursion_6746208_4.html [paywall]
Key Points
- Michal Fedorowicz, president of Res Futura collective, said experts analyzed around 200,000 mentions spreading the Russian narrative over the course of that night, equivalent to 200 to 300 mentions per minute.
- Of all the comments analyzed by Res Futura, 38 percent blamed Ukrainians for the incident, 34 percent blamed Russians, and a significant share blamed NATO, with traces of the influence operation found in France, Germany, and Romania.
- The Polish ministry in charge of digital affairs issued an alarming statement about the disinformation wave as early as the morning of September 11, while the national institute for civil cybersecurity NASK quickly released a report analyzing the malicious posts.
- Filip Glowacz from NASK stated that despite requests to companies like Facebook, X, and TikTok, it is complicated to get them to remove fake content because doing so would impact their business model.
Russian Influence Operations in Poland: From Ghostwriter to 2025 Election Interference
Russian influence operations in Poland have evolved from early cyber-enabled campaigns targeting NATO narratives to sophisticated hybrid warfare tactics aimed at disrupting democratic processes. The Ghostwriter campaign targeted audiences in Lithuania, Latvia, and Poland with narratives critical of NATO’s presence in Eastern Europe, leveraging credential harvesting and malware delivery through spear-phishing attacks against government, military, and media organizations. Beginning in 2017, this cyber-enabled operation expanded beyond anti-NATO messaging to compromise social media accounts of Polish officials to create domestic political disruption. Technical evidence linked the campaign to UNC1151, a suspected state-sponsored cyber espionage group, with attributions pointing to both Russian and Belarusian involvement.
Beyond cyber operations, Russia employed diverse influence tactics. Organizations associated with Russian operative Yevgeny Prigozhin organized a Baltic Sea Region Strategic Dialogue conference in Berlin, addressing economic and environmental issues affecting Poland and neighboring states, representing “textbook tradecraft” by starting with politically uncontroversial topics before advancing Kremlin interests. Similarly, Belarus’s KGB used fake accounts to pose as journalists and activists to inflame tensions about the migrant crisis at the Belarus-Poland border, with Meta removing 41 Facebook accounts using AI-generated profile pictures. Polish authorities described the migrant influx as low-intensity hybrid warfare, demonstrating coordinated pressure from Russia and its ally.
During the 2025 presidential elections, Polish intelligence documented over 50 incidents linked to Russian interference targeting the May 18 vote, with Interior Minister Tomasz Semonik warning of online manipulations aimed at undermining governmental trust. Operations sought to inflame tensions between Polish citizens and Ukrainian refugees, while 22 Polish-language Telegram channels with over 150,000 subscribers spread pro-Kremlin propaganda, including staged videos and false claims denying documented war crimes. These channels replicated content from banned Russian state media RT and Sputnik, with several operated by former Ukrainian security officers collaborating with Russia.
Despite the scale of Russian efforts, most Kremlin operations failed to resonate with Polish voters, as GLOBSEC polling showed that 86% of Poles considered Russia a threat. Operation Doppelganger bot networks failed due to 92% Polish support for increased defense, while pro-Putin candidate Maciej Maciak secured only 0.19% of the votes, despite extensive Russian state media coverage. Remarkably, post-election analysis found interference attempts significantly lower than government warnings suggested, with experts noting the absence of aggressive tactics seen elsewhere. Poland’s diversified social media landscape and “election umbrella” strategy proved effective in mitigating foreign interference.
However, nationalist candidate Karol Nawrocki’s narrow victory was aided by foreign-funded Facebook ads and TikTok manipulation involving over 2,400 fake accounts, creating institutional paralysis that threatened democratic reforms.
External References:
— The Kremlin’s Double Game: Russian Attempts to Influence Poland’s 2025 Election — GLOBSEC
— Ghostwriter Update: Cyber Espionage Group UNC1151 Likely Conducts Ghostwriter Influence Activity — Mandiant
— Poland fights digital interference ahead of final round of presidential vote — France 24
The Global Influence Operations Report (GIOR) utilizes AI throughout the posting process, including the generation of summaries for news items, introductions, key points, and, often, the “context” section. We recommend verifying all information before use. Additionally, all images are generated using AI and are intended solely for illustrative purposes. While they represent the events or individuals discussed, they should not be interpreted as real-world photography.