menu-close
RussiaSeptember 1 2025, 7:22 am

Russian AI-Generated Fake News Spoofs US Media

A Russ­ian fake news oper­a­tion has esca­lat­ed its dis­in­for­ma­tion cam­paign by imper­son­at­ing major news out­lets with AI-gen­er­at­ed con­tent. On August 17, 2025, Politi­co report­ed that the pro-Russ­ian pro­pa­gan­da group Storm-1679 has been spoof­ing ABC News, BBC, and POLITICO using deep­fake tech­nol­o­gy, includ­ing a fake Net­flix doc­u­men­tary nar­rat­ed by an AI-gen­er­at­ed Tom Cruise voice that tar­get­ed the 2024 Paris Olympics.The arti­cle begins:

A pro-Russ­ian pro­pa­gan­da group is tak­ing advan­tage of high-pro­file news events to spread dis­in­for­ma­tion, and it’s spoof­ing rep­utable orga­ni­za­tions — includ­ing news out­lets, non­prof­its and gov­ern­ment agen­cies — to do so. Accord­ing to mis­in­for­ma­tion track­er News­Guard, the cam­paign — which has been tracked by Microsoft­’s Threat Analy­sis Cen­ter as Storm-1679 since at least 2022 — takes advan­tage of high-pro­file events to pump out fab­ri­cat­ed con­tent from var­i­ous pub­li­ca­tions, includ­ing ABC News, BBC and most recent­ly POLITICO.

Read more: https://www.politico.com/news/2025/08/17/russia-fake-news-content-00175112

Key Points

  • The oper­a­tion achieved viral suc­cess with a fake E! News video claim­ing USAID paid celebri­ties to vis­it Ukraine, which Don­ald Trump Jr. and Elon Musk shared to mil­lions of fol­low­ers before being debunked.
  • Storm-1679 com­bines AI-gen­er­at­ed audio imper­son­ations of celebri­ty voic­es with fake video con­tent, tar­get­ing high-pro­file events like elec­tions, sport­ing events, and wars to max­i­mize impact.
  • The cam­paign focus­es on flood­ing the inter­net with pro-Krem­lin con­tent around Ger­man SNAP elec­tions, Moldovan par­lia­men­tary votes, and Ukraine war nar­ra­tives ahead of Trump-Putin meetings.
  • The Trump admin­is­tra­tion has scaled back fed­er­al agen­cies fight­ing dis­in­for­ma­tion, shut­ter­ing the State Depart­men­t’s Counter For­eign Infor­ma­tion Manip­u­la­tion office and halt­ing CISA’s domes­tic mis­in­for­ma­tion efforts.

Russian Fake News & AI Disinformation Campaigns Target Western Democracies

Russ­ian influ­ence oper­a­tions have become increas­ing­ly sophis­ti­cat­ed in their efforts to desta­bi­lize West­ern democ­ra­cies and polar­ize pub­lic opin­ion, blend­ing ide­o­log­i­cal mes­sag­ing, covert fund­ing, and advanced dig­i­tal tac­tics. Leaked doc­u­ments expose sys­tem­at­ic Moscow-financed media fronts across Europe and the Balka­ns, sup­port­ing a vast ecosys­tem of pro-Krem­lin out­lets and influ­ence cam­paigns that ampli­fy divi­sive nar­ra­tives and under­cut sup­port for Ukraine.

These efforts are now inten­si­fied by AI-dri­ven oper­a­tions such as “Oper­a­tion Under­cut,” which deploys AI to mim­ic news orga­ni­za­tions and flood social media with hyper-tar­get­ed dis­in­for­ma­tion, exploit­ing debates on Ukraine, the Mid­dle East, and even U.S. elec­tions to deep­en soci­etal frac­tures. Ger­man intel­li­gence has also doc­u­ment­ed how Rus­sia coor­di­nates dis­in­for­ma­tion to under­mine elec­tions, employ­ing a four-phase mod­el that includes cloning rep­utable news sites, recruit­ing influ­encers, and cir­cu­lat­ing AI-manip­u­lat­ed videos in an attempt to erode trust in demo­c­ra­t­ic insti­tu­tions, ampli­fy divi­sions, and ulti­mate­ly influ­ence pol­i­cy decisions.

Cam­paigns now use AI-gen­er­at­ed posts to tar­get spe­cif­ic lin­guis­tic com­mu­ni­ties, such as French speak­ers in Africa, with tai­lored dis­in­for­ma­tion deliv­ered through social media and mes­sag­ing apps. Con­sumer-grade AI tools have enabled the mass-pro­duc­tion of con­vinc­ing fake images, videos, and coun­ter­feit news sites, dra­mat­i­cal­ly increas­ing the scale and believ­abil­i­ty of fab­ri­cat­ed nar­ra­tives. Russ­ian oper­a­tions have also deployed net­works of AI-gen­er­at­ed social media accounts in West­ern coun­tries to auto­mat­i­cal­ly spread divi­sive mes­sages and coun­ter­feit grass­roots sup­port for pro-Krem­lin positions.

While the imme­di­ate effec­tive­ness of these cam­paigns can be lim­it­ed, their cumu­la­tive effect—spreading con­fu­sion, polar­iz­ing soci­eties, and weak­en­ing demo­c­ra­t­ic resilience—aligns with Russia’s long-term ambi­tion to reshape the glob­al infor­ma­tion order.

Exter­nal References:

  1. Rus­sia tar­get­ed French speak­ers in Africa with AI-gen­er­at­ed posts, says French watchdog
  2. A Pro-Rus­sia Dis­in­for­ma­tion Cam­paign Is Using Free AI Tools
  3. A Russ­ian Bot Farm Used AI to Lie to Amer­i­cans. What Now?

Dis­claimer

The Glob­al Influ­ence Oper­a­tions Report (GIOR) employs AI through­out the post­ing process, includ­ing gen­er­at­ing sum­maries of news items, the intro­duc­tion, key points, and often the “con­text” sec­tion. We rec­om­mend ver­i­fy­ing all infor­ma­tion before use. Addi­tion­al­ly, images are AI-gen­er­at­ed and intend­ed sole­ly for illus­tra­tive pur­pos­es. While they rep­re­sent the events or indi­vid­u­als dis­cussed, they should not be inter­pret­ed as real-world photography.