Disinformation is a word frequently used. It means information that is untrue, circulated by people who know it to be untrue with the intention that the information be acted upon.
It has become part of common parlance – frequently used to dismiss information that does not conform to the orthodox narrative without any explanation of why it is “disinformation”. The nature of the incorrect information is rarely if ever unpicked. Indeed it is rare that the detail of the information is disclosed. Merely an assertion is made that XYZ has circulated disinformation or that disinformation has emerged from Russian troll-farms.
Occasionally there is a closer study of disinformation. The Disinformation Project used to do this but like so many was not specific about the source of the disinformation and failed to analyse in any detail the statements that were alleged to be “disinformational”.
There have been other outlets that have discussed the problem of disinformation but like the Disinformation Project neither identify with any particularity the statements that are disinformational nor identify the source of the statements so that they can be independently verified.
And herein arises the paradox. How can an independent reader make a judgement about whether or not a statement is disinformational if she or he cannot assess and evaluate the material themselves? To make things worse, if the person alleging material is disinformation fails to identify or make available the source of that material so that it can be independently verified, do we take the allegation on trust?
Because these allegations so often arise in mainstream media, whose trust levels are not high in the minds of many citizens, there could justifiably be some concern that the allegations of unspecified and unsourced disinformation amount to a form of disinformation – or at best misinformation – itself. And so, like the Worm Ouroborus, the paradox is circular and it seems insoluble.
In this article I shall discuss a recent piece that appeared in the NZ Herald authored by David Fisher, a senior and respected journalist. This article demonstrates the paradox which could have been avoided with full and frank disclosure.
The article was published in the Herald Online on 15 March 2025 and discusses a Russian disinformation campaign targeting New Zealand through two news sites, including one in te reo Māori.
The campaign, named "Portal Kombat," involves over 100 subdomains worldwide aimed at dividing communities. Experts believe the network uses AI translations and targets countries ahead of elections. The content on these sites includes material from Russia’s state-owned Sputnik news agency and other sources, promoting pro-Russian narratives and attempting to sow division in Western countries.
The article also highlights differing opinions on the quality of the te reo Māori translations, with some experts suggesting AI involvement and others believing a native speaker was involved.
The campaign is described as a form of "hybrid warfare" intended to weaken the West and portray Russia as a victim.
This is a very general summary of Fisher’s article but the assertions need to be carefully examined.
The Portal Kombat Campaign
The article states that the 'Portal Kombat' campaign is a global disinformation effort involving more than 100 subdomains worldwide, aimed at dividing communities.
It uses Russian media brand Pravda to promote country-specific news sites, including two in New Zealand, one of which is in te reo Māori.
The campaign aggregates Russian media and social media feeds, presenting them with a favorable push on issues such as the war in Ukraine.
It targets countries ahead of elections, using AI translations to reach a wider audience and circumvent sanctions. The campaign was first exposed by the French government agency VIGINUM and is operated by a company called TigerWeb based in Crimea.
Who is VIGINUM?
Fisher’s article does not develop any information about VIGINUM but some research on the matter has been fruitful. VIGINUM, as Fisher says, is a French government organisation.
Its full name is the Service for Vigilance and Protection against Foreign Digital Interference and some brief background information can be found here. It was created on 13 July 2021 and is attached to the SGDSN (General Secretariat for Defence and National Security). It is tasked with protecting France and its interests against foreign interference. More detail about VIGINUM can be found here although it is in French but can be translated.
On 5 February 2025 VIGINUM published a report on egregious examples of information manipulation seen in the 2024 Romanian presidential election. The report analyzes the methods used on TikTok to artificially promote certain content and exploit influencers. It assesses the risk of such methods being used in France.
The report is available here and is in French but it can be translated.
The report provides information on the manipulation of information targeting the 2024 Romanian presidential election, the results of the first round of which were annulled by the Romanian Constitutional Court on December 6.
The main elements presented come from an analysis carried out between November 25 and December 20, 2024, which focuses in particular on the operating methods observed on the TikTok platform, intended to artificially promote certain content, as well as the instrumentalization of influencers. It then assesses the risk of their transposition in France.
The description of the manipulation observed on TikTok in Romania is based primarily on third-party analysis available in open sources. This includes, in particular, declassified notes shared by the Romanian administration, reports from observers and organizations specialized in combating disinformation, reports provided by TikTok , as well as investigations by Romanian and French media.
This report also includes additional technical information from VIGINUM's investigations into organizations that contacted influencers covertly. It does not in any way indicate foreign digital interference.
By publishing the information contained in this report, VIGINUM aims to alert Internet users to the risk of manipulation of content recommendation systems on platforms. The service also aims to raise awareness among content creators with a large online community about the risks of exploitation to which they could be subjected by malicious actors.
The term “disinformation” is used in the document on a number of occasions. For example, it mentions "structures spécialisées dans la lutte contre la désinformation" and "les rapports fournis par TikTok, ainsi que des enquêtes de médias roumains et français."
Additionally, it refers to "la lutte contre la désinformation" and "produisant de la mésinformation politique de masse.
The paper also addresses disinformation risks in the following ways:
1. Detection and Characterization: VIGINUM's mission includes detecting and characterizing operations of foreign digital interference by analyzing publicly accessible content on platforms and online media.
2. Documentation and Anticipation: VIGINUM documents operational modes observed in other contexts to anticipate threats and protect the French public debate from information manipulation involving foreign actors.
3. Sensitization: VIGINUM aims to alert internet users about the risks of manipulation of content recommendation systems on platforms. It also seeks to raise awareness among content creators with large online communities about the risks of being instrumentalized by malicious actors.
4. Technical Investigations: The service conducts technical investigations into structures that have approached influencers non-transparently.
5. Publication of Reports: By publishing reports, VIGINUM informs the public about specific cases of information manipulation, such as the one targeting the Romanian presidential election, and evaluates the risk of similar operations being transposed to France.
6. Collaboration with Authorities: VIGINUM works in collaboration with other authorities and platforms to address disinformation, as seen in the Romanian case where TikTok and other entities were involved.
These efforts collectively aim to protect the integrity of the public debate and prevent foreign digital interference.
On 7 February 2025 VIGINUM published a report entitled “Challenges and opportunities of artifiical intelligence in the fight against information manipulation”
The report, with contributions from various international entities addresses the challenges and opportunities of artificial intelligence (AI) in combating information manipulation.
It explores how AI technologies, particularly generative AI (GenAI), are used by foreign actors to manipulate information and the potential impact on public perception and democracy.
The report also highlights the use of AI to enhance defences against such manipulations, aiming to raise public awareness, share best practices internationally, and encourage cooperation among institutional players, civil society, academia, and private sectors to develop innovative solutions.
The principal objectives of the report are threefold:
1. Public Information and Awareness: To raise the level of knowledge about cases of malicious use of AI for information manipulation.
2. Highlighting Opportunities: To showcase the opportunities offered by AI in the fight against information manipulation and promote the international sharing of best practices.
3. Encouraging Cooperation: To foster cooperation between institutional players, civil society, academia, and private actors to accelerate the development of innovative solutions for combating information manipulation.
Whilst it is not expected that a newspaper report would go into the level of detail exemplified by the VIGINUM reports, nevertheless these reports provide concrete examples and cross references to the examples of information manipulation referred to. Thus it is possible for the reader to verify the claims and make an assessment of the reliability of the findings of the papers.
VIGINUM and Portal Kombat
On 12 February 2024 VIGINUM published the first results of its investigations into the "Portal Kombat" network, a vast scheme made up of 193 digital "information portals" with similar characteristics, which disseminates pro-Russian content and targets several Western countries including France. Part 2 of the Report followed on 14 February 2025.
What is Tigerweb?
Established in 2015, TigerWeb is a web development company whose founder, Yevgeny Shevchenko, has been developing and maintaining websites since at least 2013.
VIGINUM also noted that some of the operating methods and content disseminated have strong similarities with those of the Inforos network, which has been subject to EU sanctions since July 2023, making it possible to infer that TigerWeb could be serving as a service provider for Russian influence operators.
The InfoRos Connection
Research over a number of years reveals that InfoRos wove a large web of sites designed to support the Russian government’s strategic interests, regularly evolved its objectives, its discourse, and its methods of influence.
Despite the numerous publications on its malicious activities, the Russian agency currently continues to register domain names and to exploit social networks to manipulate the narrative in the media, more particularly by trying to discredit the Kyiv government and the assistance it receives from Western countries.
A report by French organisation OpenFacto from January 2023 details InfoRos’ efforts to spread patriotic and anti-Western rhetoric through a network of Russian-speaking websites.
The report covers various aspects of InfoRos' operations, including its influence in the post-Soviet space (Ukraine, Georgia, Transnistria), its diplomatic efforts, security issues, and commercial activities.
It also discusses the agency's reaction to investigations, its methods of influence, and the evolution of its strategies and tactics over the years and includes a detailed analysis of InfoRos' domain names, the geographical distribution of its local portals, and its involvement in commercial contracts. Fifteen domain names are identified as examples and in addition the report mentions that Inforos has registered nearly 600 new portals since 2000, bringing the total number of domain names linked to the agency – as at January 2023 – to 1945.
Identifying Information Manipulation and Bad Actors
The French Government released a press kit entitled “Russian Disinformation: The Better We Know it The Better We Can Respond”.
The document discusses the use of disinformation as a weapon in international conflicts, specifically focusing on Russian disinformation campaigns and highlights various incidents and strategies employed by Russia to manipulate information, sow doubt, and create tensions in societies, particularly in France and other European countries.
It details France's response to these threats, including national and international cooperation, specialized agencies, and task forces. The press kit also emphasizes the importance of understanding and publicizing disinformation campaigns to effectively counter them.
The press kit also provides specific examples of Russian disinformation campaigns such as
Stars of David Graffiti in Paris: In November 2023, Russia used the RRN/Doppelgänger network to spread and amplify images of Stars of David tagged on walls in Paris. This campaign aimed to stir up hatred and create tensions in French society.
RRN (Recent Reliable News) Campaign: This campaign aimed to discredit Western support for Ukraine through various methods:
Dissemination of pro-Russian content criticizing Ukrainian leaders.
Spoofing websites of news outlets and government entities using "typosquatting."
Creation of French-language news websites sharing controversial content.
Use of fake websites and social media accounts to spread content.
Portal Kombat Network: Between September and December 2023, VIGINUM analyzed a network of "digital news sites" disseminating pro-Russian content. This network, initially covering local news in Russia and Ukraine, expanded to Western countries like France, Germany, and Poland. The sites relay content from social media accounts of Russian actors, Russian news agencies, and official websites, aiming to present Russia's actions positively and denigrate Ukraine.
But as well as identifying the campaigns the press kit details the steps that were taken in the campaign. For example, the Stars of David Incident unfolded as follows:
Initial Posting: Photos of Stars of David graffiti in the 10th arrondissement of Paris were first authentically posted on X (formerly Twitter) on 30 October 2023 at 7:37 PM.
Amplification by Bots: VIGINUM, France's technical and operational agency responsible for monitoring and protecting against foreign interference online, detected the involvement of a network of 1,095 bots on X. These bots made 2,589 posts amplifying the controversy surrounding the Stars of David graffiti.
Early Posting by Bots: VIGINUM found that the photos were first posted on the RRN bot network at 7:24 PM on 28 October 2023, 48 hours before the authentic posts.
Artificial Amplification: The bots affiliated with the RRN scheme amplified the photos and controversy, redirecting readers to the websites associated with the scheme.
Condemnation: France condemned the involvement of the RRN/Doppelgänger network in artificially amplifying the photos and being the first to post them on social media shortly after the massacres perpetrated by Hamas on 7 October 2023.
This incident reflects Russia's strategy of exploiting international crises to create strategic advantage.
Detail of the RRN Campaign was provided as well.
This information manipulation campaign aimed to discredit Western support for Ukraine.
Known as RRN due to the central role of the so called Reliable Recent News media outlet, this campaign had four components:
— Dissemination of pro-Russian content related to the war in Ukraine, particularly criticizing the country’s leaders;
— Spoofing websites of news outlets, and government and EU websites, using “typosquatting” to imitate their domain name;
— Creation of French-language news websites sharing controversial content in order to use French national news for their own ends;
— Use of combined inauthentic resources, such as fake websites or social media accounts , to spread content.
To do so, the RRN campaign used a series of inauthentic narratives with four main themes.
They aimed to sow division and artificially arouse mistrust between civil society and its leaders:
— The alleged ineffectiveness of sanctions targeting Russia, which are allegedly negatively impacting European States and citizens the most;
— The alleged Russiaphobia of Western countries;
— The alleged barbaric acts committed by the Ukrainian armed forces, and the neo-Nazi ideology that is supposedly rampant among Ukrainian leaders;
— The negative effects that Ukrainian refugees are allegedly having on European States.
Some 355 domain names imitating media outlets were detected by VIGINUM, four of them targeting French speakers more specifically and copying the graphic identity of the French daily newspapers 20 Minutes, Le Monde, Le Parisien and Le Figaro. At least 58 articles were published via these channels.
In the course of its open source investigation, VIGINUM detected the involvement of Russian or Russian-speaking individuals and several Russian companies.
In late May 2023, the RRN campaign went further than ever before in spoofing the website of the French Ministry for Europe and Foreign Affairs.
Bringing the Information Together
The information that I have provided regarding the activities of the French in dealing with information manipulation started with signposts in Mr Fisher’s article and some research rapidly enabled me to locate official publications and detail of the various campaigns that have been monitored and which have targetted France.
What is significant is the amount of detail that is provided about the campaigns, the nature of the information manipulation, the methodology of the various bad actors and the various sources and means of identification of the campaigns. Although there is an amount of summarisation of the campaigns it is possible to easily trace back and obtain the primary information that exemplifies information manipulation or disinformation.
What Mr. Fisher’s article does is that it skates over the surface of the disinformation campaigns it discusses. Certainly, it identifies the Russian actors who are involved and details of which I have discussed above. But as to the location of the information manipulation and a summary of what is alleged there is little if any detail and no way in which the information can be verified.
For example, early on in the article the following statement is made:
“Some material appears to have been lifted from a Pasifika social media channel, in which China is highlighted as serving the Pacific better than New Zealand.
It includes the statement: “Beijing secures Pacific supply chains. New Zealand secures… Five Eyes surveillance for the CIA?"
The following questions arise:
- What was the New Zealand site referred to?
- What was the “address” of that site?
- What was the material lifted?
- What was the social media channel that was allegedly the source of the “lifted” material?
- What was the full text of the statement about supply chains and Five Eyes so that context can be properly assessed”?
The answers to these questions would allow a reader more granular information about the nature of the information manipulation, because the misuse of a quotation (in part) would seem to be that rather than outright disinformation.
Mr Fisher then goes on to state:
“Academics have told the Herald it appears to be a modern version of classic Russian disinformation campaigns aimed at seeding division among communities in Western countries.”
Which academics? They should have been named. Or has their expertise been questioned or discredited.
The article makes reference to DRF Lab but does to explain the acronym. In fact what is referred to is the DFR Lab which is the Digital Forensic Research Lab at the Atlantic Council. In turn, the Atlantic Council, which had its origins in 1961 states that it
“promotes constructive leadership and engagement in international affairs based on the Atlantic Community’s central role in meeting global challenges. The Council provides an essential forum for navigating the dramatic economic and political changes defining the twenty-first century by informing and galvanizing its uniquely influential network of global leaders. The Atlantic Council—through the papers it publishes, the ideas it generates, the future leaders it develops, and the communities it builds—shapes policy choices and strategies to create a more free, secure, and prosperous world.”
Mr Fisher than goes on to refer to two NZ sites involved in an information campaign. He states:
“The two New Zealand sites are among 140 country-specific domains producing what DRF (sic) Lab called an “information campaign”, with sites emerging in a number of countries ahead of elections.”
Again, more questions:
- What are the NZ sites
- What are their URLs or addresses
- What are some examples of the other “140 country specific domains”
Without this information the reader cannot cross check the allegations.
The article then goes on to discuss the use of te reo on the sites and whether this was generated by an articulate te reo speaker or artificial intelligence. A Wellington academic – Dr Michael Daub – comments on the use of AI by Russian campaigns.
Dr Daub said that
“the best response was information literacy - schooling the wider population on recognising misinformation and disinformation online.”
Jonathan Ayling of the Free Speech Union endorses this view. (More on this later).
Mr Fisher then cites Professor Stephen Hoadley who goes on to discuss the nature of Russian information manipulation and that the use of “local domains likely reflected a belief people in New Zealand would be more likely to accept and believe content that appeared local”.
Finally the article cites a source from the New Zealand Security Intelligence Service (NZSIS) commenting on how states including Russia use misinformation and disinformation to achieve a strategic advantage. The comment is made
“Foreign interference is an act by a foreign state, often through a proxy, which is intended to influence, disrupt or subvert New Zealand’s national interests by deceptive, corruptive or coercive means.”
Information Literacy
From a general point of view Mr. Fisher’s article makes some important points. It is part of Russia’s strategy to use information manipulation, misinformation and disinformation to sow weeds of discord and to obtain a long-term strategic advantage.
These efforts have been well publicised in Europe where perhaps the Russian threat is more immediate than in New Zealand. But the material from Europe is far more specific and detailed and importantly verifiable than the information contained in Mr. Fisher’s article.
Arising from that is the issue of information literacy. The public need to be educated in recognising information manipulation campaigns. But the question that flows from that is this – how is a citizen able to recognise information manipulation if it is not pointed out nor made clear.
And therein lies the disinformation paradox. It resembles Donald Rumsfeld’s famous unknown unknows paradox that he stated at a Department of Defense briefing on 12 Febuary 2002.
“..as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don't know we don't know.”
Similarly – how do we know or identify what is information manipulation if we don’t know it and haven’t had it exemplified.
Ideally we should be given examples of the statement and why it is that it amounts to information manipulation.
Ideally we should be advised of the suspect websites, domain names, TikTok or Telegram channels so we can seek out examples of misinformation and undertake our own fact checking or verification.
But as matters stand, despite the calls for information literacy, the building blocks that underpin such a skill are denied us. It would have been helpful if Mr. Fisher had provided concrete data enabling readers to see and understand why it was that the material said to be disinformation of information manipulation was just that.
Otherwise the paradox rules. We can’t identify information manipulation because we don’t know what it looks like – it is an unknown unknown.
And sadly, incomplete information, such as that contained in Mr Fisher’s article, does not assist in resolving the problem. The tools of hypertext links, specific references, identification of sources, examples of information manipulation would have given the article some substance and some weight of reliability. But that information was missing.
And sadly, in these troubled times of information manipulation it is not enough to say “trust me.” People demand more substance than that.
Quite agree. We are fed up with “Trust me, I’m the single source of truth”. If one is going to make vague claims about disinformation it amounts to the same thing as vague claims of ‘conspiracy theory’ or ‘racism’ or ‘misogyny’ or whatever current name-calling is used to dismiss another’s point of view. Give us the reasons and the specific details of what and why something is disinformation and we might pay attention. Otherwise we just ignore such claims.
The implication here is only the "bad guys" run misinformation campaigns. That seems misinformed in its own right