The conversation with Elina Lange-Ionatamishvili (Senior Expert at NATO StratCom COE) touched on issues related to researching and countering disinformation. As the interviewee of Piotr Jaszczuk noted, today even a small number of people can influence the spread of disinformation on a massive scale.
An example was given of the case described by our editorial staff, in which someone created a fake profile on Facebook of the former Chief of the General Staff of the Polish Armed Forces, gen. Jarosław Mika .
"One of the bigger problems observed during this incident was that some active soldiers, generals, journalists, experts and politicians decided that the account was real and accepted friend requests sent from this account," said Jaszczuk.
"Another issue was Facebook's reaction. A couple of weeks have passed before it has decided to delete the fake profile" – he continued.
According to both Lange-Ionatamishvili and Jaszczuk, this case shows that training on information verification should be held regularly and that it ought to be addressed not only to commanders, but also to soldiers. An example of the Command of Polish Cyberspace Defense Forces was given, which also provides substantial training to its civilian staff.
Here it is worth mentioning the example described by CNN, which looked at an experiment conducted by the NATO Strategic Communications Center of Excellence. These studies showed that through platforms such as Facebook or Instagram, it was possible to effectively influence the behaviour and way of thinking of soldiers.
"The details that we have discovered enabled us to instill undesirable behaviour during training" - the researchers said.
The conversation also concerned NATO and how the NATO command can deal with the problem of deepfakes. As an example, Latvia was given, which has difficulties reaching the large Russian minority living there. It is in this country that the activities of groups operating on Telegram primarily affect the younger audience and significantly hinder the fight against disinformation. Following closure of Russian TV broadcasts in Latvia due to war in Ukraine, also older generation Russian speakers are moving to social networks in search for information. That is a 'territory' where traditional media does not have sufficient reach.
"The problem is that state institutions and national traditional media have less influence on this audience," said Lange-Ionatamishvili, "and Russia is reaching them via TikTok and Telegram, sometimes luring audiences from open TikTok channels to closed Telegram groups".
Furthermore, she drew attention to the language used by individual governments towards their citizens. In her opinion, their language is often "distanced from citizens, and this makes it easier for populists to reach them with their message." This in turn exacerbates the problem of disinformation.
"Populists use normal, understandable, emotional language. Governments often have little understanding of their audience. Polls don't tell everything. We need more in-depth, qualitative studies," - she added.
How to prepare for the election?
The conversation also touched on the topic of the Polish parliamentary election. Is there a golden rule about what needs to be done to ensure that disinformation has as little impact on the election as possible?
According to the representative of the NATO expert centre, a good solution is the cooperation of governments and the media to prepare the public for possible problems. As an example, the Scandinavian countries were given here, whose pre-emptive decision reached the Kremlin, discouraging attempts to interfere in electoral processes.
"The society was prepared, so possible large expenditures by Russia could have little effect" - we heard from the representative.
"The basic and first step that can be taken is to make the public aware that someone will try to spread disinformation. This has also been recognised by the U.S. defence establishment as an important step to discourage hostile actors: speak publicly about the possible threats," she said, also discussing other ideas to fight fake news.
Among the actions that can be taken to combat disinformation, there are, for example, public campaigns aimed at both traditional media and the internet. It is also important to show a wider audience how deepfakes work and how to identify them.
Interestingly, hackathons organized by NATO StratCom COE centre can also be involved in the fight against disinformation.
"Their winners are later engaged, for example, in creating educational games aimed at training in fact-checking. The so-called "fake news games" (https://thenewshero.org/) can later find their way to the wider community," she said.
The difficulty of fighting disinformation
In the opinion of Lange-Ionatamishvili, the fight against rampant disinformation is difficult for another reason.
"Even a simple manipulation, not necessarily a deepfake, if it touches our emotions, if it grabs our attention, it can reach us. Our eyes and ears are something we trust strongly (...)" – said the interviewee of Piotr Jaszczuk.
"The goal of those creating disinformation is not only to make fun of us, but also to scare us and, above all, to sow doubt in us about what is true and what is not" - she concluded.
Michał Górski/ współpraca: Piotr Jaszczuk