Text resize: A A
Change contrast

Can we win the information war?

A conversation with Mattia Caniglia, Roman Osadchuk and Ruslan Trad, disinformation experts with the Atlantic Council’s Digital Forensic Research Laboratory (DFRLab). Interviewers: Maciej Makulski and Adam Reichardt

April 29, 2023 - Adam Reichardt Maciej Makulski - InterviewsIssue 2 2023Magazine

MACIEJ MAKULSKI: I would like to start by asking you to paint a general picture of where we are regarding this information war and the counter-measures used to address this problem. We decided to go back into history a little and mark Russia’s war against Ukraine as a kind of breakthrough point, although we know that all kinds of lines are a bit artificial. But it was actually interesting for me to think about how before that we were rather discussing the problem of post-truth, which was a buzzword at that time, and after February 22nd 2022 I have an impression that a whole industry fighting disinformation has developed even more; that we are in a different place because the answer to the threats is more systemic, coherent and consistent. So, what I would like to ask you first is how you would characterise the position we are in right now and what major trends are worthy of discussion?

ROMAN OSADCHUK: If you take 2013 and 2014 as a starting point, a lot of things have changed, but in essence, some things have not. If we talk about the Ukrainian context, the overall message of Russia to the world was that Ukraine is not a reliable partner, you should not have anything to do with Ukrainians, it is our sphere of influence, actually, you should not engage in any way. Basically, this was the message for the West and the United States and all the allies of Ukraine. I think it is safe to say the Russian narrative was that Ukraine is basically run by “Satanists”, “Nazis” or anything you could imagine. All those things began to show up in the information space in 2014 in one way or another. So, in this sense, the overall direction of the messages is kind of the same. What changed are the methods and tools that they use. They are not just focusing on one blog post, but they are simultaneously using Telegram, VKontakte (Russian social media), Facebook, ads, basically everything that they can in order to promote their narrative. Secondly, they are employing some things directly from the people who are fighting disinformation and fact-checking. The notorious “war on fakes” initiative – which presents itself as a fight against “disinformation” – was actually created to spread propaganda on behalf of Russia. The material that they are producing, like denying the Bucha massacre, or denying the hospital bombing in Mariupol, it was widely amplified by the Kremlin, Russian ministries and embassies. Also, the use of Russian embassies to spread disinformation is another thing. If you look at any social media handle of any Russian embassy in any place in the world, you will be able to find disinformation very quickly.

RUSLAN TRAD: We can speak for hours on this topic, but I think we should mention that some of the trends in disinformation existed well before the first invasion of Ukraine, before 2014. Generally, the field changed with the illegal annexation of Crimea and after that with the intervention in Syria. So, in that context our field has changed and we have disinformation taking place in other areas like the Balkans and South-Eastern Europe. Some of the narratives we tracked go all the way back to the first mass protest movement in Ukraine in 2004. They just changed some keywords, but they are almost using the same tools. Right now, they have recycled some of the trends that they have been using when supporting forces on the ground. In many cases spreading disinformation in Europe goes together with forces on the ground. This was the case in Donbas, with the so-called proxy forces. This is one of the first such stories sold by the Russian Federation, that there are “separatists” in Ukraine, which of course is a fairy tale. We know that these forces are directly supported by Russia and even some soldiers are from the Russian Federation. The second story is on mercenaries and how they developed in other regions like Africa and the Middle East, also in the context of Syria, Libya and Ukraine. So, in general, they use pre-existing tools and trends, some even from the Soviet toolbox, but right now in the current context. This means Telegram, social media and focusing on public opinion in Europe, which is still one of the most important targets of Russian information operations.

MATTIA CANIGLIA: These two answers gave a pretty accurate picture. Since 2014 and more recently in the past three years we have seen less big operations and much more smaller operations, across different platforms and using different techniques. I would say that another trend which is important is that sometimes campaigns are much less sophisticated, which doesn’t necessarily mean that they are sloppy. But they are not on the same level of sophistication that we have seen before and the narratives that both Roman and Ruslan referred to show that the Russians haven’t reinvented the wheel when it comes to narratives. They are just reusing narratives. For instance, in the case of Africa they have been using them since the 1960s. It is just that the world is back to a place where these narratives are working again. In terms of counter-measures, I think there are a lot of illusions because of all that has been going on around memes, like Saint Javelin or the Ukrainian farmers picking up tanks, and this gives an impression that the Ukrainians are winning the information war. And don’t get me wrong. They do an incredible job, but at the same time if we look at the information space more widely, it is a lot more contested. The counter-measures work, but at the same time we are in a position where we need to do more. In Europe you have two substantial sensitivities and these two sensitivities resulted in two different outcomes. You have countries like the Baltics, Poland, the Nordics, who were ready to meet this challenge. And you had other countries that were just not ready, largely because there is no strategic sensitivity for this, for instance Italy, Spain and Germany. In these countries we see that Russian disinformation campaigns are not always sophisticated, because they do not need to be. In Italy for instance politicians go on state television at 8:30pm and just repeat the narratives that are the same as Russian propaganda, it’s simple. One last thing, I think it is worth stressing what Roman said about the role of the embassies. I think disinformation became a much more official affair than before. Interestingly, this is a trend that we observe also with Chinese disinformation operations. This is probably because Russia does not have the resources to play on so many tables at the same time. They try to optimise what they have, and they have been pretty good at that.

ADAM REICHARDT: If I may stick with the counter-measures discussion briefly. Could you explain what counter-measures you think work well in the fight against disinformation? Are you talking about debunking, fact-checking, these kinds of things; or are there other counter-measures that can be taken on an even more broader level to fight disinformation?

MC: If you look at examples like the pre-emptive approach, this is what has been working well. In this sense the part of the work that we do is becoming more and more important, and we can see this when speaking about the actors within Europe and elsewhere in the world. Efforts like media literacy training on disinformation are super important because this creates the first line of defence. All the other things that we can do like debunking, fact-checking, especially in a crisis situation, are just not enough. Having said that, there are a number of things that are happening in the European information environment also in terms of legislation that will help make our system more resilient to disinformation. But, for me, we should start with key steps like training on media literacy, disinformation and OSINT (open-source intelligence). We need to work towards a world in which, for instance, you want to know how to spot a bot or picture that is generated by artificial intelligence. If you are not able to, you might not be able to be a responsible citizen; and if you’re not able to be a responsible citizen, well, we are in danger.

RT: Just to respond to what Mattia was saying, because he has a lot of experience with the bureaucratic efforts, and meetings with decision-makers and I am sure that he has already seen how difficult it is to organise one united European answer with all the difficulties and diplomatic problems, and even internal sabotaging. Unfortunately, in Bulgaria, where I am based, we see the president is an actual actor in sabotaging these efforts. We conduct workshops and training, but the topics are seemingly far removed from society. Yes, we have some journalists trained in OSINT, but they have old information. They are not up to date with the latest trends. And editors and publishers are not trained on content moderation. This is just one story, but last year I realised that I travelled ten times around Europe, flying from one place to another, to speak and meet with my colleagues. And I realised first of all that we are a very small group of people. Maybe me, Mattia and Roman know almost 100 per cent of people in this field. This is a problem because I do not see new faces. And, second, I realised how much funding is spent just on me. Tickets for airplanes, hotels, going to meetings, food, etc. Thousands of euros for just one person, and we are already late in comparison to China and Russia. So how to solve this? For me right now we have no quick answers. Mattia is right on the direct efforts we can continue right now, because all the others mentioned are for the long term and we have no time for this.

RO: It is absolutely true that education is important, especially in areas like OSINT and media literacy. But basic education which teaches critical thinking is also a great thing. Of course, the Ukrainian context is slightly different. We got a pretty painful “vaccine”. In 2014 it was quite chaotic, with the annexation of Crimea and the war in eastern Ukraine and all the disinformation that accompanied that. Since then, NGOs and even governmental institutions began fighting disinformation. One of them is the Centre for Strategic Communication under the culture ministry. Another is the Centre against Disinformation under the National Security Council. But even with that there is not enough because the wider population is not informed about all these things. There are other initiatives, teachers are passing this knowledge on to the kids. But again, it hasn’t reached the whole country. Of course, the invasion has changed a lot. When Russia is attacking you and telling you about chemical weapons or other false pretences to explain invasion, you can just open your window and first of all hear or see the explosions and air-raid sirens. That is why not so many people fell for those messages. So, for Ukrainians there is no question about who is attacking who. But the avalanche of different messages causes confusion. Preparation and an overall system built via education is important. I know that Finland has great experience with school programmes. They have different courses for different parts of the society. In other words, we need a robust approach to build up resilience in the long term.

MM: I would like to highlight this link between OSINT and technology, because I fully agree with you that all those steps that Mattia outlined mean, in short, education. We need education and it is a time-consuming approach, but also a part of this education might be learning new technologies. Can you talk more about the role of new technologies in fighting disinformation?

RT: This is an ongoing question. We do not have enough data about how to include such information first in institutions, then in schools, for example. We have ongoing conversations about the role of artificial intelligence because authoritarian regimes are also using such tools. So, it is important to train on these tools to be helpful in spreading democratic values, supporting these values and also implementing them in the fight against disinformation. This is an ongoing debate. In the countries in the Balkans and Central Europe, it will be difficult. Maybe the Czech Republic or Poland are on the right path, but the countries in the region are being left behind by many of the countries in Western Europe. Our school system is a disaster; so yes, many of the young people already know this trend, but it is useful to think how we can use these new technologies in real life, and how institutions and experts can implement these tools in practice.

RO: I would add that the technologies are like a game of cat and mouse. The tools to debunk something are created only after somebody spotted the disinformation. Let’s say if you wanted to see who is behind a website, you could use the “Whois” database, but then there is some kind of shield protection regarding that information and now this tool doesn’t work so well. Today, there are things like ChatGPT and AI chat-generating technologies which are also being used to generate disinformation. Of course, there are tools on how to unveil whether this is created by a machine, but again there are some additional AIs which could rewrite the text making it difficult to detect. This is what I mean by saying it is a constant game, where we are trying to catch up and we are on the losing side because we are always behind. That is why the main thing we should actually do is teach a critical approach to the information we consume. The technologies are important and we have to understand how they can be used to create disinformation content. It is a constant struggle.

MC: Roman is essentially suggesting a more humanistic approach to these technologies and that we need to teach human beings to exercise critical thinking, which is absolutely right. But I want to add two dimensions to these reflections. First is how we change topics according to technological innovation. Sometimes technological innovation also restricts data access, and that is what we are seeing with Twitter. There is also going to be a need to protect that data access and to protect OSINT for what OSINT is. This very much goes together with how much the OSINT environment is becoming competitive in a sense that we have witnessed two milestone moments in OSINT in recent history – the war in Syria and the war in Ukraine. We have seen in these two different moments how much OSINT can do. But the OSINT that was done in Syria and in the Russian war against Ukraine was very much done by journalists, or associations like DFRLab, and also for-profit companies like Bellingcat. But it is always open source, it is always to be published. Now things are changing because governments are understanding that OSINT is a powerful tool that they can use to serve the intelligence scope. They are creating agencies and recruiting people to work for the government. That would mean also a change in the paradigm, that this research by OSINT experts would not be public anymore but would be reserved for certain audiences of decision-makers. This is not necessarily a bad or good thing, but could definitely change the environment. And this also means that access to data may change. Facebook and Twitter are already starting to think that if you want to access data you got to pay. Some technologies will become paid because the demand is there, and because people will understand the value they will try to monetise data. This could be the start of a new fight while we conduct other fights like the ones we were talking about in the previous question. Technological developments need to maintain accessibility to certain data and that is the core need that we have for conducting our work.

AR: Roman, could you give us your perspective on what disinformation looks like during wartime. How has the field of disinformation changed in the last 12 months in the Ukraine context?

RO: What actually changed this year is that in the beginning there was quite a lot of chaos and misunderstandings about what was happening – especially the first days, when people were evacuating and nobody knew for sure what was happening. For example, where were missiles falling or where spies were being caught. There was a lot of disinformation around that. After everything began to become clearer, the Russians understood that their explanations for the attack were not accepted. The stories about Russia coming to save Ukrainians from Zelenskyy didn’t work. So, they started to discredit the Ukrainian government with a lot of different messages and approaches. They try to promote stories like the Ukrainian government is selling western weapons to African countries. They’re trying to discredit Zelenskyy saying that he fled Ukraine, multiple times, in fact I lost count how many times. They even promoted a story that he was using a green screen while he was filming his video on Bankova Street in Kyiv. They continue to create divisions which is what they did even before 2014, largely between eastern and western Ukraine. They try to portray Ukraine as full of different kinds of people. They try to push the language issue and explain that Russian-speaking people are being attacked, which is not true. They are also spreading information about the reasons for the war, trying to discourage any resistance towards Russians.

This works in the physical realm as well. In Belarus they launch aircraft which trigger an air-raid alert across the whole Ukrainian territory. It is a kind of intimidation – there might be an air strike and there might not be one. They use it constantly. A lot of people stopped reacting to the air-raid alerts simply because there were too many. They also try to blame the atrocities on Ukraine’s armed forces. They try to discredit them as well. Really, there are a lot of things that happened in the last 12 months.

AR: Yes, that is a lot. If I could just follow up briefly. I am going to ask maybe a difficult question, but this would be based on your knowledge and your research. As you recall, there was a scandal with Amnesty International which reported that both sides are committing war crimes. A lot of people claimed that this was disinformation without much evidence. Of course, maybe there are some individual cases, but they were not documented. How did you see this case overall? I know it is sensitive…

RO: It is sensitive. I guess the main answer why it was received this way is that it lost proportion. There is huge evidence of what the Russians have done and there is some evidence against the Ukrainian side too. Basically, the report was received so badly because it was out of proportion. It presented the cases as if the Ukrainians were the only ones who were doing most of the horrible things, despite the fact that it is the Russians who are actually erasing cities from the face of the earth. Obviously, that is the main problem. I have not dedicated a lot of time reading the report myself, because it is not my specialisation, but I think that is the reason why it was received that way. There was previously a UN report, months before, and it was much more balanced.

MC: I think it is also a problem linked with the news cycle. As Roman said the report had a problem of proportion and I agree that was the main issue. But it is also related to the news cycle. When the report came out the news was not “the Russians are committing human rights violations,” but the news was that the Ukrainians are committing them too. It gets hyped up akin to the context of conflict in places like Africa. When you have the French shelling 15 jihadists, but five civilians are caught in the crossfire, the headlines will, understandably, report on the civilians. Which is, again, understandable, but leaves out the necessary context of fighting terrorism for an overall safer community. At the same time, the increasing evidence of human rights abuses conducted by the Wagner Group in the Sahel does not necessarily make headlines. So, there is a certain element of double standards in how our media treats this kind of news and information.

MM: My last question might be a little more philosophical, because when we think about war, we usually think about something which has a beginning and an end. So, one big question that we asked ourselves when we decided to have this special issue of New Eastern Europe, is can the information war be won? Maybe the maximum that we can achieve is managing the risks, and maybe we are making a mistake by calling it an information war because we are sending the wrong signal. But the question I would like to ask all of you is what are your reflections on this and what does victory in the information war look like?

RT: I will be very brief, because I think Roman probably has a lot to say about this. Just one point, believe it or not, but the so-called “Protocols of the Elders of Zion” (a fake antisemitic text which was created in 1903 popularising the belief in a Jewish conspiracy) is an actual thing right now in Bulgaria. Many people still believe in this fabrication created in imperial Russia. When we speak about the information war and how to counter it, I refer back to Mattia and Roman’s conclusion that education and critical thinking are key. If a forged document from more than 100 years ago is still considered true, the question on education and institutions becomes very relevant. I think in many cases disinformation is successful because of pre-existing prejudices. This is a question of the principles and values of the audience and people. The question is bigger than this discussion.

MC: On the information war, I think this is partially our fault. I come from the counter-terrorism domain, and that is exactly what has happened with the “war on terror”. When we needed attention from policymakers and stakeholders on terrorism, we inserted the word “war”. I think the real war is what we are seeing in Ukraine, which has been caused by the Russian invasion. The information war is a part of that context. But outside that I am not sure it is a war. I think we call it this way because we want to grab the attention of policymakers and we want to tell them that this is important, and we have to act on it. There is an old saying that “democracy needs constant vigilance”, and that is what is going to happen with the information environment. We will need constant vigilance and I think what we try to say in different ways today is that we need more people to actually engage in this vigilance and we need to equip them with the right tools for them to do so.

RO: I do not know the perfect term for it. But from what we have seen with Russia’s actions and approaches, it seems that they perceive the information space as an element of war. And this does not just mean Ukraine. As we know the Russians have been active in many information environments, ranging from the Netherlands with MH17, Brexit in 2016, or US elections, especially in 2016. These kinds of things are happening in multiple domains, including cyber. In some senses, it is a constant conflict, a new version of the marketplace of ideas. This is a constant struggle between different contesting ideas, and the information spaces are also very interrelated. Russians may publish something on Telegram and in 20 minutes it will be in Polish on Twitter. In 40 minutes, it will be on Facebook in French and then the next day it will be in the mainstream media in Latin America. There are no defined borders and that is why it feels like a war, and that is what Russia is doing.

Regarding your question as to whether it is possible to win – I do not think so. Instead, it is a constant process. What can be done is to provide the knowledgeable audiences with information about the risks and how they might be better prepared. In other words, education and the skills that we talked about earlier. But that is an ideal state that we cannot reach. We cannot say that 100 per cent of the population is already there. Unfortunately, that is why I think it is impossible for us to say we can win. It is not something that can be won. There might be some relative successes and failures, but not a definitive win or loss.

Mattia Caniglia is an associate director at the Atlantic Council’s Digital Forensic Research Lab where he coordinates training and capacity-building activities on open-source intelligence techniques, disinformation and digital resilience. Mattia is also an affiliate lecturer at the University of Glasgow and has extensive field experience on the ground in Europe, Africa, the Balkans, and Southeast Asia.

Roman Osadchuk is a research associate for the Eurasia region at the Atlantic Council’s Digital Forensic Research Lab. Roman researches disinformation narratives and technology uses for their spread in the region. Roman is interested in information policy and the media cycle in the disinformation spread.

Ruslan Trad is a resident fellow for security research at the Atlantic Council’s Digital Forensic Research Lab (DFRLab). Ruslan is interested in Eurasia, Syria, conflicts, hybrid warfare and mercenary groups. Before joining DFRLab, he worked as a risk analyst, consultant and freelance journalist.

Maciej Makulski is a contributing editor with the New Eastern Europe.

Adam Reichardt is the editor in chief of New Eastern Europe and co-host of the Talk Eastern Europe podcast.

, , ,

Partners

Terms of Use | Cookie policy | Copyryight 2024 Kolegium Europy Wschodniej im. Jana Nowaka-Jeziorańskiego 31-153 Kraków
Agencja digital: hauerpower studio krakow.
We use cookies to personalise content and ads, to provide social media features and to analyse our traffic. We also share information about your use of our site with our social media, advertising and analytics partners. View more
Cookies settings
Accept
Decline
Privacy & Cookie policy
Privacy & Cookies policy
Cookie name Active
Poniższa Polityka Prywatności – klauzule informacyjne dotyczące przetwarzania danych osobowych w związku z korzystaniem z serwisu internetowego https://neweasterneurope.eu/ lub usług dostępnych za jego pośrednictwem Polityka Prywatności zawiera informacje wymagane przez przepisy Rozporządzenia Parlamentu Europejskiego i Rady 2016/679 w sprawie ochrony osób fizycznych w związku z przetwarzaniem danych osobowych i w sprawie swobodnego przepływu takich danych oraz uchylenia dyrektywy 95/46/WE (RODO). Całość do przeczytania pod tym linkiem
Save settings
Cookies settings