Text resize: A A
Change contrast

“We see you, Russia, and we know how to counter your practices.”

An interview with Martyna Bildziukiewicz, head of the European External Action Service’s East StratCom Task Force. Interviewer: Maciej Makulski.

May 24, 2023 - Maciej Makulski Martyna Bildziukiewicz - Interviews

Martyna Bildziukiewicz. Photo: Priv

MACIEJ MAKULSKI: Five years ago, the European Union adopted the “Action Plan Against Disinformation”, which marked the beginning of a more coherent and consistent response to this threat. How do you assess the general development of the EU in tackling hostile online activities undertaken by Russia and other actors? 

MARTYNA BILDZIUKIEWICZ: Almost five years after the plan was adopted we can say it was definitely a game-changer. It was the first document ever in which all major European institutions said we all need to work together to tackle the challenge of disinformation. I remember how it started very well because it was my third day at work in the “East StratCom Task Force” team when the document was adopted. I remember the conference of four different commissioners announcing the document. And each of them had a role to play in bringing this document into existence.

So, the document can’t be overestimated because it defines the entire work of my team and other people who contribute to the fight against disinformation in the EU. It was also the first official document that stated that we need to work very closely with social media platforms.

It was a time in which, we can say, the EU recognised the threat. I think it was very telling that it was done just several months before the EU Parliament election of 2019.

Has the Action Plan impacted the work of your team?

It is significant for my daily work. One part of the plan encompasses the EU’s commitment to hiring 27 local “Strategic Communications Officers” in different parts of the world. They work for the EU delegations and help them understand how disinformation works in different countries, and what can be done locally to address the threat accordingly. And it of course took a while before those 27 officers were hired but they already work in three different regions: Eastern Partnership countries, the Western Balkans, and MENA countries.

Does this mean that these Strategic Communication Officers are present in each of the targeted countries?

In most cases yes, but there are cases where one officer is responsible for more than one country. They usually come from the country or region they are focusing on. This is important because they speak the language, and they know the local context. The establishment of the officer network was the major development as far as the Action Plan is concerned.

Was the Action Plan followed by other major developments at the EU level?

There were other documents worth mentioning here. One of them was the “European Democracy Action Plan” promoted mostly by Commissioner Věra Jourová. It also deals with the issue of disinformation in the context of elections and freedom of speech. It proved to be very important when Russia attacked Ukraine last year.

Already during the pandemic, there was also communication from the European Commission and European External Action Service (EEAS) about COVID-related disinformation. This showed recognition of the risk presented by disinformation and how it might be used to weaponise discourse and affect people’s health. We saw this on a very large scale during the first year of the pandemic.

How have the social media platforms that you mentioned before reacted to all these developments? 

Here we come to another game-changer, which is the Digital Services Act adopted in 2022. It brings a completely new set of rules for social media platforms that encourages them to be more responsible and take ownership, at least partially, when it comes to the problem of information manipulation.

So gradually, we have been able to set up a whole new framework for our fight against disinformation. And if we speak about social media platforms it is worth mentioning the Code of Practice, which was adopted a few months after the Action Plan on fighting disinformation came into effect. The code works as a form of cooperation between the EU Commission and social media platforms but is based on the voluntary decisions of the platforms. Based on the code, the platforms report regularly on how they fight against disinformation. So something difficult to imagine some time ago is now taking place and it has also become an inspiration for similar solutions in other regions of the world.

So how many platforms are committed to complying with the code’s recommendations? 

The list is available online. All the major players in this business are on board.

I think one of the challenges we have had so far is that we use some different language whenever we talk to each other. By “we” I mean the EU as the regulator when it comes to these social media platforms and big tech companies.

What does it mean in practice?

For example, when a platform such as Google tells us that they have identified disinformation or misinformation (some platforms prefer to use this term) incidents, it may mean something different to what Facebook tells us about the disinformation campaigns they encountered and what they define as coordinated inauthentic behaviour.  

So one of the main conclusions from this initial phase of cooperation was that we really need to use the same vocabulary. We need a set of notions that we agree on and use in the same way.

And your team has a solution?

My colleagues from the EEAS came up with the proposal to replace the term disinformation with “Foreign Information Manipulation and Interference” or FIMI. More and more people and institutions are starting to use it. I think using this term would be better than defining and describing the nature of the problem every time we are dealing with it.

What is the difference between FIMI and disinformation?

FIMI is a much broader term and disinformation can be a part of FIMI. There are different actors, including states, out there and they use very different tactics and means of manipulation. Therefore, we need a more comprehensive term. I know it sounds like something that only an EU institution could come up with and that we are just adding another acronym. But we feel it was necessary because we often deal with words. Words that hurt people. So if we do not speak the same language we can’t take any action.

It feels like the legal/bureaucratic framework for tackling disinformation (or FIMI) is fixed. What then should be the next steps?

The very recent components of this framework that appeared after Russia attacked Ukraine last year are sanctions. Sanctions for disinformation and information manipulation. This is something that has never happened before but we have seen how Russia has used the whole ecosystem of disinformation and information manipulation time and time again to justify its war of aggression and war crimes. Therefore, we first had the decisions from the EU Commission and the EEAS, which then led to the idea being adopted by the European Council. All member states unanimously supported the decision to block RT and Sputnik. These are the two biggest information manipulation operations of Russia outside of the country. This was followed by sanctioning seven other outlets, as well as several sanctions against several dozen individuals who are directly connected to the Kremlin’s information ecosystem. This is a major change that would not have been possible even a month before the invasion.

Why was it so difficult to introduce sanctions before the invasion?

I remember many discussions about it, including those in the public domain and the main argument was always freedom of speech, meaning we can’t ban any media outlets because of our values and principles. And we do have and protect these values! Nobody questions that. The point is the Russian outlets I mentioned are not media groups. They do not follow any journalistic standards. I do not like even using the term “media” to describe them. I prefer to call them “outlets”. We finally figured out that freedom of speech is also about protecting our information space from attempts to manipulate it.

How about the resources that are available at the EU level to fight disinformation? There was an argument that the financial and human means available are not enough. Has the development of the legal framework you described earlier been accompanied by a strengthening of the budgets and manpower of teams such as yours?

I certainly do not want to send a message that everything is perfect. We should have more resources. But as long as we fight against the behemoth there will always be a need for more resources. But in comparison to 2018, my team is much stronger. There are more funds and more people. My team comprises of thirteen people, who deal mostly with Russia as information manipulation actor, the Eastern Partnership and Central Asian countries. It is not a huge number, but when I started work here (December 2018) as a team member, I only had a few colleagues and we were part of the Strategic Communication division that was much smaller than it is right now. Currently, the division is over forty people strong and it deals with Russia as our team does, but also with China and other actors. There is another team that deals precisely with data analysis, there are teams dealing with specific regions which are not necessarily seen as disinformation actors but are also the targets of it. So, I think we have now a much more global perspective and more tools that help us.

Do you believe that more people are aware that disinformation is more of a danger than it used to be before?

Since last year, awareness at the highest political level of what Russia and China are doing to justify the war has become much higher. It also concerns the wider audience. We see it on all our online channels. There was a spike in interest in our work after Russia attacked Ukraine again. So the general awareness is higher but, unfortunately, it took the pandemic and the war for many people to realise that disinformation can literally kill.

The crucial part and probably the most difficult is the impact that all these initiatives have. Do you have any tools to measure it and make sure that your work reaches the people that should be reached?

The question of how to measure the impact of such work is the one-million-dollar question. What you can do when you are in the strategic communication field is to observe, for example, whether you have more followers, and how often people react to what you have to say. And here, again, we see at least a tenfold increase in interest in our work compared to before the war. And to amplify our message we cooperate with different actors, such as civil society organisations across the whole like-minded community. But measuring the impact is a challenge also because we are talking about operations that are affecting the human brain. It means that we often are dealing with deep convictions that are difficult to research and measure. One thing we try to do is polling in the Eastern Partnership countries, which helps us to test certain disinformation narratives and check whether people believe them or not and why. This shows at least if disinformation is working and if the interventions of the anti-disinformation community are having any effect.  

Let’s focus on how to respond to disinformation. When someone visits the website of your team, she or he might get the impression that the main tool is debunking. And as far as we know this is needed, but we are all aware of its limitations. This is partially because of these deep convictions you mentioned and the fact that more reliable information or narrative never reaches the same audience as manipulation. That’s why, and I think this is commonplace, we need other tools that increase the effectiveness of fighting disinformation. What tool would you put as an example here?

I think there is no silver bullet here. There will never be a single solution that fits everyone. That is why there is a wide array of tools that we need to use together. Only then does the work make sense. It is also important to consider who uses these tools. We can talk about every one of us who is in the information space and who can do some work. It means, for instance, reflecting on what information you consume and whether you have to consume that much, or checking the source of information, whether it can be cross-referenced, and so on. This is something that the fact-checking community and journalists always remind us about.

Going from the individual level through to civil society, state and international levels, everyone has a role to play here. That includes the private sector – social media – too. From this perspective, fact-checking is one of the key tools and I agree there are many challenges in this business. As you pointed out, fact-checked information is never as popular as the manipulated narrative. Also, the work of fact-checkers is not widely recognised. There was research published in Poland before Russia’s invasion of Ukraine that showed a discrepancy between people saying “I do support fact-checkers and I realise how important their work is,” and the percentage of people (which was very low) who can name at least one fact-checking organisation. So I think there is a lot to be done to promote this type of work. In addition to fact-checking, there is a database available on our website which contains disinformation messages that we have been collecting over the last eight years. The database can be seen as a fact-checking tool but more important is the fact that we keep a track of records. Therefore, we can identify when Russia has used certain narratives, such as about Ukrainians as “Nazis”, and you can show that some narratives are repetitive tools in Russian tactics. I can name many other examples, including biolabs allegedly located in Ukraine, but now I do not have to because there is this database and everyone can access it.

Is the goal to have dispersed tools that can make everyone a debunker?

Yes, and this tool must be available in different languages. So whenever a person sees something on her or his feed that might be suspicious they can use our tools, but also other organisations’ tools, to check it and figure out what are the facts.

You listed rising awareness, calling out disinformation, and keeping a track of records. What else?

Another important part of this picture that concerns mainly state and international level institutions is something that we call “situational awareness”. It might sound basic but you need to understand what is happening to be able to act. We are already able to monitor a large chunk of the information space and obviously, we will never be able to monitor everything that is happening in the information space. But we can currently do it in many languages to understand what are the trends and what conclusions can be drawn from them, as well as how we can act against them. We also use these insights to inform the wider public

The most important part of the response is resilience building. It means we must invest in our capacity to respond. You can rely on fact-checkers to support you in this as an individual. However, we need to be aware constantly of the new tactics and techniques that Russia and other actors are using to manipulate.

How can we increase resilience?

By resilience, I mean, for example, the trainings we do for journalists and civil society organisations. We provide them with tools for open-source intelligence and data analysis that they will be able to use to trace a disinformation narrative or understand how it is spreading. We perceive it as a long-term investment. The assumption is that when journalists or NGOs are equipped with this kind of knowledge, they will bring it back to their communities and the message will spread further. As a result, we will all become more resilient.

Let’s come back to the national and international levels. What else can state and international organisations do?

One key tool that is at the disposal of states and international organisations is the joint diplomatic response. There are statements from the G7 which call out Russia for its disinformation practices. Today, this may sound like a very soft measure but if you consider that a few years ago this was not possible then it is also an interesting evolution. As a like-minded community, we send the message that “we see you, Russia, and we know how to counter your practices.” I also want to mention disruption, which is the newest part of our response. Disruption means that we try to make disinforming as difficult as possible. Sanctions are a good example of disruption.

Do you have any final reflections on why fighting disinformation is so difficult?    

It is worth mentioning that disinformation is not illegal. Most of the practices that Russia and other actors are applying to manipulate us are legal. It is just about testing and exploiting the rules. We need to do whatever is in our power to at least make it more costly and more difficult if we can’t stop it completely, and we know Russia will keep disinforming for as long as possible. 

After describing all the changes and processes that happened within the anti-disinformation industry at the EU level and beyond, how do you see the next steps in strengthening the response to this threat and what could be another milestone to achieve?

For me, the most logical next step is to tighten up activities. The process has started and we rather have to focus on implementation than discovering something very new. I know it does not sound very interesting, especially when you talk to the media, and there is the expectation of big things. This is natural. But I would not mind entering into a more boring phase in which we can see how different tools work and keep improving them.

Martyna Bildziukiewicz is the head of the East StratCom Task Force that coordinates the “EUvsDisinfo” project. She holds a PhD in Political Science.

Maciej Makulski is a contributing editor with New Eastern Europe.

Please support New Eastern Europe's crowdfunding campaign. Donate by clicking on the button below.


, , ,


Terms of Use | Cookie policy | Copyryight 2024 Kolegium Europy Wschodniej im. Jana Nowaka-Jeziorańskiego 31-153 Kraków
Agencja digital: hauerpower studio krakow.
We use cookies to personalise content and ads, to provide social media features and to analyse our traffic. We also share information about your use of our site with our social media, advertising and analytics partners. View more
Cookies settings
Privacy & Cookie policy
Privacy & Cookies policy
Cookie name Active
Poniższa Polityka Prywatności – klauzule informacyjne dotyczące przetwarzania danych osobowych w związku z korzystaniem z serwisu internetowego https://neweasterneurope.eu/ lub usług dostępnych za jego pośrednictwem Polityka Prywatności zawiera informacje wymagane przez przepisy Rozporządzenia Parlamentu Europejskiego i Rady 2016/679 w sprawie ochrony osób fizycznych w związku z przetwarzaniem danych osobowych i w sprawie swobodnego przepływu takich danych oraz uchylenia dyrektywy 95/46/WE (RODO). Całość do przeczytania pod tym linkiem
Save settings
Cookies settings