Skip to content

How to get the most out of the DSA and the Code of Practice: Cooperation is the key

June 25, 2024 | by: point

Representatives of regulators, civil society and platforms dived deeper into the Digital Service Act and the Code of Practice in the Balkans during the Point 12 Conference.

Photo: Vanja Čerimagić

In order to ensure an accountable online environment, the Digital Service Act (DSA) consists of rules and regulations that govern digital services across the EU. Its main goal is to, as stated by the European Commision, prevent illegal and harmful activities online and the spread of disinformation. This regulation applies in full across all EU member states from February 2024. Moderated by Maida Culahovic of CA “Why not”, the panel focused on a very specific position of the Balkans to implement DSA into it’s legal frameworks as it’s not the part of the territory where the European Commission enforces the rules on very large online platforms.

Focusing on “vulnerable” election period, Lara Levet, EU Affairs Public Policy Manager at Meta, talked about the effects the DSA and the Code of Practice could have in terms of encouraging election integrity standards outside the EU, and referred to the whitepaper issued by Meta Oversight Board recently that concludes the companies must set global platform standards for elections everywhere. Answering whether there is a chance to expect some specific treatment for this region, Levet said the line between what they do for compliance versus what they do everywhere is really interesting.

– We have five big pillars of election readiness, and they would apply here in the Balkans, and elsewhere. But every election is different, so the way that each of those pillars manifests itself for a specific election will be different. There’ll be political ads on the one hand, and transparency; influence operations, and managing those threat actors; misinformation, and working with fact-checkers; media literacy campaigns, and empowering voters with election day reminders. It’s also something you would likely see here, Levet emphasized.

DSA or no DSA

She also mentioned very specific community standards around elections, called “voter interference policy”. 

– Content on how, where, and when to vote that would be misleading is banned and removed from our platform. All of these things would apply here the same way they would anywhere else – DSA or no DSA. DSA has this risk assessment, risk mitigation framework. Ahead of an election, we measure our business as usual, what we have in place any day, on fact-checking, on content moderation, and evaluate whether that’s enough to tackle the new risks that elections represent. Sometimes it is, and sometimes it’s not, but that doesn’t mean that those five pillars would not be treated outside of the DSA framework, she said.  

The other thing that the DSA and the Code of Practice have done is they’ve formalized some of the informal corporations. 

– The Code of Practice on Disinformation formalizes fact-checking cooperation. We have had the fact-checking network since 2016. Thankfully, we didn’t wait for the DSA, but it does formalize it. So, I don’t think that election readiness is not sufficient if you don’t have the DSA; there’s many other ways to do it, Levet said.

Carlos Hernández-Echevarría of Fundación Maldita and chairman of European fact-checking organizations (EFCSN) said the most challenging thing related to DSA is what risk mitigation could look like. 

– When we are talking about disinformation, most of this content is not legal, but it still needs to be addressed through risk mitigation. So, for me, and, I guess, for most fact-checkers, the idea of this is to be an effective tool to change some of the realities we have been seeing forever in some major online platforms. The framework in that sense was pretty powerful. The Code of Practice contained, particularly in the fact-checking chapter, many of the key issues, like the fact there was a lot of fact-checking being done, and fact-checking needed to be put in front of the users in certain circumstances, so they could benefit from it, Hernández-Echevarría stated.

He also reflected on how major online platforms treat connections between fact-checkers and civil society.

– We should see real improvement in these platforms. Meta has a long history of working with fact-checkers; it’s also, from my point of view, the platform that has responded to this information in and out the electoral process itself more forcefully and with a higher degree of success. But I see many other very large online platforms who are exactly at the same point they were two years ago, if not worse, EFSCN chairman said, noting that one of them is YouTube, which is – paradoxically – a signatory of the Code of Practice. He recalled the letter fact-checkers from all around the world addressed to YouTube three years ago – with no response from the video platform.

– There were high expectations in and outside the European Union. I was already talking about these issues here three years ago. And we were all thinking there is a chance to have an effect even outside the European Union. But at this point, for some of the bigger actors, I failed to see the impact even inside the Union, not to mention in the Balkans, Hernández-Echevarría said in no optimistic manner. 

Talking about the effects of the DSA and the Code of Practice, and lessons learned in the EU so far, Tímea Červeňová from Slovak Council for Media Services remarked that DSA is not fully implemented in Slovakia yet. The European Commission is opening numerous investigations towards their electronic platforms, but, as she noted, Slovak regulator is small-sized, “and it can be a challenge for small regulators to enforce such a file as a DSA”. She emphasized they had a “stress DSA test”, illustrating activities the Slovak Council for Media Services implemented during the Slovak parliamentary elections that took place in September 2023.

– Since we do not have direct leverage on the very large online platforms, because they are regulated by the European Commission, we need to find different hooks on how to implement the DSA, and how to enforce some provisions of the DSA as well. That included preparation meetings with the European Commission platforms, such as Meta, Google and TikTok; establishing monitoring of the political advertisement and election related commitments in the Code of Practice, such as promotion of authoritative sources, media literacy activities ect, Červeňová said.

Tips and tricks

Asked to share any tips for future digital services coordinators from this region on how to go about most meaningful cooperation, whether in liaising between platforms and civil society, or in other areas, Červeňová said cooperation and coordination are key points.

– Cooperation with researchers, for example, because, as regulators, we have limited access to data. Also, it is important to get in touch with the platforms – this is something on which local regulators can work a bit more. It’s better to be around the table discussing the issues than just to pinpoint what’s wrong and what’s not working, she said, noting European contacts are also useful.

Katarzyna Szymielewicz, a lawyer, activist, and Co-founder and President of the Panoptykon Foundation, the only NGO in Poland tackling the problems related to human rights and new technologies, talking about her perspective of the DSA, said the DSA is a huge piece of legislation that covers everything discussed today.

– That’s good news, in principle, even though it’s a compromise. I think it’s the first time for our movement to have legislation that acknowledges social responsibility of very large online platforms for harms that, so far, have been presented as collateral damage, Szymielewicz noticed. 


She gave some tips for successful advocacy of regional civil society organizations in getting included in all these processes and conversation about being a part of European wide enforcement.

– We have to feed the evidence – the Commission wants more evidence for harm itself. Quality information is not pushed by algorithms, so we need to change how platforms optimize algorithms in order to promote quality of journalism, Szymielewicz concluded.

Author: Aldijana Zorlak