The "EU Code of Practice on Disinformation" is a voluntary rulebook that was originally initiated by the European Commission in 2018 after light was shed on the immense impact that disinformation distributed in a coordinated manner might have had on the outcome of the UK's Brexit referendum or the 2016 US presidential election. So-called "influence operations" have since then been identified as a significant threat to democratic systems by the EC, national governments and some leading tech companies. The latter, by signing the code, committed to vaguely worded self-obligations relating to the placement of ads, political advertising or the support of scientific research relating to disinformation and its effects.
But despite these efforts, it seems the situation has hardly improved, as disinformation continues to be a material problem in coping with current crises such as the Covid-19 pandemic or the war in Ukraine.
On 16 June 2022, the EC announced that the 34 signatories, who include some of the largest tech companies and key players operating in the ad-tech sector (e.g. Adobe, Google, IAB, Meta, Microsoft, TikTok, Amazon-owned Twitch or Twitter), had agreed on a revised version of the code, most notably enhancing the list of commitments and adding more specific measures, following the objectives set out in the "European Commission Guidance on Strengthening the Code of Practice on Disinformation".
Compared to the previous version, the renewed code contains "stronger and more granular commitments and measures, which build on the operational lessons learnt in the past years", according to the EC. In particular, the signatories have committed to cut financial incentives for spreading disinformation, address new manipulative behaviours such as deep fakes and bots, provide better tools for users to identify disinformation more easily, expand fact-checking efforts by ensuring appropriate remuneration for fact-checkers, implement self-monitoring measures, and provide more transparency when it comes to political advertising.
While these commitments and measures sound promising in principle, at the end of the day, participating in and signing the code remains voluntary for companies. Extra motivation could be generated by the fact that "Very Large Online Platforms" will be required to put in place reasonable, proportionate and effective risk mitigation measures tailored to their specific systemic risks under the Digital Services Act, which will probably come into force within the next two years.
The EC now appears to be aiming for the "EU Code of Practice on Disinformation" to be recognised as a "Code of Conduct" and thus as a "risk mitigation measure" in the sense of Article 27 of the Digital Services Act. Since violations of the provisions set out in the DSA can result in penalties of up to 6 % of annual global turnover, this may fuel the ambition to adopt the code, at least as far as VLOPs are concerned.