Google, Twitter, Meta Agree to Adopt Strict Measures Against Spam Bots to Follow EU’s New Code of Conduct

Social

Meta, Alphabet unit Google, Twitter and Microsoft on Thursday, June 16 agreed to take a tougher line against disinformation including deep fakes and fake accounts under an updated EU code of practice or face hefty fines.

More than 30 signatories including advertising bodies have signed up to the updated Code of Practice on disinformation, the European Commission said.

The signatories agree to do more to tackle deepfakes, fake accounts and political advertising while non-compliance can lead to fines as much as 6 percent of a company’s global turnover, the EU executive said, confirming a Reuters report last week.

The companies have 6 months to comply with pledges, with a progress report due at the beginning of 2023.

Previously, it was reported that Alphabet unit Google, Facebook, Twitter and other tech companies will have to take measures to counter deepfakes and fake accounts on their platforms or risk hefty fines under an updated European Union code of practice, according to an EU document seen by Reuters.

The European Commission was expected to publish the updated code of practice on disinformation on Thursday as part of its crackdown against fake news.

Introduced in 2018, the voluntary code will now become a co-regulation scheme, with responsibility shared between the regulators and signatories to the code.

The updated code spells out examples of manipulative behaviour such as deepfakes and fake accounts which the signatories will have to tackle.

“Relevant signatories will adopt, reinforce and implement clear policies regarding impermissible manipulative behaviours and practices on their services, based on the latest evidence on the conducts and tactics, techniques and procedures (TTPs) employed by malicious actors,” the document said.

Deepfakes are hyperrealistic forgeries created by computer techniques that have triggered alarm worldwide in particular when they are used in a political context.

© Thomson Reuters 2022