Facebook, Twitter and TikTok face fines of up to 10 percent of turnover if they fail to remove and limit the spread of harmful or illegal content, under new laws proposed in the UK on Tuesday.
Tech platforms will also be required to do more to ensure the online safety of children and shield them from exposure to grooming, bullying and pornography, the Government said.
“We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry and to enshrine in law safeguards for free speech,” said Oliver Dowden, the secretary of state for digital, culture, media and sport.
Governments around the world are wrestling with the problem of finding ways to better control the spread of illegal or dangerous content on social media. The EU, for example, is set to unveil its plans this week.
Sites that breach Britain’s new rules, which will be introduced in legislation next year, could be blocked and senior managers held liable. Popular platforms will also be required to have clear policies in place for dealing with content — such as the dissemination of misinformation about COVID-19 vaccines — that, while not illegal, might cause harm. Dowden said the new legal framework would provide large digital businesses with “robust rules” to follow.
The UK’s media regulator, Ofcom, will be given the power to fine companies up to ₤18 million ($24 million) or 10 percent of their global turnover, whichever is higher, for breaking the rules. It will also be able to block access in Britain to non-compliant services.
Online journalism and comments from readers on media companies’ websites will be exempt from the new rules to safeguard freedom of expression.
Facebook and Google said in February they will work with the Government on the regulations. Both companies insist that they take safety extremely seriously and have already changed their policies and operations to better tackle these issues.










