The UK implements stringent online safety regulations, allowing tech companies three months to comply.
- Ofcom released its initial guidelines for tech companies to combat illegal harms on their platforms.
- Under the Online Safety Act, the first set of duties imposed by the regulator on tech platforms is to combat illegal content online through measures.
- The safety duties officially entered into force on Monday, despite the act passing into law in October 2023.
On Monday, the U.K. enacted its comprehensive online safety law, which will increase oversight of harmful online content and may result in significant penalties for tech giants such as Facebook, Google, and TikTok.
The British media and telecommunications watchdog, Ofcom, released its initial guidelines for tech companies on how to address illegal activities such as terrorism, hate speech, fraud, and child sexual abuse on their platforms.
Under the Online Safety Act, the first set of duties imposed by the regulator on tech platforms is to combat illegal content online through measures.
Tech companies are legally obligated to take responsibility for harmful content on their platforms under the Online Safety Act.
The safety duties officially entered into force on Monday, despite the act passing into law in October 2023.
By March 16, 2025, tech platforms must complete risk assessments for illegal harms, giving them three months to comply with the rules, as stated by Ofcom.
Ofcom stated that after the deadline passes, platforms must implement measures to prevent illegal harms risks, such as better moderation, easier reporting, and built-in safety tests.
Ofcom Chief Executive Melanie Dawes stated on Monday that we will closely monitor the industry to ensure firms adhere to the strict safety standards set under our first codes and guidance, with additional requirements to follow promptly in the first half of next year.
Risk of huge fines, service suspensions
If a company breaches the rules under the Online Safety Act, Ofcom has the power to impose fines of up to 10% of their global annual revenues.
In the U.K., senior managers who repeatedly breach Ofcom regulations could face legal consequences, including imprisonment, while in severe cases, Ofcom may seek a court order to restrict access to a service or limit it to payment providers or advertisers.
Earlier this year, disinformation spread on social media contributed to far-right riots in the U.K., prompting Ofcom to consider strengthening the law.
Ofcom announced that its duties will encompass social media companies, search engines, messaging, gaming, and dating apps, in addition to pornography and file-sharing websites.
To improve user experience, firms must simplify access to reporting and complaint functions under the first-edition code. For high-risk platforms, a technology called hash-matching will be mandated to detect and remove child sexual abuse material (CSAM).
Social media platforms use hash-matching tools to identify and remove CSAM by comparing encrypted digital fingerprints of known images with those of new content.
Ofcom emphasized that the codes released on Monday were only the initial set of codes and that the regulator would consult on additional codes in spring 2025, including blocking accounts that share CSAM content and utilizing AI to combat illegal harms.
British Technology Minister Peter Kyle stated on Monday that Ofcom's illegal content codes represent a significant shift in online safety, requiring platforms to actively remove terrorist material, child and intimate image abuse, and other forms of illegal content, thereby aligning online safety with offline laws.
"If platforms do not take action, I will support the regulator in using its full powers, such as imposing fines and requesting court intervention to restrict access to sites," Kyle stated.
Technology
You might also like
- TikTok threatens to shut down on Sunday unless Biden takes action.
- Digital Currency Group to pay $38.5 million to the SEC for misleading investors.
- Senators express concerns about OpenAI's efforts to align with Trump.
- TikTok ban is upheld by Supreme Court in a unanimous decision.
- Whitney Wolfe Herd, the founder of Bumble, will be returning as CEO.