Social media companies are under increasing pressure as Ofcom, the UK’s communications regulator, prepares to enforce new rules designed to keep children safe online. With the Online Safety Act coming into effect early next year, platforms like Facebook, Instagram, and WhatsApp will face significant consequences if they fail to protect young users from harmful content.
Key Responsibilities for Social Media Platforms
Ofcom’s chief executive made it clear that the responsibility for ensuring online safety lies squarely with the companies, not with parents or children. Under the new Online Safety Act, firms will be required to safeguard children from exposure to harmful material, including self-harm content, pornography, and violent material. If companies fall short, they could face hefty fines or even risk having their services blocked in the UK.
The Act requires that within three months of the finalisation of Ofcom’s guidance, social media platforms must conduct risk assessments and implement changes to make their services safer for users, particularly younger audiences. Despite this looming deadline, many feel the pace of enforcement is not quick enough.
Ofcom’s Enforcement Powers
Once the Online Safety Act becomes enforceable, Ofcom will have the authority to fine companies up to 10% of their global revenue if they fail to comply. In extreme cases, it could block their access to UK markets entirely. Ofcom has already been in contact with major social media platforms.
Social media firms are expected to introduce a variety of changes to enhance safety. For example, Instagram has already introduced features to prevent sextortion, and other platforms may implement features like allowing users to discreetly exit group chats.
As the deadline for compliance looms, all eyes are on social media giants to see how they respond. Will they make the necessary changes to protect young users, or will they face the consequences Ofcom is prepared to hand down?
While the introduction of these regulations is a positive step, the challenge now lies in ensuring they are implemented effectively and quickly enough to make a real difference.
Ofcom’s approach to the Online Safety Act involves gradually enforcing rules to protect users, especially children, from harmful online content. The Act requires companies with a significant UK user base to assess risks and take action to safeguard users.
Starting in December 2024, Ofcom will issue codes of practice and guidance on addressing illegal content, with compliance becoming enforceable by March 2025. From early 2025, stricter age verification rules for pornographic content and child protection measures will come into effect, requiring companies to conduct risk assessments. Additionally, services that disproportionately affect women and girls will face specific guidelines. A small number of high-risk online services will have additional transparency and safety duties by mid-2025, with Ofcom continuing to monitor and enforce compliance across all platforms.