Los 40 USA
Sign in to commentAPP
spainSPAINchileCHILEcolombiaCOLOMBIAusaUSAmexicoMEXICOlatin usaLATIN USAamericaAMERICA

SOCIAL MEDIA

New rules for social media companies could change everything

Banning under-18s from social media platforms could be introduced if big tech companies don’t take more responsibility with protecting users.

Banning under-18s from social media platforms could be introduced if big tech companies don’t take more responsibility with protecting users.
Peter DasilvaREUTERS

Could we really turn back the clock on some of the aspects of social media that so many youngsters take for granted these days? That is one of the questions being posed as Ofcom (the United Kingdom’s communications regulator) announced new draft codes of practice for online safety, something thayt signals a significant shift in the regulatory landscape that could potentially transform the way social media platforms operate.

While being an integral part of daily life, ensuring the safety of users, especially minors, has become a paramount concern. And the rest of the world is taking note.

Could social media rules see under-18s banned?

Under these new rules, social media companies face stringent requirements aimed at protecting young users from harmful content. Ofcom’s draft codes mandate tech firms to implement more robust age verification measures and reformulate their algorithms to steer children away from what has been termed as “toxic” material. Failure to comply could result in being named and shamed by the media regulator, with the possibility of being banned for under-18s.

The urgency behind these regulations is underscored by heartbreaking tragedies where children have been exposed to harmful online content, resulting in devastating consequences. Central to these new rules is the requirement for tech companies to recalibrate their algorithms to filter out harmful content from children’s feeds and reduce the visibility of such material. Additionally, companies will be compelled to conduct more rigorous age verification processes and implement stronger content moderation measures, including the introduction of a “safe search” function to restrict inappropriate material.

Ofcom’s CEO, Dame Melanie Dawes, describes these rules as a “big moment,” emphasizing the need to challenge the normalization of harmful content in young people’s online experiences. The timeline for implementation is set for the second half of 2025, with a consultation period allowing stakeholders to provide feedback until 17 July. Following this, Ofcom aims to publish final versions of the codes within a year, after which companies will have three months to assess risks and implement mitigations.

Support for social media change

The government views these measures as a fundamental change in how children experience the online world and urges tech giants to take them seriously. Michelle Donelan, the Technology Secretary, emphasizes the importance of proactive engagement from platforms to meet their responsibilities and avoid enforcement measures and fines.

However, challenges lie ahead, particularly in improving the technology for age verification. Bruce Daisley, former UK boss at Twitter and YouTube, highlights the need for enhanced age verification methods to effectively implement the proposed measures. While some tech companies have expressed support for the aims of the Online Safety Act, others have remained silent or declined to comment. Snapchat and Meta, among the platforms catering to young users, have reiterated their commitment to creating a safe environment and enforcing rules against harmful content. It appears to be an important step in the right direction, but much is still to be implemented before the perceived benefits can be realised.

Rules