Is the Online Safety Act 2023 Creating the 'Safest Place in the World'?
Introduction
Aylo, the parent company of Pornhub, has declared its support for the new child safety standards that pornography sites must adhere to. Aylo’s announcement was made following the news that OnlyFans has come under scrutiny by Ofcom. The subscription service has been accused of failing to implement adequate measures to prevent minors from accessing sexual content on the platform.
The changing standards for adult-content sites is a result of the Online Safety Act 2023: the UK government’s attempt to improve safety by outlining the responsibility that social media platforms hold towards their users.
While the Act, and its potential consequences, were heavily debated during the drafting process, the realities of the Act are becoming increasingly visible.
Background
The Online Safety Act has sought to revolutionise protection in two ways. Firstly, by holding online platforms responsible for moderating the illegal content published on their sites. Secondly, by imposing a duty of care to prevent children from viewing harmful content.
onlyfans under scrutiny
Before the Act was introduced, an investigator was able to find 10 child abuse images on OnlyFans within an hour. Yet, in light of these claims, the platform continued to argue that it was the safest social media site in the world.
The Online Safety Act 2023 now requires platforms that show pornographic content to perform age verifications (section 81). In response, OnlyFans implemented facial scanning technology and now requires users to enter their payment card details.
Despite their change in attitude and absorption of responsibility, OnlyFans have come under investigation for its measures. For example, the platform claimed that the face scanner threshold was set to 23 years old. In reality, however, the threshold was set to 20 years old.
While this still exceeds the age of 18, this false claim and increased risk to minors has led to the platform being investigated.
What does this mean for platforms hosting public content?
This is not the first investigation by Ofcom following the 2023 Act. TikTok was investigated at the end of 2023 for inaccurate information regarding parental controls.
It is clear that the expectations have increased for platforms that allow users to share content publicly. Social media platforms can no longer deny responsibility for the safety of young users as it becomes clear that Ofcom are enforcing the new standards strictly.
Companies such as Aylo are already reacting to the changing expectations. As more companies begin to uphold this duty of care, it is likely that they will be faced with higher expectations of content moderation.
It is of upmost importance that tech companies adhere to their new responsibilities. Otherwise, they may face a hefty fine. Furthermore, named executives may even face criminal liability.
what does this mean for other types of data-sharing platforms?
At the end of April 2024, it was revealed that thousands of young children in Tyneside had been involved in WhatsApp groups engaging in behaviour that is criminalised under the Online Safety Act. This included distributing photographs of genitals (section 187) and the promotion of self-harm (section 184).
The NSPCC has argued that WhatsApp’s decision to reduce the age limit – without ensuring effective protections were in place for vulnerable users – constitutes a violation of the Act. However, Ofcom has not yet scrutinised WhatsApp on this basis. Additionally, WhatsApp has denied responsibility; they have reiterated that young users are responsible for protecting themselves against harmful private messages. This suggests a denial of the duty of care imposed by the Act.
Therefore, while platforms hosting public data are needing to adopt responsibility for the safety of younger audiences, this has not been the case for other data-sharing platforms such as private messaging services.
Conclusion
While social media platforms should be taking the responsibility they hold for younger users seriously, the enforcement for other types of data-sharing platforms has been significantly weaker. Consequently, their attitude to accepting responsibility is mostly unchanged.
By Abigail Eggleston