Site icon Business Solution Profit

The Impact of the Online Safety Act on UK Businesses

Introduced to tackle growing concerns over the safety of internet users – particularly children and vulnerable groups, the Online Safety Act (OSA) marks a significant shift in the regulatory landscape for businesses operating online platforms in the UK.

Passed in October 2023 and progressively being enforced, it has introduced a wide range of new obligations, imposing stricter requirements for transparency, age verification and content moderation to create a safer online environment.

Under the Act, businesses operating online must now ensure transparency by regularly publishing their safety measures and reporting on their efforts to regulators. This means not only creating new policies where needed, but also providing evidence that these policies effectively mitigate risks associated with harmful content. The Act places specific emphasis on platforms accessed by children, requiring additional safeguards and age-appropriate design features.

To comply with these new regulations, digital platforms will be required to implement more stringent risk mitigation policies and are mandated to collaborate with Ofcom, the UK’s communications regulator. Ofcom will oversee the implementation of the Act and enforce penalties for those not in compliance. To comply, businesses must maintain detailed compliance records by continuously updating and improving their safety measures to keep up with evolving risks.

Effective Age Verification and Safeguards for Children

One of the most critical elements of OSA is the focus on protecting children and young people as and when they access the internet. Come 2025, online platforms accessible to minors will be required to implement age checks to accurately determine whether or not users are children.

Ofcom will publish final guidance in early 2025, however, in the meantime it is clear that basic or outdated age-check systems – such as a simple ‘yes/no’ checkbox or self-declared age – will not suffice, and highly effective age assurance measures must be used. Innovative technologies that verify users’ ages while protecting their privacy are not a pipedream; they are available and ready to be deployed.

Platforms will also be expected to integrate further age-appropriate design features that reduce the risk of children encountering harmful content. This means filtering out explicit material, protecting personal data, and setting limitations on interactions with adults, all while maintaining a user-friendly experience. For example, social media platforms will need to assess how they moderate conversations, regulate social interactions, and structure the visibility of certain types of content.

The Need for Content Moderation and Transparency

Encouraging effective content moderation is another key element of the Online Safety Act. Businesses are obligated to implement systems to moderate harmful content – including hate speech, violence, and inappropriate material that could harm users, particularly minors. To achieve this, platforms must adopt proactive rather than reactive measures to prevent harmful content from being uploaded or spreading before it reaches their users. Content moderation efforts must also be transparent, with businesses documenting and publishing their policies, any actions taken, as well as their results.

The Act is designed to hold platforms accountable, not just for the safety measures they put in place, but also for how well the measures work in practice. Companies failing to demonstrate robust content moderation could face legal repercussions or fines from UK regulator Ofcom.

Technologies to Make the Internet Safer

Safety technology solution providers have been continuously innovating and developing solutions to keep up with the ever-changing and challenging online environment. In the age assurance space, technological advancements and the introduction of AI-driven techniques have meant that safety tech providers can now offer a range of highly accurate, privacy-preserving age assurance methods that protect user privacy, minimise friction, and ensure compliance with ever-evolving regulations.

While some methods require user interaction, such as uploading an image of an ID document or taking a short selfie video, other methods use existing user data. This data, such as an email address, can often be collected as part of the account creation process or during the checkout process on online marketplaces, and can be deployed in the background with no further user interaction required. Email address age estimation can accurately determine a user’s age without requiring sensitive personal information, allowing businesses to maintain compliance while protecting user privacy.

Within content moderation, Artificial Intelligence (AI) will play a critical role in helping platforms maintain an even safer environment. The technology can be utilised alongside human moderators to add an additional layer of support and scalability, quickly removing harmful material at scale.

An Opportunity for UK Businesses

For UK businesses, OSA is not just another regulation to follow but a hugely important opportunity to make the internet safer. By adopting cutting-edge safety measures and prioritising transparency, businesses can build trust with their users and demonstrate a commitment to protecting children when they venture online.

Businesses that proactively harness and implement effective age verification and content moderation will also benefit from the ability to avoid regulatory fines and quickly adapt to future regulatory changes. Considering the fast-paced nature of the internet, companies that are able to stay ahead of regulatory requirements now will be better positioned to thrive and grow in the years to come.

As a new piece of legislation, the OSA naturally requires businesses to change how they operate, which may initially prove challenging. However, by staying up to date on regulatory changes, leveraging cutting-edge technologies, and implementing them effectively, businesses can strategically position themselves to become a trusted voice in their space and ultimately better protect kids and young people online.

Exit mobile version