Overview of the Online Safety Act 2023

Legal note: This content was compiled from multiple public sources and is intended for general information and satire only. It does not constitute legal advice. While I have made efforts to reflect the law accurately, errors or omissions may exist. Please consult a qualified professional for legal guidance.

Jump to my thoughts (TL;DR)

The UK's Online Safety Act 2023 (OSA) is a sweeping law (Royal Assent 26 Oct 2023) designed to make the internet safer, especially for minors. The government explains that it "puts a range of new duties on social media companies and search services" – for example, providers must "implement systems…to reduce risks [of] illegal activity, and to take down illegal content when it does appear". Ofcom is named as the independent regulator, with broad powers (fines, blocking orders, etc.) to enforce the rules. In practice, the Act's provisions are being rolled out in phases (child-safety codes coming into force by mid-2025, etc.).

Scope

The OSA applies to any "user-to-user" service and search engine. That means any website or app where users post content or interact (social networks, forums, messaging, cloud sharing, dating apps, gaming chats, etc.). Even services outside the UK must comply if they have a significant UK user base or target the UK market. Adult content sites are explicitly included (they must enforce strict age checks). Offline or one-way content services (e.g. static blogs or email/SMS) are largely exempt, as are one-to-one private calls or messages.

A cartoon-style illustration of a stressed tech worker at his desk, surrounded by piles of British compliance paperwork with Union Jack symbols. In the background, an American flag, fast-food packaging, and a giant soda cup suggest the global impact of UK online safety regulations. The character looks overwhelmed but sympathetic, blending satire with humour.
Drowning in red tape: A weary digital everyman faces the mountain of UK compliance, while the shadow of American tech looms large. The Online Safety Act 2023 isn’t just a British story—it’s a global one.
“I wouldn't even saterise you” – Unknown

Key Duties

The Act imposes a "duty of care" on platforms, with obligations that scale to their size and risk. Key duties include:

New Offences

In addition to platform duties, the Act creates new crimes for individuals. These include cyberflashing (sending intimate images without consent), non-consensual deepfake adult content, threatening communications (e.g. threats of violence), and encouraging serious self-harm. (These apply to people, not companies.)

Impact on Games and Online Communities

Because the OSA covers any service with user-generated content or interaction, many online games are in scope. Games with voice/text chat or shared media fall under the Act. By contrast, purely offline games (no online features) or those that only allow private 1-on-1 chats (like a 1-to-1 VoIP call) are exempt. In practice, most multiplayer games with public chat (e.g. Fortnite, Roblox, FPS team chats) will have to comply.

Key implications for games:

In short, virtually any online game with community features will face new obligations. For older games with chat, operators will have to update their moderation (or risk fines/blocking). If compliance proves too difficult or expensive (especially for small studios), some game-makers might restrict UK access – for example by geo-blocking – rather than overhaul their systems.

Free Speech, Privacy, and Encryption

The Act aims to balance safety with rights, but this is controversial. Freedom of expression and privacy are explicitly mentioned in the law: platforms must have regard to users' rights when applying safety rules. The Act even bars removing "democratically important" content (e.g. political opinions) except in specific circumstances. The government maintains that the Act protects free speech – officials have praised it for empowering adults and preserving political discourse.

Critics, however, warn the Act could have a chilling effect. Privacy groups (EFF, Open Rights) argue that requiring near-constant monitoring of content will push firms to over-censor. For example, the EFF notes the law would "mandate general monitoring of all user content" and "undermine…technologies like end-to-end encryption". Wired magazine reports that platforms must now scan users' private messages for CSAM – a move tech companies call "an unwarranted attack on encryption".

In theory, the Act still requires messaging apps to scan encrypted chats for child abuse, even though experts say this is not technically possible without breaking encryption. The UK has said it will not force scanning until a technical solution exists, but the power remains in the law. In effect, encrypted services face a dilemma: weaken their encryption (backdoors or server-side scanning) to comply, or refuse (and risk UK bans or penalties). Major apps (WhatsApp, Signal, Apple, etc.) have publicly opposed any "backdoor" in encryption. The debate is ongoing, but for now most encryption remains intact.

In summary: the Act requires user rights to be respected, but also gives the government and Ofcom sweeping authority over speech. How that balance works in practice remains to be seen. On one hand, illegal speech (threats, harassment, sexual exploitation) is clearly targeted. On the other hand, broad categories like "harmful but legal" content (extremist propaganda, disinformation, self-harm content) could lead to grey-area moderation. Platforms are under pressure to err on the side of caution, which may limit some controversial speech.

Implications for Websites and Businesses

New and small websites:

Any online service that allows user content and has UK visitors could fall under the OSA. This includes blogs with comment sections, forums, small social apps, file-sharing sites, etc. Ofcom estimates tens of thousands of services will be in scope. The law scales duties to the size and risk of the platform: a tiny niche forum will have lighter obligations than a global platform like Facebook. But no service with user interaction is automatically exempt.

Key points for operators:

Overall, many in the tech and business community fear the OSA will raise the cost of launching and operating online services in the UK. The Act's broad scope and heavy compliance demands may deter new entrants. A UK equivalent of "starting Facebook from scratch" would today require expensive AI moderation, legal compliance teams, and constant liaison with regulators. This could slow growth of digital startups or push them to offshore solutions.

Enforcement and Ofcom's Role

Ofcom is the regulator in charge of enforcing the OSA. It must issue detailed Codes of Practice (after public consultation) that explain exactly how platforms should comply. So far Ofcom has published draft rules on illegal harms and child safety, and set deadlines (e.g. illegal-harm risk assessments due by 16 March 2025). It also maintains a register of high-risk "Category" services (the largest platforms) and will audit them closely.

Enforcement powers include: issuing fines (up to £18m/10% revenue), requiring public transparency reports, and ordering access restrictions. Ofcom can direct internet providers or app stores to block a non-compliant platform. Importantly, Ofcom also has the power to compel social media to preserve content of democratic importance (e.g. not remove political posts).

Online Safety Act 2023 - UK Parliament building with digital overlay representing internet regulation and child safety measures
The UK's Online Safety Act 2023 introduces comprehensive regulations for digital platforms and online services

Impartiality of Ofcom:

The Act formally designates Ofcom as an independent regulator, and the government has publicly stated that it will not meddle in day-to-day enforcement decisions. However, the Secretary of State retains some unusual powers over Ofcom's codes: the Act allows the government to direct Ofcom to modify draft rules for reasons of public policy or security. Civil liberties groups have warned this could undermine Ofcom's independence and give ministers emergency-style control over speech regulation. In response, officials argue that such directions will be used sparingly and that Ofcom remains operationally independent. In practice, it is too early to know how this tug-of-war will play out. For now, Ofcom appears to be proceeding with industry consultations in a fairly open manner.

Broader Implications and Advice

The OSA's impact will be far-reaching. Beyond games and major platforms, ordinary users and content creators may notice changes:

Advice:

If you run a website, blog, game, or app that has UK users, now is the time to prepare. Review whether you are "in scope" (do users generate content or chat?). If so, plan how you will:

For end users, be aware that the UK's online landscape is changing. Platforms will increasingly assert safety features. Parents may see new age-check points or content controls; gamers might notice more content being removed or flagged. At the same time, user controls (opt-outs, filters) may improve.

Conclusion

In conclusion, the Online Safety Act imposes a powerful "duty of care" on online platforms. Its goal is to reduce child exploitation, hate, and self-harm online, which many see as necessary. But it also introduces stringent monitoring and censorship-like powers that could reshape free speech and tech innovation in the UK. Companies and developers should follow the rollout closely, balance compliance with user rights, and engage with Ofcom consultations to shape practical rules. Anyone running a UK-facing service today must take the new law seriously or risk severe penalties.

Sources:

Official government guidance and legislative text; analysis by UK law firms and media; and civil liberties organisations, among others. These provide a balanced picture of the Act's provisions and potential effects.

I must say, this Online Safety Act business is quite the bureaucratic marvel, isn't it? Rather reminds one of the sort of well-intentioned but potentially overreaching legislation that would have made Orwell raise an eyebrow. Still, one does appreciate the government's attempt to tidy up the digital realm - though whether Ofcom will prove as independent as a proper cup of Earl Grey remains to be seen.
-Claude.ai

TL;DR: Britain's Digital Doom

The most shocking revelation: This sweeping 300-page monstrosity requires virtually any website with user interaction—from gaming chats to blog comments—to implement expensive AI moderation systems and comply with vague, contradictory regulations that would make a Byzantine bureaucrat blush.

Scope and Scale: Digital Totalitarianism

The Act's reach is breathtaking in its audacity: any service allowing user interaction with UK users must comply. This includes gaming platforms, forums, dating apps, file-sharing sites, and even blogs with comment sections. The law demands these services implement AI-powered content scanning, age verification systems, and maintain extensive compliance documentation—costs that would make starting "Facebook from scratch" prohibitively expensive for British entrepreneurs.

Most alarmingly, the Act requires platforms to scan encrypted private messages for illegal content—a technical impossibility that effectively mandates breaking encryption or facing £18 million fines. WhatsApp, Signal, and Apple have rightly called this an "unwarranted attack on encryption."

Economic Vandalism in Action

Companies are already fleeing: BitChute has geo-blocked UK users entirely, adult content sites are preemptively blocking British access, and Wikipedia has warned it may not comply due to privacy concerns. This "geoblocking" trend reduces service choices for British users whilst signalling to the world that the UK is hostile to digital innovation.

The compliance burden is crushing: businesses must conduct risk assessments, implement moderation systems, maintain detailed records, and navigate Ofcom's labyrinthine codes of practice. These costs favour large corporations with deep pockets, effectively creating a barrier to entry that established players can use to eliminate smaller competitors—precisely when Britain needs entrepreneurial growth.

Vague Laws, Selective Implementation

This 300-page legislative behemoth has been widely criticised for its vagueness, leaving businesses guessing about compliance requirements until Ofcom issues detailed codes. The definition of "harmful but legal" content remains deliberately opaque, creating a chilling effect on legitimate discourse. Meanwhile, the Secretary of State retains powers to direct Ofcom's enforcement, raising serious questions about selective implementation based on political convenience.

Existing EU regulations already provide adequate frameworks for online safety without the draconian overreach seen here. The EU's approach balances safety with innovation—something this Act spectacularly fails to achieve.

What Should Happen

These economically destructive regulations should be scrapped immediately. Instead, the UK government should issue non-binding guidance and best practice recommendations aligned with EU standards. This would protect users without creating the regulatory quicksand that's already driving businesses away from British shores. At a time when the country desperately needs digital growth and innovation, these laws represent an act of economic self-harm that future historians will struggle to comprehend.

In short: Britain has created the world's most comprehensive framework for throttling its own digital economy whilst pretending to protect children. Quite the achievement.

← Back to Blog Overview

More from the Blog

Browse by Tag

Contact – David W Beck