A groundbreaking and comprehensive new bill aimed at protecting children and adults on the internet was passed yesterday by Parliament. The Online Safety Bill will make user-to-user platforms legally responsible to protect their users from harmful content.
The Online Safety bill, which was drafted in 2021, began its life as a 2019 whitepaper that delved into growing concerns about online safety. It has grown in scope over time to become the massive 300-page bill that was passed this week. The bill went into effect on September 19, 2023, and must be given Royal Assent before becoming law.
The Online Safety Act will be enforced by Ofcom, the regulator of communications services in the U.K, giving this watchdog agency the power to issue stiff penalties for non-compliance.
Here’s what payment providers and platforms need to know:
Who must comply with the Online Safety Bill?
The online safety bill will impact user-to-user services such as social media platforms, video-sharing sites, and technology companies hosting user-generated content available to viewers in the U.K. It will also impact search engines which generate search results displaying or promoting content.
This bill also impacts ancillary services that support the entities above, for example, services that enable funds to be transferred to regulated platforms and companies. By this definition, payment service providers, payment processors, banks, and acquirers, and even SaaS-based tech platforms or fintechs will be within scope and expected to comply.
What are the requirements of the Online Safety Bill?
The new legislation will require:
- More robust age verification of users, especially on sites with adult content. This follows the trend of some U.S. states, including Arkansas, Mississippi, Utah, and Louisiana, that have passed laws to prevent underage users from accessing online porn sites.
- Tools for adults to filter certain content/users, so that they can have more control over what they do or don’t see.
- Increased moderation/removal of harmful content
- Removal of illegal content associated with:
- CSAM
- Controlling or coercive behavior
- Extreme sexual violence
- Fraud/scams (including prepaid advertisements and user-generated scams)
- Hate crime Inciting violence
- Illegal immigration and people smuggling
- Promoting or facilitating suicide
- Promoting self-harm
- Revenge porn
- Selling illegal drugs or weapons
- Sexual exploitation
- Scams
- Terrorism
Steep costs of Online Safety Bill non-compliance
Non-compliance with the new law can have severe consequences, resulting in fines of £18 million or 10 percent of annual global turnover, whichever is greater. Platforms will also have to show that they have processes in place to meet the requirements set out by the bill. Ofcom will check how effective those processes are at protecting internet users from harm, especially when anonoymous or repeat offenders are involved. In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers, and internet service providers to stop working with a site, preventing it from generating money or being accessed from the U.K.
Public opinion on the bill
While the group had many supporters, social media platforms and tech companies have opposed the bill, including Wikimedia, WhatsApp, Meta Platforms (Facebook owner), and Apple. Some groups fear it will give social media platforms too much power to censor content.
Whatever the opinions on the Online Safety Bill, it is a clear indicator of the trends we’ve been tracking at EverC. Regulatory burdens are shifting, enforcement is increasing, and the sphere of responsibility is widening.
Payment providers and marketplace platforms, take note.