UK issues safety-focused rules for video-sharing platforms like TikTok – TechCrunch

Video-sharing platforms offering a service in the UK must comply with new regulations designed to protect users and those under 18 from harmful content such as hate speech and videos/advertisements that may incite violence against protected groups.

Ofcom, the national regulator for communications, broadcasting and – in a growing role – internet content, today released the guidelines for platforms including TikTok, Snapchat, Vimeo and Twitch.

Among the requirements for the relevant services is that they must take “appropriate measures” to protect users from harmful content.

Terrorist content, child pornography, racism and xenophobia also fall into the category of “harmful content”.

In a press release, the regulator said its research shows that a third of UK internet users say they have witnessed or experienced hateful content; a quarter say they have been exposed to violent or disturbing content; while one in five have been exposed to videos or content promoting racism.

There is no prescriptive list of measures video-sharing platforms must employ to prevent users from being exposed to such content.

But there are a number of recommendations – such as clauses in the terms and conditions; features such as the ability for uploaders to declare whether their content contains advertising; and user-friendly mechanisms for viewers to flag or report harmful content, as well as transparent complaints procedures.

Age assurance schemes are also recommended, as is the inclusion of parental controls – as the regulations specifically aim to protect under-18s from viewing videos and advertisements containing restricted material.

Ofcom is also recommending “robust” age verification for video-sharing platforms that host pornography to prevent those under 18 from viewing adult material.

A list of video-sharing platforms that have notified themselves to Ofcom as falling within the scope of the regulations can be found here. (In addition to the aforementioned platform giants, it also includes OnlyFans, Triller, and Recast.)

“We recommend that providers have systematic risk management processes in place to help providers identify and implement feasible and proportionate measures,” Ofcom continues in the guidance to video-sharing platforms.

“While we recognize that harmful material may not be completely eradicated from a platform, we expect vendors to make meaningful efforts to prevent users from encountering it,” it adds.

“PSV [aka video-sharing platform] The regime is about the platform’s security systems and processes, not the regulation of individual videos, but evidence of a prevalence of harmful material on a platform may require further investigation.

The regulator says it will want to understand the measures the platforms have in place, as well as their effectiveness in protecting users – and “any processes that informed a provider’s decisions on what safeguards to use. “. The platforms will therefore have to document and be able to justify their decisions if the regulator appeals, for example following a complaint.

Monitoring technology platforms for compliance with the new requirements will be a key new role for Ofcom – and a taste of what will come under incoming and much broader security-focused digital regulations.

“As well as engaging with providers themselves, we hope to inform our understanding of effective user protection, for example by monitoring complaints and engaging with interested parties such as charities, NGOs and groups technology security,” Ofcom also writes, adding that this engagement will play an important role in supporting its decisions about “areas of interest.”

Ofcom’s role as a regulator of internet content will be fleshed out in the years to come as the government works to pass legislation that will impose an extensive duty of care on digital service providers of all stripes, requiring them to manage user-generated content in a way that prevents people – and especially children – from being exposed to illegal and/or harmful things.

A key appointment – the chairman of Ofcom – has been delayed as the government decided to relaunch the competition for the post.

Reports suggest the government wants the former Daily Mail editor to take the job, but an independent panel involved in the initial selection process rejected Paul Dacre as an unsuitable candidate earlier this year. (It’s unclear whether the government will continue to try to parachute Dacre into the post.)

Ofcom, meanwhile, has been regulating video-on-demand services in the UK since 2010.

But the video-sharing framework is a separate regulatory instrument intended to address differences in the level of control, as video-sharing platforms provide tools for users to upload their own content.

However, this new framework is expected to be replaced by new legislation under the new online safety regulatory framework.

So these regulations for video-sharing platforms are something of a placeholder and teaser as UK lawmakers work to establish more comprehensive online safety rules that will apply much more widely.

Yet in the guidelines, Ofcom describes the VSP scheme as “an important precursor to future online safety legislation”, adding: “Given the common aim of the two schemes to improve user safety by requiring services to protect users through the adoption of appropriate systems and processes, Ofcom believes that compliance with the VSP regime will help services prepare to comply with the online security regime as outlined by the government in the draft online security law.

UK data protection regulations also already apply a set of ‘age-appropriate’ design requirements for digital services that may be accessible to children.

Shirley K. Rosa