Ofcom has published its upcoming censorship rules for video sharing platforms and invites public responses up until 2nd June 2021. For a bit of self justification for its censorship, Ofcom has commissioned a survey to find that YouTube users and the
likes are calling out for Ofcom censorship. Ofcom writes:
A third of people who use online video-sharing services have come across hateful content in the last three months, according to a new study by Ofcom.
The news comes as Ofcom
proposes new guidance for sites and apps known as 'video-sharing platforms' (VSPs), setting out
practical steps to protect users from harmful material.
VSPs are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and
Under laws introduced by Parliament last year, VSPs established in the UK must take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or
hatred, as well as certain types of criminal content. Ofcom's job is to enforce these rules and hold VSPs to account.
The draft guidance is designed to help these companies understand what is expected of them under the new
rules, and to explain how they might meet their obligations in relation to protecting users from harm.
Harmful experiences uncovered
To inform our approach, Ofcom has researched how people in the UK
use VSPs, and their claimed exposure to potentially harmful content. Our major findings are:
Hate speech. A third of users (32%) say they have witnessed or experienced hateful content. Hateful content was most often directed towards a racial group (59%), followed by religious groups (28%), transgender people (25%)
and those of a particular sexual orientation (23%).
Bullying, abuse and violence. A quarter (26%) of users claim to have been exposed to bullying, abusive behaviour and threats, and the same proportion came across
violent or disturbing content.
Racist content. One in five users (21%) say they witnessed or experienced racist content, with levels of exposure higher among users from minority ethnic backgrounds (40%), compared to
users from a white background (19%).
Most users encounter potentially harmful videos of some sort. Most VSP users (70%) say they have been exposed to a potentially harmful experience in the last three months,
rising to 79% among 13-17 year-olds.
Low awareness of safety measures. Six in 10 VSP users are unaware of platforms' safety and protection measures, while only a quarter have ever flagged or reported harmful content.
Guidance for protecting users
As Ofcom begins its new role regulating video-sharing platforms, we recognise that the online world is different to other regulated sectors. Reflecting the nature of
video-sharing platforms, the new laws in this area focus on measures providers must consider taking to protect their users -- and they afford companies flexibility in how they do that.
The massive volume of online content means it
is impossible to prevent every instance of harm. Instead, we expect VSPs to take active measures against harmful material on their platforms. Ofcom's new guidance is designed to assist them in making judgements about how best to protect their users. In
line with the legislation, our guidance proposes that all video-sharing platforms should provide:
Clear rules around uploading content. VSPs should have clear, visible terms and conditions which prohibit users from uploading the types of harmful content set out in law. These should be enforced effectively.
Easy flagging and complaints for users. Companies should implement tools that allow users to quickly and effectively report or flag harmful videos, signpost how quickly they will respond, and be open about any action taken.
Providers should offer a route for users to formally raise issues or concerns with the platform, and to challenge decisions through dispute resolution. This is vital to protect the rights and interests of users who upload and share content.
Restricting access to adult sites. VSPs with a high prevalence of pornographic material should put in place effective age-verification systems to restrict under-18s' access to these sites and apps.
Enforcing the rules
Ofcom's approach to enforcing the new rules will build on our track record of protecting audiences from harm, while upholding freedom of expression. We will consider the unique
characteristics of user-generated video content, alongside the rights and interests of users and service providers, and the general public interest.
If we find a VSP provider has breached its obligations to take appropriate
measures to protect users, we have the power to investigate and take action against a platform. This could include fines, requiring the provider to take specific action, or -- in the most serious cases -- suspending or restricting the service.Consistent
with our general approach to enforcement, we may, where appropriate, seek to resolve or investigate issues informally first, before taking any formal enforcement action.
We are inviting
all interested parties to comment on our proposed draft guidance, particularly services which may fall within scope of the regulation, the wider industry and third-sector bodies. The deadline for responses is 2 June 2021. Subject to feedback, we plan to
issue our final guidance later this year. We will also report annually on the steps taken by VSPs to comply with their duties to protect users.
Ofcom has been given new powers to regulate
UK-established VSPs. VSP regulation sets out to protect users of VSP services from specific types of harmful material in videos. Harmful material falls into two broad categories under the VSP Framework, which are defined as:
Restricted Material , which refers to videos which have or would be likely to be given an R18 certificate, or which have been or would likely be refused a certificate. It also includes other material that might impair the
physical, mental or moral development of under-18s.
Relevant Harmful Material , which refers to any material likely to incite violence or hatred against a group of persons or a member of a group of persons based on
particular grounds. It also refers to material the inclusion of which would be a criminal offence under laws relating to terrorism; child sexual abuse material; and racism and xenophobia.
The Communications Act sets out the criteria for determining jurisdiction of VSPs, which are closely modelled on the provisions of the Audiovisual Media Services Directive. A VSP will be within UK jurisdiction if it has the required
connection with the UK. It is for service providers to assess whether a service meets the criteria and notify to Ofcom that they fall within scope of the regulation. We recently
published guidance about the criteria to assist them in making this assessment. In December 2020,
the Government confirmed its intention to appoint Ofcom as the regulator of the
future online harms regime . It re-stated its intention for the VSP Framework to be superseded by the regulatory framework in new Online Safety legislation.