Melon Farmers Original Version

Ofcom Video Sharing Censors


Video on Demand and video sharing


 

Offsite Article: Age/ID Verified by Google...


Link Here 14th January 2024
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
Ofcom speaks of behind the scenes discussions for international age verification

See article from ofcom.org.uk

 

 

More censors...

Ofcom investigates BitChute as an early test case of internet censorship


Link Here7th October 2023
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
BitChute is a British based video sharing platform that is particularly well known for hosting content that has been banned from more censorial websites, notably YouTube.

The name was conceived from a portmanteau of the words bit , a unit of information in computing, and parachute . At the time of the site's launch, founder Ray Vahey described BitChute as an alternative to mainstream platforms; he believed these platforms had demonstrated increased levels of censorship over the previous few years by banning and demonetising users (barring them from receiving advertising revenue), and tweaking algorithms to send certain content into obscurity. In 2018, the creators of BitChute described themselves as a small team making a stand against Internet censorship because we believe it is the right thing to do.

Of course right leaning opionion does not sit well with the British establishment so it isn't a surprise to see it as an early example of interest from the UK internet censor, Ofcom. Note that censorship powers have already been granted to Ofcom for video sharing platforms stupid enough to be based in Britain. So these platforms are an interesting forerunner to how Ofcom will censor the wider internet when powers from the Online Censorship Bill are enacted.

Ofcom writes:

Investigation into BitChute Limited Case considered 3 October 2023

Summary

Compliance assurances from BitChute regarding its obligations under Part 4B of the Communications Act 2003: Improvements to the measures BitChute has in place to protect users from videos containing harmful material.

Ofcom's role is to ensure video-sharing platforms (VSPs) based in the UK have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime.

In May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York. Ofcom conducted analysis of the measures in place to protect users from harmful material on several VSPs, including BitChute, in light of this incident.

Our analysis of BitChute's platform raised concerns that some of its measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime.

Following a period of close engagement with BitChute to discuss its compliance with its obligations under Part 4B of the Communications Act 2003, it has made some important changes and also committed to further improvements to protect users from harmful material.

Background

On 14 May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York, killing ten people and wounding three others. The attacker livestreamed the shooting online, and versions of the footage were distributed on multiple online services, including BitChute and other UK-based VSPs that we currently regulate . This resulted in UK users being potentially exposed to harmful material related to terrorism and material likely to incite violence and hatred.

Ofcom's role is to ensure VSPs have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime. Our approach to securing compliance focuses on oversight, accountability, and transparency, working with the industry where possible to drive improvements, as well as taking formal enforcement action where appropriate.

Our concerns

In the weeks following the Buffalo attack, we engaged with relevant VSPs and the wider industry to learn more about how platforms can set up internal systems and processes to prevent the livestreaming of such attacks and protect users from the sharing of associated video content. In October 2022, Ofcom published a report on the Buffalo attack that explored how footage of the attack, and related material, came to be disseminated online and to understand the implications of this for platforms' efforts to keep people safe online.

Our analysis raised concerns that BitChute's reporting and flagging measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime. In particular, Ofcom was concerned that BitChute's reporting function was not open to non-registered users, and that the capacity and coverage of BitChute's content moderation team was insufficient to enable it to respond promptly to reports of harmful content.

BitChute's commitments

In response to our concerns, BitChute has made some important changes to its reporting and flagging mechanisms and to the content moderation processes which underpin these, as well as committing to further changes.

1. Coverage and capacity of content moderation

In our 2022 VSP Report , published in October, we found that all VSPs, including BitChute, have adequate terms and conditions that prohibit material that would come within the scope of laws relating to terrorism, racism, and xenophobia, as well as material likely to incite violence or hatred

However, the Buffalo attack exposed key deficiencies in BitChute's ability to effectively enforce its terms and conditions relating to hate and terror content: footage of the shooting was easily accessible on the platform in the days after the attack, and we learnt that the platform's content moderation team was modest in size and limited to certain working hours. This restricted BitChute's ability to respond quickly to reports that footage was on the platform following the attack.

BitChute has committed to triple the capacity of its moderation team by taking on more human moderators. It is also extending the coverage of its moderation team by increasing the number of hours that moderators are available to review reports and has committed to having a safety team operational 24/7 in autumn 2023.

2. User reporting and flagging mechanisms

Prior to the Buffalo attack, BitChute had reporting and flagging mechanisms in place to allow users to report potentially harmful content. However, on-platform flagging was only available to users who had a registered BitChute account. While all users (registered or unregistered) were able to report content by sending an email to BitChute, we were concerned that requiring non-registered users to email the platform, rather than click a reporting button next to the video, introduces a layer of friction to the reporting process that could disincentivise the user from making a report and increase the time taken to respond to reports.

As a result of our remediation work, BitChute has changed the design of its platform to allow non-registered users to directly report potentially harmful content. It has also updated its user-facing guidelines to set out more clearly what registered and non-registered users can expect from the flagging and reporting process.

3. Measuring effectiveness

BitChute has also committed to collecting additional metrics to measure the impact of changes made to its systems and processes, including the volume of content review reports raised each day and average response time in minutes for content reports. These metrics will help BitChute and Ofcom to evaluate the effectiveness of the platform's measures more easily.

We have also encouraged BitChute to implement additional reports on risk metrics, that measure the risk of harmful material being encountered on the platform and process metrics, that measure the effectiveness of BitChute's moderation systems.

Our response

Taking into account BitChute's willingness to make timely improvements to its systems and processes to directly address the concerns we identified following the Buffalo incident, and our desire to work with industry to secure changes that protect users, we have decided not to open an investigation against BitChute into its compliance with its duties under Part 4B of the Communications Act 2003 at this time. We will, however, closely monitor the implementation of the proposed changes and the impact these changes have on user safety.

We also note that, on 14 June 2023, BitChute became a member of the Global Internet Forum to Counter Terrorism (GIFCT). GIFCT is a cross-industry initiative designed to prevent terrorists and violent extremists from exploiting digital platforms. Whilst we do not consider this an indicator of compliance, it is an encouraging step -- GIFCT has rigorous standards for membership , including demonstrating "a desire to explore new technical solutions to counter terrorist and violent extremist activity online" and "support for expanding the capacity of civil society organisations to challenge terrorism and violent extremism".

While we welcome BitChute's commitments to further improvements and measuring their effectiveness, we are aware of reports -- some of which have been communicated to us directly -- alleging that content likely to incite violence and hatred continues to be uploaded to BitChute, can be accessed easily, and may pose significant risks to users.

It is important to note that the VSP regime is a systems and processes regime, meaning the presence of harmful video content on a service is not in itself a breach of the rules. Accordingly, Ofcom's focus is to drive improvements to platforms systems and processes to minimise the risks of users encountering harmful videos online in the first place.

However, such content can be indicative of an underlying issue with the user protections in place, and we will therefore continue to monitor BitChute closely to assess whether the changes it has made to its user reporting and content moderation systems result in tangible improvements to user safety. If we find that, despite BitChute's assurances and improvements, users are not being adequately protected from the categories of harmful material covered by the VSP regime, we will not hesitate to take further action, including formal enforcement action if necessary.

 

 

Sharing censorship news...

Ofcom warns adult video sharing websites that are stupid enough to be based in Britain that it will soon be enforcing age/identity verification


Link Here 15th January 2023
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing

One of our priorities for the second year of the video-sharing platform (VSP) regime is to promote the implementation of robust age assurance, so that children are protected from the most harmful content. In October 2022, we published our report on the first year of VSP regulation . The report highlighted that many platforms that specialise in videos containing pornographic material (or "adult VSPs") do not appear to have measures that are robust enough to stop children accessing pornographic material.

Today Ofcom is opening an enforcement programme into age assurance measures across the adult VSP sector.

Our objectives for this programme are:

  • to assess the age assurance measures implemented by notified adult VSPs, to ensure they are sufficiently robust to prevent under-18s from accessing videos containing pornographic material;

  • to identify whether there are other platforms in the adult VSP sector that may fall in scope of the VSP regime but:

    • have not yet notified their service to Ofcom, as required under the VSP framework (see more below); and

    • may not have appropriate measures in place to protect under-18s from pornographic content; and

  • to understand from providers of adult VSP services the challenges they have faced when considering implementing any age assurance measures. This will also help us build a picture of what measures work and are proportionate to expect from different VSPs, in line with our strategic priority of driving forward the implementation of robust age assurance.

The programme will seek to determine the scale of any compliance concerns in respect of notified and non-notified adult VSPs. We will then decide whether any further action (including enforcement) is needed, and how best to address potential harm.

 

 

Harming British industry...

UK's internet censor threatens that the few adult video sharing websites that are stupid enough to be based in Britain should introduce onerous age verification


Link Here 21st October 2022
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing

UK adult sites not doing enough to protect children

Smaller adult video-sharing sites based in the UK do not have sufficiently robust access control measures in place to stop children accessing pornography, Ofcom has found in a new report.

Ahead of our future duties in the Online Safety Bill, Ofcom already has some powers to regulate video-sharing platforms (VSPs) established in the UK, which are required by law to take measures to protect people using their sites and apps from harmful videos.

Nineteen companies have notified us that they fall within our jurisdiction. They include TikTok, Snapchat, Twitch, Vimeo, OnlyFans and BitChute; as well as several smaller platforms, including adult sites.

Ofcom is concerned that smaller UK-based adult sites do not have robust measures in place to prevent children accessing pornography. They all have age verification measures in place when users sign up to post content. However, users can generally access adult content just by self-declaring that they are over 18.

One smaller adult platform told us that it had considered implementing age verification, but had decided not to as it would reduce the profitability of the business.

However, the largest UK-based site with adult content, OnlyFans, has responded to regulation by adopting age verification for all new UK subscribers, using third-party tools provided by Yoti and Ondato.

According to new research we have published today, most people (81%) do not mind proving their age online in general, with a majority (78%) expecting to have to do so for certain online activities. A similar proportion (80%) feel internet users should be required to verify their age when accessing pornography online, especially on dedicated adult sites.

Over the next year, adult sites that we already regulate must have in place a clear roadmap to implementing robust age verification measures. If they don't, they could face enforcement action. Under future online safety laws, Ofcom will have broader powers to ensure that many more services are protecting children from adult content. Some progress protecting users, but more to be done

We have seen some companies make positive changes more broadly to protect users from harmful content online, including as a direct result of being regulated under the existing laws. For example:

TikTok now categorises content that may be unsuitable for younger users, to prevent them from viewing it. It has also established an Online Safety Oversight Committee, which provides executive oversight of content and safety compliance specifically within the UK and EU.

Snapchat recently launched a parental control feature, Family Center, which allows parents and guardians to view a list of their child's conversations without seeing the content of the message.

Vimeo now allows only material rated all audiences to be visible to users without an account. Content rated mature or unrated is now automatically put behind the login screen.

BitChute has updated its terms and conditions and increased the number of people overseeing and -- if necessary -- removing content.

However, it is clear that many platforms are not sufficiently equipped, prepared and resourced for regulation. We have recently opened a formal investigation into one firm, Tapnet Ltd -- which operates adult site RevealMe -- in relation to its response to our information request.

We also found that companies are not prioritising risk assessments of their platforms, which we consider fundamental to proactively identifying and mitigating risks to users. This will be a requirement on all regulated services under future online safety laws.

Over the next twelve months, we expect companies to set and enforce effective terms and conditions for their users, and quickly remove or restrict harmful content when they become aware of it. We will review the tools provided by platforms to their users for controlling their experience, and expect them to set out clear plans for protecting children from the most harmful online content, including pornography.

 

 

Unrevealing...

Ofcom picks RevealMe.com seemingly as a start point to enforce ID verification requirements for adult content


Link Here30th September 2022
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
RevealMe.com is streaming service along the lines of OnlyFans that allows models to provide adult streaming videos and other content to subscribing fans.

Ofcom have announced that they are investigating the company for failing to provide Ofcom with information as to how they are implementing  Age/ID verification ('protecting users' in Ofcom speak).

Ofcom writes:

Ofcom has been the regulator of UK established video sharing platforms (VSPs) since November 2020. Earlier this year, Ofcom issued a number of information requests to VSPs to obtain information on the measures taken by VSPs to protect users.

On 29 September 2022, Ofcom opened an investigation into Tapnet Ltd, which provides the VSP RevealMe.

This investigation concerns Tapnet's compliance with an information request notice, issued on 6 June 2022 under section 368Z10 of the Communications Act 2003. Tapnet was required to respond to the Notice by no later than 4 July 2022. As of 29 September 2022, Tapnet had not provided a response to the Notice.

The Notice explained that the reason for requesting the information was to understand and monitor the measures VSPs have in place to protect users and to publish a report under section 368Z11 of the Act.

Ofcom's investigation will examine whether there are reasonable grounds for believing that Tapnet has failed to comply with its statutory duties in relation to Ofcom's information request. Ofcom will provide updates on this page as we progress this investigation.

 

 

Ofcom thinks it can 'regulate' cancel culture, PC lynch mobs and the kangaroo courts of wokeness...

The new internet censor sets outs its stall for the censorship of video sharing platforms


Link Here24th March 2021
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
Ofcom has published its upcoming censorship rules for video sharing platforms and invites public responses up until 2nd June 2021. For a bit of self justification for its censorship, Ofcom has commissioned a survey to find that YouTube users and the likes are calling out for Ofcom censorship. Ofcom writes:

A third of people who use online video-sharing services have come across hateful content in the last three months, according to a new study by Ofcom.

The news comes as Ofcom proposes new guidance for sites and apps known as 'video-sharing platforms' (VSPs), setting out practical steps to protect users from harmful material.

VSPs are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and social features.

Under laws introduced by Parliament last year, VSPs established in the UK must take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content. Ofcom's job is to enforce these rules and hold VSPs to account.

The  draft guidance is designed to help these companies understand what is expected of them under the new rules, and to explain how they might meet their obligations in relation to protecting users from harm.

Harmful experiences uncovered

To inform our approach, Ofcom has researched how people in the UK use VSPs, and their claimed exposure to potentially harmful content. Our major findings are: 

  • Hate speech. A third of users (32%) say they have witnessed or experienced hateful content. Hateful content was most often directed towards a racial group (59%), followed by religious groups (28%), transgender people (25%) and those of a particular sexual orientation (23%).

  • Bullying, abuse and violence. A quarter (26%) of users claim to have been exposed to bullying, abusive behaviour and threats, and the same proportion came across violent or disturbing content.

  • Racist content. One in five users (21%) say they witnessed or experienced racist content, with levels of exposure higher among users from minority ethnic backgrounds (40%), compared to users from a white background (19%). 

  • Most users encounter potentially harmful videos of some sort. Most VSP users (70%) say they have been exposed to a potentially harmful experience in the last three months, rising to 79% among 13-17 year-olds.

  • Low awareness of safety measures. Six in 10 VSP users are unaware of platforms' safety and protection measures, while only a quarter have ever flagged or reported harmful content.

Guidance for protecting users

As Ofcom begins its new role regulating video-sharing platforms, we recognise that the online world is different to other regulated sectors. Reflecting the nature of video-sharing platforms, the new laws in this area focus on measures providers must consider taking to protect their users -- and they afford companies flexibility in how they do that.

The massive volume of online content means it is impossible to prevent every instance of harm. Instead, we expect VSPs to take active measures against harmful material on their platforms. Ofcom's new guidance is designed to assist them in making judgements about how best to protect their users. In line with the legislation, our guidance proposes that all video-sharing platforms should provide:

  • Clear rules around uploading content. VSPs should have clear, visible terms and conditions which prohibit users from uploading the types of harmful content set out in law. These should be enforced effectively.

  • Easy flagging and complaints for users. Companies should implement tools that allow users to quickly and effectively report or flag harmful videos, signpost how quickly they will respond, and be open about any action taken. Providers should offer a route for users to formally raise issues or concerns with the platform, and to challenge decisions through dispute resolution. This is vital to protect the rights and interests of users who upload and share content.

  • Restricting access to adult sites. VSPs with a high prevalence of pornographic material should put in place effective age-verification systems to restrict under-18s' access to these sites and apps.

Enforcing the rules

Ofcom's approach to enforcing the new rules will build on our track record of protecting audiences from harm, while upholding freedom of expression. We will consider the unique characteristics of user-generated video content, alongside the rights and interests of users and service providers, and the general public interest.

If we find a VSP provider has breached its obligations to take appropriate measures to protect users, we have the power to investigate and take action against a platform. This could include fines, requiring the provider to take specific action, or -- in the most serious cases -- suspending or restricting the service.Consistent with our general approach to enforcement, we may, where appropriate, seek to resolve or investigate issues informally first, before taking any formal enforcement action.

Next steps

We are inviting all interested parties to comment on our proposed draft guidance, particularly services which may fall within scope of the regulation, the wider industry and third-sector bodies. The deadline for responses is 2 June 2021. Subject to feedback, we plan to issue our final guidance later this year. We will also report annually on the steps taken by VSPs to comply with their duties to protect users.

NOTES

Ofcom has been given new powers to regulate UK-established VSPs. VSP regulation sets out to protect users of VSP services from specific types of harmful material in videos. Harmful material falls into two broad categories under the VSP Framework, which are defined as:

  • Restricted Material , which refers to videos which have or would be likely to be given an R18 certificate, or which have been or would likely be refused a certificate. It also includes other material that might impair the physical, mental or moral development of under-18s.

  • Relevant Harmful Material , which refers to any material likely to incite violence or hatred against a group of persons or a member of a group of persons based on particular grounds. It also refers to material the inclusion of which would be a criminal offence under laws relating to terrorism; child sexual abuse material; and racism and xenophobia.

The Communications Act sets out the criteria for determining jurisdiction of VSPs, which are closely modelled on the provisions of the Audiovisual Media Services Directive. A VSP will be within UK jurisdiction if it has the required connection with the UK. It is for service providers to assess whether a service meets the criteria and notify to Ofcom that they fall within scope of the regulation. We recently published guidance about the criteria to assist them in making this assessment. In December 2020, the Government confirmed its intention to appoint Ofcom as the regulator of the future online harms regime . It re-stated its intention for the VSP Framework to be superseded by the regulatory framework in new Online Safety legislation.

 

 

Advantaging foreign companies...

If anyone is stupid enough to base a video sharing internet service in the UK, then you will have to sign up for censorship by Ofcom before 6th May 2021. After a year you will have to pay for the privilege too


Link Here 10th March 2021
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing

Ofcom has published guidance to help providers self-assess whether they need to notify to Ofcom as UK-established video-sharing platforms.

Video-sharing platforms (VSPs) are a type of online video service which allow users to upload and share videos with the public.

Under the new VSP regulations , there are specific legal criteria which determine whether a service meets the definition of a VSP, and whether it falls within UK jurisdiction. Platforms must self-assess whether they meet these criteria, and those that do will be formally required to notify to Ofcom between 6 April and 6 May 2021. Following consultation, we have today published our final guidance to help service providers to make this assessment.

 

 

Update: The New ATVOD...

Ofcom's first response to complaints about the content of internet adult videos


Link Here25th February 2016
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
Someone has been trolling around adult video on the clips4sale website and fired off a few complaints about content.

However Ofcom has responded in the latest Complaints Bulletin by listing them as outside of remit.

The complaints were listed under content complaints in the category : Prohibited material / notification. The services were named as:

  • Balls for Kicking (clips4sale)
  • More Balls to Kick (clips4sale)
  • Sit and Smother (clips4sale)
  • Uniformed Women in Control (clips4sale)
  • Women Rule (clips4sale)
  • Womens World (clips4sale)

 

 

Offsite Article: From ATVOD to Ofcom...


Link Here 21st February 2016
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
Out of the Frying Pan, Into the Fire. By Jerry Barnett

See article from sexandcensorship.org

 

 

On-demand and online research: consumption and concerns...

Ofcom solicits 'concerns' of VoD viewers


Link Here8th February 2016
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing

Ofcom regulates on-demand programme services (ODPS) that are notified and based in the UK, to ensure that providers apply the relevant standards. Ofcom also has a duty to advise the Government on the need for protection of consumers and citizens in their consumption of audio-visual services, and in particular the need to protect children.

Ofcom seeks to understand people's use of, and concerns about, notified ODPS in the broader context of all on-demand and online audio-visual services in the UK, and has therefore carried out quantitative consumer research for this purpose. A

. Comparisons are made to the 2014 data throughout this report where relevant.

This survey covers the full range of audio-visual content that is available on demand and online: sourced either directly via the internet, via an app, or via a provider of a service; for example, programmes on BBC iPlayer, clips on YouTube and films provided by ondemand services from Netflix.

In this report we examine online and on-demand consumption of audio-visual content among adults and teens, and their concerns regarding that content.

The report adds about viewer 'concerns'

The top mentions in 2015 among all concerned adults include: violence (50%), welfare of children/young people (32%), bullying/victimising (31%), racism (30%), discrimination (29%), bad language (28%) and pornography (24%). Concerns regarding violence, bullying and racism have significantly increased among adults since 2014, while concerns regarding sexually explicit content have decreased.




 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys