Melon Farmers Original Version

Ofcom Watch


Latest

 2003   2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   Latest 

 

Propaganda victory...

Chinese propaganda channel CGTN works round Ofcom's ban and will now again be available across Europe


Link Here10th April 2021
Full story: Diplomatic Censorship at Ofcom...Ofcom get caught up in international relations
China's state propaganda channel CGTN could soon be back on British TV screens, as French authorities have agreed to regulate, so overriding a decision by the UK TV censor Ofcom to ban the channel.

Ofcom decided to pull CGTN off air in February after finding it unacceptable that the channel is editorially controlled by the Chinese Communist party.

France does not have rules that prohibit state-controlled broadcasters from airing in the country.

But now that the channel is officially regulated by another Council of Europe country, then Ofcom is bound by treaty to accept that CGTN can now broadcast to Britain.

The treaty between members of the Council of Europe, a 47-member organisation that is separate from the EU and therefore not affected by Brexit, mandates that an international broadcaster can beam into the territories of signatories as long as it falls under the jurisdiction of one member.

Saying that, it is not yet clear whether Sky will be including the channel in its package. However Sky currently carries the channel on its networks in Italy and Germany.

 

 

Religious nonsense...

Ofcom fines Loveworld religious TV channel for broadcasting nonsense theories about coronavirus


Link Here2nd April 2021
Full story: Ofcom on Religion...ofcom keep religious extremism in check
Ofcom has imposed a fine of 125,000 on Loveworld Limited after a programme broadcast on its religious service Loveworld Television Network featured inaccurate and supposedly potentially harmful claims about the Coronavirus without providing adequate protection for viewers. This was the second time in a year that the broadcaster rules on accuracy in news, and harm in its coverage of the Coronavirus.

Ofcom considered these breaches to be serious, repeated and reckless, warranting the imposition of a statutory sanction beyond the direction to broadcast a statement of our findings that Ofcom issued in its Decision published 15 January 2021.

Ofcom's investigation found that the 29-hour programme, Global Day of Prayer , included statements claiming that the pandemic is a planned event created by the deep state for nefarious purposes, and that the vaccine is a sinister means of administering nanochips to control and harm people. Some statements claimed that fraudulent testing had been carried out to deceive the public about the existence of the virus and the scale of the pandemic. Others linked the cause of Covid-19 to the roll out of 5G technology.

Ofcom was particularly concerned that this breach followed previous, similar breaches in 2020 during the investigation of which, Loveworld Limited gave Ofcom a number of assurances as to how it would improve its compliance procedures.

 

 

Ofcom thinks it can 'regulate' cancel culture, PC lynch mobs and the kangaroo courts of wokeness...

The new internet censor sets outs its stall for the censorship of video sharing platforms


Link Here24th March 2021
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
Ofcom has published its upcoming censorship rules for video sharing platforms and invites public responses up until 2nd June 2021. For a bit of self justification for its censorship, Ofcom has commissioned a survey to find that YouTube users and the likes are calling out for Ofcom censorship. Ofcom writes:

A third of people who use online video-sharing services have come across hateful content in the last three months, according to a new study by Ofcom.

The news comes as Ofcom proposes new guidance for sites and apps known as 'video-sharing platforms' (VSPs), setting out practical steps to protect users from harmful material.

VSPs are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and social features.

Under laws introduced by Parliament last year, VSPs established in the UK must take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content. Ofcom's job is to enforce these rules and hold VSPs to account.

The  draft guidance is designed to help these companies understand what is expected of them under the new rules, and to explain how they might meet their obligations in relation to protecting users from harm.

Harmful experiences uncovered

To inform our approach, Ofcom has researched how people in the UK use VSPs, and their claimed exposure to potentially harmful content. Our major findings are: 

  • Hate speech. A third of users (32%) say they have witnessed or experienced hateful content. Hateful content was most often directed towards a racial group (59%), followed by religious groups (28%), transgender people (25%) and those of a particular sexual orientation (23%).

  • Bullying, abuse and violence. A quarter (26%) of users claim to have been exposed to bullying, abusive behaviour and threats, and the same proportion came across violent or disturbing content.

  • Racist content. One in five users (21%) say they witnessed or experienced racist content, with levels of exposure higher among users from minority ethnic backgrounds (40%), compared to users from a white background (19%). 

  • Most users encounter potentially harmful videos of some sort. Most VSP users (70%) say they have been exposed to a potentially harmful experience in the last three months, rising to 79% among 13-17 year-olds.

  • Low awareness of safety measures. Six in 10 VSP users are unaware of platforms' safety and protection measures, while only a quarter have ever flagged or reported harmful content.

Guidance for protecting users

As Ofcom begins its new role regulating video-sharing platforms, we recognise that the online world is different to other regulated sectors. Reflecting the nature of video-sharing platforms, the new laws in this area focus on measures providers must consider taking to protect their users -- and they afford companies flexibility in how they do that.

The massive volume of online content means it is impossible to prevent every instance of harm. Instead, we expect VSPs to take active measures against harmful material on their platforms. Ofcom's new guidance is designed to assist them in making judgements about how best to protect their users. In line with the legislation, our guidance proposes that all video-sharing platforms should provide:

  • Clear rules around uploading content. VSPs should have clear, visible terms and conditions which prohibit users from uploading the types of harmful content set out in law. These should be enforced effectively.

  • Easy flagging and complaints for users. Companies should implement tools that allow users to quickly and effectively report or flag harmful videos, signpost how quickly they will respond, and be open about any action taken. Providers should offer a route for users to formally raise issues or concerns with the platform, and to challenge decisions through dispute resolution. This is vital to protect the rights and interests of users who upload and share content.

  • Restricting access to adult sites. VSPs with a high prevalence of pornographic material should put in place effective age-verification systems to restrict under-18s' access to these sites and apps.

Enforcing the rules

Ofcom's approach to enforcing the new rules will build on our track record of protecting audiences from harm, while upholding freedom of expression. We will consider the unique characteristics of user-generated video content, alongside the rights and interests of users and service providers, and the general public interest.

If we find a VSP provider has breached its obligations to take appropriate measures to protect users, we have the power to investigate and take action against a platform. This could include fines, requiring the provider to take specific action, or -- in the most serious cases -- suspending or restricting the service.Consistent with our general approach to enforcement, we may, where appropriate, seek to resolve or investigate issues informally first, before taking any formal enforcement action.

Next steps

We are inviting all interested parties to comment on our proposed draft guidance, particularly services which may fall within scope of the regulation, the wider industry and third-sector bodies. The deadline for responses is 2 June 2021. Subject to feedback, we plan to issue our final guidance later this year. We will also report annually on the steps taken by VSPs to comply with their duties to protect users.

NOTES

Ofcom has been given new powers to regulate UK-established VSPs. VSP regulation sets out to protect users of VSP services from specific types of harmful material in videos. Harmful material falls into two broad categories under the VSP Framework, which are defined as:

  • Restricted Material , which refers to videos which have or would be likely to be given an R18 certificate, or which have been or would likely be refused a certificate. It also includes other material that might impair the physical, mental or moral development of under-18s.

  • Relevant Harmful Material , which refers to any material likely to incite violence or hatred against a group of persons or a member of a group of persons based on particular grounds. It also refers to material the inclusion of which would be a criminal offence under laws relating to terrorism; child sexual abuse material; and racism and xenophobia.

The Communications Act sets out the criteria for determining jurisdiction of VSPs, which are closely modelled on the provisions of the Audiovisual Media Services Directive. A VSP will be within UK jurisdiction if it has the required connection with the UK. It is for service providers to assess whether a service meets the criteria and notify to Ofcom that they fall within scope of the regulation. We recently published guidance about the criteria to assist them in making this assessment. In December 2020, the Government confirmed its intention to appoint Ofcom as the regulator of the future online harms regime . It re-stated its intention for the VSP Framework to be superseded by the regulatory framework in new Online Safety legislation.

 

 

Updated: All men are rapists...

So peer Floella Benjamin attempts to revive porn age verification censorship because porn viewing is just one step away from park murder


Link Here17th March 2021
The pro-censorship member of the House of Lords has tabled the following amendment to the Domestic Abuse Bill to reintroduce internet porn censorship and age verification requires previously dropped by the government in October 2019.

Amendment 87a introduces a new clause:

Impact of online pornography on domestic abuse

  1. Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the Secretary of State to investigate the impact of access to online pornography by children on domestic abuse.

  2. Within three months of their appointment, the appointed person must publish a report on the investigation which may include recommendations for the Secretary of State.

  3. As part of the investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would prevent domestic abuse, and may make recommendations to the Secretary of State accordingly.

  4. Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed person.

  5. If the appointed person recommends that Part 3 of the Digital Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act within the timeframe recommended by the appointed person."

Member's explanatory statement

This amendment would require an investigation into any link between online pornography and domestic abuse with a view to implementing recommendations to bring into effect the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse.

Update: Defeated

17th March 2021. See article from votes.parliament.uk

The amendment designed to resurrect the Age Verification clauses of the Digital Economy Act 2017 was defeated by 242 to 125 vodets in the House of Lords.

The government minister concluding the debate noted that the new censorship measures included in the Online Harms Bill are more comprensive than the measures under Digital Economy Act 2017. He also noted that although upcoming censorship measures would take significant time to implement but also noted that reviving the old censorship measures would also take time.

In passing the minister also explained one of the main failings of the act was that site blocking would not prove effective due to porn viewers being easily able to evade ISP blocks by switching to encrypted DNS servers via DNS over Https (DoH). Presumably government internet snooping agencies don't fancy losing the ability to snoop on the browsing habits of all those wanting to continue viewing a blocked porn site such as Pornhub.


 2003   2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   Latest 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 
 

 
UK News

UK Internet

UK TV

UK Campaigns

UK Censor List
ASA

BBC

BBFC

ICO

Ofcom
Government

Parliament

UK Press

UK Games

UK Customs


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys