Ofcom confirm a new broadcasting code that will ban Jews from hating Nazis, religions from hating gays, feminists from hating men, progressive commentators from Hating Trump, and the BBC from hating Brexiteers
||31st December 2020 |
See article from ofcom.org.uk
report [pdf] from ofcom.org.uk
Ofcom have released a statement about new TV censorship arrangements following Brexit. Ofcom writes:
Ofcom is today confirming changes to our Broadcasting Code and Code on the Scheduling of Television
Advertising following consultation.
The changes reflect new requirements on broadcasters under the revised Audiovisual Media Services Regulation 2020 , and also take account of legislative changes
following the end of the transition period for the UK's withdrawal from the European Union.
In brief, we are amending:
the definition of hate speech in Section Three (Crime, disorder, hatred and abuse) of the Broadcasting Code;
Section Nine (Commercial references on TV) of the Broadcasting Code, to reflect new
product placement provisions; and
the Code on the Scheduling of Television Advertising (COSTA), to reflect advertising provisions under the European Convention on Transfrontier Television.
We are also making other minor and administrative updates to the Broadcasting Code.
Both the revised Broadcasting Code and the revised COSTA will take effect from 23:00, 31 December 2020, when the
Brexit transition period ends.
In fact the definition of 'hate speech' is incredibly wide and seemingly covers many instances where 'hate' is currently totally acceptable, or even encouraged. Ofcom's definition is:
Meaning of "hate speech": all forms of expression which spread, incite, promote or justify hatred based on intolerance on the grounds of disability, ethnicity, social origin, sex, gender, gender reassignment, nationality,
race, religion or belief, sexual orientation, colour, genetic features, language, political or any other opinion, membership of a national minority, property, birth or age.
Ofcom also details the legal aspects of the changes:
The UK statutory framework that shapes the regulation of UK television services is changing.
The Audiovisual Media Services Regulations came into force on 1 November 2020. The AVMS Regulations
implement the revised Audiovisual Media Services Directive (AVMSD) into UK law. They amend Section 319 of the Communications Act 2003, which sets the standards objectives that underpin Ofcom's Broadcasting Code.
From 1 January
2021, the AVMS Directive itself and the country of origin principle will no longer apply as they did to UK television services that broadcast into the EU. However, the content rules set by the AVMSD prior to that date will still apply. This means both
the rules that already existed, and the ones on which we have been consulting to implement the AVMSD, will still apply, and our rules which implemented the AVMSD will be interpreted as they were before.
In addition, the European
Convention on Transfrontier Television (ECTT) framework will still apply and the legislation requires Ofcom to implement it. This means that services established in the UK and that broadcast to ECTT countries are required to comply with broadcast
standards set out in the ECTT, which include those on the amount of advertising broadcasters can transmit and where this is scheduled.
On 24 November 2020, Ofcom published a consultation on proposals for amendments to the
Broadcasting Code and COSTA resulting from the legislative changes. This statement sets out the amendments we are making in light of stakeholders' responses.
Ofcom's top programmes as judged by the number of complainers wound up
|27th December 2020
See article from dailystar.co.uk
The Daily Star writes:
2020 has been a year like no other, with Ofcom receiving record complaints about some of the UK's biggest shows.
Throughout the year in telly, the British public took offence
to everything from explosive interviews, to pre-watershed violence and scenes of puking.
Britain's Got Talent received more complaints from viewers than any rival show Raking in major viewings as usual, the show made
headlines on September 5 when dance troupe Diversity performed a routine to show solidarity with the Black Lives Matter movement.
Almost 28,000 people reportedly complained about very political dance routine. Ofcom did not agree with the wrong sort of
complaints though. Good Morning Britain also racked up a hefty number of Ofcom complaints throughout the year.
By October, the show had received 9,000 complaints from viewers, with Piers Morgan at the centre of a number of concerns. In
particular, Piers' April interview with Conservative Health and Social Care Minister Helen Whately received over 3,200 calls, with the presenter being accused of bullying.
Morgan's interview with Health Secretary Matt Hancock also drew in hundreds
of complaints, as did an appearance from MP Victoria Atkins. When Piers compared the PM, Boris Johnson, to Worzel Gummidge -- a scarecrow from a kids' TV series -- another 390 picked up their phones to vent.
This Morning presenters Ruth
Langsford and Eamonn Holmes hosted a segment titled Should chemists tell their customers they are fat?
Ofcom confirmed the show had received 3,496 complaints about the discussion about pharmacists.
Sky News received 840
complaints from viewers. In August, one Sky News report was met by fury after it filmed a live broadcast of migrants crossing the Channel by sea. The broadcast was slammed by Labour MP Zarah Sultana, who said it reminded her of a grotesque reality TV
show. We should ensure people don't drown crossing the Channel, not film them as if it were some grotesque reality TV show, she said.
Emmerdale chipped in with a remark made in the lockdown special which caused a backlash, when Jimmy King
thanked the deadly virus for suspending his parenting duties. Ofcom confirmed 75 disgruntled fans had got in touch to raise their concerns.
Ofcom fines Indian channel Republic TV for hate speech against Pakistan
|24th December 2020
See article from theguardian.com
A right leaning Indian news channel known for its strong pro-government stance and firebrand host has been fined by the UK TV censor Ofcom for broadcasting hate speech about Pakistan .
Republic TV was fined £20,000 for airing a segment on its UK
service, which conveyed the view that all Pakistani people are terrorists, including their scientists, doctors, their leaders, politicians [...] Even their sports people.
The primetime show Poochta Hai Bharat aired on 6 September 2019 on
the Hindi-language version of the channel, Republic Bharat. Republic TV is one of the most widely watched channels in India, with news anchor and founder Arnab Goswami hosting aggressive current affairs debates, which regularly air rightwing opinions
while pointing and shouting at viewers down the camera.
Ofcom said it had received multiple complaints from viewers for the highly pejorative references to members of the Pakistani community (eg continually referring to them as 'filthy') on Goswami's
Ofcom summarised that the show failed to comply with UK broadcasting rules as it had spread, incited, promoted and justified such intolerance towards Pakistani people among viewers.
The Government outlines its final plans to introduce new and wide ranging internet censorship laws
See press release from gov.uk
also full government response to the Online Harms White Paper consultation
Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government's final decisions on new internet censorships laws.
New rules to be introduced for nearly all tech firms that allow users to post their own content or interact
Firms failing to protect people face fines of up to ten per cent of turnover or the blocking
of their sites and the government will reserve the power for senior managers to be held liable
Popular platforms to be held responsible for tackling both legal and illegal harms
All platforms will
have a duty of care to protect children using their services
Laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech
The full government response to the Online Harms White Paper
consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures.
sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The
Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.
Tech platforms will need to do far more to protect children from being exposed to harmful content or
activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.
The most popular social media
sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological
harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.
Ofcom is now confirmed as the
regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.
The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not
respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as
financial services regulation.
The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people's rights online
and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in
The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly
interact with others online.
It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer
cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.
The legislation will include safeguards for freedom of expression and pluralism online - protecting
people's rights to participate in society and engage in robust debate.
Online journalism from news publishers' websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation
to make sure journalistic content is still protected when it is reshared on social media platforms.
Companies will have different responsibilities for different categories of
content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.
All companies will need to take appropriate steps to address illegal content and activity such as terrorism and
child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are
not accessing platforms which are not suitable for them.
The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty
of care in codes of practice.
A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1.
companies will need to assess the risk of legal content or activity on their services with "a reasonably foreseeable risk of causing significant physical or psychological harm to adults". They will then need to make clear what type of
"legal but harmful" content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently.
All companies will need mechanisms so people can easily report harmful content
or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.
Examples of Category 2 services are
platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.
Financial harms will be excluded from this framework, including fraud and the sale of unsafe goods. This will mean the regulations are clear and manageable for businesses, focus action where
there will be most impact, and avoid duplicating existing regulation.
Where appropriate, lower-risk services will be exempt from the duty of care to avoid putting disproportionate demands on businesses. This includes exemptions
for retailers who only offer product and service reviews and software used internally by businesses. Email services will also be exempt.
Some types of advertising, including organic and influencer adverts that appear on social
media platforms, will be in scope. Adverts placed on an in-scope service through a direct contract between an advertiser and an advertising service, such as Facebook or Google Ads, will be exempt because this is covered by existing regulation.
The response will set out how the regulations will apply to communication channels and services where users expect a greater degree of privacy - for example online instant
messaging services and closed social media groups which are still in scope.
Companies will need to consider the impact on user privacy and that they understand how company systems and processes affect people's privacy, but firms
could, for example, be required to make services safer by design by limiting the ability for anonymous adults to contact children.
Given the severity of the threat on these services, the legislation will enable Ofcom to require
companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse. Recognising the potential impact on user privacy, the government will ensure this is only used as
a last resort where alternative measures are not working. It will be subject to stringent legal safeguards to protect user rights.
Ofcom consults about its plans to tool up for its new roles as the UK internet censor
See article from ofcom.org.uk
Ofcom work plan [pdf] from ofcom.org.uk
Ofcom has opened a consultation on its plan to get ready for its likely role as the UK internet censor under the Governments Online Harms legislation. Ofcom writes
The consultation ends
on 5th February 2021.
We have today published our plan of work for 2021/22. This
consultation sets out our goals for the next financial year, and how we plan to achieve them.
We are consulting on this plan of work to encourage discussion with companies, governments and the public.
part of the Plan of Work publication, we are also holding some virtual events to invite feedback on our proposed plan. These free events are open to everyone, and offer an opportunity to comment and ask questions.
The Key areas referencing internet censorship are:
Preparing to regulate online harms
3.26 The UK Government has given Ofcom new duties as the regulator for UK -established
video - sharing platforms (VSPs) through the transposition of the European -wide Audiovisual Media Services Directive. VSPs are a type of online video service where users can upload and share vide os with members of the public, such as You Tube and
TikTok. Ofcom will not be responsible for regulating all VSPs as our duties only apply to services established in the UK and as such , we anticipate that a relatively small number of services fall within our jurisdiction. Under the new regulations, which
came into force on 1 November 2020, VSPs must have appropriate measures in place to protect children from potentially harmful content and all users from criminal content and incitement to hatred and violence. VSPs will also need to make sure certain
advertising standards are met.
3.27 As well as appointing Ofcom as the regulator of UK- established VSPs the Government has announced that it is minded to appoint Ofcom as the future regulator responsible for protecting users from
harmful online content. With this in mind we are undertaking the following work :
Video-sharing platforms regulation . We have issued a short guide to the new requirements. 22 On 19 November 2020 we issued draft scope and jurisdiction guidance for consultation to help providers self -assess whether they
need to notify to Ofcom as a VSP under the statutory rules from April 2021. 23 We will also consult in early 2021 on further guidance on the risk of harms and appropriate measures as well as proposals for a co-regulatory relationship with the Advertising
Standards Authority (ASA) with regards to VSP advertising. We intend to issue final versions of the guidance in summer 2021.
Preparing for the online harms regime. The UK Government has set out that it intends to put
in place a regime to keep people safe online. In February 2020 it published an initial response to the 2019 White Paper24 setting out how it intends to develop the regime which stated that it was minded to appoint Ofcom as the future regulator of online
harms. If confirmed, these proposed new responsibilities would constitute a significant expansion to our remit, and preparing for them would be a major area of focus in 2021/22. We will continue to provide technical advice to the UK Government on its
policy development process, and we will engage with Parliament as it considers legislative proposals.
3.29 We will continue work to deepen our understanding of online harms through a range of work:
Our Making Sense of Media programme. This programme will continue to provide insights on the needs, behaviours and attitudes of people online. Our other initiatives to research online markets and technologies will further
our understanding of how online harms can be mitigated
Stepping up our collaboration with other regulators. As discussed in the Developing strong partnerships section, we will continue our joint work through the
Digital Regulators Cooperation Forum and strengthen our collaboration with regulators around the world who are also considering online harms.
Understanding VSPs . The introduction of regulation to UK-established VSPs
will provide a solid foundation to inform and develop the broader future online harms regulatory framework. This interim regime is more limited in terms of the number of regulated companies and will cover a narrower range of harms compared to the online
harms white paper proposals. However, should Ofcom be confirmed as the regulator, through our work on VSPs we will develop on-the-job experience working with newly regulated online services, developing the evidence base of online harm, and building our
internal skills and expertise.
Ofcom warns Abu Dhabi TV that it will be considered for a fine for airing 'confession' extracted under duress from prison
|23rd November 2020 |
See article from gulf-times.com
Ofcom's decision [pdf] from ofcom.org.uk
Qatar's National Human Rights Committee (NHRC) has welcomed the decision of the UK's TV censor, Ofcom, condemning Abu Dhabi TV channel for broadcasting an interview that it claimed were confessions of Qatari citizen Hamad al-Hammadi during his arbitrary
arrest and detention in Abu Dhabi prisons in 2013.
Ofcom said that the channel, a subsidiary of Abu Dhabi Media Company (ADMC), which has a licence from Ofcom, broadcasted an interview on June 22, 2017 alleging they were confessions of a Qatari
intelligence agent, who was discrediting the UAE. Ofcom said that broadcasting the interview against al-Hammadi's will, who was tortured and ill-treated in prison, was a severe breach of the principles of fairness and privacy set out in the Ofcom
Broadcasting Code. Ofcom found that Mr Al-Hammadi was treated unjustly or unfairly in the programme as broadcast and that his privacy was unwarrantably infringed both in the obtaining of the footage of him and in its broadcast.
considers that the breaches of Rules 7.1 and 8.1 of the Code are serious and Ofcom is therefore putting the Licensee on notice that Ofcom intends to consider the breachesfor the imposition of a statutory sanction.
Ofcom fines Islam Channel for religious hate speech
|6th November 2020
See report [pdf] from ofcom.org.uk
also Islam Channel slams UK MP for linking it to terror from aa.com.tr
Islam Channel is an English language satellite television channel broadcast in 136 countries worldwide, including the UK. Its output includes religious instruction programmes, current affairs, documentaries and entertainment programmes, all from an
On 11 November 2018 at 23:0 0 Islam Channel broadcast an episode of The Rightly Guided Khalifas , religious education series on the history of the Qur'an, detailing its origins, its written compilation
and the measures used to preserve its original wording.
A segment of the programme ascribed a perpetually negative characteristic to Jewish people; namely corrupting Holy Books and seeking the destruction of Islam in both ancient
and more recent times. It conflated Israel and Jewish people, characterising Jewish people as tyrannical and having an evil mind . The programme also used further negative and stereotypical terms to describe Jewish people.
In Ofcom's Decision published on 7 October 2019 in issue 388 of the Broadcast and On Demand Bulletin, Ofcom's Executive found that this programme contained uncontextualised hate speech4 and breached Rules 2.3, 3.2 and 3.3 of the Code.
Rule 2.3: In applying generally accepted standards broadcasters must ensure that material which may cause offence is justified by the context...Such material may include, but is not limited to...discriminatory
treatment or language (for example on the grounds of...race, religion or belief....
Rule 3.2: Material which contains hate speech must not be included in television...programmes...except where it is justified by the context.
Rule 3.3: Material which contains abusive or derogatory treatment of individuals, groups, religions or communities, must not be included in television...services...except where it is justified by the context....
Ofcom put the Licensee on notice that it considered these breaches to be serious, and that it would consider them for the imposition of a statutory sanction.
Ofcom's Decision is to impose a financial penalty of
£20,000, to direct the Licensee to broadcast a statement of Ofcom's findings on a date and in a form to be determined by Ofcom and not to repeat the programme without edits to remove content in breach of the Code.
Ofcom publishes its censorship guidelines to be applied to UK based video sharing platforms
|21st October 2020
See article from ofcom.org.uk
censorship guidelines [pdf] from ofcom.org.uk
Ofcom has published its burdensome censorship rules that will apply to video sharing platforms that are stupid enough to be based in the UK. In particular the rules are quite vague about age verification requirements for the two adult video sharing sites
that remain in the UK. Maybe Ofcom is a bit shy about requiring onerous and unviable red tape of British companies trying to compete with large numbers of foreign companies that operate with a massive commercial advantage of not having age verification.
Ofcom do however note that these censorship rules are a stop gap until a wider scoped 'online harms' censorship regime which will start up in the next couple of years.
(VSPs) are a type of online video service which allows users to upload and share videos with members of the public.
From 1 November 2020, UK-established VSPs will be required to comply with new rules around protecting users from
The main purpose of the new regulatory regime is to protect consumers who engage with VSPs from the risk of viewing harmful content. Providers must have appropriate measures in place to protect minors from content
which might impair their physical, mental or moral development; and to protect the general public from criminal content and material likely to incite violence or hatred.
Ofcom has published a short guide outlining the new
statutory requirements on providers. The guide is intended to assist platforms to determine whether they fall in scope of the new regime and to understand what providers need to do to ensure their services are compliant.
also explains how Ofcom expects to approach its new duties in the period leading up to the publication of further guidance on the risk of harms and appropriate measures, which we will consult on in early 2021.
Ofcom will also be
consulting on guidance on scope and jurisdiction later in 2020. VSP providers will be required to notify their services to Ofcom from 6 April 2021 and we expect to have the final guidance in place ahead of this time.
Ofcom allows the word 'nigger' when used in a daytime TV debate about the word
|12th October 2020 |
See article from ofcom.org.uk
Good Morning Britain
ITV 22 June 2020, 08:15
Good Morning Britain (GMB) is weekday morning news programme broadcast on ITV.
At 08:15 on 22 June 2020, GMB featured a live discussion about plans by the
Rugby Football Union (RFU) to review the use of the song Swing Low, Sweet Chariot at England rugby matches because of its association with slavery.
Alongside GMB's regular presenters, Piers Morgan and Susanna Reid, the two
guests contributing to the discussion were the Deputy Editor of Spiked Online, Tom Slater, and lawyer and political activist, Dr Shola Mos-Shogbamimu. The discussion was wide-ranging and both contributors provided their views on topics including whether
there was sufficient awareness in society about the background of historic figures that are memorialised in public statues and the origins of certain songs. The item also featured discussion on whether a focus on songs and statues was a distraction from
the central issues of institutional and systemic racism, and whether freedom of speech was under threat.
During part of the discussion Piers Morgan described the offensive word as the n-word and in her response to the question Dr
Mos-Shogbamimu said the word in full twice, eg:
Now I don't use the 'n-word' 203 and when I say 'n-word' I mean the 'nigger' word 203 but I understand that is has become, to your point, rap stars and black youths have
almost taken that word and turned it on its head and use it either, you know as friends use to each other and also use it in a way that is not necessarily friendly.
ITV explained that given the context in which the
word had been used and the fact it was used by a suitably qualified expert guest to make a serious point in the public interest, we felt that an apology in the programme would be disrespectful to Dr Mos-Shogbamimu, and that it risked causing as much
offence to viewers as the word itself. We therefore did not consider that an on-air apology was required or appropriate in the circumstances. For the same reason we decided that the word did not need to be removed from the ITV+1 service or from the
version of the programme broadcast on the ITV Hub. However, ITV decided to add some guidance text on the ITV Hub to signpost the language for viewers.
Ofcom Response: Complaint not pursued
We took careful
account of the rationale for the two instances the word in full was used. We considered the first instance was to clarify the exact term Dr Mos-Shogbamimu was referring to and the second instance was to illustrate her view that there is a clear
distinction between a Black person using the word and a White person using it. Given this, we did not consider this amounted to frequent or gratuitous use. We particularly noted the way in which the language was used by this guest, that the word was not
directed at any particular person and was not used in a pejorative way by Dr Mos-Shogbamimu.
We considered the content did not raise any issues under the Code which warranted investigation.