BBFC News

 2016: Oct-Dec



 Update: Embracing Political Correctness...

Why has the BBFC deleted 'nudity' from the consumer advice for the feminist documentary, Embrace?


Link Here 24th December 2016  full story: Embrace...Feminist movie subject of a ratings review in Australia
Poster Embrace 2016 Taryn Brumfitt Embrace is a 2016 Australia / Canada / Dominican Republic / Germany / USA / UK documentary by Taryn Brumfitt.
Starring Renee Airya, Jade Beall and Taryn Brumfitt. BBFC link IMDb

When Body Image Activist Taryn Brumfitt posted an unconventional before-and-after photo in 2013 it was seen by more than 100 million people worldwide and sparked an international media frenzy. EMBRACE follows Taryn's crusade as she explores the global issue of body loathing, inspiring us to change the way we feel about ourselves and think about our bodies.

A few days ago the BBFC entry for the film read:

UK: Passed 15 uncut for strong language, nudity, brief surgical detail for:

  • 2016 cinema release

The entry has now been updated to:

UK: Passed 15 uncut for strong language for:

  • 2016 cinema release

There is no mention of cuts and the running times remains the same. The nudity and surgical detail could have been pixellated out. But it seems more likely that feminists have dreamt up a new rule of political correctness that nudity does not count in the context of a feminist film.

Perhaps the BBFC advice should read, strong language, positive body image, negative surgical body image augmentation

 

 Update: 15 rated body images...

Feminist documentary gets advisory 15/16 ratings in Australia and New Zealand but gets a hard 15 in the UK


Link Here 21st December 2016  full story: Embrace...Feminist movie subject of a ratings review in Australia
Poster Embrace 2016 Taryn Brumfitt Embrace is a 2016 Australia / Canada / Dominican Republic / Germany / USA / UK feminsit documentary by Taryn Brumfitt.
Starring Renee Airya, Jade Beall and Taryn Brumfitt. BBFC link IMDb

When Body Image Activist Taryn Brumfitt posted an unconventional before-and-after photo in 2013 it was seen by more than 100 million people worldwide and sparked an international media frenzy. EMBRACE follows Taryn's crusade as she explores the global issue of body loathing, inspiring us to change the way we feel about ourselves and think about our bodies.

Never cut by censors but the film made the news in Australia after the director successfully appealed against a MA 15+ rating and won an M rating instead.

In Australia, the original MA15+ (15A) rating was downrated to M (PG-15) for nudity   on appeal. The Review board explained:

A three-member panel of the Classification Review Board has unanimously determined that the film Embrace is classified M (Mature) with the consumer advice Nudity .

The National Classification Code and Classification Guidelines allows for nudity to occur at the M level if it is justified by context. In the Classification Review Board's opinion Embrace warrants an M classification because the scenes of nudity and of women's breasts and genitals in the film are justified by the context of the documentary approach to women's body image and their impact is no higher than moderate.

Now the BBFC have passed the film 15 uncut for cinema for strong language, nudity, brief surgical detail.

 

  The BBFC is set to ban all online porn...

Murray Perkins of the BBFC explains how all the world's major porn websites will have to be totally banned in Britain (even if they set up age verification systems) under the censorship rules contained in the Digital Economy Bill


Link Here 14th December 2016
bannedThe BBFC currently cuts about 15% of all R18 porn films on their way to totally ordinary mainstream porn shops. These are not niche or speciality films, they are totally middle of the road porn, which represents the sort of content on all the world's major porn sites. Most of the cuts are ludicrous but Murray Perkins, a senior examiner of the BBFC, points out that they are all considered either be to be harmful, or else are still prohibited by the police or the government for reasons that have long since past their sell by date.

So about a sixth of all the world's adult films are therefore considered prohibited by the British authorities, and so any website containing such films will have to be banned as there is to practical way to cut out the bits that wind up censors, police or government. And this mainstream but prohibited content appears on just about  all the world's major porn sites, free or paid.

The main prohibitions that will cause a website to be blocked (even before considering whether they will set up strict age verification) are such mainstream content as female ejaculation, urine play, gagging during blow jobs, rough sex, incest story lines (which is a major genre of porn at the moment), use of the word 'teen' and verbal references to under 18's. 

Murray Perkins has picked up the job of explaining this catch all ban. He explains it well,  but he tries to throw readers off track by citing examples of prohibitions being justifiable because the apply to violent porn, whilst not mentioning that they apply equally well to trivia such as female squirting.

Perkins writes in the Huffington Post:

BBFC logo Recent media reports highlighting what content will be defined as prohibited material under the terms of the Digital Economy Bill could have given an inaccurate impression of the serious nature of the harmful material that the BBFC generally refuses to classify. The BBFC works only to the BBFC Classification Guidelines and UK law, with guidance from the Crown Prosecution Service (CPS) and enforcement bodies, and not to any other lists.

The Digital Economy Bill aims to reduce the risk of children and young people accessing, or stumbling across, pornographic content online. It proposes that the BBFC check whether

(i) robust age verification is in place on websites containing pornographic content and

(ii) whether the website or app contains pornographic content that is prohibited.

An amendment to the Digital Economy Bill, passed in the House of Commons, would also permit the BBFC to ask Internet Service Providers (ISPs) to block pornographic websites that refuse to offer effective age verification or contain prohibited material such as sexually violent pornography.

In making any assessment of content, the BBFC will apply the standards used to classify pornography that is distributed offline. Under the Video Recordings Act 1984 the BBFC is obliged to consider harm when classifying any content including 18 and R18 rated sex works. Examples of material that the BBFC refuses to classify include pornographic works that: depict and encourage rape, including gang rape; depict non-consensual violent abuse against women; promote an interest in incestuous behaviour; and promote an interest in sex with children. [Perkins misleadingly neglects to include, squirting, gagging, and urine play in his examples here]. The Digital Economy Bill defines this type of unclassifiable material as prohibited .-

Under its letters of designation the BBFC may not classify anything that may breach criminal law, including the Obscene Publications Act (OPA) as currently interpreted by the Crown Prosecution Service (CPS). The CPS provides guidance on acts which are most commonly prosecuted under the OPA. The BBFC is required to follow this guidance when classifying content offline and will be required to do the same under the Digital Economy Bill. In 2015, 12% of all cuts made to pornographic works classified by the BBFC were compulsory cuts under the OPA. The majority of these cuts were to scenes involving urolagnia which is in breach of CPS guidance and could be subject to prosecution.

 

 Offsite Article: BBFC Podcast Episode 62...


Link Here 10th December 2016
david austin interview Chief Executive David Austin in a QandA in which he reveals that he enjoyed horror films and was keen to seek out uncut versions

See article from bbfc.co.uk

 

  A censorial understanding...

Government selects the BBFC as the internet porn censor


Link Here 27th November 2016
The BBFC has issued the following press release:

BBFC logo Digital Economy Bill Age Verification Letters of Understanding

On 06 October 2016, the BBFC exchanged letters of understanding with DCMS confirming DCMS's intention, in principle, to appoint the BBFC to take on a regulatory role in the age verification of pornographic content online, as proposed in the Digital Economy Bill. These letters are available below.

The Digital Economy Bill contains measures to establish the same standard of protection online as currently exists offline with the aim of reducing the risk of children and young people accessing, or stumbling across, pornographic content online.

The BBFC's proposed role in the age verification of pornographic content online, as laid out in the Digital Economy Bill, is subject to designation by both Houses of Parliament.

The Letter of Understanding form Baroness Shelds of the DCMS to David Austin reads:

DCMS logoI would like to drank you for the British Board of Film Classification's continuous help and support in developing the Government's manifesto commitment to Introduce Age Verification (AV) checks for online pornography.

As you know, the AV clauses contained in the Digital Economy Bill have been designed to ensure that pornographic material must not normally be accessible online to users in the UK on a commercial basis without appropriate age verification checks. We appreciate BBFC's ongoing support especially in helping develop effective options for Stages 1-3 of the proposed regulatory framework. I understand you have worked with my officials in thinking through these proposals and had a productive meeting on 16 September to discuss your role in more detail.

We are committed to this policy and aim to introduce an effective regulatory framework to enable its smooth delivery. BBFC's experience in making effective editorial judgements Is important to the success of the policy. I would like to invite the BBFC to take on a regulatory role within the proposed framework, subject to the particulars of the proposed designation being laid in both Houses of Parliament. In working together, it is our intention that:

  • Both DCMS and the BBFC are committed to working openly and transparently to establish an effective regulatory framework for the age verification of pornographic content online;

  • That the BBFC will create a proportionate, accountable, independent and expert regulatory function, that would seek among its alms to promote voluntary compliance and advise Her Majesty's Government (HMG) mars widely on reducing the risk of pornography being made readily available to children;

  • That the BBFC will be responsible for Stages 1-3 of the proposed regulatory framework and that any enforcement function under the current Bill Clauses 20 and 21 will be carried out by another regulator that will have equal status to the BBFC,

  • DCMS will fund the BBFC's start up, and those already incurred. subject to final agreement once legislative approvals are in place.

Please note, this letter Is nonbinding and constitutes an indication of intent rather than creating a liability or obligation of any nature whatsoever to DCMS or the BBFC.

I look forward to heating from you very soon and would like to thank you once again for your valuable contribution and ongoing co-operation.

 

 Offsite Article: Paying the price of censorship...


Link Here 22nd November 2016
BBFC logo BBFC increases its fees by 1%. I wonder how much the new internet censorship regime will cost?

See article from bbfc.co.uk

 

  Censor cuts...

BBFC examiners to be downgraded to compliance officers with a 20k salary cut


Link Here 2nd November 2016
BBFC logo The Belfast Telegraph reports that the BBFC wants to get rid of five of its current examiners by the end of the year and replace them with younger, less experienced, cheaper compliance officers.

The trade union Unite has responded with the unlikely claim that the staff economies would risk material slipping through the censorship process. Unite's general secretary Len McCluskey has written to the BBFC's president Patrick Swaffer about the planned staff changes. He wrote:

It has always been my impression that the BBFC has maintained the trust of the public, particularly in relation to its child protection responsibilities, through the recruitment of mature and experienced individuals who have come from a variety of backgrounds, both personal and professional.

It seems to me that to replace those individuals with young, inexperienced graduates is both unfortunate in terms of the BBFC's public persona, and, quite possibly, a case of age discrimination.

Furthermore, I do not believe the public's trust, and especially that of many parents, will be enhanced by the knowledge that the BBFC is willing to lose the few examiners who view material on a day-to-day basis who are themselves parents, a status that brings an unimpeachable knowledge and understanding of child development.

The examiners are being given a choice of leaving on voluntary severance terms or being redeployed as compliance officers with a reduction in status and £20,000-a-year drop in salary.

Unite is arguing that the cost savings are not necessary because the BBFC's most recent accounts revealed an operating surplus of more than 1.2 million and that turnover is up by 2%, and operating costs down by the same amount. The union's regional officer Rose Keeping said:

You can't put a price on protecting children and young people from the tidal wave of sexually explicit and very violent films and videos that are available in 2016.

With less experienced examiners, there is an increased possibility that an unacceptable sex scene and/or one of extreme violence sneaking past the censors' net - this would be detrimental to the promotion of child protection that the Government is actively supporting.

We are also investigating whether what the BBFC is proposing for our members contravenes the age discrimination provisions in the 2010 Equality Act.

The BBFC responded in a press release saying:

The BBFC's classification standards protect children and empower families.

In making classification decisions, the BBFC has in place a structure that ensures consistency of approach and is based on published Classification Guidelines that are founded on large-scale public consultation.

The BBFC is currently in consultation with Unite in relation to this phase of the reorganisation of its examining and compliance functions, which began in 2013. The BBFC must respect the privacy of the ongoing formal consultation process.

 

 Updated: BBFC Podcast: Episode 59...

David Austin explains BBFC guidelines for depictions of drugs and introduces the new term 'drug misuse'


Link Here 1st November 2016
Project X Blu ray Region Free Podcast 59 gives a chance for BBFC boss David Austin to outline classifications guidelines for films that depict drug use, eg Now is Good, Project X and 13 .

Austin also took the opportunity to speak about a slight change in BBFC terminology in the various forms of consumer advice. Previously the BBFC used the term ''drug use' but will replace this with the term 'drug misuse'. Austin cited that example that taking paracetamol for a head ache is 'drug use' and so does not always imply a classification issue.

Of course the term ' drug misuse' is also a bit confusing if the drug is intended for use as a recreational drug. Eg does a beer drinker 'misuse' alcohol, or how do you 'misuse' a spliff? Stick it up your bum or something?

 

  A blackmailers, hackers, spammers, phishers charter...

Open Rights Groups notes that the Digital Economy Bill offers no meaningful protections for the ID data handed over to porn sites or age verifiers.


Link Here 20th October 2016
open rights group 2016 logo The Digital Economy Bill mandates that pornographic websites must verify the age of their customers. Are there any powers to protect user privacy?

Yesterday we published a blog detailing the lack of privacy safeguards for Age Verification systems mandated in the Digital Economy Bill. Since then, we have been offered two explanations as to why the regulator designate , the BBFC, may think that privacy can be regulated.

The first and most important claim is that Clause 15 may allow the regulation of AV services, in an open-ended and non-specific way:

15 Internet pornography: requirement to prevent access by persons under the age of 18  

  1. A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18
  2. [snip]
  3. The age-verification regulator (see section 17) must publish guidance about--

    (a) types of arrangements for making pornographic material available that the regulator will treat as complying with subsection (1);

However, this clause seems to regulate publishers who "make pornography material available on the internet" and what is regulated in 15 (3) (a) is the "arrangements for making pornography available". They do not mention age verification systems, which is not really an "arrangement for making pornography available" except inasmuch as it is used by the publisher to verify age correctly.

AV systems are not "making pornography available".

The argument however runs that the BBFC could under 15 (3) (a) tell websites what kind of AV systems with which privacy standards they can use.

If the BBFC sought to regulate providers of age verification systems via this means, we could expect them to be subject to legal challenge for exceeding their powers. It may seem unfair to a court for the BBFC to start imposing new privacy and security requirements on AV providers or website publishers that are not spelled out and when they are subject to separate legal regimes such as data protection and e-privacy.

This clause does not provide the BBFC with enough power to guarantee a high standard of privacy for end users, as any potential requirements are undefined. The bill should spell out what the standards are, in order to meet an 'accordance with the law' test for intrusions on the fundamental right to privacy.

The second fig leaf towards privacy is the draft standard for age verification technologies drafted by the Digital Policy Alliance. This is being edited by the British Standards Institution, as PAS 1296 . It has been touted as the means by which commercial outlets will produce a workable system.

The government may believe that PAS 1296 could, via Clause 15 (3) (a), be stipulated as a standard that Age Verifcation providers abide by in order to supply publishers, thereby giving a higher standard of protection than data protection law alone.

PAS 1296 provides general guidance and has no means of strong enforcement towards companies that adopt it. It is a soft design guide that provides broad principles to adopt when producing these systems.

Contrast this, for instance, with the hard and fast contractual arrangements the government's Verify system has in place with its providers, alongside firmly specified protocols. Or card payment processors, who must abide by strict terms and conditions set by the card companies, where bad actors rapidly get switched off.

The result is that PAS 1296 says little about security requirements , data protection standards, or anything else we are concerned about. It stipulates that the age verification systems cannot be sued for losing your data. Rather, you must sue the website owner, i.e. the porn site which contracted with the age verifier.

There are also several terminological gaffes such as referring to PII (personally identifying information) which is a US legal concept, rather than EU and UK's 'personal data'; this suggests that PAS 1296 is very much a draft, in fact appears to have been hastily cobbled-together

However you look at it, the proposed PAS 1296 standard is very generic, lacks meaningful enforcement and is designed to tackle situations where the user has some control and choice, and can provide meaningful consent. This is not the case with this duty for pornographic publishers. Users have no choice but to use age verification to access the content, and the publishers are forced to provide such tools.

Pornography companies meanwhile have every reason to do age verification as cheaply as possible, and possibly to harvest as much user data as they can, to track and profile users, especially where that data may in future, at the slip of a switch, be used for other purposes such as advertising-tracking. This combination of poor incentives has plenty of potential for disastrous consequences.

What is needed is clear, spelt out, legally binding duties for the regulator to provide security, privacy and anonymity protections for end users. To be clear, the AV Regulator, or BBFC, does not need to be the organisation that enforces these standards. There are powers in the Bill for it to delegate the regulator's responsbilties. But we have a very dangerous situation if these duties do not exist.

 

 

 Update: A database of the UK's porn habits. What could possibly go wrong?...

The Government wants people who view pornography to show that they are over 18, via Age Verification systems. by Jim Killock of Open Rights Group


Link Here 19th October 2016  full story: David Cameron's Internet Porn Ban...Attempting to ban everything on the internet

open rights group 2016 logo The Government wants people who view pornography to show that they are over 18, via Age Verification systems. This is aimed at reducing the likelihood of children accessing inappropriate content.

To this end the Digital Economy Bill creates a regulator that will seek to ensure that adult content websites will verify the age of users, or face monetary penalties, or in the case of overseas sites, ask payment providers such as VISA to refuse to process UK payments for non-compliant providers.

There are obvious problems with this, which we detail elsewhere .

However, the worst risks are worth going into in some detail, not least from the perspective of the Bill Committee who want the Age Verification system to succeed.

As David Austen, from the BBFC, who will likely become the Age Verification Regulator said :

Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, "Is this person 18 or over?" The answer should be either yes or no. No other personal details are necessary.

However, the Age Verification Regulator has no duties in relation to the Age Verification systems. They will make sites verify age, or issue penalties, but they are given no duty to protect people's privacy, security or defend against cyber security risks that may emerge from the Age Verification systems themselves.

David Austen's expectations are unfortunately entirely out of his hands.

Instead, the government appears to assume that Data Protection law will be adequate to deal with the privacy and security risks. Meanwhile, the market will provide the tools.

The market has a plethora of possible means to solve this problem. Some involve vast data trawls through Facebook and social media. Others plan to link people's identity across web services and will provide way to profile people's porn viewing habits. Still others attempt to piggyback upon payment providers and risk confusing their defences against fraud. Many appear to encourage people to submit sensitive information to services that the users, and the regulator, will have little or no understanding of.

And yet with all the risks that these solutions pose, all of these solutions may be entirely data protection compliant. This is because data protection allows people to share pretty much whatever they agree to share, on the basis that they are free to make agreements with whoever they wish, by providing 'consent'.

In other words: Data protection law is simply not designed to govern situations where the user is forced to agree to the use of highly intrusive tools against themselves.

What makes this proposal more dangerous is that the incentives for the industry are poor and lead in the wrong direction. They have no desire for large costs, but would benefit vastly from acquiring user data.

If the government wants to have Age Verification in place, it must mandate a system that increases the privacy and safety of end users, since the users will be compelled to use Age Verification tools. Also, any and all Age Verification solutions must not make Britain's cybersecurity worse overall, e.g. by building databases of the nation's porn-surfing habits which might later appear on Wikileaks.

The Digital Economy Bill's impact on privacy of users should, in human rights law, be properly spelled out (" in accordance with the law ") and be designed to minimise the impacts on people (necessary and proportionate). Thus failure to provide protections places the entire system under threat of potential legal challenges.

User data in these systems will be especially sensitive, being linked to private sexual preferences and potentially impacting particularly badly on sexual minorities if it goes wrong, through data breaches or simple chilling effects. This data is regarded as particularly sensitive in law.

Government, in fact has at its hands a system called Verify which could provide age-verification in a privacy friendly manner. The Government ought to be explaining why the high standards of its own Verify system are not being applied to Age Verification, or indeed, why the government is not prepared to use its own systems to minimise the impacts.

As with web filtering, there is no evidence that Age Verification will prevent an even slightly determined teenager from accessing pornography, nor reduce demand for it among young people. The Government appears to be looking for an easy fix to a complex social problem. The Internet has given young people unprecedented access to adult content but it's education rather than tech solutions that are most likely to address problems arising from this. Serious questions about the efficacy and therefore proportionality of this measure remain.

However, legislating for the Age Verification problem to be "solved" without any specific regulation for any private sector operator who wants to "help" is simply to throw the privacy of the UK's adult population to the mercy of the porn industry. With this mind, we have drafted an amendment to introduce the duties necessary to minimise the privacy impacts which could also reduce if not remove the free expression harms to adults.

 

 Update: It's a shitty job but someone's happy to do it...

BBFC designated as the porn censor tasked with banning everybody's free porn


Link Here 12th October 2016

BBFC logo The BBFC has signed an agreement with the U.K. government to act as the country's new internet porn censor.

BBFC Director David Austin explained the censor's new role regulating online adult entertainment to a committee in Parliament weighing the 2016 Digital Economy Bill. Austin discussed how the BBFC will approach those sites that are found to be in contravention to U.K. law in regards to verifying that adult content can't be accessed by under 18s.

Austin said that the 2016 Digital Economy Bill now being weighed will achieve a great deal for the BBFC's new role as the age-verification enforcer. The piece of legislation, if given the OK, could impose financial penalties of up to $250,000 for noncomplying adult entertainment sites.

Austin said that the BBFC will methodically start focusing on the largest offending websites, including foreign ones, and notifying them for breaches in the U.K.'s mandatory age-verification laws. Austin said that offending sites will face a notification process that may include the filing of sanctions against sites' business partners, such as payment providers and others that supply ancillary services. Austin also mentioned that sanctioned sites could find web properties blocked by IP address and de-indexed from search engines.

Digital Economy Bill 2nd Sitting

See article from hansard.parliament.uk

home affairs committee

David Austin : My name is David Austin. I am the chief executive of the British Board of Film Classification.

Alan Wardle: I am Alan Wardle, head of policy and public affairs at the National Society for the Prevention of Cruelty to Children.

Louise Haigh (Sheffield, Heeley) (Lab)  

Q David, am I right in interpreting the amendments that the Government tabled last night as meaning that you are intended to be the age verification regulator?

David Austin: That is correct. We reached heads of agreement with the Government last week to take on stages 1 to 3 of the regulation.

Louise Haigh   Q Are you sufficiently resourced to take on that role?  

David Austin: We will be, yes. We have plenty of time to gear up, and we will have sufficient resource.  

Louise Haigh   Q Will it involve a levy on the porn industry?  

David Austin: It will involve the Government paying us the money to do the job on our usual not-for-profit basis.  

Louise Haigh   Q What risks do you envisage in people handing over their personal data to the pornographic industry?  

David Austin: Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, "Is this person 18 or over?" The answer should be either yes or no. No other personal details are necessary.  

We should bear in mind that this is not a new system. Age verification already exists, and we have experience of it in our work with the mobile network operators, where it works quite effectively--you can age verify your mobile phone, for example. It is also worth bearing in mind that an entire industry is developing around improving age verification. Research conducted by a UK adult company in relation to age verification on their online content shows that the public is becoming much more accepting of age verification.

Back in July 2015, for example, this company found that more than 50% of users were deterred when they were asked to age verify. As of September, so just a few weeks ago, that figure had gone down to 2.3%. It is established technology, it is getting better and people are getting used to it, but you are absolutely right that privacy is paramount.

Louise Haigh   Q Are you suggesting that it will literally just be a question--"Is the user aged 18?"--and their ticking a box to say yes or no? How else could you disaggregate identity from age verification?  

David Austin: There are a number of third-party organisations. I have experience with mobile phones. When you take out a mobile phone contract, the adult filters are automatically turned on and the BBFC's role is to regulate what content goes in front of or behind the adult filters. If you want to access adult content--and it is not just pornography; it could be depictions of self-harm or the promotion of other things that are inappropriate for children--you can go to your operator, such as EE, O2 or Vodafone, with proof that you are 18 or over. It is then on the record that that phone is age verified. That phone can then be used in other contexts to access content.  

Louise Haigh   Q But how can that be disaggregated from identity? That person's personal data is associated with that phone and is still going to be part of the contract.  

David Austin: It is known by the mobile network operator, but beyond that it does not need to be known at all.  

Louise Haigh   Q And is that the only form of age verification that you have so far looked into? 

David Austin: The only form of age verification that we, as the BBFC, have experience of is age verification on mobile phones, but there are other methods and there are new methods coming on line. The Digital Policy Alliance, which I believe had a meeting here yesterday to demonstrate new types of age verification, is working on a number of initiatives.  

Claire Perry (Devizes) (Con)   Q May I say what great comfort it is to know that the BBFC will be involved in the regulatory role? It suggests that this will move in the right direction. We all feel very strongly that the Bill is a brilliant step in the right direction: things that were considered inconceivable four or five years ago can now be debated and legislated for.  

The fundamental question for me comes down to enforcement. We know that it is difficult to enforce anything against offshore content providers; that is why in the original campaign we went for internet service providers that were British companies, for whom enforcement could work. What reassurance can you give us that enforcement, if you have the role of enforcement, could be carried out against foreign entities? Would it not be more appropriate to have a mandatory take-down regime if we found that a company was breaking British law by not asking for age verification, as defined in the Bill?

David Austin: The BBFC heads of agreement with the Government does not cover enforcement. We made clear that we would not be prepared to enforce the legislation in clauses 20 and 21 as they currently stand. Our role is focused much more on notification; we think we can use the notification process and get some quite significant results.  

We would notify any commercially-operated pornographic website or app if we found them acting in contravention of the law and ask them to comply. We believe that some will and some, probably, will not, so as a second backstop we would then be able to contact and notify payment providers and ancillary service providers and request that they withdraw services from those pornographic websites. So it is a two-tier process.

We have indications from some major players in the adult industry that they want to comply--PornHub, for instance, is on record on the BBC News as having said that it is prepared to comply. But you are quite right that there will still be gaps in the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4.

For UK-based websites and apps, that is fine, but it would be extremely challenging for any UK regulator to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator's arsenal. We think that that would be effective.

Claire Perry   Q Am I right in thinking that, for sites that are providing illegally copyrighted material, there is currently a take-down and blocking regime that does operate in the UK, regardless of their jurisdiction?  

David Austin: Yes; ISPs do block website content that is pirated. There was research published earlier this year in the US that found that it drove traffic to pirated @ websites down by about 90%. Another tool that has been used in relation to IP protection is de-indexing, whereby a search engine removes the infringing website from any search results. We also see that as a potential way forward.  

Thangam Debbonaire (Bristol West) (Lab)   Q First, can I verify that you both support adding in the power to require ISPs to block non-compliant sites?  

David Austin: Yes.  

Alan Wardle: Yes, we support that.  

Thangam Debbonaire   Q Good. That was quick. I just wanted to make sure that was there. What are your comments on widening the scope, so that age verification could be enforced for matters other than pornography, such as violent films or other content that we would not allow in the offline world? I am talking about things such as pro-anorexia websites. We know that this is possible to do in certain formats, because it is done for other things, such as copyright infringement. What are your views on widening the scope and the sanctions applying to that?  

Alan Wardle: We would support that. We think the Bill is a really great step forward, although some things, such as enforcement, need to be strengthened. We think this is an opportunity to see how you can give children parity of protection in the online and the offline worlds.  

It is very good, from our perspective, that the BBFC is doing this, because they have got that expertise. Pornography is not the only form of harm that children see online. We know from our research at the NSPCC that there are things like graphic violence. You mentioned some of the pro-anorexia and pro-suicide sites, and they are the kind of things that ought to be dealt with. We are supporting developing a code of practice with industry to work out what those harms are--and that is very much a staged approach.

We take it for granted that when, for instance, a child goes to a youth group or something like that, we make sure there are protections there, and that the staff are CRB checked. Somehow it seems that for children going on to the internet it is a bit like the wild west. There are very few protections. Some of the content really is upsetting and distressing to children. This is not about adults being blocked from seeing adult content. That is absolutely fine; we have no problem with that at all. But it is about protecting children from seeing content that is inappropriate for them. We would certainly support that widening, but obviously doing it in a staged way so that the regulator does not take on too much at once. We would certainly support that.

David Austin: I would echo what Alan says. We see this Bill as a significant step forward in terms of child protection. We absolutely agree with the principle of protecting children from a wider range of content--indeed, that is what we do in other areas: for example, with the mobile network operators and their adult filters. Like Alan, I think we see it in terms of more of a staged approach. The BBFC taking on this role is a significant new area of work--quite a challenge to take on board. I think there is a potential risk of overloading the Bill if we try to put too much on it, so I would very much support the NSPCC's phased approach.  

Thangam Debbonaire   Q Is there anything further that you think needs to be added to the Bill to make the sanctions regime work? I am also thinking--at the risk of going against what you just said, Mr Austin--about whether or not we should be considering sites that are not designed for commercial purposes but where pornography or other harmful material is available on a non-commercial basis; or things not designed for porn at all, such as Twitter timelines or Tumblr and other social media, where the main purpose may not be pornography or other harmful material, but it is available. Do you think the Bill has enough sanctions in it to cope with all of that, or should that be added? Is there anything else you would like to add?  

David Austin: There were a few questions. I will try to answer them all, but if I miss any of them please come back to me. In terms of sanctions, I have talked about ISP blocking and de-indexing. We think those could be potentially effective steps. In terms of commercial pornography, we have been working on devising a test of what that is. The Bill states explicitly that the pornography could be free and still provided on a commercial basis. I do not think it is narrowing the scope of the regulation an awful lot by specifying commercial pornography. If there are adverts, if the owner is a corporate entity, if there are other aspects--if the site is exploiting data, for example: there are all sorts of indications that a site is operating on a commercial basis. So I do not see that as a real problem.  

In relation to Twitter, which you mentioned, what the Bill says the regulator should do is define what it sees as ancillary service providers. Those are organisations whose work facilitates and enables the pornography to be distributed. There is certainly a case to argue that social media such as Twitter are ancillary service providers. There are Twitter account holders who provide pornography on Twitter so I think you could definitely argue that.

I would argue that Twitter is an ancillary service provider, as are search engines and ISPs. One of the things that we plan to do in the next weeks and months would be to engage with everyone that we think is an ancillary service provider, and see what we can achieve together, to try and achieve the maximum protection we can through the notification regime that we are taking on as part 3 of the Bill. The Chair

Just before we move on, shall we see if Mr Wardle also wants to contribute to things that should be in the Bill?

Alan Wardle: On that point, I think it is important for us that there is clarification--and I would agree with David about this--in terms of ensuring that sites that may for instance be commercial but that are not profiting from pornography are covered. Again, Twitter is an example. We know that there are porn stars with Twitter accounts who have lots of people following them and lots of content, so it is important that that is covered.  

It is important that the legislation is future-proofed. We are seeing at the NSPCC through Childline that sexual content or pornography are increasingly live-streamed through social media sites, and there is self-generated content, too. It is important that that is covered, as well as the traditional--what you might call commercial--porn. We know from our research at the NSPCC that children often stumble across pornography, or it is sent to them. We think that streamed feeds for over-18s and under-18s should be possible so that sort of content is not available to children. It can still be there for adults, but not for children. Nigel Adams   Q Can you give us your perspective on the scale of the problem of under-18s' access to this sort of inappropriate content? I guess it is difficult to do a study into it but, through the schools network and education departments, do you have any idea of the scale of the issue?  

Alan Wardle: We did research earlier this year with the University of Middlesex into this issue. We asked young people--under 18s--whether they had seen pornography and when. Between the ages of 11 and 18, about half of them had seen pornography. Obviously, when you get to older children--16 and 17-year-old-boys in particular--it was much higher. Some 90% of those 11 to 18-year-olds had seen it by the age of 14. It was striking--I had not expected this--that, of the children who had seen it, about half had searched for it but the other half had stumbled across it through pop-ups or by being sent stuff on social media that they did not want to see.  

It is a prevalent problem. If a determined 17-year-old boy wants to see pornography, undoubtedly he will find a way of doing it, but of particular concern to us is when you have got eight, nine or 10-year-old children stumbling across this stuff and being sent things that they find distressing. Through Childline, we are getting an increasing number of calls from children who have seen pornographic content that has upset them.

Nigel Adams   Q Has there been any follow-on, in terms of assaults perpetrated by youngsters as a result of being exposed to this?  

Alan Wardle: It is interesting to note that there has been an exponential rise in the number of reports of sexual assaults against children in the past three or four years. I think it has gone up by about 84% in the past three years.  

Nigel Adams   Q By children?  

Alan Wardle: Against children. Part of that, we think, is what you might call the Savile effect--since the Savile scandal there has been a much greater awareness of child abuse and children are more likely to come forward, which we think is a good thing. But Chief Constable Simon Bailey, who is the national lead on child protection, believes that a significant proportion of that is due to the internet. Predators are able to cast their net very widely through social networking sites and gaming sites, fishing for vulnerable children to groom and abuse.  

We believe that, in developing the code of practice that I talked about earlier, that sort thing needs to be built in to ensure that children are protected from that sort of behaviour in such spaces. The internet is a great thing but, as with everything, it can be used for darker purposes. We think there is increasing evidence--Simon Bailey has said this, and more research needs to be done into the scale of it--that children, as well as seeing adult content, are increasingly being groomed for sex online.

Nigel Adams   Q Mr Austin, what constructive conversations and meetings have you had with ISPs thus far, in terms of the potential for blocking those sites--especially the sites generated abroad?

David Austin: We have not had any conversations yet, because we signed the exchange of letters with the Government only last Thursday and it was made public only today that we are taking on this role. We have relationships with ISPs--particularly the mobile network operators, with which we have been working for a number of years to bring forward child protection on mobile devices.  

Our plan is to engage with ISPs, search engines, social media--the range of people we think are ancillary service providers under the Bill--over the next few weeks and months to see what we can achieve together. We will also be talking to the adult industry. As we have been regulating pornography in the offline space and, to an extent, in the online space for a number of years, we have good contacts with the adult industry so we will engage with them.

Many companies in the adult industry are prepared to work with us. Playboy , for instance, works with us on a purely voluntary basis online. There is no law obliging it to work with us, but it wants to ensure that all the pornography it provides is fully legally and compliant with British Board of Film Classification standards, and is provided to adults only. We are already working in this space with a number of players.

Nigel Huddleston   Q Obviously, the BBFC is very experienced at classifying films according to certain classifications and categories. I am sure it is no easy task, but it is possible to use an objective set of criteria to define what is pornographic or disturbing, or is it subjective? How do you get that balance?  

David Austin: The test of whether something is pornographic is a test that we apply every single day, and have done since the 1980s when we first started regulating that content under the Video Recordings Act 1984. The test is whether the primary purpose of the work is to arouse sexually. If it is, it is pornography. We are familiar with that test and use it all the time.  

Nigel Huddleston   Q In terms of skills and resources, are you confident you will be able to get the right people in to do the job properly? I am sure that it is quite a disturbing job in some cases.  

David Austin: Yes. We already have people who have been viewing pornographic content for a number of years. We may well need to recruit one or two extra people, but we certainly have the expertise and we are pretty confident that we already have the resources. We have time between now and the measures in the Bill coming into force to ensure that we have a fully effective system up and running.  

The Minister for Digital and Culture (Matt Hancock)   Q I just want to put on the record that we are delighted that the BBFC has signed the heads of agreement to regulate this area. I cannot think of a better organisation with the expertise and the experience to make it work. What proportion of viewed material do you think will be readily covered by the proposed mechanism in the Bill that you will be regulating the decision over but not the enforcement of?  

David Austin: I am not sure that I understand the question.  

Matt Hancock   Q I am thinking about the scale of the problem--the number of views by under-18s of material that you deem to be pornographic. What proportion of the problem do you think the Bill, with your work, will fix?  

David Austin: So we are talking about the amount of pornography that is online?  

Matt Hancock   Q And what is accessed.  

David Austin: Okay. As you all know, there is masses of pornography online. There are 1.5 million new pornographic URLs coming on stream every year. However, the way in which people access pornography in this country is quite limited. Some 70% of users go to the 50 most popular websites. With children, that percentage is even greater; the data evidence suggests that they focus on a relatively small number of sites.  

We would devise a proportionality test and work out what the targets are in order to achieve the greatest possible level of child protection. We would focus on the most popular websites and apps accessed by children--those data do exist. We would have the greatest possible impact by going after those big ones to start with and then moving down the list.

Matt Hancock   Q So you would be confident of being able to deal with the vast majority of the problem.  

David Austin: Yes. We would be confident in dealing with the sites and apps that most people access. Have I answered the question?  

Matt Hancock   Q Yes. Given that there is a big problem that is hard to tackle and complicated, I was just trying to get a feel for how much of the problem you think, with your expertise and the Bill, we can fix.  

David Austin: We can fix a great deal of the problem. We cannot fix everything. The Bill is not a panacea but it can achieve a great deal, and we believe we can achieve a great deal working as the regulator for stages 1 to 3.  

Louise Haigh   Q My question follows on neatly from that. While I am sure that the regulation will tackle those top 50 sites, it obviously comes nowhere near tackling the problems that Mr Wardle outlined, and the crimes, such as grooming, that can flow from those problems. There was a lot of discussion on Second Reading about peer-to-peer and social media sites that you have called "ancillary". No regulation in the world is going to stop that. Surely, the most important way to tackle that is compulsory sex education at school.  

Alan Wardle: Yes. In terms of online safety, a whole range of things are needed and a whole lot of players. This will help the problem. We would agree and want to work with BBFC about a proportionality test and identifying where the biggest risks are to children, and for that to be developing. That is not the only solution.  

Yes, we believe that statutory personal, social and health education and sexual relationships education is an important part of that. Giving parents the skills and understanding of how to keep their children safe is also really important. But there is a role for industry. Any time I have a conversation with an MP or parliamentarian about this and they have a child in their lives--whether @ their own, or nieces or nephews--we quickly come to the point that it is a bit of a nightmare. They say, "We try our best to keep our children safe but there is so much, we don't know who they are speaking to" and all the rest of it.

How do we ensure that when children are online they are as safe as they are when offline? Of course, things happen in the real world as well and no solution is going to be perfect. Just as, in terms of content, we would not let a seven-year-old walk into the multiplex and say, "Here is 'Finding Nemo' over here and here is hard core porn--off you go."

We need to build those protections in online so we know what children are seeing and whom they speaking to, and also skilling up children themselves through school and helping parents. But we believe the industry has an important part to play in Government, in terms of regulating and ensuring that spaces where children are online are as safe as they can be.

Christian Matheson (City of Chester) (Lab)   Q To follow on from the Minister's question, you feel you are able to tackle roughly the top 50 most visited sites. Is there a danger that you then replace those with the next top 50 that are perhaps less regulated and less co-operative? How might we deal with that particular problem, if it exists?  

David Austin: When I said "the top 50", I was talking in terms of the statistics showing that 70% of people go to the top 50. We would start with the top 50 and work our way through those, but we would not stop there. We would look to get new data every quarter, for example. As you say, sites will come in and out of popularity. We will keep up to date and focus on those most popular sites for children.  

We would also create something that we have, again, done with the mobile operators. We would create an ability for members of the public--a parent, for example--to contact us about a particular website if that is concerning them. If an organisation such as the NSPCC is getting information about a particular website or app that is causing problems in terms of under-age access, we would take a look at that as well. In creating this proportionality test what we must not do is be as explicit as to say that we will look only at the top 50.

First, that is not what we would do. Secondly, we do not want anyone to think, "Okay, we don't need to worry about the regulator because we are not on their radar screen." It is very important to keep up to date with what are the most popular sites and, therefore, the most effective in dealing with under-age regulation, dealing with complaints from members of the public and organisations such as the NSPCC.

Alan Wardle: I think that is why the enforcement part is so important as well, so that people know that if they do not put these mechanisms in place there will be fines and enforcement notices, the flow of money will be stopped and, crucially, there is that backstop power to block if they do not operate as we think they should in this country. The enforcement mechanisms are really important to ensure that the BBFC can do their job properly and people are not just slipping from one place to the next.  

Claire Perry   Q Of those top 50 sites, do we know how many are UK-based? @  

David Austin: I would guess, none of them. I do not know for sure, but that would be my understanding.  

Claire Perry   Q Secondly, I want to turn briefly to the issue of the UK's video on demand content. My reading around clause 15 suggests that, although foreign-made videos on demand will be captured by the new provisions, UK-based will continue to be caught by Communications Act 2003 provisions. Do you think that is adequate?  

David Austin: That is my understanding as well. We work very closely with Ofcom. Ofcom regulates this UK on demand programme services as the Authority for Television On Demand, but it applies our standards in doing so. That is a partnership that works pretty effectively and Ofcom has done an effective job in dealing with that type of content. That is one bit that is carved out from the Bill and already dealt with by Ofcom.  

Claire Perry  

It is already done. Okay. Thank you. The Chair

 

  Age of censorship...

The Open Rights group has provided written evidence to parliament highlighting the serious flaws in the Digital Economy bill that will employ the BBFC to try and snuff out internet porn


Link Here 12th October 2016
open rights group 2016 logo

Open Rights Group has submitted Written evidence to House of Commons Public Bill Committee on the Digital Economy Bill. The following is the groups views on some of the worst aspects of the Age Verification requirements for 18 rated adult internet porn:

Open Rights Group (ORG) is the United Kingdom's only campaigning organisation dedicated to working to protect the rights to privacy and free speech online. With 3,200 active supporters, we are a grassroots organisation with local groups across the UK. We believe people have the right to control their technology, and oppose the use of technology to control people.

Age Verification

23. We believe the aim of restricting children's access to inappropriate material is a reasonable one; however placing age verification requirements on adults to access legal material throws up a number of concerns which are not easily resolved.24.Our concerns include: whether these proposals will work; the impact on privacy and freedom of expression; and how pornography is defined.

Lack of privacy safeguards

25. New age verification systems will enable the collection of data about people accessing pornographic websites, potentially across different providers or websites. Accessing legal pornographic material creates sensitive information that may be linkedto a real life identity. The current wording of the draft Bill means that this data could be vulnerable to the "Ashley Madison-style" leaks.

26. MindGeek (the largest global adult entertainment operator) estimates there are 20 to 25 million adults in the UK who access adult content regularly. That is over 20 million people that will have to reveal attributes of their identity to a pornographywebsite or a third party company.

27. Current proposals2 for age-verification systems suggest using people's emails, social media accounts, bank details, credit and electoral information, biometrics and mobile phone details. The use of any of this information exposes pornography website users to threats of data mining, identity theft and unsolicited marketing.

28. The currently proposed age-verification systems have minimal regard for the security of the data they will collect.

29. The Bill does not contain provisions to secure the privacy and anonymity of users of pornographic sites. These must be included in the Bill, not merely in guidance issued by the age-verification regulator. They should ensure that the age-verificationsystem, by default, must not be able to identify a user to the pornographic site by leaving persistent data trails. The user information that pornography websites are allowed to store without additional consent should be strictly limited.

Will age verification work?

30. The objective of these proposals is child safety rather than age verification. Policy makers should not measure success by the number of adults using age verification. It is highly likely that children will be able to continue accessing pornographicmaterial, meaning that the policy will struggle to meet its true goal.

31. The Bill does not outline an effective system to administer age verification. It sets out a difficult task to regulate foreign pornography publishers. This will be difficult to enforce. Even if access to pornographic material hosted abroad is blockedin the UK, bypassing website blocks is very easy - for example through the use of VPNs. Using VPNs is not technically difficult and could easily be used by teenagers to circumvent age verification.

32. Young people will still be able to access pornographic materials through some mainstream social media websites that are not subject to age verification, and from peer-to-peer networks.

33. As with ISP and mobile phone filters, age verification may prevent young children from accidentally finding pornographic material but it is unlikely to restrict a tech-savvy teenager.

Discrimination against sexual minorities and small business

34. The age verification systems will impose disproportionate costs on small publishers. No effective and efficient age verification system has been presented and it is very likely the costs imposed on smaller publishers will cause them to go out of business 3 .

35. Smaller publishers of adult materials often cater for sexual minorities or people with special needs. The costs associated with implementing age verification systems threaten the existence of these sites and thus the ability of particular groupsto express their sexuality by using the services of smaller pornographic publishers.

36. It is unclear whether adults will trust age verification systems, especially if they appear to identify them to the sites. It is possible that there will be a dissuasive effect on adults wishing to receive legal material. This would be a negativeimpact on free expression, and would be likely to disproportionately impact people from sexual minorities.

Definition of pornographic material

37. The definitions of pornographic material included in the Bill are much broader than what is socially accepted as harmful pornography. The Bill not only covers R18 materials typically described as "hardcore pornography", which offline can only be acquiredin licensed sex shops, but also 18-rated materials of a sexual nature. The boundaries of 18 classification are dynamic and reflect social consensus on what is acceptable with some restrictions. Today this would include popular films such as Fifty Shadesof Gray. This extension of the definition of pornography to cover all "erotic" 18 rated films also raises questions as to why violent - but not sexual - materials rated as 18 should then be accessible online.

38. Hiding some of these materials or making them more difficult to access puts unjustifiable restrictions on people's freedom of expression. Placing 18-rated materials beyond the age-verification wall under the same category as hardcore pornography willdiscourage people from exploring topics related to their sexuality.

Suggestions for improvement

39. The online age verification proposed in the Bill is unworkable and will not deliver what Government set out to do. We urge the Government to find more effective solutions to deliver their objectives on the age verification. The online age verificationshould be dropped from the Bill in its current version. 40. The updated version of age verification should incorporate:

41. 1) Privacy safeguards

The regulator should have specific duties to ensure the systems are low risk. For instance, Age verification should not be be in place unless privacy safeguards are strong. Any age verification system should not create wider security risks, for instanceto credit card systems, or through habituating UK Internet users into poor security practices.

42. Users of adult websites should have clarity on the liability of data breaches and what personal data is at risk.

42. 2) Safeguards for sexual minorities

Requirements should be proportionate to the resources available and the likelihood of access by minors. Small websites that cater for sexual minorities may fall under the commercial threshold.

43. 3) Remove 18-rated materials from the definition of pornographic materials

Placing all materials of a sexual nature under the definition of pornography is not helpful and will greatly increase the impact of these measures on the human right to impart and receive information, including of older children and young adults.

Open Rights Group make equally valid arguments against the criminalisation of file sharing  and the introduction of many features of an ID card to tie together vast amounts or personal data held in a variety of government databases.

Red the full article from openrightsgroup.org