No matter how much governments spout bollox about mass snooping being used onlt to detect the likes of terrorism, the authorities end up sharing the data with Tom, Dick and Harry for the most trivial of reasons
Is the government misleading the Lords about blocking Twitter?
Last week we reported that the UK government expect the BBFC to ask social media providers, such as Twitter, to block the use of their service by accounts that are associated with porn sites that fail to verify the age of their users.
The Bill is even worse than we illustrated. The definition of a "pornographic website" in Clause 15 (2) is purely a site that operates on a "commercial basis". This could catch any site--including Twitter, Reddit,
Tumblr--where pornography can be found. The practical limit would therefore purely be down to the discretion of the regulator, the BBFC, as to the kind of commercial sites they wanted to force to use Age Verification. However, the BBFC does not
seem to want to require Twitter or Reddit to apply age verification--at least, not yet.
However, we also got one part wrong
last week . In relation to Twitter, Reddit and other websites where porn sites might promote their content, the Bill contains a power to notify these "ancillary services" but has no specific power to enforce the notifications
In other words, they expect Twitter, Google, Facebook, Tumblr and other companies to voluntarily block accounts within the UK, without a specific legal basis for their action .
This would create a toxic situation for these companies. If they fail to "act" on the "notifications", these services will leave themselves open to the accusation that they are failing to protect children, or actively
"supplying" pornography to minors.
On the other hand, if they act on these notices, they will rightly be accused by ourselves and those that are censored of acting in an unaccountable, arbitrary manner. They will not have been legally obliged to act by a court; similar content
will remain unblocked; and there will be no clear remedy for someone who wished to contest a "notification". Liability for the blocks would remain with the company, rather than the BBFC.
The government has not been clear with the Lords that this highly unclear situation is the likely result of notifications to Twitter--rather than account blocks, as they have suggested.
There are very good reasons not to block accounts after a mere notification. For instance in this case, although sites can contest a classification at the BBFC, and an internal appeals process will exist, there is no external appeal available,
other than embarking on an expensive judicial review. It is not clear that a classification as pornography should automatically lead to action by ancillary services, not least because compliance automatically results in the same content being
made available. To be clear, the bill does not aim to remove pornography from Twitter, Reddit users or search engines.
Why then, has the government drafted a bill with this power to notify "ancillary services", but no method to enforce? The reason appears to be that payment providers in particular have a long standing agreement amongst themselves that
they will halt payments when they are notified that someone is taking payments for unlawful activity. Similarly, large online ad networks have a similar process of accepting notifications.
There is therefore no need to create enforcement mechanisms for these two kinds of "ancillary providers". (There are pitfalls with their approach--it can lead to censorship and unwarranted damage to businesses--but let us leave that
debate aside for now.)
It seems clear that, when the bill was written, there was no expectation that "ancillary providers" would include Twitter, Yahoo, or Google, so no enofrcement power was created.
The government, in their haste, has agreed with the BBFC that they should be able to notify Twitter, Google, Yahoo and other platforms. They have agreed that BBFC need not take on a role of enforcement through court orders.
The key point is that the Lords are being misled by the government as things stand. Neither the BBFC or government have explored with Parliamentarians what the consequences of expanding the notion of "ancillary providers" is.
The Lords need to be told that this change means that:
the notices are unenforceable against Internet platforms;
they will lead to public disputes with the companies;
they make BBFC's decisions relating to ancillary providers highly unaccountable as legal responsibility for account blocks rest with the platforms.
It appears that the BBFC do not wish to be cast in the role of "national censor". They believe that their role is one of classification, rather than enforcement. However, the fact that they also wish to directly block websites via ISPs
rather flies in the face of their self-perception, as censorship is most clearly what they will be engaging in. Their self-perception is also not a reason to pass the legal buck onto Internet platforms who have no role in deciding whether a site
fails to meet regulatory requirements.
This mess is the result of rushing to legislate without understanding the problems involved. The obvious thing to do is to limit the impact of the "ancillary services" approach by narrowing the definition to exclude all but payment
providers and ad networks. The alternative--to create enforcement powers against a range of organisations--would need to establish full accountability for the duties imposed on ancillary providers in a court, something that the BBFC seems to wish
Or of course, the government could try to roll back its mistaken approach entirely, and give up on censorship as a punishment: that would be the right thing to do. Please
sign our petition if you agree .
The European Court of Justice has passed judgement on several linked cases in Europe requiring that ISP retain extensive records of all phone and internet communications. This includes a challenge by Labour's Tom Watson. The court wrote in a
The Members States may not impose a general obligation to retain data on providers of electronic communications services
EU law precludes a general and indiscriminate retention of traffic data and location data, but it is open to Members States to make provision, as a preventive measure, for targeted retention of that data solely for the purpose of fighting
serious crime, provided that such retention is, with respect to the categories of data to be retained, the means of communication affected, the persons concerned and the chosen duration of retention, limited to what is strictly necessary. Access
of the national authorities to the retained data must be subject to conditions, including prior review by an independent authority and the data being retained within the EU.
In today's judgment, the Court's answer is that EU law precludes national legislation that prescribes general and indiscriminate retention of data.
The Court confirms first that the national measures at issue fall within the scope of the directive. The protection of the confidentiality of electronic communications and related traffic data guaranteed by the directive, applies to the measures
taken by all persons other than users, whether by private persons or bodies, or by State bodies.
Next, the Court finds that while that directive enables Member States to restrict the scope of the obligation to ensure the confidentiality of communications and related traffic data, it cannot justify the exception to that obligation, and in
particular to the prohibition on storage of data laid down by that directive, becoming the rule.
Further, the Court states that, in accordance with its settled case-law, the protection of the fundamental right to respect for private life requires that derogations from the protection of personal data should apply only in so far as is
strictly necessary. The Court applies that case-law to the rules governing the retention of data and those governing access to the retained data.
The Court states that, with respect to retention, the retained data, taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained.
The interference by national legislation that provides for the retention of traffic data and location data with that right must therefore be considered to be particularly serious. The fact that the data is retained without the users of
electronic communications services being informed of the fact is likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance. Consequently, only the objective of fighting serious crime is
capable of justifying such interference.
The Court states that legislation prescribing a general and indiscriminate retention of data does not require there to be any relationship between the data which must be retained and a threat to public security and is not restricted to, inter
alia, providing for retention of data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved in a serious crime. Such national legislation therefore exceeds the limits of what is strictly
necessary and cannot be considered to be justified within a democratic society, as required by the directive, read in the light of the Charter.
The Court makes clear however that the directive does not preclude national legislation from imposing a targeted retention of data for the purpose of fighting serious crime, provided that such retention of data is, with respect to the categories
of data to be retained, the means of communication affected, the persons concerned and the retention period adopted, limited to what is strictly necessary. The Court states that any national legislation to that effect must be clear and precise
and must provide for sufficient guarantees of the protection of data against risks of misuse. The legislation must indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted,
thereby ensuring that the scope of that measure is, in practice, actually limited to what is strictly necessary. In particular, such legislation must be based on objective evidence which makes it possible to identify the persons whose data is
likely to reveal a link with serious criminal offences, to contribute to fighting serious crime or to preventing a serious risk to public security.
As regards the access of the competent national authorities to the retained data, the Court confirms that the national legislation concerned cannot be limited to requiring that access should be for one of the objectives referred to in the
directive, even if that objective is to fight serious crime, but must also lay down the substantive and procedural conditions governing the access of the competent national authorities to the retained data. That legislation must be based on
objective criteria in order to define the circumstances and conditions under which the competent national authorities are to be granted access to the data. Access can, as a general rule, be granted, in relation to the objective of fighting
crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime. However, in particular situations, where for example vital national
security, defence or public security interests are threatened by terrorist activities, access to the data of other persons might also be granted where there is objective evidence from which it can be inferred that that data might, in a specific
case, make an effective contribution to combating such activities.
Further, the Court considers that it is essential that access to retained data should, except in cases of urgency, be subject to prior review carried out by either a court or an independent body. In addition, the competent national authorities
to whom access to retained data has been granted must notify the persons concerned of that fact.
Given the quantity of retained data, the sensitivity of that data and the risk of unlawful access to it, the national legislation must make provision for that data to be retained within the EU and for the irreversible destruction of the data at
the end of the retention period.
The view of the authorities
David Anderson, the Independent Reviewer of Terrorism Legislation gives a lucid response outlining the government's case for mass surveillance. However the official justification is easily summarised as it clearly assists in the detection of
serious crime. He simply does not mention that the government having justified grabbing the data on grounds of serious crime detection, will share it willy nilly with all sorts of government departments for their own convenience, way beyond the
reasons set out in the official justification.
And when the authorities talk about their fight against 'serious' crime, recent governments have been updating legislation to redefine practically all crimes as 'serious' crimes. Eg possessing a single spliff may in practice be a trivial crime,
but the law on possession has a high maximum sentence that qualifies it as a 'serious' crime. It does not become trivial until it goes to court and the a trivia punishment has been handed down. So using mass snooping data would be easily
justified to track down trivial drug users.
The judgment relates to a case brought by Deputy Leader of the Labour Party, Tom Watson MP, over intrusive data retention powers. The ruling says that:
- Blanket data retention is not permissible
- Access to data must be authorised by an independent body
- Only data belonging to people who are suspected of serious crimes can be accessed
- Individuals need to be notified if their data is accessed.
At present, none of these conditions are met by UK law.
Open Rights Group intervened in the case together with Privacy International, arguing that the Data Retention and Investigatory Powers Act (DRIPA), rushed through parliament in 2014, was incompatible with EU law. While the
Judgment will no longer affect DRIPA, which expires at the end of 2016, it has major implications for the Investigatory Powers Act.
Executive Director Jim Killock said:
The CJEU has sent a clear message to the UK Government: blanket surveillance of our communications is intrusive and unacceptable in a democracy.
The Government knew this judgment was coming but Theresa May was determined to push through her snoopers' charter regardless. The Government must act quickly to re-write the IPA or be prepared to go to court again.
Data retention powers in the Investigatory Powers Act will come into effect on 30 Dec 2016. These mean that ISPs and mobile phone providers can be obliged to keep data about our communications, including a record of the
websites we visit and the apps we use. This data can be accessed by the police but also a wide range of organisations like the Food Standards Agency, the Health and Safety Executive and the Department of Health.
The Lords had their first debate on the Digital Economy Bill which includes laws to require age verification as well as extension of out dated police and BBFC censorship rules to the internet.
Lords inevitable queued up to support the age verification requirements. However a couple of the lords made cautionary remarks about the privacy issues of websites being able to build up dangerous database of personal ID information of porn
A couple of lords also spoke our against the BBFC/police/government censorship prohibitions being included in the bill. It was noted that these rules are outdated, disproportionate and perhaps requires further debate in another bill.
As an example of these points, the Earl of Erroll (cross bencher) said:
My Lords, I welcome the Bill because it has some very useful stuff in it -- but, like everything else, it might benefit from some tweaking. Many other speakers mentioned the tweaks that need to be made, and if that happens I think that we may
end up with quite a good Bill.
I will concentrate on age verification because I have been working on this issue with a group for about a year and three-quarters. We spotted that its profile was going to be raised because so many people were worried about it. We were the first
group to bring together the people who run adult content websites -- porn websites -- with those who want to protect children. The interesting thing to come out quite quickly from the meetings was that, believe it or not, the people who run porn
sites are not interested in corrupting children because they want to make money. What they want are adult, middle-aged people, with credit cards from whom they can extract money, preferably on a subscription basis or whatever. The stuff that
children are getting access to is what are called teaser adverts. They are designed to draw people in to the harder stuff inside, you might say. The providers would be delighted to offer age verification right up front so long as all the others
have to comply as well -- otherwise they will get all the traffic. Children use up bandwidth. It costs the providers money and wastes their time, so they are very happy to go along with it. They will even help police it, for the simple reason
that it will block the opposition. It is one of the few times I approve of the larger companies getting a competitive advantage in helping to police the smaller sites that try not to comply.
One of the things that became apparent early on was that we will not be able to do anything about foreign sites. They will not answer mail or do anything, so blocking is probably the only thing that will work. We are delighted that the
Government has gone for that at this stage. Things need to get blocked fast or sites will get around it. So it is a case of block first, appeal later, and we will need a simple appeals system. I am sure that the BBFC will do a fine job, but we
need something just in case.
Another thing that came back from the ISPs is that they want more clarity about what should be blocked, how it will be done and what they will have to do. There also needs to be indemnity. When the ISPs block something for intellectual property
and copyright reasons, they are indemnified. They would need to have it for this as well, or there will be a great deal of reluctance, which will cause problems.
The next thing that came up was censorship. The whole point of this is we want to enforce online what is already illegal offline. We are not trying to increase censorship or censor new material. If it illegal offline, it should be illegal online
and we should be able to do something about it. This is about children viewing adult material and pornography online. I am afraid this is where I slightly disagree with the noble Baroness, Lady Kidron. We should decide what should be blocked
elsewhere; we should not use the Bill to block other content that adults probably should not be watching either. It is a separate issue. The Bill is about protecting children. The challenge is that the Obscene Publications Act has some
definitions and there is ATVOD stuff as well. They are supposed to be involved with time. CPS guidelines are out of step with current case law as a result of one of the quite recent cases -- so there is a bit of a mess that needs clearing up.
This is not the Bill to do it. We probably need to address it quite soon and keep the pressure on; that is the next step. But this Bill is about keeping children away from such material.
The noble Baroness, Lady Benjamin, made a very good point about social platforms. They are commercial. There are loopholes that will get exploited. It is probably unrealistic to block the whole of Twitter -- it would make us look like idiots. On
the other hand, there are other things we can do. This brings me to the point that other noble Lords made about ancillary service complaints. If we start to make the payment service providers comply and help, they will make it less easy for
those sites to make money. They will not be able to do certain things. I do not know what enforcement is possible. All these sites have to sign up to terms and conditions. Big retail websites such as Amazon sell films that would certainly come
under this category. They should put an age check in front of the webpage. It is not difficult to do; they could easily comply.
We will probably need an enforcer as well. The BBFC is happy to be a regulator, and I think it is also happy to inform ISPs which sites should be blocked, but other enforcement stuff might need to be done. There is provision for it in the Bill.
The Government may need to start looking for an enforcer.
Another point that has come up is about anonymity and privacy, which is paramount. Imagine the fallout if some hacker found a list of senior politicians who had had to go through an age-verification process on one of these websites, which would
mean they had accessed them. They could bring down the Government or the Opposition overnight. Noble Lords could all go to the MindGeek website and look at the statistics, where there is a breakdown of which age groups and genders are accessing
these websites. I have not dared to do so because it will show I have been to that website, which I am sure would show up somewhere on one of these investigatory powers web searches and could be dangerous.
One of the things the Digital Policy Alliance, which I chair, has done is sponsor a public available specification, which the BSI is behind as well. There is a lot privacy-enforcing stuff in that. It is not totally obvious; it is not finished
yet, and it is being highlighted a bit more. One thing we came up with is that websites should not store the identity of the people whom they age-check. In fact, in most cases, they will bounce straight off the website and be sent to someone
called an attribute provider, who will check the age. They will probably know who the person is, but they will send back to the website only an encrypted token which says, We've checked this person that you sent to us. Store this token. This
person is over 18 -- or under 18, or whatever age they have asked to be confirmed. On their side, they will just keep a record of the token but will not say to which website they have issued it -- they will not store that, either. The link
is the token, so if a regulator or social service had to track it down, they could physically take the token from the porn site to where it came from, the attribute provider, and say, Can you check this person's really over 18, because we
think someone breached the security? What went wrong with your procedures? They can then reverse it and find out who the person was -- but they could still perhaps not be told by the regulator which site it was. So there should be a security
cut-out in there. A lot of work went into this because we all knew the danger.
This is where I agree entirely with the Open Rights Group, which thinks that such a measure should be mandated. Although the publicly available specification, which is almost like a British standard, says that privacy should be mandated under
general data protection regulation out of Europe, which we all subscribe to, I am not sure that that is enough. It is a guideline at the end of the day and it depends on how much emphasis the BBFC decides to put on it. I am not sure that we
should not just put something in the Bill to mandate that a website cannot keep a person's identity. If the person after they have proved that they are 18 then decides to subscribe to the website freely and to give it credit card details and
stuff like that, that is a different problem -- I am not worried about that. That is something else. That should be kept extremely securely and I personally would not give my ID to such a site -- but at the age verification end, it must be
There are some other funny things behind the scenes that I have been briefed on, such as the EU VAT reporting requirements under the VAT Mini One Stop Shop, which requires sites to keep some information which might make a person identifiable.
That could apply if someone was using one of the attribute providers that uses a credit card to provide that check or if the website itself was doing that. There may be some things that people will have to be careful of. There are some perfectly
good age-checking providers out there who can do it without you having to give your details. So it is a good idea; I think that it will help. Let us then worry about the point that the noble Baroness, Lady Kidron, made so well about what goes
The universal service obligation should be territorial; it has to cover the country and not just everyone's homes. With the internet of things coming along -- which I am also involved in because I am chair of the Hypercat Alliance, which is
about resource discovery over the internet of things -- one of the big problems is that we are going to need it everywhere: to do traffic monitoring, people flows and all the useful things we need. We cannot have little not-spots, or the
Government will not be able to get the information on which to run all sorts of helpful control systems. The noble Lord, Lord Gordon of Strathblane, referred to mast sharing. The problem with it is that they then do not put masts in the
not-spots; they just keep the money and work off just one mast -- you still get the not-spots. If someone shares a mast, they should be forced a mast somewhere else, which they then share as well.
On broadband take-up, people say, Oh, well, people aren't asking for it . It is chicken and egg: until it is there, you do not know what it is good for. Once it is there and suddenly it is all useful, the applications will flow. We have
to look to the future; we have to have some vision. Let us get chicken or the egg out there and the chicken will follow -- I cannot remember which way round it is.
I agree entirely with the noble Lord, Lord Mitchell, that the problem with Openreach is that it will always be controlled by its holding company, which takes the investment, redirects it and decides where the money goes. That is the challenge
with having it overseeing.
I do not want waste much time, because I know that it is getting late-ish. On jobs, a huge number of jobs were created in earlier days in installing and maintaining internet of things sensors all over the place -- that will change. On the
gigabit stuff, it will save travel, energy and all sorts of things -- we might even do remote-control hip operations, so you send the device and the surgeon then does it remotely, once we get super-duper superfast broadband.
I want to say one thing about IP. The Open Rights Group raised having thresholds of seriousness. It is quite important that we do not start prosecuting people on charges with 10-year sentences for trivial things. But it is also sad how
interesting documentaries can disappear terribly quickly. The catch-up services cover only a month or so and if you are interested, it is quite nice being able to find these things out there on the internet a year or two later. There should
somehow be a publicly available archive for all the people who produce interesting documentaries. I do not know whether they should make a small charge for it, but it should be out there.
The Open Rights Group also highlighted the bulk sharing of data. Some of the stuff will be very useful -- the briefing on free school meals is interesting -- but if you are the only person who really knows what might be leaked, it is very
dangerous. If someone were to beat you up, an ordinary register could leak your address across without realising that at that point you are about to go into witness protection. There can be lots of problems with bulk data sharing, so be careful;
that is why the insurance database was killed off a few years ago. Apart from that, I thank your Lordships for listening and say that, in general, this is a good effort.?
Murray Perkins of the BBFC explains how all the world's major porn websites will have to be totally banned in Britain (even if they set up age verification systems) under the censorship rules contained in the Digital Economy Bill
The BBFC currently cuts about 15% of all R18 porn films on their way to totally ordinary mainstream porn shops. These are not niche or speciality films, they are totally middle of the road porn, which represents the sort of content on all the
world's major porn sites. Most of the cuts are ludicrous but Murray Perkins, a senior examiner of the BBFC, points out that they are all considered either be to be harmful, or else are still prohibited by the police or the government for reasons
that have long since past their sell by date.
So about a sixth of all the world's adult films are therefore considered prohibited by the British authorities, and so any website containing such films will have to be banned as there is to practical way to cut out the bits that wind up censors,
police or government. And this mainstream but prohibited content appears on just about all the world's major porn sites, free or paid.
The main prohibitions that will cause a website to be blocked (even before considering whether they will set up strict age verification) are such mainstream content as female ejaculation, urine play, gagging during blow jobs, rough sex, incest
story lines (which is a major genre of porn at the moment), use of the word 'teen' and verbal references to under 18's.
Murray Perkins has picked up the job of explaining this catch all ban. He explains it well, but he tries to throw readers off track by citing examples of prohibitions being justifiable because the apply to violent porn, whilst not
mentioning that they apply equally well to trivia such as female squirting.
Perkins writes in the Huffington Post:
Recent media reports highlighting what content will be defined as prohibited material under the terms of the Digital Economy Bill could have given an inaccurate impression of the serious nature of the harmful material that the BBFC generally
refuses to classify. The BBFC works only to the BBFC Classification Guidelines and UK law, with guidance from the Crown Prosecution Service (CPS) and enforcement bodies, and not to any other lists.
The Digital Economy Bill aims to reduce the risk of children and young people accessing, or stumbling across, pornographic content online. It proposes that the BBFC check whether
(i) robust age verification is in place on websites containing pornographic content and
(ii) whether the website or app contains pornographic content that is prohibited.
An amendment to the Digital Economy Bill, passed in the House of Commons, would also permit the BBFC to ask Internet Service Providers (ISPs) to block pornographic websites that refuse to offer effective age verification or contain prohibited
material such as sexually violent pornography.
In making any assessment of content, the BBFC will apply the standards used to classify pornography that is distributed offline. Under the Video Recordings Act 1984 the BBFC is obliged to consider harm when classifying any content including 18
and R18 rated sex works. Examples of material that the BBFC refuses to classify include pornographic works that: depict and encourage rape, including gang rape; depict non-consensual violent abuse against women; promote an interest in incestuous
behaviour; and promote an interest in sex with children. [Perkins misleadingly neglects to include, squirting, gagging, and urine play in his examples here]. The Digital Economy Bill defines this type of
unclassifiable material as prohibited .-
Under its letters of designation the BBFC may not classify anything that may breach criminal law, including the Obscene Publications Act (OPA) as currently interpreted by the Crown Prosecution Service (CPS). The CPS provides guidance on acts
which are most commonly prosecuted under the OPA. The BBFC is required to follow this guidance when classifying content offline and will be required to do the same under the Digital Economy Bill. In 2015, 12% of all cuts made to pornographic
works classified by the BBFC were compulsory cuts under the OPA. The majority of these cuts were to scenes involving urolagnia which is in breach of CPS guidance and could be subject to prosecution.
It was Conservative MP and former minister John Whittingdale who introduced the bill. But now, the BBC is reporting that he's worried it might not actually work. He told Parliament:
One of the main ways in which young people are now exposed to pornography is through social media such as Twitter, and I do not really see that the bill will do anything to stop that happening.
This gets neatly at a key problem with the porn filter: The internet is not neatly divided into pornography and non-pornography. As I wrote last week , it's technically simple to block dedicated fetish websites. But plenty of sites mix porn with
non-pornographic content, or include both conventional and non-conventional material -- raising serious questions as to how the filter could ever work in practice.
Warning: Fake News Alert: When did politicians ever care about a robust evidence base when issues of morality are at stake?
In July the Home Affairs Committee said soliciting for sex in England and Wales should no longer be a criminal offence. MPs also suggested sex workers should be able to share premises rather than risk working alone.
However such policies are way to liberal for the government and so they have commissioned another research report, no doubt hoping that it will reach a more proscriptive solution. After all there are still lots of men to jail for the heinous
crime of simply trying to enjoy the pleasures of life.
Home Secretary Amber Rudd has said that a robust evidence base was needed before policy changes were addressed. And so another Home Office research project has been commissioned and will report back next June. Rudd commented that any
government response should include:
Ensuring those involved in prostitution and sex work are safeguarded, that traffickers and those who exploit vulnerable people can be effectively targeted, and ensuring that community concerns about prostitution and sex work can be addressed.
Among the many unpleasant things in the Investigatory Powers Act that was officially signed into law this week, one that has not gained as much attention is the apparent ability for the UK government to undermine encryption and demand
As the bill was passing through Parliament, several organizations noted their alarm at section 217 which obliged ISPs, telcos and other communications providers to let the government know in advance of any new products and services being deployed
and allow the government to demand technical changes to software and systems.
Communications Service Providers (CSP) subject to a technical capacity notice must notify the Government of new products and services in advance of their launch, in order to allow consideration of whether it is necessary and proportionate to
require the CSP to provide a technical capability on the new service.
As per the final wording of the law, comms providers on the receiving end of a technical capacity notice will be obliged to do various things on demand for government snoops -- such as disclosing details of any system upgrades and removing
electronic protection on encrypted communications.
The Liberal Democrats are to oppose plans to censor internet porn sites in the name of 'protecting the children'. Brian Paddick, Liberal Democrat Shadow Home Secretary, said:
Liberal Democrats will do everything possible to ensure that our privacy is not further eroded by this Tory government.
Clamping down on perfectly legal material is something we would expect from the Russian or Chinese governments, not our own. Of course the internet cannot be an ungoverned space, but banning legal material for consenting adults is not the right
The Internet Service Provider Association has also said moves to force providers to block adult sites that do not age verify has the potential to significantly harm the digital economy . ISPA chair James Blessing said:
The Digital Economy Bill is all about ensuing the UK continues to be a digital world leader, including in relation to internet safety. This is why ISPA supported the government's original age verification policy for addressing the problem of
underage access of adult sites at source.
Instead of rushing through this significant policy change, we are calling on government to pause and have a substantive discussion on how any legal and regulatory change will impact the UK's dynamic digital economy and the expectations and
rights of UK Internet users.
In placing the BBFC as official guardians of morality, alternative depictions of sexuality such as that by the growing feminist pornography movement and the BDSM community are threatened. By Vonny Moyes
Digital Economy Bill Age Verification Letters of Understanding
On 06 October 2016, the BBFC exchanged letters of understanding with DCMS confirming DCMS's intention, in principle, to appoint the BBFC to take on a regulatory role in the age verification of pornographic content online, as proposed in the
Digital Economy Bill. These letters are available below.
The Digital Economy Bill contains measures to establish the same standard of protection online as currently exists offline with the aim of reducing the risk of children and young people accessing, or stumbling across, pornographic content
The BBFC's proposed role in the age verification of pornographic content online, as laid out in the Digital Economy Bill, is subject to designation by both Houses of Parliament.
The Letter of Understanding form Baroness Shelds of the DCMS to David Austin reads:
I would like to drank you for the British Board of Film Classification's continuous help and support in developing the Government's manifesto commitment to Introduce Age Verification (AV) checks for online pornography.
As you know, the AV clauses contained in the Digital Economy Bill have been designed to ensure that pornographic material must not normally be accessible online to users in the UK on a commercial basis without appropriate age verification
checks. We appreciate BBFC's ongoing support especially in helping develop effective options for Stages 1-3 of the proposed regulatory framework. I understand you have worked with my officials in thinking through these proposals and had a
productive meeting on 16 September to discuss your role in more detail.
We are committed to this policy and aim to introduce an effective regulatory framework to enable its smooth delivery. BBFC's experience in making effective editorial judgements Is important to the success of the policy. I would like to invite
the BBFC to take on a regulatory role within the proposed framework, subject to the particulars of the proposed designation being laid in both Houses of Parliament. In working together, it is our intention that:
Both DCMS and the BBFC are committed to working openly and transparently to establish an effective regulatory framework for the age verification of pornographic content online;
That the BBFC will create a proportionate, accountable, independent and expert regulatory function, that would seek among its alms to promote voluntary compliance and advise Her Majesty's Government (HMG) mars widely on reducing the risk of
pornography being made readily available to children;
That the BBFC will be responsible for Stages 1-3 of the proposed regulatory framework and that any enforcement function under the current Bill Clauses 20 and 21 will be carried out by another regulator that will have equal status to the BBFC,
DCMS will fund the BBFC's start up, and those already incurred. subject to final agreement once legislative approvals are in place.
Please note, this letter Is nonbinding and constitutes an indication of intent rather than creating a liability or obligation of any nature whatsoever to DCMS or the BBFC.
I look forward to heating from you very soon and would like to thank you once again for your valuable contribution and ongoing co-operation.
When you legislate at break-neck speed, and fail to consult, things will go wrong. This is absolutely the case with Age Verification (AV) in the Digital Economy Bill, which now seems set to include website blocking to bolster use of AV
technologies. This is likely to lead to high risks of credit card fraud and privacy abuse.
Currently the BBFC are pinning their hopes on being able to specify some kind of privacy and safety standard through their ability to regulate arrangements that deliver age verified material. Sites must deliver pornographic material:
in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18
The regulator can issue then guidance for:
types of arrangements for making pornographic material available that the regulator will treat as complying
The claim is that this mechanism allows the guidance to specify what kind of AV is private and secure.
However, if the BBFC are told to block non-compliant websites, in practice they will have to accept any system that websites use that verifies age. To do otherwise would be highly unfair: why should a site with legal material, that uses
their own AV system, end up blocked by the BBFC?
This will especially apply to systems that require registration / credit card tests. There are plenty of paysites already of course. These are not privacy friendly, as they strongly identify the user to the website - and they have to do this to
minimise fraudulent payment card transactions. That's alright as a matter of choice of course, but dangerous when it is done purely as a means of age verification.
If asking for credit card details becomes common or permissible, and a credible ask in the minds of UK citizens, then the government will have created a gold mine for criminals to operate scam porn sites targeted at the UK, inviting people to
supply their credit cards to scam sites for Age Verification . In fact you could see this being extended to all manner of sites that a criminal could claim were blocked until you prove you're over 18 .
verified by visa fraud
Once credit card details are harvested, in return for some minimal/copyright infringing porn access at a scam porn site, then criminals can of course resell them for fraud. Another easy to understand example of a criminal abusing this system is
that you could see criminals typo-squatting on relevant domain names such as youporm.com and asking for a credit card to gain access. Anything that normalises the entry of credit card details into pages where the user isn't making a payment will
increase the fraudulent use of such cards. And if a website is validating credit cards to prove age, but not verifying them, then the internationally agreed standards to protect credit card data are unlikely to apply to them.
Website blocking makes these scams more likely because the BBFC is likely to have to sacrifice control of the AV systems that are permissible, and a diversity of AV systems makes it hard for users to understand what is safe to do. During the
committee stage of the Digital Economy Bill, we argued that the AV regulator should be highly specific about the privacy and anonymity protections, alongside the cyber security consequences. We argued for a single system with perhaps multiple
providers, that would be verifiable and trusted. The government on the other hand believes that market-led solutions should be allowed to proliferate. This makes it hard for users to know which are safe or genuine.
If website blocking becomes part of the enforcement armoury, then websites that employ unsafe but effective, or novel and unknown, AV systems will be able to argue that they should not be blocked. The BBFC is likely to have to err on the side of
caution - it would be an extreme step to block an age-verifying website just because it hadn't employed an approved system.
The amount of website blocking that takes place will add to the scamming problem and open up new opportunities for innovative criminals. The BBFC seems to be set to have an administrative power to order ISPs to block. If this is the case,
the policy would appear to be designed to block many websites, rather than a small number. The more blocking of sites that users encounter, the more they will get used to the idea that age verification is in use for pornography or anything that
could possibly be perceived as age-restricted, and therefore trust the systems they are presented with. If this system is not always the same, but varies wildly, then there are plenty of opportunities for scams and criminal compromise of
poorly-run Age Verification systems.
Security and privacy problems can be minimised, but are very, very hard to avoid if the government goes down the website blocking route. What MPs need to know right now is that they are moving too fast to predict the scale of the problems they
are opening up.
The Digital Economy Bill is primarily reprehensible for introducing mass internet censorship, but don't forget it also enables the rapid sharing of government databases to more or less any official who makes a request
Well, Part 5 of the Bill will fundamentally change the way our personal information is handled, shared and controlled whenever we hand it over to government.
That means that whenever we file a tax return, apply for a driving licence, register a birth, death or marriage, apply for benefits or deal with a council, court or other public authority, all of the data we share, we will have no control of.
Because if Part 5 of the Bill becomes law:
As soon as you share anything with the government, you will be blocked from having any further control over how your personal information and sensitive data is shared around government, with councils, other government bodies and business.
You will not be allowed to change your data if there is a mistake or error.
You will not be asked permission or informed if an official shares, uses or looks at your data.
You will not be allowed to opt out of your data being shared.
Your birth, death, marriage and civil registration documents will be shared in bulk without your consent.
Data sharing is a fact of life and a great deal of good can come from the sharing of data, but as soon as our data is digitised it is insecure and open to exploitation.
We see this every time we read of a big company suffering a data breach or data hack. And government aren't immune, in 2014/15 government experienced 9,000 data breaches possibly down to poor data sharing practice, certainly down to not
understanding data protection laws.
Our data is us -- it is who we are, what we do, how we live and who we know. If we don't know where it is going, who it is shared with, why it is used and what we can do to control access to it, the future of all our personal information is at
If you are worried please write to your MP this week and tell them, because without challenge this Bill will pass and control of our personal information will be lost to Government forever.
Reasons why the government's plan 'to protect children online' is not just dreadful but extremely alarming.
21st November 2016
From a Melon Farmers reader
It's already been announced that the government are to press ahead with their controversial plans to create a huge database of the all the activities of every internet user in the UK . Every time you visit any website,
the time and date and the name of the website will be recorded. There are no exemptions.
Such a system of blanket surveillance has not been used or proposed in any other country.
You might think then, that after such an announcement, they would have been a little muted for a short while in proposing yet more heavy handed legislation aimed at the internet. Not a bit of it. Now they really seem to have the bit between their
teeth and are charging full steam ahead with, if possible, even more draconian powers.
In the 1980's, as a result of the backlash against video nasties , the government handed complete censorship of all video media to the British Board of Film Censors, now renamed the British Board of Film Classification (because they don't
like to be thought of as censors). A bit like the ministry of propaganda preferred to be called the Ministry of Truth in George Orwell's 1984. Appropriately enough, this bill was made law in 1984.
Now, the latest proposal is to effectively hand censorship of the entire internet over to the same people!
The argument is that if a website which is unsuitable for children does not have adequate checks in place to verify the users age, the BBFC will be able to block it. This might sound reasonable in theory but in practice it will culminate in a
monstrous invasion of internet freedom and dangers for internet users. Here's why:
Most people know that such controls can be effectively by-passed with use of a proxy servers, or on a phone or tablet a simple app which redirects internet traffic through a secure unfiltered connection. The problem with this is that it
introduces a whole new level of risk and exposure to criminality. Traffic can be routed, without the user knowing, via servers which are known to contain criminal content thus giving the appearance that the user has been accessing child
pornography, terrorist information or other material which could incriminate them.
Amongst the honest firms who run proxy servers there are con-men and criminals waiting to catch the unwary. Ransom demands and other criminal activity is often the actual business which is sitting behind a link for what appears to be a
proxy server. If you don't believe me, please do your own research.
Identification will be a nightmare. Making porn or other websites take credit or debit card details as a check of age is preposterous. Very few people would want to trust giving their credit or debit card details to a website just to even see
what is on it.
It's even been suggested that these websites could cross check the UK electoral roll. How's that supposed to work? Presumably not so anybody can give the name and address of someone they dislike and that goes down on the government's list of
names and addresses of people who've visited dodgy websites?
The BBFC can not just censor but entirely block any web site that contains anything they disagree with! For example if the site contains anything which they would not allow in a BBFC certificated video. They would argue that it was their
duty . Since a website containing any nudity at all, or discussion of sex, or any other thing which is not suitable for children , should be behind an age protected barrier, this will allow them to block any web site they wish. If a
site with discussion about something which is not suitable for a small child, say in the US or Canada, cannot be bothered to deal with the BBFC, it can simply be blocked completely in the UK if the owners do not cravenly submit to the demands
of a government censor in another country! Not that the websites will probably care, having written off internet users in the UK the same way as they would people who are blocked from access by any other dictatorial government around the world.
In addition to websites being blocked, if a server contains a small amount of anything which is unsuitable for children, the domain itself, containing many other web sites, can be blocked. Because most countries in the world are more broad
minded and less adamant about state control of what people see than the UK, nobody else will have noticed that UK users are being blocked from access to perfectly normal information just because their domain has been blacklisted.
Who is going to pay for this work to be done? The BBFC can currently pay for their video censorship work because the Video Recordings Act requires that by law firms in the UK have no option but to pay their fees ranging from several thousand pounds for each video submitted.
How do you think the BBFC is going to get on with the owners of foreign websites?
Ah, hello Mr Dirty Website Owner, this is the BBFC here, we want you to follow our regulations and pay us or fees or I'm afraid I'll have to inform you that her majesty's government will block UK users from access to your website.
Mr Dirty Website Owner's response is something which you can probably imagine yourself. It probably involves some rather colourful language telling the BBFC where they can stick their regulations and fees.
The government has already required ISPs to provide filtered child friendly internet connections for anyone who wants it. However, since the population have generally been less than enthusiastic about uptake of filtered internet connections the government has decided that this is not good enough and so you *will* have a censored internet connection *and like it* even though 70% of households in the UK have no children.
If this truly was a matter of protecting children, then the problem would lie with the 10 to 15 % of homes with children, where the adults have not switched on the filters. It would be far more sensible to amend the law to require homes where
children are present to have the filters switched on. But this just proves that it *isn't* just a matter of protecting children, what they really want is *total* control, and you don't get that with a opt in scheme. The plan is to censor the
internet to the extent that these filtered connections are no longer required.
Going back to proxy servers again, since this is such an easy way to avoid the censorship, and since, unfortunately, proxy servers allow access to anything, even stuff 99.9% of people really don't want to see, this will give the government a
*perfect excuse* to ban proxy servers as well. And there you have it: TOTAL INTERNET CENSORSHIP. You could probably still download and install a proxy server, but if you are detected using it you could be marched down to the local police
station for questioning, and since there is no excuse to be using a proxy server as they will be illegal, they can assume you were planning a terrorist attack or watching child pornography and throw you in jail. Sorry, I mean detain you in a
cell pending trial, for the public good.
WAKE UP BRITAIN! Please don't allow the control freaks to take over your county. Print this article out, send it to your MP - don't let MPs simply be carried along by misguided nanny state meddling in basic democratic freedom under the guise of
protecting the children . The onus should be on parents to switch on the filters that have already been provided, not treat every adult in the UK as a child.
This proposed legislation is a continuation of the very slippery slope towards total state surveillance and control which has already been approved. If you don't stand up to this next level of state control, what will they think they can get
away with next?
Don't take this warning lightly, unless enough people object they will steamroller ahead with it and you will loose your freedom. Unless you want your internet to be suitable for a pre school toddler with a vast number of other
harmless pages and websites blocked as a result, send this article to your MP now and ask for his or her comments.
Britain's minister of censorship culture has said that the government will move block the vast majority of internet porn, both domestic and foreign.
Culture Secretary Karen Bradley threatened:
We made a promise to keep children safe from harmful pornographic content online and that is exactly what we are doing. Only adults should be allowed to view such content and we have appointed a regulator, BBFC, to make sure the right age checks
are in place to make that happen. If sites refuse to comply, they should be blocked.
In fulfilling this manifesto commitment and working closely with people like (MPs) Claire Perry and Kit Malthouse who have worked tirelessly on internet safety issues, we are protecting children from the consequences of harmful content.
The powers will be brought forward in amendments to the Digital Economy Bill later this month.
Porn websites will be allowed to stay open if they adopt onerous age validation but as yet no one has come up with a solution that is accurate, cheap, convenient and secure enough to be viable. The only currently acceptable method is to
allow porn only to those willing to pay with credit cards, (debit cards not allowed). Not only do you have to go through the hassle of filling in credit card details, you have to trust potentially dodgy foreign websites with your ID information,
you have to pay before being able to see what is on offer. Needless to say, the UK adult online trade that has been subjected to this suffocating censorship regime have been forced to either go bankrupt or go abroad.
The British Board of Film Classification (BBFC), will be given powers to make ISPs censor porn sites which do not put age checks in place to make them inaccessible to children.
On a slightly more positive note The BBFC said any verification mechanism must provide assurances around data protection and it would consider those that already exist and ones currently being developed. It is understood the government is
working with the BBFC to determine the best mechanism that confirms eligibility rather than identifying the user.
In the Digital Economy Bill, the Government wants erotica and pornography websites to make sure their users are over 18. This could threaten our privacy by collecting data on everyone in the UK who visits erotica and pornography sites. Making
sure all porn sites go along with it is unworkable. So a group of MPs want Internet Service Providers to block websites that don't comply. Sign our petition to say no to censorship of legal content.
MPs are putting pressure on the Government to add measures to the Bill that would force Internet Service Providers to block erotica and pornography websites that don't verify the age of their users.
This equates to censorship of legal content - potentially affecting tens of thousands of websites and millions of people.
Blocking websites is a disproportionate, technical response to a complex, social issue. The UK's children need education, not censorship, to keep them safe.
The UK government has introduced an amendment to the Investigatory Powers Bill currently going through Parliament, to make ensure that data retention orders cannot require ISPs to collect and retain third party data. The Home Office had
previously said that they didn't need powers to force ISPs to collect third party data, but until now refused to provide guarantees in law.
Third party data is defined as communications data (sender, receiver, date, time etc) for messages sent within a website as opposed to messages sent by more direct methods such as email. It is obviously a bit tricky for ISPs to try and decode
what is going on within websites as messaging data formats are generally proprietary, and in the general case, simply not de-cypherable by ISPs.
The Government will therefore snoop on messages sent, for example via Facebook, by demanding the communication details from Facebook themselves.
The Digital Economy Bill mandates that pornographic websites must verify the age of their customers. Are there any powers to protect user privacy?
Yesterday we published a blog
detailing the lack of privacy safeguards for Age Verification systems mandated in the Digital Economy Bill. Since then, we have been offered two explanations as to why the regulator designate , the BBFC, may think that privacy can be
The first and most important claim is that Clause 15 may allow the regulation of AV services, in an open-ended and non-specific way:
15 Internet pornography: requirement to prevent access by persons under the age of 18
A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the
age of 18
The age-verification regulator (see section 17) must publish guidance about--
(a) types of arrangements for making pornographic material available that the regulator will treat as complying with subsection (1);
However, this clause seems to regulate publishers who "make pornography material available on the internet" and what is regulated in 15 (3) (a) is the "arrangements for making pornography available". They do not mention age
verification systems, which is not really an "arrangement for making pornography available" except inasmuch as it is used by the publisher to verify age correctly.
AV systems are not "making pornography available".
The argument however runs that the BBFC could under 15 (3) (a) tell websites what kind of AV systems with which privacy standards they can use.
If the BBFC sought to regulate providers of age verification systems via this means, we could expect them to be subject to legal challenge for exceeding their powers. It may seem unfair to a court for the BBFC to start imposing new privacy and
security requirements on AV providers or website publishers that are not spelled out and when they are subject to separate legal regimes such as data protection and e-privacy.
This clause does not provide the BBFC with enough power to guarantee a high standard of privacy for end users, as any potential requirements are undefined. The bill should spell out what the standards are, in order to meet an 'accordance with the
law' test for intrusions on the fundamental right to privacy.
The second fig leaf towards privacy is the draft standard for age verification technologies
drafted by the Digital Policy Alliance. This is being edited by the British Standards Institution, as
PAS 1296 . It has been touted as the means by which commercial outlets will produce a workable system.
The government may believe that PAS 1296 could, via Clause 15 (3) (a), be stipulated as a standard that Age Verifcation providers abide by in order to supply publishers, thereby giving a higher standard of protection than data protection law
PAS 1296 provides general guidance and has no means of strong enforcement towards companies that adopt it. It is a soft design guide that provides broad principles to adopt when producing these systems.
Contrast this, for instance, with the hard and fast contractual arrangements the government's Verify system has in place with its providers, alongside firmly specified protocols. Or card payment processors, who must abide by strict terms and
conditions set by the card companies, where bad actors rapidly get switched off.
The result is that PAS 1296
says little about security requirements , data protection standards, or anything else we are concerned about. It stipulates that the age verification systems cannot be sued for losing your data. Rather, you must sue the website owner, i.e.
the porn site which contracted with the age verifier.
There are also several terminological gaffes such as referring to PII (personally identifying information) which is a US legal concept, rather than EU and UK's 'personal data'; this suggests that PAS 1296 is very much a draft, in fact appears to
have been hastily cobbled-together
However you look at it, the proposed PAS 1296 standard is very generic, lacks meaningful enforcement and is designed to tackle situations where the user has some control and choice, and can provide meaningful consent. This is not the case with
this duty for pornographic publishers. Users have no choice but to use age verification to access the content, and the publishers are forced to provide such tools.
Pornography companies meanwhile have every reason to do age verification as cheaply as possible, and possibly to harvest as much user data as they can, to track and profile users, especially where that data may in future, at the slip of a switch,
be used for other purposes such as advertising-tracking. This combination of poor incentives has plenty of potential for disastrous consequences.
The Government wants people who view pornography to show that they are over 18, via Age Verification systems. This is aimed at reducing the likelihood of children accessing inappropriate content.
To this end the Digital Economy Bill creates a regulator that will seek to ensure that adult content websites will verify the age of users, or face monetary penalties, or in the case of overseas sites, ask payment providers such as VISA to
refuse to process UK payments for non-compliant providers.
There are obvious problems with this, which we detail
However, the worst risks are worth going into in some detail, not least from the perspective of the Bill Committee who want the Age Verification system to succeed.
As David Austen, from the BBFC, who will likely become the Age Verification Regulator
Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be
interested in is age. The simple question that should be returned to the pornographic website or app is, "Is this person 18 or over?" The answer should be either yes or no. No other personal details are necessary.
However, the Age Verification Regulator has no duties in relation to the Age Verification systems. They will make sites verify age, or issue penalties, but they are given no duty to protect people's privacy, security or defend against cyber
security risks that may emerge from the Age Verification systems themselves.
David Austen's expectations are unfortunately entirely out of his hands.
Instead, the government appears to assume that Data Protection law will be adequate to deal with the privacy and security risks. Meanwhile, the market will provide the tools.
has a plethora of possible means to solve this problem. Some involve vast data trawls through Facebook and social media. Others plan to link people's identity across web services and will provide way to profile people's porn viewing habits.
Still others attempt to piggyback upon payment providers and risk confusing their defences against fraud. Many appear to encourage people to submit sensitive information to services that the users, and the regulator, will have little or no
And yet with all the risks that these solutions pose, all of these solutions may be entirely data protection compliant. This is because data protection allows people to share pretty much whatever they agree to share, on the basis that they are
free to make agreements with whoever they wish, by providing 'consent'.
In other words: Data protection law is simply not designed to govern situations where the user is forced to agree to the use of highly intrusive tools against themselves.
What makes this proposal more dangerous is that the incentives for the industry are poor and lead in the wrong direction. They have no desire for large costs, but would benefit vastly from acquiring user data.
If the government wants to have Age Verification in place, it must mandate a system that increases the privacy and safety of end users, since the users will be compelled to use Age Verification tools. Also, any and all Age Verification solutions
must not make Britain's cybersecurity worse overall, e.g. by building databases of the nation's porn-surfing habits which might later appear on Wikileaks.
The Digital Economy Bill's impact on privacy of users should, in human rights law, be properly spelled out ("
in accordance with the law ") and be designed to minimise the impacts on people (necessary and proportionate). Thus failure to provide protections places the entire system under threat of potential legal challenges.
User data in these systems will be especially sensitive, being linked to private sexual preferences and potentially impacting particularly badly on sexual minorities if it goes wrong, through data breaches or simple chilling effects. This data is
regarded as particularly sensitive in law.
Government, in fact has at its hands a system called Verify which could provide age-verification in a privacy friendly manner. The Government ought to be explaining why the high standards of its own Verify system are not being applied to Age
Verification, or indeed, why the government is not prepared to use its own systems to minimise the impacts.
As with web filtering, there is no evidence that Age Verification will prevent an even slightly determined teenager from accessing pornography, nor reduce demand for it among young people. The Government appears to be looking for an easy fix to a
complex social problem. The Internet has given young people unprecedented access to adult content but it's education rather than tech solutions that are most likely to address problems arising from this. Serious questions about the efficacy and
therefore proportionality of this measure remain.
However, legislating for the Age Verification problem to be "solved" without any specific regulation for any private sector operator who wants to "help" is simply to throw the privacy of the UK's adult population to the mercy
of the porn industry. With this mind, we have
drafted an amendment to introduce the duties necessary to minimise the privacy impacts which could also reduce if not remove the free expression harms to adults.
The bill talks of 'disclosing' personal data to gas and electricity companies, yet there are no details about access limitations, data security, ethical use of data, nor of a trust framework to protect the privacy and security of citizens
The BBFC has signed an agreement with the U.K. government to act as the country's new internet porn censor.
BBFC Director David Austin explained the censor's new role regulating online adult entertainment to a committee in Parliament weighing the 2016 Digital Economy Bill. Austin discussed how the BBFC will approach those sites that are found to be in
contravention to U.K. law in regards to verifying that adult content can't be accessed by under 18s.
Austin said that the 2016 Digital Economy Bill now being weighed will achieve a great deal for the BBFC's new role as the age-verification enforcer. The piece of legislation, if given the OK, could impose financial penalties of up to
$250,000 for noncomplying adult entertainment sites.
Austin said that the BBFC will methodically start focusing on the largest offending websites, including foreign ones, and notifying them for breaches in the U.K.'s mandatory age-verification laws. Austin said that offending sites will face a
notification process that may include the filing of sanctions against sites' business partners, such as payment providers and others that supply ancillary services. Austin also mentioned that sanctioned sites could find web properties blocked by
IP address and de-indexed from search engines.
David Austin : My name is David Austin. I am the chief executive of the British Board of Film Classification.
Alan Wardle: I am Alan Wardle, head of policy and public affairs at the National Society for the Prevention of Cruelty to Children.
Louise Haigh (Sheffield, Heeley) (Lab)
Q David, am I right in interpreting the amendments that the Government tabled last night as meaning that you are intended to be the age verification regulator?
David Austin: That is correct. We reached heads of agreement with the Government last week to take on stages 1 to 3 of the regulation.
Louise Haigh Q Are you sufficiently resourced to take on that role?
David Austin: We will be, yes. We have plenty of time to gear up, and we will have sufficient resource.
Louise Haigh Q Will it involve a levy on the porn industry?
David Austin: It will involve the Government paying us the money to do the job on our usual not-for-profit basis.
Louise Haigh Q What risks do you envisage in people handing over their personal data to the pornographic industry?
David Austin: Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a
porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, "Is this person 18 or over?" The answer should be either yes or no. No other personal details are necessary.
We should bear in mind that this is not a new system. Age verification already exists, and we have experience of it in our work with the mobile network operators, where it works quite effectively--you can age verify your mobile phone, for
example. It is also worth bearing in mind that an entire industry is developing around improving age verification. Research conducted by a UK adult company in relation to age verification on their online content shows that the public is becoming
much more accepting of age verification.
Back in July 2015, for example, this company found that more than 50% of users were deterred when they were asked to age verify. As of September, so just a few weeks ago, that figure had gone down to 2.3%. It is established technology, it is
getting better and people are getting used to it, but you are absolutely right that privacy is paramount.
Louise Haigh Q Are you suggesting that it will literally just be a question--"Is the user aged 18?"--and their ticking a box to say yes or no? How else could you disaggregate identity from age verification?
David Austin: There are a number of third-party organisations. I have experience with mobile phones. When you take out a mobile phone contract, the adult filters are automatically turned on and the BBFC's role is to regulate what
content goes in front of or behind the adult filters. If you want to access adult content--and it is not just pornography; it could be depictions of self-harm or the promotion of other things that are inappropriate for children--you can go to
your operator, such as EE, O2 or Vodafone, with proof that you are 18 or over. It is then on the record that that phone is age verified. That phone can then be used in other contexts to access content.
Louise Haigh Q But how can that be disaggregated from identity? That person's personal data is associated with that phone and is still going to be part of the contract.
David Austin: It is known by the mobile network operator, but beyond that it does not need to be known at all.
Louise Haigh Q And is that the only form of age verification that you have so far looked into?
David Austin: The only form of age verification that we, as the BBFC, have experience of is age verification on mobile phones, but there are other methods and there are new methods coming on line. The Digital Policy Alliance, which
I believe had a meeting here yesterday to demonstrate new types of age verification, is working on a number of initiatives.
Claire Perry (Devizes) (Con) Q May I say what great comfort it is to know that the BBFC will be involved in the regulatory role? It suggests that this will move in the right direction. We all feel very strongly that the Bill
is a brilliant step in the right direction: things that were considered inconceivable four or five years ago can now be debated and legislated for.
The fundamental question for me comes down to enforcement. We know that it is difficult to enforce anything against offshore content providers; that is why in the original campaign we went for internet service providers that were British
companies, for whom enforcement could work. What reassurance can you give us that enforcement, if you have the role of enforcement, could be carried out against foreign entities? Would it not be more appropriate to have a mandatory take-down
regime if we found that a company was breaking British law by not asking for age verification, as defined in the Bill?
David Austin: The BBFC heads of agreement with the Government does not cover enforcement. We made clear that we would not be prepared to enforce the legislation in clauses 20 and 21 as they currently stand. Our role is focused much
more on notification; we think we can use the notification process and get some quite significant results.
We would notify any commercially-operated pornographic website or app if we found them acting in contravention of the law and ask them to comply. We believe that some will and some, probably, will not, so as a second backstop we would then be
able to contact and notify payment providers and ancillary service providers and request that they withdraw services from those pornographic websites. So it is a two-tier process.
We have indications from some major players in the adult industry that they want to comply--PornHub, for instance, is on record on the BBC News as having said that it is prepared to comply. But you are quite right that there will still be gaps in
the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4.
For UK-based websites and apps, that is fine, but it would be extremely challenging for any UK regulator to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to
the consultation back in the spring, that ISP blocking ought to be part of the regulator's arsenal. We think that that would be effective.
Claire Perry Q Am I right in thinking that, for sites that are providing illegally copyrighted material, there is currently a take-down and blocking regime that does operate in the UK, regardless of their jurisdiction?
David Austin: Yes; ISPs do block website content that is pirated. There was research published earlier this year in the US that found that it drove traffic to pirated @ websites down by about 90%. Another tool that has been used in
relation to IP protection is de-indexing, whereby a search engine removes the infringing website from any search results. We also see that as a potential way forward.
Thangam Debbonaire (Bristol West) (Lab) Q First, can I verify that you both support adding in the power to require ISPs to block non-compliant sites?
David Austin: Yes.
Alan Wardle: Yes, we support that.
Thangam Debbonaire Q Good. That was quick. I just wanted to make sure that was there. What are your comments on widening the scope, so that age verification could be enforced for matters other than pornography, such as
violent films or other content that we would not allow in the offline world? I am talking about things such as pro-anorexia websites. We know that this is possible to do in certain formats, because it is done for other things, such as copyright
infringement. What are your views on widening the scope and the sanctions applying to that?
Alan Wardle: We would support that. We think the Bill is a really great step forward, although some things, such as enforcement, need to be strengthened. We think this is an opportunity to see how you can give children parity of
protection in the online and the offline worlds.
It is very good, from our perspective, that the BBFC is doing this, because they have got that expertise. Pornography is not the only form of harm that children see online. We know from our research at the NSPCC that there are things like graphic
violence. You mentioned some of the pro-anorexia and pro-suicide sites, and they are the kind of things that ought to be dealt with. We are supporting developing a code of practice with industry to work out what those harms are--and that is very
much a staged approach.
We take it for granted that when, for instance, a child goes to a youth group or something like that, we make sure there are protections there, and that the staff are CRB checked. Somehow it seems that for children going on to the internet it is
a bit like the wild west. There are very few protections. Some of the content really is upsetting and distressing to children. This is not about adults being blocked from seeing adult content. That is absolutely fine; we have no problem with that
at all. But it is about protecting children from seeing content that is inappropriate for them. We would certainly support that widening, but obviously doing it in a staged way so that the regulator does not take on too much at once. We would
certainly support that.
David Austin: I would echo what Alan says. We see this Bill as a significant step forward in terms of child protection. We absolutely agree with the principle of protecting children from a wider range of content--indeed, that is
what we do in other areas: for example, with the mobile network operators and their adult filters. Like Alan, I think we see it in terms of more of a staged approach. The BBFC taking on this role is a significant new area of work--quite a
challenge to take on board. I think there is a potential risk of overloading the Bill if we try to put too much on it, so I would very much support the NSPCC's phased approach.
Thangam Debbonaire Q Is there anything further that you think needs to be added to the Bill to make the sanctions regime work? I am also thinking--at the risk of going against what you just said, Mr Austin--about whether or
not we should be considering sites that are not designed for commercial purposes but where pornography or other harmful material is available on a non-commercial basis; or things not designed for porn at all, such as Twitter timelines or Tumblr
and other social media, where the main purpose may not be pornography or other harmful material, but it is available. Do you think the Bill has enough sanctions in it to cope with all of that, or should that be added? Is there anything else you
would like to add?
David Austin: There were a few questions. I will try to answer them all, but if I miss any of them please come back to me. In terms of sanctions, I have talked about ISP blocking and de-indexing. We think those could be potentially
effective steps. In terms of commercial pornography, we have been working on devising a test of what that is. The Bill states explicitly that the pornography could be free and still provided on a commercial basis. I do not think it is narrowing
the scope of the regulation an awful lot by specifying commercial pornography. If there are adverts, if the owner is a corporate entity, if there are other aspects--if the site is exploiting data, for example: there are all sorts of indications
that a site is operating on a commercial basis. So I do not see that as a real problem.
In relation to Twitter, which you mentioned, what the Bill says the regulator should do is define what it sees as ancillary service providers. Those are organisations whose work facilitates and enables the pornography to be distributed. There is
certainly a case to argue that social media such as Twitter are ancillary service providers. There are Twitter account holders who provide pornography on Twitter so I think you could definitely argue that.
I would argue that Twitter is an ancillary service provider, as are search engines and ISPs. One of the things that we plan to do in the next weeks and months would be to engage with everyone that we think is an ancillary service provider, and
see what we can achieve together, to try and achieve the maximum protection we can through the notification regime that we are taking on as part 3 of the Bill. The Chair
Just before we move on, shall we see if Mr Wardle also wants to contribute to things that should be in the Bill?
Alan Wardle: On that point, I think it is important for us that there is clarification--and I would agree with David about this--in terms of ensuring that sites that may for instance be commercial but that are not profiting from
pornography are covered. Again, Twitter is an example. We know that there are porn stars with Twitter accounts who have lots of people following them and lots of content, so it is important that that is covered.
It is important that the legislation is future-proofed. We are seeing at the NSPCC through Childline that sexual content or pornography are increasingly live-streamed through social media sites, and there is self-generated content, too. It is
important that that is covered, as well as the traditional--what you might call commercial--porn. We know from our research at the NSPCC that children often stumble across pornography, or it is sent to them. We think that streamed feeds for
over-18s and under-18s should be possible so that sort of content is not available to children. It can still be there for adults, but not for children. Nigel Adams Q Can you give us your perspective on the scale of the
problem of under-18s' access to this sort of inappropriate content? I guess it is difficult to do a study into it but, through the schools network and education departments, do you have any idea of the scale of the issue?
Alan Wardle: We did research earlier this year with the University of Middlesex into this issue. We asked young people--under 18s--whether they had seen pornography and when. Between the ages of 11 and 18, about half of them had
seen pornography. Obviously, when you get to older children--16 and 17-year-old-boys in particular--it was much higher. Some 90% of those 11 to 18-year-olds had seen it by the age of 14. It was striking--I had not expected this--that, of the
children who had seen it, about half had searched for it but the other half had stumbled across it through pop-ups or by being sent stuff on social media that they did not want to see.
It is a prevalent problem. If a determined 17-year-old boy wants to see pornography, undoubtedly he will find a way of doing it, but of particular concern to us is when you have got eight, nine or 10-year-old children stumbling across this stuff
and being sent things that they find distressing. Through Childline, we are getting an increasing number of calls from children who have seen pornographic content that has upset them.
Nigel Adams Q Has there been any follow-on, in terms of assaults perpetrated by youngsters as a result of being exposed to this?
Alan Wardle: It is interesting to note that there has been an exponential rise in the number of reports of sexual assaults against children in the past three or four years. I think it has gone up by about 84% in the past three
Nigel Adams Q By children?
Alan Wardle: Against children. Part of that, we think, is what you might call the Savile effect--since the Savile scandal there has been a much greater awareness of child abuse and children are more likely to come forward, which we
think is a good thing. But Chief Constable Simon Bailey, who is the national lead on child protection, believes that a significant proportion of that is due to the internet. Predators are able to cast their net very widely through social
networking sites and gaming sites, fishing for vulnerable children to groom and abuse.
We believe that, in developing the code of practice that I talked about earlier, that sort thing needs to be built in to ensure that children are protected from that sort of behaviour in such spaces. The internet is a great thing but, as with
everything, it can be used for darker purposes. We think there is increasing evidence--Simon Bailey has said this, and more research needs to be done into the scale of it--that children, as well as seeing adult content, are increasingly being
groomed for sex online.
Nigel Adams Q Mr Austin, what constructive conversations and meetings have you had with ISPs thus far, in terms of the potential for blocking those sites--especially the sites generated abroad?
David Austin: We have not had any conversations yet, because we signed the exchange of letters with the Government only last Thursday and it was made public only today that we are taking on this role. We have relationships with
ISPs--particularly the mobile network operators, with which we have been working for a number of years to bring forward child protection on mobile devices.
Our plan is to engage with ISPs, search engines, social media--the range of people we think are ancillary service providers under the Bill--over the next few weeks and months to see what we can achieve together. We will also be talking to the
adult industry. As we have been regulating pornography in the offline space and, to an extent, in the online space for a number of years, we have good contacts with the adult industry so we will engage with them.
Many companies in the adult industry are prepared to work with us. Playboy , for instance, works with us on a purely voluntary basis online. There is no law obliging it to work with us, but it wants to ensure that all the pornography it
provides is fully legally and compliant with British Board of Film Classification standards, and is provided to adults only. We are already working in this space with a number of players.
Nigel Huddleston Q Obviously, the BBFC is very experienced at classifying films according to certain classifications and categories. I am sure it is no easy task, but it is possible to use an objective set of criteria to
define what is pornographic or disturbing, or is it subjective? How do you get that balance?
David Austin: The test of whether something is pornographic is a test that we apply every single day, and have done since the 1980s when we first started regulating that content under the Video Recordings Act 1984. The test is
whether the primary purpose of the work is to arouse sexually. If it is, it is pornography. We are familiar with that test and use it all the time.
Nigel Huddleston Q In terms of skills and resources, are you confident you will be able to get the right people in to do the job properly? I am sure that it is quite a disturbing job in some cases.
David Austin: Yes. We already have people who have been viewing pornographic content for a number of years. We may well need to recruit one or two extra people, but we certainly have the expertise and we are pretty confident that we
already have the resources. We have time between now and the measures in the Bill coming into force to ensure that we have a fully effective system up and running.
The Minister for Digital and Culture (Matt Hancock) Q I just want to put on the record that we are delighted that the BBFC has signed the heads of agreement to regulate this area. I cannot think of a better organisation with
the expertise and the experience to make it work. What proportion of viewed material do you think will be readily covered by the proposed mechanism in the Bill that you will be regulating the decision over but not the enforcement of?
David Austin: I am not sure that I understand the question.
Matt Hancock Q I am thinking about the scale of the problem--the number of views by under-18s of material that you deem to be pornographic. What proportion of the problem do you think the Bill, with your work, will fix?
David Austin: So we are talking about the amount of pornography that is online?
Matt Hancock Q And what is accessed.
David Austin: Okay. As you all know, there is masses of pornography online. There are 1.5 million new pornographic URLs coming on stream every year. However, the way in which people access pornography in this country is quite
limited. Some 70% of users go to the 50 most popular websites. With children, that percentage is even greater; the data evidence suggests that they focus on a relatively small number of sites.
We would devise a proportionality test and work out what the targets are in order to achieve the greatest possible level of child protection. We would focus on the most popular websites and apps accessed by children--those data do exist. We would
have the greatest possible impact by going after those big ones to start with and then moving down the list.
Matt Hancock Q So you would be confident of being able to deal with the vast majority of the problem.
David Austin: Yes. We would be confident in dealing with the sites and apps that most people access. Have I answered the question?
Matt Hancock Q Yes. Given that there is a big problem that is hard to tackle and complicated, I was just trying to get a feel for how much of the problem you think, with your expertise and the Bill, we can fix.
David Austin: We can fix a great deal of the problem. We cannot fix everything. The Bill is not a panacea but it can achieve a great deal, and we believe we can achieve a great deal working as the regulator for stages 1 to 3.
Louise Haigh Q My question follows on neatly from that. While I am sure that the regulation will tackle those top 50 sites, it obviously comes nowhere near tackling the problems that Mr Wardle outlined, and the crimes, such
as grooming, that can flow from those problems. There was a lot of discussion on Second Reading about peer-to-peer and social media sites that you have called "ancillary". No regulation in the world is going to stop that. Surely, the
most important way to tackle that is compulsory sex education at school.
Alan Wardle: Yes. In terms of online safety, a whole range of things are needed and a whole lot of players. This will help the problem. We would agree and want to work with BBFC about a proportionality test and identifying where the
biggest risks are to children, and for that to be developing. That is not the only solution.
Yes, we believe that statutory personal, social and health education and sexual relationships education is an important part of that. Giving parents the skills and understanding of how to keep their children safe is also really important. But
there is a role for industry. Any time I have a conversation with an MP or parliamentarian about this and they have a child in their lives--whether @ their own, or nieces or nephews--we quickly come to the point that it is a bit of a nightmare.
They say, "We try our best to keep our children safe but there is so much, we don't know who they are speaking to" and all the rest of it.
How do we ensure that when children are online they are as safe as they are when offline? Of course, things happen in the real world as well and no solution is going to be perfect. Just as, in terms of content, we would not let a seven-year-old
walk into the multiplex and say, "Here is 'Finding Nemo' over here and here is hard core porn--off you go."
We need to build those protections in online so we know what children are seeing and whom they speaking to, and also skilling up children themselves through school and helping parents. But we believe the industry has an important part to play in
Government, in terms of regulating and ensuring that spaces where children are online are as safe as they can be.
Christian Matheson (City of Chester) (Lab) Q To follow on from the Minister's question, you feel you are able to tackle roughly the top 50 most visited sites. Is there a danger that you then replace those with the next top 50
that are perhaps less regulated and less co-operative? How might we deal with that particular problem, if it exists?
David Austin: When I said "the top 50", I was talking in terms of the statistics showing that 70% of people go to the top 50. We would start with the top 50 and work our way through those, but we would not stop there. We
would look to get new data every quarter, for example. As you say, sites will come in and out of popularity. We will keep up to date and focus on those most popular sites for children.
We would also create something that we have, again, done with the mobile operators. We would create an ability for members of the public--a parent, for example--to contact us about a particular website if that is concerning them. If an
organisation such as the NSPCC is getting information about a particular website or app that is causing problems in terms of under-age access, we would take a look at that as well. In creating this proportionality test what we must not do is be
as explicit as to say that we will look only at the top 50.
First, that is not what we would do. Secondly, we do not want anyone to think, "Okay, we don't need to worry about the regulator because we are not on their radar screen." It is very important to keep up to date with what are the most
popular sites and, therefore, the most effective in dealing with under-age regulation, dealing with complaints from members of the public and organisations such as the NSPCC.
Alan Wardle: I think that is why the enforcement part is so important as well, so that people know that if they do not put these mechanisms in place there will be fines and enforcement notices, the flow of money will be stopped and,
crucially, there is that backstop power to block if they do not operate as we think they should in this country. The enforcement mechanisms are really important to ensure that the BBFC can do their job properly and people are not just slipping
from one place to the next.
Claire Perry Q Of those top 50 sites, do we know how many are UK-based? @
David Austin: I would guess, none of them. I do not know for sure, but that would be my understanding.
Claire Perry Q Secondly, I want to turn briefly to the issue of the UK's video on demand content. My reading around clause 15 suggests that, although foreign-made videos on demand will be captured by the new provisions,
UK-based will continue to be caught by Communications Act 2003 provisions. Do you think that is adequate?
David Austin: That is my understanding as well. We work very closely with Ofcom. Ofcom regulates this UK on demand programme services as the Authority for Television On Demand, but it applies our standards in doing so. That is a
partnership that works pretty effectively and Ofcom has done an effective job in dealing with that type of content. That is one bit that is carved out from the Bill and already dealt with by Ofcom.
Open Rights Group has submitted Written evidence to House of Commons Public Bill Committee on the Digital Economy Bill. The following is the groups views on some of the worst aspects of the Age Verification requirements for 18 rated adult
Open Rights Group (ORG) is the United Kingdom's only campaigning organisation dedicated to working to protect the rights to privacy and free speech online. With 3,200 active supporters, we are a grassroots organisation with local groups across
the UK. We believe people have the right to control their technology, and oppose the use of technology to control people.
23. We believe the aim of restricting children's access to inappropriate material is a reasonable one; however placing age verification requirements on adults to access legal material throws up a number of concerns which are not easily
resolved.24.Our concerns include: whether these proposals will work; the impact on privacy and freedom of expression; and how pornography is defined.
Lack of privacy safeguards
25. New age verification systems will enable the collection of data about people accessing pornographic websites, potentially across different providers or websites. Accessing legal pornographic material creates sensitive information that may be
linkedto a real life identity. The current wording of the draft Bill means that this data could be vulnerable to the "Ashley Madison-style" leaks.
26. MindGeek (the largest global adult entertainment operator) estimates there are 20 to 25 million adults in the UK who access adult content regularly. That is over 20 million people that will have to reveal attributes of their identity to a
pornographywebsite or a third party company.
27. Current proposals2 for age-verification systems suggest using people's emails, social media accounts, bank details, credit and electoral information, biometrics and mobile phone details. The use of any of this information exposes pornography
website users to threats of data mining, identity theft and unsolicited marketing.
28. The currently proposed age-verification systems have minimal regard for the security of the data they will collect.
29. The Bill does not contain provisions to secure the privacy and anonymity of users of pornographic sites. These must be included in the Bill, not merely in guidance issued by the age-verification regulator. They should ensure that the
age-verificationsystem, by default, must not be able to identify a user to the pornographic site by leaving persistent data trails. The user information that pornography websites are allowed to store without additional consent should be strictly
Will age verification work?
30. The objective of these proposals is child safety rather than age verification. Policy makers should not measure success by the number of adults using age verification. It is highly likely that children will be able to continue accessing
pornographicmaterial, meaning that the policy will struggle to meet its true goal.
31. The Bill does not outline an effective system to administer age verification. It sets out a difficult task to regulate foreign pornography publishers. This will be difficult to enforce. Even if access to pornographic material hosted abroad is
blockedin the UK, bypassing website blocks is very easy - for example through the use of VPNs. Using VPNs is not technically difficult and could easily be used by teenagers to circumvent age verification.
32. Young people will still be able to access pornographic materials through some mainstream social media websites that are not subject to age verification, and from peer-to-peer networks.
33. As with ISP and mobile phone filters, age verification may prevent young children from accidentally finding pornographic material but it is unlikely to restrict a tech-savvy teenager.
Discrimination against sexual minorities and small business
34. The age verification systems will impose disproportionate costs on small publishers. No effective and efficient age verification system has been presented and it is very likely the costs imposed on smaller publishers will cause them to go out
of business 3 .
35. Smaller publishers of adult materials often cater for sexual minorities or people with special needs. The costs associated with implementing age verification systems threaten the existence of these sites and thus the ability of particular
groupsto express their sexuality by using the services of smaller pornographic publishers.
36. It is unclear whether adults will trust age verification systems, especially if they appear to identify them to the sites. It is possible that there will be a dissuasive effect on adults wishing to receive legal material. This would be a
negativeimpact on free expression, and would be likely to disproportionately impact people from sexual minorities.
Definition of pornographic material
37. The definitions of pornographic material included in the Bill are much broader than what is socially accepted as harmful pornography. The Bill not only covers R18 materials typically described as "hardcore pornography", which
offline can only be acquiredin licensed sex shops, but also 18-rated materials of a sexual nature. The boundaries of 18 classification are dynamic and reflect social consensus on what is acceptable with some restrictions. Today this would include
popular films such as Fifty Shadesof Gray. This extension of the definition of pornography to cover all "erotic" 18 rated films also raises questions as to why violent - but not sexual - materials rated as 18 should then be accessible
38. Hiding some of these materials or making them more difficult to access puts unjustifiable restrictions on people's freedom of expression. Placing 18-rated materials beyond the age-verification wall under the same category as hardcore
pornography willdiscourage people from exploring topics related to their sexuality.
Suggestions for improvement
39. The online age verification proposed in the Bill is unworkable and will not deliver what Government set out to do. We urge the Government to find more effective solutions to deliver their objectives on the age verification. The online age
verificationshould be dropped from the Bill in its current version. 40. The updated version of age verification should incorporate:
41. 1) Privacy safeguards
The regulator should have specific duties to ensure the systems are low risk. For instance, Age verification should not be be in place unless privacy safeguards are strong. Any age verification system should not create wider security risks, for
instanceto credit card systems, or through habituating UK Internet users into poor security practices.
42. Users of adult websites should have clarity on the liability of data breaches and what personal data is at risk.
42. 2) Safeguards for sexual minorities
Requirements should be proportionate to the resources available and the likelihood of access by minors. Small websites that cater for sexual minorities may fall under the commercial threshold.
43. 3) Remove 18-rated materials from the definition of pornographic materials
Placing all materials of a sexual nature under the definition of pornography is not helpful and will greatly increase the impact of these measures on the human right to impart and receive information, including of older children and young adults.
Open Rights Group make equally valid arguments against the criminalisation of file sharing and the introduction of many features of an ID card to tie together vast amounts or personal data held in a variety of government databases.
Social media users who encourage flame wars or retweet the doxing (revealing identifying information with malicious intent) of others are set to be punished more severely by British prosecutors.
The Crown Prosecution Service (CPS)'s latest Guidelines on prosecuting cases involving communications sent via social media target doxing, online mobs, fake social media profiles and other social media misbehaviour.
Also included in the latest version of the guidance is a specific encouragement to prosecutors to charge those who egg on others to break social media speech laws. Those who encourage others to commit a communications offence may be charged
with encouraging an offence under the Serious Crime Act 2007, warns the guidance.
In a Kafka-esque twist, the guidance also includes this chilling line, discussing how prosecutors can prove the criminal offence of sending a grossly offensive message, under section 127 of the Communications Act 2003:
The offence is committed by sending the message. There is no requirement that any person sees the message or be offended by it.
Another nasty touch is that the CPS will allow victims to decide whether crimes are deemed to be 'hate crimes' and therefore attract more severe penalties. The CPS policy consultation defines race/religion hate crimes as follows:
Crimes involving hostility on the basis of race or religion
The reporting and prosecution of hate crime are shaped by two definitions; one is subjective and is based on the perception of the victim and the other is objective and relies on supporting evidence.
Both the subjective and objective definitions refer to hostility, not hatred. There is no statutory definition of hostility and the everyday or dictionary definition is applied, encompassing a broad spectrum of behaviour.
We have an agreed definition with the police for identifying and flagging cases involving hostility on the basis of race or religion. The joint definition is:
Any criminal offence which is perceived by the victim or any other person, to be motivated by a hostility or prejudice based on a person's race or religion or perceived race or religion.
The equivalent paragraph an disability hate crime adds explaining how the CPS has waved its hands and extended the scope:
This definition is wider than the statutory definition, to ensure we capture all relevant cases:
The guidance also encourages prosecutors to treat social media crimes committed against persons serving the public more seriously than nasty words directed against their fellow members of the public. Similarly, coordinated attacks by
different people should also attract greater prosecutorial attention.
Prosecution in all cases is said to be less likely if swift and effective action has been taken by the suspect and/or others, for example service providers, to remove the communication .