|
Age verification and UK internet porn censorship
|
|
|
 | 31st March 2019
|
|
| 29th March 2019 See article from news.sky.com
See article from metro.co.uk See
article from theregister.co.uk |
The Government has been very secretive about its progress towards the starting of internet censorship for porn in the UK. Meanwhile the appointed internet porn censor, the BBFC, has withdrawn into its shell to hide from the flak. It has uttered hardly a
helpful word on the subject in the last six months, just at a time when newspapers have been printing uniformed news items based on old guesstimates of when the scheme will start. The last target date was specified months ago when DCMS minister Margot
James suggested that it was intended to get the scheme going around Easter of 2019. This date was not achieved but the newspapers seem to have jumped to the conclusion that the scheme would start on 1st April 2019. The only official response to this
false news is that the DCMS will now be announcing the start date shortly. So what has been going on? Well it seems that maybe the government realised that asking porn websites and age verification services to demand that porn users
identify themselves without any real legal protection on how that data can be used is perhaps not the wisest thing to do. Jim Killock of Open Rights Group explains that the delays are due to serious concerns about privacy and data collection:
When they consulted about the shape of age verification last summer they were surprised to find that nearly everyone who wrote back to them in that consultation said this was a privacy disaster and they need to make sure
people's data doesn't get leaked out. Because if it does it could be that people are outed, have their relationships break down, their careers could be damaged, even for looking at legal material. The
delays have been very much to do with the fact that privacy has been considered at the last minute and they're having to try to find some way to make these services a bit safer. It's introduced a policy to certify some of the products as better for
privacy (than others) but it's not compulsory and anybody who chooses one of those products might find they (the companies behind the sites) opt out of the privacy scheme at some point in the future. And there are huge commercial
pressures to do this because as we know with Facebook and Google user data is extremely valuable, it tells you lots about what somebody likes or dislikes or might want or not want. So those commercial pressures will kick in and
they'll try to start to monetise that data and all of that data if it leaked out would be very damaging to people so it should simply never be collected.
So the government has been working on a voluntary kite mark scheme to approve
age verifiers that can demonstrate to an auditor they will keep user data safe. This scheme seems to be in its early stages as the audit policy was first outlines to age verifiers on 13th March 2019. AvSecure reported on Twitter:
Friday saw several AV companies meet with the BBFC & the accreditation firm, who presented the framework & details of the proposed scheme. Whilst the scheme itself seems very deep & comprehensive, there were
several questions asked that we are all awaiting answers on. The Register reports that AgeID has already commissioned a data security audit using the information security company, the NCC Group. Perhaps that company can therefore be
rapidly approved by the official auditor, whose identity seems to being kept secret. So the implementation schedule must presumably be that the age verifiers get audited over the next couple of months and then after that the government can give
websites the official 3 months notice required to give websites time to implement the now accredited age verification schemes. The commencement date will perhaps be about 5 or 6 months from now. Update: Announcement
this week
31st March 2019. See article from thetimes.co.uk The government is expected to announce a timetable on Wednesday for the long-awaited measure to force commercial providers of online porn to check users' ages.
|
|
The BBFC has made a pretty poor show of setting out guidelines for the technical implementation of age verification, and now the Stop Age Verification campaign has pointed out that the BBFC has made legal errors about text porn
|
|
|
 | 25th
March 2019
|
|
| See article from stopageverification.org.uk |
The BBFC seems a little behind the curve in its role as porn censor. Its initial draft of its guidelines gave absolutely no concern for the safety and well being of porn users. The BBFC spoke of incredibly sensitive identity and browsing date being
entrusted to adult websites and age verifiers, purely on the forlorn hope that these companies would follow 'best practice' voluntary guidelines to keep the data safe. The BBFC offered next to no guidelines that defined how age verification should work
and what it really needs to do. As time has moved on, it has obviously occurred to the BBFC or the government that this was simply not good enough, so we are now waiting on the implementation of some sort of kite marking scheme to try to provide at
least a modicum of trust in age verifiers to keep this sensitive data safe. But even in this period of rework, the BBFC hasn't been keeping interested parties informed of what's going on. The BBFC seem very reluctant to advise or inform anyone of
anything. Perhaps the rework is being driven by the government and maybe the BBFC isn't in a position to be any more helpful. Anyway it is interesting to note that in an
article from stopageverification.org.uk , that the BBFC has been reported to being overstepping the remit of the age verification laws
contained in the Digital Economy Act: The BBFC posts this on the Age verifiers website :
All types of pornographic content are within the scope of the legislation. The legislation does not exclude audio or text from its definition of pornography. All providers of commercial online pornography to persons in the UK are
required to comply with the age-verification requirement. Except that's not what the legislation says :
Pornographic material is defined in s.15 of the act. This sets out nine categories of material. Material is defined in that section (15(2) as material means204 (a) a series of visual images shown as a moving picture, with
or without sound; (b) a still image or series of still images, with or without sound; or (c) sound;
It clearly doesn't mention text. The BBFC need to be clear in their role as Age Verifier.
They can only apply the law as enacted by Parliament. If they seek to go beyond that they could be at risk of court action.
|
|
The Guardian suggests that the start of internet porn censorship will be timed to help heal the government's reputational wounds after the Brexit debacle
|
|
|
 |
25th March 2019
|
|
| 24th March 2019. See article from theguardian.com
by Jamie Doward |
The Observer today published an article generally supporting the upcoming porn censorship and age verification regime. It did have one interesting point to note though: Brexit's impact on the pornography industry has gone
unnoticed. But the chaos caused by the UK's disorderly exit from the European Union even stretches into the grubbier parts of cyberspace. A new law forcing pornography users to prove that they are adults was supposed to be
introduced early next month. But sources told the Observer that it may not be unveiled until after the Brexit impasse is resolved as the government, desperate for other things to talk about, believes it will be a good news story that will play well with
the public when it is eventually unveiled.
Comment: The illiberal Observer 25th March 2019. Thanks to Alan
Bloody hell! Have you seen this fuckwittage from the purportedly liberal Observer?
Posh-boy churnalist Jamie (definitely not Jim) Doward regurgitates the bile of authoritarian feminist Gail Dines about the crackpot attempt to stop children accessing a bit of porn. This is total bollox. It's
getting on for sixty years since I spotted that my girl contemporaries were taking on a different and interesting shape - a phenomenon I researched by reference to two bodies of literature: those helpful little books for the amateur and professional
photographer in which each photo of a lady was accompanied by F number and exposure time and those periodicals devoted to naturism. This involved no greater subterfuge than taking off my school cap and turning up my raincoat collar to hide my school tie.
I would fervently hope that today's lads can run rings round parental controls and similar nonsense. |
|
|
|
|
 |
21st March 2019
|
|
|
Backlash speculates that the UK's upcoming porn censorship will play into the hands of foreign tube site monopolies See article from
backlash.org.uk |
|
|
|
|
 |
20th March 2019
|
|
|
We should be stripping away curbs on speech -- not adding more. By Andrew Tettenborn See article from spiked-online.com
|
|
Parliamentary group calls for Ofcom to become the UK internet censor
|
|
|
 | 18th March
2019
|
|
| See
article from rsph.org.uk See report [pdf] from rsph.org.uk |
An informal group of MPs, the All Party Parliamentary Group on Social Media and Young People's Mental Health and Wellbeing has published a report calling for the establishment of an internet censor. The report clams:
- 80% of the UK public believe tighter regulation is needed to address the impact of social media on the health and wellbeing of young people.
- 63% of young people reported social media to be a good source of health information.
- However, children who spend more than three hours a day using social media are twice as likely to display symptoms of mental ill health.
- Pressure to conform to beauty standards perpetuated and praised online can encourage harmful behaviours to achieve "results", including body shame and disordered eating, with 46% of girls compared to 38% of all young people reporting
social media has a negative impacted on their self-esteem.
The report titled, #NewFilters to manage the impact of social media on young people's mental health and wellbeing , puts forward a
number of policy recommendations, including:
- Establish a duty of care on all social media companies with registered UK users aged 24 and under in the form of a statutory code of conduct, with Ofcom to act as regulator.
- Create a Social Media Health Alliance, funded by a 0.5% levy on the
profits of social media companies, to fund research, educational initiatives and establish clearer guidance for the public.
- Review whether the "addictive" nature of social media is sufficient for official disease classification.
- Urgently commission robust, longitudinal research, into understanding the extent to which the impact of social media on young people's mental health and wellbeing is one of cause or correlation.
Chris Elmore MP, Chair of the APPG on Social Media on Young People's Mental Health and Wellbeing said: "I truly think our report is the wakeup call needed to ensure - finally - that meaningful action is taken
to lessen the negative impact social media is having on young people's mental health. For far too long social media companies have been allowed to operate in an online Wild West. And it is in this lawless landscape that our
children currently work and play online. This cannot continue. As the report makes clear, now is the time for the government to take action. The recommendations from our Inquiry are both sensible and reasonable; they would make a
huge difference to the current mental health crisis among our young people. I hope to work constructively with the UK Government in the coming weeks and months to ensure we see real changes to tackle the issues highlighted in the
report at the earliest opportunity."
|
|
|
|
|
 | 18th March 2019
|
|
|
The Daily Mail highlights the dangers of identity checks for porn viewers and notes that the start date will be announced in April but could well be several months before is fully implemented See
article from dailymail.co.uk |
|
|
|
|
 | 16th March
2019
|
|
|
Get a VPN. The Guardian outlines some of the dangers of getting age verified for porn See article from theguardian.com
|
|
|
|
|
 |
14th March 2019
|
|
|
This is how age verification will work under the UK's porn censorship law See article from wired.co.uk |
|
|
|
|
 | 13th
March 2019
|
|
|
UK porn censorship risks creating sex tape black market on Twitter, WhatsApp and even USB sticks See
article from thescottishsun.co.uk |
|
Lords committee supports the creation of a UK internet censor
|
|
|
 | 10th
March 2019
|
|
| See
press release from parliament.uk See
report [pdf] from publications.parliament.uk |
The House of Lords Communications Committee has called for a new, overarching censorship framework so that the services in the digital world are held accountable to an enforceable set of government rules. The Lords Communications Committee writes:
Background In its report 'Regulating in a digital world' the committee notes that over a dozen UK regulators have a remit covering the digital world but there is no body which has complete oversight.
As a result, regulation of the digital environment is fragmented, with gaps and overlaps. Big tech companies have failed to adequately tackle online harms. Responses to growing public concern have been piecemeal and inadequate.
The Committee recommends a new Digital Authority, guided by 10 principles to inform regulation of the digital world. Chairman's Comments The chairman of the committee, Lord Gilbert of Panteg , said:
"The Government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.
Self-regulation by online platforms is clearly failing. The current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation. Without intervention, the largest tech
companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people's lives. Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to
innovation and creativity, with a new culture of ethical behaviour embedded in the design of service."
Recommendations for a new regulatory approach Digital Authority A new 'Digital
Authority' should be established to co-ordinate regulators, continually assess regulation and make recommendations on which additional powers are necessary to fill gaps. The Digital Authority should play a key role in providing the public, the Government
and Parliament with the latest information. It should report to a new joint committee of both Houses of Parliament, whose remit would be to consider all matters related to the digital world. 10 principles for regulation
The 10 principles identified in the committee's report should guide all regulation of the internet. They include accountability, transparency, respect for privacy and freedom of expression. The principles will help the industry,
regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all. If rights are infringed, those responsible should be held accountable in a fair and transparent way.
Recommendations for specific action Online harms and a duty of care
A duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. Given the urgent need to address online harms, Ofcom's remit should expand to include
responsibility for enforcing the duty of care. Online platforms should make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. Major platforms should
invest in more effective moderation systems to uphold their community standards.
Ethical technology
Users should have greater control over the collection of personal data. Maximum privacy and safety settings should be the default. Data controllers and data processors should be required to publish an
annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred. The Government should
empower the Information Commissioner's Office to conduct impact-based audits where risks associated with using algorithms are greatest. Businesses should be required to explain how they use personal data and what their algorithms do.
Market concentration
The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. Greater use of data portability might help, but this will require more interoperability.
The Government should consider creating a public-interest test for data-driven mergers and acquisitions. Regulation should recognise the inherent power of intermediaries.
|
|
|
|
|
 |
10th March 2019
|
|
|
At least somebody will do well out of porn censorship See article from vpncompare.co.uk |
|
|
|
|
 | 8th March 2019
|
|
|
The Daily Mail reports on vague details about a proposal from the Information Commissioner to require age verification for any website that hoovers up personal details See
article from dailymail.co.uk |
|
Open Rights Group met to discuss the BBFC's age verification scheme with its voluntary privacy protection
|
|
|
 | 28th
February 2019
|
|
| See CC article from openrightsgroup.org
|
We met to discuss BBFC's voluntary age verification privacy scheme, but BBFC did not attend. Open Rights Group met a number of age verification providers to discuss the privacy standards that they will be meeting when the scheme launches,
slated for April. Up to 20 million UK adults are expected to sign up to these products. We invited all the AV providers we know about, and most importantly, the BBFC, at the start of February. BBFC are about to launch a voluntary
privacy standard which some of the providers will sign up to. Unfortunately, BBFC have not committed to any public consultation about the scheme, relying instead on a commercial provider to draft the contents with providers, but without wider feedback
from privacy experts and people who are concerned about users. We held the offices close to the BBFC's offices in order that it would be convenient for them to send someone that might be able to discuss this with us. We have been
asking for meetings with BBFC about the privacy issues in the new code since October 2018: but have not received any reply or acknowledgement of our requests, until this morning, when BBFC said they would be unable to attend today's roundtable. This is
very disappointing. BBFC's failure to consult the public about this standard, or even to meet us to discuss our concerns, is alarming. We can understand that BBFC is cautious and does not wish to overstep its relationship with its
new masters at DCMS. BBFC may be worried about ORG's attitude towards the scheme: and we certainly are critical. However, it is not responsible for a regulator to fail to talk to its potential critics. We are very clear about our
objectives. We are acting to do our best to ensure the risk to adult users of age verification technologies are minimised. We do not pose a threat to the scheme as a whole: listening to us can only result in making the pornographic age verification
scheme more likely to succeed, and for instance, to avoid catastrophic failures. Privacy concerns appear to have been recognised by BBFC and DCMS as a result of consultation responses from ORG supporters and others, which resulted
in the voluntary privacy standard. These concerns have also been highlighted by Parliament, whose regulatory committee expressed surprise that the Digital Economy Act 2017 had contained no provision to deal with the privacy implications of pornographic
age verification. Today's meeting was held to discuss:
What the scheme is likely to cover; and what it ideally should cover; Whether there is any prospect of making the scheme compulsory; What should be done about non-compliant
services; What the governance of the scheme should be in the long tern, for instance whether it might be suitable to become an ICO backed code, or complement such as code
As we communicated to BBFC in December 2018, we have considerable worries about the lack of consultation over the standard they are writing, which appears to be truncated in order to meet the artificial deadline of April this year.
This is what we explained to BBFC in our email:
Security requires as many perspectives to be considered as possible. The best security standards eg PCI DSS are developed in the open and iterated The standards will be
best if those with most to lose are involved in the design.
For PCI DSS, the banks and their customers have more to lose than the processors For Age Verification, site users have more to lose than the processors, however only the processors seem likely to be
involved in setting the standard
We look forward to BBFC agreeing to meet us to discuss the outcome of the roundtable we held about their scheme, and to discuss our concerns about the new voluntary privacy standard. Meanwhile, we will produce a note from the meeting,
which we believe was useful. It covered the concerns above, and issues around timing, as well as strategies for getting government to adjust their view of the absence of compulsory standards, which many of the providers want. In this, BBFC are a critical
actor. ORG also intends as a result of the meeting to start to produce a note explaining what an effective privacy scheme would cover, in terms of scope, risks to mitigate, governance and enforcement for participants.
|
|
Chelsea Russell's ridiculous conviction for quoting rap lyrics quashed on appeal
|
|
|
 |
27th February 2019
|
|
| See article from spiked-online.com |
In 2017, Chelsea Russell, a Liverpool teenager with Asperger's syndrome, paid tribute on her Instagram profile to a 13-year-old friend who died when he was hit by a car. She quoted the lyrics of a rap song, I'm Trippin" by Snap Dogg,
alongside the phrase 'RIP Frankie Murphy. Many other teenagers used the lyrics to pay tribute to Murphy. A year later, Russell's profile came to the attention of the police, who decided to arrest her and have her charged. The lyrics she quoted Kill a snitch nigga, rob a rich nigga
were found in court to be grossly offensive and Russell was convicted of a hate crime . For nothing more than quoting rap lyrics, she was placed on an eight-week, 8am-to-8pm curfew, fitted with an ankle tag, and fined £585. Last week, the
conviction was overturned on appeal. Russell's defence lawyer slammed the initial verdict as ridiculous, akin to the actions of a totalitarian state. Offsite Comment: Chelsea Russell and the depravity of PC
27th February 2019. See article from spiked-online.com by Fraser Myers
|
|
CPS relaxes its pornography guidelines so that fisting, golden showers, female ejaculation and many more can now be legally published in the UK
|
|
|
 | 31st January 2019
|
|
| See
CPS Statement on
Consultation Response on the Proposed CPS Guidance relating to the[pdf] from cps.gov.uk See The revised Legal Guidance from cps.gov.uk See
article from bbc.co.uk See
article from uk.news.yahoo.com |
The upcoming UK internet porn censorship regime being introduced later this year has set the UK authorities to thinking about a more rational set of laws governing what porn is legal and what porn is illegal in the UK. It makes a lot of sense to get the
UK stall straight before the commencement of the new censorship regime. The most contradictory area of porn law is that often referred to as 'beyond R18 porn'. This includes material historically banned by the Crown Prosecution Service (CPS) claiming
obscenity, ie fisting, golden showers, BDSM, female ejaculation, and famously from a recent anti censorship campaign, face sitting/breath play. Such material is currently cut from R18s, as censored and approved by the BBFC. When the age
verification law first came before parliament, 'beyond R18' porn was set to be banned outright. However as some of these categories are commonplace in worldwide porn, then the BBFC would have had to block practically all the porn websites in the world,
leaving hardly any that stuck to R18 guidelines that would be acceptable for viewing after age verification. So the lawmakers dropped the prohibition, and this 'beyond R18' material will now be acceptable for viewing after age verification. This leaves
the rather clear contradiction that the likes of fisting and female ejaculation would be banned or cut by the BBFC for sale in UK sex shops, but would have to be allowed by the BBFC for viewing online. This contradiction has now been squared by
the government deciding that 'beyond R18' pornography is now legal for sale in the UK. So the BBFC will now have a unified set of rules, specified by the CPS, covering both the censorship of porn sales in the UK and the blocking of foreign websites.
This legalisation of 'beyond R18' porn will surely disappoint a few censorial politicians in the House of Lords, notably Elspeth Howe. She has already tabled a private members bill to restore the ban on any foreign websites including 'beyond R18'
porn. Her bill has now been rendered mostly irrelevant. However there is still one genre of pornography that is sticking out of line, and that is cartoon porn featuring under age characters. Such porn is widespread in anime but strictly banned
under UK law. So given the large amounts of Japanese Hentai porn on the most popular tube sites in the world, then those videos could still be an issue for the viability of the age classification regime and could still end up with all the major porn
sites in the world banned. The new CPS censorship rules The new rules have already come into force, they started on 28th January 2019. A CPS spokesperson confirmed the change saying
It is not for the CPS to decide what is considered good taste or objectionable. We do not propose to bring charges based on material that depicts consensual and legal activity between adults, where no serious harm is caused and the
likely audience is over the age of 18. The CPS will, however, continue to robustly apply the law to anything which crosses the line into criminal conduct and serious harm. It seems a little bit rich for the CPS
to claim that It is not for the CPS to decide what is considered good taste or objectionable, when they have happily been doing exactly that for the last 30 years. The CPS originally outlined the new rules in a public consultation that
started in July 2018. The key proposals read: When considering whether the content of an article is “obscene”, prosecutors should distinguish between:
- Content showing or realistically depicting criminal conduct (whether
non-consensual activity, or consensual activity where serious harm is caused), which is likely to be obscene; - Content showing or realistically depicting other conduct which is lawful,
which is unlikely to be obscene.
Do consultees agree or disagree with the guidance that prosecutors must exercise real caution when dealing with the moral nature of acts not criminalized by law, and that the showing or realistic
depiction of sexual activity / pornography which does not constitute acts or conduct contrary to the criminal law is unlikely to be obscene? The following conduct (notwithstanding previous guidance indicating
otherwise) will not likely fall to be prosecuted under the Act:
- Fisting
- Activity involving bodily substances (including urine, vomit, blood and faeces)
- Infliction of pain / torture
-
Bondage / restraint
- Placing objects into the urethra
- Any other sexual activity not prohibited by law
provided that:
- It is consensual;
- No serious harm is caused;
- It is not otherwise inextricably linked with other criminality; and
- The likely audience is not under 18 or otherwise vulnerable.
The CPS has now issued a
document summarising
the responses received and how the CPS has taken some of these responses onboard. The CPS has already updated its rules in Revised Legal
Guidance from cps.gov.uk . The key rules are now: When considering whether the content of an article is "obscene", prosecutors should distinguish between:
Content relating to criminal conduct (whether non-consensual activity, or consensual activity where serious harm is caused, or otherwise inextricably linked to criminality), which is likely to be obscene; -
Content relating to other non-criminal conduct, which is unlikely to be obscene, provided the audience is not young or otherwise vulnerable.
Conduct will not likely fall to be prosecuted under the Act provided that:
It is consensual (focusing on full and freely exercised consent, and also where the provision of consent is made clear where such consent may not be easily determined from the material itself); and No
serious harm is caused (whether physical or other, and applying the guidance above at paragraph 17); and It is not otherwise inextricably linked with other criminality (so as to encourage emulation or fuelling interest or
normalisation of criminality); and The likely audience is not under 18 (having particular regard to where measures have been taken to ensure that the audience is not under 18) or otherwise vulnerable (as a result of their
physical or mental health, the circumstances in which they may come to view the material, the circumstances which may cause the subject matter to have a particular impact or resonance or any other relevant circumstance).
Note that extreme pornography is considered illegal so will likely be considered obscene too. But the CPS adds a few additional notes of harmful porn that will continue to be illegal: Publications which
show or depict the infliction of serious harm may be considered to be obscene publications because they show criminal assault notwithstanding the consent of the victim. This includes dismemberment and graphic mutilation. It includes asphyxiation causing
unconsciousness, which is more than transient and trifling, and given its danger is serious.
So it seems that breath play will be allowed as long as it doesn't lead to unconsciousness. Another specific rule is that gags do not in
themselves imply a lack of consent: Non-consent for adults must be distinguished from consent to relinquish control. The presence of a gag or other forms of bondage does not, without more, suffice to confirm that
sexual activity was non-consensual.
The BBFC changes its R18 rules The BBFC has several roles, it works in an advisory role when classifying cinema films, it works as an independent and mandatory censor when
classifying mainstream videos, but it works directly under government rules when censoring pornographic films. And in this last role, it uses unpublished guidelines based on rules provided by the CPS. The BBFC has informed BBC News that it will
indeed use the updated CPS guidelines when censoring porn. The BBC explains:
The BBFC's guidelines forbid material judged to be obscene under the current interpretation of the Obscene Publications Act. A spokeswoman told the BBC: Because the Obscene Publications Act does not define what types of
material are likely to be considered obscene, we rely upon guidance from the Crown Prosecution Service (CPS) as to what classes of material they consider likely to be suitable for prosecution. We are aware that the CPS have
updated their guidance on Obscene Publications today and we have now adjusted our own internal policies to reflect that revised guidance.
Myles Jackman And Pandora Blake And a thank you to two of the leading
campaigners calling for the CPS to lighten up on its censorship rules. Obscenity lawyer Myles Jackman, who has campaigned for these changes for a number of years, told Yahoo News UK that the change had wider
implications for the law. He said: "It is a very impressive that they've introduced the idea of full and freely exercised consent in the law. "Even for people with no interest in
pornography this is very important for consent and bodily autonomy."
Activist and queer porn filmmaker Pandora Blake, who also campaigned to have the ban on the depiction of certain sex acts overturned, called
the news a 'welcome improvement'. They said: "This is a happy day for queer, feminist and fetish porn." Acts that were banned that can now be depicted include:
|
|
Status report on the government's plans to introduce an internet censor for social media
|
|
|
 | 30th January
2019
|
|
| See article from politico.eu See
also Matt Hancock tells social media giants to remove suicide and self-harm material from telegraph.co.uk |
The U.K. government is rushing to finalize a draft internet censorship law particularly targeting social media but key details of the proposal have yet to be finalised amid concerns about stifling innovation. Government officials have been meeting
with industry players, MPs, peers and other groups over the past month as they try to finalise their proposals. People involved in those discussions said there is now broad agreement about the need to impose a new duty of care on big tech
companies, as well as the need to back up their terms and conditions with the force of law. A white paper is due be published by the end of winter. But the Department for Digital, Culture, Media and Sport, which is partly responsible for writing
up the new rules alongside the Home Office, is still deliberating over key aspects with just weeks to go until the government said it would unveil an outline of its proposals. Among the sticking points are worries that regulation could stifle
innovation in one of the U.K. economy's most thriving sectors and concerns over whether it can keep pace with rapid technological change. Another is ensuring sufficient political support to pass the law despite likely opposition from parts of the
Conservative Party. A third is deciding what regulatory agency would ultimately be responsible for enforcing the so-called Internet Safety Law. A major unresolved question is what censorship body will be in charge of enforcing laws that could
expose big tech companies to greater liability for hosted content, a prospect that firms including Google and Facebook have fought at the European level. Several people who spoke to POLITICO said the government does not appear to have settled on
who would be the censor, although the communications regulator Ofcom is very much in the mix, however there are concerns that Ofcom is already getting too big. |
|
InternetMatters.org publishes a survey showing that 83% of parents support age verification for porn
|
|
|
 |
23rd January 2019
|
|
| See article from
internetmatters.org |
InternetMatters.org is group funded by UK internet and telecoms companies with the aim of promoting their role in internet safety. The group has now published a survey supporting the government's upcoming introduction of age verification requirements
for porn websites. The results reveal:
- 83% feel that commercial porn sites should demand users verify their age before they're able to access content.
- 76% of UK parents feel there should be greater restrictions online to stop kids seeing adult content.
- 69% of parents of
children aged four to 16 say they're confident the government's new ID restrictions will make a difference.
However 17% disagreed with commercial porn sites requiring ID from their users. And the use of data was the biggest obstacle for those parents opposed to the plans. Of those parents who are anti-age verification, 30% said they wouldn't trust
age-verification companies with their personal data. While 18% of parents claim they expect kids would find a way to get around age-verification and a further 13% claim they're unsure that it would actually reduce the number of children accessing
pornography. Age-verification supported by experts |
|
Gay website closes as user fears of being outed via age verification makes the site too dangerous for it to be viable
|
|
|
 | 17th January 2019
|
|
| See article from
gaystarnews.com |
gaystarnews.com has published an article outlining the dangers of porn viewers
submitting their identity data and browsing history to age verifiers and their websites. The article explains that the dangers for gay porn viewers are even mor pronounced that for straight viewers. The artisle illustrates this with an example:
David Bridle, the publisher of Dirty Boyz , announced in October that last month's issue of the magazine would be its last. He said: Following the Conservative government's
decision ... to press ahead with new regulations forcing websites which make money from adult content to carry an age verification system ... Dirtyboyz and its website dirtyboyz.xxx have made the decision to close. The new age
verification system will be mostly run by large adult content companies which themselves host major "Tube" style porn sites. 'It would force online readers of Dirtyboyz to publicly declare themselves.
Open
Rights Group executive director, Jim Killock, told GSN the privacy of users needs protecting: The issue with age verification systems is that they need to know it's you. This means there's a strong likelihood that it
will basically track you and know what you're watching. And that's data that could be very harmful to people. It could cause issues in relationships. Or it could see children outed to their parents. It could mean people are
subjected to scams and blackmail if that data falls into criminal hands. Government response
A spokesperson for the Department of Culture, Media and Sport (DCMS) told Gay Star News:
Pornographic websites and age verification services will be subject to the UK's existing high standard of data protection legislation. The Data Protection Act 2018 provides a comprehensive and modern framework for data protection, with strong sanctions
for malpractice and enforced by the Information Commissioner's Office.
But this is bollox, the likes of Facebook and Google are allowed to sell browsing data for eg targeted advertising within the remit of GDPR. And targeted
advertising could be enough in itself to out porn viewers.
|
|
British porn viewers are reported to be building up their collections ahead of the introduction of censorship and age verification
|
|
|
 |
13th January 2019
|
|
| See article from theregister.co.uk
|
UK-based porn viewers seem to be filling their boots before the government's age check kicks in as traffic to xHamster rose 6% in 2018 According to xHamster's Alex Hawkins, the trend is typical of countries in which plans to block online pornography
becomes national news. It seems the more you talk about it, the more people feel invested in it as a right, he said. The government has promised a minimum of three months for industry and the public to prepare for age verification, meaning they
are likely to come into force around Easter. However this is a little unfair to websites as the BBFC has not yet established the process by which age verification services will be kitemarked and approved as promising to keep porn viewers identity and/or
browsing history acceptably safe. For the moment websites do not know which services will be deemed acceptable. Countries that have restrictions already in place showed, unsurprisingly, a decline in visitors. Traffic from China fell 81% this year,
which xHamster put down to the nation's ban on VPNs and $80,000 cash rewards for people who shopped sites hosting illegal content, like porn. Elsewhere, the report showed an increase in the number of female visitors to the site -- up 42% in the US
and 12.3% worldwide -- a trend Hawkins predicted would continue into 2019. |
|
UK internet porn censorship marches on with the publication of a new law supporting age verification
|
|
|
 |
11th January 2019
|
|
| See article from news.sky.com See
Online Pornography (Commercial Basis) Regulations 2019 from legislation.gov.uk |
The government has published Online Pornography (Commercial Basis) Regulations 2019 which defines which websites get caught up in upcoming internet porn censorship requirements and how social media websites are excused from the censorship. These new
laws will come into force on the day that subsection (1) of section 14 of the Digital Economy Act 2017 comes fully into force. This is the section that introduces porn censorship and age verification requirements. This date has not yet been announced but
the government has promised to give at least 3 months notice. So now websites which are more than one-third pornographic content or else those that promote themselves as pornographic will be obliged to verify the age of UK visitors under. However
the law does not provide any specific protection for porn viewers' data beyond the GDPR requirements to obtain nominal consent before using the data obtained for any purpose the websites may desire. The BBFC and ICO will initiate a voluntary
kitemark scheme so that porn websites and age verification providers can be audited as holding porn browsing data and identity details responsibly. This scheme has not yet produced any audited providers so it seems a little unfair to demand that websites
choose age verification technology before service providers are checked out. It all seems extraordinarily dangerous for porn users to submit their identity to adult websites or age verification providers without any protection under law. The BBFC
has offered worthless calls for these companies to handle data responsibly, but so many of the world's major website companies have proven themselves to be untrustworthy, and hackers, spammers, scammers, blackmailers and identity thieves are hardly
likely to take note of the BBFC's fine words eg suggesting 'best practice' when implementing age verification. Neil Brown, the MD of law firm decoded.legal told Sky News: It is not clear how this age
verification will be done, and whether it can be done without also have to prove identity, and there are concerns about the lack of specific privacy and security safeguards. Even though this legislation has received quite a lot of
attention, I doubt most internet users will be aware of what looks like an imminent requirement to obtain a 'porn licence' before watching pornography online. The government's own impact assessment recognises that it is not
guaranteed to succeed, and I suspect we will see an increase in advertising from providers in the near future.
It would seem particularly stupid to open one up to the dangers of have browsing and identity tracked, so surely it is time
to get oneself protected with a VPN, which enables one to continue accessing porn without having to hand over identity details. |
|
A chair has been appointed for independent appeals panel for the age verification
|
|
|
 | 9th January 2019
|
|
| See article from twitter.com |
Kirsty Brimelow QC is the new chairwoman of the independent appeals panel for the age verification regime of the British Board of Film Classification. The panel will oversee attempts to prevent children gaining access to adult content online. The
initial term is for 3 years in the post
|
|
A parliamentary committee suggests that perhaps the government ought to monitor how age verification requirements endanger porn viewers
|
|
|
 | 6th January 2019
|
|
| See article from theregister.co.uk See
Regulatory Policy Committee report
[pdf] from assets.publishing.service.gov.uk |
Parliament's Regulatory Policy Committee (RPC) has reported that the government's approach to internet porn censorship and age verification is fit for purpose, but asks a few important questions about how safe it is for porn viewers. The RPC
was originally set up a decade ago to help cut red tape by independently checking government estimates of how much complying with new laws and regulations would cost the private sector. Of curse all it has achieved is to watch the western world suffocate
itself in accelerating red tape to such a point that the west seems to be on a permanent course to diminishing wealth and popular unrest. One has to ask if the committee itself is fit for purpose? Anyway in the subject of endangering porn
users by setting them up for identity thieves, blackmailers and scammers, the authors write: Risks and wider impacts. The Impact Assessment (IA) makes only limited reference to risks and wider impacts of the measure.
These include the risk that adults and children may be pushed towards the dark web or related systems to avoid AV, where they could be exposed to illegal activities and extreme material that they otherwise would never have come into contact with. The IA
also recognises numerous other wider impacts, including privacy/fraud concerns linked to inputting ID data into sites and apps. Given the potential severity of such risks and wider impacts, the RPC believes that a more thorough
consideration of each, and of the potential means to mitigate them, would have been appropriate. The RPC therefore recommends that the Department ensures that it robustly monitors these risks and wider impacts, post-implementation.
|
|
|