|
SNP adopt a resolution to not use Scotland's archaic blasphemy law to prosecute anyone, there are plenty of modern equivalents to use instead
|
|
|
| 30th March 2018
|
|
| See article from scottishlegal.com
|
The Scottish National Party has passed a resolution that Scotland's centuries-old blasphemy law should be abolished, or at least never be used to prosecute anyone. Blasphemy is outlawed under the Confession of Faith Ratification Act 1690 and was last
used in 1843 to convict the Edinburgh bookseller Thomas Paterson who was imprisoned for selling blasphemous literature. The motion said Scotland lags behind other European countries by still having the law on the statute books and called for the
abolition of the archaic common law crimes of blasphemy, heresy and profanity to the extent that they remain law in Scotland. |
|
So when even the most senior internet figures can't keep our data safe, why does Matt Hancock want to force us to hand over our porn browsing history to the Mindgeek Gang?
|
|
|
| 29th March 2018
|
|
| 20th March 2018. See article from
dailymail.co.uk See The Cambridge Analytica scandal isn't a
scandal: this is how Facebook worksfrom independent.co.uk |
The Culture Secretary has vowed to end the Wild West for tech giants amid anger at claims data from Facebook users was harvested to be used by political campaigns. Matt Hancock warned social media companies that they could be slapped with new
rules and regulations to rein them in. It comes amid fury at claims the Facebook data of around 50 million users was taken without their permission and used by Cambridge Analytica. The firm played a key role in mapping out the behaviour of
voters in the run-up to the 2016 US election and the EU referendum campaign earlier that year. Tory MP Damian Collins, chairman of the Culture select committee, has said he wants to haul Mark Zuckerberg to Parliament to explain himself. Hancock said:
Tech companies store the data of billions of people around the world - giving an unparalleled insight into the lives and thoughts of people. And they must do more to show they are storing the data responsibly.
Update: They'll have to put a price on his head if they want Zuckerberg hauled in front of a judge 28th March 2018. See
article from theguardian.com Mark Zuckerberg has turned down the request
to appear in front of the a UK parliamentary committee for a good grilling. In response to the request, Facebook has suggested one of two executives could speak to parliament: Chris Cox, the company' chief product officer, who is in charge of the
Facebook news feed, or Mike Schroepfer, the chief technology officer, who heads up the developer platform. The Culture select committee chair, Damian Collins said: It is absolutely astonishing that Mark
Zuckerberg is not prepared to submit himself to questioning in front of a parliamentary or congressional hearing, given these are questions of fundamental importance and concern to his users, as well as to this inquiry. I would certainly urge him to
think again if he has any care for people that use his company's services.
Update: Pulling the big data plug 29th March 2018. See
article from reuters.com Facebook said on Wednesday it would end its
partnerships with several large data brokers who help advertisers target people on the social network. Facebook has for years given advertisers the option of targeting their ads based on data collected by companies such as Acxiom Corp and Experian PLC.
Facebook has also adjusted the privacy settings on its service, giving users control over their personal information in fewer taps. This move also reflects new European privacy laws soon to come in force. Update:
Facebook's listening 29th March 2018. See article from
dailymail.co.uk
Christopher Wylie, the whistle blower who revealed lots of interesting stuff about Facebook and Cambridge Analytica, has been speaking to Commons Digital, Culture, Media and Sport Committee about what Facebook gets up to. He told the committee
that he believes the social media giant is able to decipher whether someone is out in a crowd of people, in the office or at home. Asked by Conservative MP Damian Collins whether Facebook can listen to what people are saying to shape their
advertising, Wylie said they use the smartphone app microphone for environmental purposes. My understanding generally of how companies use it... not just Facebook, but generally other apps that pull audio, is for
environmental context. So if, for example, you have a television playing versus if you're in a busy place with a lot of people talking versus a work environment. It's not to say they're listening to what
you're saying. It's not natural language processing. That would be hard to scale.
It is interesting to note that he said companies don't listen into conversations because they can't for the moment. Butt he explanation is phrased such
that they will listen to conversations just as soon as the technology allows. |
|
|
|
|
|
22nd March 2018
|
|
|
The Register investigates touching on the dark web, smut monopolies and moral outrage See article from theregister.co.uk
|
|
But will a porn site with an unadvertised Russian connection follow these laws, or will it pass on IDs and browsing histories straight to the 'dirt digging' department of the KGB?
|
|
|
| 17th March 2018
|
|
| See article from theregister.co.uk
|
The Open Rights Group, Myles Jackman and Pandora Blake have done a magnificent job in highlighting the dangers of mandating that porn companies verify the age of their customers. Worst case scenario In the worst case scenario,
foreign porn companies will demand official ID from porn viewers and then be able to maintain a database of the complete browsing history of those officially identified viewers. And surely much to the alarm of the government and the newly
appointed internet porn censors at the BBFC, then this worst case scenario seems to be the clear favourite to get implemented. In particular Mindgeek, with a near monopoly on free porn tube sites, is taking the lead with its Age ID scheme. Now for
some bizarre reason, the government saw no need for its age verification to offer any specific protection for porn viewers, beyond that offered by existing and upcoming data protection laws. Given some of the things that Google and Facebook do with
personal data then it suggests that these laws are woefully inadequate for the context of porn viewing. For safety and national security reasons, data identifying porn users should be kept under total lock and key, and not used for any commercial
reason whatsoever. A big flaw But there in lies the flaw of the law. The government is mandating that all websites, including those based abroad, should verify their users without specifying any data protection requirements beyond
the law of the land. The flaw is that foreign websites are simply not obliged to respect British data protection laws. So as a topical example, there would be nothing to prevent a Russian porn site (maybe not identifying itself as Russian) from
requiring ID and then passing the ID and subsequent porn browsing history straight over to its dirty tricks department. Anyway the government has made a total pigs ear of the concept with its conservative 'leave it to industry to find a solution'
approach'. The porn industry simply does not have the safety and security of its customers at heart. Perhaps the government should have invested in its own solution first, at least the national security implications may have pushed it into at least
considering user safety and security. Where we are at As mentioned above campaigners have done a fine job in identifying the dangers of the government plan and these have been picked up by practically all newspapers. These seem to
have chimed with readers and the entire idea seems to be accepted as dangerous. In fact I haven't spotted anyone, not even 'the think of the children' charities pushing for 'let's just get on with it'. And so now its over to the authorities to try and
convince people that they have a safe solution somewhere. The Digital Policy Alliance
Perhaps as part of a propaganda campaign to win over the people, parliament's Digital Policy Alliance are just about to publish guidance on age verification policies. The alliance is a cross party group that includes, Merlin Hay, the Earl of Erroll, who
made some good points about privacy concerns whilst the bill was being forced through the House of Lords. He said that a Publicly Available Specification (PAS) numbered 1296 is due to be published on 19 March. This will set out for the age check
providers what they should do and what records they keep. The document is expected to include a discussion on the background to age verification, set out the rules in accordance with the Digital Economy Act, and give a detailed look at the
technology, with annexes on anonymity and how the system should work. However the document will carry no authority and is not set to become an official British standard. He explained: We can't put rules about
data protection into the PAS... That is in the Data Protection Bill, he said. So we refer to them, but we can't mandate them inside this PAS -- but it's in there as 'you must obey the law'...
But of course Hay did not mention that
Russian websites don't have to obey British data protection law. And next the BBFC will have a crack at reducing people's fears Elsewhere in the discussion, Hay suggested the British Board of Film and Internet Censorship could
mandate that each site had to offer more than one age-verification provider, which would give consumers more choice. Next the BBFC will have a crack at minimising people's fears about age verification. It will publish its own guidance document
towards the end of the month, and launch a public consultation about it. |
|
|
|
|
| 14th March 2018
|
|
|
A commendably negative take from The Sun. A legal expert has revealed the hidden dangers of strict new porn laws, which will force Brits to hand over personal info in exchange for access to XXX videos See
article from thesun.co.uk |
|
Is it just me or is Matt Hancock just a little too keen to advocate ID checks just for the state to control 'screen time'. Are we sure that such snooping wouldn't be abused for other reasons of state control?
|
|
|
| 13th March 2018
|
|
| See article from alphr.com |
It's no secret the UK government has a vendetta against the internet and social media. Now, Matt Hancock, the secretary of state for Digital, Culture, Media and Sport (DCMS) wants to push that further, and enforce screen time cutoffs for UK children on
Facebook, Instagram and Snapchat. Talking to the Sunday Times, Hancock explained that the negative impacts of social media need to be dealt with, and he laid out his idea for an age-verification system to apply more widely than just porn viewing.
He outlined that age-verification could be handled similarly to film classifications, with sites like YouTube being restricted to those over 18. The worrying thing, however, is his plans to create mandatory screen time cutoffs for all children.
Referencing the porn restrictions he said: People said 'How are you going to police that?' I said if you don't have it, we will take down your website in Britain. The end result is that the big porn sites are introducing this globally, so we are leading
the way. ...Read the full article from alphr.com Advocating internet censorship See
article from gov.uk Whenever politicians peak of 'balance' it inevitably means that the
balance will soon swing from people's rights towards state control. Matt Hancock more or less announced further internet censorship in a speech at the Oxford Media Convention. He said: Our schools and our curriculum have a
valuable role to play so students can tell fact from fiction and think critically about the news that they read and watch. But it is not easy for our children, or indeed for anyone who reads news online. Although we have robust
mechanisms to address disinformation in the broadcast and press industries, this is simply not the case online. Take the example of three different organisations posting a video online. If a broadcaster
published it on their on demand service, the content would be a matter for Ofcom. If a newspaper posted it, it would be a matter for IPSO. If an individual published it online, it would be untouched by
media regulation. Now I am passionate in my belief in a free and open Internet ....BUT... freedom does not mean the freedom to harm others. Freedom can only exist within a framework. Digital
platforms need to step up and play their part in establishing online rules and working for the benefit of the public that uses them. We've seen some positive first steps from Google, Facebook and Twitter recently, but even tech
companies recognise that more needs to be done. We are looking at the legal liability that social media companies have for the content shared on their sites. Because it's a fact on the web that online platforms are no longer just
passive hosts. But this is not simply about applying publisher or broadcaster standards of liability to online platforms. There are those who argue that every word on every platform should be the full legal
responsibility of the platform. But then how could anyone ever let me post anything, even though I'm an extremely responsible adult? This is new ground and we are exploring a range of ideas... including
where we can tighten current rules to tackle illegal content online... and where platforms should still qualify for 'host' category protections. We will strike the right balance between addressing issues
with content online and allowing the digital economy to flourish. This is part of the thinking behind our Digital Charter. We will work with publishers, tech companies, civil society and others to establish a new framework...
A change of heart of press censorship It was only a few years ago when the government were all in favour of creating a press censor. However new fears such as Russian interference and fake news has turned the mainstream
press into the champions of trustworthy news. And so previous plans for a press censor have been put on hold. Hancock said in the Oxford speech: Sustaining high quality journalism is a vital public policy goal. The scrutiny, the
accountability, the uncovering of wrongs and the fuelling of debate is mission critical to a healthy democracy. After all, journalists helped bring Stephen Lawrence's killers to justice and have given their lives reporting from
places where many of us would fear to go. And while I've not always enjoyed every article written about me, that's not what it's there for. I tremble at the thought of a media regulated by the state in a
time of malevolent forces in politics. Get this wrong and I fear for the future of our liberal democracy. We must get this right. I want publications to be able to choose their own path, making decisions like how to make the most
out of online advertising and whether to use paywalls. After all, it's your copy, it's your IP. The removal of Google's 'first click free' policy has been a welcome move for the news sector. But I ask the question - if someone is
protecting their intellectual property with a paywall, shouldn't that be promoted, not just neutral in the search algorithm? I've watched the industry grapple with the challenge of how to monetise content online, with different
models of paywalls and subscriptions. Some of these have been successful, and all of them have evolved over time. I've been interested in recent ideas to take this further and develop new subscription models for the industry.
Our job in Government is to provide the framework for a market that works, without state regulation of the press. |
|
A few more details from the point of view of British adult websites
|
|
|
| 12th March 2018
|
|
| See article from wired.co.uk |
|
|
The Government announces a new timetable for the introduction of internet porn censorship, now set to be in force by the end of 2018
|
|
|
| 11th March 2018
|
|
| See press release from gov.uk
|
In a press release the DCMS describes its digital strategy including a delayed introduction of internet porn censorship. The press release states: The Strategy also reflects the Government's ambition to make the internet
safer for children by requiring age verification for access to commercial pornographic websites in the UK. In February, the British Board of Film Classification (BBFC) was formally designated as the age verification regulator. Our
priority is to make the internet safer for children and we believe this is best achieved by taking time to get the implementation of the policy right. We will therefore allow time for the BBFC as regulator to undertake a public consultation on its draft
guidance which will be launched later this month. For the public and the industry to prepare for and comply with age verification, the Government will also ensure a period of up to three months after the BBFC guidance has been
cleared by Parliament before the law comes into force. It is anticipated age verification will be enforceable by the end of the year.
|
|
UK censorship minister seems in favour of state controls on screen time for children
|
|
|
| 10th March 2018
|
|
| See article from stv.tv
|
Children could have time limits imposed when they are on social media sites, the secretary of state for digital, culture, media and censorship has suggested. Matt Hancock told The Times that an age-verification system could be used to restrict screen
time. He aid: There is a genuine concern about the amount of screen time young people are clocking up and the negative impact it could have on their lives. For an adult I wouldn't want to
restrict the amount of time you are on a platform but for different ages it might be right to have different time cut-offs.
|
|
So how come the BBFC are saying virtually nothing about internet porn censorship and seem happy for newspapers to point out the incredibly dangerous privacy concerns of letting porn websites hold browsing records
|
|
|
| 7th March 2018
|
|
| See article from bbc.com |
The BBC seems to have done a good job voicing the privacy concerns of the Open Rights Group as the article has been picked up by most of the British rpess, The Open Rights Group says it fears a data breach is inevitable as the deadline approaches
for a controversial change in the way people in the UK access online pornography. Myles Jackman, legal director of the Open Rights Group, said while MindGeek had said it would not hold or store data, it was not clear who would - and by signing in
people would be revealing their sexual preferences. If the age verification process continues in its current fashion, it's a once-in-a-lifetime treasure trove of private information, he said. If it gets hacked,
can British citizens ever trust the government again with their data? The big issues here are privacy and security.
Jackman said it would drive more people to use virtual private networks (VPNs) - which mask a
device's geographical location to circumvent local restrictions - or the anonymous web browser Tor. He commented: It is brutally ironic that when the government is trying to break all encryption in order to combat
extremism, it is now forcing people to turn towards the dark web.
MindGeek, which runs sites including PornHub, YouPorn and RedTube, said its AgeID age verification tool had been in use in Germany since 2015. It said its software
would use third-party age-verification companies to authenticate the age of those signing in. AgeID spokesman James Clark told the BBC there were multiple verification methods that could be used - including credit card, mobile SMS, passport and
driving licence - but that it was not yet clear which would be compliant with the law. For something that is supposed to be coming in April, and requires software update by websites, it is surely about time that the government and/or the BBFC
actually told people about the detailed rules for when age verification is required and what methods will be acceptable to the censors. The start date has not actually been confirmed yet and the BBFC haven't even acknowledged that they have
accepted the job as the UK porn censor.. The BBFC boss David Austin, spouted some nonsense to the BBC claiming that age verification was already in place for other services, including some video-on-demand sites. In fact 'other' services such as
gambling sites have got totally different privacy issues and aren't really relevant to porn. The only method in place so far is to demand credit cards to access porn, the only thing that this has proved is that it is totally unviable for the businesses
involved, and is hardy relevant to how the dominant tube sites work. In fact a total absence of input from the BBFC is already leading to some alarming takes on the privacy issues of handing over people's porn viewing records to porn companies.
Surely the BBFC would be expected to provide official state propaganda trying to convince the worried masses that they have noting fear and that porn websites have people's best interests at heart. For instance, the Telegraph follow-up report writes
(See article from telegraph.co.uk :
Incoming age verification checks for people who watch pornography online are at risk of their sexual tastes being exposed, a privacy expert has warned. The Government has given the all clear for one of the largest pornography companies to organise
the arrangements for verification but experts claim that handing this power to the porn industry could put more people at risk. Those viewing porn will no longer be anonymous and their sexual tastes may be easily revealed through a cache of the
websites they have visited, according to Jim Killock, director of Open Rights Group. He warned: These are the most sensitive, embarrassing viewing habits that have potentially life-changing consequences if they become
public. In order for it to work, the company will end up with a list of every webpage of all of the big pornographic products someone has visited. Just like Google and Facebook, companies want to profile you and send you
advertisements based on what you are searching for. So what are AgeID going to do now that they have been given unparalleled access to people's pornographic tastes? They are going to decide what people's sexual tastes are and the
logic of that is impossible to resist. Even if they give reassurances, I just cannot see why they wouldn't. A database with someone's sexual preferences , highlighted by the web pages visited and geographically traceable through
the IP address, would be a target for hackers who could use them for blackmail or simply to cause humiliation. Imagine if you are a teacher and the pornography that you looked at - completely legally - became public? It would be
devastating for someone's career.
|
|
Politicians, censors and campaigners scent blood in getting Facebook and Google to censor their pet peeves, in this case pop-up brothels
|
|
|
| 5th March 2018
|
|
| See article from thesun.co.uk
|
Google and Facebook accused of supposedly profiting from pop-up brothels and sex clubs sweeping Britain Ministers are reportedly considering new laws to make internet giants liable when sex workers use their sites to organise business. The
National Crime Agency (NCA) are supporting the propaganda and claim Google and Facebook are making profits from sex trafficking, according to the Times. Pop up sex clubs have been discovered in Cornwall, Cambridge, Swindon and holiday cottages in
the Peak District. Will Kerr, the NCA's 'head of vulnerabilities', claimed: People are using the internet and social media sites to enable sexual exploitation and trafficking. It is clear that the internet platforms which host and make a profit
out of this type of material need to do more to identify and stop these forms of exploitation. Government figures want internet giants like Facebook to be held accountable, eying new US laws that are set to overturn more than 20 years of blanket
immunity for sites for content posted by users. It will make firms liable if they knowingly assist, support or facilitate content that leads to trafficking. Downing Street and Department for Digital, Culture, Media and Sport said they are looking
at whether and how to replicate the action in the UK.
|
|
|
|
|
|
5th March 2018
|
|
|
How Bad Laws Happen. By David Flint See article from reprobatemagazine.uk
|
|
The government abandons the disgraceful and unjust press censorship laws associated with Leveson
|
|
|
| 2nd March 2018
|
|
| See article from gov.uk |
Matt Hancock said in a statement to Parliament: Over many centuries in Britain, our press has held the powerful to account and been free to report and investigate without fear or favour. These principles underpin our democracy and are
integral to the freedom of our nation. Today in a world of the Internet and clickbait, our press face critical challenges that threaten their livelihood and sustainability - with declining circulations and a changing media
landscape. Mr Speaker, it is in this context that we approach the Leveson Inquiry, which was set up seven years ago in 2011, and reported six years ago in 2012, in response to events over a decade ago. The
Leveson Inquiry was a diligent and thorough examination of the culture, practices and ethics of our press in response to illegal and improper press intrusion. There were far too many cases of terrible behaviour and having met some
of the victims, I understand the impact this had. I want, from the start, to thank Sir Brian for his work. The Inquiry lasted over a year and heard evidence from more than 300 people including journalists,
editors and victims. Three major police investigations examined a wide range of offences, and more than 40 people were convicted. The Inquiry and investigations were comprehensive. And since it was set up, the terms of reference for a Part 2 of the Inquiry have largely been met.
There have also been extensive reforms to policing practices and significant changes to press self-regulation. IPSO has been established and now regulates 95% of national newspapers by circulation. It has
taken significant steps to demonstrate its independence as a regulator. And in 2016, Sir Joseph Pilling concluded that IPSO largely complied with Leveson's recommendations. There have been further improvements since and I hope
more to come. In November last year, IPSO introduced a new system of low-cost arbitration. It has processed more than 40,000 complaints in its first three years of operation; and has ordered multiple front
page corrections or clarifications. Newspapers have also made improvements to their governance frameworks to improve internal controls, standards and compliance. And one regulator, IMPRESS, has been
recognised under the Royal Charter. Extensive reforms to policing practices have been made. The College of Policing has published a code of ethics and developed national guidance for police officers on how
to engage with the press. And reforms in the Policing and Crime Act have strengthened protections for police whistleblowers. So it is clear that we have seen significant progress, from publications, from
the police and also from the newly formed regulator. And Mr Speaker, the media landscape today is markedly different from that which Sir Brian looked at in 2011. The way we consume news has changed
dramatically. Newspaper circulation has fallen by around 30 per cent since the conclusion of the Leveson Inquiry. And although digital circulation is rising, publishers are finding it much harder to
generate revenue online. In 2015, for every 100 pounds newspapers lost in print revenue they gained only 3 pounds in digital revenue. Our local papers, in particular, are under severe pressure. Local papers
help to bring together local voices and shine a light on important local issues - in communities, in courtrooms, in council chambers. And as we devolve power further to local communities, they will become even more important.
And yet, over 200 local newspapers have closed since 2015, including two in my own constituency. There are also new challenges, that were only in their infancy back in 2011. We have seen
the dramatic and continued rise of social media, which is largely unregulated. And issues like clickbait, fake news, malicious disinformation and online abuse, which threaten high quality journalism. A
foundation of any successful democracy is a sound basis for democratic discourse. This is under threat from these new forces that require urgent attention. These are today's challenges and this is where we need to focus.
Especially as over 48 million pounds was spent on the police investigations and the Inquiry. During the consultation, 12% of direct respondents were in favour of reopening the Leveson Inquiry, with 66% against.
We agree and that is the position that we set out in our Manifesto. Sir Brian, who I thank for his service, agrees that the Inquiry should not proceed on the current terms of reference but believes that it should continue in an
amended form. We do not believe that reopening this costly and time-consuming public inquiry is the right way forward. Considering all of the factors that I have outlined to the House today, I have informed
Sir Brian that we will be formally closing the Inquiry. But we will take action to safeguard the lifeblood of our democratic discourse, and tackle the challenges our media face today, not a decade ago. During the consultation, we also found serious concerns that Section 40 of the Crime and Courts Act 2013 would exacerbate the problems the press face rather than solve them.
Respondents were worried that it would impose further financial burdens, especially on the local press. One high profile figure put it very clearly. He said: 'Newspapers...are
already operating in a tough environment. These proposals will make it tougher and add to the risk of self-censorship'. 'The threat of having to pay both sides' costs - no matter what the challenge - would have the effect of
leaving journalists questioning every report that named an individual or included the most innocuous data about them.' He went on to say that Section 40 risks 'damaging the future of a paper that you love' and that the impact will
be to 'make it much more difficult for papers...to survive'. These are not my words Mr Speaker, but the words of Alastair Campbell talking about the chilling threat of Section 40. [political content removed]
Only 7 per cent of direct respondents favoured full commencement of Section 40. By contrast, 79 per cent favoured full repeal. Mr Speaker, we have decided not to commence Section 40 of the Crime and Courts Act
2013 and to seek repeal at the earliest opportunity. Action is needed. Not based on what might have been needed years ago - but action now to address today's problems. Our new Digital Charter sets out the
overarching programme of work to agree norms and rules for the online world and put them into practice. Under the Digital Charter, our Internet Safety Strategy is looking at online behaviour and we will firmly tackle the problems
of online abuse. And our review into the sustainability of high quality journalism will address concerns about the impact of the Internet on our news and media. It will do this in a forward looking way, so
we can respond to the challenges of today, not the challenges of yesterday. Mr Speaker, the future of a vibrant press matters to us all. There has been a huge public response to our consultation. I would
like to thank every one of the 174,000 respondents as well as all those who signed petitions. We have carefully considered all of the evidence we received. We have consulted widely, with regulators, publications and victims of
press intrusion. The world has changed since the Leveson Inquiry was established in 2011. Since then we have seen seismic changes to the media landscape. The work of the Leveson
Inquiry, and the reforms since, have had a huge impact on public life. We thank Sir Brian Leveson for lending his dedication and expertise to the undertaking of this Inquiry. At national and local levels, a press that can hold the
powerful to account remains an essential component of our democracy. Britain needs high-quality journalism to thrive in the new digital world. We seek a press - a media - that is robust, and independently
regulated. That reports without fear or favour. The steps I have set out today will help give Britain a vibrant, independent and free press that holds the powerful to account and rises to the challenges of our times.
|
|
The BBC notes that it is only few weeks until age verification is required for porn sites yet neither the government nor the BBFC has been able to provide details to the BBC about how it will work.
|
|
|
| 28th
February 2018
|
|
| See article from bbc.com |
The BBC writes: A few weeks before a major change to the way in which UK viewers access online pornography, neither the government nor the appointed regulator has been able to provide details to the BBC about how it will work.
From April 2018, people accessing porn sites will have to prove they are aged 18 or over. Both bodies said more information would be available soon. The British Board of Film
Classification (BBFC) was named by parliament as the regulator in December 2017. (But wasn't actually appointed until 21st February 2018. However the BBFC has been working on its censorship procedures for many months already but has refused to speak
about this until formally appointed). The porn industry has been left to develop its own age verification tools. Prof Alan Woodward, cybersecurity expert at Surrey University, told the BBC this presented
porn sites with a dilemma - needing to comply with the regulation but not wanting to make it difficult for their customers to access content. I can't imagine many porn-site visitors will be happy uploading copies of passports and driving licences to such
a site. And, the site operators know that. |
|
The government has now officially appointed the BBFC as its internet porn censor
|
|
|
| 23rd February 2018
|
|
| See letter [pdf] from gov.uk
|
The DCMS has published a letter dated 21st February 2018 that officially appoints the BBFC as its internet porn censor. It euphemistically describes the role as an age verification regulator. Presumably a few press releases will follow and now the
BBFC can at least be expected to comment on how the censorship will be implemented.. The enforcement has previously being noted as starting around late April or early May but this does not seem to give sufficient time for the required software to
be implemented by websites. |
|
The UK reveals a tool to detect uploads of jihadi videos
|
|
|
|
15th February 2018
|
|
| 13th February 2018. See article from bbc.com |
The UK government has unveiled a tool it says can accurately detect jihadist content and block it from being viewed. Home Secretary Amber Rudd told the BBC she would not rule out forcing technology companies to use it by law. Rudd is visiting the
US to meet tech companies to discuss the idea, as well as other efforts to tackle extremism. The government provided £600,000 of public funds towards the creation of the tool by an artificial intelligence company based in London. Thousands
of hours of content posted by the Islamic State group was run past the tool, in order to train it to automatically spot extremist material. ASI Data Science said the software can be configured to detect 94% of IS video uploads. Anything the
software identifies as potential IS material would be flagged up for a human decision to be taken. The company said it typically flagged 0.005% of non-IS video uploads. But this figure is meaningless without an indication of how many contained any
content that have any connection with jihadis. In London, reporters were given an off-the-record briefing detailing how ASI's software worked, but were asked not to share its precise methodology. However, in simple terms, it is an algorithm that
draws on characteristics typical of IS and its online activity. It sounds like the tool is more about analysing data about the uploading account, geographical origin, time of day, name of poster etc rather than analysing the video itself.
Comment: Even extremist takedowns require accountability 15th February 2018. See article from openrightsgroup.org
Can extremist material be identified at 99.99% certainty as Amber Rudd claims today? And how does she intend to ensure that there is legal accountability for content removal? The Government is very keen to ensure that
extremist material is removed from private platforms, like Facebook, Twitter and Youtube. It has urged use of machine learning and algorithmic identification by the companies, and threatened fines for failing to remove content swiftly.
Today Amber Rudd claims to have developed a tool to identify extremist content, based on a database of known material. Such tools can have a role to play in identifying unwanted material, but we need to understand that there are some
important caveats to what these tools are doing, with implications about how they are used, particularly around accountability. We list these below. Before we proceed, we should also recognise that this is often about computers
(bots) posting vast volumes of material with a very small audience. Amber Rudd's new machine may then potentially clean some of it up. It is in many ways a propaganda battle between extremists claiming to be internet savvy and exaggerating their impact,
while our own government claims that they are going to clean up the internet. Both sides benefit from the apparent conflict. The real world impact of all this activity may not be as great as is being claimed. We should be given
much more information about what exactly is being posted and removed. For instance the UK police remove over 100,000 pieces of extremist content by notice to companies: we currently get just this headline figure only. We know nothing more about these
takedowns. They might have never been viewed, except by the police, or they might have been very influential. The results of the government's' campaign to remove extremist material may be to push them towards more private or
censor-proof platforms. That may impact the ability of the authorities to surveil criminals and to remove material in the future. We may regret chasing extremists off major platforms, where their activities are in full view and easily used to identify
activity and actors. Whatever the wisdom of proceeding down this path, we need to be worried about the unwanted consequences of machine takedowns. Firstly, we are pushing companies to be the judges of legal and illegal. Secondly,
all systems make mistakes and require accountability for them; mistakes need to be minimised, but also rectified. Here is our list of questions that need to be resolved. 1 What really is the accuracy of
this system? Small error rates translate into very large numbers of errors at scale. We see this with more general internet filters in the UK, where our blocked.org.uk project regularly uncovers and reports errors.
How are the accuracy rates determined? Is there any external review of its decisions? The government appears to recognise the technology has limitations. In order to claim a high accuracy rate, they say at
least 6% of extremist video content has to be missed. On large platforms that would be a great deal of material needing human review. The government's own tool shows the limitations of their prior demands that technology "solve" this problem.
Islamic extremists are operating rather like spammers when they post their material. Just like spammers, their techniques change to avoid filtering. The system will need constant updating to keep a given level of accuracy.
2 Machines are not determining meaning Machines can only attempt to pattern match, with the assumption that content and form imply purpose and meaning. This explains how errors can occur, particularly in
missing new material. 3 Context is everything The same content can, in different circumstances, be legal or illegal. The law defines extremist material as promoting or glorifying terrorism. This is a
vague concept. The same underlying material, with small changes, can become news, satire or commentary. Machines cannot easily determine the difference. 4 The learning is only as good as the underlying material
The underlying database is used to train machines to pattern match. Therefore the quality of the initial database is very important. It is unclear how the material in the database has been deemed illegal, but it is likely that these
are police determinations rather than legal ones, meaning that inaccuracies or biases in police assumptions will be repeated in any machine learning. 5 Machines are making no legal judgment The
machines are not making a legal determination. This means a company's decision to act on what the machine says is absent of clear knowledge. At the very least, if material is "machine determined" to be illegal, the poster, and users who attempt
to see the material, need to be told that a machine determination has been made. 6 Humans and courts need to be able to review complaints Anyone who posts material must be able to get human review,
and recourse to courts if necessary. 7 Whose decision is this exactly? The government wants small companies to use the database to identify and remove material. If material is incorrectly removed,
perhaps appealed, who is responsible for reviewing any mistake? It may be too complicated for the small company. Since it is the database product making the mistake, the designers need to act to correct it so that it is less
likely to be repeated elsewhere. If the government want people to use their tool, there is a strong case that the government should review mistakes and ensure that there is an independent appeals process.
8 How do we know about errors? Any takedown system tends towards overzealous takedowns. We hope the identification system is built for accuracy and prefers to miss material rather than remove the wrong things, however
errors will often go unreported. There are strong incentives for legitimate posters of news, commentary, or satire to simply accept the removal of their content. To complain about a takedown would take serious nerve, given that you risk being flagged as
a terrorist sympathiser, or perhaps having to enter formal legal proceedings. We need a much stronger conversation about the accountability of these systems. So far, in every context, this is a question the government has
ignored. If this is a fight for the rule of law and against tyranny, then we must not create arbitrary, unaccountable, extra-legal censorship systems.
|
|
Matt Hancock rules out creating a UK social media censor
|
|
|
| 10th February 2018
|
|
| See article from tech.newstatesman.com
|
The UK's digital and culture secretary, Matt Hancock, has ruled out creating a new internet censor targeting social media such as Facebook and Twitter. In an interview on the BBC's Media Show , Hancock said he was not inclined in that direction
and instead wanted to ensure existing regulation is fit for purpose. He said: If you tried to bring in a new regulator you'd end up having to regulate everything. But that doesn't mean that we don't need to make sure
that the regulations ensure that markets work properly and people are protected.
Meanwhile the Electoral Commission and the Department for Digital, Culture, Media and Sport select committee are now investigating whether Russian groups
used the platforms to interfere in the Brexit referendum in 2016. The DCMS select committee is in the US this week to grill tech executives about their role in spreading fake news. In a committee hearing in Washington yesterday, YouTube's policy chief
said the site had found no evidence of Russian-linked accounts purchasing ads to interfere in the Brexit referendum. |
|
Government outlines next steps to make the UK the most censored place to be online
|
|
|
| 7th February 2018
|
|
| See press release from gov.uk
|
Government outlines next steps to make the UK the safest place to be online The Prime Minister has announced plans to review laws and make sure that what is illegal offline is illegal online as the Government marks Safer
Internet Day. The Law Commission will launch a review of current legislation on offensive online communications to ensure that laws are up to date with technology. As set out in the
Internet Safety Strategy Green Paper , the Government is clear that abusive and threatening behaviour online is totally
unacceptable. This work will determine whether laws are effective enough in ensuring parity between the treatment of offensive behaviour that happens offline and online. The Prime Minister has also announced:
That the Government will introduce a comprehensive new social media code of practice this year, setting out clearly the minimum expectations on social media companies The introduction of an annual
internet safety transparency report - providing UK data on offensive online content and what action is being taken to remove it.
Other announcements made today by Secretary of State for Digital, Culture, Media and Sport (DCMS) Matt Hancock include:
A new online safety guide for those working with children, including school leaders and teachers, to
prepare young people for digital life A commitment from major online platforms including Google, Facebook and Twitter to put in place specific support during election campaigns to ensure abusive content can be dealt with
quickly -- and that they will provide advice and guidance to Parliamentary candidates on how to remain safe and secure online
DCMS Secretary of State Matt Hancock said: We want to make the UK the safest place in the world to be online and having listened to the views of parents, communities and industry, we are delivering
on the ambitions set out in our Internet Safety Strategy.
Not only are we seeing if the law needs updating to better tackle online harms, we are moving forward with our plans for online platforms to have
tailored protections in place - giving the UK public standards of internet safety unparalleled anywhere else in the world.
Law Commissioner Professor David Ormerod QC said: There
are laws in place to stop abuse but we've moved on from the age of green ink and poison pens. The digital world throws up new questions and we need to make sure that the law is robust and flexible enough to answer them.
If we are to be safe both on and off line, the criminal law must offer appropriate protection in both spaces. By studying the law and identifying any problems we can give government the full picture as it works to make the UK the
safest place to be online. The latest announcements follow the publication of the Government's
Internet Safety Strategy Green Paper last year which outlined plans for a social media code of practice. The aim is to prevent abusive behaviour online, introduce more effective reporting mechanisms to tackle bullying or harmful content, and give
better guidance for users to identify and report illegal content. The Government will be outlining further steps on the strategy, including more detail on the code of practice and transparency reports, in the spring. To support
this work, people working with children including teachers and school leaders will be given a new guide for online safety, to help educate young people in safe internet use. Developed by the UK Council for Child Internet Safety (
UKCCIS , the toolkit describes the knowledge and skills for staying safe online that children and young people should have at
different stages of their lives. Major online platforms including Google, Facebook and Twitter have also agreed to take forward a recommendation from the Committee on Standards in Public Life (CSPL) to provide specific support for
Parliamentary candidates so that they can remain safe and secure while on these sites. during election campaigns. These are important steps in safeguarding the free and open elections which are a key part of our democracy. Notes
Included in the Law Commission's scope for their review will be the Malicious Communications Act and the Communications Act. It will consider whether difficult concepts need to be reconsidered in the light of technological
change - for example, whether the definition of who a 'sender' is needs to be updated. The Government will bring forward an Annual Internet Safety Transparency report, as proposed in our Internet Safety Strategy green paper. The
reporting will show:
the amount of harmful content reported to companies the volume and proportion of this material that is taken down how social media companies are handling and responding to
complaints how each online platform moderates harmful and abusive behaviour and the policies they have in place to tackle it.
Annual reporting will help to set baselines against which to benchmark companies' progress, and encourage the sharing of best practice between companies. The new social media code of practice will outline
standards and norms expected from online platforms. It will cover:
The development, enforcement and review of robust community guidelines for the content uploaded by users and their conduct online The prevention of abusive behaviour online and the misuse of social
media platforms -- including action to identify and stop users who are persistently abusing services The reporting mechanisms that companies have in place for inappropriate, bullying and harmful content, and ensuring they
have clear policies and performance metrics for taking this content down The guidance social media companies offer to help users identify illegal content and contact online, and advise them on how to report it to the
authorities, to ensure this is as clear as possible The policies and practices companies apply around privacy issues.
Comment: Preventing protest 7th February 2018. See article from indexoncensorship.org
The UK Prime Minister's proposals for possible new laws to stop intimidation against politicians have the potential to prevent legal protests and free speech that are at the core of our democracy, says Index on Censorship. One hundred years after the
suffragette demonstrations won the right for women to have the vote for the first time, a law that potentially silences angry voices calling for change would be a retrograde step. No one should be threatened with violence, or
subjected to violence, for doing their job, said Index chief executive Jodie Ginsberg. However, the UK already has a host of laws dealing with harassment of individuals both off and online that cover the kind of abuse politicians receive on social media
and elsewhere. A loosely defined offence of 'intimidation' could cover a raft of perfectly legitimate criticism of political candidates and politicians -- including public protest.
|
|
Appeals court finds that the Government's snooping law is an abuse of rights
|
|
|
| 31st January 2018
|
|
| See article from theguardian.com
See article from openrightsgroup.org |
The UK's mass digital surveillance regime preceding the snoopers charter has been found to be illegal by an appeals court. The case was brought by the Labour deputy leader, Tom Watson in conjunction with Liberty, the human rights campaign group.
The three judges said Data Retention and Investigatory Powers Act 2014 (Dripa), which paved the way for the snooper's charter legislation, did not restrict the accessing of confidential personal phone and web browsing records to investigations of
serious crime, and allowed police and other public bodies to authorise their own access without adequate oversight. The judges said Dripa was inconsistent with EU law because of this lack of safeguards, including the absence of prior review by a court or
independent administrative authority. Responding to the ruling, Watson said: This legislation was flawed from the start. It was rushed through parliament just before recess without proper parliamentary scrutiny.
The government must now bring forward changes to the Investigatory Powers Act to ensure that hundreds of thousands of people, many of whom are innocent victims or witnesses to crime, are protected by a system of independent approval for access to
communications data. I'm proud to have played my part in safeguarding citizens' fundamental rights.
Martha Spurrier, the director of Liberty, said: Yet again a UK court has ruled the government's
extreme mass surveillance regime unlawful. This judgement tells ministers in crystal clear terms that they are breaching the public's human rights. She said no politician was above the law. When will the government stop bartering with judges and start
drawing up a surveillance law that upholds our democratic freedoms?
Matthew Rice of the Open Rights Group responded: Once again, another UK court has found another piece of Government surveillance
legislation to be unlawful. The Government needs to admit their legislation is flawed and make the necessary changes to the Investigatory Powers Act to protect the public's fundamental rights. The Investigatory Powers Act carves a
gaping hole in the public's rights. Public bodies able to access data without proper oversight, and access to that data for reasons other than fighting serious crime. These practices must stop, the courts have now confirmed it. The ball is firmly in the
Government's court to set it right.
|
|
Government sets up propaganda and fake news unit to counter Russian propaganda
|
|
|
| 25th January 2018
|
|
| 24th January 2018. See article from
telegraph.co.uk |
Theresa May is creating a new national security unit to counter supposed fake news and disinformation spread by Russia and other foreign powers, Downing Street has announced. The Prime Minister's official spokesman said the new national security
communications unit would build on existing capabilities and would be tasked with combating disinformation by state actors and others. The spokesman said: We are living in an era of fake news and competing narratives.
The government will respond with more and better use of national security communications to tackle these interconnected, complex challenges. To do this we will build on existing capabilities by creating a dedicated national
security communications unit. This will be tasked with combating disinformation by state actors and others.
Update: The new unit has already been dubbed the Ministry of Truth.
|
|
The government publishes it guidance to the new UK porn censor about notifying websites that they are to be censored, asking payment providers and advertisers to end their service, recourse to ISP blocks and an appeals process
|
|
|
| 22nd
January 2018
|
|
| See
Guidance from the Secretary of State for Digital, Culture, Media and Sport to the Age-Verification Regulator for Online Pornography [pdf] from gov.uk |
A few extracts from the document Introduction
- A person contravenes Part 3 of the Digital Economy Act 2017 if they make
pornographic material available on the internet on a commercial basis to persons in the United Kingdom without ensuring that the material is not
normally accessible to persons under the age of 18. Contravention could lead to a range of measures being taken by the age-verification regulator in relation to that person, including blocking by internet service providers (ISPs).
- Part 3 also gives the age-verification regulator powers to act where a person
makes extreme pornographic material (as defined in section 22 of the Digital Economy Act 2017) available on the internet to persons in the
United Kingdom.
Purpose This guidance has been written to provide the framework for the operation of the age-verification regulatory regime in the following areas:
● Regulator's approach to the exercise of its powers; ● Age-verification arrangements; ● Appeals; ● Payment-services Providers and Ancillary Service Providers; ● Internet Service
Provider blocking; and ● Reporting. Enforcement principles This guidance balances two overarching principles in the regulator's application of its powers under
sections 19, 21 and 23 - that it should apply its powers in the way which it thinks will be most effective in ensuring compliance on a case-by-case basis and that it should take a proportionate approach. As set out in
this guidance, it is expected that the regulator, in taking a proportionate approach, will first seek to engage with the non-compliant person to encourage them to comply, before considering issuing a notice under section 19, 21 or 23, unless there are
reasons as to why the regulator does not think that is appropriate in a given case Regulator's approach to the exercise of its powers The age-verification consultation Child
Safety Online: Age verification for pornography identified that an extremely large number of websites contain pornographic content - circa 5 million sites or parts of sites. All providers of online pornography, who are making available pornographic
material to persons in the United Kingdom on a commercial basis, will be required to comply with the age-verification requirement . In exercising its powers, the regulator should take a proportionate approach. Section
26(1) specifically provides that the regulator may, if it thinks fit, choose to exercise its powers principally in relation to persons who, in the age-verification regulator's opinion:
- (a) make pornographic material or extreme pornographic material available on the internet on a commercial basis to a large number of persons, or a large number of persons under the age of 18, in the United Kingdom; or
- (b) generate a large amount of turnover by doing so.
In taking a proportionate approach, the regulator should have regard to the following: a. As set out in section 19, before making a determination that a person is contravening section 14(1),
the regulator must allow that person an opportunity to make representations about why the determination should not be made. To ensure clarity and discourage evasion, the regulator should specify a prompt timeframe for compliance and, if it considers it
appropriate, set out the steps that it considers that the person needs to take to comply. b. When considering whether to exercise its powers (whether under section 19, 21 or 23), including considering what type of
notice to issue, the regulator should consider, in any given case, which intervention will be most effective in encouraging compliance, while balancing this against the need to act in a proportionate manner. c. Before
issuing a notice to require internet service providers to block access to material, the regulator must always first consider whether issuing civil proceedings or giving notice to ancillary service providers and payment-services providers might have a
sufficient effect on the non-complying person's behaviour. To help ensure transparency, the regulator should publish on its website details of any notices under sections 19, 21 and 23.
Age-verification arrangements Section 25(1) provides that the regulator must publish guidance about the types of arrangements for making pornographic material available that the regulator
will treat as complying with section 14(1). This guidance is subject to a Parliamentary procedure A person making pornographic material available on a commercial basis to persons in the United Kingdom must have an
effective process in place to verify a user is 18 or over. There are various methods for verifying whether someone is 18 or over (and it is expected that new age-verification technologies will develop over time). As such, the Secretary of State considers
that rather than setting out a closed list of age-verification arrangements, the regulator's guidance should specify the criteria by which it will assess, in any given case, that a person has met with this requirement. The regulator's guidance should
also outline good practice in relation to age verification to encourage consumer choice and the use of mechanisms which confirm age, rather than identity. The regulator is not required to approve individual
age-verification solutions. There are various ways to age verify online and the industry is developing at pace. Providers are innovating and providing choice to consumers. The process of verifying age for adults should
be concerned only with the need to establish that the user is aged 18 or above. The privacy of adult users of pornographic sites should be maintained and the potential for fraud or misuse of personal data should be safeguarded. The key focus of many
age-verification providers is on privacy and specifically providing verification, rather than identification of the individual. Payment-services providers and ancillary service providers
There is no requirement in the Digital Economy Act for payment-services providers or ancillary service providers to take any action on receipt of such a notice. However, Government expects that responsible companies will wish to
withdraw services from those who are in breach of UK legislation by making pornographic material accessible online to children or by making extreme pornographic material available. The regulator should consider on a
case-by-case basis the effectiveness of notifying different ancillary service providers (and payment-services providers). There are a wide-range of providers whose services may be used by pornography providers to
enable or facilitate making pornography available online and who may therefore fall under the definition of ancillary service provider in section 21(5)(a) . Such a service is not limited to where a direct financial relationship is in place between the
service and the pornography provider. Section 21(5)(b) identifies those who advertise commercially on such sites as ancillary service providers. In addition, others include, but are not limited to:
- a. Platforms which enable pornographic content or extreme pornographic material to be uploaded;
- b. Search engines which facilitate access to pornographic content or extreme pornographic
material;
- c. Discussion for a and communities in which users post links;
- d. Cyberlockers' and cloud storage services on which pornographic content or extreme pornographic
material may be stored;
- e. Services including websites and App marketplaces that enable users to download Apps;
- f. Hosting services which enable access to websites, Apps or App
marketplaces; that enable users to download apps
- g. Domain name registrars.
- h. Set-top boxes, mobile applications and other devices that can connect directly to streaming servers
Internet Service Provider blocking The regulator should only issue a notice to an internet service provider having had regard to Chapter 2 of this guidance. The regulator should take a
proportionate approach and consider all actions (Chapter 2.4) before issuing a notice to internet service providers. In determining those ISPs that will be subject to notification, the regulator should take into
consideration the number and the nature of customers, with a focus on suppliers of home and mobile broadband services. The regulator should consider any ISP that promotes its services on the basis of pornography being accessible without age verification
irrespective of other considerations. The regulator should take into account the child safety impact that will be achieved by notifying a supplier with a small number of subscribers and ensure a proportionate approach.
Additionally, it is not anticipated that ISPs will be expected to block services to business customers, unless a specific need is identified. Reporting In order to assist with
the ongoing review of the effectiveness of the new regime and the regulator's functions, the Secretary of State considers that it would be good practice for the regulator to submit to the Secretary of State an annual report on the exercise of its
functions and their effectiveness. |
|
Murray Perkins of the BBFC travels the world to inform the rest of the world how it will have to comply with UK internet censorship
|
|
|
| 22nd January
2018
|
|
| See article from xbiz.com |
The US adult trade group, Free Speech Coalition at its inaugural Leadership Conference on Thursday introduced Murray Perkins, who leads efforts for the UK's new age-verification censorship regime under the Digital Economy Act. Perkins is the
principal adviser for the BBFC, which last year signed on to assume the role of internet porn censor. Perkins traveled to the XBIZ Show on an informational trip specifically to offer education on the Digital Economy Act's regulatory powers; he
continues on to Las Vegas next week and Australia the following week to speak with online adult entertainment operators. Pekins said: The reason why I am here is to be visible, to give people an opportunity to
ask questions about what is happening. I firmly believe that the only way to make this work is to with and not against the adult entertainment industry. This is a challenge; there is no template, but we will figure it out. I am
reasonably optimistic [the legislation] will work.
A team of classification examiners will start screening content for potential violations starting in the spring. (In a separate discussion with XBIZ, Perkins said that his army of
examiners will total 15.) Perkins showed himself to be a bit naive, a bit insensitive, or a bit of an idiot when he spouted: The Digital Economy Act will affect everyone in this room, one way or the other,
Perkins said. However, the Digital Economy Act is not anti-porn -- it is not intended to disrupt an adult's journey or access to their content. [...BUT... it is likely to totally devastate the UK adult industry and hand over all remaining business
to the foreign internet giant Mindgeek, who will become the Facebook/Google/Amazon of porn. Not to mention the Brits served on a platter to scammers, blackmailers and identity thieves].
|
|
Government is giving the HMRC the right to break in and search your house without needing to get a warrant
|
|
|
| 14th January 2018
|
|
| See article from telegraph.co.uk
|
Customs officers are to gain permission to enter and search people's homes without a warrant in a law change a minister warns would allow them more powers than the police. Kit Malthouse, a Conservative MP who became a minister in this week's
reshuffle, said he is concerned about new powers for HM Revenue and Customs in the Finance Bill which is currently going through Parliament. The changes were an extension of the old excise men's powers to deal with smugglers in ports and airports
he said, questioning whether such powers are appropriate today. He said: I hope that Ministers will think carefully about whether it might be more appropriate for a warrant to be obtained to access someone's premises, in the same way that the
police do when they have suspicions.
|
|
Matt Hancock has been appointed the new secretary of state for censorship at the DCMS
|
|
|
| 9th January 2018
|
|
| See article from gov.uk |
Matt Hancock MP was appointed Secretary of State for Digital, Censorship, Media and Sport on 8 January 2018. He was previously Minister of State for Digital from July 2016 to January 2018. Matt Hancock is the MP for West Suffolk, having been elected
in the 2010 general election. Since July 2016 he has served at DCMS as Minister of State for Digital and is responsible for broadband, broadcasting, creative industries, cyber and the tech industry. The Secretary of State has overall
responsibility for strategy and policy across the Department for Culture, Media and Sport. The department's main policy areas are:
- arts and culture
- broadcasting
- creative industries
- cultural property, heritage and the historic environment
- gambling and racing
- libraries
- media ownership and mergers
- museums and galleries
- the National Lottery
- sport
- telecommunications and online
- tourism
Hancock has already been working on the new law to serve up porn viewers on a platter to scammers, fraudsters, blackmailers and identity thieves, so there is unlikely to be a change of direction there. |
|
The government outlines the expected harms to people and businesses associated with its upcoming porn censorship law
|
|
|
| 5th January 2018
|
|
| See article
from theregister.co.uk See also Government risk assessment [pdf] from gov.uk
|
The UK government slipped out its impact assessment of the upcoming porn censorship law during the Christmas break. The new law requires porn websites to be blocked in the UK when they don't implement age verification. The measures are currently
due to come into force in May but it seems a tight schedule as even the rules for acceptable age verification systems have not yet been published. The report contains some interesting costings and assessment of the expected harms to be inflicted
on porn viewers and British adult businesses. The document notes the unpopularity of the age verification requirements with a public consultation finding that 54% of respondents did not support the introduction of a law to require age
verification. However, the government has forged ahead, with the aim of stopping kids accessing porn on the grounds that such content could distress them or harm their development. The governments censorship rules will be enforced by the
BBFC, in its new role as the UK porn censor although it prefers the descriptor: age-verification regulator . The government states that the censorship job will initially be funded by the government, and the government is assuming this will cost
£4.5 million based upon a range of estimates from 1 million to 8 million. The government has bizarrely assumed that the BBFC will ban just 1 to 60 sites in a year. The additional work for ISPs to block these sites is estimated £100,000 to £500,000
for each ISP. Probably to be absorbed by larger companies, but will be an expensive problem for smaller companies who do not currently implement any blocking systems. Interestingly the government notes that there wont be any impact on UK adult
businesses notionally because they should have already implemented age verification under ATVOD and Ofcom censorship rules. In reality it will have little impact on UK businesses because they have already been decimated by the ATVOD and Ofcom rules and
have mostly closed down or moved abroad. Te key section of the document summarising expected harms is as follows. The policy option set out above also gives rise to the following risks:
- Deterring adults from consuming content as a result of privacy/ fraud concerns linked to inputting ID data into sites and apps, also some adults may not be able to prove their age online;
- Development of alternative payment systems and technological work-arounds could mean porn providers do not comply with new law, and enforcement is impossible as they are based overseas, so the policy goal would not be achieved;
- The assumption that ISPs will comply with the direction of the regulator;
- Reputational risks including Government censorship, over-regulation, freedom of speech and freedom of
expression.
- The potential for online fraud could raise significantly, as criminals adapt approaches in order to make use of false AV systems / spoof websites and access user data;
-
The potential ability of children, particularly older children, to bypass age verification controls is a risk. However, whilst no system will be perfect, and alternative routes such as virtual private networks and peer-to-peer
sharing of content may enable some under-18s to see this content, Ofcom research indicates that the numbers of children bypassing network level filters, for example, is very low (ca. 1%).
- Adults (and some children)
may be pushed towards using ToR and related systems to avoid AV where they could be exposed to illegal and extreme material that they otherwise would never have come into contact with.
The list does not seem to include the potential for blackmail from user data sold by porn firms, or else stolen by hackers. And mischievously, politicians could be one of the groups most open to blackmail for money or favours. Another notable
omission, is that the government does not seem overly concerned about mass VPN usage. I would have thought that the secret services wanting to monitor terrorists would not be pleased if a couple of million people stared to use encrypted VPNs. Perhaps it
shows that the likes of GCHQ can already see into what goes on behind VPNs. |
|
Ludicrous government minister seems to think that the entire security bill combatting terrorism should be footed by social media companies because they don't take down extremist posts quickly enough
|
|
|
|
1st January 2018
|
|
| See article from thenational.ae
|
Britain's security minister Ben Wallace has threatened technology firms such as Facebook, YouTube and Google with punitive taxation if they fail to cooperate with the government on fighting online extremism. Ben Wallace said that Britain was spending
hundreds of millions of pounds on human surveillance and de-radicalisation programmes because tech giants were failing to remove extremist content online quick enough. Wallace said the companies were ruthless profiteers, despite sitting on
beanbags in T-shirts, who sold on details of its users to loan companies but would fail to give the same information to the government. Because of encryption and because of radicalisation, the cost of that is heaped on law enforcement agencies,
Wallace told the Sunday Times. I have to have more human surveillance. It's costing hundreds of millions of pounds. If they [tech firms] continue to be less than co-operative, we should look at things like tax as a way of incentivising them or
compensating for their inaction. Because content is not taken down as quickly as they could do, we're having to de-radicalise people who have been radicalised. That's costing millions. They [the firms] can't get away with that and we should look
at all options, including tax. Maybe its a good idea to extract a significantly higher tax take from the vast sums of money being siphoned out of the UK economy straight into the hands of American big business. But it seems a little hopeful to claim
that quicker blocking of terrorist related material will 'solve' the UK's terrorism problem. One suspects that terrorism is a little more entrenched in society, and that terrorism will continue pretty much unabated even if the government get its way
with quicker takedowns. There might even be a scope for some very expensive legal bluff calling, should expensive censorship measures get taken, and it turns out that the government blame conjecture is provably wrong. |
|
|