Melon Farmers Original Version

UK Government Watch


2020: Jan-March

 2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

Take your medicine, stay home for 3 months, and don't worry about the depression...

UK government to censor quack cures for coronavirus


Link Here31st March 2020
Full story: Coronavirus...Internet censorship and surveillance
The UK government is reported to be actively working with social media to remove coronavirus fake news and harmful content.

Social media companies have responded by introducing several sweeping rule changes that crack down on any dissenting opinions and push users to what they deem to be authoritative or credible sources of information. And now the BBC is reporting that the UK government will be working with these social media companies to remove what it deems to be fake news, harmful content, and misinformation related to the coronavirus.

The report doesn't specify how the UK government will determine what qualifies as fake news or harmful content.

Twitter has updated rules around the coronavirus targeting users that deny expert guidance. The company has also forced some users to remove jokes about the virus.

 

 

Extract: Online harms harms trade negotiations...

Eye watering fines or jailing directors for not protecting kids from perceived online social media harms isn't sitting comfortably with negotiating a free trade deal with the US


Link Here 23rd February 2020

A Times article has popped up under the headline Boris Johnson set to water down curbs on tech giants.

It had all the hallmarks of an insider briefing, opening with the following

The prime minister is preparing to soften plans for sanctions on social media companies amid concerns about a backlash from tech giants.

There is a very pro-tech lobby in No 10, a well-placed source said. They got spooked by some of the coverage around online harms and raised concerns about the reaction of the technology companies. There is a real nervousness about it.

Read the full article from johncarr.blog

 

 

Offsite Article: This is state censorship of the internet...


Link Here 20th February 2020
UK government plans to tackle online harms pose a huge threat to free speech. By Andrew Tettenborn

See article from spiked-online.com

 

 

The DCMS announces its full censorship team line up...

Captain Oliver Dowden leads centre forward Matt Warman and own goal kicker Nigel Huddleston. Taking offence will be Caroline Dinenage. John Whittingdale is on the right wing and Diana Barran takes the other place


Link Here 18th February 2020
The Department for Digital, Culture, Media and Sport (DCMS) has welcomed a number of new and returning ministers, following appointments made by Prime Minister Boris Johnson.

Oliver Dowden, Secretary of State for Digital, Culture, Media and Sport.

The Secretary of State has overall responsibility for strategy and policy across the department and management of Brexit for the department.
Caroline Dinenage, Minister of State for Digital and Culture:
  • Online Harms and Security
  • Digital and Tech Policy including Digital Skills
  • Creative Industries
  • Arts and Libraries
  • Museums and Cultural Property
  • Festival 2022
John Whittingdale, Minister of State for Media and Data:
  • Media
  • Oversight of EU negotiations
  • Overall international strategy including approach to future trade deals
  • Data and the National Archives
  • Public Appointments
Matt Warman: Parliamentary Under Secretary of State for Digital Infrastructure:
  • Broadband Delivery UK (BDUK)
  • Gigabit delivery programme
  • Mobile coverage
  • Telecoms supply chain
  • Cyber Security
Nigel Huddleston: Parliamentary Under Secretary of State for Sport, Tourism and Heritage:
  • Sport
  • Commonwealth Games
  • Gambling and Lotteries
  • Tourism and Heritage
  • Lead Secondary Legislation Minister (including EU Exit SIs)
Diana Barran: DCMS Lords Minister, Parliamentary Under Secretary of State for Civil Society and DCMS
  • All DCMS business in the House of Lords
  • Ceremonials
  • Youth and Social Action
  • Office for Civil Society
  • Loneliness

 

 

New government internet censors...

Oliver Dowden takes over as the Culture Secretary, Julian Knight takes over the chair of the DCMS Select Committee and Ofcom is appointed as the AVMS internet censor


Link Here16th February 2020
Oliver Dowden was appointed Secretary of State for Digital, Culture, Media and Sport on 13 February 2020.

He was previously Paymaster General and Minister for the Cabinet Office, and before that, Parliamentary Secretary at the Cabinet Office. He was elected Conservative MP for Hertsmere in May 2015.

The previous Culture Secretary Nicky Morgan will now be spending more time with her family.

There's been no suggestions that Dowden will diverge from the government path on setting out a new internet censorship regime as outlined in its OnlIne Harms white paper.

Perhaps another parliamentary appointment that may be relevant is that Julian Knight has taken over the Chair of the DCMS Select Committee, the Parliamentary scrutiny body overseeing the DCMS.

Knight seems quite keen on the internet censorship idea and will surely be spurring on the DCMS.

And finally one more censorship appointment was announced by the Government. The government has appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

 Matt Warman, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport announced:

We also yesterday appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

In Fact this censorship process is set to start in September 2020 and in fact Ofcom have already produced their solution that shadows the age verification requirements of the Digital Economy Act but now may need rethinking as some of the enforcement mechanisms, such as ISP blocking, are no longer on the table. The mechanism also only applies to British based online adult companies providing online video. of which there are hardly any left, after previously being destroyed by the ATVOD regime.

 

 

Offsite Article: The Porn Block Failed...


Link Here15th February 2020
Now the Next Ofcom Censorship Bandwagon Begins. By Jerry Barnett

See article from sexandcensorship.org

 

 

The fundamental online harm is for British people to speak freely amongst themselves...

The Government will effectively ban British websites from having forums or comment sections by imposing onerous, vague and expensive censorship requirements on those that defiantly continue.


Link Here12th February 2020
The Government has signalled its approach to introducing internet censorship in a government response to consultation contributions about the Online Harms white paper. A more detailed paper will follow in the spring.

The Government has outlined onerous, vague and expensive censorship requirements on any British website that lets its users post content including speech. Any website that takes down its forums and comment sections etc will escape the nastiness of the new law.

The idea seems to be to force all speech onto a few US and Chinese social media websites that can handle the extensive censorship requirements of the British Governments. No doubt this will give a market opportunity for the US and Chinese internet giants to start charging for forcibly moderated and censored interaction.

The Government has more or less committed to appointing Ofcom as the state internet censor who will be able to impose massive fines on companies and their fall guy directors who allow speech that the government doesn't like.

On a slightly more positive note the government seems to have narrowed down its censorship scope from any conceivable thing that could be considered a harm to someone somewhere into more manageable set that can be defines as harms to children.

The introductory sections of the document read:

Executive summary

1. The Online Harms White Paper set out the intention to improve protections for users online through the introduction of a new duty of care on companies and an independent regulator responsible for overseeing this framework. The White Paper proposed that this regulation follow a proportionate and risk-based approach, and that the duty of care be designed to ensure that all companies have appropriate systems and processes in place to react to concerns over harmful content and improve the safety of their users - from effective complaint mechanisms to transparent decision-making over actions taken in response to reports of harm.

2. The consultation ran from 8 April 2019 to 1 July 2019. It received over 2,400 responses ranging from companies in the technology industry including large tech giants and small and medium sized enterprises, academics, think tanks, children's charities, rights groups, publishers, governmental organisations and individuals. In parallel to the consultation process, we have undertaken extensive engagement over the last 12 months with representatives from industry, civil society and others. This engagement is reflected in the response.

3. This initial government response provides an overview of the consultation responses and wider engagement on the proposals in the White Paper. It includes an in-depth breakdown of the responses to each of the 18 consultation questions asked in relation to the White Paper proposals, and an overview of the feedback in response to our engagement with stakeholders. This document forms an iterative part of the policy development process. We are committed to taking a deliberative and open approach to ensure that we get the detail of this complex and novel policy right. While it does not provide a detailed update on all policy proposals, it does give an indication of our direction of travel in a number of key areas raised as overarching concern across some responses.

4. In particular, while the risk-based and proportionate approach proposed by the White Paper was positively received by those we consulted with, written responses and our engagement highlighted questions over a number of areas, including freedom of expression and the businesses in scope of the duty of care. Having carefully considered the information gained during this process, we have made a number of developments to our policies. These are clarified in the 'Our Response' section below.

5. This consultation has been a critical part of the development of this policy and we are grateful to those who took part. This feedback is being factored into the development of this policy, and we will continue to engage with users, industry and civil society as we continue to refine our policies ahead of publication of the full policy response. We believe that an agile and proportionate approach to regulation, developed in collaboration with stakeholders, will strengthen a free and open internet by providing a framework that builds public trust, while encouraging innovation and providing confidence to investors.

Our response Freedom of expression

1. The consultation responses indicated that some respondents were concerned that the proposals could impact freedom of expression online. We recognise the critical importance of freedom of expression, both as a fundamental right in itself and as an essential enabler of the full range of other human rights protected by UK and international law. As a result, the overarching principle of the regulation of online harms is to protect users' rights online, including the rights of children and freedom of expression. Safeguards for freedom of expression have been built in throughout the framework. Rather than requiring the removal of specific pieces of legal content, regulation will focus on the wider systems and processes that platforms have in place to deal with online harms, while maintaining a proportionate and risk-based approach.

2. To ensure protections for freedom of expression, regulation will establish differentiated expectations on companies for illegal content and activity, versus conduct that is not illegal but has the potential to cause harm. Regulation will therefore not force companies to remove specific pieces of legal content. The new regulatory framework will instead require companies, where relevant, to explicitly state what content and behaviour they deem to be acceptable on their sites and enforce this consistently and transparently. All companies in scope will need to ensure a higher level of protection for children, and take reasonable steps to protect them from inappropriate or harmful content.

3. Services in scope of the regulation will need to ensure that illegal content is removed expeditiously and that the risk of it appearing is minimised by effective systems. Reflecting the threat to national security and the physical safety of children, companies will be required to take particularly robust action to tackle terrorist content and online child sexual exploitation and abuse.

4. Recognising concerns about freedom of expression, the regulator will not investigate or adjudicate on individual complaints. Companies will be able to decide what type of legal content or behaviour is acceptable on their services, but must take reasonable steps to protect children from harm. They will need to set this out in clear and accessible terms and conditions and enforce these effectively, consistently and transparently. The proposed approach will improve transparency for users about which content is and is not acceptable on different platforms, and will enhance users' ability to challenge removal of content where this occurs.

5. Companies will be required to have effective and proportionate user redress mechanisms which will enable users to report harmful content and to challenge content takedown where necessary. This will give users clearer, more effective and more accessible avenues to question content takedown, which is an important safeguard for the right to freedom of expression. These processes will need to be transparent, in line with terms and conditions, and consistently applied.

Ensuring clarity for businesses

6. We recognise the need for businesses to have certainty, and will ensure that guidance is provided to help businesses understand potential risks arising from different types of service, and the actions that businesses would need to take to comply with the duty of care as a result. We will ensure that the regulator consults with relevant stakeholders to ensure the guidance is clear and practicable.

Businesses in scope

7. The legislation will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example through comments, forums or video sharing. Our assessment is that only a very small proportion of UK businesses (estimated to account to less than 5%) fit within that definition. To ensure clarity, guidance will be provided by the regulator to help businesses understand whether or not the services they provide or functionality contained on their website would fall into the scope of the regulation.

8. Just because a business has a social media page that does not bring it in scope of regulation. Equally, a business would not be brought in scope purely by providing referral or discount codes on its website to be shared with other potential customers on social media. It would be the social media platform hosting the content that is in scope, not the business using its services to advertise or promote their company. To be in scope, a business would have to operate its own website with the functionality to enable sharing of user-generated content, or user interactions. We will introduce this legislation proportionately, minimising the regulatory burden on small businesses. Most small businesses where there is a lower risk of harm occurring will not have to make disproportionately burdensome changes to their service to be compliant with the proposed regulation.

9. Regulation must be proportionate and based on evidence of risk of harm and what can feasibly be expected of companies. We anticipate that the regulator would assess the business impacts of any new requirements it introduces. Final policy positions on proportionality will, therefore, align with the evidence of risk of harm and impact to business. Business-to-business services have very limited opportunities to prevent harm occurring to individuals and as such will be out of scope of regulation.

Identity of the regulator

11. We are minded to make Ofcom the new regulator, in preference to giving this function to a new body or to another existing organisation. This preference is based on its organisational experience, robustness, and experience of delivering challenging, high-profile remits across a range of sectors. Ofcom is a well-established and experienced regulator, recently assuming high profile roles such as regulation of the BBC. Ofcom's focus on the communications sector means it already has relationships with many of the major players in the online arena, and its spectrum licensing duties mean that it is practised at dealing with large numbers of small businesses.

12. We judge that such a role is best served by an existing regulator with a proven track record of experience, expertise and credibility. We think that the best fit for this role is Ofcom, both in terms of policy alignment and organisational experience - for instance, in their existing work, Ofcom already takes the risk-based approach that we expect the online harms regulator will need to employ.

Transparency

13. Effective transparency reporting will help ensure that content removal is well-founded and freedom of expression is protected. In particular, increasing transparency around the reasons behind, and prevalence of, content removal may address concerns about some companies' existing processes for removing content. Companies' existing processes have in some cases been criticised for being opaque and hard to challenge.

14. The government is committed to ensuring that conversations about this policy are ongoing, and that stakeholders are being engaged to mitigate concerns. In order to achieve this, we have recently established a multi-stakeholder Transparency Working Group chaired by the Minister for Digital and Broadband which includes representation from all sides of the debate, including from industry and civil society. This group will feed into the government's transparency report, which was announced in the Online Harms White Paper and which we intend to publish in the coming months.

15. Some stakeholders expressed concerns about a potential 'one size fits all' approach to transparency, and the material costs for companies associated with reporting. In line with the overarching principles of the regulatory framework, the reporting requirements that a company may have to comply with will also vary in proportion with the type of service that is being provided, and the risk factors involved. To maintain a proportionate and risk-based approach, the regulator will apply minimum thresholds in determining the level of detail that an in-scope business would need to provide in its transparency reporting, or whether it would need to produce reports at all.

Ensuring that the regulator acts proportionately

16. The consideration of freedom of expression is at the heart of our policy development, and we will ensure that appropriate safeguards are included throughout the legislation. By taking action to address harmful online behaviours, we are confident that our approach will support more people to enjoy their right to freedom of expression and participate in online discussions.

17. At the same time, we also remain confident that proposals will not place an undue burden on business. Companies will be expected to take reasonable and proportionate steps to protect users. This will vary according to the organisation's associated risk, first and foremost, size and the resources available to it, as well as by the risk associated with the service provided. To ensure clarity about how the duty of care could be fulfilled, we will ensure there is sufficient clarity in the regulation and codes of practice about the applicable expectations on business, including where businesses are exempt from certain requirements due to their size or risk.

18. This will help companies to comply with the legislation, and to feel confident that they have done so appropriately.

Enforcement

19. We recognise the importance of the regulator having a range of enforcement powers that it uses in a fair, proportionate and transparent way. It is equally essential that company executives are sufficiently incentivised to take online safety seriously and that the regulator can take action when they fail to do so. We are considering the responses to the consultation on senior management liability and business disruption measures and will set out our final policy position in the Spring.

Protection of children

20. Under our proposals we expect companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms. This would achieve our objective of protecting children from online pornography, and would also fulfil the aims of the Digital Economy Act.

 

 

I don't believe the government's new internet harm vaccine will work!...

The UK government has been briefing the press about its upcoming internet censorship bill


Link Here6th February 2020
The U.K. government has hinted at its thoughts on its internet censorship plans and has also be giving clues about the schedule.

A first announcement seems to be due this month. It seems that the government is planning a summer bill and implementation within about 18 months.

The plans are set to be discussed in Cabinet on Thursday and are due to be launched to coincide with Safer Internet Day next Tuesday when Baroness Morgan will also publish results of a consultation on last year's White Paper on online harms.

The unelected Nicky Morgan proposes the new regime should mirror regulation in the financial sector, known as senior management liability where firms have to appoint a fall guy director to take personal responsibility for ensuring they meet their legal duties. They face fines and criminal prosecution for breaches.

Ofcom will advise on potential sanctions against the directors ranging from enforcement notices, professional disqualification, fines and criminal prosecution. Under the plans, Ofcom will also draw up legally enforceable codes of practice setting out what the social media firms will be expected to do to protect users from loosely define online harms that may not even be illegal. 

Other legal harms to be covered by codes are expected to include disinformation that causes public harm such as anti-vaccine propaganda, self-harm, harassment, cyberbullying, violence and pornography where there will be tougher rules on age verification to bar children.

Tellingly proposals to include real and actual financial harms such as fraud in the codes have been dropped.

Ministers have yet to decide if to give the internet censor the power to block website access to UK internet users but this option seems out of favour, maybe because it results in massive numbers of people moving to the encrypted internet that makes it harder the authorities to snoop on people's internet activity.

 

 

The ethics of censorship...

DCMS group calls for new law in the Online Harms Bill to give the government oversight into algorithms used by social media companies


Link Here4th February 2020
The Centre for Data Ethics and Innovation does is part of the Department for Digital, Culture, Media & Sport. It's tasked by the Government to connect policymakers, industry, civil society, and the public to develop the 'right' governance regime for data-driven technologies.

The group has just published its final report into the control of social media and their 'algorithms' in time for their suggestions to be incorporated into the government's upcoming internet censorship bill.

Maybe the term 'algorithm' has been used to imply some sort of manipulative menace that secretly drives social media. In fact the algorithm isn't likely to be far away from: Give them more of what they like, and maybe also try them with what their mates like. No doubt the government would prefer something more like: Give them more of what the government likes.

Anyway the press release reads:

The CDEI publishes recommendations to make online platforms more accountable, increase transparency, and empower users to take control of how they are targeted. These include:

  • New systemic regulation of the online targeting systems that promote and recommend content like posts, videos and adverts.

  • Powers to require platforms to allow independent researchers secure access to their data to build an evidence base on issues of public concern - from the potential links between social media use and declining mental health, to its role in incentivising the spread of misinformation

  • Platforms to host publicly accessible online archives for 'high-risk' adverts, including politics, 'opportunities' (e.g. jobs, housing, credit) and age-restricted products.

  • Steps to encourage long-term wholesale reform of online targeting to give individuals greater control over how their online experiences are personalised.

The CDEI recommendations come as the government develops proposals for online harms regulation.

The Centre for Data Ethics and Innovation (CDEI), the UK's independent advisory body on the ethical use of AI and data-driven technology, has warned that people are being left in the dark about the way that major platforms target information at their users, in its first report to the government.

The CDEI's year long review of online targeting systems - which use personal information about users to decide which posts, videos and adverts to show them - has found that existing regulation is out of step with the public's expectations.

A major new analysis of public attitudes towards online targeting, conducted with Ipsos MORI, finds that people welcome the convenience of targeting systems, but are concerned that platforms are unaccountable for the way their systems could cause harm to individuals and society, such as by increasing discrimination and harming the vulnerable. The research highlighted most concern was related to social media platforms.

The analysis found that only 28% of people trust platforms to target them in a responsible way, and when they try to change settings, only one-third (33%) of people trust these companies to do what they ask. 61% of people favoured greater regulatory oversight of online targeting, compared with 17% of people who support self-regulation.

The CDEI's recommendations to the government would increase the accountability of platforms, improve transparency and give users more meaningful control of their online experience.

The recommendations strike a balance by protecting users from the potential harms of online targeting, without inhibiting the kind of personalisation of the online experience that the public find useful. Clear governance will support the development and take-up of socially beneficial applications of online targeting, including by the public sector.

The report calls for internet regulation to be developed in a way that promotes human rights-based international norms, and recommends that the online harms regulator should have a statutory duty to protect and respect freedom of expression and privacy.

And from the report:

Key recommendations

Accountability

The government's new online harms regulator should be required to provide regulatory oversight of targeting:

  • The regulator should take a "systemic" approach, with a code of practice to set standards, and require online platforms to assess and explain the impacts of their systems.

  • To ensure compliance, the regulator needs information gathering powers. This should include the power to give independent experts secure access to platform data to undertake audits.

  • The regulator's duties should explicitly include protecting rights to freedom of expression and privacy.

  • Regulation of online targeting should encompass all types of content, including advertising.

  • The regulatory landscape should be coherent and efficient. The online harms regulator, ICO, and CMA should develop formal coordination mechanisms.

The government should develop a code for public sector use of online targeting to promote safe, trustworthy innovation in the delivery of personalised advice and support.

Transparency

  • The regulator should have the power to require platforms to give independent researchers secure access to their data where this is needed for research of significant potential importance to public policy.

  • Platforms should be required to host publicly accessible archives for online political advertising, "opportunity" advertising (jobs, credit and housing), and adverts for age-restricted products.

  • The government should consider formal mechanisms for collaboration to tackle "coordinated inauthentic behaviour" on online platforms.

User empowerment

Regulation should encourage platforms to provide people with more information and control:

  • We support the CMA's proposed "Fairness by Design" duty on online platforms.

  • The government's plans for labels on online electoral adverts should make paid-for content easy to identify, and give users some basic information to show that the content they are seeing has been targeted at them.

  • Regulators should increase coordination of their digital literacy campaigns. The emergence of "data intermediaries" could improve data governance and rebalance power towards users. Government and regulatory policy should support their development.

 

 

Offsite Article: Harmful government...


Link Here26th January 2020
Internet regulation is necessary but an overzealous Online Harms bill could harm our rights. By Michael Drury and Julian Hayes

See article from euronews.com

 

 

A Brexit bounty...

UK Government wisely decides not to adopt the EU's disgraceful Copyright Directive that requires YouTube and Facebook to censor people's uploads if they contain even a snippet of copyrighted material


Link Here25th January 2020
Full story: Copyright in the EU...Copyright law for Europe
Universities and Science Minister Chris Skidmore has said that the UK will not implement the EU Copyright Directive after the country leaves the EU.

Several companies have criticised the disgraceful EU law, which would hold them accountable for not removing copyrighted content uploaded by users.

EU member states have until 7 June 2021 to implement the new reforms, but the UK will have left the EU by then.

It was Article 13 which prompted fears over the future of memes and GIFs - stills, animated or short video clips that go viral - since they mainly rely on copyrighted scenes from TV and film. Critics noted that Article 13 would make it nearly impossible to upload even the tiniest part of a copyrighted work to Facebook, YouTube, or any other site.

Other articles give the news industry total copyright control of news material that people have previously been widely used in people's blogs and posts commenting on the news.

Prime Minister Boris Johnson criticised the law in March, claiming that it was terrible for the internet.

Google had campaigned fiercely against the changes, arguing they would harm Europe's creative and digital industries and change the web as we know it. YouTube boss Susan Wojcicki had also warned that users in the EU could be cut off from the video platform.

 

 

Unsafe law...

Elspeth Howe introduces a House of Lords Bill Private Member's Bill to resurrect the deeply flawed and unsafe age verification requirements for adult porn websites


Link Here22nd January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
The Age Verification for porn requirements inlcuded in the 2017 Digital Economy Acts were formally cancelled in October 2019. The 2017 was deeply flawed in omission of any effective requirements to keep porn users identity and browsing data safe. In addition the regime of enforcing the rules through BBFC censorship and ISP blocking were proving troublesome and expensive.

It is also interesting to note that the upcoming Online Harms bill has also been stripped of its ISP blocking enforcement options. I suspect that the police and security forces would rather not see half the population hidng their internet usage behind Tor and VPNs just so they can continue accessing porn.

For whatever reasons the government quite rightly considered that it would be a whole lot easier just to fine companies when they get it wrong and leave all the expensive technical details of how to to do this to the websites themselves. (This approach has worked well for gambling websites, where AV has been achieved without having to employ censors to make them toe the line).

So I don't tink the government will be interested in supporting the virtue signal lords and the bill will not be given time to get anywhere.

Elspeth Howe's short bill was introduced in the House of Lords on 21st January 2020 and reads:

Digital Economy Act 2017 (Commencement of Part 3) Bill

A bill to bring into force the remaining sections of Part 3 of the Digital Economy Act 2017.

1 The Secretary of State must make regulations under section 118(6) (commencement) of the Digital Economy Act 2017 to ensure that all provisions under Part 3 (online pornography) of that Act have come into force before 4 January 2021.

2 Extent, commencement and short title:

  1. This Act extends to England and Wales, Scotland and Northern Ireland.

  2. This Act comes into force on the day on which it is passed.

  3. This Act may be cited as the Digital Economy Act 2017 (Commencement of Part 3) Act 2020.

 

 

Commented: Verified as out of pocket...

Four companies hoping to profit from cancelled porn age verification go to court seeking compensation from the government


Link Here18th January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Four age verification companies have launched legal action to challenge the government's decision to cancel the censorship scheme requiring age verification for access to internet porn. The companies have lodged a judicial review at the High Court Thursday.

The Telegraph understands the companies are arguing the decision was an abuse of power as the move had been approved by parliament. They are also claiming damages, understood to be in the region of £3 million, for losses sustained developing age verification technology.

The four companies behind the judicial review - AgeChecked Ltd, VeriMe, AVYourself and AVSecure - are arguing the secretary of state only had power to choose when the scheme came into force, not scrap it in the form passed by Parliament.

The legal action has been backed by campaigners from the Children's Charities' Coalition for Internet Safety (CCCIS), which represents organisations including the NSPCC and Barnardo's.

The CEO of AVSecure, Stuart Lawley, a British tech entrepreneur who made his fortune in the dotcom boom, said he had personally lost millions creating the technology. He said the company, which is behind other parental control apps such as Ageblock, had been preparing for up to 10 million people signing up for the service on day one.

Comment: Age Verification Judicial Review endangers UK citizens' privacy

18th January 2020. See article from openrightsgroup.org

Reacting to the Judicial Review launched by Tech companies to force Age Verification for adult content to be implemented Jim Killock, Executive Director of the Open Rights Group said:

These companies are asking us to trust them with records of millions of people's sexual preferences, with huge commercial reasons to use that data for profiling and advertising.

The adult industry has a terrible record on data security. We're being asked to hope they don't repeat the many, many times they have lost personal data, with the result that blackmail scams and worse proliferates.

The government did the responsible thing when it admitted its plans were not ready to proceed. Age Verification must not be pushed forward until there is compulsory privacy regulation put in place.

The companies behind the legal action are not subject to tight privacy regulations. Instead, the government can only ask for voluntary privacy commitments.

General data protection law is not sufficient for this industry as data breaches of this nature cannot be fixed by fines. They need to be prevented by the toughest and most specific regulation available.

 

 

Retirement age verified...

The official letter putting an end to the BBFC's designation as the UK internet porn censor


Link Here8th January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
I was just wondering if the ICO's Age Appropriate Design documentation had been published anywhere on the parliamentary website and spotted this official document marking the end of the BBFC's tenure as the UK's porn censor.

Matt Warman, a minister at the DCMS, signed the revocation of the BBFC's internet censorship powers on 31st October 2019. The notice reads:

Department for Digital, Culture Media & Sport

NOTICE TO REVOKE DESIGNATION OF AGE-VERIFICATION REGULATOR UNDER DIGITAL ECONOMY ACT 2017

The Secretary of State for Digital, Culture, Media and Sport hereby revokes, in exercise of the power in section 16(3)(a) of the Digital Economy Act 2017 ("the 2017 Act"), the designation of the British Board of Film Classification as the age verification regulator made under section 16(1) of the 2017 Act on 21 February 2018 for the purposes of the following functions:

  • a. section 18 (regulator's power to require information);

  • b. section 19(2) and (11) (enforcement by regulator of section 14);

  • c. section 21 (notice by regulator to payment-services providers and ancillary service providers);

  • d. section 23 (regulator's power to require internet service providers to block access to material), subject to section 24 (no power to give notice under section 23(1) where detrimental to national security etc);

  • e. section 25 (guidance to be published by regulator);

  • f. section 26 (exercise of functions by regulator); and

  • g. section 28 (requirements for notices given by regulator under this Part) (to the extent that this applies in relation to the giving of notices by the British Board of Film Classification under the provisions listed in this paragraph).

Matt Warman
Minister for Digital and Broadband on behalf of the Secretary of State
#1st October 2019.

 

 

Offsite Article: Britain's Digital Nanny State...


Link Here7th January 2020
The way in which the UK is approaching the regulation of social media will undermine privacy and freedom of expression and have a chilling effect on Internet use by everyone in Britain. By Bill Dutton

See article from billdutton.me


 2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 
 

 
UK News

UK Internet

UK TV

UK Campaigns

UK Censor List
ASA

BBC

BBFC

ICO

Ofcom
Government

Parliament

UK Press

UK Games

UK Customs


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys