Google, Facebook, YouTube and other sites would be required by law to take down extremist material within 24 hours of receiving an
official complaint under an amendment put forward for inclusion in new counter-terror legislation.
The Labour MP Stephen Doughty's amendment echoes censorship laws that came into effect in Germany last year. However the effect of the German law was to enable no-questions-asked censorship of anything the government doesn't like. Social media
companies have no interest in challenging unfair censorship and find the easiest and cheapest way to comply is to err on the side of the government, and take down anything asked regardless of the merits of the case.
The counter-terrorism strategy unveiled by the home secretary, Sajid Javid, this month, said the Home Office would place a renewed emphasis on engagement with internet providers and work with the tech industry to seek more investment in
technologies that automatically identify and remove terrorist content before it is accessible to all.
But Doughty, a member of the home affairs select committee, said his amendment was needed because the voluntary approach was failing. He said a wide variety of extremist content remained online despite repeated warnings.
If these companies can remove copyrighted video or music content from companies like Disney within a matter of hours, there is no excuse for them to be failing to do so for extremist material.
Doughty's amendment would also require tech companies to proactively check content for extremist material and take it down within six hours of it being identified.
The proactive check of content alludes to the censorship machines being introduced by the EU to scan uploads for copyrighted material. The extension to detect terrorist material coupled with the erring on the side of caution approach would
inevitably lead to the automatic censorship of any content even using vocabulary of terrorism, regardless of it being news reporting, satire or criticsim.
A row over Tory MP Christopher Chope blocking a backbench attempt to ban people from taking lewd photographs up women's
skirts has spurred the government to adopt the bill.
Speaking in PMQ's this week, Theresa May confirmed the Government was taking on the upskirting campaign. She said:
Upskirting is a hideous invasion of privacy.
It leaves victims feeling degraded and distressed.
We will adopt this as a Government Bill, we will introduce the Bill this Thursday with a second reading before the summer recess.
But we are not stopping there. We will also ensure that the most serious offenders are added to the sex offenders register, and victims will be in no doubt their complaints will be taken very seriously and perpetrators will be punished.
And now extremist feminists are attempting to massively extend the bill to cover their own pet peeves.
Feminist campaigner and academic Clare Ms McGlynn, claimed the draft law created an opportunity to tackle so-called deepfake pornography. She said:
It would be easy to extend the bill so that it covers images which have been altered too and clearly criminalise a practice that victims say they find incredibly distressing.
And Labour MP Stella Creasy demanded misogyny is made a hate crime to ensure the law keeps up with abuse of women. She claimed outlawing hatred of women - by bringing it in to with race and equality laws - would be more effective than one-off bans
for offences like upskirting.
Who is liable if a user posts copyrighted music to YouTube without authority? Is it the user or is it YouTube? The answer is of course that it is the user who would be held liable should copyright holders seek compensation. YouTube would be held
responsible only if they were informed of the infringement and refused to take it down.
This is the practical compromise that lets the internet work.
So what would happen if the government changed the liability laws so that YouTube was held liable for unauthorised music as soon as it was posted. There maybe millions of views before it was spotted. If YouTube were immediately liable they may
have to pay millions in court judgements against them.
There is lot of blather about YouTube having magic Artificial Intelligence that can detect copyrighted music and block it before it us uploaded. But this is nonsense, music is copyrighted by default, even a piece that has never been published and
is not held in any computer database.
YouTube does not have a database that contains all the licensing and authorisation, and who exactly is allowed to post copyrighted material. Even big companies lie, so how could YouTube really know what could be posted and what could not.
If the law were to be changed, and YouTube were held responsible for the copyright infringement of their posters, then the only possible outcome would be for YouTube to use its AI to detect any music at all and block all videos which contain
music. The only music allowed to be published would be from the music companies themselves, and even then after providing YouTube with paperwork to prove that they had the necessary authorisation.
So when the government speaks of changes to liability law they are speaking of a massive step up in internet censorship as the likely outcome.
In fact the censorship power of such liability tweaks has been proven in the US. The recently passed FOSTA law changed liability law so that internet companies are now held liable for user posts facilitating sex trafficking. The law was sold
as a 'tweak' just to take action against trafficking. But it resulted in the immediate and almost total internet censorship of all user postings facilitating adult consensual sex work, and a fair amount of personal small ads and dating services as
The rub was that sex traffickers do not in any way specify that their sex workers have been trafficked, their adverts are exactly the same as for adult consensual sex workers. With all the artificial intelligence in the world, there is no way that
internet companies can distinguish between the two.
When they are told they are liable for sex trafficking adverts, then the only possible way to comply is to ban all adverts or services that feature anything to do with sex or personal hook ups. Which is of course exactly what happened.
So when UK politicians speak of internet liability changes and sex trafficking then they are talking about big time, large scale internet censorship.
And Theresa May said today via a government press release as reported in the Daily Mail:
Web giants such as Facebook and Twitter must automatically remove vile abuse aimed at women, Theresa May will demand today.
The Prime Minister will urge companies to utilise the same technology used to take down terrorist propaganda to remove rape threats and harassment.
Speaking at the G7 summit in Quebec, Mrs May will call on firms to do more to tackle content promoting and depicting violence against women and girls, including illegal violent pornography.
She will also demand the automatic removal of adverts that are linked to people-trafficking.
May will argue they must ensure women can use the web without fear of online rape threats, harassment, cyberstalking, blackmail or vile comments.
She will say: We know that technology plays a crucial part in advancing gender equality and empowering women and girls, but these benefits are being undermined by vile forms of online violence, abuse and harassment.
What is illegal offline is illegal online and I am calling on world leaders to take serious action to deal with this, just like we are doing in the UK with our commitment to legislate on online harms such as cyber-stalking and harassment.
In a world that is being ripped apart by identitarian intolerance of everyone else, its seems particularly unfair that men should be expected to happily put up with the fear of online threats, harassment, cyberstalking, blackmail or vile comments.
Surely laws should be written so that all people are treated totally equally.
Online platforms need to take responsibility for the content they host. They need to proactively tackle harmful behaviours and content. Progress has been made in removing illegal content, particularly terrorist material, but more needs to be done
to reduce the amount of damaging content online, legal and illegal.
We are developing options for increasing the liability online platforms have for illegal content on their services. This includes examining how we can make existing frameworks and definitions work better, as well as what the liability regime
should look like in the long-run.
Terms and Conditions
Platforms use their terms and conditions to set out key information about who can use the service, what content is acceptable and what action can be taken if users don't comply with the terms. We know that users frequently break these rules. In
such circumstances, the platforms' terms state that they can take action, for example they can remove the offending content or stop providing services to the user. However, we do not see companies proactively doing this on a routine basis. Too
often companies simply do not enforce their own terms and conditions.
Government wants companies to set out clear expectations of what is acceptable on their platforms in their terms, and then enforce these rules using sanctions when necessary. By doing so, companies will be helping users understand what is and
We believe that it is right for Government to set out clear standards for social media platforms, and to hold them to account if they fail to live up to these. DCMS and Home Office will jointly work on the White Paper which will set out our
proposals for forthcoming legislation. We will focus on proposals which will bring into force real protections for users that will cover both harmful and illegal content and behaviours. In parallel, we are currently
assessing legislative options to modify the online liability regime in the UK, including both the smaller changes consistent with the EU's eCommerce directive, and the larger changes that may be possible when we leave the EU.
Culture Secretary Matt Hancock has issued to the following press release from the Department for Digital, Culture, Media
New laws to make social media safer
New laws will be created to make sure that the UK is the safest place in the world to be online, Digital Secretary Matt Hancock has announced.
The move is part of a series of measures included in the government's response to the Internet Safety Strategy green paper, published today.
The Government has been clear that much more needs to be done to tackle the full range of online harm.
Our consultation revealed users feel powerless to address safety issues online and that technology companies operate without sufficient oversight or transparency. Six in ten people said they had witnessed inappropriate or harmful content online.
The Government is already working with social media companies to protect users and while several of the tech giants have taken important and positive steps, the performance of the industry overall has been mixed.
The UK Government will therefore take the lead, working collaboratively with tech companies, children's charities and other stakeholders to develop the detail of the new legislation.
Matt Hancock, DCMS Secretary of State said:
Digital technology is overwhelmingly a force for good
across the world and we must always champion innovation and change for the better. At the same time I have been clear that we have to address the Wild West elements of the Internet through legislation, in a way that supports innovation. We
strongly support technology companies to start up and grow, and we want to work with them to keep our citizens safe.
People increasingly live their lives through online platforms so it's more important than ever that people are safe and parents can have confidence they can keep their children from harm. The measures we're taking forward today will help make
sure children are protected online and balance the need for safety with the great freedoms the internet brings just as we have to strike this balance offline.
DCMS and Home Office will jointly work on a White Paper with other government departments, to be published later this year. This will set out legislation to be brought forward that tackles a range of both legal and illegal harms, from
cyberbullying to online child sexual exploitation. The Government will continue to collaborate closely with industry on this work, to ensure it builds on progress already made.
Home Secretary Sajid Javid said:
Criminals are using the internet to further their exploitation and abuse of children, while terrorists are abusing these platforms to recruit people and incite atrocities. We need to protect our communities from these heinous crimes and vile
propaganda and that is why this Government has been taking the lead on this issue.
But more needs to be done and this is why we will continue to work with the companies and the public to do everything we can to stop the misuse of these platforms. Only by working together can we defeat those who seek to do us harm.
The Government will be considering where legislation will have the strongest impact, for example whether transparency or a code of practice should be underwritten by legislation, but also a range of other options to address both legal and illegal
We will work closely with industry to provide clarity on the roles and responsibilities of companies that operate online in the UK to keep users safe.
The Government will also work with regulators, platforms and advertising companies to ensure that the principles that govern advertising in traditional media -- such as preventing companies targeting unsuitable advertisements at children -- also
apply and are enforced online.
It seems that the latest call for internet censorship is driven by some sort revenge for having been snubbed by the
The culture secretary said he does not have enough power to police social media firms after admitting only four of 14 invited to talks showed up.
Matt Hancock told the BBC it had given him a big impetus to introduce new laws to tackle what he has called the internet's Wild West culture.
He said self-policing had not worked and legislation was needed.
He told BBC One's Andrew Marr Show , presented by Emma Barnett, that the government just don't know how many children of the millions using using social media were not old enough for an account and he was very worried about age
verification. He told the programme he hopes we get to a position where all users of social media users has to have their age verified.
Two government departments are working on a White Paper expected to be brought forward later this year. Asked about the same issue on ITV's Peston on Sunday , Hancock said the government would be legislating in the next couple of years
because we want to get the details right.
Update: Internet safety just means internet censorship
This week, Matt Hancock, Secretary of State for Digital, Culture, Media and Sport, announced the launch of a consultation on
new legislative measures to clean up the Wild West elements of the Internet. In response, music group BPI says the government should use the opportunity to tackle piracy with advanced site-blocking measures, repeat infringer policies, and new
responsibilities for service providers.
This week, the Government published its response to the Internet Safety Strategy green paper , stating unequivocally that more needs to be done to tackle online harm. As a result, the Government will now carry through with its threat to introduce
new legislation, albeit with the assistance of technology companies, children's charities and other stakeholders.
While emphasis is being placed on hot-button topics such as cyberbullying and online child exploitation, the Government is clear that it wishes to tackle the full range of online harms. That has been greeted by UK music group BPI with a request
that the Government introduces new measures to tackle Internet piracy.
In a statement issued this week, BPI chief executive Geoff Taylor welcomed the move towards legislative change and urged the Government to encompass the music industry and beyond. He said:
This is a vital opportunity to protect consumers and boost the UK's music and creative industries. The BPI has long pressed for internet intermediaries and online platforms to take responsibility for the content that they promote to users.
Government should now take the power in legislation to require online giants to take effective, proactive measures to clean illegal content from their sites and services. This will keep fans away from dodgy sites full of harmful content and
prevent criminals from undermining creative businesses that create UK jobs.
The BPI has published four initial requests, each of which provides food for thought.
The demand to establish a new fast-track process for blocking illegal sites is not entirely unexpected, particularly given the expense of launching applications for blocking injunctions at the High Court.
The BPI has taken a large number of actions against individual websites -- 63 injunctions are in place against sites that are wholly or mainly infringing and whose business is simply to profit from criminal activity, the BPI says.
Those injunctions can be expanded fairly easily to include new sites operating under similar banners or facilitating access to those already covered, but it's clear the BPI would like something more streamlined. Voluntary schemes, such as the one
in place in Portugal , could be an option but it's unclear how troublesome that could be for ISPs. New legislation could solve that dilemma, however.
Another big thorn in the side for groups like the BPI are people and entities that post infringing content. The BPI is very good at taking these listings down from sites and search engines in particular (more than 600 million requests to date) but
it's a game of whac-a-mole the group would rather not engage in.
With that in mind, the BPI would like the Government to impose new rules that would compel online platforms to stop content from being re-posted after it's been taken down while removing the accounts of repeat infringers.
Thirdly, the BPI would like the Government to introduce penalties for online operators who do not provide transparent contact and ownership information. The music group isn't any more specific than that, but the suggestion is that operators of
some sites have a tendency to hide in the shadows, something which frustrates enforcement activity.
Finally, and perhaps most interestingly, the BPI is calling on the Government to legislate for a new duty of care for online intermediaries and platforms. Specifically, the BPI wants effective action taken against businesses that use the Internet
to encourage consumers to access content illegally.
While this could easily encompass pirate sites and services themselves, this proposal has the breadth to include a wide range of offenders, from people posting piracy-focused tutorials on monetized YouTube channels to those selling fully-loaded
Kodi devices on eBay or social media.
Overall, the BPI clearly wants to place pressure on intermediaries to take action against piracy when they're in a position to do so, and particularly those who may not have shown much enthusiasm towards industry collaboration in the past.
Legislation in this Bill, to take powers to intervene with respect to operators that do not co-operate, would bring focus to the roundtable process and ensure that intermediaries take their responsibilities seriously, the BPI says.
The Culture Secretary Matt Hancock has warned that addictive video games have a negative and damaging impact on children's lives.
The comments have been attributed to the phenomenal success of the survival shooter Fortnite. It has been downloaded more than 40 million times and has been endorsed by stars such as footballer Dele Alli and rapper Drake .
Hancock has also said that too much screen time is damaging to the lives of children. Matt Hancock told The Daily Telegraph : Too much screen time could have a damaging impact on our children's lives. Whether it's social media or video games,
children should enjoy them safely and as part of a lifestyle that includes exercise and socialising in the real world.
He also confirmed that his department is working alongside game developers to improve online safety.
It seems that Hancock is trying to dream up a few ideas designed to support the notion of requiring ID for nternet users.
Nigel Huddleston, a Tory MP and parliamentary private secretary to Mr Hancock, also called on gaming companies to take more responsibility over addictive games. He also said he wouldn't want his own 12-year-old son playing the game because of
concerns it could lead to addiction.
High Court judges have given the UK government six months to revise parts of its Investigatory Powers Act. The government has been given a deadline of
1 November this year to make the changes to its Snooper's Charter.
Rules governing the British surveillance system must be changed quickly because they are incompatible with European laws, said the judges.
The court decision came out of legal action by human rights group Liberty. It started its legal challenge to the Act saying clauses that allow personal data to be gathered and scrutinised violated citizens' basic rights to privacy.
The court did not agree that the Investigatory Powers Act called for a general and indiscriminate retention of data on individuals, as Liberty claimed. However in late 2017, government ministers accepted that its Act did not align with European
law which only allows data to be gathered and accessed for the purposes of tackling serious crime. By contrast, the UK law would see the data gathered and held for more mundane purposes and without significant oversight.
One proposed change to tackle the problems was to create an Office for Communications Data Authorisations that would oversee requests to data from police and other organisations.
The government said it planned to revise the law by April 2019 but Friday's ruling means it now has only six months to complete the task.
Martha Spurrier, director of Liberty, said the powers to grab data in the Act put sensitive information at huge risk.
Javier Ruiz, policy director at the Open Rights Group which campaigns on digital issues, said:
We are disappointed the court decided to narrowly focus on access to records but did not challenge the general and indiscriminate retention of communications data.
This is so wrong on so many levels. Britain would undergo a mass tantrum.
How are parents supposed to entertain their kids if they can't spend all day on YouTube?
And what about all the privacy implications of letting social media companies have complete identity details of their users. It will be like Cambridge Analytica on speed.
Jeremy Hunt wrote to the social media companies:
Thank you for participating in the working group on children and young people's mental health and social media with officials from my
Department and DCMS. We appreciate your time and engagement, and your willingness to continue discussions and potentially support a communications campaign in this area, but I am disappointed by the lack of voluntary progress in those discussions.
We set three very clear challenges relating to protecting children and young people's mental health: age verification, screen time limits and cyber-bullying. As I understand it, participants have focused more on promoting work already underway and
explaining the challenges with taking further action, rather than offering innovative solutions or tangible progress.
In particular, progress on age verification is not good enough. I am concerned that your companies seem content with a situation where thousands of users breach your own terms and conditions on the minimum user age. I fear that you are
collectively turning a blind eye to a whole generation of children being exposed to the harmful emotional side effects of social media prematurely; this is both morally wrong and deeply unfair on parents, who are faced with the invidious choice of
allowing children to use platforms they are too young to access, or excluding them from social interaction that often the majority of their peers are engaging in. It is unacceptable and irresponsible for you to put parents in this position.
This is not a blanket criticism and I am aware that these aren't easy issues to solve. I am encouraged that a number of you have developed products to help parents control what their children an access online in response to Government's concerns
about child online protection, including Google's Family Link. And I recognise that your products and services are aimed at different audiences, so different solutions will be required. This is clear from the submissions you've sent to my
officials about the work you are delivering to address some of these challenges.
However, it is clear to me that the voluntary joint approach has not delivered the safeguards we need to protect our children's mental health. In May, the Department
for Digital, Culture, Media and Sport will publish the Government response to the Internet Safety Strategy consultation, and I will be working with the Secretary of State to explore what other avenues are open to us to pursue the reforms we need.
We will not rule out legislation where it is needed.
In terms of immediate next steps, I appreciate the information that you provided our officials with last month but would be grateful if you would set out in writing your companies' formal responses, on the three challenges we posed in November. In
particular, I would like to know what additional new steps you have taken to protect children and young people since November in each of the specific categories we raised: age verification, screen time limits and cyber-bullying. I invite you to
respond by the end of this month, in order to inform the Internet Safety Strategy response. It would also be helpful if you can set out any ideas or further plans you have to make progress in these areas.
During the working group meetings I understand you have pointed to the lack of conclusive evidence in this area — a concern which I also share. In order to address this, I have asked the Chief Medical Officer to undertake an evidence review on the
impact of technology on children and young people's mental health, including on healthy screen time. 1 will also be working closely with DCMS and UKRI to commission research into all these questions, to ensure we have the best possible empirical
basis on which to make policy. This will inform the Government's approach as we move forwards.
Your industry boasts some of the brightest minds and biggest budgets globally. While these issues may be difficult, I do not believe that solutions on these issues are outside your reach; I do question whether there is sufficient will to reach
I am keen to work with you to make technology a force for good in protecting the next generation. However, if you prove unwilling to do so, we will not be deterred from making progress.
A survey commissioned by the Royal Society for Public Health has claimed that four in five people want social media firms to be regulated
to ensure they do more to protect kids' mental health. Presumably the questions were somewhat designed to favour the wished of the campaigners.
Some 45% say the sites should be self-regulated with a code of conduct but 36% want rules enforced by Government.
The Royal Society for Public Health, which surveyed 2,000 adults, warned social media can cause significant problems if left unchecked.
Health Secretary Jeremy Hunt has previously claimed that social media could pose as great a threat to children's health as smoking and obesity. And he has accused them of developing seductive products aimed at ever younger children.
The survey comes as MPs and Peers today launch an All Party Parliamentary Group (APPG) that will probe the effect of social media on young people' mental health. It will hear evidence over the coming year from users, experts and industry, with the
aim of drawing up practical solutions, including a proposed industry Code of Conduct. Labour MP Chris Elmore, who will chair the APPG.
Ofcom has today opened seven new investigations into the due impartiality of news and current
affairs programmes on the RT news channel.
The investigations (PDF, 240.5 KB) form part of an Ofcom update, published today, into the licences held by TV Novosti, the company that broadcasts RT.
Until recently, TV Novosti's overall compliance record has not been materially out of line with other broadcasters.
However, since the events in Salisbury, we have observed a significant increase in the number of programmes on the RT service that warrant investigation as potential breaches of the Ofcom Broadcasting Code.
We will announce the outcome of these investigations as soon as possible. In relation to our fit and proper duty, we will consider all relevant new evidence, including the outcome of these investigations and the future conduct of the licensee.
UK Censorship Culture Secretary Matt Hancock met Facebook executives to warn them the social network is not above law.
Hancock told US-based Vice President of Global Policy Management Monika Bickert, and Global Deputy Chief Privacy Officer Stephen Deadman he would hold their feet to the fire over the privacy of British users.
Hancock pressed Facebook on accountability, transparency, micro-targeting and data protection. He also sought assurances that UK citizens data was no longer at risk and that Facebook would be giving citizens more control over their data going
Following the talks, Hancock said:
Social media companies are not above the law and will not be allowed to shirk their responsibilities to our citizens. We will do what is needed to ensure that people's data is protected and don't rule anything out - that includes further
regulation in the future.
The government has announced a new Offensive Weapons Bill, which will be brought forward within weeks. It will ban the sale of the most
dangerous corrosive products to under-18s and introduce restrictions on online sales of knives. It will also make it illegal to possess certain offensive weapons like zombie knives and knuckle-dusters in private.
The government notes that the new legislation will form part of the government's Serious Violence Strategy, which will be launched tomorrow.
Along with other issues the Serious Violence Strategy will examine how social media usage can drive violent crime and focus on building on the progress and relationships made with social media providers and the police to identify where we can take
further preventative action relevant to tackling serious violence.
When the strategy is launched tomorrow, the Home Secretary will call on social media companies to do more to tackle gang material hosted on their sites and to make an explicit reference to not allowing violent gang material including music and
video on their platforms.