The IWF is advised to drop its remit for investigating illegal adult porn on human rights grounds. It is not a law enforcement agency, and is not well placed to adjudicate on what is illegal and what is not
At the Internet Watch Foundation's (IWF) annual general meeting, former Director of Public Prosecutions Lord Ken MacDonald reported on the results of his review into the human rights implications of the IWF's activities.
The independent review was commissioned by the IWF following suggestions that its activities might contravene the rights enshrined in the Human Rights Act.
Lord MacDonald put these fears partially to rest, praising the IWF for the respect and sensitivity with which it had balanced the rights to privacy and freedom of expression with the important task of combating the distribution of child abuse
images online. Lord MacDonald argued the IWF's activities in this area do impact on the rights to privacy and freedom of expression, but in a way which is proportionate and justifiable.
However MacDonald found that IWF activity in other areas may be at risk of contravening human rights.
Firstly, he suggested that the IWF should explicitly limit its scope to child abuse content, removing other potentially illegal content from its remit. Child abuse content is unique in that it is universally condemned by the public and it
is relatively easy for IWF analysts to identify. Other illegal content, on the other hand, is a much more complicated and controversial area of law, where legal defences are potentially available to publishers that the IWF would not be well placed
to adjudicate. IWF decisions to remove or block such content would run a much higher risk of impinging unacceptably on human rights.
Secondly, MacDonald argued that the IWF should limit its scope to identifying and removing child abuse content, as opposed to investigating perpetrators, which is the proper role of law enforcement. In particular, he recommended abandoning for the
time being plans to investigate and disrupt the distribution of child abuse content over peer-to-peer networks. Investigation of peer-to-peer networks, argued MacDonald, inevitably involves a degree of intrusion that, while appropriate in the
context of a criminal investigation, is not an appropriate role for a private body such as the IWF.
The MacDonald report is currently before the IWF board, who will decide whether to accept or reject its recommendations. The IWF has promised to release the report publicly once a decision has been made.
The Internet Watch Foundation does not at the moment pursue images and videos on so-called peer-to-peer networks because it lacks
permission from the Home Office. But it was announced on Monday that the watchdog would begin a six-month pilot scheme in collaboration with Google, Microsoft and the Child Exploitation and Online Protection agency (Ceop), so that IWF can develop
procedures to identify and blacklist links to child abuse material on P2P services.
Separately, David Cameron said the dark net , a general term for areas of the internet not accessible through search engines, was policeable. And he said that the government listening service GCHQ would be brought in to tackle child abuse
images. Cameron told the BBC's Jeremy Vine:
There's been a lot in the news recently about the techniques, ability and brilliance of the people involved in the intelligence community, in GCHQ and the NSA in America. That expertise is going to be brought to bear to go after these revolting
people sharing these images [of child abuse] on the dark net, and making them available more widely.
A No 10 spokesperson said the details of the project had yet to be confirmed but roles and responsibilities between IWF and the National Crime Agency would be clarified in due course.
Jim Gamble, former head of Ceop, said rather contradicting the need for all the Google work to block searches:
Nobody actually knows how much child abuse material is on the dark net, but the vast majority is shared on P2P. I think the government is masking the problem by not investing in real human resource.
IWF itself currently has only five staff monitoring the internet, though it has been given approval by its 110 industry members for a bigger budget, following a large donation by Google, and more resources from April 2014.
Culture Secretary Maria Miller announced that the Internet Watch Foundation (IWF) will be asked for the first time to actively
seek out illegal images of child abuse on the internet, working closely with the Child Exploitation and Online Protection (CEOP) Centre. At a summit of major internet service providers (ISPs), search engines, mobile operators and social media
companies, an agreement was reached that the IWF should, for the first time, work with CEOP to search for and block child sexual abuse images.
The UK's leading ISPs -- Virgin Media, BSkyB, BT and TalkTalk -- committed to provide a further £1 million to help fund this new proactive approach and to help tackle the creation and distribution of child sexual abuse material online.
Additionally, all the companies present signed up to a zero tolerance pledge on child sexual abuse imagery.
This will be the first time the IWF has been asked to take on a proactive approach to detect and act against criminal material. The IWF, working alongside CEOP, and the wider internet industry, will ensure the UK leads the way in the global
battle against child sexual abuse. New funding will allow more to be done to actively search, block and remove more child sexual abuse images.
This is a fundamental change in the way that child sexual abuse content will be tackled. It is estimated that there are one million unique images of child abuse online yet only 40,000 reports are made to the IWF each year. The IWF will no longer
have to wait for illegal material to be reported before they can take action, but will work with CEOP to take the fight to those behind child sexual abuse images.
It was agreed at the summit that:
A new proactive role would be taken on by the IWF, working with CEOP -- industry funding will increase to reflect this new role with £1 million more provided by the four major ISPs over the next four years to tackle child sexual abuse
Any relevant organisation which does not yet operate splash pages will introduce them by the end of the month so that when someone tries to access a page blocked by the IWF, they will see a warning message (a splash page') stating
that the page may contain indecent or illegal content;
All present would sign up to a 'zero tolerance pledge towards child sexual abuse content on the internet;
The industry will report to the Culture Secretary within a month on how they can work to support the new proactive approach being taken on this issue through the use of their technology and expertise.
The summit also reviewed the considerable progress that has been made to protect children from harmful or inappropriate content online, including:
The four main ISPs are now offering an active choice on parental controls to all new customers;
The main public Wi-Fi providers have pledged to offer family friendly Wi-Fi in public places where children are likely to be;
The main ISPs have committed to delivering home network parental controls by the end of the year allowing restrictions to be set - simply and quickly - on all devices in the home;
Internet providers are now regularly telling customers about parental controls through emails and their bills;
ISPs will email account holders when any filter settings are changed to ensure the change is approved by an adult.
The Culture Secretary will convene a further meeting, once the industry has reported on what more it can do to support this proactive approach, to ensure that real action is taking place. Notes to Editors
The companies attending the summit were Yahoo, Google, Microsoft, Twitter, Facebook, BT, BSkyB, Virgin Media, TalkTalk, Vodafone, O2, EE and Three. They were joined by CEOP and the IWF.
A ComRes poll conducted among a representative sample of 2058 British adults for the Internet Watch Foundation (IWF) shows the vast majority of people in Britain
think that child sexual abuse content ("child pornography") (91%) and computer generated images or cartoons of child sexual abuse (85%) should be removed from the internet.
83% are 'concerned' about child pornography with 74% saying they are 'very concerned'.
77% are 'concerned' about computer generated images or cartoons of child sexual abuse;
73% are 'concerned' about terrorist websites;
68% are 'concerned' about very extreme/violent pornography;
62% are 'concerned' about hate websites (racist or homophobic);
61% are 'concerned' about suicide websites;
51% are 'concerned' about eating disorder websites.
The survey also revealed some differences in views between men and women, with women being more concerned than men across all categories of material:
Photographic child sexual abuse
Computer-generated images or cartoons of child sexual abuse
Very extreme/violent pornography
Hate websites, eg, racist/homophobic
Websites promoting suicide
Websites encouraging eating disorders
Throughout the whole of 2012, the Internet Watch Foundation logged just 73 UK webpages hosting child sexual abuse images or videos. This compares to 9,477 hosted in other countries around the world.