A loose moral climate fed the paranoia and fear that allowed Nu Labour to flourish:
(Picture by MichaelG)
Former Home Secretary David Blunkett has called for internet providers to block pornography, ludicrously warning against a descent into Sodom and Gomorrah .
Blunkett backed an opt-in system of censorship, claiming the Lib Dems had been wrong to reject it at their party conference last week.
He was speaking at a Demos fringe meeting at the Labour conference:
The Lib Dems in Glasgow debated this and decided they were against automatic protection unless people chose to over-ride it, in terms of pornography on the internet and the protection of children. I think they were wrong.
I think we have a job in this country, in a civilised, free, open democracy, to protect ourselves from the most bestial activities and from dangers that would undermine a civilised nation.
Drawing a parallel with Germany before the rise of the Nazis, he suggested a loose moral climate had fed the paranoia and fear that had allowed Adolf Hitler to flourish:
In the late 1920s and early 1930s, Berlin came as near as dammit to Sodom and Gomorrah. There was a disintegration of what you might call any kind of social order.
People fed on that - they fed people's fears of it. They encouraged their paranoia. They developed hate about people who had differences, who were minorities.
There always has had to be some balance, in terms of the freedom of what we want to do, for ourselves and the mutual respect and the duty we owe to each other in a collective society. They developed hate about people who had differences, who were
ATVOD publishes determinations that 10 adult video on demand services had breached statutory
rules requiring UK video on demand providers to keep hardcore pornographic content out of reach of children and announces summit with financial industry on blocking payments to non-UK porn services which fail to protect children
The findings by the Authority for Television On Demand (ATVOD) bring to almost 30 the number of porn operators against whom the regulator has acted over the past two years.
The 10 online video on demand services - Absolute Cruelty, Belted by Beauty, Bitch Slapped, CFNM, CMNM, Frankie and Friends, Jessica Pressley, The British Institution, The Casting Room and Young Dommes -- were held to be in breach of a statutory rule
which requires that material which might seriously impair under 18's can only be made available if access is blocked to children.
The ten services offered any user access to explicit hardcore porn videos which could be viewed on-demand. Yet the content of the videos was equivalent to that which could be sold only to adults in licensed sex shops if supplied on DVD.
The services each broke the statutory rules in two ways. Firstly, they allowed any visitor free, unrestricted access to hardcore pornographic or BDSM video promos/trailers or still images featuring real sex in explicit detail or strong BDSM activity.
Secondly, access to the full videos was open to any visitor who paid a fee. As the services accepted commonplace payment methods -- such as debit cards -- which can be used by under 18's,
[ATVOD don't mention the fact that it is almost impossible to run a business that cannot accept commonplace payment systems, and cannot show the products that they are selling before purchase. Who's going to risk giving credit
card details to a porn website that you can't see. It would be a scammers charter].
ATVOD counsels against complacency as most websites which allow UK children to access hardcore pornography operate from outside the UK and therefore fall outside ATVOD's remit.
ATVOD notes that Crown Prosecution Service guidance on the Obscene Publications Act makes clear that non-UK websites which offer unrestricted access to hardcore pornography and which can be accessed from the UK are likely to be considered to be operating
in breach of UK law. Such websites offer free content as a shop window to attract subscriptions mainly paid by credit and debit card. ATVOD has therefore questioned whether it can be right for businesses which are likely to be operating illegally
to draw revenues from UK bank and credit card accounts.
[But this legal argument relies on under 18's being depraved and corrupted by viewing hardcore porn. All campaigners seem to insist that massive amounts of children access porn, yet the vast majority are clearly neither
depraved nor corrupt, nor seriously harmed for that matter.]
ATVOD has raised this issue directly with those involved in facilitating such payments and is holding a summit with the UK Cards Association, the Payments Council, the British Bankers' Association, and leading payment scheme operators on 10 October to
discuss how the financial industry's response to the problem might evolve.
The latest ATVOD victims, with full text of the ATVOD determinations are:
Gender extremists say they will protest at a three-day conference for adult website operators which began in London today, with talks including State of the Industry: The War on Porn .
The US adult trade group and conference organiser, XBIZ, said the debate would look at the Government's plans.
The extremists from London Feminist Network and Object will wear overalls and masks alluding to their view that the adult industry is toxic . Julia Long from the London Feminist Network said:
At the very moment we are having a national debate on the harms of pornography, and not least the enormous amount of porn in teenagers' and children's lives, XBIZ is holding sessions specifically aimed at combating any attempts to curb access to internet
pornography. Pornographers don't care about the damage their industry does. Their only concern is profit.
Industry lawyer Myles Jackman told the conference website:
Successive governments have mounted a sustained campaign against the UK porn industry and now's the time to fight back.
The UKs Internet Video censor, ATVOD wants to convince banks to withhold card payment services to hardcore
websites that don't follow the overcautious restrictions on access to hardcore material by under 18s. ATVOD demand that only credit card holders, (not debit card holders) should be allowed to access hardcore material.
ATVOD claims to have powers to fine British companies for making unrestricted hardcore material available but this unilateral approach is suffocating British business. The ATVOD approach to the card companies is an attempt to penalise foreign companies
outside of British control.
Financial organisations and the video censor will meet next month to discuss ATVOD's requests.
The Daily Telegraph reports that it is hoped a voluntary deal will be agreed with credit card firms, perhaps heeding the warning that: Government sources have made it clear that ministers would be prepared to consider legislation, if necessary.
Atvod is expected to announce details of action it has taken against British businesses next week.
A summit will be held with the UK Cards Association, the British Bankers Association, the Payments Council and the leading credit card companies early next month, the paper reported.
Johnson said that the financial services firms had given a very positive response to the proposal. But surely the card companies will be a bit aware that if they agree to ATVOD's requests, they will get an endless stream of moralists calling for
the same treatment of their pet prohibitions.
Liberal Democrats have resoundingly rejected plans for an automatic block on internet pornography.
The motion, proposed by Floella Benjamin, the former children's television presenter, suggested all computers should block out pornography unless a user specifically opts to receive it.
The rejection of the motion will be seen as a warning to Nick Clegg, who has already signed up to coalition proposals that means people will automatically have an anti-pornography filter on new computers unless they switch it off.
Despite the leadership's position, several Liberal Democrat speakers stood up to oppose the motion, with one arguing it was counter to all liberal instincts .
Jess Palmer was cheered by members after saying a pornography filter would have prevented her from discovering fan fiction with some adult themes and finding out about asexuality.
Julian Huppert, MP for Cambridge, successfully asked for the motion to be referred back to the party's policy committee for a rethink. He said there are some problems with children accessing internet pornography but this is not the solution.
Update: Crappy website blocking algorithms cited in vote against internet censorship policy
Cllr Sarah Brown, who represents Petersfield on Cambridge City Council, was successful in urging Liberal Democrat members to
reject calls for software companies to filter out pornography.
Adults would have had to opt in to view pornography under the proposals, which will now go back to party bosses for redrafting. The vote put the party at odds with the Conservatives.
Cllr Brown, who campaigns for transgender rights, said filters on public computers had stopped her from visiting her own blog - as well as websites on issues such as safe working conditions for prostitutes. She said:
As an equality campaigner I have seen first hand the effects of Internet censorship. I have been frustrated when trying to access LGBT news sites, or reading blogs of people campaigning for quality, sex education, breast feeding, safer working conditions
for those involved in sex work, drugs information, and so on.
I have even been disallowed access to my own blog, which, by the way, was shortlisted for a Lib Dem Voice award this year, because, apparently, it contains adult content .
Perhaps campaigning for equal rights for vulnerable and abused minorities is adult content , but so-called porn filters shouldn't be blocking it.
She said the motion had good intentions, but argued:
In seeking to protect children from porn, automated filters will block political campaigners, satire, support sites for victims of homophobic bullying, sexual abuse and eating disorders, breast feeding campaigners and the blogs of members of this party.
It is profoundly illiberal and will cause real harm to things of value.
Nominet has launched a review of its registration policy for .uk domain names. The scope of the review focuses on whether there should be restrictions
on the words and expressions permitted in .uk domain name registrations.
Nominet currently has an open policy on domain registrations since 1996, which has played a key role in promoting a dynamic and open internet in the UK.
However, concerns over this approach have been raised by an internet safety commentator and subsequently reported in the Daily Mail. Nominet was also contacted by the Department of Culture, Media and Sport in relation to this issue and are keeping them
informed of our actions.
The review is to be independently chaired by former Director of Public Prosecutions Ken Macdonald QC.
Lord Macdonald will work with Nominet's policy team to conduct a series of meetings with key stakeholders, and to review and assess wider contributions from the internet community, which should be received by 4 November 2013. The goal is to deliver a
report to Nominet's board in December of this year, which will be published shortly thereafter.
Nominet are now seeking contributions from the public via this online form
in relation to this policy review.
The ISP TalkTalk has told education minister Sarah Teather that the government should downplay its focus on pornography blocking and try to stop
suicide sites instead, the Guardian learned.
In a meeting with Teather, who as an MP led a campaign against the sexualisation of children, TalkTalk chief executive Dido Harding said:
Suicide is more important to parents than porn, so why mandate [filters against] porn and not suicide?
According to notes from the meeting in May 2012, released under a Freedom of Information request, Harding said that the government's plan to make users choose whether to opt in or out of being able to access sites designated as porn would be
David Cameron has said that he wants to see ISPs being more proactive over pornography. But a source at one ISP criticised Cameron's overt focus on pornography and claimed politicians and the media are absolutely obsessed with it .
A TalkTalk spokesperson said suicide was the most commonly blocked subject matter by customers using its Homesafe content filtering software, followed by self-harm, pornography, weapons and violence in that order.
By 2010, suicide had become the single biggest caused of death for those aged 15-49 in the developed world, according to the Institute for Health Metrics and Evaluation.
Today the BBFC becomes the new regulator of mobile content, replacing the Independent Mobile Classification Body, which had regulated
this content since 2004. From 2 September, the BBFC will provide the UK mobile network operators EE, O2, Three and Vodafone, with a new independent Classification Framework for content accessed via their mobile networks. Mobile Operators will use this as
a basis for their code of practice for content, meaning content that would be age rated 18 by the BBFC, can be put behind access filters.
The Classification Framework designed by the BBFC allows mobile operators to classify their own commercial content and to calibrate the filters they use to restrict content accessible by children via a mobile operator's Internet access service. Such
content will include pornography and other adult sexual content, pro-ana websites and content which promotes or glorifies discrimination or real life violence.
The BBFC's new partnership will better enable EE, O2, Three and Vodafone to make consistent, evidence based and transparent decisions about the use of Internet filters and will make a significant contribution to protecting children from unsuitable and
even harmful content accessed through their mobile devices.
It seems that the BBFC have just patched up their film classification guidelines and ignored the consequences of trying to apply this to a much broader medium such as a large website.
Back in 2004 when the IMCB were in charge, the rules were envisaged to control video clips and the like provided by mobile phone companies, but David Cooke's introduction seems to suggest that the scope of this has been extended to take in internet
It makes sense to speak of 'repeated' use of the word 'cunt' for a 90 minute film or 1 5 minute video clip, but how does this apply to a massive website such as the Guardian newspaper? It will have many uses of the word 'cunt' spread thinly throughout
1000's of pages. Is it ok to use asterisked spellings such as 'c**t'?
The BBFC speak of references to porn terms being 18 rated but how would this apply to a list of R18 DVDs with titles and cuts using explicit porn terminology?
The BBFC opts out of discussing how effective age restrictions are such, as self declared age like on the BBFC website. Does such an age gate mean that the BBFC porn terms don't trigger an 18 rating (Can other websites use this same technique?)
The BBFC doesn't mention anything about links to other website. Does a link to a porn website mean that the linking website is 18 rated?
And what about pixellated nudity sex scenes.
The questions are endless and the BBFC document is woeful at answering even the most basic.
Perhaps the BBFC could provide a few illustrations such as how it classifies its own website, or how it would classify the Daily Mail website? Enquiring minds need to know.
BT has sought greater legal clarity from the Government in relation to the implementation of website blocking as mandated by the government internet
According to the Financial Times:
BT executives met with Oliver Letwin MP recently to discuss a range of policy issues, a BT spokesperson said. During this meeting the issue of filters came up and we expressed a view that greater legal clarity would be welcome given external
legal advice we have received. We have made this point several times during the past year as it is important that any plans are practical and not unintentionally derailed.
Under the Regulation of Investigatory Powers Act (RIPA), the interception of communications is generally prohibited. It is only legal to intrude on private communications if you have a warrant or both the sender and recipient of information
consent to the activity, even if the interception is done unintentionally.
Telecoms firms are allowed to unintentionally intercept communications in line with RIPA if it takes place for purposes connected with the provision or operation of that service or with the enforcement, in relation to that service, of any
enactment relating to the use of postal services or telecommunications services.
Perhaps BT should also consider the legal liability for businesses trashed by their websites being blocked by cheapo keyword checking algorithms. To date these have a long history of failure resulting in unfair and negligent blocks. ISPs have
probably got away with it in the past because the algorithms have been used for requested child protection where users were probably happy with a 'better safe than sorry' approach. But in the next round users will be expecting ISPs to block only
what they sign up for.
The current crop of website blocking options are totally over the top in overblocking with a safety first approach that blocks websites over totally trivial use of eg strong language. It is not clear if the Government or ISPs are intending to
upgrade their filters to prevent businesses being trashed over negligent website blocking decisions by automated software. But presumably they will stick with the current crap. Related issues are that educational websites are being likewise
blocked merely for using words associated with sexuality.
David Cameron's plan for UK households to block internet porn with default search filters will be very damaging for LGBT people and vulnerable adults who could be denied access to legitimate sexual health and education sites, a group of
authors and journalists has warned.
In an open letter to the Prime Minister, prominent figures including the Belle de Jour writer Brooke Magnanti and feminist blogger and author Zoe Margolis, warned that the Government was taking:
A dangerous and misguided approach to internet safety. Focusing on a default 'on' filter ignores the importance of sex and relationship education and sexual health. Worse, you are giving parents the impression that if they install Internet
filters they can consider their work is done.
They point out that faults with existing internet service provider filters have been reported numerous times and warn that any default filters could:
Unintentionally block important sites related to sexual health, LGBT issues, or sex and relationship education. This will be very damaging for LGBT young people, for example, or vulnerable adults who may be cut off from important support and
advice, in particular those with abusive partners who are also the Internet account holder.
Lee Maguire, technical officer at the civil liberties organisation the Open Rights Group, said that filters could never distinguish:
Between sites that seek to titillate and those with frank discussion of sexuality.
Sites dealing with issues surrounding sexuality are likely to fall foul of miscategorisation as they often contain certain keywords that filters see as inappropriate for children. Even when humans categorise sites, categories will often be set by
individuals with their own cultural values.
The open letter, which was also signed by the science-fiction writer Charles Stross and the New Statesman journalist Laurie Penny, said that by promising families one click to protect the whole family , the Prime Minister was:
Giving parents the impression that if they install Internet filters they can consider their work is done. We urge you instead to invest in a programme of sex and relationship education that empowers young people and to revisit the need for this
topic to be mandatory in schools. Please drop shallow headline grabbing proposals and pursue serious and demonstrably effective policies to tackle abuse of young people.
The parliamentary Culture, Media and Sport Committee has announced another inquiry into Online Safety:
Despite technological innovation and an increase in public understanding of dangers, the online world continues to pose hazards, from images of child abuse to trolling. These dangers are the correlation of the immense benefits provided by
unimpeded communication and free speech, so any attempts to mitigate harms have to be proportionate and, where possible, avoid disadvantageous consequences.
The Culture, Media and Sport Committee has decided to investigate a number of aspects of online safety that are currently raising concerns, in particular:
How best to protect minors from accessing adult content;
Filtering out extremist material, including images of child abuse and material intended to promote terrorism or other acts of violence;
Preventing abusive or threatening comments on social media.
The Committee invites written evidence from those who wish to contribute to the inquiry.
Social networking site Ask.fm has unveiled changes to make its site safer after recent online bullying cases.
It said it would view all reports within 24 hours, make the report button more visible, and include bullying and harassment as a category for a report. It said some of the changes would be live on the site by September.
Ask.fm said it would:
Hire more staff, including a safety officer, to moderate comments on the site
Create a bullying/harassment category for reported comments, alongside spam or scam , hate speech , violence and pornographic content
Raise the visibility of a function to opt out of receiving anonymous questions
Limit the number of features unregistered users were able to access, and
require an email address upon sign-up for registered users
The UK Safer Internet Centre, which promotes the safe use of technology, said it was delighted by Ask.fm's proposed changes, and added the increased visibility of the anonymous opt-out option was an important development. We
strongly advise users, especially children, to switch off anonymous questions, and to report any abuse they see on the site, the group said.
The web inevitably makes available some content which is unsuitable or inappropriate for children to access. Some of
this will be illegal, but much more will not, or may be suitable say for over 13s or over 16s only. A traffic light system may therefore struggle to distinguish between these and runs the risk of imposing the strictest warning on masses of content
A greater concern however, is how the new system will guard against becoming a tool to enable prejudices of one kind or another to be played out. The system can only operate if it is the crowd's decision which counts - the reason this is even
being considered is because there is too much content for a regulator or platform to consider. Relying on the crowd assumes that a collective consciousness emerges from the great mass of web users and their shared values, rather than a set of
subjective reactions. This is a dangerous assumption. As a recent MIT study reported in Science suggests, the wisdom of the crowd may be a myth, its mentality more akin to that of a mob or herd.
A man using the British Library's wi-fi network was denied access to an online version of Shakespeare's Hamlet because the text contained violent content .
Author Mark Forsyth was writing his book in the library, and needed to check a line from the famous play. He revealed on his blog that the filter had logged his attempt to access the page.
The British Library said the fault was caused by a newly installed wi-fi service from a third-party provider.
A spokesperson for the British Library said Hamlet had since been made accessible.
Internet filters have recently come under increased scrutiny, after the government announced that pornography will be automatically blocked by UK internet providers, unless customers choose otherwise. In general the most minute examples of a few
words alluding to adult content can be enough to trigger a block. The software then errs on the side of caution and unfairly blocks many websites.
And of course these companies show little concern about legitimate businesses that suffer as a result.
Prof Ross Anderson, a security expert at Cambridge University, told the BBC that internet filters were pointless and that it was completely inappropriate to have one in the British Library. He added:
Everything that is legal should be available over the library's wi-fi network. The only things they should block are the few dozen books against which there are court judgements in the UK. One of the functions of deposit libraries is to keep
everything, including smut.
Meanwhile one filter maker has a bit of a Gerald Ratner moment
Some customers of newly filtered ISPs are finding that porn is still getting through , but bona fide sites are being blocked . That's because filter algorithms struggle to distinguish between porn and legitimate sites, like lingerie retailers.
None of these systems are perfect, says George Anderson from online filtering security firm Webroot:
If you're an underwear site that's pretty close [to a porn site] and you get blocked because of this ban, that's going to cause issues.
David Cameron's plan to protect children from obscene material online has been dismissed as absolutely ridiculous by
Wikipedia co-founder Jimmy Wales. He said:
It's an absolutely ridiculous idea. It won't work. The software you would use to implement this doesn't work.
My view is that instead of spending literally billions of pounds, billions of dollars, snooping on ordinary people and gathering up all of this data in an apparently fruitless search for terrorists, we should devote a significant proportion of
that to dealing with the real criminal issues online - people stealing credit card numbers, hacking into websites and things like that.
Unfortunately we're not seeing a lot of that. We see a lot of flash and a lot of snooping. But this is, at the end of the day, going to take an investment in real, solid police work.
Wales said problems like online child abuse, hacking social media sites and abusive or threatening messages could be tackled without the introduction of new legislation.
Wales also spoke of the issue of abusive tweets. He suggested that Twitter should make it easier for users to report abuse, but rejected calls for tighter censorship of the social network. He said:
When you think about rules about verbal threats, human society has a long history of rules and laws around this, and those rules and laws are very well thought-out. They deal with complicated cases.
I do think that Twitter has needed in the past to do more to give people more control of the environment, to allow faster means for people to complain and to have people behaving badly exposed, blocked or arrested as necessary.
But it is not like we don't have a law against threatening people. We do, and people are quite rightly being called up on this.
The DCMS has published an official wide ranging paper on internet and communications policy. Many of the censorship aspects have already been described by David Cameron in his recent speech. Here are a few paragraphs fleshing out some of the
proposed censorship ideas:
Material Promoting Terrorism
The Prime Minister has convened an Extremism Task Force which will be looking closely, in the coming months, at the role the communications industry can and should play in reducing the availability of material promoting terrorism online.
A watershed for internet TV
We want to ensure that the living room remains a safe space for children.
TV remains central to our lives, with people in the UK watching on average more than four hours of broadcast TV every day. Families still get together to sit around the television and watch the latest period drama, talent competition, or catch
the latest episode of their favourite soap.
But increasingly, set-top boxes and TVs connected to the internet enable programmes and films to be viewed on-demand, to fit viewing around our own schedules. These can fall outside of regulatory frameworks. People tend to consider connected
TVs to be a TV-like experience and expect to be more protected than they are from content accessed through PCs and laptops. Yet, the technology means that it is easy to flick between regulated and unregulated spaces. Since this is not
always clear, this increases the risk of people inadvertently accessing content that may be offensive, inappropriate, or harmful to children.
The technology is already available to enable people to be provided with more information about programmes, and for locks to be put in place to prevent post- watershed programmes from being viewed by children on-demand. But more needs to
be done to make sure that these practices are adopted more widely, and to make sure that tools, like pin-protection, are straightforward and easy for people to use.
We also want it to be clear to people when they are watching TV in a protected, regulated space, and when they move with just a few clicks to an unregulated area of the internet. We want industry, broadcasters, manufacturers and platform
providers, to lead the development of consumer tools in this area, working with regulators to consider what mechanisms can be applied to clearly label regulated and unregulated content. One such mechanism, may be, for example, using the
electronic programme guide itself to define the protected space. We will work with industry to ensure that best practice is developed and can be shared and standardised. Given this is an area where we are seeing rapid developments, we will keep
progress under close review, and if necessary, we will consider the case for legislation to ensure that audiences are protected to the level that they choose
R18 on internet TV
The popularity of video-on-demand services (VoD) has grown dramatically in recent years, providing consumers with great new choices about what they want to watch when and where. But with this new opportunity comes risk, and this is particularly
the case when it comes to harmful content that is now more readily available. In hard copy, content rated R18 by the British Board of Film Classification (BBFC) is only available in licensed sex shops and content that was even stronger is
banned outright. The VOD regulations in this area do not currently provide the same level of certainty and protection as on the high street. As on-demand services become increasingly prevalent we want to make sure that regulation of on-demand
content is as robust as regulation of content on a DVD, bringing the online world into line with the high street.
We will legislate to ensure that material that would be rated R18 by the British Board of Film Classification is put behind access controls on regulated services and we will ban outright content on regulated services that is illegal even in
licensed sex shops.
More Dangerous Pictures
We will also close a loophole in the Criminal Justice and Immigration Act 2008, so that it is a criminal offence to possess extreme pornography that depicts rape.
We are seeing good progress in this area:
Where children could be accessing the internet, we need good filters that are preselected to be on, and we need parents aware and engaged in the setting of those filters. By the end of this year, when someone sets up a new broadband account,
the settings to install family friendly filters will be automatically selected; if you just click next or enter, then the filters are automatically on.
By the end of next year ISPs will have prompted all existing customers to make an unavoidable decision about whether to apply family friendly filters.
Only adult account holders will be able to change these filters once applied.
All mobile phone operators will apply adult filters to their phones. [Does this allow adults to turn off the blocking?]
90% of public Wi-Fi will have family friendly filters applied to wherever children are likely to be present.
Ofcom will regularly review the efficacy of these filters
But we are clear that industry must go further:
We expect the smaller ISPs to follow the lead being set by the larger providers.
We want industry to continue to refine and improve their filters to ensure they do not, even unintentionally, filter out legitimate content.
We want to see mobile network operators develop their child safety services further; for example, filtering by handset rather than by contract would provide greater flexibility for parents as they work to keep their children safe online.
Paying for PC advert censorship
The UK benefits from a healthy and successful advertising sector, underpinned by an exemplar of successful self-regulation, the Advertising Standards Authority (ASA). The A administers a system which is flexible and responsive, and is industry
funded, through 0.1% levy on non-broadcast advertising spend levied by the Advertising Standards Boa of Finance (ASBOF). This levy is voluntary, but is well supported by industry; however, will be important to ensure that this continues to be
sustainable in the future. The relatively recent extension of the ASA's online remit to cover marketing on companies own websites and on social media demonstrates the increasing importance of online advertising, and advertising spend in the
future is likely to increase its focus on these online markets. Therefore, it will be important to ensure that this self-regulatory, industry-funded model remains sustainable for the future, and that the regulation of online and offline
advertising alike can continue to be supported by the industry levy. Some concerns have been raised over the degree to which collection of the levy in the digital world has kept pace with the rate at which advertisers are now operating there.
We think it is incumbent upon all parts of the industry, including the digital media, to safeguard this continued funding by playing their part in the collection of the levy.
Respondents were asked in our latest survey whether or not they supported David Cameron's proposals on the internet and pornography.
57 per cent said that they do.
27 per cent said that they don't.
15 per cent said that they have no view.
This represents decisive support for the Prime Minister's proposals, which have been strongly driven by the Culture Department. It's worth adding that at this stage this is very much support in principle: we have yet to
see the detail.
However Tory MPs don't seem to be lining up make their pro-censorship views known to the electorate. Perhaps too many votes to be lost.
Everyone agrees that we should try to protect children from harmful content. But asking everyone to sleepwalk into censorship does more harm than good.
Filters won't stop children seeing adult content and risks giving parents a false sense of security. It will stop people finding advice on sexual health, sexuality and relationships. This isn't just about pornography. Filters will block any site deemed
unsuitable for under 18s.
What are the problems with switching on adult filtering by default?
"Set it and forget it" is the wrong message to send to parents. Filters will not stop children seeing adult content.
Adult filters will not just block pornography. They also restrict access to sites deemed unsuitable for under 18s including information on alcohol and other drugs, forums, YouTube and controversial political views.
When adults filters are in place, mistakes are made. Adult filtering can stop people accessing crucial advice on sexual health, sexuality and relationships.
Adult filtering amounts to censoring legal content. The UK would be the only modern democratic society to do this. This sets a terrible example to other countries with interests in suppressing information.
Microsoft has introduced a pop-up warning on its Bing search engine that tells UK users that they are searching for illegal child abuse images. Yahoo will also introduce them in the coming weeks, but Google has no plans to.
Microsoft announced that anyone using its search engine to look for material that shows the sexual abuse of children will trigger the Bing Notification Platform message warning that tells them the content they are looking for is against the law. The
notification will provide a link to a counselling service.
A Microsoft spokesman said:
If someone in the UK tries to use search terms on Bing which can only indicate they are looking for illegal child abuse content, they will activate the Bing Notification Platform which will produce an on-screen notification telling them that child abuse
content is illegal. The notification will also contain a link to Stopitnow.org who will be able to provide them with counselling.
The Bing Notification Platform is triggered by search terms on a list provided by the The Child Exploitation and Online Protection Centre (CEOP).
Please give a big hand (up the arse, naturally) for Dave, the Gutter Press Glove Puppet.
It's astounding! All Daily Mail editor Paul Dacre has to do is wiggle his fingers and Dave obeys his every command!
The Daily Mail writer, Jan Moir, asks:
David Cameron has been mocked for his proposals that all households will have to opt out of automatic porn filters. From next year, the filters would come as standard with internet broadband and cover all devices in a home.
Those who want porn will have to tick the box marked naughty, very naughty and ouch, that's gotta hurt.
No doubt the family-friendly porn sifter will lead to some awkward discussions between husbands and wives making a joint decision. Shall we, shan't we, darling? You mean you didn't even realise that I did?
Possession of the most extreme forms of adult pornography will become an offence, while online content will have the same restrictions as DVDs sold in sex shops.
Something to cheer? A step in the right direction? You would think so.
Yet some, particularly on the Left, dismiss Cameron's plans as an example of the nanny state running wild, spoiling all the fun, taking away our porny freedoms. Others huff that the Prime Minister is a philistine and that his imbecilic plans will censor
The Guardian is continuing its quest to become the new Daily Mail. Deborah Orr takes inspiration from the Daily Mail's Jan Moir:
A roar of libertarian outrage greeted David Cameron's announcement this week that the government was going to talk to internet service providers about installing opt-in rather than opt-out filters for pornography , as if computer access to hot and cold
running arousal aids was some kind of basic human right. Is this really such a big deal?
Comment: Jan Moir's scaremongering about children and pornography makes things worse, not better
Meanwhile the Telegraph does a little fact checking on some of the 'evidence' offered by Jan Moir:.
I was shocked. No, scratch that... I was shocked and appalled to discover that 11-year-olds are addicted to internet porn and that an academic study has confirmed the epidemic. I was also grateful to Jan Moir for highlighting the issue in a column that
castigates those on the Left and Right who have criticised the Government's plans for internet filters. She writes: A study conducted last year by Plymouth University warned that 11-year-old boys were becoming addicted to internet porn. And after
regularly viewing hard-core pornography at an early age, children go on to develop unrealistic or warped expectations of sex... As impertinent as it may be for me to question such a respected expert in the field of public indignation, I decided to
seek out the study.
Reading the executive summary, I was shocked for a second time to discover that it said the exact opposite of what Moir had suggested.
The adult content blocking system championed by David Cameron is controlled by the controversial Chinese company Huawei, the BBC has learned.
UK-based employees at the firm are able to decide which sites TalkTalk's service blocks.
Politicians in both the UK and US have raised concerns about alleged close ties between Huawei and the Chinese government.
Even customers who do not want filtering still have their traffic routed through the system, but matches to Huawei's database are dismissed rather than acted upon.
One expert insisted that private companies should not hold power over blacklists, and that the responsibility should lie with an independent group. Dr Martyn Thomas, chair of the IT policy panel at the Institution of Engineering and Technology,
told the BBC:
It needs to be run by an organisation accountable to a minister so it can be challenged in Parliament,
There's certainly a concern about the process of how a web address gets added to a blacklist - who knows about it, and who has an opportunity to appeal against it.
You could easily imagine a commercial organisation finding itself on that blacklist wrongly, and where they actually lost a lot of web traffic completely silently and suffered commercial damage. The issue is who gets to choose who's on that blocking
list, and what accountability do they have? 'Policing themselves'
Huawei's position was recently the subject of an Intelligence and Security Committee (ISC) report. It criticised the lack of ministerial oversight over the firm's rapid expansion in the UK. The committee said:
The alleged links between Huawei and the Chinese State are concerning, as they generate suspicion as to whether Huawei's intentions are strictly commercial or are more political.
In the US, intelligence committees have gone further, branding Huawei a threat to national security.
Initially, TalkTalk told the BBC that it was US security firm Symantec that was responsible for maintaining its blacklist, and that Huawei only provided the hardware, as previously reported. However, Symantec said that while it had been in a joint
venture with Huawei to run Homesafe in its early stages, it had not been involved for over a year.
TalkTalk later confirmed it is Huawei that monitors activity, checking requests against its blacklist of over 65 million web addresses, and denying access if there is a match.
The contents of this list are largely determined by an automated process, but both Huawei and TalkTalk employees are able to add or remove sites independently.
After brief conversations with some of the ISPs that will be implementing the UK's "pornwall" we've established a little bit about what it will be doing.
The essential detail is that they will assume you want filters enabled across a wide range of content, and unless you un-tick the option, network filters will be enabled. As we've said repeatedly, it's not just about hardcore pornography.
You'll encounter something like this:
(1) Screen one
Do you want to install / enable parental controls
[ticked box] yes
[unticked box] no
(2) Screen two [if you have left the box on screen 1 ticked]
Do you want to block
[ticked box] pornography
[ticked box] violent material
[ticked box] extremist and terrorist related content
[ticked box] anorexia and eating disorder websites
[ticked box] suicide related websites
[ticked box] alcohol
[ticked box] smoking
[ticked box] web forums
[ticked box] esoteric material
[ticked box] web blocking circumvention tools
You can opt back in at any time
The precise pre-ticked options may vary from service to service.
What's clear here is that David Cameron wants people to sleepwalk into censorship. We know that people stick with defaults: this is part of the idea behind 'nudge theory' and 'choice architecture' that is popular with Cameron.
The implication is that filtering is good, or at least harmless, for anyone, whether adult or child. Of course, this is not true; there's not just the question of false positives for web users, but the affect on a network economy of excluding a
proportion of a legitimate website's audience.
There comes a point that it is simply better to place your sales through Amazon and ebay, and circulate your news and promotions exclusively through Facebook and Twitter, as you know none of these will ever be filtered.
Meanwhile ISPs face the unenviable customer relations threat of increased complaints as customers who hadn't paid much attention find websites unexpectedly blocked.
Just as bad, filters installed with no thought cannot be expected to set appropriately for children of different ages.
Offsite Comment: I'm Sorry to Have to Say This, But It Should Not Be a Crime to Fantasise About Raping a Woman
In a civilised society, we recognise that a distinction must be made between what people think and what people do. We insist that while it is very often legitimate to punish people for their actions - particularly their violent actions - it is
unacceptable to punish them for their thoughts and their fantasies, however perverse they might be.
Offsite Comment: Please, Prime Minister, do your porn research
David Cameron is using a legitimate crusade against child abuse images to infiltrate policy on adult content per se, while demonstrating that he doesn't understand either what porn is, or how the internet works.
On the Jeremy Vine show this lunchtime, the PM demonstrated just how ignorant he is. Vine quite simply asked Cameron to define pornography. He couldn't -- or wouldn't -- and told Vine that that was up to the internet service providers to decide .
So the Prime Minister wants to block access to something he can't even relay in layman's terms, and expects global businesses and millions of adults up and down the country to agree to this undemocratic, miasmic proposal. (Did we really democratically
elect this man? Well, I didn't -- but someone must have).
Daily Mail Dave is facing criticisms and serious questions over how his plan for automatic internet porn filters in every
British home would work.
The former head of the Child Exploitation and Online Protection centre (CEOP), Jim Gamble, said Cameron's plan to tackle child abuse images by removing results from search engines like Google would be laughed at by paedophiles:
There are 50,000 predators...downloading abusive images on peer-to-peer, not from Google. Yet from CEOP intelligence only 192 were arrested last year. That's simply not good enough.
We've got to attack the root cause, invest with new money, real investment in child protection teams, victim support and policing on the ground. Let's create a real deterrent. Not a pop-up that paedophiles will laugh at.
In interviews after his speech, Cameron seemed unclear of exactly which legal sites should be banned by the new filters - and accepted that the technology still had weaknesses. Speaking on the BBC's Jeremy Vine programme, Cameron said what would be
included in the filters would evolve over time:
The companies themselves are going to design what is automatically blocked, but the assumption is they will start with blocking pornographic sites and also perhaps self-harming sites
It will depend on how the companies choose how to do it. It doesn't mean, for instance, it will block access to a newspaper like The Sun, it wouldn't block that - but it would block pornography.
Cameron said he did not believe written pornography, such as erotic novel Fifty Shades of Grey, would be blocked under the plans. But he added: It will depend on how the filters work.
He also admitted it could lead to some interesting conversations in families. Asked if the opt in system meant a husband would have to fess up to his partner if he wanted to look at porn, he finally said: Yes, it does. He then added:
I'm not saying we've thought of everything and there will be many problems down the line as we deal with this, but we're trying to crunch through these problems and work out what you can do and can't do.
Cameron was even attacked by one of his former female MPs, Louise Mensch, for attempting to ban video containing rape simulation. She suggested such fantasies were common in more than half of all women. She wrote on Twitter:
It is not for our government to police consensual simulation, between adults, of one of women's most common fantasies,
Padraig Reidy, of the Index on Censorship, said people should not have to opt out of the filters:
If we have, as the Prime Minister is suggesting, an opt-out filter we have a kind of default censorship in place.
Families should be able to choose if they want to opt in to censorship. If a filter is set up as a default then it can really restrict what people can see legitimately. Sites about sexual health, about sexuality and so on, will get caught up in the same
filters as pornography. It will really restrict people's experience on the web, including children's.
Dr Paul Bernal, from the University of East Anglia's law school, suggested Cameron's crackdown on child abuse images was also inadequate:
Plans like these, worthy though they may appear, do not, to me, seem likely to be in any way effective. The real 'bad guys' will find ways around them, the material will still exist, will keep being created, and we'll pretend to have solved the problem
-- and at the same time put in a structure to allow censorship, create a deeply vulnerable database of 'untrustworthy people', and potentially alienate many of the most important companies on the internet. I'm not convinced it's a good idea.
Back on May, following up the conviction of Stuart Hazell for the murder of 12 year old Tia Sharp, Amanda Platell of the
Daily Mail wrote a piece
claiming that child porn could be readily found using Google search terms that were noted in the trial.
Of course it was all bollox and the 'child porn' noted by Platell was found
to be a commercial adult video. The supposed 'child' was either 18 or 19 depending on which month her birthday fell. Her age was properly recorded and is available for checking as required by US law.
But the damage was already done and Daily Mail readers and campaigners were easily convinced by Platell's bollox piece. And so a new evil was born, easy to find child porn just waiting to be revealed by a few search terms in Google.
And now it appears that David Cameron was one of those who believes everything he reads in the Daily Mail.
In a press release David Cameron announced a series of censorship measures to placate the Daily Mail and its readers.
All internet users will be contacted by their service providers and given an unavoidable choice on whether to use website blocking. The changes will be introduced by the end of next year. As a first step, the system will be mandated for new
customers by the end of 2013. The subscriber making the choices will be subject to age verification and further updates to the blocking options may only be made by the account holder.
Website blocking to be applied to all new mobile phones
Prohibited possession of extreme pornography will be extended to scenes of simulated rape.
The Child Exploitation and Online Protection Centre (CEOP) is to draw up a blacklist of 'abhorrent' internet search terms to supposedly prevent paedophiles searching for illegal material.
All police forces will work with a single secure database of illegal images of children.
Videos streamed online are to be subject to the same R18 censorship rules as those sold in shops.
There will be stronger powers for watchdogs to investigate the hidden internet -- heavily encrypted forums and pages that allow abusers to cover their tracks
Adult content will be banned on public WiFi
Ofcom to oversee this implementation of these measures.
In a separate move, Twitter is to use Microsoft's PhotoDNA system to check all uploaded pictures against a database of known child abuse images.
Cameron will say:
There are certain types of pornography that can only be described as 'extreme' ... that is violent, and that depicts simulated rape. These images normalise sexual violence against women -- and they are quite simply poisonous to the young people who see
The government today has made a significant step forward in preventing rapists using rape pornography to legitimise and strategise their crimes and, more broadly, in challenging the eroticisation of violence against women and girls.
I have a very clear message for Google, Bing, Yahoo and the rest. You have a duty to act on this -- and it is a moral duty. If there are technical obstacles to acting on [search engines], don't just stand by and say nothing can be done; use your great
brains to help overcome them.
You're the people who have worked out how to map almost every inch of the Earth from space; who have developed algorithms that make sense of vast quantities of information. Set your greatest brains to work on this. You are not separate from our society,
you are part of our society, and you must play a responsible role in it.
We are already looking at the legislative options we have. This is quite simply about obliterating this disgusting material from the net -- and we will do whatever it takes.'
Offsite Comment: Cameron becomes a bit of an embarrassment on the world stage
Cameron's Bizarre Warning To Google, Bing and Yahoo Over Child Pornography
There are times when I'm not sure that the British Prime Minister, David Cameron, actually understands this technology stuff. An example is this threat in a TV interview in England today. He's huffing and puffing that if the search engine companies don't
do what they're told then they'll be forced to by law.
Over the past few weeks the Government has held meetings with Internet companies about child protection online. These are designed to prompt more more action to protect children, on the assumption that these companies could and should be doing more.
Sadly the Government has seemed keen to appear as if they are taking tough action, and not so keen on thinking carefully about what their action should be.
Policy makers who are pushing for more Internet filtering for child protection do not take the related practical and technical questions seriously. They tend to throw about ideas for technical interventions such as internet filtering without considering
how these would work, or what unintended consequences they might have.
They simply want more done. What that more is, or what it will achieve, seems to be an irrelevant detail. This is despite the Government having run a consultation last year, after which they settled on a fairly reasonable policy of helping
parents make the right choices about filtering. They seem determined to edge towards a stricter default on regime.
We have seen no evidence that during the meetings with internet companies the Government has taken account of any of the broader public policy questions related to the implementation of Internet filtering systems. Along with Index on Censorship, English
PEN and Big Brother Watch, we wrote to the Culture Secretary Maria Miller asking her to invite us to the discussions so these issues could be raised. The Department has subsequently set up a meeting between us and the Minister Ed Vaizey MP.
The details are very important. Internet filtering can easily block more content than it is designed to -- for example, if people do not understand what is being blocked and why, or if sites are incorrectly categorised. People may also easily get around
blocking. It can give people a false sense of security. Making Internet filtering fit multiple devices, ages or beliefs within a household or other setting is almost impossible. And there are other consequences, such as the speed of access or an impact
on privacy where traffic or blocking events are logged.
That's why we are putting these questions to ISPs. We will be sending the questions and replies to the relevant policy makers, and will hope to explain to them why we think these are important questions.
Twenty questions for ISPs on Internet filtering systems
A. On how the technology works
Under the Internet filtering system set up following discussions with the Government about online safety and child protection:
1. Is any traffic of users who are not opted in to filtering inspected and / or logged? If so, is it logged in a way that links the traffic to a subscriber? What logging will there be of blocking events? How does this work?
2. Is filtering applied to all forms of connection offered by the ISP (dialup, ADSL, cable, fast fibre connections etc)?
3. Have you estimated the impact of the through-put of filtering technology on the speed of users' internet access (both for those who are opted in and opted out)?
4. We are concerned about the impact on Internet applications in general as well as web traffic. Does filtering take place only of HTTP traffic on port 80, or will other traffic be affected? What steps will be taken to avoid interfering with non-HTTP
traffic on port 80, for example non-HTTP applications that use this port in order to bypass firewall restrictions?
5. What impact does the filtering have on end-to-end security measures such as SSL or DNSSEC?
6. Can you guarantee that your networks will not be susceptible to mistaken blocking as a result of using specific IP addresses for forwarding filtered traffic, for example as seemed to happen in a case involving Wikipedia ?
7. Have you made any estimates on the impact of filtering systems on infrastructure upgrades?
B. On setting up the filtering
8. Are users faced with pre-ticked boxes when choosing to activate filtering? What is the impact on customers who do not have access to or who do not use a web browsers on a network such as a home broadband connection that is only used for Smart TV video
on demand applications? (ie who will not be presented with a web-based set up screen?)
9. How granular are the available choices? Will a household be able to cater for:
a. Multiple ages or a variety of beliefs? b. Can specific sites be unblocked by a user?
10. Have you done user-testing for your opt-in systems?
11. What information about the filtering is available at the point of sign up? Does it include:
a. Detailed information about what types of content are blocked, with examples? b. The providers of their filtering tools, if a third party is involved? c. Information about the possible problems with and limitations of blocking, with information about
how to report problems?
12. What age-verification processes will be in place? How will this work?
13. Is a customer's decision not to activate filtering a one-off decision, or will it have to be periodically repeated?
C. On managing problems and mistakes
14. When a site is blocked, what information is supplied to the end-user about why and how it has been blocked?
15. Are there easy ways to report mistaken blocks, either over-blocking or under-blocking? Are these clear when users encounter a block?
16. Are there easy ways for people to check if URLs are blocked, and will this include a reporting tool for requesting corrections and reclassifications?
17. How will complaints, from both your subscribers and from owners of sites that are blocked, be dealt with?
a. Are there plans in place to train customer service staff for dealing with these reports? b. Are there targets for dealing with mistakes in a timely manner, or estimates of how long responding to and correcting mistakes will take? c. Will you share
error reports and corrections with other ISPs?
18. Have you specified acceptable error rates to suppliers of filtering services? If so, what are they?
19. Have you sought legal opinions relating to liability for incorrect blocks, including both false positives and false negatives? Do you have plans to offer compensation for businesses harmed by blocking errors, for example when potential customers are
unable to access the site?
20. Are there or will there be systematic reviews of the effectiveness and quality of filtering, including reporting on problems and complaints? Is there a process for review and improvement? Is there or will there be an ombudsman or other oversight body
to handle disputes and review performance?