The UK parliament's Culture, Media and Sport Committee has announced a new inquiry into the potential risks from harmful material on the Internet and in videogames.
The CMS Committee wants to consider the benefits and opportunities offered to consumers, including children and young people, and the economy by technologies such as the Internet, videogames and mobile phones.
At the same time, it will look at potential risks to consumers from exposure to harmful content on the Internet or in videogames, considering the "effectiveness of the existing regulatory regime" in helping to manage the potential
The committee is particularly interested in the potential risks posed by "cyberbullying" according to a statement calling for written submissions from interested parties.
While the CMS Committee will accept responses to the Byron Review, it intends that its inquiry be broader in scope as it will examine the impact of content on consumers in general, rather than focusing solely on the impact on children and young
Submissions are due by the end of January, with oral evidence sessions planned for February and March of 2008.
The inquiry was first mentioned in August 2007 after a series of YouTube videos highlighting youth violence:
John Whittingdale, the chairman of the cross-party Commons culture, media and sport select committee said he was "very interested" in an investigation into how to limit access to unsuitable material across the
Concern about the way gangs promote themselves by placing violent video clips - including scenes with guns -on the internet has grown since the shooting of Rhys Jones in Croxteth.
And again the inquiry was mentioned in the
A teenager who boasted of his criminal exploits on Bebo has been handed an anti-social behaviour order. The 17-year-old, from Norfolk, posted comments and photographs on the social networking site glorifying his criminal
exploits including drug-taking, according to the police.
Appearing at Norwich Youth Court, district judge Philip Browning banned the youth from using the internet to publish material that is "threatening or abusive" and "promotes criminal activity".
The court heard a police investigation found the boy had also made offensive comments against officers on his web page.
Fears that criminal gangs use internet sites to recruit members, organise fights and glorify their activities prompted MPs to launch an investigation into how to shield young people from such material.
The Culture, Media and Sport Committee has announced a new inquiry into the potential risks from harmful material on the Internet and in video games, with the following terms of reference:
The benefits and opportunities offered to consumers, including children and young people, and the economy by technologies such as the Internet, video games and mobile phones.
The potential risks to consumers, including children and young people, from exposure to harmful content on the Internet or in video games. The Committee is particularly interested in the potential risks posed by:
user generated content, including content that glorifies guns and gang violence
the availability of personal information on social networking sites
content that incites racial hatred, extremism or terrorism
content that exhibits extreme pornography or violence
The tools available to consumers and industry to protect people from potentially harmful content on the Internet and in video games.
The effectiveness of the existing regulatory regime in helping to manage the potential risks from harmful content on the Internet and in video games.
The Committee will accept as submissions (or as part of submissions) responses to the Byron Review of children and new technology.
The Committee, however, intends that its inquiry be broader in scope than the Byron Review as the Committee will examine the impact of content on consumers in general, rather than focusing solely on the impact on children and young people.
It is expected that oral evidence sessions will be held in February and March 2008.
Written submissions are invited from interested parties; these should be sent to Daniel Dyball, Committee Specialist, at the address below by Wednesday 30 January 2008 .
Our strong preference is for submissions to be in Word or rich text format (not as a PDF document) and sent by e-mail to email@example.com, although letters will also be accepted. Submissions sent by post should be sent to Daniel Dyball,
Committee Specialist, Culture, Media and Sport Committee, House of Commons, 7 Millbank, London SW1P 3JA. Please include a contact name, postal address and telephone number in the body of the e-mail or in the letter.
The Culture, Media and Sport Committee held its first oral evidence session as part of its inquiry into harmful content on 26th February 2008.
Videogame developers should dis-incentivise gamers from long periods of play by allowing players to achieve the highest scoring aspects of a title early on in the game's life cycle. That's according to John Carr, executive secretary at the
Children's Charities Coalition for Internet Safety.
He raised the argument that there were a number of concerns over videogames, other than the issue of violence – including reports of children "dying at their consoles" – that need to be addressed.
While fellow panellist at the hearing Professor Sonia Livingstone, from the London School of Economics, pointed out that there is no clear evidence that videogames provide benefits to children, she also pointed out that there is no clear evidence
that they harm children either - but there was evidence suggesting repetition of actions could be a problem.
Professor Livingstone also raised the subject of age ratings in games, and highlighted reports that large numbers of children played games at home that according to the ratings were not appropriate.
Carr then added his belief that some parents misunderstood the nature of age ratings, believing them to relate more to a general skill level suggestion, instead of advice on potentially damaging content.
The consensus among the panel was that parents needed more help and better tools to educate themselves and their children about the potential dangers online.
Matt Lambert, Microsoft's head of corporate affairs in the UK, stated his belief that the PEGI ratings system was better than the BBFC version.
When committee chairman John Whittingdale asked Lambert about the apparent confusion for parents over age ratings for videogames – particularly the belief that they represented skill levels instead - Lambert replied that he hadn't seen any
evidence of such confusion, and that internal research indicated that 96% of parents were in fact aware of the presence of age ratings.
Instead he pointed to anecdotal evidence which led him to believe parents instead weren't concerned about applying those ratings. And on the question of which of the two ratings systems that exist in the UK was preferable, Lambert indicated that
he believed PEGI was more effective.
If there's going to be one ratings system, it should be PEGI. With PEGI, they think very carefully about age appropriacy…but the BBFC is set up to rate films, and it takes that approach for games when a different approach is required.
PEGI breaks it down to a different level. If there's bad language it will give you a specific symbol, if there's gambling there's another symbol, and some games will have a whole raft of symbols on the back. It's a different depth, it's more
sensible, and it also has a European aspect to it.
The chairman then responded to the answer by pointing out that the BBFC itself would contradict such a view – that it believes the PEGI methodology to be inferior, and employs specialists who look at hours of gameplay when coming to a decision:
I'm not saying that's wrong, and I apologise if I gave the impression that that's not what they do - though they would say that they are the best. But I do believe that the BBFC's thinking clearly comes from the world of film [and not games],
that's definitely true.
The BBFC has hit back at suggestions that it doesn't provide a more effective ratings system than the PEGI version, as suggested by Microsoft's UK head of corporate affairs Matt Lambert, at a CMS Select Committee hearing yesterday.
Speaking to GamesIndustry.biz the BBFC has rejected those claims, and stated that while the body uses the same symbols as for films in order to enable a greater understand of the level of content to be expected in games, it doesn't classify
games in the same way that we classify films, because we physically play the game.
The fact is, we provide consumer advice about the content - and extended information - on our Parents website about exactly the sort of things you can expect to encounter in the game, in all of the games we classify - and we do it in words,
which people understand, they don't understand the pictograms.
We know this - in January we did research and the public really couldn't get their heads around what a spider meant. That is not sufficient information for them to make a decision.
What people think about the PEGI system is that it's a difficulty rating, said the spokesperson. One of the parents in our research groups was complaining that she had bought a game with a 3+ on thinking it was suitable for her child,
and it turned out to be a complicated sports game - whereas if they see a PG12, they know it's going to have the sort of content (and here you can argue that the system is similar) as they would expect from a 12-rated film.
Just like when they get a film that's an 18, and says 'Strong bloody violence' they have an idea of what that is, because they've seen it in 18-rated films…The fact is, sticking a spider on the back of a box is not going to help a person make
the kind of decision that they ought to be making about games.
The BBFC also underlined that during its review process it employs people that actually plays through the games, and noted the contrast with the PEGI methodology.
Unlike the PEGI system, which is purely a tick-box system filled in by the distributor themselves, the BBFC has very well-qualified games examiners - who are games fans themselves - to play the games right through all the levels, with the
cheat codes, and spend a lot of time playing them so that they know what the content is.
MPs of the Commons culture, media and sport select committee asked industry experts about filtering and user content websites.
John Carr, the executive secretary of the Children's Charities Coalition for Internet Safety, said that the industry could not be expected to be some sort of "moral arbiters" or "priests" for the public, deciding which content
should be screened.
In school the headteacher sets the standards surrounding internet content, Carr added. It should be the same in the home ... there is no way we can legislate from the centre. The public policy challenge is in helping parents to
understand the internet and in turn help children. Parents feel at sea about what to do. Safety software should be pre-installed and set to a high level.
Asked what he thought of the idea, Matt Lambert, head of corporate affairs at Microsoft, admitted that internet content filtering technology already provided by the company as standard with its software products was "not widely used".
But Lambert rejected the idea of a mandatory setting of content filters to a high security level, arguing that it would block too much content that posed no risk to children. Lambert said a better solution would be for parents to be better
educated about what their children are looking at online and what content filters are available. Setting [filtering controls] at a high level is the equivalent to blocking the internet ... it would be living in the dark ages in my view.
Stephen Carrick Davies, the chief executive of Childnet International, a charitable body that promotes online safety for children, told the committee that one problem with policing the internet is that the concept of harmful content is difficult
to define, unlike obviously illegal content such as child abuse images: Illegal content is easy [to define and regulate] while harmful is difficult. We need to recognise there is 'grey'. There is black and white but also grey.
He also pointed out that legislation against such a "grey" area could result in curbs of freedom of expression and that in a web 2.0 world of user-generated content it can often be young people themselves - those often seen as
"passive victims" - who can perpetrate cyber bullying online.
Davies suggested the answer might lie in a three-pronged approach. He said this strategy would involve self-regulation by the industry; empowering, supporting and educating schools; and making sure that parents help children so they are savvy
enough and equipped just as how they are when they walk down the high street.
UK Xbox boss Neil Thompson has said he reckons PEGI would do a better job of rating videogames than the British Board of Film Classification.
There's been much talk about whether the UK should have a single ratings system lately. (Sometimes we talk about it in the office. "Do you think the UK should have a single ratings system?" "I don't care. It's your turn to make the
tea.") It's thought that Tanya Byron could make such a recommendation in her forthcoming Government review on violence in games, though nothing has been decided. Two sugars.
"We made it very clear to the Byron Report team, both as an industry and as Microsoft, strongly believe that PEGI has a lot more benefits for customers, parents and for everyone involved in the industry really," Thompson said.
"PEGI has been established for quite a few years now as the industry standard, so the industry has got behind it and invested a lot of time and effort in it, and it offers a level of in-depth information as well as a level of expertise to be
honest, that the BBFC doesn't."
According to Thompson, PEGI rated nearly 2000 games last year - while the BBFC managed just 100. That's not including Manhunt 2, which was refused a rating by the BBFC for being likely to turn us all into homicidal maniacs.
"There's just a scale difference in terms of industry knowledge and industry insight that goes into these things," Thomspon observed.
The BBFC has claimed the symbols used by PEGI aren't meaningful enough, but Thompson reckons they help consumers to quickly ascertain which age groups games are suitable for. The key, he argues, is for the industry and Government to educate
parents about ratings.
To read the full interview with Thompson, visit GamesIndustry.biz - where freshly squeezed information and organically grown fact are whisked up in the blender of truth to produce piping hot news soup.
Google resisted calls to screen videos before they appeared on YouTube, despite admitting it had been too slow to take down a clip which showed a 25-year-old mother being gang-raped.
The search giant was attacked by MPs after admitting it was "clearly a mistake" that a video showing the woman being raped was watched 600 times before being removed from YouTube, the video-sharing site it owns.
Giving evidence before a Commons select committee, Google's general counsel, Kent Walker, said it would go against the spirit of the internet to require all videos to be screened and resisted calls for tighter regulation of sites like YouTube.
Asked about the site's failure to take down the footage - which showed the mother being sexually assaulted by three boys after her drink had been spiked - more quickly, Walker told MPs: I do not know exactly what happened but it was a mistake.
Walker was giving evidence to the Culture, Media and Sport committee, which is investigating the dangers posed by the internet to children. He told the committee that YouTube's reviewers looked through "a huge amount" of material. He
added that, of the offensive videos that were flagged to the site, more than 50 per cent were removed within half an hour. A large majority is removed within an hour.
Walker came under heavy fire from MPs, who said his inability to disclose how many staff were employed by Google to monitor footage flagged on YouTube suggested his defence was "incredible". Do you know how absurd you are sounding?
asked Paul Farrelly, the Labour MP for Newcastle-under-Lyme.
Walker said, however, that it would be "neither efficient not effective for YouTube to screen the entirety of the content uploaded by its users - about 10 hours of footage every minute - before it was made public: That would burden
the process of creativity. You do not have a policeman on every street corner to stop things from happening, you have policemen responding very quickly when things do happen.
Plans to widen the use of cinema-style rating for computer games are at risk of failing, amid predictions that soon there will be too many for the censors to regulate.
Games industry bosses told MPs on the Culture Select Committee, who are examining harmful content on the internet and in video games, that an explosion in online gaming would mean up to 100,000 games appear a year – far more than the 1,750 titles
Paul Jackson, director-general of Elspa, the games industry trade body, said it would need to fill a tower block with censors to make the system work. He was responding to questions from John Whittingdale, the Conservative chairman of the
Jackson's comments mean that government plans, announced this month, to introduce compulsory rating for all games that would attract a 12 certificate and above would collapse because the BBFC could not cope: We are concerned about plans to
introduce a hybrid system. On the face of it, it means classifying another 500 games a year. But will they be able to rate 100,000 games and game elements in five years' time?
Comment: Future Proofing Games Ratings
Paul Jackson's comments are better explained in an interview with TechRadar
Paul Jackson: Our concern is this – the games industry needs to be reassured that the British Board of Film Classification would be capable of delivering against a new remit. There are two broad areas of concern.
Firstly, it looks as though the PEGI system currently delivers a harsher rating on games than (historically) the BBFC has – and we want to understand why that is happening and, if it's not right, how we can fix it.
The second area of concern is about 'future-proofing'. We know that our industry is going online and we know that the methodologies used with PEGI allow complete flexibility, because it is generated from within the industry. Every product has got
a product manager, so every product can be self-assessed. And then the checks and balances that are so important come into play after that.
With the BBFC system that has been developed since the 1930s it is based around individual censors reviewing each and every product. Now what does that mean in a world where there are perhaps a million online elements a year which need to be
classified? I don't know? That is where we need to make sure that we understand how the BBFC would be capable of delivering against that remit.
TechRadar: The BBFC told TechRadar recently that they were more than happy and confident to take on what they estimate to be an extra three to five hundred games a year.
Paul Jackson: Yes, and at the level of three-to-five hundred, who would question that? The question really is – 'what happens in that online space?'
As the industry goes online over the next three to ten years what we don't want to do, including the BBFC, I'm sure – and this is why we keep talking about 'future proofing' – is we don't want to invest in a system that effectively becomes
redundant over the few years' time.
TechRadar: Why would it become redundant?
Paul Jackson: Well if – and there are many 'ifs' in this which is why we want to work with government and with the BBFC over the next 18 months – if, for instance, one scenario is that the games industry moves almost
exclusively online and then the products that we are selling, many of those products fragment… So, The Sims would be a good example here. If you look at The Sims as a product, it's a £30 purchase at the point of display and
then just look at the number of items that are already available to purchase online for The Sims. Every one of those in future will need to be referenced and classified. How will that be done?
Those are the areas of concern we have got, because we are certainly not talking five to six hundred 'elements' per year over the next ten years. We're talking about hundreds of thousands, millions, who knows?
We've tried to word our concern very clearly. We are concerned because we don't understand how that is going to work. And if it doesn't work, if we've not 'future proofed' then we just have a system that's going to last us the next three years.
Which is not what any of us want.
Ofcom has dismissed claims by a group of MPs that the 9pm watershed is failing to protect young children because they can now access television online.
Giving evidence at a culture, media and sport committee hearing today, the Ofcom chief executive, Ed Richards, denied the regulator had put itself in an "impossible and absurd position" by not doing more to regulate objectionable
content on the web.
Richards was responding to claims made by Nigel Evans a conservative MP who argued that Ofcom's powers over broadcasting should be more rigorously applied to internet content.
It's important to remember that the watershed isn't dead, Richards said: Despite the internet, television remains remarkably resilient as a medium. The watershed is still a very important and I think it will remain so for several years.
The cross-party group of MPs raised concerns about services such as the BBC iPlayer, which make it possible for anyone to view post-watershed content at any time of the day.
The Ofcom partner for content and standards, Stuart Purvis, said a lot of the responsibility rested with parents to make sure their children were not watching inappropriate material: If you look at the iPlayer, it immediately asks you if you
are over 16. The question that arises is: Are children going to understand that or are they going to override it?
He added that new technology had in a sense disadvantaged parents who might not necessarily know how to use access locks to protect children from post-watershed content.
However, both Purvis and Richards dismissed suggestions that it was the role of Ofcom on its own to encourage parents to become more aware of their children's online activities.
Richards said: We are definitely not the right body to deliver a mass campaign to promote media literacy. We are not qualified enough to do it. We don't have the skills to do it. I think somebody does have to do that, but it's not the
duty of Ofcom. That sort of mass campaign to bring parents understanding of literacy issues is not appropriate for us.
15th May 2008
Back bench Labour MP Margaret Moran has introduced a private members bill in the House of Commons calling for online retailers to take reasonable steps to establish the age of its customers when selling adult goods and services.
The Online Purchasing Of Goods And Services (Age Verification) Bill gets its second reading on 16th May.
Update: No Mention
21st May 2008
No mention of the Bill in Hansard on the 16th May so presumably parliament didn't find time to debate it. So presumably it is no more.
The internet industry must take more responsibility for protecting young people from the "dark side" of digital content relating to abuse, violence and suicide, according to a committee of MPs.
The investigation recommended the establishment of a self-regulatory body to create better online safeguards to protect children from being exposed to unsuitable material. The body would police websites, adjudicate on complaints and could help
crack down on piracy and illegal file-sharing in Britain.
The culture, media and sport committee report, on harmful content on the internet and video games, said that leaving individual companies to introduce their own measures to protect users had resulted in an unsatisfactory piecemeal approach
which lacks consistency and transparency.
The committee chairman, John Whittingdale, criticised YouTube for not going far enough with proactive measures, beyond a pledge to take down material when it is "flagged" up by users: We had a lively debate with YouTube [who said
they have] millions of users who act as regulators. They understandably say they can't look at all the material uploaded.
The report recommends a "proactive review of content" as standard practice for sites hosting user-generated content. The idea would be to introduce technological tools to "quarantine" material which potentially violates
terms and conditions of use until ... reviewed by staff.
The report recommended a host of measures including improving the "shocking" industry-accepted standard takedown time of 24 hours for the removal of child abuse content. Whittingdale said a key concern was that many young people did not
realise when they are putting information on social networking websites such as Bebo and Facebook it was being "made available to the world".
The report recommends a default setting for social networking website user profiles with heavily restricted access that would require a "deliberate decision" to display personal information. The increasingly worrying role of the
influence of suicide websites was also highlighted in the report. It said that it could be possible to look at blocking such websites on a voluntary basis, in the same way that ISPs already do for child sex abuse websites with the Internet Watch
The report also agrees that parents need to take on a greater responsibility to protect their children. The report also recommended introducing the rating system used by the BBFC for computer games.
A select committee has called for more regulation and greater safety on the Internet. But politicians should be careful what they wish for.
It would be nice to think that the latest call to ‘do something’ about online content from the Culture, Media and Sport Select Committee was grounded in some new development that made it trivial for websites to identify adult-oriented
content, an online identity system which reliably linked social network profiles with age verification for all users, or the release of a user-friendly but unbreakable watermarking scheme that could identify copyrighted material whenever it
appeared on an Internet-accessible computer.
Because the alternative would be that a bunch of MPs has decided the best way to get some publicity at the start of the summer recess, when newspaper editors are starved of ‘serious’ stories, is to announce that the Internet is like
the Wild West, and children are constantly exposed to unsuitable material on YouTube, reveal intimate personal details on Bebo and surf the web looking for pro-anorexia or suicide support sites.
Sadly, it seems that John Whittingdale and his committee members have not been poring over the technical details of IPv6 and OpenID, so what we’ve got in their report is yet more condemnation of the dark side of today’s Internet and a
few poorly-grounded suggestions as to what might be done, most of which seem to comprise a call for Internet service providers and web hosts to become the net’s new morality police.
Reading the Committee on Culture, Media and Sport Report into Harmful content on the Internet and in Video Games , there is an evil, ill thought out recommendation which should be thoroughly condemned::
Controlling conduct-based risks and cyberbullying
138. We note that mobile phone call records would make it possible to establish that a particular phone had been used to upload content onto a video-sharing website at a particular time but would not necessarily identify the images uploaded or
the person who had used the phone to upload them.
Given that images or videos taken by mobile devices may be uploaded to social networking sites or video sharing sites on impulse, it would seem important to be able to have a record of the nature of content handled, should
it be offensive, harmful or even illegal.
It may be that the mobile phone industry could develop technology which would allow images uploaded by mobile devices to be viewed, thereby helping in the process of assembling evidence if inappropriate conduct has taken
We recommend that network operators and manufacturers of mobile devices should assess whether it is technically possible to enable images sent from mobile devices to be traced and viewed by law enforcement officers with the
If such currently non-existent technology is developed in the UK, presumably by magic, since the Committee has not come up with any research and development funding, what will prevent this selfsame mobile phone image tracking technology from
being abused, in say, China, Russia, Zimbabwe, Burma, Pakistan etc. to hunt down political dissidents and opponents of those authoritarian regimes ?
Innocent photographers in the UK already suffer from illegal harassment by Police Constables, Police Community Support Officers and Private Security Guards. Why should they welcome their mobile phone retained Communications Traffic Data being
trawled, just in case their copyrighted images might of interest in a Police investigation ?
Why should mobile phone photographers be hunted down and identified, if the Police or shyster lawyers representing rich and powerful people or organisations, try to suppress their images?
The dreadful dictatorship appeasing commercial monopoly of the International Olympic Committee springs to mind. They already seem set to inflict Beijing 2008 style monopoly enforcement on the London 2012 Olympic Games.
Parliament has announced a another inquiry into online child safety, to be headed by Conservative MP and anti-porn campaigner Claire Perry. She got noticed due to her impractical campaign to force ISPs to block porn unless people opt to receive
According to a press release on Claire Perry's constituency website, the inquiry will seek:
1) To understand better the extent to which children access on-line pornography and the potential for harm that this may cause
2) To determine what British Internet Service Providers have done to date to protect children online and the extent and possible impact of their future plans in this area
3) To determine what additional tools parents require to protect children from inappropriate content
4) To establish the arguments for and against network level filtering of content that would require an 18 rating in other forms of media
5) To recommend to Government the possible form of regulation required if ISPs fail to meet Recommendation no.5 from the Bailey Review.
Public evidence sessions will take place in Committee Room 7, House of Commons between 14:00 and 16:00 on September 8th and October 18th.
The inquiry will include approximately 60 MPs and gather feedback from ISPs as well as parents and many others [but probably not those who actually enjoy adult material on the internet].
The Parliamentary Inquiry into Online Child Protection has begun to take comments from a rather predictably selective group.
The committee has heard comments from the Lucy Faithful Foundation, the Mother's Union, YoungMinds, Marie Collins Foundation, Sonia Livingstone, Professor of Social Psychology at LSE, Jacqui Smith, the Sun's agony aunt Deidre Sanders and
Jerry Barnett, managing director of the UK's largest adult VOD site.
Jacqui Smith, the disgraced former Home Secretary, had a few ideas that caught the interest. She told the Inquiry that online pornography should be made harder to access in Britain, but that the quid pro quo for helping the industry to
remain profitable might be that it could help fund sex education programmes for children.
She said that the online pornography industry is not illegal, and it is being impacted by free and unregulated content on the internet . She proposed that if all adult content were only accessible to customers who specifically opted in to
it through their internet service providers, then the adult industry might see its profits improved. Online porn has suffered economically in the wake of free YouTube-style sites.
She added after the inquiry. If there are restrictions put on to what people can see, that will have a beneficial effect on the industry. If government or ISPs put in place restrictions that does enable the mainstream industry to [recover
economically], that would be the point at which you could apply pressure.
Smith was keen to stress that she did not propose limiting or censoring legal pornography, but that she wanted to make sure only people who were allowed to see it could do so. I genuinely don't think mainstream pornographers want young people
to see their material because it risks limiting what they can make for adults, she said. She conceded that her proposal may be technically challenging.
She said that the adult industry was already in a parlous state and that it would be unlikely to be able to fund education programmes at the moment. She said that although the chances of her proposals coming to fruition are not great, there are reasonable people in the porn industry
The committee will take evidence from ISPs next month.
Sittingbourne and Sheppey Tory MP Gordon Henderson said unrestricted access to the web and a lack of parental responsibility had created an everything is free mentality among a minority of young people.
He is one of more than 60 members of a cross-party group involved in a Parliamentary inquiry into online child protection.
There's a risk of children being groomed by strangers on the internet but it's a relatively low risk because most young people have the nouse to not get sucked in. The danger of the internet is more insidious than that.
It's the slow seeping of access to porn images that then slowly erodes the moral fibre of young people, which in turn adds to the social problems we currently face. Much of what we saw with the rioting and looting was due
to a breakdown in morality among young people.
Easy access to the internet just reinforces the message that everything is free and you never have to work for anything. That's got to change.
There's the possibility we overreact and I'm not a great believer in censorship or an internet clampdown. Most children are sensible enough to not put themselves in dangerous situations ...BUT... there are
others who are vulnerable and need protection.
The inquiry has got to look more at parental responsibility and access to the internet rather than a censorship of the internet itself.