Luke EVans being touched up to look good in a photo
Conservative MP Luke Evans has drawn up a Private Members Bill which would mean celebrities would have to label images which have been digitally altered to change how they look.
Evans, a member of the Health and Social Care Committee and a GP, claimed
that edited photos on social media were fuelling a mental health crisis as it was creating a warped view of beauty. Celebrities such as Lauren Goodger and Khloe Kardashian have been criticised for doctoring their photos on Instagram.
Former culture secretary Jeremy Wright is setting up a parliamentary group (All party parliamentary group, APPG) to campaign for internet censorship
Wright, who drew up the Government's white paper proposing strict sanctions on tech platforms who fail
to protect users under a duty of care is particularly calling for censorship powers to block, ban, fine or restrict apps and websites considered undesirable by the proposed internet censor, Ofcom. Wright said:
needs to be a lot more clubs in the bag for the regulator than just fines, he said. I do think we need to consider criminal liability for individual (tech company) directors where it can be demonstrated.
He also felt the regulator
should have powers of ISP blocking, which effectively bar an app from the UK, in cases of companies repeatedly and egregiously refusing to comply with censorship rules. He said:
I do accept the chances of
WhatsApp being turned off are remote. Although frankly, there may be circumstances where that may be the right thing to do and we shouldn't take it off the table.
Wright is founding the APPG alongside crossbench peer and children's
digital rights campaigner Beeban Kidron, and the group has already attracted supporters, including three other former culture secretaries: Baroness Nicky Morgan, Karen Bradley and Maria Miller, as well as former Health and Foreign Secretary Jeremy Hunt.
More censorship legislation is needed to protect people online after social media giants' failure to tackle hate speech on their websites, claims the Labour Party.
Jo Stevens, shadow secretary of state for digital, culture, media and sport, claimed
the UK desperately needed legislation forcing platforms to act because self-regulation isn't working.
The Labour party is accusing the Government of delaying the introduction of an online harms bill to protect Internet users. It comes after
politicians and campaigners condemned Twitter for being too slow to remove anti-Semitic tweets by rapper Wiley.
The Mayor of London Sadiq Khan said he has written to Instagram and Twitter to make it clear that they need to act immediately to
remove social media posts that Labour does not like.
A parliamentary committee looking into supposed 'fake news' is calling for more internet censorship. It writes:
The Digital, Culture, Media and Sport Committee calls for Government to appoint new Online Harms Regulator
Online misinformation about Covid-19 was allowed to spread virulently across social media without the protections offered by legislation, promised by the Government 15 months ago.
The Misinformation in
the COVID-19 Infodemic Report details evidence on a range of harms from dangerous hoax treatments to conspiracy theories that led to attacks on 5G engineers.
The Online Harms White Paper, published in April 2019, proposed a duty
of care on tech companies and an independent Online Harms Regulator, both key recommendations from the predecessor DCMS Committee.
MPs voice new concerns that the delayed legislation will not address the harms caused by
misinformation and disinformation 203 a serious omission that would ignore the lessons of the Covid crisis.
The Report finds that tech companies use business models that disincentivise action against misinformation while affording
opportunities to bad actors to monetise misleading content. As a result the public is reliant on the good will of tech companies or the bad press they attract to compel them to act.
The DCMS Committee calls for the Government to
make a final decision on the appointment of the regulator now.
The report summary reads:
In February, the World Health Organisation warned that, alongside the outbreak of COVID-19, the world faced
an infodemic, an unprecedented overabundance of information204both accurate and false204that prevented people from accessing authoritative, reliable guidance about the virus. The infodemic has allowed for harmful misinformation, disinformation, scams and
cybercrime to spread. False narratives have resulted in people harming themselves by resorting to dangerous hoax cures or forgoing medical treatment altogether. There have been attacks on frontline workers and critical national infrastructure as a result
of alarmist conspiracy theories.
The UK Government is currently developing proposals for online harms legislation that would impose a duty of care on tech companies. Whilst not a silver bullet in addressing harmful content, this
legislation is expected to give a new online harms regulator the power to investigate and sanction tech companies. Even so, legislation has been delayed. As yet, the Government has not produced the final response to its consultation (which closed over a
year ago), voluntary interim codes of practice, or a media literacy strategy. Moreover, there are concerns that the proposed legislation will not address the harms caused by misinformation and disinformation and will not contain necessary sanctions for
tech companies who fail in their duty of care
We have conducted an inquiry into the impact of misinformation about COVID-19, and the efforts of tech companies and relevant public sector bodies to tackle it. This has presented an
opportunity to scrutinise how online harms proposals might work in practice. Whilst tech companies have introduced new ways of tackling misinformation through the introduction of warning labels and tools to correct the record, these innovations have been
applied inconsistently, particularly in the case of high-profile accounts. Platform policies have been also been too slow to adapt, while automated content moderation at the expense of human review and user reporting has had limited effectiveness. The
business models of tech companies themselves disincentivise action against misinformation while affording opportunities to bad actors to monetise misleading content. At least until well-drafted, robust legislation is brought forward, the public is
reliant on the goodwill of tech companies, or the bad press they attract, to compel them to act.
During the crisis the public have turned to public service broadcasting as the main and most trusted source of information. Beyond
broadcasting, public service broadcasters (PSBs) have contributed through fact-checking and media literacy initiatives and through engagement with tech companies. The Government has also acted against misinformation by reforming its Counter
Disinformation Unit to co-ordinate its response and tasked its Rapid Response Unit with refuting seventy pieces of misinformation a week. We have raised concerns, however, that the Government has been duplicating the efforts of other organisations in
this field and could have taken a more active role in resourcing an offline, digital literacy-focused response. Finally, we have considered the work of Ofcom, as the Government's current preferred candidate for online harms regulator, as part of our
discussion of online harms proposals. We call on the Government to make a final decision now on the online harms regulator to begin laying the groundwork for legislation to come into effect.
The House of Lords Gambling Committee claims that video game loot boxes should be regulated under gambling laws.
The Lords claim that loot boxes they should be classified as games of chance - which would bring them under the Gambling Act 2005. If a
product looks like gambling and feels like gambling, it should be regulated as gambling, their report says. And they warn that such a change should not wait.
In reality the regulation of gambling is an entirely different kettle of fish that is
about adult entertainment and significant levels of moeny being lost. Surely the monetising of games throught loot boxes would be better dealt with by those with expertise in child psychology.
Ex BBC boss Michael Grade, chairman of the committee,
told BBC Breakfast that lots of other countries have already started to regulate loot boxes because they can see the dangers which is teaching kids to gamble. He said the Gambling Act was way behind what was actually happening in the market but he added
that the overwhelming majority of the report's recommendations could be enacted today as they don't require legislation.
Gambling Harms: Time for Action Report: Key recommendations
Committee sets out a range of recommendations across different areas to reduce gambling-related harm.
The gambling industry offers a variety of products to consumers, including some which can be highly addictive. The Gambling Commission should create a system for testing all new games against a series of harm indicators,
including their addictiveness and whether they will appeal to children. A game which scores too highly on the harm indicators must not be approved.
The equalisation of speed of play and spin, so that no game can be
played quicker online than in a casino, bookmaker or bingo hall.
The Gambling Commission must explain the minimum steps which operators should take when considering customer affordability, and make clear that it is
for the operator to take the steps which will enable them to identify customers who are betting more than they can afford.
The creation of a statutory independent Gambling Ombudsman Service, modelled on the Financial
Ombudsman Service, to settle disputes between gambling operators and gamblers.
The Government must act immediately to bring loot boxes within the remit of gambling legislation and regulation.
Gambling operators should no longer be allowed to advertise on the shirts of sports teams or any other visible part of their kit. There should also be no gambling advertising in or near any sports grounds or sports venues.
Problem gambling is a common mental health disorder, and the NHS has the same duty to treat it as to treat any other disorder. Last year the NHS promised to open 15 new clinics. It should do this before 2023 and establish a
comparable number within the following few years.