Facebook is launching a UK initiative to train and fund local organisations it hopes will combat extremism and hate speech. The UK Online Civil
Courage Initiative's initial partners include Imams Online and the Jo Cox Foundation.
The recent terror attacks in London and Manchester - like violence anywhere - are absolutely heartbreaking. No-one should have to live in fear of terrorism - and we all have a part to play in stopping violent extremism from spreading. We know we
have more to do - but through our platform, our partners and our community we will continue to learn to keep violence and extremism off Facebook.
Last week Facebook outlined its technical measures to remove terrorist-related content from its site. The company told the BBC it was using artificial intelligence to spot images, videos and text related to terrorism as well as clusters of fake
Facebook explained that it was aiming to detect terrorist content immediately as it is posted and before other Facebook users see it. If someone tries to upload a terrorist photo or video, the systems look to see if this matches previous known
extremist content to stop it going up in the first place.
A second area is experimenting with AI to understand text that might be advocating terrorism. This is analysing text previously removed for praising or supporting a group such as IS and trying to work out text-based signals that such content may
be terrorist propaganda.
The company says it is also using algorithms to detect clusters of accounts or images relating to support for terrorism. This will involve looking for signals such as whether an account is friends with a high number of accounts that have been
disabled for supporting terrorism. The company also says it is working on ways to keep pace with repeat offenders who create accounts just to post terrorist material and look for ways of circumventing existing systems and controls.
Facebook has previously announced it is adding 3,000 employees to review content flagged by users. But it also says that already more than half of the accounts that it removes for supporting terrorism are ones that it finds itself. Facebook
says it has also grown its team of specialists so that it now has 150 people working on counter-terrorism specifically, including academic experts on counterterrorism, former prosecutors, former law enforcement agents and analysts, and engineers.
One of the major challenges in automating the process is the risk of taking down material relating to terrorism but not actually supporting it - such as news articles referring to an IS propaganda video that might feature its text or images. An
image relating to terrorism - such as an IS member waving a flag - can be used to glorify an act in one context or be used as part of a counter-extremism campaign in another.
A song accusing Theresa May of being a liar has reached number three in the iTunes charts and the top 10 radio charts. Liar Liar Ge2017 , produced and performed by Captain Ska, skewers the Prime Minister on the NHS, education and poverty,
and her party's several recent U-turns including calling the snap election.
The chorus and easy-to-sing-along melody, She's a liar, liar, you can't trust her, no no no no, has helped the song to number 3. Profits generated from downloads between 26 May and 8 June will be split between food banks in the UK and the
People's Assembly Against Austerity.
Radio stations have refused to play the song. The Big Top 40 Show on Capital FM and Heart announced the song had made the Top 10, and skipped to the song in ninth position.
calling on radio stations to play the song and oppose censorship has been signed by about 3000 people
A Surrey council has introduced a policy to allow parents with babies to attend 15 and 18
rated films at cinemas in the district.
Although BBFC 15 and 18 certificates specify that nobody under that age can attend cinema screenings, councils are the ultimate authority for specifying rules and licensing conditions for cinemas in their areas.
Parents are now being offered the chance to watch 15 and 18 rated films with their young children under Tandridge District Council rules.
Some mothers and fathers in the council area had expressed their wish to watch more adult films in parent and baby cinema club screenings.
Tandridge Council has decide to enable this, in theory giving parents the opportunity to watch Quentin Tarentino's Pulp Fiction , Clockwork Orange by Stanley Kubrick, with their children.
However council officers will decide what is and isn't appropriate viewing on a case by case basis. The council said:
It is anticipated that scenes of strong violence and gore, sex and strong threat will lead to greater concern around viewing by children of that age than will strong language, mild nudity and discriminatory content.
This approach will only apply for screenings advertised and restricted to 'parent and baby' only.
A police unit to censor online insult and hate crime has been launched by London's mayor, Sadiq Khan.
The Online Hate Crime Hub is made up of five Met police officers who will try to identify, prevent and investigate online abuse. Sadiq Khan said officers would work with community 'experts' to develop the police's understanding of online hate .
The unit will cost £1.7m over two years. It is being funded by the Met and the Mayor's Office for Policing And Crime (MOPAC), with £452,000 also being contributed by the Home Office Police Innovation Fund.I
Any online insult and hate crimes on the likes of Twitter and Facebook will be looked into by the unit.
City Hall said discussions were also under way with social media companies to develop appropriate online sanctions for perpetrators of online hate .
Offsite Comment: All hail Sadiq Khan's new Ministry of Truth
We're calling on social networks to be regulated and fined when they fail to protect children after it was revealed that 4 out of 5 children feel social media companies aren't doing enough to protect them
Out of 1,696 children and young people who took part in our Net Aware research, 1,380 thought social media sites needed to do more to protect them from inappropriate or harmful content. When asked about what they were coming across
on social media sites, children reported seeing:
bullying and hatred.
We're calling on Government to draw up minimum standards that internet companies must meet to safeguard children. These standards must include:
age-ratings in line with those for films set by the British Board of Film Classification
safe accounts automatically offered to under 18's -- with default privacy settings, proactive filtering of harmful content and mechanisms to guard against grooming
The Portnam Group is a drinks industry panel which investigates complaints about the marketing of alcoholic drinks. The latest
A complaint about the packaging of Old English Gin promoting excessive drinking has not been upheld by the Independent Complaints Panel (Panel).
The complainant believed that due to the fact that the product is sealed with a wine-style cork it is less practical than a more usual spirit closure --.and will encourage purchasers to drink the bottle more quickly than they would otherwise
The Panel were presented with the bottle of Old English Gin sealed to gauge if it could be opened easily. The Panel proceeded to open the bottle and reseal it with the cork. Whilst disappointed with the Company's short response, the Panel found that the
bottle was straightforward to reseal; with the brand name etched upside down on the cork so that when it was inserted into the neck the writing on the cork was the right-way up. The Panel noted this design feature and that the product was unlikely to
deteriorate quickly and therefore would not encourage consumers to drink the product more quickly. The Panel therefore concluded that the product did not breach the Code.