Melon Farmers Original Version

Internet News


2019: February

 2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan   Feb   Mar   Apr   May   June   July   Aug   Sep   Oct   Nov   Dec    

 

The BBFC didn't turn up...

Open Rights Group met to discuss the BBFC's age verification scheme with its voluntary privacy protection


Link Here28th February 2019

We met to discuss BBFC's voluntary age verification privacy scheme, but BBFC did not attend. Open Rights Group met a number of age verification providers to discuss the privacy standards that they will be meeting when the scheme launches, slated for April. Up to 20 million UK adults are expected to sign up to these products.

We invited all the AV providers we know about, and most importantly, the BBFC, at the start of February. BBFC are about to launch a voluntary privacy standard which some of the providers will sign up to. Unfortunately, BBFC have not committed to any public consultation about the scheme, relying instead on a commercial provider to draft the contents with providers, but without wider feedback from privacy experts and people who are concerned about users.

We held the offices close to the BBFC's offices in order that it would be convenient for them to send someone that might be able to discuss this with us. We have been asking for meetings with BBFC about the privacy issues in the new code since October 2018: but have not received any reply or acknowledgement of our requests, until this morning, when BBFC said they would be unable to attend today's roundtable. This is very disappointing.

BBFC's failure to consult the public about this standard, or even to meet us to discuss our concerns, is alarming. We can understand that BBFC is cautious and does not wish to overstep its relationship with its new masters at DCMS. BBFC may be worried about ORG's attitude towards the scheme: and we certainly are critical. However, it is not responsible for a regulator to fail to talk to its potential critics.

We are very clear about our objectives. We are acting to do our best to ensure the risk to adult users of age verification technologies are minimised. We do not pose a threat to the scheme as a whole: listening to us can only result in making the pornographic age verification scheme more likely to succeed, and for instance, to avoid catastrophic failures.

Privacy concerns appear to have been recognised by BBFC and DCMS as a result of consultation responses from ORG supporters and others, which resulted in the voluntary privacy standard. These concerns have also been highlighted by Parliament, whose regulatory committee expressed surprise that the Digital Economy Act 2017 had contained no provision to deal with the privacy implications of pornographic age verification.

Today's meeting was held to discuss:

  • What the scheme is likely to cover; and what it ideally should cover;

  • Whether there is any prospect of making the scheme compulsory;

  • What should be done about non-compliant services;

  • What the governance of the scheme should be in the long tern, for instance whether it might be suitable to become an ICO backed code, or complement such as code

As we communicated to BBFC in December 2018, we have considerable worries about the lack of consultation over the standard they are writing, which appears to be truncated in order to meet the artificial deadline of April this year. This is what we explained to BBFC in our email:

  • Security requires as many perspectives to be considered as possible.

  • The best security standards eg PCI DSS are developed in the open and iterated

  • The standards will be best if those with most to lose are involved in the design.
     

    • For PCI DSS, the banks and their customers have more to lose than the processors

    • For Age Verification, site users have more to lose than the processors, however only the processors seem likely to be involved in setting the standard

We look forward to BBFC agreeing to meet us to discuss the outcome of the roundtable we held about their scheme, and to discuss our concerns about the new voluntary privacy standard. Meanwhile, we will produce a note from the meeting, which we believe was useful. It covered the concerns above, and issues around timing, as well as strategies for getting government to adjust their view of the absence of compulsory standards, which many of the providers want. In this, BBFC are a critical actor. ORG also intends as a result of the meeting to start to produce a note explaining what an effective privacy scheme would cover, in terms of scope, risks to mitigate, governance and enforcement for participants.

 

 

Akin to the actions of a totalitarian state....

Chelsea Russell's ridiculous conviction for quoting rap lyrics quashed on appeal


Link Here 27th February 2019
Full story: Insulting UK Law...UK proesecutions of jokes and insults on social media

In 2017, Chelsea Russell, a Liverpool teenager with Asperger's syndrome, paid tribute on her Instagram profile to a 13-year-old friend who died when he was hit by a car. She quoted the lyrics of a rap song, I'm Trippin" by Snap Dogg, alongside the phrase 'RIP Frankie Murphy. Many other teenagers used the lyrics to pay tribute to Murphy.

A year later, Russell's profile came to the attention of the police, who decided to arrest her and have her charged. The lyrics she quoted Kill a snitch nigga, rob a rich nigga were found in court to be grossly offensive and Russell was convicted of a hate crime . For nothing more than quoting rap lyrics, she was placed on an eight-week, 8am-to-8pm curfew, fitted with an ankle tag, and fined £585.

Last week, the conviction was overturned on appeal. Russell's defence lawyer slammed the initial verdict as ridiculous, akin to the actions of a totalitarian state.

Offsite Comment: Chelsea Russell and the depravity of PC

27th February 2019. See article from spiked-online.com by Fraser Myers

 

 

Updated: Right wronged...

Facebook censors Tommy Robinson's page


Link Here27th February 2019
Full story: Facebook Censorship...Facebook quick to censor
Tommy Robinsonm has been permanently banned from Facebook and sister website Instagram. In a blogpost, Facebook said:

When ideas and opinions cross the line and amount to hate speech that may create an environment of intimidation and exclusion for certain groups in society -- in some cases with potentially dangerous offline implications -- we take action. Tommy Robinson's Facebook page has repeatedly broken these standards, posting material that uses dehumanizing language and calls for violence targeted at Muslims. He has also behaved in ways that violate our policies around organized hate.

Robinson is already banned from Twitter and the decision to cut him off from Instagram and Facebook will leave him reliant on YouTube as the only major online platform to provide him with a presence.

The ban comes a month after Facebook issued a final written warning against Robinson, warning him that he would be removed from its platform permanently if he continued to break the company's hate speech policies.

Mainstream outlets have struggled to deal with Robinson. When he was interviewed by Sky News last year, Robinson responded b uploading an unedited video of the discussion showing that Sky News did in fact mislead viewers by mixing and matching questions to answers to make Robinson look bad. The video became an online success and was shared far more widely online than the original interview.

Robinson adopted a similar tactic with the BBC's Panorama, which is investigating the far-right activist. Two weeks ago, Robinson agreed to be interviewed by the programme, only to turn the tables on reporter John Sweeney by revealing he had sent an associate undercover to film the BBC reporter.

Several other accounts were removed from Facebook on Tuesday, including one belonging to former Breitbart London editor Raheem Kassam.

Update: BBC receive complaints about Panorma

27th February 2019.  See  article from bbc.co.uk

Complaint

We received complaints following the third party release of secretly recorded material related to a BBC Panorama investigation.

BBC Response

 BBC Panorama is investigating Tommy Robinson, whose real name is Stephen Yaxley-Lennon. The BBC strongly rejects any suggestion that our journalism is faked or biased. Any programme we broadcast will adhere to the BBC's strict editorial guidelines. BBC Panorama's investigation will continue.

John Sweeney made some offensive and inappropriate remarks whilst being secretly recorded, for which he apologises. The BBC has a strict expenses policy and the drinks bill in this video was paid for in full by John.

Offsite Comment: Why Tommy Robinson should not be banned

27th February 2019. See  article from spiked-online.com by Brendan O'Neill

Facebook and Instagram's ban confirms that corporate censorship is out of control.

 

 

Updated: Destroying European livelihoods...

The text of Article 13 and the EU Copyright Directive has just been finalised in its worst form yet


Link Here27th February 2019
Full story: Copyright in the EU...Copyright law for Europe

In the evening of February 13, negotiators from the European Parliament and the Council concluded the trilogue negotiations with a final text for the new EU Copyright Directive.

For two years we've debated different drafts and versions of the controversial Articles 11 and 13. Now, there is no more ambiguity: This law will fundamentally change the internet as we know it -- if it is adopted in the upcoming final vote. But we can still prevent that!

Please click the links to take a look at the final wording of Article 11 and Article 13 . Here's my summary:

Article 13: Upload filters

Parliament negotiator Axel Voss accepted the deal between France and Germany I laid out in a recent blog post :

  • Commercial sites and apps where users can post material must make "best efforts" to preemptively buy licences for anything that users may possibly upload -- that is: all copyrighted content in the world. An impossible feat.

  • In addition, all but very few sites (those both tiny and very new) will need to do everything in their power to prevent anything from ever going online that may be an unauthorised copy of a work that a rightsholder has registered with the platform. They will have no choice but to deploy upload filters , which are by their nature both expensive and error-prone .

  • Should a court ever find their licensing or filtering efforts not fierce enough, sites are directly liable for infringements as if they had committed them themselves. This massive threat will lead platforms to over-comply with these rules to stay on the safe side, further worsening the impact on our freedom of speech.

Article 11: The "link tax"

The final version of this extra copyright for news sites closely resembles the version that already failed in Germany -- only this time not limited to search engines and news aggregators, meaning it will do damage to a lot more websites.

  • Reproducing more than "single words or very short extracts" of news stories will require a licence. That will likely cover many of the snippets commonly shown alongside links today in order to give you an idea of what they lead to. We will have to wait and see how courts interpret what "very short" means in practice -- until then, hyperlinking (with snippets) will be mired in legal uncertainty.

  • No exceptions are made even for services run by individuals, small companies or non-profits, which probably includes any monetised blogs or websites.

Other provisions

The project to allow Europeans to conduct Text and Data Mining , crucial for modern research and the development of artificial intelligence, has been obstructed with too many caveats and requirements. Rightholders can opt out of having their works datamined by anyone except research organisations.

Authors' rights: The Parliament's proposal that authors should have a right to proportionate remuneration has been severely watered down: Total buy-out contracts will continue to be the norm.

Minor improvements for access to cultural heritage : Libraries will be able to publish out-of-commerce works online and museums will no longer be able to claim copyright on photographs of centuries-old paintings.

 

How we got here Former digital Commissioner Oettinger proposed the law

The history of this law is a shameful one. From the very beginning , the purpose of Articles 11 and 13 was never to solve clearly-defined issues in copyright law with well-assessed measures, but to serve powerful special interests , with hardly any concern for the collateral damage caused.

In the relentless pursuit of this goal , concerns by independent academics , fundamental rights defenders , independent publishers , startups and many others were ignored. At times, confusion was spread about crystal-clear contrary evidence . Parliament negotiator Axel Voss defamed the unprecedented protest of millions of internet users as " built on lies ".

In his conservative EPP group, the driving force behind this law, dissenters were marginalised . The work of their initially-appointed representative was thrown out after the conclusions she reached were too sensible. Mr Voss then voted so blindly in favour of any and all restrictive measures that he was caught by surprise by some of the nonsense he had gotten approved. His party, the German CDU/CSU, nonchalantly violated the coalition agreement they had signed (which rejected upload filters), paying no mind to their own minister for digital issues .

It took efforts equally herculean and sisyphean across party lines to prevent the text from turning out even worse than it now is.

In the end, a closed-door horse trade between France and Germany was enough to outweigh the objections... so far.

What's important to note, though: It's not "the EU" in general that is to blame -- but those who put special interests above fundamental rights who currently hold considerable power. You can change that at the polls! The anti-EU far right is trying to seize this opportunity to promote their narrow-minded nationalist agenda -- when in fact without the persistent support of the far-right ENF Group (dominated by the Rassemblement/Front National ) the law could have been stopped in the crucial Legal Affairs Committee and in general would not be as extreme as it is today.

 

We can still stop this law

Our best chance to stop the EU copyright law: The upcoming Parliament vote.

The Parliament and Council negotiators who agreed on the final text now return to their institutions seeking approval of the result. If it passes both votes unchanged, it becomes EU law , which member states are forced to implement into national law.

In both bodies, there is resistance.

The Parliament's process starts with the approval by the Legal Affairs Committee -- which is likely to be given on Monday, February 18.

Next, at a date to be announced, the EU member state governments will vote in the Council. The law can be stopped here either by 13 member state governments or by any number of governments who together represent 35% of the EU population ( calculator ). Last time, 8 countries representing 27% of the population were opposed. Either a large country like Germany or several small ones would need to change their minds: This is the less likely way to stop it.

Our best bet: The final vote in the plenary of the European Parliament , when all 751 MEPs, directly elected to represent the people, have a vote. This will take place either between March 25 and 28, on April 4 or between April 15 and 18. We've already demonstrated last July that a majority against a bad copyright proposal is achievable .

The plenary can vote to kill the bill -- or to make changes , like removing Articles 11 and 13. In the latter case, it's up to the Council to decide whether to accept these changes (the Directive then becomes law without these articles) or to shelve the project until after the EU elections in May, which will reshuffle all the cards.

This is where you come in

The final Parliament vote will happen mere weeks before the EU elections . Most MEPs -- and certainly all parties -- are going to be seeking reelection. Articles 11 and 13 will be defeated if enough voters make these issues relevant to the campaigns. ( Here's how to vote in the EU elections -- change the language to one of your country's official ones for specific information)

It is up to you to make clear to your representatives: Their vote on whether to break the internet with Articles 11 and 13 will make or break your vote in the EU elections. Be insistent -- but please always stay polite.

Together, we can still stop this law.

Update: What next?

27th February 2019. Via Julia Reda on Twitter

For the record the draft legislation was approved by 16 votes to 9 with no abstentions.

The final vote in Parliament will take place during the 25-28th March II plenary session.

 

 

Wrongs righted...

Facebook restores Russia Today page that it recently censored


Link Here26th February 2019
Full story: Facebook Censorship...Facebook quick to censor
Facebook has restored several RT-linked pages a week after it blocked them without prior notice. The pages were only freed-up after their administrators posted data about their management and funding.

The Facebook pages of InTheNow, Soapbox, Back Then and Waste-Ed -- all operated by the Germany-based company Maffick Media were made accessible again as of Monday evening.

Facebook said in a statement at the time of the ban that it wants the pages' administrators to reveal their ties to Russia to their audience in the name of greater transparency. Facebook's measure was taken following a CNN report, which ludicrously accused the pages of concealing their ties to the Kremlin, even though their administrators had never actually made a secret of their relations to Ruptly and RT. In fact RT is very blatantly, a propaganda channel supporting Russia.

Maffick CEO Anissa Naouai revealed that the social media giant agreed to unblock the pages, but only after their administration updated our 'About' section, in a manner NO other page has been required to do. The accounts now indeed feature information related to their funding and management, visible under the pages' logos.

 

 

Updated: Waging war...

Bangladesh ISPs block porn and gambling on orders or the government .


Link Here26th February 2019
Bangladesh internet censors have blocked almost 20,000 websites as part of an anti-pornography campaign, a minister has reported.

ISPs have blocked pornography and gambling websites in the past week under orders from the telecommunications censor. war, Mustafa Jabbar, the posts and telecommunications minister, told AFP:

I want to create a safe and secure internet for all Bangladeshis, including children. And this is my war against pornography. And this will be a continuous.

Popular social media apps such as TikTok and Bigo - which authorities believe are misused - have also been blocked

Update: And of course the government took the opportunity to ban a few other things too

26th February 2019. See article from dw.com

While most of the blocked sites are foreign, a few local websites and social media platforms have also been targeted by the government censorship. One of these websites, somewhereinblog.net, is the largest Bengali-language community blog platform in the world.

The post and telecommunications minister blamed the site for spreading atheism in Bangladesh.

A group of 33 Bangladeshi university teachers, journalists, bloggers, and activists have demanded that the government lift the ban on the blog platform immediately.

 

 

Throwing Creators Under the Bus...

Cory Doctorow explains why EU censorship means will spell disaster for the livelihoods of European creators


Link Here25th February 2019
Full story: Copyright in the EU...Copyright law for Europe

Article 13 is the on-again / off-again controversial proposal to make virtually every online community, service, and platform legally liable for any infringing material posted by their users, even very briefly, even if there was no conceivable way for the online service provider to know that a copyright infringement had taken place.

This will require unimaginable sums of money to even attempt, and the attempt will fail. The outcome of Article 13 will be a radical contraction of alternatives to the U.S. Big Tech platforms and the giant media conglomerates. That means that media companies will be able to pay creators less for their work, because creators will have no alternative to the multinational entertainment giants.

Throwing Creators Under the Bus

The media companies lured creators' groups into supporting Article 13 by arguing that media companies and the creators they distribute have the same interests. But in the endgame of Article 13, the media companies threw their creator colleagues under the bus , calling for the deletion of clauses that protect artists' rights to fair compensation from media companies, prompting entirely justifiable howls of outrage from those betrayed artists' rights groups.

But the reality is that Article 13 was always going to be bad for creators. At best, all Article 13 could hope for was to move a few euros from Big Tech's balance-sheet to Big Content's balance-sheet (and that would likely be a temporary situation). Because Article 13 would reduce the options for creators by crushing independent media and tech companies, any windfalls that media companies made would go to their executives and shareholders, not to the artists who would have no alternative but to suck it up and take what they're offered.

After all: when was the last time a media company celebrated a particularly profitable year by increasing their royalty rates?

It Was Always Going to Be Filters

The initial versions of Article 13 required companies to build copyright filters, modeled after YouTube's "Content ID" system: YouTube invites a select group of trusted rightsholders to upload samples of works they claim as their copyright, and then blocks (or diverts revenue from) any user's video that seems to match these copyright claims.

There are many problems with this system. On the one hand, giant media companies complain that they are far too easy for dedicated infringers to defeat; and on the other hand, Content ID ensnares all kinds of legitimate forms of expression, including silence , birdsong , and music uploaded by the actual artist for distribution on YouTube . Sometimes, this is because a rightsholder has falsely claimed copyrights that don't belong to them; sometimes, it's because Content ID generated a "false positive" (that is, made a mistake); and sometimes it's because software just can't tell the difference between an infringing use of a copyrighted work and a use that falls under "fair dealing," like criticism, commentary, parody, etc. No one has trained an algorithm to recognise parody, and no one is likely to do so any time soon (it would be great if we could train humans to reliably recognise parody!).

Copyright filters are a terrible idea. Google has spent a reported $100 million (and counting) to build a very limited copyright filter that only looks at videos and only blocks submissions from a select group of pre-vetted rightsholders. Article 13 covers all possible copyrighted works: text, audio, video, still photographs, software, translations. And some versions of Article 13 have required platforms to block infringing publications of every copyrighted work, even those that no one has told them about: somehow, your community message-board for dog-fanciers is going to have to block its users from plagiarising 50-year-old newspaper articles, posts from other message-boards, photos downloaded from social media, etc. Even the milder "compromise" versions of Article 13 required online services to block publication of anything they'd been informed about, with dire penalties for failing to honour a claim, and no penalties for bogus claims.

But even as filters block things that aren't copyright infringement, they still allow dedicated infringers to operate with few hindrances. That's because filters use relatively simple, static techniques to inspect user uploads, and infringers can probe the filters' blind-spots for free, trying different techniques until they hit on ways to get around them. For example, some image filters can be bypassed by flipping the picture from left to right , or rendering it in black-and-white instead of color. Filters are "black boxes" that can be repeatedly tested by dedicated infringers to see what gets through.

For non-infringers -- the dolphins caught in copyright's tuna-nets -- there is no underground of tipsters who will share defeat-techniques to help get your content unstuck. If you're an AIDS researcher whose videos have been falsely claimed by AIDS deniers in order to censor them, or police brutality activists whose bodycam videos have been blocked by police departments looking to evade criticism, you are already operating at the limit of your abilities, just pursuing your own cause. You can try to become a filter-busting expert in addition to your research, activism, or communications, but there are only so many hours in a day, and the overlap between people with something to say and people who can figure out how to evade overzealous (or corrupted) copyright filters just isn't very large.

All of this put filters into such bad odor that mention of them was purged from Article 13, but despite obfuscation , it was clear that Article 13's purpose was to mandate filters: there's just no way to imagine that every tweet, Facebook update, message-board comment, social media photo, and other piece of user-generated content could be evaluated for copyright compliance without an automated system. And once you make online forums liable for their users' infringement, they have to find some way to evaluate everything their users post.

Just Because Artists Support Media Companies, It Doesn't Mean Media Companies Support Artists

Spending hundreds of millions of euros to build filters that don't stop infringers but do improperly censor legitimate materials (whether due to malice, incompetence, or sloppiness) will not put any money in artists' pockets.

Which is not to say that these won't tilt the balance towards media companies (at least for a while). Because filters will always fail at least some of the time, and because Article 13 doesn't exempt companies from liability when this happens, Big Tech will have to come to some kind of accommodation with the biggest media companies -- Get Out Of Jail cards, along with back-channels that media companies can use to get their own material unstuck when it is mistakenly blocked by a filter. (It's amazing how often one part of a large media conglomerate will take down its own content, uploaded by another part of the same sprawling giant.)

But it's pretty naive to imagine that transferring money from Big Tech to Big Content will enrich artists. Indeed, since there's no way that smaller European tech companies can afford to comply with Article 13, artists will have no alternative but to sign up with the major media companies, even if they don't like the deal they're offered.

Smaller companies play an important role today in the EU tech ecosystem. There are national alternatives to Instagram, Google, and Facebook that outperform U.S. Big Tech in their countries of origin. These will not survive contact with Article 13. Article 13's tiny exemptions for smaller tech companies were always mere ornaments, and the latest version of Article 13 renders them useless .

Smaller tech companies will also be unable to manage the inevitable flood of claims by copyright trolls and petty grifters who see an opportunity.

Smaller media companies -- often run by independent artists to market their own creations, or those of a few friends -- will likewise find themselves without a seat at the table with Big Tech, whose focus will be entirely on keeping the media giants from using Article 13's provisions to put them out of business altogether.

Meanwhile, "filters for everything" will be a bonanza for fraudsters and crooks who prey on artists. Article 13 will force these systems to err on the side of over-blocking potential copyright violations, and that's a godsend for blackmailers , who can use bogus copyright claims to shut down artists' feeds, and demand money to rescind the claims. In theory, artists victimised in this way can try to get the platforms to recognise the scam, but without the shelter of a big media company with its back-channels into the big tech companies, these artists will have to get in line behind millions of other people who have been unjustly filtered to plead their case.

If You Think Big Tech Is Bad Now...

In the short term, Article 13 tilts the field toward media companies, but that advantage will quickly evaporate.

Without the need to buy or crush upstart competitors in Europe, the American tech giants will only grow bigger and harder to tame. Even the aggressive antitrust work of the European Commission will do little to encourage competition if competing against Big Tech requires hundreds of millions for copyright compliance as part of doing business -- costs that Big Tech never had to bear while it was growing, and that would have crushed the tech companies before they could grow.

Ten years after Article 13 passes, Big Tech will be bigger than ever and more crucial to the operation of media companies. The Big Tech companies will not treat this power as a public trust to be equitably managed for all: they will treat it as a commercial advantage to be exploited in every imaginable way. When the day comes that FIFA or Universal or Sky needs Google or Facebook or Apple much more than the tech companies need the media companies, the tech companies will squeeze, and squeeze, and squeeze.

This will, of course, harm the media companies' bottom line. But you know who else it will hurt? Artists.

Because media giants, like other companies who have a buyer's market for their raw materials -- that is, art and other creative works -- do not share their windfalls with their suppliers, but they absolutely expect their suppliers to share their pain.

When media companies starve, they take artists with them. When artists have no other option, the media companies squeeze them even harder .

What Is To Be Done?

Neither media giants nor tech giants have artists' interests at heart.

Both kinds of company are full of people who care about artists, but institutionally, they act for their shareholders, and every cent they give to an artist is a cent they can't return to those investors.

One important check on this dynamic is competition. Antitrust regulators have many tools at their disposal, and those tools have been largely idle for more than a generation. Companies have been allowed to grow by merger, or by acquiring nascent competitors, leaving artists with fewer media companies and fewer tech companies, which means more chokepoints where they are shaken down for their share of the money from their work.

Another important mechanism could be genuine copyright reform, such as re-organizing the existing regulatory framework for copyright, or encouraging new revenue-sharing schemes such as voluntary blanket licenses, which could allow artists to opt into a pool of copyrights in exchange for royalties.

Any such scheme must be designed to fight historic forms of corruption, such as collecting societies that unfairly share out license payments, or media companies that claim these. That's the sort of future-proof reform that the Copyright Directive could have explored, before it got hijacked by vested interests.

In the absence of these policies, we may end up enriching the media companies, but not the artists whose works they sell. In an unfair marketplace, simply handing more copyrights to artists is like giving your bullied kid extra lunch-money: the bullies will just take the extra money, too, and your kid will still go hungry.

Artists Should Be On the Side of Free Expression

It's easy to focus on media and art when thinking about Article 13, but that's not where its primary effect will be felt.

The platforms that Article 13 targets aren't primarily entertainment systems: they are used for everything, from romance to family life, employment to entertainment, health to leisure, politics and civics, and more besides.

Copyright filters will impact all of these activities, because they will all face the same problems of false-positives, censorship, fraud and more.

The arts has always championed free expression for all , not just for artists. Big Tech and Big Media already exert enormous control over our public and civic lives. Dialing that control up is bad for all of us, not just those of us in the arts.

Artists and audiences share an interest in promoting the fortunes of artists: people don't buy books or music or movies because they want to support media companies, they do it to support creators. As always, the right side for artists to be on is the side of the public: the side of free expression, without corporate gatekeepers of any kind.

Take Action Stop Article 13

 

 

Putting Zuckerberg behind bars...

The Telegraph reports on the latest government thoughts about setting up a social media censor


Link Here23rd February 2019

Social media companies face criminal sanctions for failing to protect children from online harms, according to drafts of the Government's White Paper circulating in Whitehall.

Civil servants are proposing a new corporate offence as an option in the White Paper plans for a tough new censor with the power to force social media firms to take down illegal content and to police legal but harmful material.

They see criminal sanctions as desirable and as an important part of a regulatory regime, said one source who added that there's a recognition particularly on the Home Office side that this needs to be a regulator with teeth. The main issue they need to satisfy ministers on is extra-territoriality, that is can you apply this to non-UK companies like Facebook and YouTube? The belief is that you can.

The White Paper, which is due to published mid-March followed by a Summer consultation, is not expected to lay out as definitive a plan as previously thought. A decision on whether to create a brand new censor or use Ofcom is expected to be left open. A Whitehall source said:

Criminal sanctions are going to be put into the White Paper as an option. We are not necessarily saying we are going to do it but these are things that are open to us. They will be allied to a system of fines amounting to 4% of global turnover or Euros 20m, whichever is higher.

Government minister Jeremy Wright told the Telegraph this week he was especially focused on ensuring that technology companies enforce minimum age standards. He also indicated the Government w ould fulfill a manifesto commitment to a levy on social media firms, that could fund the new censorr.

 

 

Driving the internet into dark corners...

The IWF warns the government to think about unintended consequences when creating a UK internet censor


Link Here 22nd February 2019

Internet Watch Foundation's (IWF) CEO, Susie Hargreaves OBE, puts forward a voice of reason by urging politicians and policy makers to take a balanced approach to internet regulation which avoids a heavy cost to the victims of child sexual abuse.

IWF has set out its views on internet regulation ahead of the publication of the Government's Online Harms White Paper. It suggests that traditional approaches to regulation cannot apply to the internet and that human rights should play a big role in any regulatory approach.

The IWF, as part of the UK Safer Internet Centre, supports the Government's ambition to make the UK the safest place in the world to go online, and the best place to start a digital business.

IWF has a world-leading reputation in identifying and removing child sexual abuse images and videos from the internet. It takes a co-regulatory approach to combating child sexual abuse images and videos by working in partnership with the internet industry, law enforcement and governments around the world. It offers a suite of tools and services to the online industry to keep their networks safer. In the past 22 years, the internet watchdog has assessed -- with human eyes -- more than 1 million reports.

Ms Hargreaves said:

Tackling criminal child sexual abuse material requires a global multi-stakeholder effort. We'll use our 22 years' experience in this area to help the government and policy makers to shape a regulatory framework which is sustainable and puts victims at its heart. In order to do this, any regulation in this area should be developed with industry and other key stakeholders rather than imposed on them.

We recommend an outcomes-based approach where the outcomes are clearly defined and the government should provide clarity over the results it seeks in dealing with any harm. There also needs to be a process to monitor this and for any results to be transparently communicated.

But, warns Ms Hargreaves, any solutions should be tested with users including understanding impacts on victims: "The UK already leads the world at tackling online child sexual abuse images and videos but there is definitely more that can be done, particularly in relation to tackling grooming and livestreaming, and of course, regulating harmful content is important.

My worries, however, are about rushing into knee-jerk regulation which creates perverse incentives or unintended consequences to victims and could undo all the successful work accomplished to date. Ultimately, we must avoid a heavy cost to victims of online sexual abuse.

 

 

Wider definition of harm can be manipulated to restrict media freedom...

Index on Censorship responds to government plans to create a UK internet censor


Link Here22nd February 2019

Index on Censorship welcomes a report by the House of Commons Digital, Culture, Media and Sport select committee into disinformation and fake news that calls for greater transparency on social media companies' decision making processes, on who posts political advertising and on use of personal data. However, we remain concerned about attempts by government to establish systems that would regulate harmful content online given there remains no agreed definition of harm in this context beyond those which are already illegal.

Despite a number of reports, including the government's Internet Safety Strategy green paper, that have examined the issue over the past year, none have yet been able to come up with a definition of harmful content that goes beyond definitions of speech and expression that are already illegal. DCMS recognises this in its report when it quotes the Secretary of State Jeremy Wright discussing the difficulties surrounding the definition. Despite acknowledging this, the report's authors nevertheless expect technical experts to be able to set out what constitutes harmful content that will be overseen by an independent regulator.

International experience shows that in practice it is extremely difficult to define harmful content in such a way that would target only bad speech. Last year, for example, activists in Vietnam wrote an open letter to Facebook complaining that Facebook's system of automatically pulling content if enough people complained could silence human rights activists and citizen journalists in Vietnam , while Facebook has shut down the livestreams of people in the United States using the platform as a tool to document their experiences of police violence.

Index on Censorship chief executive Jodie Ginsberg said:

It is vital that any new system created for regulating social media protects freedom of expression, rather than introducing new restrictions on speech by the back door. We already have laws to deal with harassment, incitement to violence, and incitement to hatred. Even well-intentioned laws meant to tackle hateful views online often end up hurting the minority groups they are meant to protect, stifle public debate, and limit the public's ability to hold the powerful to account.

The select committee report provides the example of Germany as a country that has legislated against harmful content on tech platforms. However, it fails to mention the German Network Reinforcement Act was legislating on content that was already considered illegal, nor the widespread criticism of the law that included the UN rapporteur on freedom of expression and groups such as Human Rights Watch. It also cites the fact that one in six of Facebook's moderators now works in Germany as practical evidence that legislation can work. Ginsberg said:

The existence of more moderators is not evidence that the laws work. Evidence would be if more harmful content had been removed and if lawful speech flourished. Given that there is no effective mechanism for challenging decisions made by operators, it is impossible to tell how much lawful content is being removed in Germany. But the fact that Russia, Singapore and the Philippines have all cited the German law as a positive example of ways to restrict content online should give us pause.

Index has reported on various examples of the German law being applied incorrectly, including the removal of a tweet of journalist Martin Eimermacher criticising the double standards of tabloid newspaper Bild Zeitung and the blocking of the Twitter account of German satirical magazine Titanic. The Association of German Journalists (DJV) has said the Twitter move amounted to censorship, adding it had warned of this danger when the German law was drawn up.

Index is also concerned about the continued calls for tools to distinguish between quality journalism and unreliable sources, most recently in the Cairncross Review . While we recognise that the ability to do this as individuals and through education is key to democracy, we are worried that a reliance on a labelling system could create false positives, and mean that smaller or newer journalism outfits would find themselves rejected by the system.

 

 

Eavesdropping on Eve's porn habits...

Korea steps up its internet censorship by breaking into https packets


Link Here22nd February 2019
Full story: Internet Censorship in South Korea...Repressive new internet censorship law
South Korea will expand its site blocking measures with SNI eavesdropping, so HTTPS sites can be blocked as well. The new measure, which will also affect pirate sites, has generated widespread opposition. While it's more effective than standard DNS blocking, it's certainly not impossible to circumvent.

When it comes to pirate site blocking, South Korea is one of the most proactive countries in the Asia-Pacific region. Pirate website blocking orders are sanctioned by the Korean Communications Standards Commission (KCSC), which also oversees other blocking efforts, including those targeted at porn or illegal gambling sites.

While the ISP blockades work well for regular HTTP sites, they are fairly easy to bypass on HTTPS connections, something most sites offer today. For this reason, the Korean authorities are now stepping up their blocking game. This week the Government announced that it will start eavesdropping on SNI fields , which identify the hostname of the target server. This allows ISPs to see which HTTPS sites users are trying to access, so these can be blocked if they're on the Korean blocklist.

The new measures will apply to 895 foreign websites that are linked to porn, gambling or copyright infringement.

The new blocking policy is meeting quite a bit of resistance locally. A petition that was launched earlier this week has been signed by over 180,000 people already and this number is growing rapidly. The petition warns that this type of censorship is limiting freedom of expression. At the same time, however, it notes that people will find ways to bypass the blockades.

SNI eavesdropping and blocking is useless when people use a VPN. In addition, more modern browsers and companies such as Cloudflare increasingly support encrypted SNI (ESNI). This prevents ISPs from snooping on SNI handshakes.

 

 

 

Offsite Article: Good to see the Daily Mail reporting on protests against EU internet censorship...


Link Here 22nd February 2019
Full story: Copyright in the EU...Copyright law for Europe
EU backs copyright law that could force Google and Facebook to block huge amounts of posts sparking protests at censorship of the web

See article from dailymail.co.uk

 

 

Taking to the Streets...

The Worst Possible Version of the EU Copyright Directive Has Sparked a German Uprising


Link Here19th February 2019
Full story: Copyright in the EU...Copyright law for Europe

Last week's publication of the final draft of the new EU Copyright Directive baffled and infuriated almost everyone, including the massive entertainment companies that lobbied for it in the first place; the artists' groups who endorsed it only to have their interests stripped out of the final document; and the millions and millions of Europeans who had publicly called on lawmakers to fix grave deficiencies in the earlier drafts, only to find these deficiencies made even worse .

Take Action: Stop Article 13

Thankfully, Europeans aren't taking this lying down. With the final vote expected to come during the March 25-28 session, mere weeks before European elections, European activists are pouring the pressure onto their Members of the European Parliament (MEPs), letting them know that their vote on this dreadful mess will be on everyone's mind during the election campaigns.

The epicenter of the uprising is Germany, which is only fitting, given that German MEP Axel Voss is almost singlehandedly responsible for poisoning the Directive with rules that will lead to mass surveillance and mass censorship, not to mention undermining much of Europe's tech sector.

The German Consumer Association were swift to condemn the Directive, stating : "The reform of copyright law in this form does not benefit anyone, let alone consumers. MEPs are now obliged to do so. Since the outcome of the trilogue falls short of the EU Parliament's positions at key points, they should refuse to give their consent."

A viral video of Axel Voss being confronted by activists has been picked up by politicians campaigning against Voss's Christian Democratic Party in the upcoming elections, spreading to Germany's top TV personalities, like Jan Böhmermann.

Things are just getting started. On Saturday, with just two days of organizing, hundreds of Europeans marched on the streets of Cologne against Article 13. A day of action --March 23, just before the first possible voting date for MEPs--is being planned, with EU-wide events.

In the meantime, the petition to save Europe from the Directive --already the largest in EU history--keeps racking up more signatures, and is on track to be the largest petition in the history of the world.

Take Action : Stop Article 13

 

 

Offsite Article: Artificial Unintelligence...


Link Here19th February 2019
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law
EU proposal pushes tech companies to tackle terrorist content with AI, despite implications for war crimes evidence

See article from advox.globalvoices.org

 

 

Parliamentary committee publishes report laying into Facebook for flagrant data abuse...

But inevitably concludes that the UK needs a new social media censor


Link Here18th February 2019
Full story: Fake news in the UK...Government sets up fake news unit

The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and 'fake news'. The report calls for:

  • Compulsory Code of Ethics for tech companies overseen by independent regulator

  • Regulator given powers to launch legal action against companies breaching code

  • Government to reform current electoral communications laws and rules on overseas involvement in UK elections

  • Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation

Further finds that:

  • Electoral law 'not fit for purpose'

  • Facebook intentionally and knowingly violated both data privacy and anti-competition laws

Chair's comment

Damian Collins MP, Chair of the DCMS Committee said:

"Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them; we cannot delay any longer.

"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this is directed from agencies working in foreign countries, including Russia.

"The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

"Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.

"These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the 'move fast and break things' culture often seems to be that it is better to apologise than ask permission.

"We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.

"We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world. More needs to be done to require major donors to clearly establish the source of their funds.

"Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.

"We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.

"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to answer yet he's continued to duck them, refusing to respond to our invitations directly or sending representatives who don't have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world's biggest companies.

"We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We want to find out what was the impact of disinformation and voter manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation."

Final Report

This Final Report on Disinformation and 'Fake News' repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government to reconsider a number of recommendations to which it did not respond and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.

The Report repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining what constitutes harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.

Companies failing obligations on harmful or illegal content would face hefty fines. MPs conclude: "Social media companies cannot hide behind the claim of being merely a 'platform' and maintain that they have no responsibility themselves in regulating the content of their sites."

The Report's recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm recommendations for legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.

It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.

Data use and data targeting

The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the Report finds evidence to indicate that the company was willing to: override its users' privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and starve some developers--such as Six4Three--of that data, contributing to them losing their business. MPs conclude: "It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws."

It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users' and users' friends' data, and the use of 'reciprocity' of the sharing of data. The CMA (Competition and Markets Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.

MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: "By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both our Committee and the 'International Grand Committee' involving members from nine legislators from around the world."

 

 

Perhaps they were addicted to their research...

Despite a ban on porn doctors in India note a 75% increase in porn viewing and rather simplistically correlate this to an increase in the divorce rate


Link Here18th February 2019
Full story: Internet Censorship in India...India considers blanket ban on internet porn
According to a recent analysis, people in Hyderabad have taken an avid interest in viewing porn even though it has been banned. With the Union government banning 827 porn sites across the country, an increase of 75% has been seen in porn viewing in Hyderabad.

Hyderabad is among the many states which have seen an increase in porn viewership. On conducting a medical study, it was claimed that the increasing number of divorces can be attributed to psychological effects of porn addiction.

A survey published by DocOnline and conducted by city doctors, it was inferred that the obsession with pornography is effecting the sexual health of viewers. Dr Syed Abrar Kareem, a physician stated that porn gives rise to impractical sexual expectations which when not met, result in psycho-somatic disorders. Out of the 5,000 people chosen for the survey, 3,500 were men and 1,500 were women confessed to watching porn regularly.

A rise of 31% has been recorded in divorces and break-ups. Allegedly, the doctors have also seen an increase in impotency cases being brought to them due to the extreme involvement in virtual sex.

 

 

Offsite Article: The EU sees itself as a white knight slaying the Google dragon...


Link Here 18th February 2019
Full story: Copyright in the EU...Copyright law for Europe
But its choice of weapon is the atom bomb and it doesn't care about destroying the freedom and livelihoods of the European people relying on the Google ecosystem

See article from torrentfreak.com

 

 

Algorithmic propaganda...

The Council of Europe calls on European countries to control social media 'algorithms' presumably meaning that people will be force fed with what the establishment want people to read


Link Here16th February 2019
The Council of Europe is a wider organisation of European countries than the EU and is known best for being the grouping behind the European Court of Human Rights.

The council's Committee of Ministers has issued a statement criticising the algorithmic nature of social media. It calls on member countries to address its concerns. The Committee writes:

- draws attention to the growing threat to the right of human beings to form opinions and take decisions independently of automated systems, which emanates from advanced digital technologies. Attention should be paid particularly to their capacity to use personal and non-personal data to sort and micro-target people, to identify individual vulnerabilities and exploit accurate predictive knowledge, and to reconfigure social environments in order to meet specific goals and vested interests;

- encourages member States to assume their responsibility to address this threat by

a) ensuring that adequate priority attention is paid at senior level to this inter-disciplinary concern that often falls in between established mandates of relevant authorities;

b) considering the need for additional protective frameworks related to data that go beyond current notions of personal data protection and privacy and address the significant impacts of the targeted use of data on societies and on the exercise of human rights more broadly;

c) initiating, within appropriate institutional frameworks, open-ended, informed and inclusive public debates with a view to providing guidance on where to draw the line between forms of permissible persuasion and unacceptable manipulation. The latter may take the form of influence that is subliminal, exploits existing vulnerabilities or cognitive biases, and/or encroaches on the independence and authenticity of individual decision-making;

d) taking appropriate and proportionate measures to ensure that effective legal guarantees are in place against such forms of illegitimate interference; and

e) empowering users by promoting critical digital literacy skills and robustly enhancing public awareness of how many data are generated and processed by personal devices, networks, and platforms through algorithmic processes that are trained for data exploitation. Specifically, public awareness should be enhanced of the fact that algorithmic tools are widely used for commercial purposes and, increasingly, for political reasons, as well as for ambitions of anti- or undemocratic power gain, warfare, or to inflict direct harm;

Of course if once strips away the jargon, then the fundamental algorithm is to simply give people more of what they seem to have enjoyed reading. And of course the establishment's preferred algorithm is to give people what the state would like them to read.

 

 

One click and your jailed...

The government's Counter-Terrorism Act has received royal consent and makes it an offence to accidentally click on a link before you even know what it contains


Link Here13th February 2019
Full story: Extremism in the UK...UK government introduces wide ranging ban on extremism
It will be an offence to view terrorist material online just once -- and could incur a prison sentence of up to 15 years -- under a new UK law.

The Counter-Terrorism and Border Security Bill has just been granted Royal Assent, updating a previous Act and bringing new powers to law enforcement to tackle terrorism.

But a controversial inclusion was to update the offence of obtaining information likely to be useful to a person committing or preparing an act of terrorism so that it now covers viewing or streaming content online.

Originally, the proposal had been to make it an offence for someone to view material three or more times -- but the three strikes idea has been dropped from the final Act.

The government said that the existing laws didn't capture the nuance in changing methods for distribution and consumption of terrorist content -- and so added a new clause into the 2019 Act making it an offence to view (or otherwise access) any terrorist material online. This means that, technically, anyone who clicked on a link to such material could be caught by the law.

 

 

Offence rules...

Conviction of holocaust denial being grossly offensive upheld on appeal


Link Here13th February 2019
Full story: Insulting UK Law...UK proesecutions of jokes and insults on social media
A musician found guilty of broadcasting grossly offensive anti-Semitic songs has had her conviction upheld.

Alison Chabloz has written many politically incorrect, humorous and insulting songs often targeted at jews but also more generally against the PC establishment. The songs have been published on many internet platforms including YouTube.

In May she was convicted of three charges relating to the songs and was given a suspended jail sentence by magistrates which she appealed against.

A judge at Southwark Crown Court has upheld her conviction ruling the content was particularly repellent. In the songs Chabloz suggested the Holocaust was a bunch of lies and referred to Auschwitz as a theme park.

Chabloz was convicted of two counts of sending an offensive, indecent or menacing message through a public communications network and a third charge relating to a song on YouTube.

She was sentenced to 20 weeks' imprisonment, suspended for two years and banned from social media for 12 months.

During the appeal Adrian Davies, defending, told judge Christopher Hehir: It would be a very, very strong thing to say that a criminal penalty should be imposed on someone for singing in polemical terms about matters on which she feels so strongly.

The case started as a private prosecution by the Campaign Against Anti-Semitism before the Crown Prosecution Service took over. The group's chairman, Gideon Falter, said: This is the first conviction in the UK over Holocaust denial on social media.

 

 

Javid threatens to cut social media down to size...

Delete gang related content or else!


Link Here13th February 2019
Full story: Knives in UK Media...Blaming video games showing knives
Social media giants will face tough new laws to prevent the spread of knife crime, the Home Secretary threatened -- as he spoke of fears for his own children's safety.

Sajid Javid said it was time for a legal crackdown on social media images promoting gang culture, in the same way that child sex abuse images and terrorist propaganda have already been outlawed.

In a warning to online firms, he said:

My message to these companies is we are going to legislate and how far we go depends on what you decide to do now. At the moment we don't have the legislation for these types of [knife crime-related] content.

I have it for terrorist content and child sexual abuse images.

Google is among several firms which have been criticised for hosting content glamorising gang culture. Rappers using its YouTube video platform post so-called drill music videos to boast about the number of people they have stabbed or shot, using street terms. The platform has taken down dozens of videos by drill artists, after warnings from the Metropolitan Police that they were raising the risk of violence.

 

 

The Cairncross Review...

Online platforms should have a 'news quality obligation' to improve trust in news they host, overseen by a censor


Link Here12th February 2019

The Cairncross Review into the future of the UK news industry has delivered its final report, with recommendations on how to safeguard the future sustainability of the UK press.

  • Online platforms should have a 'news quality obligation' to improve trust in news they host, overseen by a regulator

  • Government should explore direct funding for local news and new tax reliefs to support public interest journalism

  • A new Institute for Public Interest News should focus on the future of local and regional press and oversee a new innovation fund

The independent review , undertaken by Frances Cairncross, was tasked by the Prime Minister in 2018 with investigating the sustainability of the production and distribution of high-quality journalism. It comes as significant changes to technology and consumer behaviour are posing problems for high-quality journalism, both in the UK and globally.

Cairncross was advised by a panel from the local and national press, digital and physical publishers and advertising. Her recommendations include measures to tackle the uneven balance of power between news publishers and the online platforms that distribute their content, and to address the growing risks to the future provision of public-interest news.

It also concludes that intervention may be needed to improve people's ability to assess the quality of online news, and to measure their engagement with public interest news. The key recommendations are:

  • New codes of conduct to rebalance the relationship between publishers and online platforms;

  • The Competition and Markets Authority to investigate the online advertising market to ensure fair competition;

  • Online platforms' efforts to improve their users' news experience should be placed under regulatory supervision;

  • Ofcom should explore the market impact of BBC News, and whether its inappropriately steps into areas better served by commercial news providers;

  • The BBC should do more to help local publishers and think further about how its news provision can act as a complement to commercial news;

  • A new independent Institute should be created to ensure the future provision of public interest news;

  • A new Innovation Fund should be launched, aiming to improve the supply of public interest news;

  • New forms of tax reliefs to encourage payments for online news content and support local and investigative journalism;

  • Expanding financial support for local news by extending the BBC's Local Democracy Reporting Service;

  • Developing a media literacy strategy alongside Ofcom, industry and stakeholders.

The Government will now consider all of the recommendations in more detail. To inform this, the Culture Secretary will write immediately to the Competition and Markets Authority, Ofcom and the Chair of the Charity Commission to open discussions about how best to take forward the recommendations which fall within their remits. The Government will respond fully to the report later this year.

DCMS Secretary of State Jeremy Wright said:

A healthy democracy needs high quality journalism to thrive and this report sets out the challenges to putting our news media on a stronger and more sustainable footing, in the face of changing technology and rising disinformation. There are some things we can take action on immediately while others will need further careful consideration with stakeholders on the best way forward.

A Mediatique report Overview of recent market dynamics in the UK press, April 2018 commissioned by DCMS as the part of the Cairncross Review found:

  • Print advertising revenues have dropped by more than two-thirds in the ten years to 2017;

  • Print circulation of national papers fell from 11.5 million daily copies in 2008 to 5.8 million in 2018 and for local papers from 63.4 million weekly in 2007 to 31.4 million weekly in 2017;

  • Sales of both national and local printed papers fell by roughly half between 2007 and 2017, and are still declining;

  • The number of full-time frontline journalists in the UK has dropped from an estimated 23,000 in 2007, to just 17,000 today, and the numbers are still declining.

A report Online Advertising in the UK by Plum Consulting, commissioned by DCMS as the part of the Cairncross Review (and available as an annex to the Review) found:

  • UK internet advertising expenditure increased from £3.5 billion in 2008 to £11.5 billion in 2017, a compound annual growth rate of 14%.

  • Publishers rely on display advertising for their revenue online - which in the last decade has transformed into a complex, automated system known as programmatic advertising.

  • An estimated average of £0.62 of every £1 spent on programmatic advertising goes to the publisher - though this can range from £0.43 to £0.72. *Collectively, Facebook and Google were estimated to have accounted for over half (54%) of all UK online advertising revenues in 2017.

  • The major online platforms collect multiple first-party datasets from large numbers of logged-in users. They generally, they do not share data with third-parties, including publishers.

Dame Frances Cairncross is a former economic journalist, author and academic administrator. She is currently Chair of the Court of Heriot-Watt University and a Trustee at the Natural History Museum. Dame Frances was Rector of Exeter College, Oxford University; a senior editor on The Economist; and principal economic columnist for the Guardian. In 2014 she was made a Dame of the British Empire for services to education. She is the author of a number of books, including "The Death of Distance: How the Communications Revolution is Changing our Lives" and "Costing the Earth: The Challenge for Governments, the Opportunities for Business". Dame Frances is married to financial journalist Hamish McRae.

The BBC comments on some of the ideas not included in the report's recommendations

See article from bbc.co.uk

The report falls short of requiring Facebook, Google and other tech giants to pay for the news they distribute via their platforms. Caurncross told the BBC's media editor Amol Rajan that "draconian and risky" measures could result in firms such as Google withdrawing their news services altogether.:

There are a number of ways we have suggested technology companies could behave differently and could be made to behave differently. But they are mostly ways that don't immediately involve legislation."

Frances Cairncross earned widespread respect as a journalist for her hard-headed and pragmatic approach to economics. That pragmatism is the very reason the government commissioned her to look at the future of high-quality news - and also the reason many in local and regional media will be disappointed by her recommendations.

What is most notable about her review is what it doesn't do.

  • It doesn't suggest all social media should be regulated in the UK
  • It doesn't suggest social media companies pay for the privilege of using news content
  • It doesn't suggest social media companies be treated as publishers, with legal liability for all that appears on their platform

This is because the practicalities of doing these things are difficult, and experience shows that the likes of Google will simply pull out of markets that don't suit them.

Ultimately, as this report acknowledges, when it comes to news, convenience is king. The speed, versatility and zero cost of so much news now means that, even if it is of poor quality, a generation of consumers has fallen out of the habit of paying for news. But quality costs. If quality news has a future, consumers will have to pay. That's the main lesson of this report.

 

 

UK Internet Regulation...

Open Righst Group publishes a report outlining a new wave of Internet censorship on the horizon


Link Here12th February 2019

2018 was a pivotal year for data protection. First the Cambridge Analytica scandal put a spotlight on Facebook's questionable privacy practices. Then the new Data Protection Act and the General Data Protection Regulation (GDPR) forced businesses to better handle personal data.

As these events continue to develop, 2019 is shaping up to be a similarly consequential year for free speech online as new forms of digital censorship assert themselves in the UK and EU.

Of chief concern in the UK are several initiatives within the Government's grand plan to "make Britain the safest place in the world to be online", known as the Digital Charter. Its founding document proclaims "the same rights that people have offline must be protected online." That sounds a lot like Open Rights Group's mission! What's not to like?

Well, just as surveillance programmes created in the name of national security proved detrimental to privacy rights, new Internet regulations targeting "harmful content" risk curtailing free expression.

The Digital Charter's remit is staggeringly broad. It addresses just about every conceivable evil on the Internet from bullying and hate speech to copyright infringement, child pornography and terrorist propaganda. With so many initiatives developing simultaneously it can be easy to get lost.

To gain clarity, Open Rights Group published a report surveying the current state of digital censorship in the UK . The report is broken up into two main sections - formal censorship practices like copyright and pornography blocking, and informal censorship practices including ISP filtering and counter terrorism activity. The report shows how authorities, while often engaging in important work, can be prone to mistakes and unaccountable takedowns that lack independent means of redress.

Over the coming weeks we'll post a series of excerpts from the report covering the following:

Formal censorship practices

  • Copyright blocking injunctions

  • BBFC pornography blocking

  • BBFC requests to "Ancillary Service Providers"

Informal censorship practices

  • Nominet domain suspensions

  • The Counter Terrorism Internet Referral Unit (CTIRU)

  • The Internet Watch Foundation (IWF)

  • ISP content filtering

The big picture

Take a step back from the many measures encompassed within the Digital Charter and a clear pattern emerges. When it comes to web blocking, the same rules do not apply online as offline. Many powers and practices the government employs to remove online content would be deemed unacceptable and arbitrary if they were applied to offline publications.

Part II of our report is in the works and will focus on threats to free speech within yet another branch of the Digital Charter known as the Internet Safety Strategy.

 

 

Article 13 is Not Just Criminally Irresponsible...

It's Irresponsibly Criminal. By Glyn Moody


Link Here12th February 2019
Full story: Copyright in the EU...Copyright law for Europe

 

 

 

Offsite Article: Obscenity law liberalised...


Link Here12th February 2019
The Adam Smith Institute comments on the UK liberalisation of its obscenity law. By Nick Cowen

See article from adamsmith.org

 

 

Offsite Article: Be careful who you share with...


Link Here11th February 2019
China sets its sights on buying up western social media starting with reddit

See article from theguardian.com

 

 

Fake free speech...

Russian anti-fake news bill rushed through parliament despite vocal opposition


Link Here10th February 2019
Full story: Internet Censorship in Russia...Russia and its repressive state control of media

The Russian State Duma is considering multiple bills of law that would further stifle free speech in Russia's already heavily restricted internet environment.

One targets expressions of willful disregard towards the state. Another targets disinformation. All of them echo increasingly global concerns among governments about the political implications of disinformation -- and unbridled criticism -- on the internet. And all have been heavily criticized by Russian civil society groups, experts, users and even the government's own ministers. Yet these bills promoting possible further crackdown on free speech still trudge on through the legislative system.

The first bill, a sovereign internet initiative , which is yet to reach the floor of the lower chamber of Russia's bicameral parliament, seeks to establish state-regulated internet exchange points that would allow for increased monitoring and control over internet traffic moving into and out of the country.

Under this law, individuals, officials or organizations accused of spreading fake news disguised as genuine public announcements which are found to promote public disorder or other serious disturbances could be fined for up to a million rubles (slightly above USD $15,000), unless they remove the violating content in a day's time. The bill also provides measures through which Roskomnadzor, Russia's media watchdog, will order ISPs to block websites hosting the offending content.

The bill passed its first reading in late January with flying colours, receiving 336 votes in its favor and only 44 against, thanks to the 2016 landslide which guaranteed the ruling United Russia party an absolute voting majority.

The anti-fake news bill will be reviewed again by the Duma in February, conditioned on the revision of some of its most contentious points. The bill pushed through by Putin's party was met with a rare response of significant opposition, even among the normally acquiescent branches of Russia's highly centralized and executive-biased power structure. The attorney general's office, among others, criticized the bill's vague definitions as potentially damaging to citizens' civil rights.

The second bill , which came up for review alongside the fake news-busting proposal, is seen as being even more controversial. It seeks to punish vulgar expressions of wilful disregard towards the state, its symbols and organs of its power with fines of up to 5,000 rubles (around USD $76) and detention for up to 15 days. The bill also passed in the first reading on the same day, despite vocal criticism from both government members (Deputy Communications Minister Alexey Volin said that calmly accepting criticism was an obligation for state officials, adding that they weren't made of sugar) and opposition parties.

 

 

Offsite Article: The new tech totalitarianism...


Link Here10th February 2019
A book review of The Age of Surveillance Capitalism by Professor Shoshana Zuboff

See article from newstatesman.com

 

 

Duty of care: an empty concept...

The Open Rights Group comments on government moves to create a social media censor


Link Here9th February 2019

There is every reason to believe that the government and opposition are moving to a consensus on introducing a duty of care for social media companies to reduce harm and risk to their users. This may be backed by an Internet regulator, who might decide what kind of mitigating actions are appropriate to address the risks to users on different platforms.

This idea originated from a series of papers by Will Perrin and Lorna Woods and has been mentioned most recently in a recent Science and Technology committee report and by NGOs including children's charity 5Rights.

A duty of care has some obvious merits: it could be based on objective risks, based on evidence, and ensure that mitigations are proportionate to those risks. It could take some of the politicisation out of the current debate.

However, it also has obvious problems. For a start, it focuses on risk rather than process . It moves attention away from the fact that interventions are regulating social media users just as much as platforms. It does not by itself tell us that free expression impacts will be considered, tracked or mitigated.

Furthermore, the lack of focus that a duty of care model gives to process means that platform decisions that have nothing to do with risky content are not necessarily based on better decisions, independent appeals and so on. Rather, as has happened with German regulation, processes can remain unaffected when they are outside a duty of care.

In practice, a lot of content which is disturbing or offensive is already banned on online platforms. Much of this would not be in scope under a duty of care but it is precisely these kinds of material which users often complain about, when it is either not removed when they want it gone, or is removed incorrectly. Any model of social media regulation needs to improve these issues, but a duty of care is unlikely to touch these problems.

There are very many questions about the kinds of risk, whether to individual in general, vulnerable groups, or society at large; and the evidence required to create action. The truth is that a duty of care, if cast sensibly and narrowly, will not satisfy many of the people who are demanding action; equally, if the threshold to act is low, then it will quickly be seen to be a mechanism for wide-scale Internet censorship.

It is also a simple fact that many decisions that platforms make about legal content which is not risky are not the business of government to regulate. This includes decisions about what legal content is promoted and why. For this reason, we believe that a better approach might be to require independent self-regulation of major platforms across all of their content decisions. This requirement could be a legislative one, but the regulator would need to be independent of government and platforms.

Independent self-regulation has not been truly tried. Instead, voluntary agreements have filled its place. We should be cautious about moving straight to government regulation of social media and social media users. The government refuses to regulate the press in this way because it doesn't wish to be seen to be controlling print media. It is pretty curious that neither the media nor the government are spelling out the risks of state regulation of the speech of millions of British citizens.

That we are in this place is of course largely the fault of the social media platforms themselves, who have failed to understand the need and value of transparent and accountable systems to ensure they are acting properly. That, however, just demonstrates the problem: politically weak platforms who have created monopoly positions based on data silos are now being sliced and diced at the policy table for their wider errors. It's imperative that as these government proposals progress we keep focus on the simple fact that it is end users whose speech will ultimately be regulated.

 

 

A tax burden shared...

Uganda's social media tax is leaving people either disconnected or else relying on shared wifi access or VPNs


Link Here9th February 2019

The number of people using the internet in Uganda has dropped by 26% since July 2018, when the country's social media tax was put into force. Prior to the tax's implementation, 47.4% of people in Uganda were using the internet. Three months after the tax was put in place, that number had fallen to 35%.

ISPs charge an additional 200 Ugandan shillings (UGX) in social media tax on top of the ISP access fees and standard sales tax. This is nominally 5.4 US cents but is a significant portion of typical Ugandan incomes.

President Yoweri Museveni and several government officials said this was intended to curb online rumor-mongering and to generate more tax revenue.

The tax was the subject of large-scale public protests in July and August 2018. During one protest against the tax, key opposition leader, activist and musician Bobi Wine noted that the tax was enforced to oppress the young generation.

The government expected to collect about UGX 24 billion in revenue from the tax every quarter. But in the first quarter after the tax's implementation, they collected UGX 20 billion. In the second quarter, ending December 2018, they had collected only UGX 16 billion.

While some people have gone offline altogether, others are simply finding different and more affordable ways to connect. People are creating shared access points where one device pays the tax and tethers the rest as a WiFi hotspot, or relying on workplace and public area WiFi networks to access the services.

Other Ugandans are using Virtual Private Network (VPN) applications to bypass the tax. In a statement for The Daily Monitor , the Uganda Revenue Authority's Ian Rumanyika argued that people could not use the VPNs forever, but that doesn't seem to be the case.

In addition to leaving Ugandans with less access to communication and diminished abilities to express themselves online, it has also affected economic and commercial sectors, where mobile money and online marketing are essential components of daily business.

 

 

What have Donald Trump and Donald Tusk got in common?...

Apart from the name Donald, and securing a place in hell, both put American corporate interests above European livelihoods. The Council of the EU approves copyright law that will suffocate European businesses and livelihoods


Link Here 8th February 2019
Full story: Copyright in the EU...Copyright law for Europe

The Council of the EU headed by Donald Tusk has just adopted as the common position, the deal struck by France and Germany on the controversial EU Copyright Directive that was leaked earlier this week .

While Italy, Poland, the Netherlands, Sweden, Finland and Luxembourg maintained their opposition to the text and were newly joined by Malta and Slovakia, Germany's support of the "compromise" secretly negotiated with France over the last weeks has broken the previous deadlock .

This new Council position is actually more extreme than previous versions, requiring all platforms older than 3 years to automatically censor all their users' uploads, and putting unreasonable burdens even on the newest companies.

The German Conservative--Social Democrat government is now in blatant violation of its own coalition agreement , which rejects upload filters against copyright infringement as disproportionate. This breach of coalition promises will not go down well with many young voters just ahead of the European elections in May. Meanwhile, prominent members of both German government parties have joined the protests against upload filters.

The deal in Council paves the way for a final round of negotiations with the Parliament over the course of next week, before the entire European Parliament and the Council vote on the final agreement. It is now up to you to contact your MEPs, call their offices in their constituencies and visit as many of their election campaign events as you can! Ask them to reject a copyright deal that will violate your rights to share legal creations like parodies and reviews online, and includes measures like the link tax that will limit your access to the news and drive small online newspapers out of business.

Right before the European elections, your voices cannot be ignored! Join the over 4.6 million signatories to the largest European petition ever and tell your representatives: If you break the Internet and accept Article 13, we won't reelect you!

 

 

Updated: Lawmakers with a weak constitution...

New York State lawmaker proposes a 20 dollar tax on all online or offline porn purchases


Link Here8th February 2019
Full story: Pole Tax...Discriminatory taxes on adult entertainment in USA
A new bill introduced late last month in the New York State legislature marks the latest attempt to impose a user tax on porn, or for that matter any sexually oriented media. Teh proposed bill will slap an extra $2 on to every porn download.

The charge would also apply to offline sexually oriented media, adding the two-buck fee to each magazine or DVD classified as sexually oriented. In fact, the language of New York Assembly Bill AO3417 is so broad that it apparently would apply not only to porn, but even to R-rated movies and TV programs airing on pay cable networks such as HBO or Showtime.

That's because the law as written by Assistant Assembly Speaker Felix W. Ortiz defines sexually oriented as any media that features nude pictures or nude performances. And nude does not even mean completely nude under the bill's wording, breasts or buttocks are enough.

The language of the bill is also unclear on whether the $2 surcharge would apply to free porn downloads, such as on Pornhub and similar tube sites.

Update: Blocking blocked in South Dakota

8th February 2019. See  article from eu.argusleader.com

An attempt to block pornography and other obscene material on all personal devices in South Dakota, then charge users a $20 access fee, was voted down Friday by state lawmakers.

House Bill 1154, written by out-of-state authors, raised serious concerns with lobbyists representing South Dakota retailers and telecommunication companies, who opposed the measure in a meeting of the House Judiciary Committee Friday morning.

 

 

Offsite Article: A Lord Chamberlain for the internet?...


Link Here 8th February 2019
Thanks, but no thanks. By Graham Smith

See article from cyberleagle.com

 

 

Searching for compliance...

Google agrees to censor search in Russia as dictated by the authorities


Link Here7th February 2019
Full story: Internet Censorship in Russia...Russia and its repressive state control of media

Google has agreed to censor search results in Russia as dictated by country's internet censor. This will then allow Google to continue operations in Russia.

Google is one of a few search engines that does not adhere to an official list of banned websites that should not be included in search results.. However, Google already deletes 70% links from its search results to websites that internet censor Roskomnadzor has banned.

In December of 2018, Roskomnadzor charged Google a fine of 500,000 rubles ($7,590) for refusing to subscribe to the banned list. The company did not challenge the agency's decision and chose to pay the fine. The Russian law that made the fine possible does not allow Roskomnadzor to block sites that do not comply with its censorship demands, but that did not stop Roskomnadzor from threatening to block Google within Russian borders regardless.

 

 

Offsite Article: Banned in 1964 and again in 2019...


Link Here7th February 2019
Full story: Facebook Censorship...Facebook quick to censor
An article on US censorship history about the 1964 obscenity case against the avant garde movie Flaming Creatures has stills banned by Facebook in 2019

See article from indiewire.com

 

 

Offsite Article: Monopolistic data grabbers...


Link Here7th February 2019
Full story: Facebook Privacy...Facebook criticised for discouraging privacy
German Competition watchdog bans Facebook from processing so much data without explicit permissions

See article from bbc.co.uk

 

 

The EU's Curb on the Dissemination of Terrorist Content Will Have a Chilling Effect on Speech...

Is there a special place in hell for EU legislators who promote censorship without even a sketch of a plan of the likely consequences


Link Here6th February 2019
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law

Governments around the world are grappling with the threat of terrorism, but their efforts aimed at curbing the dissemination of terrorist content online all too often result in censorship. Over the past five years, we've seen a number of governments--from the US Congress to that of France and now the European Commission (EC)--seek to implement measures that place an undue burden on technology companies to remove terrorist speech or face financial liability.

This is why EFF has joined forces with dozens of organizations to call on members of the European Parliament to oppose the EC's proposed regulation, which would require companies to take down terrorist content within one hour . We've added our voice to two letters--one from Witness and another organized by the Center for Democracy and Technology --asking that MEPs consider the serious consequences that the passing of this regulation could have on human rights defenders and on freedom of expression.

We share the concerns of dozens of allies that requiring the use of proactive measures such as use of the terrorism hash database (already voluntarily in use by a number of companies) will restrict expression and have a disproportionate impact on marginalized groups. We know from years of experience that filters just don't work.

Furthermore, the proposed requirement that companies must respond to reports of terrorist speech within an hour is, to put it bluntly, absurd. As the letter organized by Witness states, this regulation essentially forces companies to bypass due process and make rapid and unaccountable decisions on expression through automated means and furthermore doesn't reflect the realities of how violent groups recruit and share information online.

We echo these and other calls from defenders of human rights and civil liberties for MEPs to reject proactive filtering obligations and to refrain from enacting laws that will have unintended consequences for freedom of expression.

 

 

Updated: As always increased red tape benefits the largest (ie US) companies...

Daily Mail reports on government discussion about a new internet censor, codenamed Ofweb


Link Here 6th February 2019
Wrangling in Whitehall has held up plans to set up a social media censor dubbed Ofweb, The Mail on Sunday reveals.

The Government was due to publish a White Paper this winter on censorship of tech giants but this Mail has learnt it is still far from ready. Culture Secretary Jeremy Wright said it would be published within a month, but a Cabinet source said that timeline was wholly unrealistic. Other senior Government sources went further and said the policy document is unlikely to surface before the Spring.

Key details on how a new censor would work have yet to be decided while funding from the Treasury has not yet been secured. Another problem is that some Ministers believe the proposed clampdown is too draconian and are preparing to try to block or water down the plan.

There are also concerns that technically difficult requirements would benefit the largest US companies as smaller European companies and start ups would not be able to afford the technology and development required.

The Mail on Sunday understands Jeremy Wright has postponed a visit to Facebook HQ in California to discuss the measures, as key details are still up in the air.

Update: The Conservatives don't have a monopoly on internet censorship...Labour agrees

6th February 2019. See  article from ft.com

Labour has called for a new entity capable of taking on the likes of Facebook and Google. Tom Watson, the shadow digital secretary, will on Wednesday say a regulator should also have responsibility for competition policy and be able to refer cases to the Competition and Markets Authority.

According to Watson, any duty of care would only be effective with penalties that seriously affect companies' bottom lines. He has referred to regulators' ability to fine companies up to 4% of global turnover, or euro 20m, whichever is higher, for worst-case breaches of the EU-wide General Data Protection Regulation.

 

 

Commented: Censorship machines roll on to lay waste European business...

Article 13 is back on -- and it got worse, not better


Link Here 6th February 2019
Full story: Copyright in the EU...Copyright law for Europe

Contrary to some reports A rticle 13 was not shelved solely because EU governments listened to the unprecedented public opposition and understood that upload filters are costly, error-prone and threaten fundamental rights.

Without doubt, the consistent public opposition contributed to 11 member state governments voting against the mandate, instead of just 6 last year, but ultimately the reform hinges on agreement between France and Germany , who due to their size can make or break blocking minorities. The deadlock is the direct result of their disagreement, which was not about whether to have upload filters at all; they just couldn't  agree on exactly who should be forced to install those faulty filters :

The deadlock hinged on a disagreement between France and Germany

  • France's position: Article 13 is great and must apply to all platforms, regardless of size . They must demonstrate that they have done all they possibly could to prevent uploads of copyrighted material. In the case of small businesses, that may or may not mean using upload filters -- ultimately, a court would have to make that call . (This was previously the majority position among EU governments , supported by France, before Italy's newly elected government retracted their support for Article 13 altogether.)

  • Germany's position: Article 13 is great, but it should not apply to everyone. Companies with a turnover below ?20 million per year should be excluded outright, so as not to harm European internet startups and SMEs. (This was closer to the European Parliament's current position , which calls for the exclusion of companies with a turnover below ?10 million and fewer than 50 employees.)

What brought France and Germany together:

Making Article 13 even worse In the Franco-German deal , which leaked today, Article 13 does apply to all for-profit platforms. Upload filters must be installed by everyone except those services which fit all three of the following extremely narrow criteria:

  • Available to the public for less than 3 years

  • Annual turnover below ? 10 million

  • Fewer than 5 million unique monthly users

Countless apps and sites that do not meet all these criteria would need to install upload filters, burdening their users and operators, even when copyright infringement is not at all currently a problem for them. Some examples:

  • Discussion boards on for-profit sites, such as the Ars Technica or Heise.de forums (older than 3 years)

  • Patreon , a platform with the sole purpose of helping authors get paid (fails to meet any of the three criteria)

  • Niche social networks like GetReeled , a platform for anglers (well below 5 million users, but older than 3 years)

  • Small European competitors to larger US brands like wykop, a Polish news sharing platform similar to reddit (well below ? 10 million turnover, bur may be above 5 million users depending on the calculation method)

On top of that, even the smallest and newest platforms, which do meet all three criteria , must still demonstrate they have undertaken " best efforts " to obtain licenses from rightholders such as record labels, book publishers and stock photo databases for anything their users might possibly upload -- an impossible task . In practice, all sites and apps where users may upload material will likely be forced to accept any license a rightholder offers them , no matter how bad the terms, and no matter whether the y actually want their copyrighted material to be available on the platform or not , to avoid the massive legal risk of coming in conflict with Article 13. In summary: France's and Germany's compromise on Article 13 still calls for nearly everything we post or share online to require prior permission by "censorship machines" , algorithms that are fundamentally unable to distinguish between copyright infringement and legal works such as parody and critique. It would change the web from a place where we can all freely express ourselves into one where big corporate rightholders are the gatekeepers of what can and can't be published. It would allow these rightholders to bully any for-profit site or app that includes an upload function. European innovation on the web would be discouraged by the new costs and legal risks for startups -- even if they only apply when platforms become successful, or turn 3 years old. Foreign sites and apps would be incentivised to just geoblock all EU users to be on the safe side.

Now everything hinges on the European Parliament

With this road block out of the way, the trilogue negotiations to finish the new EU copyright law are back on. With no time to lose, there will be massive pressure to reach an overall agreement within the next few days and pass the law in March or April. The most likely next steps will be a rubber-stamping of the new Council position cooked up by Germany and France on Friday, 8 February, and a final trilogue on Monday, 11 February.
MEPs, most of whom are fighting for re-election, will get one final say. Last September, a narrow majority for Article 13 could only be found in the Parliament after a small business exception was included that was much stronger than the foul deal France and Germany are now proposing -- but I don't have high hopes that Parliament negotiator Axel Voss will insist on this point. Whether MEPs will reject this harmful version of Article 13 (like they initially did last July) or bow to the pressure will depend on whether all of us make clear to them: If you break the internet and enact Article 13, we won't re-elect you.

Update: Now made even worse

6th February 2019. See  article from eff.org by Cory Doctorow

As the German Government Abandons Small Businesses, the Worst Parts of the EU Copyright Directive Come Roaring Back, Made Even Worse

 

 

Artificial intelligence with an IQ of 5...

Facebook bans comedy show advert because the word 'Brexit' means Facebook thinks that it is a political advert


Link Here5th February 2019
Facebook has been caught out censoring a poster for comedy show because Facebook's simplistic algorithms couldn't distinguish a jokey use of the word 'Brexit' from a political advert.

The social media site has taken drastic action to clamp down on political advertising in a bid to tackle a backlash over secret Russian interference. But it was accused of over-reacting after a comedian was told he couldn't promote his show Brexit Through The Gift Shop.

comedian Matt Forde was told his stand up show's title breached new rules on ads about politics or issues of national importance. Facebook told him: There's no way around this other than not using the word Brexit.

The comedian told The Sun that it was incredible that Facebook allowed tech firms to harvest the data of millions without telling them but stopped him from advertising a comedy show. Forde added:

I'm flattered that they think I'm a greater threat to their users than the collapse of global democracy. Obviously what I forgot to do was offer Facebook the personal data of my friends and family.

 

 

Wrangling on Twitter...

Jeremy Clarkson on The Grand Tour winds up gay campaigner joking about gay associations with the Wrangler Jeep


Link Here4th February 2019
Full story: Top Gear and the Grand Tour...Top Gear and Jeremy Clarkson wind up whingers
The Grand Tour presenter Jeremy Clarkson has pushed back at claims of homophobia from gay singer Will Young by joking about enjoying lesbian porn.

LGBT+ campaigner and musician Will Young had hit out at Clarkson after a recent episode of Amazon motoring show included a running gag alluding to a Jeep Wrangler being gay. The January 27 episode of the Amazon show also saw Clarkson ask whether LGBT stands for lesbian, bacon, transgender.

Young tweeted:

I'm afraid 3 heterosexual men SO uncomfortable with their sexuality that they reference in some lame way a Wrangler Jeep being a Gay mans car

.... and then Hammond and May's 'quips to Clarkson wearing chaps , a pink shirt , he should get some moisturiser . It's f**king pathetic and actually homophobic .

Clarkson responded to Young also on Twitter:

...I will apologise to Will for causing him some upset and reassure him that I know I'm not homophobic as I very much enjoy watching lesbians on the internet.

 

 

Morality in Media recommends...

Morality in Media rant about NC-17 films on Netflix and kindly publish a list of the sexiest films available to stream


Link Here4th February 2019
Full story: Morality in Media...Misreable campaigners for censorship
Netflix has hundreds of TV shows and movies to choose from. That selection also includes many titles that contain extreme graphic sexual content, requiring no automatic barrier to access them.

Furthermore, Netflix has a flimsy ratings system at best, only very recently adding content ratings to the opening screens of any selection. The lack of descriptive content warnings employed by cable television and other streaming services means that movie rated "TV-MA" could have anything from a few swear words to gratuitous and explicit sex scenes.

Due to the nature of streaming, many of Netflix's titles do not have the industry-standard MPAA ratings such as PG or R. Instead, many programs fall under the umbrella of TV-MA, meaning "Mature Audiences". Again, Netflix does not require any other content warnings to be included besides this vague description. Netflix currently has several films that fall under the category of TV-MA, and some are even rated NC-17. Many of the films rated TV-MA were originally released as NC-17 or otherwise have extremely graphic sexual content.

The only way to block this content is for parental controls to be turned on, requiring a PIN for either specific titles or ratings such as TV-MA. Read more about Netflix parental controls here .

If movie theaters don't allow those under 17 to see NC-17 (or even R) rated movies, then why is Netflix making these films available to anyone regardless of age?

Below is a list of films currently offered on Netflix that have either a NC-17 or TV-MA rating for graphic sexual content:

  • Love
  • Nymphomaniac I and II
  • Lust Stories
  • Blue is the Warmest Color
  • Y Tu Mama Tambíen
  • Sex and Lucía
  • Below Her Mouth
  • Sex Doll
  • Fragments of Love
  • Love Steaks
  • The Little Death
  • Immoral Tales
  • Desire
  • Newness

 

 

Offsite Article: The age of market dominance...


Link Here4th February 2019
The Register talks to Pornhub's Age ID about progress of UK Age verification schemes

See article from theregister.co.uk

 

 

Offsite Article: Core censorship...


Link Here2nd February 2019
Full story: Internet Censorship in China...All pervading Chinese internet censorship
New Website Exposes How Apple Censors Apps in China

See article from theintercept.com

 

 

Offsite Article: How can internet companies compile databases of our browsing habits without seeking consent?...


Link Here1st February 2019
ICO are asked to investigate by the Open Rights Group and others

See article from theregister.co.uk


 2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan   Feb   Mar   Apr   May   June   July   Aug   Sep   Oct   Nov   Dec    


 


 
TV  

Movies

Games

Internet
 
Advertising

Technology

Gambling

Food+Drink
Books

Music

Art

Stage

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys