Melon Farmers Original Version

Copyright and Control Freaks


2020: April-June

 2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

Netherlands puts copyright controls above human rights...

Dutch Law Proposes a Wholesale Jettisoning of Human Rights Considerations in Copyright Enforcement By Cory Doctorow


Link Here 30th June 2020
Full story: Copyright in the EU...Copyright law for Europe

With the passage of last year's Copyright Directive, the EU demanded that member states pass laws that reduce copyright infringement by internet users while also requiring that they safeguard the fundamental rights of users (such as the right to free expression) and also the limitations to copyright . These safeguards must include protections for the new EU-wide exemption for commentary and criticism. Meanwhile states are also required to uphold the GDPR, which safeguards users against mass, indiscriminate surveillance, while somehow monitoring everything every user posts to decide whether it infringes copyright .

Serving these goals means that when EU member states turn the Directive into their national laws (the "transposition" process), their governments will have to decide to give more weight to some parts of the Directive, and that courts would have to figure out whether the resulting laws passed constitutional muster while satisfying the requirement of EU members to follow its rules.

The initial forays into transposition were catastrophic. First came France's disastrous proposal , which "balanced" copyright enforcement with Europeans' fundamental rights to fairness, free expression, and privacy by simply ignoring those public rights.

Now, the Dutch Parliament has landed in the same untenable legislative cul-de-sac as their French counterparts, proposing a Made-in-Holland version of the Copyright Directive that omits:

  • Legally sufficient protections for users unjustly censored due to false accusations of copyright infringement;

  • Legally sufficient protection for users whose work makes use of the mandatory, statutory exemptions for parody and criticism;

  • A ban on "general monitoring"-- that is, continuous, mass surveillance;

  • Legally sufficient protection for "legitimate uses" of copyright works.

These are not optional elements of the Copyright Directive. These protections were enshrined in the Directive as part of the bargain meant to balance the fundamental rights of Europeans against the commercial interests of entertainment corporations. The Dutch Parliament's willingness to pay mere lip-service to these human rights-preserving measures as legislative inconveniences is a grim harbinger of other EU nations' pending lawmaking, and an indictment of the Dutch Parliament's commitment to human rights.

EFF was pleased to lead a coalition of libraries, human rights NGOs, and users' rights organizations in an open letter to the EU Commission asking them to monitor national implementations that respect human rights.

In April, we followed this letter with a note to the EC's Copyright Stakeholder Dialogue Team , setting out the impossibility of squaring the Copyright Directive with the GDPR's rules protecting Europeans from "general monitoring," and calling on them to direct member-states to create test suites that can evaluate whether companies' responses to their laws live up to their human rights obligations.

Today, we renew these and other demands, and we ask that Dutch Parliamentarians do their job in transposing the Copyright Directive , with the understanding that the provisions that protect Europeans' rights are not mere ornaments, and any law that fails to uphold those provisions is on a collision course with years of painful, costly litigation.

 

 

Offsite Article: A tweak too far...


Link Here16th June 2020
Microsoft claims that tweaking Windows 10 to remove annoying features is a DMCA copyright violation

See article from torrentfreak.com

 

 

Offsite Article: EFF Petitions the Appeals Court:...


Link Here23rd May 2020
Reverse Legal Gotchas on Ordinary Internet Activities Using RSS. By Mitch Stoltz

See article from eff.org

 

 

Updated: Googling for newspaper revenue...

Australia accelerates its link tax proposals


Link Here23rd April 2020
The Australian Competition and Consumer Commission (ACCC) is accelerating its proposals to require social media companies to sare revenue obtained from sharing or linking to Australian media sources.

A mandatory code being developed by the ACCC will include penalties for Google, Facebook and other media platforms that share news content.

The code originally scheduled for November 2020 is being brought forward as newspapers struggle for income in coronavirus lockdown.

The code originally required internet companies to negotiate in good faith on how to pay news media for use of their content, advise news media in advance of algorithm changes that would affect content rankings, favour original source news content in search page results, and share data with media companies. But of course limited success in early negotiations between the platforms and the news industry has led to more of a mandatory imposition approach.

Now a draft code will be finalised by the end of July.

Update: Britain too

23rd April 2020. See article from dailymail.co.uk

Facebook and Google should be made to pay for news content generated by the UK media to avoid the death of the industry, UK ministers were told today.

Ex-Culture Committee chair Damian Collins is urging the government to follow the example of Australia, where new rules are being brought in to help prop up publications amid coronavirus turmoil.

 

 

Calling for perpetual internet domination licenses for YouTube and Facebook...

The EFF responds to a petition calling for EU censorship machines to be required in the US too. By Katharine Trendacosta and Corynne McSherry


Link Here 22nd April 2020

Right now, we really are living our everyday lives online. Teachers are trying to teach classes online, librarians are trying to host digital readings, and trainers are trying to offer home classes.

With more people entering the online world, more people are encountering the barriers created by copyright. Now is no time to make those barriers higher, but a new petition directed at tech companies does exactly that, and in the process tries to do for the US what Article 17 of last's year's European Copyright Directive is doing for Europe--create a rule requiring online service providers to send everything we post to the Internet to black-box machine learning filters that will block anything that the filters classify as "copyright infringement."

The petition from musical artists, calls on companies to "reduce copyright infringement by establishing 'standard technical measures.'" The argument is that, because of COVID-19, music labels and musicians cannot tour and, therefore, are having a harder time making up for losses due to online copyright infringement. So the platforms must do more to prevent that infringement.

Musical artists are certainly facing grave financial harms due to COVID-19, so it's understandable that they'd like to increase their revenue wherever they can. But there are at least three problems with this approach, and each creates a situation which would cause harm for Internet users and wouldn't achieve the ends musicians are seeking.

First, the Big Tech companies targeted by the petition already employ a wide variety of technical measures in the name of blocking infringement, and long experience with these systems has proven them to be utterly toxic to lawful, online speech. YouTube even warned that this current crisis would prompt even more mistakes, since human review and appeals were going to be reduced or delayed. It has, at least, decided not to issue strikes except where it has "high confidence" that there was some violation of YouTube policy. In a situation where more people than before are relying on these platforms to share their legally protected expression, we should, if anything, be looking to lessen the burden on users, not increase it. We should be looking to make them fairer, more transparent, and appeals more accessible, not adding more technological barriers.

YouTube's Content ID tool has flagged everything from someone speaking into a mic to check the audio to a synthesizer test. Scribd's filter caught and removed a duplicate upload of the Mueller report, despite the fact that anything created by a federal government employee as part of their work can't even be copyrighted. Facebook's Rights Manager keeps flagging its users' performances of classical music composed hundreds of years ago. Filters can't distinguish lawful from unlawful content. Human beings need to review these matches.

But they don't. Or if they do, they aren't trained to distinguish lawful uses. Five rightsholders were happy to monetize ten hours of static because Content ID matched it. Sony refused the dispute by one Bach performer, who only got his video unblocked after leveraging public outrage. A video explaining how musicologists determine whether one song infringes on another was taken down by Content ID, and the system was so confusing that law professors who are experts in intellectual property couldn't figure out the effect of the claim in their account if they disputed it. They only got the video restored because they were able to get in touch with YouTube via their connections . Private connections, public outrage, and press coverage often get these bad matches undone, but they are not a substitute for a fair system.

Second, adding more restrictions will raise make making and sharing our common culture harder at a time when, if anything, it needs to be easier. We should not require everyone online become experts in law and the specific labyrinthine policies of a company or industry just when whole new groups of people are transferring their lives, livelihoods, and communities to the Internet.

If there's one lesson recent history has taught us, it's that "temporary, emergency measures" have a way of sticking around after the crisis passes, becoming a new normal. For the same reason that we should be worried about contact tracing apps becoming a permanent means for governments to track and control whole populations, we should be alarmed at the thought that all our online lives (which, during the quarantine, are almost our whole lives) will be subjected to automated surveillance, judgment and censorship by a system of unaccountable algorithms operated by massive corporations where it's impossible to get anyone to answer an email.

Third, this petition appears to be a dangerous step toward the content industry's Holy Grail: manufacturing an industry consensus on standard technical measures (STMs) to police copyright infringement. According to Section 512 of the Digital Millennium Copyright Act (DMCA), service providers must accommodate STMs in order to receive the safe harbor protections from the risk of crippling copyright liability. To qualify as an STM, a measure must (1) have been developed pursuant to a broad consensus in an "open, fair, voluntary, multi-industry standards process"; (2) be available on reasonable and nondiscriminatory terms; and (3) cannot impose substantial costs on service providers . Nothing has ever met all three requirements, not least because no "open, fair, voluntary, multi-industry standards process" exists.

Many in the content industries would like to change that, and some would like to see U.S. follow the EU in adopting mandatory copyright filtering. The EU's Copyright Directive--also known as Article 17, the most controversial part --passed a year ago, but only one country has made progress towards implementing it [pdf]. Even before the current crisis, countries were having trouble reconciling the rights of users, the rights of copyright holders, and the obligations of platforms into workable law. The United Kingdom took Brexit as a chance not to implement it . And requiring automated filters in the EU runs into the problem that the EU has recognized the danger of algorithms by giving users the right not to be subject to decisions made by automated tools.

Put simply, the regime envisioned by Article 17 would end up being too complicated and expensive for most platforms to build and operate. YouTube's Content ID alone has cost $100,000,000 to date, and it just filters videos for one service. Musicians are 100 percent right to complain about the size and influence of YouTube and Facebook, but mandatory filtering creates a world in which only YouTube and Facebook can afford to operate. Cementing Big Tech's dominance is not in the interests of musicians or users. Mandatory copyright filters aren't a way to control Big Tech: they're a way for Big Tech to buy Perpetual Internet Domination licenses that guarantee that they need never fear a competitor.

Musicians are under real financial stress due to COVID-19, and they are not incorrect to see something wrong with just how much of the world is in the hands of Big Tech. But things will not get better for them or for users by entrenching its position or making it harder to share work online.

 

 

Offsite Article: The EU has a copyright on impossible to comply with censorship law...


Link Here 20th April 2020
Full story: Copyright in the EU...Copyright law for Europe
The internet industry is still scratching its head about an upcoming EU copyright law requiring social media to block uploads of illegal content whilst requiring that they do not over block legal content

See article from euractiv.com

 

 

Justice shared...

US court rules that posting images on Instagram effectively grants 3rd parties website copyright permission to embed those images


Link Here15th April 2020
A US court ruled yesterday that Mashable can embed a professional photographer's photo without breaking copyright law, thanks to Instagram's terms of service.

The New York district court determined that Stephanie Sinclair offered a valid sublicense to use the photograph when she posted it publicly on Instagram.

The case stems from a 2016 Mashable post on female photographers, which included Sinclair and embedded an image from her Instagram feed. Mashable had previously failed to license the image directly, and Sinclair sued parent company Ziff Davis for using Instagram embedding as a workaround.

The judge noted that Instagram reserves a fully paid and royalty-free, transferable, sub-licensable right to photos on its service. If a photo is posted publicly, it also offers embedding as an option -- which, in Wood's estimation, effectively grants a sublicense to display the picture. The user who initially uploaded the content has already granted Instagram the authority to sublicense the use of 'public' content to users who share it, Wood wrote. That makes copyright questions moot. By posting the photograph to her public Instagram account, Plaintiff made her choice.

 

 

Offsite Article: Opening gambit...


Link Here10th April 2020
Full story: Copyright in the EU...Copyright law for Europe
France reports that its implementation of the EU Copyright Directive requires Google to pay for links to French news sources

See article from politico.eu

 

 

Guilty until it provides proof of innocence...

Google is accused in court of downranking competitors in searches. A judge says that it must reveal its algorithms to prove its unlikely denial of the accusation


Link Here5th April 2020
Full story: Google Transparency Reports...Google reveals the scale of copyright claims

 2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    


 


Liberty

Privacy

Copyright
 

Free Speech

Campaigners

Religion
 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys