Melon Farmers Original Version

EU Censorship News


Latest

 2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 

 

AI porn sniffing censor...

German Government Now Exporting Anti-Porn Surveillance Tool


Link Here14th February 2024
Full story: Age Verification in Germany...Requiring age verification for adult websites
A moral campaigner who has been waging a one-man War on Porn in Germany, and who developed an AI tool that scans online content to identify porn images, has now exported that technology for use by a Belgian media censor.

Tobias Schmid, director of the State Media Authority of North Rhine-Westphalia, announced the tool after supervising its development himself. He named it KIVI, a word play referencing surveillance.

A spokeswoman for the State Media Authority of North Rhine-Westphalia confirmed to NetzPolitik that there were exploratory talks taking place regarding expanding the use of KIVI across Europe. Last week, it was confirmed that Belgium's Superior Audiovisual Council (CSA) is also automatically searching the Internet, looking for freely accessible pornography, among other things.

KIVI was developed for Schmid by Berlin-based Condat AG and is currently being used by all 14 state media authorities in Germany. In addition to pornography, KIVI is also trained to detect categories like extremism, hate speech, swastikas or the glorification of drugs.

Belgium's CSA is now scanning X.com for adult content, Meineck reported, noting, From September to December 2023, around 5,000 suspicious activity reports were collected. Examiners viewed around a fifth of it, and around 90% of this content was 'clearly' pornographic, and thus should not be accessible without strict age controls.

 

 

Coining it in...

Spain announces a plan to required age/identity verification for online porn viewers


Link Here16th December 2023
The government of Spanish Prime Minister Pedro Sánchez intends to implement age verification to access adult content on the internet across the board to prevent minors from viewing age-restricted websites. Spain's data regulator Agencia Española de Protección de Datos (AEPD) is developing a process to require web users to utilize a digital ID card.

The Royal Spanish Mint will be directed to develop the digital ID technology following recommendations from the AEPD. One format floated by the agency is that a user will download an app on their mobile device, a QR code, or some other type of digital document verifying their age through a government ID, health or residence cards, a driver's license, or a passport. AEPD claims that this approach minimizes risks of a data breach since third parties--such as a private sector age verification software vendor or a regulated platform--will not be able to access a user's sensitive personally identifiable information.

Unfortunately, there is no guarantee of sensitive personally identifiable information being safe in the hands of a government agency or private company. Consider a case that occurred in Louisiana, which was the first U.S. state to require an ID to view adult content. Seeking to comply with the law, the tube site Pornhub adopted an age verification solution that integrated with the state's digital identification app, LA Wallet.

Months after the deployment of LA Wallet by Pornhub, the company and the agency administering the digital wallet program were victims of a data breach. A local news report indicates that over 6 million records from the Louisiana Office of Motor Vehicles were exposed by hackers in June 2023. Names, addresses, ID numbers, social security numbers, height, weight and eye colors were exposed in a breach of a file transfer protocol.

Even with the best intention and risk mitigation, AEPD will not be able to completely prevent a breach of data. That is one major concern among critics of age verification.

 

 

Testing old ideas...

Italy introduces network level blocking for SIM cards registered to under 18s


Link Here19th November 2023
Full story: Internet Censorship in Italy...Censorship affecting bloogers and the press in Italy
Italy will begin enforcing a new, experimental directive from the countries internet censor requiring all phone providers to install a default filter for all adult content, on SIM cards registered to minors.

The directive from the Italian Communications Regulatory Authority (AGCOM) was approved in January and published on Feb. 21, allowing telecom companies nine months for full implementation.

AGCOM Commissioner Massimiliano Capitanio told Italian media that the measure is a testing ground to verify the real desire of adults to take an active part in the digital education of their children.

Adult content categorized for filtering includes all websites for an adult audience, showing full or partial nudity in a pornographic sexual context, sexual accessories, sexually oriented activities, and sites that support the online purchase of such goods and services.

Besides adult content, other material designated for filtering includes sites related to gambling, weapons sales, violence, self-injury or suicide; sites that display scenes of gratuitous, sustained or brutal violence; and sites promoting hatred or intolerance toward any individual or group, or promoting practices that can damage health, like anorexia or bulimia, or the use of drugs, alcohol or tobacco.

Another blocked category is sites that provide tools and methods to make online activity untraceable, including VPNs.

 

 

EU snoopers foiled...

European Parliament votes against an EU Commission proposal for mass scanning of all internet communication


Link Here16th November 2023
Full story: Internet Encryption in the EU...Encryption is legal for the moment but the authorites are seeking to end this

On 14th November, Members of the European Parliament's Civil Liberties committee voted against attempts from EU Home Affairs officials to roll out mass scanning of private and encrypted messages across Europe. It was a clear-cut vote, with a significant majority of MEPs supporting the proposed position.

A political deal struck by the Parliament's seven political groups at the end of October meant that this outcome was expected. Nevertheless, this is an important and welcome milestone, as Parliamentarians demand that EU laws are based in objective evidence, scientific reality and with respect for human rights law.

This vote signals major improvements compared to the Commission's original draft law (coined Chat Control'), which has courted controversy. The process around the legislation has faced allegations of conflicts of interest and illegal advert micro-targeting, and rulings of "maladministration". The proposal has also been widely criticised for failing to meet EU requirements of proportionality -- with lawyers for the EU member states making the unprecedented critique that the proposal likely violates the essence of the right to privacy.

In particular, the vote shows the strong political will of the Parliament to remove the most dangerous parts of this law -- mass scanning, undermining digital security and mandating widespread age verification. Parliamentarians have recognised that no matter how important the aim of a law, it must be pursued using only lawful and legitimate measures.

At the same time, there are parts of their position which still concern us, and which would need to be addressed if any final law were to be acceptable from a digital rights point of view. Coupled with mass surveillance plans from the Council of member states and attempts from the Commission to manipulate the process, we remain sceptical about the chances of a good final outcome.

Civil liberties MEPs also voted for this position to become the official position of the European Parliament. On 20 th November, the rest of the house will be notified about the intention to permit negotiators to move forward without an additional vote. Only after that point will the position voted on today be confirmed as the European Parliament's mandate for the CSA Regulation.

Mass scanning (detection orders)

The European Parliament's position firmly rejects the premise that in order to search for child sexual abuse material (CSAM), all people's messages may be scanned (Articles 7-11). Instead, MEPs require that specific suspicion must be required -- a similar principle to warrants. This is a vital change which would resolve one of the most notorious parts of the law. The position also introduces judicial oversight of hash lists (Article 44.3), which we welcome. However, it unfortunately does not distinguish between basic hashing (which is generally seen as more robust) and perceptual hashing (which is less reliable).

At the same time, the wording also needs improvement to ensure legal certainty. The Parliament position rightly confirms that scanning must be "targeted and specified and limited to individual users, [or] a specific group of users" (Article 7.1). This means that there must be "reasonable grounds of suspicion a link [...] with child sexual abuse material" (Articles 7.1. and 7.2.(a)). However, despite attempts in Recital (21) to interpret the "specific group of users" narrowly, we are concerned that the phrasing "as subscribers to a specific channel of communications"(Article 7.1.) is too broad and too open to interpretation. he concept of "an indirect link" is also ambiguous in the context of private messages, and should be deleted or clarified.

The Parliament's position deletes solicitation (grooming) detection from the scope of detection orders, recognising the unreliability of such tools. However, the fact that solicitation remains in the scope of risk assessment (Articles 3 and 4) still poses a risk of incentivising overly-restrictive measures.

End-to-end encryption

The European Parliament's position states that end-to-end encrypted private message services -- like WhatsApp, Signal or ProtonMail -- are not subject to scanning technologies (Articles 7.1 and 10.3). This is a strong and clear protection to stop encrypted message services from being weakened in a way that could harm everyone that relies on them -- a key demand of civil society and technologists.

Several other provisions throughout the text, such as a horizontal protection of encrypted services (Article 1.3a and Recital 9a), give further confirmation of the Parliament's will to protect one of the only ways we all have to keep our digital information safe.

There is a potential (likely unintended) loophole in the Parliament's position on end-to-end encryption, however, which must be addressed in future negotiations. This is the fact that whilst encrypted 'interpersonal communications services (private messages) are protected, there is not an explicit protecting for other kinds of encrypted services ('hosting services').

It would therefore be important to amend Article 1.3a. to ensure that hosting providers, such as of personal cloud backups, cannot be required to circumvent the security and confidentiality of their services with methods that are designed to access encrypted information, and that Article 7.1. is amended so that it is not limited to interpersonal communications.

Age verification & other risk mitigation measures

The European Parliament's position is mixed when it comes to age verification and other risk mitigation measures. EDRi has been clear that mandatory age verification at EU level would be very risky -- and we are glad to see that these concerns have been acted upon. The European Parliament's position protects people's anonymity online by removing mandatory age verification for private message services and app stores, and adds a series of strong safeguards for its optional use (Article 4.3.a.(a)-(k)). This is a positive and important set of measures.

On the other hand, we are disappointed that the Parliament's position makes age verification mandatory for porn platforms (Article 4a.) -- a step that is not coherent with the overall intention of the law. What's more, the cumulative nature of the risk mitigation measures for services directly targeting children in the Parliament's position (Article 4.1.(aa)) need further attention.

This is because there is no exception given for cases where the measures might not be right for a particular service, and could instead risk platforms or services deciding to exclude young people from their services to avoid these requirements.

We recommend that there should not be mandatory age verification for porn platforms, and that risk mitigation measures should oblige providers to achieve a specific outcome, rather than creating overly-detailed (and sometimes misguided) service design requirements. We also warn that the overall CSA Regulation framework should not incentivise the use of age verification tools.

Voluntary scanning

The European Parliament's position does not include a permanent voluntary scanning regime, despite some MEPs calling for such an addition. This is an important legal point: if co-legislators agree that targeted scanning measures are a necessary and proportionate limitation on people's fundamental human rights, then they cannot leave such measures to the discretion of private entities. The Parliament's position does -- however, extend the currently-in-force interim derogation by nine months (Article 88.2).


 2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


US  

Americas

World

Campaigns
 

UK  

W Europe

E Europe

Africa
 

Middle East

South Asia

Asia Pacific

Australia
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys