Melon Farmers Original Version

Americas Censorship News


Latest

 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   Latest 

 

Deserving of insult...

Canadian politician introduces legislation to ban politicians being insulted online


Link Here12th April 2021
Full story: Internet Censorship in Canada...Proposal for opt in intenet blocking
A Canadian politician has proposed the banning of 'hurtful' language against politicians online. The provision is going to be included in the upcoming internet censorship bill, to be discussed in parliament in the next few weeks.

Steven Guilbeault, a 'Liberal' member of parliament has oftenn been the subject of controversy for favoring internet censorship. He said in a recent podcast:

We have seen too many examples of public officials retreating from public service due to the hateful online content targeted towards themselves.

If the bill passes social media companies will have to remove posts containing hurtful words targeted at Canadian politicians. The provision is a danger to free speech not only in Canada but also the rest of the world as other governments will surely try to get away with similar censorship laws.

 

 

Age of censorship...

An internet porn age verification bill progresses in Canada


Link Here19th March 2021
Full story: Internet Censorship in Canada...Proposal for opt in intenet blocking
A bill has passed 2nd reading in the Canadian Senate that would require porn websites to implement age verification for users.

Bill S-203, An Act to restrict young persons' online access to sexually explicit material, will now be referred to the Standing Senate Committee on Legal and Constitutional Affairs.

 

 

Updated: Ethical porn...

Pornhub comes under scrutiny in the Canadian Parliament


Link Here6th February 2021
Full story: Pornhub...An ongoing target of censors
The Committee on Access to Information, Privacy and Ethics in Canada's House of Commons held a hearing concerning allegations made against Pornhub's content moderation policies. The allegations featured in a New York Times article by Nicholas Kristof and were based on a religious group Exodus Cry's Traffickinghub campaign against the tube site and parent company MindGeek.

MindGeek is headquartered in Luxembourg, although many of its operations are run from Montreal and the two people identified by the New York Times as owners are Canadian nationals.

The committee heard from a witness who retold her story of having difficulties getting Pornhub to remove a video she had shot of herself as a teenager, which then she sent to a boyfriend and which was allegedly repeatedly uploaded onto the tube site by unidentified third parties.

The committee also heard from New York lawyer Michael Bowe, who has previously represented disgraced evangelist Jerry Falwell Jr. and Donald Trump. Bowe made repeated claims about a supposed conspiracy masterminded by MindGeek, and their agents and allies to gaslight the public opinion about the organized international campaign against Pornhub. Bowe also asked for Canada to change their laws to make MindGeek accountable, and stated that in his opinion the company committed criminal offenses.

Update: Pornhub Releases Statement About Content Moderation Changes

6th February 2021. See statement from help.pornhub.com

Going forward, we will only allow properly identified users to upload content. We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations. Earlier this year, we also partnered with the National Center for Missing & Exploited Children, and next year we will issue our first transparency report. Full details on our expanded policies can be found below.

If you wish to report any content that violates our terms of service, including CSAM or other illegal content, please click this link .

1. Verified Uploaders Only

Effective immediately, only content partners and people within the Model Program will be able to upload content to Pornhub. In the new year, we will implement a verification process so that any user can upload content upon successful completion of identification protocol.

2. Banning Downloads

Effective immediately, we have removed the ability for users to download content from Pornhub, with the exception of paid downloads within the verified Model Program. In tandem with our fingerprinting technology, this will mitigate the ability for content already removed from the platform to be able to return.

3. Expanded Moderation

We have worked to create comprehensive measures that help protect our community from illegal content. In recent months we deployed an additional layer of moderation. The newly established "Red Team" will be dedicated solely to self-auditing the platform for potentially illegal material. The Red Team provides an extra layer of protection on top of the existing protocol, proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece of content that violates the Terms of Service. Additionally, while the list of banned keywords on Pornhub is already extensive, we will continue to identify additional keywords for removal on an ongoing basis. We will also regularly monitor search terms within the platform for increases in phrasings that attempt to bypass the safeguards in place. Pornhub's current content moderation includes an extensive team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and utilization of a variety of automated detection technologies. These technologies include:

  • CSAI Match, YouTube's proprietary technology for combating Child Sexual Abuse Imagery online

  • Content Safety API, Google's artificial intelligence tool that helps detect illegal imagery

  • PhotoDNA, Microsoft's technology that aids in finding and removing known images of child exploitation

  • Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform.

If a user encounters a piece of content they think may violate the Terms of Service, we encourage them to immediately flag the video or fill out the Content Removal Request Form , which is linked on every page.

Our policy is to immediately disable any content reported in the Content Removal Request Form for review.

4. Trusted Flagger Program

We recently launched a Trusted Flagger Program, a new initiative empowering non-profit partners to alert us of content they think may violate our Terms of Service. The Trusted Flagger Program consists of more than 40 leading non-profit organizations in the space of internet and child safety. Our partners have a direct line of access to our moderation team, and any content identified by a Trusted Flagger is immediately disabled. Partners include: Cyber Civil Rights Initiative (United States of America), National Center for Missing & Exploited Children (United States of America), Internet Watch Foundation (United Kingdom), Stopline (Austria), Child Focus (Belgium), Safenet (Bulgaria), Te Protejo Hotline - I Protect You Hotline (Colombia), CZ.NIC - Stop Online (Czech Republic ), Point de Contact (France), Eco-Association of the Internet Industry (Germany), Safeline (Greece), Save the Children (Iceland), Latvian Internet Association (Latvia), Meldpunt Kinderporno - Child Pornography Reporting Point (Netherlands), Centre for Safer Internet Slovenia (Slovenia), FPB Hotline - Film and Publication Board (South Africa), ECPAT (Sweden), ECPAT (Taiwan).

5. NCMEC Partnership

Last year, we voluntarily partnered with the National Center for Missing & Exploited Children (NCMEC) in order to transparently report and limit incidents of CSAM on our platform. In early 2021, NCMEC will release our total number of reported CSAM incidents alongside numbers from other major social and content platforms. We will also continue to work with law enforcement globally to report and curb any issues of illegal content.

6. Transparency Report

In 2021, we will release a Transparency Report detailing our content moderation results from 2020. This will identify not just the full number of reports filed with NCMEC, but also other key details related to the trust and safety of our platform. Much like Facebook, Instagram, Twitter and other tech platforms, Pornhub seeks to be fully transparent about the content that should and should not appear on the platform. This will make us the only adult content platform to release such a report.

7. Independent Review

As part of our commitment, in April 2020 we hired the law firm of Kaplan Hecker & Fink LLP to conduct an independent review of our content compliance function, with a focus on meeting legal standards and eliminating all non-consensual content, CSAM and any other content uploaded without the meaningful consent of all parties. We requested that the goal of the independent review be to identify the requisite steps to achieve a "best-in-class" content compliance program that sets the standard for the technology industry. Kaplan Hecker & Fink LLP is continuing its review, but has already identified and categorized a comprehensive inventory of remedial recommendations, supported by dozens of additional sub-recommendations, in addition to the steps identified above, based on an evaluation and assessment of our current policies and practices. Kaplan Hecker & Fink LLP is soliciting information to assist with its review and in developing recommendations regarding our compliance policies and procedures. If you would like to provide compliance suggestions, you can do so here .

 

 

Harming democratic expression...

The misleadingly titled Canadian Commission on 'Democratic Expression' bizarrely calls for internet censorship


Link Here27th January 2021
Full story: Internet Censorship in Canada...Proposal for opt in intenet blocking

Following nine months of study and deliberations, the Canadian Commission on Democratic Expression has settled on a series of principles and recommendations that can lead to a practical course of action. What we set forth is a series of functional steps to enable citizens, governments and platforms to deal with the matter of harmful speech in a free and democratic, rights-based society like Canada. We recognize the complexity of the issues at play and offer these as a path forward and with the knowledge they will be subject to further debate and molding.

PRINCIPLES

  • Free speech is fundamental to a democratic society and that the internet enables more people to participate in public discussions and debates.

  • The rise of hatred, disinformation, conspiracies, bullying and other harmful communications online is undermining these gains and having a corrosive impact on democratic expression in Canada.

  • The status quo of leaving content moderation to the sole discretion of platforms has failed to stem the spread of these harms and that platform companies can find themselves in conflict between their private interests and the public good.

  • We find fault with the notion that platforms are neutral disseminators of information. Platforms curate content to serve their commercial interests and so must assume greater responsibility for the harms they amplify and spread.

  • Government must play a more active role in furthering the cause of democratic expression and protecting Canadians from online harms.

  • Any policy response must put citizens first, reduce online harms and guard against the potential for over-censorship of content in putting forth remedies. This requires a balanced and multi-pronged approach.

These principles have led the Commission to an integrated program of six scaffolding recommendations.

RECOMMENDATIONS

1. A new legislated duty on platforms to act responsibly.

Establishment by Parliament of a statutory Duty to Act Responsibly imposing an affirmative requirement on platforms under legislation and regulation, including social media companies, large messaging groups, search engines and other internet operators involved in the dissemination of user-generated and third-party content. In addressing harms, the details of this duty must take account of principles such as the fundamental nature of free speech.

2. A new regulator to oversee and enforce the Duty to Act Responsibly.

Creation of a new regulatory body, operating within legislated guidelines, that represents the public interest and moves content moderation and platform governance beyond the exclusive preserve of private sector companies. The regulator would oversee a Code of Conduct to guide the actions of parties under its supervision, while recognizing that not all platforms can be treated in precisely the same manner. Regulatory decisions will be judicially made, based in the rule of law and subject to a process of review.

3. A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet.

Ensuring an inclusive dialogue on ongoing platform governance policies and practices, including content moderation, through a broadly based social media council that places platforms, civil society, citizens and other interested parties around the same table.

4. A world-leading transparency regime to provide the flow of necessary information to the regulator and Social Media Council.

Embedding significant, world-leading transparency mechanisms at the core of the mandate for the regulator and Social Media Council -- on data, ads, bots and the right to compel information. This will also assist researchers, journalists and members of the public with access to the information required for a publicly accountable system.

5. Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.@

Creating rapid and accessible recourse to content-based dispute settlement by a dedicated e-tribunal charged with addressing online content disputes in a timely manner. And creating a process that enables targets of harms to compel platforms to make creators aware of a complaint.

6. A mechanism to quickly remove content that presents an imminent threat to a person.

Development of a quick-response system under the authority of the regulator to ensure the rapid removal of content -- even temporarily -- that creates a reasonable apprehension of an imminent threat to the health and safety of the targeted party.

The Commission considered imposing takedown requirements on platforms as some nations have done. These generally identify offending categories of content, provide a fixed window, such as 24 hours, for it to be removed and may levy significant penalties. We are concerned that such systems could create incentives for over-censorship by platform companies. Our recommendations do less to circumscribe speech in recognition of the fact that harmful speech and the unjustified denial of freedom of expression are both problematic.

The standards of the Duty to Act Responsibly are purposely left vague at this point to give government, the regulator, and the Social Media Council an opportunity to flesh it out as part of a Code of Conduct. To be sure, we are not recommending the creation of a self-standing new tort as the basis for a cause of action, but rather imposing affirmative requirements on the platforms to be developed under legislation and regulation. We expect the Code will evolve given the recentness of the problem and the rapid evolution of the internet.


 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   Latest 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


US  

Americas

World

Campaigns
 

UK  

W Europe

E Europe

Africa
 

Middle East

South Asia

Asia Pacific

Australia
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys