|
Indonesia unblocks Tumblr now the porn has been censored
|
|
|
| 26th December 2018
|
|
| See article from coconuts.co
|
Indonesia's Ministry of Communications and Information (Kominfo) has announced that the social media website Tumblr is once again accessible in Indonesia now that it's been censored. Kominfo Spokesperson Ferdinandus Setu stated in a press release that
the ministry lifted its restrictions on Tumblr after the site no longer allowed adult content on its platform since Dec 17. The site had been blocked by Kominfo in March of this year due to the country's repressive anti-pornography laws.
|
|
|
|
|
|
26th December 2018
|
|
|
From state bans to corporate censorship, free speech is in trouble. By Andrew Doyle See article from
spiked-online.com |
|
|
|
|
|
24th December 2018
|
|
|
A bumper crop of pending litigation and legislative initiatives for the coming year (without even thinking about Brexit). See
article from cyberleagle.com |
|
New York State is considering legislation that demands gun licence applicants hand over social media and Google passwords so that these accounts can be checked for political correctness
|
|
|
| 21st December 2018
|
|
| See
petition from actionnetwork.org See
bill from legislation.nysenate.gov |
A bill was recently introduced to the New York State Senate by Senator Kevin Parker and Brooklyn borough President, Eric Adams, that would require gun license applicants to hand over social media passwords, and 3 years of search history for review by the
State. Regardless of how you feel about gun rights, this is a clear violation of privacy, and a request like this in any context is completely inappropriate, and totally unconstitutional. Background checks are one thing, but the process outlined in this
bill goes way too far. This isn't about gun rights, this is about privacy rights. The authorities intend to check that all licence applicants are totally politically correct. The relevant text of the bill reads:
In order to ascertain whether any social media account or search engine history of an applicant presents any good cause for the denial of a license, the investigating officer shall, after obtaining the applicant's consent pursuant to subdivision three of
this section, and obtaining any log-in name, password or other means for accessing a personal account, service, or electronic communications device necessary to review such applicant's social media accounts and search engine history, review an
applicant's social media accounts for the previous three years and search engine history for the previous year and investigate an applicant's posts or searches related to:
(i) commonly known profane slurs or biased language used to describe the race, color, national origin, ancestry, gender, religion, religious practice, age, disability or sexual orientation of a person;
(ii) threatening the health or safety of another person; (iii) an act of terrorism; or (iv) any other issue deemed necessary by the investigating officer.
For the purposes of this subdivision, "social media accounts" shall only include facebook, snapchat, twitter and instagram, and "search engine" shall only include google, yahoo and bing.
Security experts have long warned that it's extremely dangerous to give your password to anyone, including your local police department. It not only exposes you to unreasonably intrusive analysis, but also exposes private details of
everyone you have ever communicated with with online. If your friend wants to buy a gun does that mean the police should get to read every message you've ever sent them? The best thing we can do is reject these ideas right now to prevent bad privacy
practices from become normalized. It makes perfect sense to require background checks and other vetting before allowing someone to purchase a weapon, but setting any precedent that allows the government to demand social media
passwords is extremely dangerous. If you care about privacy, and keeping a close eye on overreaching state power, please sign this petition and tell the NY State Senate that you oppose bill S9191. Sign the
petition from actionnetwork.org
|
|
MPs nod through the BBFC internet porn censorship guidelines
|
|
|
| 19th December
2018
|
|
| See parliamentary transcription from theyworkforyou.com See
TV recording from parliamentlive.tv |
The House of Commons approved the upcoming internet porn censorship scheme to be implemented by the BBFC from about Easter 2019. The debate was set for 3 sections to approve each of the 3 documents defining the BBFC censorship guidelines. Each was
allotted 90 minutes for a detailed debate on how the BBFC would proceed. However following a Brexit debate the debate was curtailed to a single 90 minute session covering all 3 sections. It didn't matter much as the debate consisted only of MPs
with a feminist agenda saying how the scope of the censorship didn't go far enough. Even the government spokeswoman leading the debate didn't understand why the rules didn't go further in extending sites being censored to social media; and why the range
of porn to be banned outright wasn't more extensive. Hardly a word said was relevant to the topic of examining the BBFC guidelines. Issues of practicality, privacy, the endangerment of porn viewers from fraud, outing and blackmail are
clearly of no interest to MPs. The MPs duly nodded their approval of the BBFC regime and so it will soon be announced when the censorship will commence. The age verification service provider was quick to follow up with a press release
extolling the virtues of its porn viewing card approach. Several newspapers obliging published articles using it, eg See
Porn sites 'will all require proof of age from April 2019' -- here's how it'll work from metro.co.uk
|
|
|
|
|
|
19th December 2018
|
|
|
Turning Off Facebook Location Tracking Doesn't Stop It From Tracking Your Location See article from gizmodo.com.au
|
|
|
|
|
| 19th December 2018
|
|
|
Tumblr Bloggers Trying To Fool The Censor Bots With These Tricks See article from valuewalk.com |
|
|
|
|
| 16th December
2018
|
|
|
As usual the EU's cunning plan to try to make Google fund newspapers will end up suffocating European small businesses whilst making US internet giants even more powerful See
article from politico.eu |
|
|
|
|
| 16th December 2018
|
|
|
And the corollary is that all encryption apps which continue to operate in Australia have backdoors and so are unsafe to use See
article from motherboard.vice.com |
|
The upcoming porn censorship regime has been approved by the Lords
|
|
|
|
14th December 2018
|
|
| See article from xbiz.com See
transcript of the Lords debate from theyworkforyou.com |
On Tuesday the House of Lords approved the BBFC's scheme to implement internet porn censorship in the UK. Approval will now be sought from the House of Commons. The debate in the Lords mentioned a few issues in passing but they seemed to be
avoiding taking about some of the horrors of the scheme. The Digital Economy Act defining the law behind the scheme offers no legal requirement for age verification providers to restrict how they can use porn viewers data. Lords mentioned that it
is protected under the GDPR rules but these rules still let companies do whatever they like with data, just with the proviso that they ask for consent. But of course the consent is pretty mandatory to sign up for age verification, and some of the biggest
internet companies in the world have set the precedent they can explain wide ranging usage of the data claiming it will be used say to improve customer experience. Even if the lords didn't push very hard, people at the DCMS or BBFC have
been considering this deficiency, and have come up with the idea that data use should be voluntarily restricted according to a kite mark scheme. Age verification schemes will have their privacy protections audited by some independent group and if they
pass they can display a gold star. Porn viewers are then expected to trust age verification schemes with a gold star. But unfortunately it sounds a little like the sort of process that decided that cladding was safe for high rise blocks of flats. The lords were much more concerned about the age verification requirements for social media and search engines, notably Twitter and Google Images. Clearly age verification schemes required for checking that users are 13 or 18 will be very different from an 18 only check, and will be technically very different. So the Government explained that these wider issues will be addressed in a new censorship white paper to be published in 2019.
The lords were also a bit perturbed that the definition of banned material wasn't wide enough for their own preferences. Under the current scheme the BBFC will be expected to ban totally any websites with child porn or extreme porn. The lords
wondered why this wasn't extended to cartoon porn and beyond R18 porn, presumably thinking of fisting, golden showers and the like. However in reality if the definition of bannable porn was extended, then every major porn website in the word would have
to be banned by the BBFC. And anyway the government is changing its censorship rules such that fisting and golden showers are, or will soon be, allowable at R18 anyway. The debate revealed that the banks and payment providers have already agreed
to ban payments to websites banned by the BBFC. The government also confirmed its intention to get the scheme up and running by April. Saying that, it would seem a little unfair for the website's 3 month implementation period to be set running before
their age verification options are accredited with their gold stars. Otherwise some websites would waste time and money implementing schemes that may later be declared unacceptable. Next a motion to approve draft legislation over the UK's
age-verification regulations will be debated in the House of Commons. Stephen Winyard, AVSecure s chief marketing officer, told XBIZ: We are particularly pleased that the prime minister is set to approve the draft
guidance for the age-verification law on Monday. From this, the Department for Digital, Culture, Media and Sport will issue the effective start date and that will be around Easter.
But maybe the prime minister has a few more urgent
issues on her mind at the moment.
|
|
This will deprive European creators of their livelihood in favour of mostly American corporations. And yet these corporations are complaining that the law does not go far enough
|
|
|
| 14th December 2018
|
|
| See article from
eff.org See petition from change.org See
article from eff.org |
4,000,000 Europeans have signed a petition opposing Article 13 of the new Copyright in the Single Market Directive. They oppose it for two main reasons: because it will inevitably lead to the creation of algorithmic copyright filters that only US Big
Tech companies can afford (making the field less competitive and thus harder for working artists to negotiate better deals in) and because these filters will censor enormous quantities of legitimate material, thanks to inevitable algorithmic errors and
abuse. On Monday, a delegation from the signatories officially presented the Trilogue negotiators with the names of 4,000,000+ Europeans who oppose Article 13. These 4,000,000 are in esteemed company: Article 13 is also opposed by
the father of the Internet, Vint Cerf, and the creator of the Web, Tim Berners-Lee and more than 70 of the Internet's top technical experts, not to mention Europe's largest sports leagues and film studios. Burgeoning movements opposing the measure have
sprung up in Italy and Poland. But no matter how much damage the EU proposed law will do to European businesses and creators, it does not go far enough for the large corporates. This leaves a tricky negation for the EU power brokers of the EU
Commission and EU Council of Ministers. The law is widely opposed by European people but now the US corporates are whingeing that they don't like a few concessions made to get get the bill through the European Parliament. They want the full horror of
censorship machines resurrected. The EFF reports on a delay to proceedings: This week EU negotiators in Strasbourg struggled to craft the final language of the Copyright in the Single Digital Market Directive, in their last
possible meeting for 2019. They failed, thanks in large part to the Directive's two most controversial clauses: Article 11, which requires paid licenses for linking to news stories while including more than a word or two; and Article 13, which will lead
to the creation of error-prone copyright censorship algorithms that will block users from posting anything that has been identified as a copyrighted work -- even if that posting is lawful. This means that the Directive will not be completed, as was
expected, under Austria's presidency of the European Union. The negotiations between the European Parliament, representatives of the member states, and the European Commission (called "trilogues") will continue under the Romanian presidency, in
late January. The controversy over Article 13 and Article 11 has not diminished since millions of Europeans voiced their opposition to the proposals and their effect on the Internet earlier this year. Even supporters and notional
beneficiaries have now grown critical of the proposals. An open letter signed by major rightsholder groups, including movie companies and sports leagues,
asks the EU to exempt their products from Article 13 altogether , and suggest it should
only apply to the music industry's works. Meanwhile, the music industry wrote their own open letter, saying that he latest proposed text on Article 13 won't solve their problems. These rightsholders join the world's most eminent computer scientists,
including the inventors of the Internet and the Web, who denounced the whole approach and warned of the
irreparable harm it will do to free expression and the hope of a fair, open Internet. The collective opposition is unsurprising. Months of closed-door negotiations and corporate lobbying
have actually made the proposals worse : even less
coherent, and more riddled with irreconcilable contradictions. The way that the system apportions liability (with stiff penalties for allowing a user to post something that infringes copyright, and no consequences for censoring legitimate materials)
leads inexorably to filters . And as recent experiences
with Tumblr's attempt to filter adult material have shown, algorithms are simply not very
good at figuring out when a user has broken a rule, let alone a rule as technical and fact-intensive as copyright. What is worse, the Directive will only reinforce the power of US Big Tech companies by inhibiting the emergence of
European competitors. That's because only the biggest tech companies have the millions of euros it will cost to deploy the filters Article 13 requires. Proponents of Article 13 stress that the dominance of platforms like Google and Facebook leaves them
with insufficient bargaining leverage and say this leads to a systematic undervaluing of their products. But Article 13 will actually reduce that leverage even further by preventing the emergence of alternative platforms. Compromises suggested by the negotiators to limit the damage are proving unlikely to help. Prior to the Trilogue, Article 13 was imposed on all online platforms save those businesses with less than 10 million euros in annual turnover. Some parties, realising that this will limit the EU tech sector, have suggested changing the figure, but doubling that figure to 20 million doesn't help. If you own a European tech company that you hope will compete with Google someday, you will have to do something Google never had to face: the day you make the leap from 20 million euros in annual turnover to 20,000,001 euros, you will have to find
hundreds of millions of euros to implement an Article 13 copyright filter. Others have proposed a "notice-and-staydown" system to reassure rightsholders that they will not have to invest their own resources in
maintaining the copyright filters. But creating this model for copyright complaints extinguishes any hope of moderating the harms Article 13 will do to small European companies. Earlier drafts of Article 13 spoke of case-by-case assessments for mid-sized
platforms, which would exempt them from implementing filters if they were judged to be engaged in good faith attempts to limit infringement. But notice-and-staydown (the idea that once a platform has been notified of a user's copyright violation, it must
prevent every other user from making such a violation, ever) necessarily requires filters. Others in the negotiation are now arguing that microenterprises should have to pay the burden, and are pressing for even these small and mid-sized business
exemptions to be deleted from the text. With European internet users, small business people, legal experts, technical experts, human rights and free speech experts all opposed to these proposals, we had hoped that they would be
struck from the Trilogue's final draft. Now, they are blocking the passage of other important copyright reforms. Even Article 13 and 11's original advocates are realising how much they depend on a working Internet, and a remuneration system that might
have a chance of working. Still, the lobbying will continue over the holiday break. Some of the world's biggest entertainment and Internet companies will be throwing their weight around the EU to find a "compromise" that
will keep no-one happy, and will exclude the needs and rights of individual Internet users, and European innovators. Read more about the Directive, and contact your MEPs and national governments at
Save Your Internet .
|
|
|
|
|
| 14th December
2018
|
|
|
Self-censorship in the age of big data. By Nik Williams See article from opendemocracy.net
|
|
|
|
|
| 13th December 2018
|
|
|
Internet TV and the unnecessary censorship of The Marvellous Mrs Maisel See article from news18.com
|
|
The government's age verification scheme which leaves people's sensitive sexual preferences unprotected by law is to be presented for approval by the House of Lords
|
|
|
|
10th December 2018
|
|
| See article from lordsbusiness.parliament.uk
|
The following four motions are expected to be debated together in the House of Lords on 11th December 2018: Online Pornography (Commercial Basis) Regulations 2018 Lord Ashton of Hyde to move that the
draft Regulations laid before the House on 10 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 38th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B)
Guidance on Age-verification Arrangements Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the instrument by the Joint
Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B) Lord Stevenson of Balmacara to move that this House regrets that the draft Online Pornography
(Commercial Basis) Regulations 2018 and the draft Guidance on Age-verification Arrangements do not bring into force section 19 of the Digital Economy Act 2017, which would have given the regulator powers to impose a financial penalty on persons who have
not complied with their instructions to require that they have in place an age verification system which is fit for purpose and effectively managed so as to ensure that commercial pornographic material online will not normally be accessible by persons
under the age of 18. Guidance on Ancillary Service Providers Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the
instrument by the Joint Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B) The DCMS and BBFC age verification scheme has been widely panned as fundamentally the law
provides no requirement to actually protect people's identity data that can be coupled with their sexual preferences and sexuality. The scheme only offers voluntary suggestions that age verification services and websites should protect their user's
privacy. But one only has to look to Google, Facebook and Cambridge Analytica to see how worthless mere advice is. GDPR is often quoted but that only requires that user consent is obtained. One will have to simply to the consent to the 'improved user
experience' tick box to watch the porn, and thereafter the companies can do what the fuck they like with the data. See criticism of the scheme:
Security expert provides a detailed break down of the privacy and security failures of the age
verification scheme Parliamentary scrutiny committee condemns BBFC Age Verification Guidelines
Parliamentary scrutiny committee condemns as 'defective' a DCMS Statutory Instrument excusing Twitter and Google
images from age verification. |
|
GCHQ will work around encrypted communications by the mass hacking of people's devices
|
|
|
| 10th December 2018
|
|
| See article from theguardian.com
|
The UK's intelligence agencies are to significantly increase their use of large-scale data hacking after claiming that more targeted operations are being rendered obsolete by technology. The move will see an expansion in what is known as the bulk
equipment interference (EI) regime -- the process by which GCHQ can target entire communication networks overseas in a bid to identify individuals who pose a threat to national security. [Note that the idea this is somehow only targeted at foreigners is
misleading. Five countries cooperate so that they can mutually target each others users to work round limits on snooping on one's own country]. A letter from the security minister, Ben Wallace, to the head of the intelligence and security
committee, Dominic Grieve, quietly filed in the House of Commons library last week, states: Following a review of current operational and technical realities, GCHQ have ... determined that it will be necessary to
conduct a higher proportion of ongoing overseas focused operational activity using the bulk EI regime than was originally envisaged.
|
|
Israel passes laws implementing selectable options for ISPs to block adults sites
|
|
|
|
8th December 2018
|
|
| See article from timesofisrael.com
|
A bill that would force ISPs in Israel to censor pornographic sites by default has been amended after heavy criticism from lawmakers over privacy concerns. AN earlier version of the bill that was unanimously approved by the Ministerial Committee for
Legislation in late Octoberr but now a new version of the legislation has been passed which was sponsored by Likud MK Miki Zohar and Jewish Home MK Shuli Moalem-Refaeli. The differences seem subtle and are whether customers opt in or opt out of
network level website blocking. Customers will have to confirm their preferences for website blocking every 3 months but may change their settings at any time. The bill will incentivize internet companies to actively market
existing website blocking software to families. ISPs will receive NIS 0.50 ($0.13 cents) for every subscriber who opts to block adult sites. In a refreshing divergence from UK internet censorship, ISPs will be legally required to delete all data
related to their users' surfing habits, to prevent creating de facto -- and easily leaked -- black lists of pornography consumers. In comparison, internet companies are allowed to use or sell UK customer data for any purpose they so desire as long
as customers tick a consent box with some woolly text about improving the customer's experience. Update: Netanyahu voices privacy concerns 10th December 2018. See
article from sputniknews.com See also
Netanyahu against anti-porn bill, rejects online regulation from al-monitor.com Israeli Prime Minister Benjamin Netanyahu moved to halt the adoption of a new law aimed at curbing pornographic content on the Internet and possibly keeping tabs on people who watch porn. Netanyahu inquired:
We don't want our children to be exposed to harmful content, but my concern is about inserting regulation into a space in which there is no government regulation. Who will decide which content is permitted and which is
forbidden? Who will decide the interpretations?
|
|
Or else Facebook will censor your advances, no matter how subtle
|
|
|
| 8th December 2018
|
|
| 6th December 2018. See article from pcmag.com
See sexual_solicitation censorship rules on Facebook |
Facebook has added a new category of censorship, sexual solicitation. It added the update on 15thh October but no one really noticed until recently. The company has quietly updated its content-moderation policies to censor implicit requests for
sex.The expanded policy specifically bans sexual slang, hints of sexual roles, positions or fetish scenarios, and erotic art when mentioned with a sex act. Vague, but suggestive statements such as looking for a good time tonight when soliciting sex are
also no longer allowed. The new policy reads: 15. Sexual Solicitation Policy Do not post: Content that attempts to coordinate or recruit for adult sexual activities
including but not limited to:
Filmed sexual activities Pornographic activities, strip club shows, live sex performances, erotic dances Sexual, erotic, or tantric massages
Content that engages in explicit sexual solicitation by, including but not limited to the following, offering or asking for:
Content that engages in implicit sexual solicitation, which can be identified by offering or asking to engage in a sexual act and/or acts identified by other suggestive elements such as any of the following:
Vague suggestive statements, such as "looking for a good time tonight" Sexualized slang Using sexual hints such as mentioning sexual roles, sex positions, fetish scenarios, sexual preference/sexual partner preference, state
of arousal, act of sexual intercourse or activity (sexual penetration or self-pleasuring), commonly sexualized areas of the body such as the breasts, groin, or buttocks, state of hygiene of genitalia or buttocks Content (hand drawn, digital, or
real-world art) that may depict explicit sexual activity or suggestively posed person(s).
Content that offers or asks for other adult activities such as:
Sexually explicit language that adds details and goes beyond mere naming or mentioning of:
A state of sexual arousal (wetness or erection) An act of sexual intercourse (sexual penetration, self-pleasuring or exercising fetish scenarios)
Comment: Facebook's Sexual Solicitation Policy is a Honeypot for Trolls 8th December 2018. See
article from eff.org by Elliot Harmon
Facebook just quietly adopted a policy that could push thousands of innocent people off of the platform. The new " sexual solicitation " rules forbid pornography and other explicit sexual content (which was already functionally banned under
a different statute ), but they don't stop there: they also ban "implicit sexual solicitation" , including the use of sexual slang, the solicitation of nude images, discussion of "sexual partner preference," and even
expressing interest in sex . That's not an exaggeration: the new policy bars "vague suggestive statements, such as 'looking for a good time tonight.'" It wouldn't be a stretch to think that asking " Netflix and chill? " could run
afoul of this policy. The new rules come with a baffling justification, seemingly blurring the line between sexual exploitation and plain old doing it: [P]eople use Facebook to discuss and draw
attention to sexual violence and exploitation. We recognize the importance of and want to allow for this discussion. We draw the line, however, when content facilitates, encourages or coordinates sexual encounters between adults.
In other words, discussion of sexual exploitation is allowed, but discussion of consensual, adult sex is taboo. That's a classic censorship model: speech about sexuality being permitted only when sex is presented as dangerous and
shameful. It's especially concerning since healthy, non-obscene discussion about sex--even about enjoying or wanting to have sex--has been a component of online communities for as long as the Internet has existed, and has for almost as long been the
target of governmental censorship efforts . Until now, Facebook has been a particularly important place for groups who aren't well represented in mass media to discuss their sexual identities and practices. At very least, users
should get the final say about whether they want to see such speech in their timelines. Overly Restrictive Rules Attract Trolls Is Facebook now a sex-free zone ? Should we be afraid of meeting
potential partners on the platform or even disclosing our sexual orientations ? Maybe not. For many users, life on Facebook might continue as it always has. But therein lies the problem: the new rules put a substantial portion of
Facebook users in danger of violation. Fundamentally, that's not how platform moderation policies should work--with such broadly sweeping rules, online trolls can take advantage of reporting mechanisms to punish groups they don't like.
Combined with opaque and one-sided flagging and reporting systems , overly restrictive rules can incentivize abuse from bullies and other bad actors. It's not just individual trolls either: state actors have systematically abused
Facebook's flagging process to censor political enemies. With these new rules, organizing that type of attack just became a lot easier. A few reports can drag a user into Facebook's labyrinthine enforcement regime , which can result in having a group
page deactivated or even being banned from Facebook entirely. This process gives the user no meaningful opportunity to appeal a bad decision . Given the rules' focus on sexual interests and activities, it's easy to imagine who
would be the easiest targets: sex workers (including those who work lawfully), members of the LGBTQ community, and others who congregate online to discuss issues relating to sex. What makes the policy so dangerous to those communities is that it forbids
the very things they gather online to discuss. Even before the recent changes at Facebook and Tumblr , we'd seen trolls exploit similar policies to target the LGBTQ community and censor sexual health resources . Entire harassment
campaigns have organized to use payment processors' reporting systems to cut off sex workers' income . When online platforms adopt moderation policies and reporting processes, it's essential that they consider how those policies and systems might be
weaponized against marginalized groups. A recent Verge article quotes a Facebook representative as saying that people sharing sensitive information in private Facebook groups will be safe , since Facebook relies on reports from
users. If there are no tattle-tales in your group, the reasoning goes, then you can speak freely without fear of punishment. But that assurance rings rather hollow: in today's world of online bullying and brigading, there's no question of if your
private group will be infiltrated by the trolls ; it's when . Did SESTA/FOSTA Inspire Facebook's Policy Change? The rule change comes a few months after Congress passed the Stop Enabling Sex
Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act (SESTA/FOSTA), and it's hard not to wonder if the policy is the direct result of the new Internet censorship laws. SESTA/FOSTA opened online
platforms to new criminal and civil liability at the state and federal levels for their users' activities. While ostensibly targeted at online sex trafficking, SESTA/FOSTA also made it a crime for a platform to "promote or facilitate the
prostitution of another person." The law effectively blurred the distinction between adult, consensual sex work and sex trafficking. The bill's supporters argued that forcing platforms to clamp down on all sex work was the only way to curb
trafficking--nevermind the growing chorus of trafficking experts arguing the very opposite . As SESTA/FOSTA was debated in Congress, we repeatedly pointed out that online platforms would have little choice but to over-censor : the
fear of liability would force them not just to stop at sex trafficking or even sex work, but to take much more restrictive approaches to sex and sexuality in general, even in the absence of any commercial transaction. In EFF's ongoing legal challenge to
SESTA/FOSTA , we argue that the law unconstitutionally silences lawful speech online. While we don't know if the Facebook policy change came as a response to SESTA/FOSTA, it is a perfect example of what we feared would happen:
platforms would decide that the only way to avoid liability is to ban a vast range of discussions of sex. Wrongheaded as it is, the new rule should come as no surprise. After all, Facebook endorsed SESTA/FOSTA . Regardless of
whether one caused the other or not, both reflect the same vision of how the Internet should work--a place where certain topics simply cannot be discussed. Like SESTA/FOSTA, Facebook's rule change might have been made to fight online sexual exploitation.
But like SESTA/FOSTA, it will do nothing but push innocent people offline. |
|
Italian authorities have fined Facebook for their abuse of people's personal data
|
|
|
| 8th
December 2018
|
|
| See article from
theguardian.com |
Facebook has been fined ?10m (£8.9m) by Italian authorities for misleading users over its data practices. The two fines issued by Italy's competition watchdog are some of the largest levied against the social media company for data misuse. The
Italian regulator found that Facebook had breached the country's consumer code by:
- Misleading users in the sign-up process about the extent to which the data they provide would be used for commercial purposes.
- Emphasising only the free nature of the service, without informing users of the "profitable ends that
underlie the provision of the social network", and so encouraging them to make a decision of a commercial nature that they would not have taken if they were in full possession of the facts.
- Forcing an "aggressive practice" on
registered users by transmitting their data from Facebook to third parties, and vice versa, for commercial purposes.
The company was specifically criticised for the default setting of the Facebook Platform services, which in the words of the regulator, prepares the transmission of user data to individual websites/apps without express consent from users.
Although users can disable the platform, the regulator found that its opt-out nature did not provide a fully free choice. The authority has also directed Facebook to publish an apology to users on its website and on its app. |
|
|
|
|
| 8th
December 2018
|
|
|
No sex please, we're beholden to our advertisers. By Violet Blue See article from engadget.com |
|
Tumblr is banning all adult images of sex and nudity from 17th December 2018
|
|
|
| 7th December 2018
|
|
| Thanks to Nick 4th December 2018. See article from
tumblr.zendesk.com See article from theguardian.com |
Image hosting service Tumblr is banning all adult images of sex and nudity from 17th December 2018. This seems to have been sparked by the app being banned from Apple Store after a child porn image was detected being hosted by Tumblr. Tumblr explained
the censorship process in a blog post: Starting Dec 17, adult content will not be allowed on Tumblr, regardless of how old you are. You can read more about what kinds of content are not allowed on Tumblr in our Community
Guidelines. If you spot a post that you don't think belongs on Tumblr, period, you can report it: From the dashboard or in search results, tap or click the share menu (paper airplane) at the bottom of the post, and hit Report. Adult content primarily includes photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content204including photos, videos, GIFs and illustrations204that depicts sex acts.
Examples of exceptions that are still permitted are exposed female-presenting nipples in connection with breastfeeding, birth or after-birth moments, and health-related situations, such as post-mastectomy or gender confirmation
surgery. Written content such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, are also stuff that can be freely posted on Tumblr.
Any images identified as
adult will be set as unviewable by anyone except the poster. There will be an appeals process to contest decisions held to be incorrect. Inevitably Tumblr algorithms are not exactly accurate when it comes to detecting sex and nudity. The Guardian
noted that ballet dancers, superheroes and a picture of Christ have all fallen foul of Tumblr's new pornography ban, after the images were flagged up as explicit content by the blogging site's artificial intelligence (AI) tools. The actor and
Tumblr user Wil Wheaton posted one example: An image search for beautiful men kissing, which was flagged as explicit within 30 seconds of me posting it. These images are not explicit. These
pictures show two adults, engaging in consensual kissing. That's it. It isn't violent, it isn't pornographic. It's literally just two adult humans sharing a kiss.
Other users chronicled flagged posts, including historical images of
(clothed) women of colour, a photoset of the actor Sebastian Stan wearing a selection of suits with no socks on, an oil painting of Christ wearing a loincloth, a still of ballet dancers and a drawing of Wonder Woman carrying fellow superhero Harley
Quinn. None of the images violate Tumblr's stated policy. Update: Petition 5th December 2018. See petition
from change.org
Tumblr, after years of being a space for nsfw artists to reach a community of like-minded individuals to enjoy their work, has decided to close their metaphorical doors to adult content. Solution Stop it. Let people post porn,
it's 90% of the reason anybody is on the site in the first place. Or, if you really want a non-18+ tumblr, start a new one with that specific goal in mind. Don't rip down what people have spent years working on. ...sign the
petition from change.org At the time of writing 368,000 people had signed. Comment: Censored whilst
claiming to be uncensored 6th December 2018. See article from avn.com
The Free Speech coalition [representing the US adult trade] released the following statement regarding the recent announcement about censorship at Tumblr: The social media platform Tumblr has announced that on December 17, it will
effectively ban all adult content. Tumblr follows the lead of Facebook, Instagram, YouTube and other social media platforms, who over the past few years have meticulously scrubbed their corners of the internet of adult content, sex, and sexuality, in the
name of brand protection and child protection. While some in the adult industry may cheer the end of Tumblr as a never-ending source of free content, specifically pirated content, it is concerning that of the major social media
platforms, only Twitter and Reddit remain in any way tolerant of adult workers -- and there are doubts as to how much longer that will last. As legitimate platforms ban or censor adult content -- having initially benefited from
traffic that adult content brought them -- illegitimate platforms for distribution take their place. The closure of Tumblr only means more piracy, more dispersal of community, and more suffering for adult producers and performers.
Free Speech Coalition was founded to fight government censorship -- set raids and FBI entrapment, bank seizures and jail terms. The internet gave us freedom from much that had plagued us, particularly local ordinances and overzealous prosecutors. But
now, when corporate censors suspend your account, the only choice is to abandon the platform 203 there is no opportunity for arbitration or appeal. When companies like Google and Facebook (and subsidiaries like YouTube and
Instagram) control over 70% of all web traffic, adult companies are denied a market as effectively as a state-level sex toy ban. And when sites like Tumblr and Twitter can close an account with millions of followers without warning, the effect is the
same on a business -- particularly a small, performer-run one -- as an FBI seizure. As social media companies become more powerful, we must demand recourse, but we also must look beyond our industry and continue to build alliances
-- with women, with LGBTQ groups, with sex workers and sex educators, with artists -- who implicitly understand the devastating effect of this new form of censorship. These communities have seen the devastation wreaked when
platforms use purges of adult content as a sledgehammer, broadly banning sexual health information, vibrant communities based around non-normative genders and sexualities, resources for sex workers, and political and cultural commentary that engages with
such topics. The loss of these platforms isn't just about business, it's about the loss of vital communities and education -- and organizing. We use these platforms not only to grow our reach, but to communicate with one another,
to rally, to drive awareness of issues of sex and sexuality. They have become a central source of power. And today, we're one step closer to losing that as well.
Offsite Comment: Filters don't work 6th December 2018. See article from eff.org
Dear Tumblr: Banning Adult Content Won't Make Your Site Better But It Will Harm Sex-Positive Communities Offsite Article: Alternatives 7th December 2018. See
article from wired.com
Wired has penned an article considering alternatives for Tumblr users wanting to find an alternative for adult content. The initial suggestions are PillowFort.io and
Dreamwidth on Twitter. See
article from wired.com |
|
Poland stands up to the EU to champion the livelihoods of thosands of Europeans against the disgraceful EU that wants to grant large, mostly American companies, dictatorial copyright control of the internet
|
|
|
| 6th December 2018
|
|
| See Creative Commons article from boingboing.net by Cory Doctorow
|
In 2011, Europeans rose up over ACTA , the misleadingly named "Anti-Counterfeiting Trade Agreement," which created broad surveillance and censorship regimes for the internet. They were successful in large part thanks to the Polish activists who
thronged the streets to reject the plan, which had been hatched and exported by the US Trade Representative. Now, Europe is in on the verge of an ever farther-reaching scheme to censor and surveil the internet: the new Copyright
Directive, which limits who can link to (and criticise) the news and sets up crowdsourced
databases of blacklisted content that anyone can add anything to, and which cannot thereafter be published online. The Poles aren't having any of it: a broad coalition of Poles
from the left and the right have come together to oppose the new Directive, dubbing it "ACTA2," which should give you an idea of how they feel about the matter. There are now enough national governments opposed to
the Directive to constitute a "blocking minority" that could stop it dead. Alas, the opposition is divided on whether to reform the offending parts of the Directive, or eliminate them outright (this division is why the Directive squeaked
through the last vote, in September), and unless they can work together, the Directive still may proceed. A massive coalition of 15,000 Polish creators whose videos, photos and text are enjoyed by over 20,000,000 Poles
have signed an open letter supporting the idea of a strong, creator-focused copyright and rejecting the new Copyright Directive as a direct path to censoring filters that will deprive them of their livelihoods. The coalition
points out that online media is critical to the lives of everyday Poles for purposes that have nothing to do with the entertainment industry: education, the continuation of Polish culture, and connections to the global Polish diaspora.
Polish civil society and its ruling political party are united in opposing ACTA2; Polish President Andrzej Duda vowed to oppose it. Early next month, the Polish Internet Governance Forum will host a roundtable
on the question; they have invited proponents of the Directive to attend and publicly debate the issue.
|
|
Uganda blocks 27 internet porn websites
|
|
|
| 6th December 2018
|
|
| See article from the-star.co.ke
|
ISPs in Uganda have blocked 27 pornography websites after a directive was issued by the Uganda Communications Commission. Pornhub, Xvideos, and Youporn were among the top 100 most visited websites. The Daily Monitor reports that at least 25 of
the 27 banned websites cannot be accessed on mobile phones. However, users of Virtual Private Networks can access the banned sites. Chairperson of the Pornography Control Committee Annette Kezaabu told the Monitor there is a drop in the number of
people accessing pornography after they blocked the prominent porn sites. She said: We have a team that is compiling a list of other porn sites that will be blocked We anticipate that some
people will open up new sites but this is a continuous process.
|
|
|
|
|
| 6th December 2018
|
|
|
The Daily Mail reports on large scale data harvesting of your data and notes that Paypal have been passing on passport photos used for account verification to Microsoft for their facial recognition database See
article from dailymail.co.uk |
|
Parliament publishes a set of enlightening emails about Facebook's pursuit of revenue and how it allows people's data to be used by app developers
|
|
|
| 5th December 2018
|
|
| See article from bbc.co.uk See
Facebook emails [pdf] from parliament.uk See
Mark Zuckerberg's response on Facebook |
Parliament's fake news inquiry has published a cache of seized Facebook documents including internal emails sent between Mark Zuckerberg and the social network's staff. The emails were obtained from the chief of a software firm that is suing the tech
giant. About 250 pages have been published, some of which are marked highly confidential. Facebook had objected to their release. Damian Collins MP, the chair of the parliamentary committee involved, highlighted several key issues in an
introductory note. He wrote that:
- Facebook allowed some companies to maintain "full access" to users' friends data even after announcing changes to its platform in 2014/2015 to limit what developers' could see. "It is not clear that there was any user consent for this,
nor how Facebook decided which companies should be whitelisted," Mr Collins wrote
- Facebook had been aware that an update to its Android app that let it collect records of users' calls and texts would be controversial. "To mitigate any
bad PR, Facebook planned to make it as hard as possible for users to know that this was one of the underlying features," Mr Collins wrote
- Facebook used data provided by the Israeli analytics firm Onavo to determine which other mobile apps
were being downloaded and used by the public. It then used this knowledge to decide which apps to acquire or otherwise treat as a threat
- there was evidence that Facebook's refusal to share data with some apps caused them to fail
- there
had been much discussion of the financial value of providing access to friends' data
In response, Facebook has said that the documents had been presented in a very misleading manner and required additional context. See Mark
Zuckerberg's response on Facebook
Offsite Analysis: New Documents Show That Facebook Has Never Deserved Your Trust 7th December 2018. See article from eff.org by
Bennett Cyphers and Gennie Gebhart
|
|
Mastercard and Microsoft get together to pool their data on you in the name of identity verification
|
|
|
| 5th
December 2018
|
|
| See article from alphr.com
|
Mastercard and Microsoft are collaborating in an identity management system that promises to remember users' identity verification and passwords between sites and services. Mastercard highlights four particular areas of use: financial services,
commerce, government services, and digital services (eg social media, music streaming services and rideshare apps). This means the system would let users manage their data across both websites and real-world services. However, the inclusion of
government services is an eyebrow-raising one. Microsoft and Mastercard's system could link personal information including taxes, voting status and criminal record, with consumer services like social media accounts, online shopping history and bank
accounts. As well as the stifling level of tailored advertising you'd receive if the system knew everything you did, this sets the dangerous precedent for every byte of users' information to be stored under one roof -- perfect for an opportunistic
hacker or businessman. Mastercard mention it is working closely with players like Microsoft, showing that many businesses have access to the data. Neither Microsoft nor Mastercard have slated a release date for the system, only promising
additional details on these efforts will be shared in the coming months. |
|
Reddit explains to European users that it won't be able to operate effectively under forthcoming EU copyright law
|
|
|
| 5th December 2018
|
|
| See article from redditblog.com See
also dontwreckthe.net |
Defending equal access to the free and open internet is core to Reddit's ideals, and something that redditors have told us time and again they hold dear too, from the SOPA/PIPA battle to the fight for Net Neutrality. This is why even though we are an
American company with a user base primarily in the United States, we've nevertheless spent a lot of time this year
warning about how an overbroad EU Copyright Directive could restrict Europeans' equal access to the open
Internet--and to Reddit. Despite these warnings, it seems that EU lawmakers still don't fully appreciate the law's potential impact, especially on small and medium-sized companies like Reddit. So we're stepping things up to draw
attention to the problem. Users in the EU will notice that when they access Reddit via desktop, they are greeted by a modal informing them about the Copyright Directive and referring them to
detailed resources on proposed fixes . The problem with the Directive lies in Articles 11 (link licensing fees) and 13 (copyright filter requirements), which
set sweeping, vague requirements that create enormous liability for platforms like ours. These requirements eliminate the previous safe harbors that allowed us the leeway to give users the benefit of the doubt when they shared content. But under the new
Directive, activity that is core to Reddit, like sharing links to news articles, or the use of existing content for creative new purposes (r/photoshopbattles, anyone?) would suddenly become questionable under the law, and it is not clear right now that
there are feasible mitigating actions that we could take while preserving core site functionality. Even worse, smaller but similar attempts in various countries in Europe in the past have shown that
such efforts have actually harmed publishers and creators . Accordingly, we hope that today's action will drive the point home that there are grave
problems with Articles 11 and 13, and that the current trilogue negotiations will choose to remove both entirely. Barring that, however, we have a number of suggestions for ways to improve both proposals. Engine and the Copia Institute have compiled them
here at https://dontwreckthe.net/ . We hope you will read them and consider calling your Member of
European Parliament ( look yours up here ). We also hope that EU lawmakers will listen to those who use and understand the internet the most, and reconsider these
problematic articles. Protecting rights holders need not come at the cost of silencing European internet users.
|
|
|
|
|
| 5th December 2018
|
|
|
New Zealand film censor, with a keen eye on upcoming UK cesorship, publishes a report on porn viewing by the young and inevitably finds that they want porn to be censored See
report [pdf] from classificationoffice.govt.nz |
|
Parliamentary scrutiny committee condemns as 'defective' a DCMS Statutory Instrument excusing Twitter and Google images from age verification. Presumably one of the reasons for the delayed introduction
|
|
|
| 3rd
December 2018
|
|
| See article from publications.parliament.uk
|
There's a joint committee to scrutinise laws passed in parliament via Statutory Instruments. These are laws that are not generally presented to parliament for discussion, and are passed by default unless challenged. The committee has now taken issue
with a DCMS law to excuse the likes of social media and search engines from requiring age verification for any porn images that may get published on the internet. The committee reports from a session on 21st November 2018 that the law was defective and
'makes an unexpected use of the enabling power'. Presumably this means that the DCMS has gone beyond the scope of what can be passed without full parliamentary scrutiny. Draft S.I.: Reported for defective drafting and for
unexpected use of powers Online Pornography (Commercial Basis) Regulations 2018 7.1 The Committee draws the special attention of both Houses to these draft Regulations on the grounds that they are defectively drafted and
make an unexpected use of the enabling power. 7.2 Part 3 of the Digital Economy Act 2017 ("the 2017 Act") contains provisions designed to prevent persons under the age of 18 from accessing internet sites which
contain pornographic material. An age-verification regulator 1 is given a number of powers to enforce the requirements of Part 3, including the power to impose substantial fines. 2 7.3 Section 14(1) is the key requirement. It
provides: "A person contravenes [Part 3 of the Act] if the person makes pornographic material available on the internet to persons in the United Kingdom on a commercial basis other than in a way that secures
that, at any given time, the material is not normally accessible by persons under the age of 18".
7.4 The term "commercial basis" is not defined in the Act itself. Instead, section 14(2) confers a
power on the Secretary of State to specify in regulations the circumstances in which, for the purposes of Part 3, pornographic material is or is not to be regarded as made available on a commercial basis. These draft regulations would be made in exercise
of that power. Regulation 2 provides: "(1) Pornographic material is to be regarded as made available on the internet to persons in the United Kingdom on a commercial basis for the purposes of Part 3 of the Digital
Economy Act 2017 if either paragraph (2) or (3) are met. (2) This paragraph applies if access to that pornographic material is available only upon payment. (3) This paragraph applies (subject to paragraph
(4)) if the pornographic material is made available free of charge and the person who makes it available receives (or reasonably expects to receive) a payment, reward or other benefit in connection with making it available on the internet.
(4) Subject to paragraph (5), paragraph (3) does not apply in a case where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one-third of the content of the material made
available on or via the internet site or other means (such as an application program) of accessing the internet by means of which the pornographic material is made available. (5) Paragraph (4) does not apply if the internet
site or other means (such as an application program) of accessing the internet (by means of which the pornographic material is made available) is marketed as an internet site or other means of accessing the internet by means of which pornographic
material is made available to persons in the United Kingdom."
7.5 The Committee finds these provisions difficult to understand, whether as a matter of simple English or as legal propositions. Paragraphs (4) and
(5) are particularly obscure. 7.6 As far as the Committee can gather from the Explanatory Memorandum, the policy intention is that a person will be regarded as making pornographic material available on the internet on a commercial
basis if: (A) a charge is made for access to the material; OR (B) the internet site is accessible free of charge, but the person expects to receive a payment or other commercial benefit, for
example through advertising carried on the site.
7.7 There is, however, an exception to (B): in cases in which no access charge is made, the person will NOT be regarded as making the pornographic material available on
a commercial basis if the material makes up less than one-third of the content on the internet site--even if the person expects to receive a payment or other commercial benefit from the site. But that exception does not apply in a case where the person
markets it as a pornographic site, or markets an "app" as a means of accessing pornography on the site. 7.8 As the Committee was doubtful whether regulation 2 as drafted is effective to achieve the intended result, it
asked the Department for Digital, Culture, Media and Sport a number of questions. These were designed to elicit information about the regulation's meaning and effect. 7.9 The Committee is disappointed with the Department's
memorandum in response, printed at Appendix 7: it fails to address adequately the issues raised by the Committee. 7.10 The Committee's first question asked the Department to explain why paragraph (1) of regulation 2 refers to
whether either paragraph (2) or (3) "are met" 3 rather than "applies". The Committee raised this point because paragraphs (2) and (3) each begin with "This paragraph applies if ...". There is therefore a mismatch between
paragraph (1) and the subsequent paragraphs, which could make the regulation difficult to interpret. It would be appropriate to conclude paragraph (1) with "is met" only if paragraphs (2) and (3) began with "The condition in this paragraph
is met if ...". The Department's memorandum does not explain this discrepancy. The Committee accordingly reports regulation 2(1) for defective drafting. 7.11 The first part of the Committee's second question sought to
probe the intended effect of the words in paragraph (4) of regulation 2 italicised above, and how the Department considers that effect is achieved. 7.12 While the Department's memorandum sets out the policy reasons for setting the
one-third threshold, it offers little enlightenment on whether paragraph (4) is effective to achieve the policy aims. Nor does it deal properly with the second part of the Committee's question, which sought clarification of the concept of "one-third
of ... material ... on ... [a] means .... of accessing the internet ...". 7.13 The Committee is puzzled by the references in regulation 2(4) to the means of accessing the internet. Section 14(2) of the 2017 Act confers a
power on the Secretary of State to specify in regulations circumstances in which pornographic material is or is not to be regarded as made available on the internet on a commercial basis. The means by which the material is accessed (for example, via an
application program on a smart phone) appears to be irrelevant to the question of whether it is made available on the internet on a commercial basis. The Committee remains baffled by the concept of "one-third of ... material ... on [a] means ... of
accessing the internet". 7.14 More generally, regulation 2(4) fails to specify how the one-third threshold is to be measured and what exactly it applies to. Will the regulator be required to measure one-third of the pictures
or one-third of the words on a particular internet site or both together? And will a single webpage on the site count towards the total if less than one-third of the page's content is pornographic--for example, a sexually explicit picture occupying 32%
of the page, with the remaining 68% made up of an article about fishing? The Committee worries that the lack of clarity in regulation 2(4) may afford the promoter of a pornographic website opportunities to circumvent Part 3 of the 2017 Act.
7.15 The Committee is particularly concerned that a promoter may make pornographic material available on one or more internet sites containing multiple pages, more than two-thirds of which are non-pornographic. For every 10 pages of
pornography, there could be 21 pages about (for example) gardening or football. Provided the sites are not actively marketed as pornographic, they would not be regarded as made available on a commercial basis. This means that Part 3 of the Act would not
apply, and the promoter would be free to make profits through advertising carried on the sites, while taking no steps at all to ensure that they were inaccessible to persons under 18. 7.16 The Committee anticipates that the
shortcomings described above are likely to cause significant difficulty in the application and interpretation of regulation 2(4). The Committee also doubts whether Parliament contemplated, when enacting Part 3 of the 2017 Act, that the power conferred by
section 14(2) would be exercised in the way provided for in regulation 2(4). The Committee therefore reports regulation 2(4) for defective drafting and on the ground that it appears to make an unexpected use of the enabling power.
|
|
Chinese rules requiring internet companies to record all users online activity have commenced
|
|
|
| 1st December 2018
|
|
| See article from edition.cnn.com |
Chinese internet companies have started keeping detailed records of their users' personal information and online activity. The new rules from China's internet censor went into effect Friday. The new requirements apply to any company that provides
online services which can influence public opinion or mobilize the public to engage in specific activities, according to a notice posted on the Cyber Administration of China's website. Citing the need to safeguard national security and social
order, the Chinese internet censor said companies must be able to verify users' identities and keep records of key information such as call logs, chat logs, times of activity and network addresses. Officals will carry out inspections of companies'
operations to ensure compliance. But the Cyber Administration didn't make clear under what circumstances the companies might be required to hand over logs to authorities.
|
|
Baby on Netflix
|
|
|
| 1st December 2018
|
|
| See article from endsexualexploitation.org
|
Morality in Media (now calling themselves the National Center on Sexual Exploitation) writes: This Friday, Netflix will begin streaming a new show, Baby . Based loosely on the account of the Baby Squillo
scandal, the show portrays a group of teenagers entering into prostitution as a glamorized coming-of-age story. Under international and U.S. federal law, anyone engaged in commercial sex who is under 18 years old is by definition a sex trafficking
victim. In the real-life scandal that Baby is based on, the mother of one of the teenagers was arrested for sex trafficking. In January, the National Center on Sexual Exploitation, along with 55 other survivors of sex trafficking
and/or subject matter experts, social service providers, and advocates for the abolition of sexual exploitation sent a letter to Netflix executives to express their deep concern regarding Netflix's forthcoming Italian drama, Baby, which normalizes child
sexual abuse and the sex trafficking of minors as prostitution. Despite being at ground zero of the #MeToo movement, Netflix appears to have gone completely tone-deaf on the realities of sexual exploitation, said Dawn Hawkins,
executive director of the National Center on Sexual Exploitation. Despite the outcry from survivors of sex trafficking, subject matter experts, and social service providers, Netflix promotes sex trafficking by insisting on streaming Baby. Clearly,
Netflix is prioritizing profits over victims of abuse. Erik Barmack, VP of International Originals at Netflix, has previously described the new show as edgy. There is absolutely nothing edgy
about the sexual exploitation of minors. This show glamorizes sexual abuse and trivializes the experience of countless underage women and men who have suffered through sex trafficking.
|
|
|
|
|
| 1st December 2018
|
|
|
GCHQ pushes for the ability to silently join and snoop on encrypted messaging conversations See article from theregister.co.uk
|
|
|