|
Instagram will detect nude photos in private messages and initially blur them
|
|
|
| 21st
April 2024
|
|
| See blog post from
about.instagram.com |
New Tools to Help Protect Against Sextortion and Intimate Image Abuse We're testing new features to help protect young people from sextortion and intimate image abuse, and to make it more difficult for potential scammers
and criminals to find and interact with teens. We're also testing new ways to help people spot potential sextortion scams, encourage them to report and empower them to say no to anything that makes them feel uncomfortable. We've started sharing more
signals about sextortion accounts to other tech companies through Lantern, helping disrupt this criminal activity across the internet. While people overwhelmingly use DMs to share what they love with their friends, family or
favorite creators, sextortion scammers may also use private messages to share or ask for intimate images. To help address this, we'll soon start testing our new nudity protection feature in Instagram DMs, which blurs images detected as containing nudity
and encourages people to think twice before sending nude images. This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending
their own images in return. Nudity protection will be turned on by default for teens under 18 globally, and we'll show a notification to adults encouraging them to turn it on. When nudity protection is
turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they've changed their mind. Anyone who tries to forward a
nude image they've received will see a message encouraging them to reconsider. When someone receives an image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn't confronted with
a nude image and they can choose whether or not to view it. We'll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat. Nudity protection uses on-device
machine learning to analyze whether an image sent in a DM on Instagram contains nudity. Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won't have access to these images --
unless someone chooses to report them to us.
|
|
Instagram extends its option for users to block message requests containing banned words
|
|
|
| 23rd October
2022
|
|
| See article from about.instagram.com |
Since launching Hidden Words last year, more than one in five people with large followings have turned on the feature, giving them a powerful tool to automatically filter harmful content from their comments and message requests. We've seen that Hidden
Words has been really effective at keeping people safe. When people turn on Hidden Words for comments, on average, they see 40% fewer comments that might be offensive. We want to help more creators benefit from this protection, so
we're starting to test automatically turning on Hidden Words for Creator accounts . Everyone will continue to be able to turn these settings on or off at any time and build a custom list with additional words, phrases, and emojis they may want to hide.
We're also continuing to improve Hidden Words to offer more protections, including:
Expanding Hidden Words to cover Story replies, so offensive replies from people you don't follow will be sent to your Hidden Requests folder and you never have to see them. Supporting new languages,
including Farsi, Turkish, Russian, Bengali, Marathi, Telugu, and Tamil. Improving our filtering to spot and hide more intentional misspellings of offensive terms, for instance, if someone uses a "1" instead of an
"i". Adding new terms to filter message requests that might contain scams or spam. We'll start doing this in English in certain countries, with more languages and countries coming soon.
|
|
Instagram steps down the content feed for new young teen users
|
|
|
| 26th
August 2022
|
|
| See article from about.instagram.com
|
Last summer, we launched the Sensitive Content Control so people could choose how much or how little sensitive content to see in Explore from accounts they don't follow. The Sensitive Content Control has three options, which we've
renamed from when we first introduced the control to help explain what each option does. The three options are: More, Standard and Less. Standard is the default state, and will prevent people from seeing some sensitive content and
accounts. More enables people to see more sensitive content and accounts, whereas Less means they see less of this content than the default state. For people under the age of 18, the More option is unavailable. The Sensitive
Content Control has only two options for teens: Standard and Less. New teens on Instagram under 16 years old will be defaulted into the Less state. For teens already on Instagram, we will send a prompt encouraging them to select the Less experience.
This will make it more difficult for young people to come across potentially sensitive content or accounts in Search, Explore, Hashtag Pages, Reels, Feed Recommendations and Suggested Accounts. In addition, we
are testing a new way to encourage teens to update their safety and privacy settings. We'll show prompts asking teens to review their settings including: controlling who can reshare their content, who can message and contact them, what content they can
see and how they can manage their time spent on Instagram.
|
|
Instagram introduces 2 new ways for age verification
|
|
|
| 23rd June
2022
|
|
| Instagram explains in an article from about.fb.com |
Instagram is testing new options for people on Instagram to verify their age, starting with people based in the US. If someone attempts to edit their date of birth on Instagram from under the age of 18 to 18 or over, we'll require them to verify their
age using one of three options: upload their ID, record a video selfie or ask mutual friends to verify their age. We're testing this so we can make sure teens and adults are in the right experience for their age group. We are also partnering with Yoti, a
company that specializes in online age verification, to help ensure people's privacy. In 2019, we began asking people to provide their age when signing up for Instagram. Since then, we've made this a requirement. Knowing people's
age allows us to provide appropriate experiences to different age groups, specifically teens. We require people to be at least 13 years old to sign up for Instagram. In some countries, our minimum age is higher. When we know if
someone is a teen (13-17), we provide them with age-appropriate experiences like defaulting them into private accounts, preventing unwanted contact from adults they don't know and limiting the options advertisers have to reach them with ads.
In addition to having someone upload their ID, we're testing two new ways to verify a person's age: Video Selfie: You can choose to upload a video selfie to verify your age. If you choose this option,
you'll see instructions on your screen to guide you. After you take a video selfie, we share the image with Yoti, and nothing else. Yoti's technology estimates your age based on your facial features and shares that estimate with us. Meta and Yoti then
delete the image. The technology cannot recognize your identity 203 just your age. Social Vouching: This option allows you to ask mutual followers to confirm how old you are. The person vouching must be at least 18 years
old, must not be vouching for anyone else at that time and will need to meet other safeguards we have in place. The three people you select to vouch for you will receive a request to confirm your age and will need to respond within three days.
You will still be able to upload your ID to verify your age with forms of identification like a driver's license or ID card. We will use your ID to confirm your age and help keep our community safe. Your ID will be stored securely on
our servers and is deleted within 30 days.
|
|
Instagram introduces a policy to nudge teenagers towards what they 'should' be reading about
|
|
|
|
6th December 2021
|
|
| See blog post from
about.instagram.com by Adam Mosseri, Head of Instagram |
At Instagram, we've been working for a long time to keep young people safe on the app; as part of that work, today we're announcing some new tools and features to keep young people even safer on Instagram. We'll be taking a
stricter approach to what we recommend to teens on the app, we'll stop people from tagging or mentioning teens that don't follow them, we'll be nudging teens towards different topics if they've been dwelling on one topic for a long time and we're
launching the Take a Break feature in the US, UK, Ireland, Canada, Australia and New Zealand, which we previously announced. We'll also be launching our first tools for parents and guardians early next year to help them get more
involved in their teen's experiences on Instagram. Parents and guardians will be able to see how much time their teens spend on Instagram and set time limits. And we'll have a new educational hub for parents and guardians. Parents
and guardians know what's best for their teens, so we plan to launch our first tools in March to help them guide and support their teens on Instagram. Parents and guardians will be able to view how much time their teens spend on Instagram and set time
limits. We'll also give teens a new option to notify their parents if they report someone, giving their parents the opportunity to talk about it with them. This is the first version of these tools; we'll continue to add more options over time.
We're also developing a new educational hub for parents and guardians that will include additional resources, like product tutorials and tips from experts, to help them discuss social media use with their teens.
It's important to me that people feel good about the time they spend on Instagram, so today we're launching Take A Break to empower people to make informed decisions about how they're spending their time. If someone has been scrolling
for a certain amount of time, we'll ask them to take a break from Instagram and suggest that they set reminders to take more breaks in the future. We'll also show them expert-backed tips to help them reflect and reset. We're also
starting to test a new experience for people to see and manage their Instagram activity. We know that as teens grow up, they want more control over how they show up both online and offline so, for the first time, they will be able to bulk delete content
they've posted like photos and videos, as well as their previous likes and comments. While available to everyone, I think this tool is particularly important for teens to more fully understand what information they've shared on Instagram, what is visible
to others, and to have an easier way to manage their digital footprint. This new experience will be available to everyone in January. In July, we launched the Sensitive Content Control, which allows people to decide how much
sensitive content shows up in Explore. The control has three options: Allow, Limit and Limit Even More. Limit is the default state for everyone and based on our Recommendation Guidelines, Allow enables people to see more sensitive content, whereas Limit
Even More means they see less of this content than the default state. The Allow option is unavailable to people under the age of 18. We're exploring expanding the Limit Even More state beyond Explore for teens. This will make it
more difficult for teens to come across potentially harmful or sensitive content or accounts in Search, Explore, Hashtags, Reels and Suggested Accounts. We're in the early stages of this idea and will have more to share in time. Lastly, our research shows -- and external experts agree -- that if people are dwelling on one topic for a while, it could be helpful to nudge them towards other topics at the right moment. That's why we're building a new experience that will nudge people towards other topics if they've been dwelling on one topic for a while. We'll have more to share on this, and changes we're making when it comes to content and accounts we recommend to teens, soon.
|
|
Instagram introduces a new sanction for users that break its censorship rules
|
|
|
| 20th October
2021
|
|
| See article from vice.com
|
Instagram has started showing some users a popup message explaining that it will soon take away their ability to post links stickers, which many creators use to send their followers to other sites, digital stores, and platforms where they can make money.
Instagram doesn't allow adult content on its platform, but many adult content creators use it for promotional reasons, inviting their Instagram audience to follow them to other platforms or personal sites. Link stickers are an option where
users can add an external link to their photo or video. In August, Instagram removed the ability to link and send users off platform by swipe-up on a story and replaced it with a sticker, a small clickable icon that hovers over the image. The
Instagram popup warns: Starting October 25, you will no longer have access to the link sticker because you have shared content that violates our Community Guidelines, the message said. There is no option to appeal this
decision, only an OK button and a link to the Community Guidelines. Screenshot of the Instagram notification.
An Instagram spokesperson told Motherboard in a statement: As part of our efforts to
limit the spread of harmful content that violates our Community Guidelines, we'll restrict people who have repeatedly or severely violated these policies from using the link sticker. However, we're investigating an issue where people may have mistakenly
been notified that they will be restricted, and we're working on resolving this as soon as possible.
|
|
44 US states call for an end to Instagram's idea to introduce a version for under 13s
|
|
|
| 13th
May 2021
|
|
| See article from
theverge.com |
44 US states have come out against Instagram's idea for version of the social networking site for under 13s. In an open letter, the National Association of Attorneys General called on Facebook to abandon plans for an Instagram platform focused on
children under the age of 13. The letter is signed by 44 different state-level attorneys generals. The attourneys said: It appears that Facebook is not responding to a need, but instead creating one, as this platform
appeals primarily to children who otherwise do not or would not have an Instagram account, the letter reads. The attorneys general urge Facebook to abandon its plans to launch this new platform. While the letter has no formal legal
power, it emphasizes the significant legal risk Facebook will face in undertaking the project. In the US, children under 13 are subject to enhanced legal protections under the Children's Online Privacy Protection Act (or COPPA), which places particularly
stringent rules against data collection. Facebook said it would not sell ads on any Instagram app targeted at young children but did not back off on its interest in developing the app. |
|
Miserable moralists from the Campaign for a Commercial-Free Childhood complain about Facebook's idea of an Instagram for kids
|
|
|
| 16th April
2021
|
|
| See article from bbc.co.uk |
A moralist campaign group called the Campaign for a Commercial-free Childhood wants Facebook to scrap its plans to launch a version of Instagram for children. A letter from the group, signed by 99 individuals and groups including the Electronic
Privacy Information Center, Global Action Plan and Kidscape, claims that the image-obsessed platform is dangerous for children's health and privacy. In the letter, the signatories point out that those under the age of 13 already on Instagram
are unlikely to abandon it for a new site that seems babyish. The real target of Instagram for kids will be much younger children. Josh Golin, Campaign for a Commercial-Free Childhood executive director, said: Instagram's
business model relies on extensive data collection, maximising time on devices, promoting a culture of over-sharing and idolising influencers, as well as a relentless focus on often altered physical appearance. It is certainly not appropriate for
seven-year olds.
Plans for an Instagram for under-13s have been mooted in recent weeks. Facebook, which owns Instagram, said it would be managed by parents. It is a response to state censors who want under 13's to be banned from
social media. Facbeook explained: Kids are already online, and want to connect with their family and friends, have fun, and learn. We want to help them do that in a safe and age-appropriate way, and find practical
solutions to the ongoing industry problem of kids lying about their age to access apps. We're working on new age verification methods to keep under-13s off Instagram, and have just started exploring an Instagram experience for
kids that is age-appropriate and managed by parents.
We agree that any experience we develop must prioritise their safety and privacy, and we will consult with experts in child development, child safety
and mental health, and privacy advocates to inform it. We also won't show ads in any Instagram experience we develop for people under the age of 13.
|
|
Instagram updates its censorship policy ro allow pictures of women holding their breasts
|
|
|
| 27th
October 2020
|
|
| See article from xbiz.com
|
Instagram has announced it will be introducing a new nudity policy this week, which will now allow pictures of women holding, cupping or wrapping their arms around their breasts. Instagram said the change was prompted by a campaign by Nyome
Nicholas-Williams, a Black British plus-sized model, who had accused the Facebook-owned company of removing images showing her covering her breasts with her arms due to racial biases in its algorithm. According to Thomson Reuters, Instagram
apologized last month to Nicholas-Williams and said it would update its policy, amid global concern over racism in technology following the global Black Lives Matter protests this year. |
|
|
|
|
| 1st August 2020
|
|
|
Big Tech's Morality Police Are Going After Adult Content See article from vice.com |
|
But doesn't the transgender journey convert a gay person into a straight person?
|
|
|
|
10th July 2020
|
|
| See article from bbc.co.uk |
Instagram will block the promotion of conversion therapy, which tries to change a person's sexuality or gender identity. Campaigners are urging the government to act now on a two-year-old promise to make the practice illegal. This year, 200,000 people
have signed an online petition calling for action. In 2018, the government announced that gay conversion therapies were to be banned as part of a government plan to improve the lives of gay and transgender people, but activists note that such a ban
has not been initiated. The government has since said it will consider all options for ending the practice. Speaking exclusively to the BBC, Tara Hopkins, EMEA public policy director, Instagram, said the company is changing the way it handles
conversion therapy content: We don't allow attacks against people based on sexual orientation or gender identity and are updating our policies to ban the promotion of conversion therapy services. We are always
reviewing our policies and will continue to consult with experts and people with personal experiences to inform our approach.
Earlier this year, Instagram banned the promotion of conversion therapy in ads. From Friday, any content
linked to it will now be banned across the platform. |
|
Instagram thinks that its AI systems can recognise bullying captions
|
|
|
|
19th December 2019
|
|
| Thanks to Nick See article
from petapixel.com |
Instagram has launched a new censorship feature that uses AI to recognize potentially offensive language and warn you that you're about to post something that might be deemed 'problematic'. The feature uses a machine learning algorithm that Instagram
developed and tested to recognize different forms of bullying and provide a warning if and when a caption crosses that line. The warning reads: This caption looks similar to others that have been reported. From
there, you can choose to either Edit the Caption, Learn More, or Share Anyway. If the AI mistake, you can report it by clicking Learn More: The feature joins another AI-powered pop-up, released earlier this year, which warns users when
their comments may be considered offensive. Instagram said: We've found that these types of nudges can encourage people to reconsider their words when given a chance. Additionally, Instagram hopes that the feature
will be informative, helping educate people on what is and is not allowed.
The warning will roll out around the world in the next few months. |
|
Instagram is considering the monstrosity of full identity verification for users
|
|
|
| 5th December 2019
|
|
| See article from
telegraph.co.uk |
Instagram is actively considering bringing in gambling app-style full identity verification in the name of preventing underage children joining. Vishal Shah, Instagram's head of product, said the social media site would not take asking new
users to submit proof of age off the table as it looked at ways to tighten up how it verifies users' ages. His comments come as Instagram announced it would now start asking all new members to give their date of birth when signing up. The social
network also said it would soon start using the date of birth users had given on Facebook to verify ages on Instagram. Currently, Instagram asks if new users are over or under 18, and then only asks for a date of birth for those who say they are 17 or
younger. Parent company Facebook said: We understand not everyone will share their actual age. How best to collect and verify the age of people who use online services is something that the whole industry is exploring
and we are committed to continuing to work with industry and governments to find the best solutions. Nobody will have their date of birth publicly displayed on their Instagram profile.
|
|
Because they don't like what they do for a living
|
|
|
| 28th November 2019
|
|
| See article from bbc.com |
Hundreds of porn stars and sex workers had their Instagram accounts deleted this year, and many say that they're being held to a different standard than mainstream celebrities. I should be able to model my Instagram account on
Sharon Stone or any other verified profile, but the reality is that doing that would get me deleted, says Alana Evans, president of the Adult Performers Actors Guild and one of the leading voices in the battle that adult stars are waging to stay on the
platform. Ms Evans' group has collected a list of more than 1,300 performers who claim that their accounts have been deleted by Instagram's content moderators for violations of the site's community standards, despite not showing
any nudity or sex. They discriminate against us because they don't like what we do for a living, Ms Evans says. ...Read the full article from bbc.com
|
|
|
|
|
| 9th November 2019
|
|
|
Guardian article calls for Instagram to ease up on censorship of a queer arts collective and anything not white average sized and CIS. See
article from theguardian.com |
|
Instagram adds facility to flag posts as 'fake news'
|
|
|
|
17th September 2019
|
|
| See article from bbc.com |
Facebook has launched a new feature allowing Instagram users to flag posts they claim contain fake news to its fact-checking partners for vetting. The move is part of a wider raft of measures the social media giant has taken to appease the authorities
who claim that 'fake news' is the root of all social ills. Launched in December 2016 following the controversy surrounding the impact of Russian meddling and online fake news in the US presidential election, Facebook's partnership now involves
more than 50 independent 'fact-checkers' in over 30 countries . The new flagging feature for Instagram users was first introduced in the US in mid-August and has now been rolled out globally. Users can report potentially false posts by
clicking or tapping on the three dots that appear in the top right-hand corner, selecting report, it's inappropriate and then false information. No doubt the facility will be more likely to report posts that people don't like rather for 'false
information'. |
|
|
|
|
| 5th September 2019
|
|
|
Marius Sperlich's provocative pics are the antidote to Instagram censorship See
article from dazeddigital.com |
|
Instagram to allow users to report 'fake news' but no doubt this will used to harass those with opposing views
|
|
|
| 18th
August 2019
|
|
| See article from theguardian.com
|
Instagram is adding an option for users to report posts they claim are false. The photo-sharing website is responding to increasing pressure to censor material that government's do not like. Results then rated as false are removed from search tools,
such as Instagram's explore tab and hashtag search results. The new report facility on Instagram is being initially rolled out only in the US. Stephanie Otway, a Facebook company spokeswoman Said: This is
an initial step as we work towards a more comprehensive approach to tackling misinformation. Posting false information is not banned on any of Facebook's suite of social media services, but the company is taking steps to limit the
reach of inaccurate information and warn users about disputed claims.
|
|
Instagram adds another reason to ban users but promises better warnings of impending censorship and also a better appeal process
|
|
|
| 19th July
2019
|
|
| See article from instagram-press.com |
Instagram explains in a blog post: Under our existing policy, we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain
percentage of violating content, we will also remove accounts with a certain number of violations within a window of time. Similarly to how policies are enforced on Facebook , this change will allow us to enforce our policies more consistently and hold
people accountable for what they post on Instagram. We are also introducing a new notification process to help people understand if their account is at risk of being disabled. This notification will also offer the opportunity to
appeal content deleted. To start, appeals will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we'll be expanding appeals in the coming
months. If content is found to be removed in error, we will restore the post and remove the violation from the account's record. We've always given people the option to appeal disabled accounts through our Help Center , and in the next few months, we'll
bring this experience directly within Instagram. |
|
US adult performers protest the injustice of Instagram who summarily remove accounts without warning, explanation, or right to appeal
|
|
|
| 20th June 2019
|
|
| Thanks to Nick See article from theguardian.com
|
Dozens of adult performers have picketed outside of Instagram's Silicon Valley headquarters over censorship guidelines and the arbitrary inconsistent enforcement of the rules. They said that this has led to hundreds of thousands of account suspensions
and is imperiling their livelihoods. Adult performers led the protest on Wednesday, but other users including artists, sex workers, queer activists, sex education platforms and models say they have been affected by the platform's opaque removal
system. The action was organized by the Adult Performer Actors Guild, the largest labor union for the adult film industry. They were complaining in particular in the way that the company takes down accounts without warning or explanation and
provide no real recourse or effective appeal system. Amber Lynn, an American porn star based in Los Angeles, said her account was terminated without warning or explanation two months ago. She had more than 100,000 followers.
I sent [Instagram] multiple emails through my lawyer and they will still not tell me why they did it, she said. They do not answer you, do not give you an opportunity to correct any problems or even tell you what problems they had to
begin with so you can avoid it in the future. |
|
|
|
|
| 29th May 2019
|
|
|
The Photographers Fighting Instagram's Censorship of Nude Bodies. By Kelsey Ables See article from
artsy.net |
|
|
|
|
| 30th April 2019
|
|
|
The platform's new policy will disproportionately affect women and sex workers. By Jesselyn Cook See article from
huffpost.com |
|
Images of butchered meat are now defined as sensitive and liable to offend on Instagram
|
|
|
| 7th March
2019
|
|
| See article from
independent.co.uk |
A chef has criticised Instagram after it decided that a photograph she posted of two pigs' trotters and a pair of ears needed to be protected from 'sensitive' readers. Olia Hercules, a writer and chef who regularly appears on Saturday Kitchen and
Sunday Brunch , shared the photo alongside a caption in which she praised the quality and affordability of the ears and trotters before asking why the cuts had fallen out of favour with people in the UK. However Hercules later discovered
that the image had been censored by the photo-sharing app with a warning that read: Sensitive content. This photo contains sensitive content which some people may find offensive or disturbing. Hercules hit back at the decision on Twitter,
condemning Instagram and the general public for becoming detached from reality. |
|
Instagram apologises for its censorship of a gay kiss
|
|
|
| 2nd July
2018
|
|
| Thanks to Nick See article from
indy100.com |
Instagram has apologised for censoring a photo of two men kissing for violating community guidelines. The photo - featuring Jordan Bowen and Luca Lucifer - was taken down from photographer Stella Asia Consonni's Instagram. A spokesperson for
the image sharing site regurgitated the usual apology for shoddy censorship saying This post was removed in error and we are sorry. It has since been reinstated.
The photo was published in i-D
magazine as part of a series of photos by Stella exploring modern relationships, which she plans to exhibit later this year. It only reappeared after prominent people in fashion and LGBT+ rights raised awareness about the removal of the photo.
|
|
Instagram is trying to inform posters that their post has been saved, snapped or recorded before it self destructs
|
|
|
| 13th
February 2018
|
|
| Thanks to Nick See
article from dazeddigital.com |
Some users have reported seeing pop ups in Instagram (IG) informing them that, from now on, Instagram will be flagging when you record or take a screenshot of other people's IG stories and informing the originator that you have rsnapped or ecorded the
post. According to a report by Tech Crunch , those who have been selected to participate in the IG trial can see exactly who has been creeping and snapping their stories. Those who have screenshotted an image or recorded a video will have a little
camera shutter logo next to their usernames, much like Snapchat. Of course, users have already found a nifty workaround to avoid social media stalking exposure. So here's the deal: turning your phone on airplane mode after you've loaded the story
and then taking your screenshot means that users won't be notified of any impropriety (sounds easy for Instagram to fix this by saving the keypress until the next time it communicates with the Instagram server). You could also download the stories from
Instagram's website or use an app like Story Reposter. Maybe PC users just need another small window on the desktop, then move the mouse pointer to the small window before snapping the display. Clearly, there's concerns on Instagram's part about
users' content being shared without their permission, but if the post is shared with someone for viewing, it is pretty tough to stop then from grabbing a copy for themselves as they view it. |
|
|
|
|
| 18th
April 2017
|
|
|
The Official Full List of Hashtags Banned From Instagram See article from bet.com |
|
Study finds that it doesn't take long for the wit of man to dream up new words to replace those banned by Instagram
|
|
|
|
13th March 2016
|
|
| Thanks to Therumbler See article from independent.co.uk
|
A scientific study has found that Instagram' s decision to ban certain words linked to pro-anorexia posts may have actually made the problem worse. The study, conducted by a team at Georgia Tech, found that the censoring of terms like
'thighgap, thinspiration and secretsociety, commonly used by anorexia sufferers, initially caused a decrease in use. However, they found that users adapted by simply making up new, almost identical words to get around Instagram's moderation, often
by altering spellings to create terms like thygap and thynspooo . Instagram's censoring of pro-eating disorder (ED) content began in 2012, when they began limiting what users could see when searching for certain terms. Some
terms, like #thinspiration, simply return no results when searched for in the app. Other terms, like #thin, are still searchable, but users first have to read a message warning them about the content and directing them towards ED support services before
they can see any pictures. The researchers believe that by accidentally prompting the creation of these terms, Instagram polarised the vulnerable pro-ED community and actually increased how much members engaged with the content. Munmun De
Choudury, an assistant professor at the school, said: Likes and coments on these new tags were 15 to 30 per cent higher than the originals. |
|
Instagram updates its censorship rules
|
|
|
| 18th
April 2015
|
|
| See article
from washingtonpost.com See Instagram censorship rules from
help.instagram.com |
Instagram has updated its censorship rules to give users more insight into how it polices content on its site. Nicky Jackson Colaco, director of public policy for Instagram said: We're not changing any of the policies. But
the company has added in detail around questions we've gotten over and over, and into places where [users] needed more information.
Parent company, Facebook also updated censorship rules several weeks ago. And many of the policies
outlined in Instagram's latest guidelines are the same as the one's Facebook explained in its latest rewrite. These include specific prohibitions against messages that support or praise terrorism an or hate groups, serious threats of harm to public or private safety and clear statements against abuse of all kinds. Rules common to both websites say:
We remove content that contains credible threats or hate speech, content that targets private individuals to degrade or shame them, personal information meant to blackmail or harass someone, and repeated unwanted
messages.
On the question of nudity, Instagram says that nudity in general-- and pornography specifically -- is off-limits. But photos of post-mastectomy scarring and women actively breastfeeding are allowed, the guidelines
say, Nudity in photos of paintings and sculptures is OK, too. |
|
Chelsea Handler vs Putin: International competition for best instagrams and best take downs
|
|
|
| 3rd
November 2014
|
|
| See article from
gawker.com |
Chelsea Handler is an American comedienne, actress, author, television host, writer and producer. She hosted a late-night talk show called Chelsea Lately on the E! network. Chelsea Handler's bare breasts were on Instagram for roughly half
an hour after she shared a topless photo of herself riding a horse . The pic was a protest against an unfair double standard: Vladimir Putin can freely post topless pictures on horseback anywhere online without fear of censorship, but a lady's
nipples are still considered obscene by many websites. Chelsea explained: Anything a man can do, a woman has the right to do better #kremlin. Instagram repeated the censorship 3 times before Chelsea got the message the US can be more
censorial than Russia and free speech does not apply when people are supposedly offended or outraged.
|
|
Another example of incompetent social media censorship
|
|
|
|
25th June 2014
|
|
| See article
from dailylife.com.au See also
So is this innocent picture of my child inappropriate, too? from
telegraph.co.uk |
In a picture, a little girl is seen lifting her dress to admire her new underpants, evidence to her of her first steps in toilet training. But the tummy and underpants are considered by Instagram to be nudity. Adamo was warned by the site about posting
inappropriate content, but not being able to recognise sexual tones in her children's photos fast enough she had her account deleted before she could resolve it. Adamo's account has since been reactivated after mounting furore. But an incident like
this still begs the questioin: are photography sharing sites being unnecessarily rigid about content and prudish about flesh? Facebook, for instance, has only just lifted its long held ban on the appearance of female nipple in breastfeeding photos.
Advertisement Indeed, there's a deliberate reluctance to involve themselves in the debate required for interpreting content. Blanket policies alleviate social media sites from needing to pay people, rather than inexpensive filter programs, to do
specialised decision making. Adamo, cofounder of a fashionable online baby boutique had over 36,000 followers of her family photo album on Instagram before her account was removed. |
|
Rihanna offends Instagram with sexy French magazine cover
|
|
|
| 1st
May 2014
|
|
| See
article from
independent.co.uk |
Rihanna shared a picture of her appearance on the cover of French magazine Lui, in which she appears in a hat and a pair of coral briefs. The image was shot by fashion photographer Mario Sorrenti. However, nudity, partial nudity or sexually suggestive
photographs are banned on Instagram and the social media platform temporarily closed her account until the picture was taken down. Instagram's censorship rules read: If you wouldn't show the photo or video you are
thinking about uploading to a child, or your boss, or your parents, you probably shouldn't share it on Instagram. The same rule applies to your profile photo. Accounts found sharing nudity or mature content will be disabled and
your access to Instagram may be discontinued.
|
|
|
|
|
| 12th
November 2013
|
|
|
Soone everything will be banned on Instagram continuing with tags related to drugs See article from bbc.co.uk |
|
American Apparel t-shirts
|
|
|
| 18th October 2013
|
|
| 8th October 2013. See
article from
dailymail.co.uk |
The Daily Mail goes into overdrive and asks: Has American Apparel gone too far? Shoppers attack vile and disturbing T-shirt showing menstruation A unisex T-shirt emblazoned with a
drawing of a woman's vagina stained with menstrual blood has been deemed vile , gross and disturbing since going on sale at American Apparel. The retailer notes on its website that the $32 Period Tee was designed by 20-year-old Toronto-based artist, Petra Collins, who
creates portraits exploring female sexuality and teen girl culture.
However the Daily Mail couldn't really back up the 'outrage' All the have are a few unimaginative tweets eg: @sawissinger
wrote, this is the worst thing I can possibly think of, while @kawaii_origami tweeted, are you kidding me?
Update: More 'outrage' 18th October 2013. See
article from
dailymail.co.uk
The designer behind American Apparel's menstruation T-shirt, has had her Instagram account deleted after uploading an image of her unshaven bikini line. Petra Collins posted a snap of herself from the waist down wearing a bathing suit bottom
with her pubic hair peeping out to the photo-sharing site last week. However, she revealed to OysterMag.com today that the picture attracted so many complaints from users that her account - which had over 25,000 followers - has since been deleted.
The artist said dozens of people deemed her self-portrait horrible and disgusting. Voicing her annoyance, she wrote: To those who reported me, to those who are disgusted by my body . . . I
want you to thoughtfully dissect your own reaction to these things. Please think about WHY you felt this way, WHY this image was so shocking, WHY you have no tolerance for it.
|
| |