Today Facebook Fined Tuned Their Rules for You!
Facebook’s community standards are getting a facelift today: For the first time ever, the social media platform will be outright banning content that promotes sexual violence or exploitation including “revenge porn” and posts that can be considered “violent, criminal or hateful.”
Facebook’s posting guidelines have been inconsistent in the past — the company has waffled on banning everything from breastfeeding photos to beheading videos — but the updated rules spell out what users can and cannot post more clearly than ever before. The following content categories were some of the most heavily impacted:
Nudity. Porn has always been banned on Facebook, but the new guidelines are more nuanced about what kind of nudity is allowed. Don’t expect to see any “fully exposed buttocks” in your Newsfeed, “vivid” depictions of sexual acts, or genitals (in general). Photos of “paintings, sculptures, and other art that depicts nude figures” are still fair game, as are pics of women “actively engaged in breastfeeding or showing breasts with post-mastectomy scarring.” The #FreeTheNipple war wages on, though: Facebook will continue to restrict “some images of female breasts” if nipples make an appearance.
Sexual violence and exploitation. The new guidelines say that “photographs or videos depicting incidents of sexual violence” will be removed, as will images shared in the spirit of revenge or without permission. Sexual solicitations, any sexual content involving minors, “threats to share intimate images,” or offers of sexual services are also banned, and Facebook will turn things over to law enforcement when it seems appropriate.
Self-injury. The new policy prohibits content that encourages users to self-harm through suicide, self-mutilation, and eating disorders. However, users can share this type of content if it’s meant to support others in distress. Additionally, posts that identify or target specific victims for attack will be removed. This section is also careful to clarify that body modification does not count as self-mutilation.
Hate speech. Any content that directly attacks people based on race, ethnicity, national origin, sexual orientation, gender identity, disabilities, or illnesses will be removed.
What, if any, impact these sweeping changes will have remains to be seen. Facebook says it still has no plans to automatically scan for and remove content that potentially violates the community standards. Monika Bickert, the company’s head of global policy management, told The New York Times that the platform will continue to rely on users to report when the rules are being broken. With 1.39 billion members across the world, the team responsible for responding to those flagged posts certainly has its work cut out. When the new standards have language patrolling the "spirit" of various posts, or whether the poster has permission to share a given piece of content, it's unclear how the site even guesses it will undertake this massive crackdown. Its standards, we’d have to guess, will continue to be a moving target.
Facebook’s posting guidelines have been inconsistent in the past — the company has waffled on banning everything from breastfeeding photos to beheading videos — but the updated rules spell out what users can and cannot post more clearly than ever before. The following content categories were some of the most heavily impacted:
Nudity. Porn has always been banned on Facebook, but the new guidelines are more nuanced about what kind of nudity is allowed. Don’t expect to see any “fully exposed buttocks” in your Newsfeed, “vivid” depictions of sexual acts, or genitals (in general). Photos of “paintings, sculptures, and other art that depicts nude figures” are still fair game, as are pics of women “actively engaged in breastfeeding or showing breasts with post-mastectomy scarring.” The #FreeTheNipple war wages on, though: Facebook will continue to restrict “some images of female breasts” if nipples make an appearance.
Sexual violence and exploitation. The new guidelines say that “photographs or videos depicting incidents of sexual violence” will be removed, as will images shared in the spirit of revenge or without permission. Sexual solicitations, any sexual content involving minors, “threats to share intimate images,” or offers of sexual services are also banned, and Facebook will turn things over to law enforcement when it seems appropriate.
Self-injury. The new policy prohibits content that encourages users to self-harm through suicide, self-mutilation, and eating disorders. However, users can share this type of content if it’s meant to support others in distress. Additionally, posts that identify or target specific victims for attack will be removed. This section is also careful to clarify that body modification does not count as self-mutilation.
Hate speech. Any content that directly attacks people based on race, ethnicity, national origin, sexual orientation, gender identity, disabilities, or illnesses will be removed.
What, if any, impact these sweeping changes will have remains to be seen. Facebook says it still has no plans to automatically scan for and remove content that potentially violates the community standards. Monika Bickert, the company’s head of global policy management, told The New York Times that the platform will continue to rely on users to report when the rules are being broken. With 1.39 billion members across the world, the team responsible for responding to those flagged posts certainly has its work cut out. When the new standards have language patrolling the "spirit" of various posts, or whether the poster has permission to share a given piece of content, it's unclear how the site even guesses it will undertake this massive crackdown. Its standards, we’d have to guess, will continue to be a moving target.
Comments