FAQ

Breaking:

IT WORKED! Read our statement here

Q:  Do you have examples of the kind of pages advertisers should be aware their ads may appear on?

A:  Here is a list of pages and content illustrating a widespread, systematized acceptance of violence against women. In many instances, this content was reported, passed Facebook’s review process, garnered sometimes tens of thousands of supporters, and was only removed after active media exposure and public protest.  Some of these pages are still live and have not yet been removed, despite complaints.

Q: Why are you focusing this protest/initiative/project on Facebook?

A: Facebook has more than a billion users and is a powerful social medium and culture.  It is part of the Internet, but it is not “the Internet.” The company has procedures, terms and community guidelines that it interprets and enforces. Unacceptably, it does so in prejudicial ways that marginalize girls and women and contribute to violence against them.

Q: Why are you focusing on advertisers and not working with Facebook directly to address the problem?

A: Over the past several years multiple individuals and organizations have approached Facebook and worked with the company on this and related issues. However, despite these efforts, Facebook still regularly approves content that promotes violence against women. Recent attempts to engage Facebook similarly yielded no results.

Q: Aren’t Facebook advertisements targeted to individual users rather than pages, so advertisers are not able to control where their ads appear?

A: This campaign is not intended to demonize advertisers – we are drawing their awareness to the fact that their ads are appearing on these pages, which they may not even be aware of. However, the fact that Facebook’s advertising system works in this way means that as long as it continues to allow content depicting and inciting rape and domestic violence, advertisers have to be aware that their content could appear on such pages. Even if advertisers ask Facebook to remove a specific page or pages on which their ads may appear, unless Facebook takes the steps we have asked to publicly and clearly ban such content from its site, advertisers should be aware that their brands will appear in positions that sponsor content that mocks, trivializes or promulgates gendered violence.  So in just the same way that a company would have to decide whether or not to advertise in the pages of a magazine that included hate speech against a particular group, advertisers must similarly choose whether to withdraw their ads from Facebook until they address this issue.’

Q: What exactly do Facebook’s guidelines prohibit?

A: Facebook’s Community Guidelines prohibit nudity and pornography; graphic violence for sadistic intent; hate speech; bullying and harassment, including “ abusive behavior directed at private individuals;” any content that appears to legitimate threaten self-harm. Content deemed to be “a genuine risk of physical harm, or a direct threat to public safety” is prohibited as is any content indicating that a user is organizing “acts of real-world violence.”

Q: If Facebook’s bans attacks on identified groups and hate speech, what is the problem?

A: The company’s community standards state: “We do not permit individuals … to attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition.” However, in practice, Facebook deals with some forms of hate speech more aggressively than others. Gender-based hate speech in which women are targeted for being women is not treated as a legitimate form of hate speech by Facebook.

Q: But what about free speech? Aren’t you advocating censorship?

A: Facebook is a corporation, not a government actor, and is therefore not subject to the First Amendment. Despite claims that Facebook is dedicated to the principle of free speech, Facebook makes millions of decisions regarding what speech is allowable or not through its moderation process. It is, by virtue of its process, policing speech every day.  We are demanding that Facebook openly and actively confront sex and gender-based hate speech on its platform, and stop hiding behind sexist interpretations of words like “free speech”, “safety”, “humor” and “credible threats.”  The company would like to give the impression that engaging within Facebook is like going to a dinner party where diverse people have the opportunity to express diverse views openly and freely. In this depiction, Facebook might appear to be the table at which they sit. However, this is not the case. In this equation, by virtue of the moderation process and guidelines in place, Facebook is the host of the dinner party. They can and do ask people to leave when they have a problem with certain expressed opinions.

Q: How is Facebook specifically hurting women?

A: There are four primary ways:

1) Facebook prohibits hate speech. The company’s moderators deal with content that is violently homophobic, Islamophobic, and anti-Semitic every day.  So, for example,  “I hate Muslims” or “I hate Jews” is not allowable (for good reason!). However,  pages that express similar sentiment, but are targeted at girls and women – often through the use of misogynistic and sexist language –  are not similarly treated.  Our objection here is to Facebook’s inconsistency in deciding what constitutes hate speech, and who is treated as a valid target of hate.

2) Facebook employs a grossly sexist double standard when it comes to representations of women’s bodies.  Photographs that are pornographic – also not allowed by Facebook’s terms – and display women’s bodies as sex objects, including pictures of women fully exposed, women tied up, women in extreme pain, dead and abused women, remain on the site, while content posted by groups that represent women’s bodies for other purposes – either for health and education, art, political protest – are rejected and removed.   For example, breast feeding mothers, or pictures of placentas or medical illustrations of women’s sexual and reproductive organs are regularly banned. Facebook moderators remove this content – often created by women for women – while they routinely allow photographs of girls and women to be used without women’s consent for the purposes of harassment, bullying, slut-shaming and sexual ranking, review and commentary.

 Last week, the Daily Beast’s Facebook page was suspended after they published a famous painting depicting a topless Bea Arthur.

3) Facebook excludes women’s speech as political or artistic while arguing that it is dedicated to enabling free speech in the service of social justice. A common refrain in Facebook’s defense of its commitment to allow certain content, including content containing graphic violence (also in violation of terms), is that the company wants to reveal the real world and help catalyze social change. But Facebook routinely penalizes activist feminists on the site by removing their content, suspending their accounts, and disabling their links, as in the case of the Uprising of Women in the Arab Word.  For example, Hildur Lilliendahl, an Icelandic feminist who established a page to protests images like this: a picture of a woman in her underwear, tied up with ropes, gagged with an apple like a suckling pig on a spit, suspended from a long metal pole carried in procession by a gang of men. The caption read:  “Feminist found in town this morning—captured and put on the grill.”  When Lilliendahl reposted a threat a threat against her Lilliendahl was suspended  and her account was blocked least four times.  The threats stayed up. Facebook eventually apologized, but made no statement regarding how its policies or processes  were changed to make sure this doesn’t happen.  Other examples involving similar circumstances include:  Rapebook,  The Uprising of Arab Women,  Women on Waves,  The Girls Guide to Taking Over the World,  Rabid Feminist, Thorlaug Agustsdottir, Mama to Mama, Feminists at Sea. In many instances, after the fact, Facebook apologizes and restores the account or content. However, there appears to be a bias in favor of censoring the political and artistic spee of women, while allowing hate speech against women.

4) The way in which Facebook’s moderation process is structured – treating each incident and report as isolated and unrelated on a case-by-base basis – fails to address the overall environment created by this failure – namely one of harm and hostility towards women users in which they are disproportionately silenced.

Q: Can’t you people take a joke?

A:  Many people object to the idea that rape jokes and domestic violence jokes constitute a form of hate speech, or cause harm and create and unsafe environment. Facebook routinely categorizes rape and domestic violence content as [Humour] or [Controversial Humor].   Studies show that jokes making light of violence against women are triggering to victims and degrades the ability of people exposed to the content to empathize with victims. Not only does it normalize violence, making it increasingly tolerable to bystanders, but also it encourages abusers to find each other and form communities with enthusiastic social sanction, creating an unsafe space where girls and women are easily harassed, bullied and silenced.  Studies repeatedly demonstrate that rape and domestic violence jokes create an environment of harm that has real world consequences for women.  These jokes constitute an entire category of what is called “disparagement humor,” defined as racist or sexist content that denigrates, belittles or marginalizes a targeted group. In general, disparagement humor that targets people of color, or a religious group is not allowed by Facebook. However, disparagement humor that targets women is.

We are hard pressed to understand exactly what makes the rape and battery of women funny or controversial.

Q: Isn’t some speech, which includes graphically violent content, political and important?

A: Yes, however, again, Facebook has demonstrated an inability to consider women’s speech equally. Take the example of Amina Tyler, a woman’s rights activist in Tunisia, whose topless protest photos, like all photos of women protesting either partially or entirely undressed, were airbrushed or removed from Facebook pages. In another case, The Uprising of Arab Women, photos that women shared of themselves, fully dressed but without headscarves and captioned with expressions of wanting, for example, to “feel the air on my hair and body,” were removed. The administrators of this page using non-violent protest involving the use of women’s bodies for non-sexualized purposes, were suspended by Facebook. Conservatives reported the photos of women without head scarves appearing on the UWAW site.  After minimal response from Facebook, explained that one removal was in error and the other a legitimate violation of standards.  Facebook’s sexualized male-gaze nipple policies (rape jokes, yes, nursing mothers no; pornified pinups, yes, barechested strollers, no) is not only puritanical and sexist, but results in the suppression of women’s speech.

Q: Isn’t it true that Facebook is not in the business of establishing rules for civility for all of society?

A: If that is true, then it should eliminate its guidelines for content moderation and provide a truly open “marketplace of ideas.”  We are only asking that Facebook apply its own, existing standards banning hate speech to hate speech that targets women.

Q: Isn’t there a risk that overregulation will suppress speech?

A: Women’s speech is already suppressed disproportionately in the current (theoretically  “neutral”) system. When women are harassed and threatened online, or encounter gender-based hate speech, we’re less likely to participate openly in that space. Facebook is already suppressing the speech of women by allowing gender-based hate speech to flourish on its pages.

Q: Isn’t the Internet just offensive? Doesn’t Facebook allow lots of offensive material?

A: What is offensive is that the infliction of pain on girls and women –pain inflicted because they are female – is common, entertaining and acceptable.  Content that reduces girls and women to their body parts and mocks sexual and domestic violence sends the message that we are violable for other people’s purposes and entertainment.  At the same times, this content perpetuates harmful stereotypes about what makes men “real” – violence, control, infliction of pain on others, lack of empathy, never weak or helpless.

Facebook’s position is this: “We occasionally see people post distasteful or crude content. While it may be vulgar and offensive, distasteful content on its own does not violate our policies. However, there is no place on Facebook for content that is hateful, threatening, or incites violence, and we will not tolerate material deemed to be genuinely or directly harmful.”  That position is fine on the surface, but Facebook fails to recognize violence against girls and women as “hateful, threatening, or incit[ing] violence” far, far too often.  That’s why we are calling on Facebook to change their policies. We are resisting an environment that systemically tolerates hate, degradation, objectification and marginalization of girls and women, behind which loiters actual violence. Women, acculturated to a world where one in three women will be sexually assaulted or battered (in the US, that number is one in five; for men, one in 77), cannot separate this reality from their online experiences. Domestic violence statistics reflect a similar epidemic. When victims are older teens or adults, the majority of perpetrators in either case are men. This dynamic is reflected in and perpetuated by online misogyny.  Facebook has a distinctly male-centric view what “genuinely or directly harmful” means when assessing safety.

Q: What does this initiative have to do with safety?

A: Men and women, on the whole, understand “safety” differently, which is evident in a double-digit real world sex-based safety gap.  So, for example, in the United States, 89% of men surveyed feel that they can safely walk at night in their neighborhoods. That number is only 62% for women. This gap is reflected in the fact that 75% of online abuse is targeted at women.  Girls and women are acculturated to a world where one in three women will be sexually assaulted (in the US, that number is one in five; for men, one in 77) and cannot reasonably be expected to separate this reality from our online experiences. Domestic violence and homicide statistics show similar levels of gender-based violence.  Women experience and assess safety differently from men and Facebook’s policies do not take this into account.

Q: Can’t you just stop using Facebook?

A: Facebook is a contested public space, meaning there are differences of opinion about who should be engaged publicly and these opinions are often expressed against women in hostile ways. Women should not be forced by default to cede this public space because of a failure of the company to examine its own policies and processes.  Recently, a page called Rapebook, which was set up to provide a place where people could collectively identify and report content to Facebook, ended its efforts after the founders –  who have left Facebook –  were targeted by an online attack. It’s important to note that people who supported Rapebook’s efforts were unwilling to publicly show their support in Facebook, for fear of similar targeting, which crossed into the founders “real” world.  This is a genuine loss of free speech for these users (overwhelmingly women), resulting from bullying, harassment and sexist online abusiveness.  The people left feeling comfortable at Facebook are rape apologists and those who create content glorifying the debasement of women. Facebook’s interpretation of “freedom of speech” is being used as a defense of unjust actions that intimidating and silence female users and create a hostile space.

Q: Why focus on girls and women and not on violence in general?

A: Because women are universally marginalized and subjected to high levels of violence and Facebook’s approach exacerbates this reality.  In order to effectively and fairly enforce guidelines, Facebook cannot ignore the effect of speech on the social status of women as a group.  Click here, here, here and here for statistics regarding violence against women and here for statistics regarding attitudes regarding violence against women.

Q:  What can Facebook do to address this imbalance?

A:  We are asking Facebook to commit to three steps:

  1. Recognize speech that trivializes or glorifies violence against girls and women as hate speech and make a commitment that you will not tolerate this content.

  1. Effectively train moderators to recognize and remove gender-based hate speech.

  1. Effectively train moderators to understand how online harassment differently affects women and men, in part due to the real-world pandemic of violence against women.

Training would mean that Facebook commits to a policy development process regarding VAW that involves women’s rights and feminist organisations and activists input.

Q: Didn’t Facebook, along with other online companies such as Twitter, Yahoo and others, form a “Anti-Cyberhate Working Group”?

A: Yes.  However, this group seems to be concerned with how companies like Facebook can apply their guidelines transnationally. We are saying that their guidelines are flawed and damaging to begin with.  Writing recently about this group in The New Republic, Jeffrey Rosen explained that early in Facebook’s process definition, it was decided that “vague standards prohibiting speech that creates a “hostile environment” weren’t practical” and that the person responsible for creating a new process used the “university anti-harassment codes” as a model. Rates of sexual and domestic violence on American college campuses are HIGHER than those in the general population. Schools across the country are being charged with Title IX violations for failing to protect female students from harassment, sexual assault and violence. These institutions are institutionally tolerant of violence against women and a terrible  ideal on which to base a model.

Q: Why do you think Facebook’s free speech norms are biased against women?

A: Facebook has argued that they want to provide a forum for the free exchange of ideas and a diversity of opinion and allow independent, autonomous people to decide for themselves.  Facebook also argues that speech should only be constrained when it is intended or likely to result in ‘real’ violence or crime. The company completely fails to incorporate the reality that speech that demeans, denigrates, objectifies and threatens women with violence has the exact effect on women that they are claiming to protect against: women’s speech is constrained and not free. In addition, “real” violence does not seem to include effects such as anxiety or changes in behavior that women experience in hostile environments. While on the surface, the company is interested in diversity and pluralism, in reality it is perpetuating deeply patriarchal ideas that silence and punish women.  Otherwise, it is very difficult to explain demonstrable double standards in representations of women’s bodies and speech.    The company’s claims of neutral liberalism in regards to speech has, in actuality, conservative, sexist results.