— Reuters (File Photo)
While India and the rest of the world protest heavily, both on the streets and online, against yet another crime committed against a woman – many Pakistanis celebrate their team’s victory in an ODI series against India, and consider it analogous to that same crime that is being protested.
Using the word ‘rape’ as part of an analogy to describe overwhelming power over others in situations, is gaining greater acceptance these days. Many youngsters, especially males, use the term wantonly as the urban slang for ‘defeat’ or ‘damage’, without considering the fact that it describes an act of utter cowardice and is a crime not just against women but humanity.
Facebook, with its one billion active users (as of official stats released in October 2012), has a huge role to play in the increasing acceptance of terms and concepts that depict and incite violence against women.
Its policies clearly state that “Facebook does not permit hate speech, but distinguishes between serious and humorous speech. While we encourage you to challenge ideas, institutions, events, and practices, we do not permit individuals or groups to attack others based on their race, ethnicity, national origin, religion, gender, sexual orientation, disability or medical condition.”
Contrary to this and other policies, a lot of content demeaning women is regularly uploaded on Facebook. It encourages the abuse and disrespect of women and some pages even go to the extent of discussing ways to kill them.
These pages usually have anywhere from about 500 to 15,000 followers, and often grow to a million. For instance, a page titled “I like to dissect girls, did you know I’m utterly insane?” has 434 followers and another one ‘Sh**s and Giggles’ which has been reported many times by those offended by such content, is still up and has 16,628 likes (statistics were taken at the time this article was written).
To add insult to injury, most of these pages are listed in the “Humour” category, which is why, despite being reported for violating policy, Facebook often does not remove them. After all, their policy says ‘(Facebook) distinguishes between serious and humorous speech’.
The question, however, remains whether it is justified on the part of this huge social network to let such pages, that promote a culture of violence, stay online and potentially infect millions of minds.
Several human, women’s and cyber rights groups have repeatedly criticised Facebook’s lax attitude when it comes to removing content degrading women. However, most of these efforts often result in the opposite – the pages of these rights groups, their content and even the accounts of users creating these groups get banned, while the offensive content remains.
One such example is that of a women’s rights campaigner Hildur Lilliendahl from Iceland. Hilder was banned from Facebook for 30 days for posting an album highlighting the casual bullying and harassment women and girls face, including her experience online.
In response to various rights groups’ concerns, Facebook’s policy department responded, “We seek to prohibit such attacks while giving people the opportunity to use language – even when highly offensive – to express their opinions, tell jokes, and engage in other activities that we believe do not represent direct threats of harm,”
This official response is discouraging for rights advocates and peace-loving netizens alike because this often leads to incidents of harassment and bullying being ignored as well.
Also, Facebook is too slow in responding to reports of threats, abuse, harassment and bullying. When users report cyber-harassment, Facebook takes so long to take action that by that time a lot of damage to the users is already done.
A sad example of this is of Amanda Todd of Canada. Amanda was a tormented teen who committed suicide after her derogatory video was posted on Facebook. Often, reporting a page or profile are exercises in futility, as the response from the Facebook team is often that action could not be taken as the content does not violate the company’s Statement of Rights and Responsibilities.
As an alternate, it suggests to hide such content and cut ties with people, pages and apps that offend you. Facebook has only four teams working to address issues highlighted through social reporting, as a result it has failed to take any impressive action.
Perhaps Facebook could introduce an automated mechanism that removes/blocks the content on its website if it has been reported a specific number of times. The designated teams could later review this blocked content to decide if it has to be permanently removed or needs to be un-blocked.
This would at least make the process of offensive content removal fast, hence minimising the damage it could cause to peoples’ reputations and their lives.
At the same time, individuals, communities and groups working against demeaning content should avoid sharing direct links to such content as it only helps it gain more popularity. Instead, they should make closed mailing lists over Facebook or elsewhere in which to share direct links for people to report.
It may also be required to weigh the trade-off between sharing such links publically and getting possibly everyone to report them and keeping them out in the open for them to get more fan-base.
Despite all these, the question of whether Facebook supports misogyny or rejects it through its policies remains open. Currently, it seems the former and in order to prove so wrong, Facebook will have to revisit and revise its policies.
It would require it to further draw line between content that is deemed “objectionable” and that which is not. This is imperative to make world’s third most populous place safe for women and to avoid making violence against women look acceptable by letting crass content stay on its website.