The Unconscious Bias In Facebook’s Moderation Problem

The Unconscious Bias In Facebook’s Moderation Problem

Following the recent leak of Facebook’s moderation practices, online community expert, Venessa Paech (pictured below), covers the myriad challenges for the platform, and where to next for Facebook and its users in this guest post for B&T…

Facebook’s moderation practices are in the news yet again, as more detail of their playbook was leaked to the public (aka, their users) recently. They highlight alarming inconsistencies and a disdain for duty of care. But they also illuminate the bias at the heart of the problem.

Venessa Paech%5b1%5d%5b1%5d

Technology isn’t neutral. Like any other tool, it’s imbued with the assumptions, expectations, politics and predilections of those who create it. It’s shaped by the experience and perspectives those doing the tooling have lived up to that moment. We’re getting better at recognising and responsibly managing unconscious bias in our businesses and organisations. But we’re pretty ordinary at calling it out when it comes to social networks that feature prominently in our daily lives.

The standard push-back to more responsible content and user regulation on Facebook is concern over censorship (driven primarily by a nativist US perspective). This is, of course, a straw man argument. Facebook policies and positions already ‘censor’. Terms of service prohibit certain behaviours and activities. This is the beginning of curating a culture – at the margins.

When Facebook moderators remove a personal picture of a mother breastfeeding, but not a graphic rape or death threat, they’re making choices and creating cultural norms. These interventions are based on an explicit world view and legal lens particular to a small portion of actual users. They don’t consider community context and they ignore the fact they’re already a form of ‘censorship’.

When someone is intimidated or threatened out of participation, censorship is in effect. Women and other institutionalised marginalised groups are routinely drummed out of the virtual room by armies of trolls, men’s rights activists or the loudest hater in the room at any given moment. For some it’s a matter of life and death, where speaking up or speaking out provokes the ultimate attempt to silence – doxing (posting personally identifiable details like address or phone number, with the invitation to harass or stalk that person).

Facebook reporting tools, though better than they once were, still suffer from an either/or, square peg/round-hole problem, with inadequate categories to accommodate the thrilling range of abuse directed at women on the platform. And of course there are the all-too-common cases of people in crisis, (again, usually women), seeking the connectivity and support of their personal social network, but needing to cloak their identity and online activity from abusive parties.

Facebook will tell you repeatedly that they are committed to improving their content reporting capabilities, making it ‘easier’ and ‘faster’ to raise alarm. But reports are still in relation to Facebook’s ‘standards’, and it’s there that unconscious bias needs tackling first. By prioritising profit over harm minimisation, by intentionally refusing to use available technology to quarantine graphic content, and by ignoring their wider unconscious bias, Facebook is complicit.

Twitter founder Evan Williams recently issued a sincere mea culpa for what he now understands as Twitter’s role in mobilising toxic behaviour. I like and admire Evan, but was surprised at the naiveté of his comment: “I thought once everyone could speak freely and exchange information and ideas, the world was automatically going to be a better place. I was wrong about that.” I’m pretty sure most women (and other institutionally marginalised voices) would have had a safe bet what was in store; and some suggestions about better tools to help manage it.

Facebook has taken positive steps to scrutinise and address employer-side unconscious bias. But this doesn’t yet filter through to the platform itself. A diversity of perspectives in the room allows Facebook to more accurately map risk scenarios and desired user journeys. How many women or people of colour were involved when their now infamous moderation playbook was created? How many did they run it by to see if it tracked with lived experience? Were there anthropologists and ethnographers working alongside the lawyers? Diversity makes better products. It also makes safer, more equitable ones. Lessons from community management and the world of social science teach us that when people feel they can disclose without threat, they’re more likely to (ironically, revealing the kind of intimate data Facebook transforms into product, currency and share price).

Social and community professionals manage millions of groups across their platform, not to mention the ‘accidental’ community managers who voluntarily administrate local groups and communities of interest. One way to help scale the mountain of moderation is to engage these people in the creation of tools to manage and create a group culture that reflects their values and their needs. There are signs Facebook is starting to address this, with a ‘Communities Summit’ in Chicago connecting group admins with Facebook staff (including Zuckerberg himself). However, there is a rigorous application process and, for now at least, it’s only open to U.S admins. We watch with interest to see if the conversation is about ‘getting more out of Facebook’, or listening to community management needs with an intent to act.

Imagine where we might be if this decade-old business had engaged community experts and others outside its filter bubble to begin with? Imagine what a commitment to transparency could accomplish on all sides?

Venessa Paech is the Co-Founder of Swarm, Australia’s premier community management conference. Tickets now on sale at www.swarmconference.com.au

 




Please login with linkedin to comment

Australian Marriage Equality Designworks

Latest News

Sydney Comedy Festival: Taking The City & Social Media By Storm
  • Media

Sydney Comedy Festival: Taking The City & Social Media By Storm

Sydney Comedy Festival 2024 is live and ready to rumble, showing the best of international and homegrown talent at a host of venues around town. As usual, it’s hot on the heels of its big sister, the giant that is the Melbourne International Comedy Festival, picking up some acts as they continue on their own […]

Global Marketers Descend For AANA’s RESET For Growth
  • Advertising

Global Marketers Descend For AANA’s RESET For Growth

The Australian Association of National Advertisers (AANA) has announced the final epic lineup of local and global marketing powerhouses for RESET for Growth 2024. Lead image: Josh Faulks, chief executive officer, AANA  Back in 2000, a woman with no business experience opened her first juice bar in Adelaide. The idea was brilliantly simple: make healthy […]