The Unconscious Bias In Facebook’s Moderation Problem

The Unconscious Bias In Facebook’s Moderation Problem
SHARE
THIS



Following the recent leak of Facebook’s moderation practices, online community expert, Venessa Paech (pictured below), covers the myriad challenges for the platform, and where to next for Facebook and its users in this guest post for B&T…

Facebook’s moderation practices are in the news yet again, as more detail of their playbook was leaked to the public (aka, their users) recently. They highlight alarming inconsistencies and a disdain for duty of care. But they also illuminate the bias at the heart of the problem.

Venessa Paech%5b1%5d%5b1%5d

Technology isn’t neutral. Like any other tool, it’s imbued with the assumptions, expectations, politics and predilections of those who create it. It’s shaped by the experience and perspectives those doing the tooling have lived up to that moment. We’re getting better at recognising and responsibly managing unconscious bias in our businesses and organisations. But we’re pretty ordinary at calling it out when it comes to social networks that feature prominently in our daily lives.

The standard push-back to more responsible content and user regulation on Facebook is concern over censorship (driven primarily by a nativist US perspective). This is, of course, a straw man argument. Facebook policies and positions already ‘censor’. Terms of service prohibit certain behaviours and activities. This is the beginning of curating a culture – at the margins.

When Facebook moderators remove a personal picture of a mother breastfeeding, but not a graphic rape or death threat, they’re making choices and creating cultural norms. These interventions are based on an explicit world view and legal lens particular to a small portion of actual users. They don’t consider community context and they ignore the fact they’re already a form of ‘censorship’.

When someone is intimidated or threatened out of participation, censorship is in effect. Women and other institutionalised marginalised groups are routinely drummed out of the virtual room by armies of trolls, men’s rights activists or the loudest hater in the room at any given moment. For some it’s a matter of life and death, where speaking up or speaking out provokes the ultimate attempt to silence – doxing (posting personally identifiable details like address or phone number, with the invitation to harass or stalk that person).

Facebook reporting tools, though better than they once were, still suffer from an either/or, square peg/round-hole problem, with inadequate categories to accommodate the thrilling range of abuse directed at women on the platform. And of course there are the all-too-common cases of people in crisis, (again, usually women), seeking the connectivity and support of their personal social network, but needing to cloak their identity and online activity from abusive parties.

Facebook will tell you repeatedly that they are committed to improving their content reporting capabilities, making it ‘easier’ and ‘faster’ to raise alarm. But reports are still in relation to Facebook’s ‘standards’, and it’s there that unconscious bias needs tackling first. By prioritising profit over harm minimisation, by intentionally refusing to use available technology to quarantine graphic content, and by ignoring their wider unconscious bias, Facebook is complicit.

Twitter founder Evan Williams recently issued a sincere mea culpa for what he now understands as Twitter’s role in mobilising toxic behaviour. I like and admire Evan, but was surprised at the naiveté of his comment: “I thought once everyone could speak freely and exchange information and ideas, the world was automatically going to be a better place. I was wrong about that.” I’m pretty sure most women (and other institutionally marginalised voices) would have had a safe bet what was in store; and some suggestions about better tools to help manage it.

Facebook has taken positive steps to scrutinise and address employer-side unconscious bias. But this doesn’t yet filter through to the platform itself. A diversity of perspectives in the room allows Facebook to more accurately map risk scenarios and desired user journeys. How many women or people of colour were involved when their now infamous moderation playbook was created? How many did they run it by to see if it tracked with lived experience? Were there anthropologists and ethnographers working alongside the lawyers? Diversity makes better products. It also makes safer, more equitable ones. Lessons from community management and the world of social science teach us that when people feel they can disclose without threat, they’re more likely to (ironically, revealing the kind of intimate data Facebook transforms into product, currency and share price).

Social and community professionals manage millions of groups across their platform, not to mention the ‘accidental’ community managers who voluntarily administrate local groups and communities of interest. One way to help scale the mountain of moderation is to engage these people in the creation of tools to manage and create a group culture that reflects their values and their needs. There are signs Facebook is starting to address this, with a ‘Communities Summit’ in Chicago connecting group admins with Facebook staff (including Zuckerberg himself). However, there is a rigorous application process and, for now at least, it’s only open to U.S admins. We watch with interest to see if the conversation is about ‘getting more out of Facebook’, or listening to community management needs with an intent to act.

Imagine where we might be if this decade-old business had engaged community experts and others outside its filter bubble to begin with? Imagine what a commitment to transparency could accomplish on all sides?

Venessa Paech is the Co-Founder of Swarm, Australia’s premier community management conference. Tickets now on sale at www.swarmconference.com.au

 

Please login with linkedin to comment

Australian Marriage Equality Designworks

Latest News

Entries now open for Snapchat Young Lions Australia
  • Media

Entries now open for Snapchat Young Lions Australia

The annual search for the very best of Australia’s young media, creative, and marketing professionals has begun, with Cannes Lions Festival representative The Misfits announcing entries are now officially open for the 2021 Snapchat Young Lions Competition. The Young Lions Competition, which is proudly supported by Snapchat and The Trade Disk, gives blossoming media talent […]

by B&T Magazine

B&T Magazine
Pureprofile Snares Former Kantar Director Of Data Solutions & Sourcing Young Ham
  • Technology

Pureprofile Snares Former Kantar Director Of Data Solutions & Sourcing Young Ham

Global data and insights business Pureprofile Limited has today announced three new appointments to the Australian team to support its renewed growth strategy. Young Ham [pictured] joins the team in a global role as head of data, innovation and product, Kate Richards will hold the role of interim head of sales, data & insights ANZ […]

Taboola Lays Out Plans To Go Public
  • Technology

Taboola Lays Out Plans To Go Public

Taboola lays out plans to go public. And by that it apparently means the stock market, not the local park's lavatories.

The Three Must-Haves For Your Data Strategy
  • Technology

The Three Must-Haves For Your Data Strategy

Here are the three must-haves for your data strategy. A fourth could possibly include the value of legumes in your diet.

Opinion

by B&T Magazine

B&T Magazine
Business woman study financial market to calculate possible risks and profits.Female economist accounting money with statistics graphs pointing on screen of computer at desktop. Quotations on exchange
  • Technology

Third-Party Data Sales Spike 109% For Lotame In APAC

Lotame has today announced 109 per cent growth in third-party data sales in the APAC region from Q2 2020 to Q3 2020. Across the APAC region, India saw 555 per cent growth, followed by Malaysia at 190 per cent and Taiwan at 178 per cent growth. Subsequently, Australia saw 43 per cent growth and New Zealand […]

Busting Six Of The Biggest Ad Tech Myths
  • Partner Content

Busting Six Of The Biggest Ad Tech Myths

Here, six of ad tech's biggest myths get expertly busted. There was a seventh, but it turned out to possibly be true.

Partner Content

by B&T Magazine

B&T Magazine
Australian Road Safety Foundation Utilises Geo-Targeting Technology To Protect Kids In School
  • Media

Australian Road Safety Foundation Utilises Geo-Targeting Technology To Protect Kids In School

The Australian Road Safety Foundation (ARSF) has teamed up with Spotify and some of Queensland’s leading musicians in a national first that will geotarget drivers within school zones. The Slow Down Songs campaign, which is being piloted in Queensland, will work to keep kids safe within school zones by dramatically slowing down songs and serving […]