B&TB&TB&T
  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Technology
  • Regulars
    • Agency Scorecards
    • Best of the Best
    • Campaigns of the Month
    • CMO Power List
    • CMOs to Watch
    • Culture Bites
    • Fast 10
    • New Business Winners
    • Spotlight on Sponsors
  • Jobs
  • Awards
    • 30 Under 30
    • B&T Awards
    • Cairns Crocodiles
    • Women In Media
    • Women Leading Tech
Search
Trending topics:
  • Cairns Crocodiles
  • Nine
  • Seven
  • Cannes Lions
  • AFL
  • WPP
  • NRL
  • Anthony Albanese
  • Pinterest
  • B&T Women in Media
  • Thinkerbell
  • State of Origin
  • imaa
  • Meta
  • ARN
  • AI
  • Federal Election
  • TV Ratings
  • Radio Ratings
  • Sports Marketing

  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise
© 2025 B&T. The Misfits Media Company Pty Ltd.
Reading: The Unconscious Bias In Facebook’s Moderation Problem
Share
B&TB&T
Subscribe
Search
  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Technology
  • Regulars
    • Agency Scorecards
    • Best of the Best
    • Campaigns of the Month
    • CMO Power List
    • CMOs to Watch
    • Culture Bites
    • Fast 10
    • New Business Winners
    • Spotlight on Sponsors
  • Jobs
  • Awards
    • 30 Under 30
    • B&T Awards
    • Cairns Crocodiles
    • Women In Media
    • Women Leading Tech
Follow US
  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise
© 2025 B&T. The Misfits Media Company Pty Ltd.
B&T > Opinion > The Unconscious Bias In Facebook’s Moderation Problem
Opinion

The Unconscious Bias In Facebook’s Moderation Problem

Staff Writers
Published on: 29th June 2017 at 8:02 AM
Staff Writers
Share
7 Min Read
SHARE

Following the recent leak of Facebook’s moderation practices, online community expert, Venessa Paech (pictured below), covers the myriad challenges for the platform, and where to next for Facebook and its users in this guest post for B&T…

Facebook’s moderation practices are in the news yet again, as more detail of their playbook was leaked to the public (aka, their users) recently. They highlight alarming inconsistencies and a disdain for duty of care. But they also illuminate the bias at the heart of the problem.

Venessa Paech%5b1%5d%5b1%5d

Technology isn’t neutral. Like any other tool, it’s imbued with the assumptions, expectations, politics and predilections of those who create it. It’s shaped by the experience and perspectives those doing the tooling have lived up to that moment. We’re getting better at recognising and responsibly managing unconscious bias in our businesses and organisations. But we’re pretty ordinary at calling it out when it comes to social networks that feature prominently in our daily lives.

The standard push-back to more responsible content and user regulation on Facebook is concern over censorship (driven primarily by a nativist US perspective). This is, of course, a straw man argument. Facebook policies and positions already ‘censor’. Terms of service prohibit certain behaviours and activities. This is the beginning of curating a culture – at the margins.

When Facebook moderators remove a personal picture of a mother breastfeeding, but not a graphic rape or death threat, they’re making choices and creating cultural norms. These interventions are based on an explicit world view and legal lens particular to a small portion of actual users. They don’t consider community context and they ignore the fact they’re already a form of ‘censorship’.

When someone is intimidated or threatened out of participation, censorship is in effect. Women and other institutionalised marginalised groups are routinely drummed out of the virtual room by armies of trolls, men’s rights activists or the loudest hater in the room at any given moment. For some it’s a matter of life and death, where speaking up or speaking out provokes the ultimate attempt to silence – doxing (posting personally identifiable details like address or phone number, with the invitation to harass or stalk that person).

Facebook reporting tools, though better than they once were, still suffer from an either/or, square peg/round-hole problem, with inadequate categories to accommodate the thrilling range of abuse directed at women on the platform. And of course there are the all-too-common cases of people in crisis, (again, usually women), seeking the connectivity and support of their personal social network, but needing to cloak their identity and online activity from abusive parties.

Facebook will tell you repeatedly that they are committed to improving their content reporting capabilities, making it ‘easier’ and ‘faster’ to raise alarm. But reports are still in relation to Facebook’s ‘standards’, and it’s there that unconscious bias needs tackling first. By prioritising profit over harm minimisation, by intentionally refusing to use available technology to quarantine graphic content, and by ignoring their wider unconscious bias, Facebook is complicit.

Twitter founder Evan Williams recently issued a sincere mea culpa for what he now understands as Twitter’s role in mobilising toxic behaviour. I like and admire Evan, but was surprised at the naiveté of his comment: “I thought once everyone could speak freely and exchange information and ideas, the world was automatically going to be a better place. I was wrong about that.” I’m pretty sure most women (and other institutionally marginalised voices) would have had a safe bet what was in store; and some suggestions about better tools to help manage it.

Facebook has taken positive steps to scrutinise and address employer-side unconscious bias. But this doesn’t yet filter through to the platform itself. A diversity of perspectives in the room allows Facebook to more accurately map risk scenarios and desired user journeys. How many women or people of colour were involved when their now infamous moderation playbook was created? How many did they run it by to see if it tracked with lived experience? Were there anthropologists and ethnographers working alongside the lawyers? Diversity makes better products. It also makes safer, more equitable ones. Lessons from community management and the world of social science teach us that when people feel they can disclose without threat, they’re more likely to (ironically, revealing the kind of intimate data Facebook transforms into product, currency and share price).

Social and community professionals manage millions of groups across their platform, not to mention the ‘accidental’ community managers who voluntarily administrate local groups and communities of interest. One way to help scale the mountain of moderation is to engage these people in the creation of tools to manage and create a group culture that reflects their values and their needs. There are signs Facebook is starting to address this, with a ‘Communities Summit’ in Chicago connecting group admins with Facebook staff (including Zuckerberg himself). However, there is a rigorous application process and, for now at least, it’s only open to U.S admins. We watch with interest to see if the conversation is about ‘getting more out of Facebook’, or listening to community management needs with an intent to act.

Imagine where we might be if this decade-old business had engaged community experts and others outside its filter bubble to begin with? Imagine what a commitment to transparency could accomplish on all sides?

Venessa Paech is the Co-Founder of Swarm, Australia’s premier community management conference. Tickets now on sale at www.swarmconference.com.au

 

Join more than 30,000 advertising industry experts
Get all the latest advertising and media news direct to your inbox from B&T.

No related posts.

TAGGED: Australian Marriage Equality, Designworks
Share
Staff Writers
By Staff Writers
Follow:
Staff Writers represent B&T's team of award-winning reporters. Here, you'll find articles crafted with industry experience spanning over 50 years. Our team of specialists brings together a wealth of knowledge and a commitment to delivering insightful, topical, and breaking news. With a deep understanding of advertising and media, our Staff Writers are dedicated to providing industry-leading analysis and reporting, both shaping the conversation and setting the benchmark for excellence.

Latest News

TV Ratings (29/06/2025): Sunday Afternoon Football Does The Numbers But Fails To Engage Its Audience
30/06/2025
VisitBritain Appoints Kristen Angus As Country Manager For ANZ
30/06/2025
Sydney Powerhouse Kyle & Jackie O ‘Bleed Advertisers’ In Melbourne Market
30/06/2025
Dentsu Queensland & Tourism & Events Queensland Team Up With Nine To Invite Readers To Experience ‘That Holiday Feeling’ For Themselves
30/06/2025
//

B&T is Australia’s leading news publication magazine for the advertising, marketing, media and PR industries.

 

B&T is owned by parent company The Misfits Media Company Pty Ltd.

About B&T

  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise

Top Categories

  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Opinion
  • Technology
  • TV Ratings

Sign Up for Our Newsletter



B&TB&T
Follow US
© 2025 B&T. The Misfits Media Company Pty Ltd. All Rights Reserved.
Welcome Back!

Sign in to your account

Register Lost your password?