Following the recent leak of Facebook’s moderation practices, online community expert, Venessa Paech (pictured below), covers the myriad challenges for the platform, and where to next for Facebook and its users in this guest post for B&T…
Facebook’s moderation practices are in the news yet again, as more detail of their playbook was leaked to the public (aka, their users) recently. They highlight alarming inconsistencies and a disdain for duty of care. But they also illuminate the bias at the heart of the problem.
Technology isn’t neutral. Like any other tool, it’s imbued with the assumptions, expectations, politics and predilections of those who create it. It’s shaped by the experience and perspectives those doing the tooling have lived up to that moment. We’re getting better at recognising and responsibly managing unconscious bias in our businesses and organisations. But we’re pretty ordinary at calling it out when it comes to social networks that feature prominently in our daily lives.
The standard push-back to more responsible content and user regulation on Facebook is concern over censorship (driven primarily by a nativist US perspective). This is, of course, a straw man argument. Facebook policies and positions already ‘censor’. Terms of service prohibit certain behaviours and activities. This is the beginning of curating a culture – at the margins.
When Facebook moderators remove a personal picture of a mother breastfeeding, but not a graphic rape or death threat, they’re making choices and creating cultural norms. These interventions are based on an explicit world view and legal lens particular to a small portion of actual users. They don’t consider community context and they ignore the fact they’re already a form of ‘censorship’.
When someone is intimidated or threatened out of participation, censorship is in effect. Women and other institutionalised marginalised groups are routinely drummed out of the virtual room by armies of trolls, men’s rights activists or the loudest hater in the room at any given moment. For some it’s a matter of life and death, where speaking up or speaking out provokes the ultimate attempt to silence – doxing (posting personally identifiable details like address or phone number, with the invitation to harass or stalk that person).
Facebook reporting tools, though better than they once were, still suffer from an either/or, square peg/round-hole problem, with inadequate categories to accommodate the thrilling range of abuse directed at women on the platform. And of course there are the all-too-common cases of people in crisis, (again, usually women), seeking the connectivity and support of their personal social network, but needing to cloak their identity and online activity from abusive parties.
Facebook will tell you repeatedly that they are committed to improving their content reporting capabilities, making it ‘easier’ and ‘faster’ to raise alarm. But reports are still in relation to Facebook’s ‘standards’, and it’s there that unconscious bias needs tackling first. By prioritising profit over harm minimisation, by intentionally refusing to use available technology to quarantine graphic content, and by ignoring their wider unconscious bias, Facebook is complicit.
Twitter founder Evan Williams recently issued a sincere mea culpa for what he now understands as Twitter’s role in mobilising toxic behaviour. I like and admire Evan, but was surprised at the naiveté of his comment: “I thought once everyone could speak freely and exchange information and ideas, the world was automatically going to be a better place. I was wrong about that.” I’m pretty sure most women (and other institutionally marginalised voices) would have had a safe bet what was in store; and some suggestions about better tools to help manage it.
Facebook has taken positive steps to scrutinise and address employer-side unconscious bias. But this doesn’t yet filter through to the platform itself. A diversity of perspectives in the room allows Facebook to more accurately map risk scenarios and desired user journeys. How many women or people of colour were involved when their now infamous moderation playbook was created? How many did they run it by to see if it tracked with lived experience? Were there anthropologists and ethnographers working alongside the lawyers? Diversity makes better products. It also makes safer, more equitable ones. Lessons from community management and the world of social science teach us that when people feel they can disclose without threat, they’re more likely to (ironically, revealing the kind of intimate data Facebook transforms into product, currency and share price).
Social and community professionals manage millions of groups across their platform, not to mention the ‘accidental’ community managers who voluntarily administrate local groups and communities of interest. One way to help scale the mountain of moderation is to engage these people in the creation of tools to manage and create a group culture that reflects their values and their needs. There are signs Facebook is starting to address this, with a ‘Communities Summit’ in Chicago connecting group admins with Facebook staff (including Zuckerberg himself). However, there is a rigorous application process and, for now at least, it’s only open to U.S admins. We watch with interest to see if the conversation is about ‘getting more out of Facebook’, or listening to community management needs with an intent to act.
Imagine where we might be if this decade-old business had engaged community experts and others outside its filter bubble to begin with? Imagine what a commitment to transparency could accomplish on all sides?
Venessa Paech is the Co-Founder of Swarm, Australia’s premier community management conference. Tickets now on sale at www.swarmconference.com.au
Twitter has just experienced its fastest growth in revenue since 2014, with the social media platform benefiting from increased interest from advertisers. Revenue was up 74 per cent YoY, according to the company’s Q2 results, reaching $US1.19 billion ($1.6 billion) from $US683.4 million ($925 million) 12 months ago. The strong results came in the same […]
Wild Turkey has announced the launch of its new global creative campaign and platform, ‘Trust Your Spirit’, featuring the brand’s creative director and spokesperson, Matthew McConaughey. The global campaign and platform ‘Trust Your Spirit’ is to encourage people to be bold, unapologetically themselves, and stay true to who they are. The global tagline and ethos […]
Telecommunications company Optus has announced it will launch the world’s first TikTok sign-language activated filter. Featuring Optus ambassador Ian Thorpe, Optus will unveil a branded effect that shows TikTok users how to say key phrases in Auslan sign language, including ‘How are you?’ and ‘It starts with Yes’, through the hashtag challenge #SignYes. Optus will […]
Podsights has revealed new insights into the effectiveness of advertising in podcasts, tracking global and Australian podcast advertising trends. This is the first Australian report to be published since ARN partnered with Podsights to set the standard for best-in-class podcast advertising measurement. The report includes additional analysis and follows a series of Measurement Masterclasses held for […]
NGEN’s 2021 charity cup has raised over $175,000 for UnLtd charity partner Gotcha4Life. Gotcha4Life is a not-for-profit foundation raising awareness and funding to provide educational workshops and innovative programs that build mental fitness in individuals and communities. While Sydney and Brisbane completed the Charity Cup in June, before Covid restrictions hit, the final leg in […]
Australian tech incubator Cicada Innovations and Biennale of Sydney are launching the ‘New & Sustainable Materials Challenge,’ in an effort to create a more sustainable future. The works of chosen material-makers will be showcased to millions of Biennale audiences, exhibition partners, and exhibition makers globally. The Challenge is open to any Australian and UK startups, […]