In this guest post, Quiip’s team lead and community consultant, Larah Kennedy (pictured below), takes a look at the recent decision to hold tech companies liable for defamatory comments. And, Kennedy argues, it should be a wake-up call for all brands that use social media platforms…
The recent first round ruling from a NSW supreme court judge in the Dylan Voller defamation trial, found media companies liable for the comments left by others on their Facebook pages.
While this has got many journos and news organisations up in arms, it has reinforced what the Australian Community Management industry has known all along. Organisations, brands and media outlets should be responsible for the things people say on their social media pages and online forums. This includes defamatory statements, discrimination, racism, sexism, vilification for religious beliefs, obstruction of justice, disclosure of self-harm or suicide and so much more. The list is long, and it’s only one reason that moderation and active community management are fundamental for organisations operating in the social media space.
Yes, while organisations are potentially legally liable for these conversations and comments, I would take this a step further and argue they are also more widely responsible for ensuring the online spaces they create are safe and welcoming. We wouldn’t step into an organisation’s bricks and mortar establishment and expect to be verbally attacked or discriminated against by another patron. And if it did happen, we would certainly expect the business to intervene. Yet the comments section of a Facebook page or group are often free-for-alls where customers or readers can hurl abuse at each other from the safety of their keyboard. As businesses operating in the social media space, we can do better.
Moderation isn’t just about covering an organisation from risk and liability (although that’s important); it is fundamental to ensuring that the social media space they’ve created is an inviting and pleasant place to be. It’s about protecting vulnerable populations. It’s about duty of care. It’s about reinforcing positive behaviours and denouncing bad behaviour. It’s about cultivating connections and building loyalty and advocacy. It’s so much more than just not getting sued. For an organisation to claim that they are not aware of the comments or discussions happening in an online space they have created is lazy at best and downright negligible at worst.
Within the Australian Community Management industry are thousands of experienced and passionate community managers working across media, large corporations, NFP and government, who take the role and responsibility of moderating online spaces very seriously. In fact, there’s even a Code of Conduct that outlines the ethical and legal framework, which we strive to operate within. More organisations need to be looking to community management experts and investing in moderation resourcing if they want to protect themselves from legal risks AND build safe and welcoming online spaces.
Facebook isn’t a broadcast channel, it’s a social network and it’s time organisations wanting to play in that space take moderation and community management more seriously.