Who’s To Blame? The Ins And Outs Of Social Media Defamation

Hands pointing fingers at each other. Blame concept.

Facebook and the NSW Federal Court don’t see eye-to-eye on who is responsible for defamatory comments made on the platform.

Last week the court ruled Australian media giants News Corp and Nine would be held responsible for defamatory comments made on their Facebook pages last year concerning former Northern Territory youth detainee Dylan Voller.

In response, Facebook senior executive Richard Allan said responsibility should lie with the individual who makes the comment.

It’s an issue that has sparked debate around content moderation and poses the question: who is responsible for defamatory comments made on digital platforms?

Much of the court’s ruling against the media outlets rested on the fact these companies use social media platforms for commercial means, while Facebook and Allan argue “primary responsibility” lies with the person making the comment.

Social media expert and founder of Pepper IT Ryan Shelley, who put together a report for the court, can see both sides.

“For business, it is a stern reminder that you may be held responsible for what you publish – or allow to be published – on social media,” Shelley told B&T.

“You should at the very least have procedures in place where you are making an effort to monitor and possibly hide or delete any inappropriate content posted by others.”

In his initial report for the case, Shelley said the likes of Nine and News Corp have “significant available resources” to monitor the comments in their social media posts, however, this is not the case for everyone.

Evidence suggests Australia’s community managers – those who are often accountable for monitoring social media posts – find it difficult to connect with platform providers like Facebook.

In a recent survey, just 13 per cent of community managers said it is easy to communicate with a platform when there is a problem.

This is coupled with an industry-wide sense of underappreciation, with only 22 per cent of community professionals reporting their role is understood and valued by their organisation.

Facebook itself has a responsibility to monitor content posted on the platform in certain cases according to Allan.

However, this can be difficult, said Shelley.

“Due to the vast amounts of information, it would be technically difficult for a platform like Facebook to monitor all content,” he said.

“While they may be actively monitoring for specific keywords at specific times, they largely rely on users to report content.”

Hold individuals accountable

“For individuals, it’s a reminder that there could be consequences for your online posts,” Shelley said. “Take a moment and consider whether you would want your post read aloud in court.”

But this approach can also be problematic.

Fake profiles can be set up with relative ease on Facebook, while one needs to only scan Twitter to see the vast number of active anonymous accounts.

“Identifying the true owner of a deliberately faked social account can be challenging,” said Shelley.

“It’s relatively easy for a would-be fake to create a throwaway email account, and then set up a social account using that fake email.”

As Shelley points out, there are ways for authorities to trace fake accounts in such cases, however even here “a savvy user could also further obscure their identity by routing their web traffic via a VPN.”

There is every likelihood the court’s decision will be appealed, meaning the final outcome remains unknown.

In the meantime, it seems both individuals and businesses must stay diligent online.




Please login with linkedin to comment

defamation Facebook Social Media

Latest News