Is The Algorithm The New Gatekeeper Of Content?

Is The Algorithm The New Gatekeeper Of Content?

Carat Sydney’s Head of Digital, Fiona Harrop, examines the flip side to our ever-increasing levels of content personalisation, and the role it has to play in in ‘attitude isolation’.

The increased personalisation of content on the internet and our ability to harness this is becoming the holy grail of marketing.  As marketers, this personalisation has a huge number of benefits, driving more effective and efficient communications.

However there is another side to this personalisation that we have a duty as marketers to think about. The internet is perceived as an open democratic source of information, an enabler to diversity.  However, the reality is not as idealistic, as the use of algorithms by multiple platforms, means we are increasingly being steered towards topics that reflect our own ideologies.

This leaves a gap as we then aren’t being exposed to points of view that veer away from our own.  In marketing, we use this personalised approach to steer people towards products that we know they will either like or believe they will like due to life stage, demographic, behaviour and geographic location.

A number of terms have been coined for this personalisation, including ‘Social media bubble’, ‘filter bubble’ and ‘echo chamber’.

The term ‘echo chamber’ has been used by researchers from Boston University, describing it as “like-minded people who share controversial theories, biased views, and selective news.” The information is often repeated back and ends up being believed as fact.  The term is also now being applied in the social media space.

Facebook is one such ‘echo chamber’ that 1.13 billion (11 million daily AU) people peruse each day.  The Facebook algorithm means that we are seeing news in our feeds and views from people we follow, limiting our exposure to those with potentially similar view points.

One experience I had of this was with Brexit, where the majority of content in my news feed and opinion from people I followed on Twitter were all ‘vote remain’.  If my reading had been limited to social media platforms I would have been given a false perception that this was the majority viewpoint for the vote.  This was obviously far from the case with the final vote being to leave.

This insular exchange of information consumption and shared views is leading to ‘attitude polarisation’.  Cass Sunstein, a Harvard law professor has been studying these affects and highlighting how this narrow view of information in the social space is leading to polarisation of opinion.

Being around people with a similar attitude or within the social space where we are only reading similar opinions will just make that opinion more valid in your mindset.

Taking this a step further, could it be influencing people to become even more extreme in their views on a particular issue?

Is it easier to become more radicalised?

Polarisation has always been present but is new media speeding up the way this information is spread?

Looking at this from a political standpoint, Facebook likes and shares are what voters are now using to communicate their political views. These views are then reinforced as content skews towards these behaviors.

Removing the opportunity for them to be exposed to differing opinion and information, instead they continue to be served information backing their own beliefs.

In Jan 2016, 44 per cent of US adults reported having learned about the 2016 presidential election from a social media channel, outpacing both local and national print newspapers.  Following these candidates gave a view of the campaign through a narrow window.

CNN tagged Donald Trump as the first ‘social media president’.  He strategically gained traction through social media and syndicated news networks like Breitbart and Infowars rather than trying to gain ground with the media elite; a method now being heavily reviewed by politicians.

Editors were the original gatekeepers of content, however technology is now taking control in a more effective way.  If algorithms continue to curate in this way, our society is not only at risk of bias to particular views, and points of difference, but this continued personalisation of media is detrimental to diversity of thought.

We have a responsibility to ensure this evolves, so we get a balanced diet of content and information, not reinforcement of our own beliefs, and our social circle’s opinion and perspective.  As more channels become personalised we need to address this before we reach the tipping point of a homogenous society.

As Eli Pariser summarised “if algorithms are going to curate the world for us, then… we need to make sure that they also show us things that are uncomfortable or challenging or important”.

We have a stake in the way this plays out and have the ability to take control back from the new gatekeeper of diversity of thought.  We can actively seek other views and opinions by clicking on links that share diverse opinions, break out and follow new people in the social sphere and open up our news feeds to a wider array of topics, don’t limit yourself.

 




Please login with linkedin to comment

Latest News

Sydney Comedy Festival: Taking The City & Social Media By Storm
  • Media

Sydney Comedy Festival: Taking The City & Social Media By Storm

Sydney Comedy Festival 2024 is live and ready to rumble, showing the best of international and homegrown talent at a host of venues around town. As usual, it’s hot on the heels of its big sister, the giant that is the Melbourne International Comedy Festival, picking up some acts as they continue on their own […]

Global Marketers Descend For AANA’s RESET For Growth
  • Advertising

Global Marketers Descend For AANA’s RESET For Growth

The Australian Association of National Advertisers (AANA) has announced the final epic lineup of local and global marketing powerhouses for RESET for Growth 2024. Lead image: Josh Faulks, chief executive officer, AANA  Back in 2000, a woman with no business experience opened her first juice bar in Adelaide. The idea was brilliantly simple: make healthy […]