The Greatest Lie Told On The Internet Is “I Have Read And Agree To The Terms And Conditions”

Toy forklift hold letter block t to complete word trust on wood background

In this guest post, international director on the interaction design association board and MD of ANZ DesignIt Katja Forbes discusses the importance of trust invisibility in a highly tech-driven milieu …

There has been a lot of talk lately in my industry about trusting invisibility, but what does that actually refer to and how does it affect everyone else?  To explain, the online environment itself became a smoking gun when we collectively realised that we don’t understand the invisible decisions being made by systems, algorithms and digital ecosystems. If we don’t understand how it works, but to participate in digital society, can’t stay away from it, that’s problematic. Can these invisible, non-human, decision-makers be trusted?

Invisibility itself has become a constant in the great world of technology. When the wireless (radio) and television were first invented and made their way into homes, I expect there was much more interest in how these things actually worked. Now we have our phones that offer something to everyone’s taste, and as long as they do what they are meant to, and we can access our vital social media accounts, our Podcasts and reply to our emails, there is little discussion anymore about how they do the job, more so that they actually do. We sacrifice privacy for functionality and the greatest lie told on the internet is “I have read and agree to the terms and conditions”.

Forbes and Juniper have determined 2.5 quintillion (18 zeroes!) bytes of data is created every single day. Can we, as consumers, even comprehend this?  Going back to that one wireless and television, when inventors were seemed to be inherently good people and always had our best interests at heart, we had no reason to distrust them. However, nowadays, almost everything works behind the scenes, invisibly, and research by Edelman that tells us trust in many countries is decreasing.

The Edelman barometer has shown a number of factors. Trust declines have been year-on-year in 10 of 15 sectors. Also, in 20 of 28 countries, including Australia, there is a general distrust in institutions. Out of those, the United States has experienced the biggest drop from 52 per cent to 43 per cent trust. Clearly, it would seem we are starting to have some misgivings about what we can’t see and that would be rightly so. However, you can imagine the difficulty that creates for people like me who have been engaged to design customer experiences.

Humans rely on trust for positive interactions with a number of institutions in life including relationships, systems and human connection, and now here you have customers who effectively are mistrustful of the experience they need to have with an organisation because it involves technology. People need to go online and engage with this organisation to work with them, but instead of doing as asked, we see data theft, the perpetuation of sexist and racist biases and malicious use of people’s personal information for profit.

So what do we do, those of us who are tasked with the job of creating some kind of interaction that involves the online world, but knowing that a portion of customers are mistrustful? They obviously have the enormous ask of somehow persuading end-users that it’s ok to trust this invisible technology and algorithmic decision making. This was never considered in design training just a few years ago, yet now it’s the most important part of their job.  So what do we know about trust?

Firstly, its earned. Secondly, you (or the piece of technology in question) needs to do what it says it’s going to do. In other words, action needs to meet intention.  If it starts doing something different, even if its better, alarm bells start ringing. Transparency also builds trust. It’s fair and legal to ask a food manufacturer to list its ingredients, why not list how an algorithm makes its decisions?

Trust is a two-way relationship: it is mutual. It goes both ways. When customers trust companies with their data and attention, companies must not only provide value, but also trust in return. And if trust breaks in one way, you need to prepare for breaches and rebuild it.

Trust is also emotional: an emotion that belongs to living organisms. Human interactions and decisions are being taken over by machines. Trust requires empathy, a quality technology doesn’t have (yet). You might be communicating with a machine – but not connecting. Because machines don’t give feedback and understand emotions the way people do.

Smart customer experience specialists and designers know this, and they take trust into account when planning their solutions. No longer is trust something that can be taken for granted by either the organisation or the designer.   There are enough people working in this field, that now someone with bad intentions is common-place enough for customers to recognise this and ask more questions.

The answer? Ensure that building trust is part of the brief when creating something that makes invisible decisions. We are at a defining moment in time. Change is happening at an incredible speed. We can’t predict where we are going, but we can help shape the direction with honest, transparent interactions. We need to design both what’s happening behind the scenes as well as what’s going on up-front because we want technology to improve our lives, not control it.

Please login with linkedin to comment

brand trsut katja forbes

Latest News