B&TB&TB&T
  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Technology
  • Regulars
    • Agency Scorecards
    • Best of the Best
    • Campaigns of the Month
    • CMO Power List
    • CMOs to Watch
    • Culture Bites
    • Fast 10
    • New Business Winners
    • Spotlight on Sponsors
  • Jobs
  • Awards
    • 30 Under 30
    • B&T Awards
    • Cairns Crocodiles
    • Women In Media
    • Women Leading Tech
Search
Trending topics:
  • Cairns Crocodiles
  • Nine
  • Seven
  • AFL
  • Federal Election
  • Pinterest
  • AI
  • NRL
  • News Corp
  • Cairns Hatchlings
  • Married At First Sight
  • Channel 10
  • oOh!Media
  • Anthony Albanese
  • WPP
  • Thinkerbell
  • ARN
  • TV Ratings
  • Radio Ratings
  • Sports Marketing

  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise
© 2025 B&T. The Misfits Media Company Pty Ltd.
Reading: Feels wins: Why AI Will Never Send You To Jail
Share
B&TB&T
Subscribe
Search
  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Technology
  • Regulars
    • Agency Scorecards
    • Best of the Best
    • Campaigns of the Month
    • CMO Power List
    • CMOs to Watch
    • Culture Bites
    • Fast 10
    • New Business Winners
    • Spotlight on Sponsors
  • Jobs
  • Awards
    • 30 Under 30
    • B&T Awards
    • Cairns Crocodiles
    • Women In Media
    • Women Leading Tech
Follow US
  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise
© 2025 B&T. The Misfits Media Company Pty Ltd.
B&T > Opinion > Feels wins: Why AI Will Never Send You To Jail
Opinion

Feels wins: Why AI Will Never Send You To Jail

Staff Writers
Published on: 12th November 2024 at 10:20 AM
Staff Writers
Share
7 Min Read
Generated by Adobe Firefly.
SHARE

In his latest for B&T, DDB Australia’s managing director of strategy and growth, Leif Stromnes explains that while humans are completely fallible (like machines) we are willing to forgive humans for their mistakes and even prefer sticking to sets of ‘morally right’ rules even when there could be bad consequences.

In his seminal tome Thinking, Fast and Slow, Daniel Kahneman illustrates the fallibility of human decision-making by studying the outcomes of parole determinations by Israeli judges just before and after lunch. He found that when judges were “hangry”, i.e. just before their lunch break, their parole approvals went to virtually zero. Once they had eaten, approvals increased by 65 per cent.

This is truly alarming. If you are applying for parole and you are the final case before lunch, you are 65 per cent less likely to walk free than the lucky bugger who is the first case after lunch. Your biggest crime might in fact turn out to be your spectacular lack of timing.

Unlike human beings, AI-powered computers do not get hangry and do not experience fatigue. In fact, they don’t even require a lunch break. An ethical AI could, in principle, be programmed to reflect the values and ideals of an impartial agent. Free from human limitations and biases, such machines could even be said to make better decisions than us. So, what about an AI judge? Unfortunately for the pre-lunch parole applicants, this is not going to happen anytime soon. The problem is not with the machines, it’s with our own psychology.

Leif Stromnes

Artificial or machine decision-making is based on a cost and benefits algorithm and the decision with the best overall consequences (knows as consequentialism) is the decision that the machine will always make. But humans are different. Our default is to follow a set of moral rules in which certain actions are “just wrong”, even if they produce good consequences.

Our distaste for consequentialism has been demonstrated across several psychological studies in which participants are given hypothetical dilemmas that pit consequentialism against a more rule-based morality. In the “footbridge dilemma” for instance, participants are told a runaway train is set to kill five innocent people who are stuck on the train tracks. Its progress can be stopped with certainty by pushing a very large man, who happens to be standing on a small footbridge overlooking the tracks, to his death below (where his body will stop the trolley before it kills the other five). The vast majority of people believe it is wrong to push the man to his death in this case, despite the good consequences.

But this is only half the story. The minority of people in the study who were willing to coolly sacrifice a life for the greater good were rated as untrustworthy by the rest of the participants. This finding was validated across nine further experiments with more than 2,400 subjects. It would seem that humans have a fundamental mistrust in machines when it comes to morality because artificial machines lack the very features we use to infer trustworthiness in others. We prefer an irrational commitment to certain rules no matter the consequences and we prefer those whose moral decisions are guided by social emotions like guilt and empathy. Being a stickler for the rules of morality says something deep about your character.

Another quirk of human versus machine psychology is our ability to forgive human mistakes, but our almost total lack of tolerance for the same mistakes that are made by a machine.

A Cruise robotaxi drives into some wet cement.

Empathy is enormously powerful in alleviating anger in a human-to-human interaction, but completely useless in a self-service technology failure. It’s why we get angrier at bots and automated self-service systems when they mess us around than we do with humans. And why we are outraged when an autonomous motor vehicle kills an innocent pedestrian despite this being a daily reality with human drivers.

This has profound implications for brands and marketing. The inexorable rise of AI, and the automation of most customer service tasks in the name of efficiency and cost control, means the default interaction is human to machine. But as we have learnt, we don’t like the way machines make decisions, and we are much less forgiving of the mistakes that machines make.

Whilst using machines will almost certainly drive efficiency up and mistakes down, the outcome might be a lack of trust in the integrity of the decision, and ironically lead to lower customer satisfaction rates. Even if machines were able to perfectly mimic human moral judgements, we would know that the computer did not arrive at its judgements for the same reasons we would.

This insight played out with the launch of a robotic barista café in Melbourne in 2017. Rationally it made sense. The robot made perfect cup after perfect cup, didn’t call in sick and didn’t demand overtime on weekends. But every small imperfection was amplified without forgiveness and after one year the café closed down. As one customer elegantly put it, “I just didn’t trust the robot barista to know how I really liked my coffee.”

Whilst AI and machine decisioning with its efficiency and low error rate will undoubtedly win the day, an emotionally satisfying customer approach might be to prioritise human to human contact in high value social interactions. And automate everything else.

After all, to err is human, to forgive divine.

Join more than 30,000 advertising industry experts
Get all the latest advertising and media news direct to your inbox from B&T.

No related posts.

TAGGED: AI, DDB
Share
Staff Writers
By Staff Writers
Follow:
Staff Writers represent B&T's team of award-winning reporters. Here, you'll find articles crafted with industry experience spanning over 50 years. Our team of specialists brings together a wealth of knowledge and a commitment to delivering insightful, topical, and breaking news. With a deep understanding of advertising and media, our Staff Writers are dedicated to providing industry-leading analysis and reporting, both shaping the conversation and setting the benchmark for excellence.

Latest News

Bunnings stock photo
PHD Melbourne Nails Bunnings’ Media Account
23/05/2025
OMD’s DJ Sometimes Wholesome Wins Inaugural Club Unltd. Final
23/05/2025
TV Ratings (22/05/2025): Top Of The Table Dogs Toppled By The Dolphins In Treacherous Conditions
23/05/2025
Toothpaste Brand White Glo Admonished Over ‘Make The White Choice’ Ad
23/05/2025
//

B&T is Australia’s leading news publication magazine for the advertising, marketing, media and PR industries.

 

B&T is owned by parent company The Misfits Media Company Pty Ltd.

About B&T

  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise

Top Categories

  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Opinion
  • Technology
  • TV Ratings

Sign Up for Our Newsletter



B&TB&T
Follow US
© 2025 B&T. The Misfits Media Company Pty Ltd. All Rights Reserved.
Welcome Back!

Sign in to your account

Register Lost your password?