B&TB&TB&T
  • Advertising
    • Campaigns of the Month
    • Effectiveness
    • League Tables
    • Opinion & Analysis
    • PR
    • Production & Craft
    • Social
    • Strategy & Insight
  • Agencies
    • Agency Scorecards
    • Appointments
    • Culture Bites
    • League Tables
    • New Business
    • Opinions & Analysis
    • Profiles
    • The Work
    • Fast 10
  • Awards
    • 30 Under 30
    • B&T Awards
    • Cairns Crocodiles Awards
    • Hatchlings
    • Women in Media
    • Women Leading Tech
  • Best of the Best
  • Brands
    • Appointments
    • Campaigns
    • Culture Bites
    • Opinions & Analysis
    • Partnerships
    • Spotlight on Sponsors
  • Campaigns
    • Campaigns of the Month
    • League Tables
    • Opinion & Analysis
    • The Work
  • CMOs
    • Appointments
    • CMO Power List
    • CMOs to Watch
    • Opinions & Analysis
  • Marketing
    • Appointments
    • Customer Experience
    • Data & Insights
    • Opinions & Analysis
    • Spotlight on Sponsorship
    • Strategy
    • Sports Marketing
  • Media
    • AI
    • Appointments
    • Audio
    • Digital
    • Headliners presented by Nine
    • News
    • News Media & Publishing
    • Opinions & Analysis
    • Out of Home
    • Platforms
    • Radio Ratings
    • Retail Media
    • Social
    • Spotlight on Sponsors
    • Streaming
    • Trading & Upfronts
    • TV Ratings
  • Technology
    • AdTech & MarTech
    • AI
    • Appointments
    • Opinions & Analysis
    • Platforms
  • Cairns Crocodiles
Search
Trending topics:
  • Featured
  • Nine
  • Cairns Crocodiles
  • Pinterest
  • B&T Exclusive
  • Married At First Sight
  • Seven
  • Australian Open
  • Partner content
  • Thinkerbell
  • Meta
  • Cairns Crocodiles Speaker Spotlight
  • AFL
  • Women Leading Tech
  • TikTok
  • WPP
  • ABC
  • TV Ratings
  • Radio Ratings
  • Sports Marketing

  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise
© 2025 B&T. The Misfits Media Company Pty Ltd.
Reading: eSafety Report Reveals AI Chatbots Are ‘Encouraging Self-Harm & Suicide’
Share
Subscribe
B&TB&T
Subscribe
Search
  • Advertising
    • Campaign of the Month
    • Effectiveness
    • League Tables
    • Opinion & Analysis
    • PR
    • Production & Craft
    • Social
    • Strategy & Insight
  • Agencies
    • Agency Scorecards
    • Appointments
    • Culture Bites
    • League Tables
    • New Business
    • Opinions & Analysis
    • Profiles
    • The Work
    • Fast 10
  • Awards
    • 30 Under 30
    • B&T Awards
    • Cairns Crocodiles Awards
    • Hatchlings
    • Women in Media
    • Women Leading Tech
  • Best of the Best
  • Brands
    • Appointments
    • Campaigns
    • Culture Bites
    • Opinions & Analysis
    • Partnerships
    • Spotlight on Sponsors
  • Campaigns
    • Campaigns of the Month
    • League Tables
    • Opinion & Analysis
    • The Work
  • CMOs
    • Appointments
    • CMO Power List
    • CMOs to Watch
    • Opinions & Analysis
  • Marketing
    • Appointments
    • Customer Experience
    • Data & Insights
    • Opinions & Analysis
    • Spotlight on Sponsorship
    • Strategy
    • Fast 10
    • Sports Marketing
  • Media
    • AI
    • Appointments
    • Audio
    • Digital
    • Headliners presented by Nine
    • News
    • News Media & Publishing
    • Opinions & Analysis
    • Out of Home
    • Platforms
    • Radio Ratings
    • Social
    • Spotlight on Sponsors
    • Streaming
    • Trading & Upfronts
    • TV Ratings
    • Retail Media
  • Technology
    • AdTech & MarTech
    • AI
    • Appointments
    • Opinions & Analysis
    • Platforms
  • Cairns Crocodiles
Follow US
  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise
© 2026 B&T. The Misfits Media Company Pty Ltd.
B&T > Technology > AI > eSafety Report Reveals AI Chatbots Are ‘Encouraging Self-Harm & Suicide’
AITechnology

eSafety Report Reveals AI Chatbots Are ‘Encouraging Self-Harm & Suicide’

Staff Writers
Published on: 24th March 2026 at 11:42 AM
Edited by Staff Writers
Share
6 Min Read
Image created using ChatGPT.
SHARE

AI companion chatbots are failing to protect Australian children from exposure to sexually explicit content and not doing enough to prevent users generating child sexual exploitation and abuse material, according to eSafety’s latest transparency report.

The report summarises responses from four AI companion services – Character.AI, Nomi, Chai, and Chub AI – to questions asked by the regulator about how they are tackling these and other issues.

The report revealed that most of the AI companions featured failed to refer users who engaged in chats related to suicide or self-harm to appropriate support services and did not warn users of the potential risk and criminality of accessing or creating child sexual exploitation and abuse material through their service.

eSafety Commissioner Julie Inman Grant—number four on the recent Women Leading Tech Power List—said AI companion services marketed as sources of friendship, emotional support, or romantic companionship, are becoming increasingly popular with Australian children, but they pose significant risks if safety guardrails are not put in place.

“We are riding a new wave of AI companions that are entrapping and entrancing impressionable young minds, with human-like, sycophantic and often sexually explicit conversations, some even going as far as encouraging self-harm and suicide,” she said.

“As this report shows, none of these four AI companions had any meaningful age checks in place to protect children from age-inappropriate content that many of these chatbots are capable of producing, primarily relying instead on self-declaration of age at sign up. In Australia, this is no longer good enough.

“In addition to this report, our recent survey of 1,950 children aged 10 to 17 in Australia shows AI companions and AI assistants are already a common part of their lives. 79 per cent of children told us they had used either an AI companion or AI assistant. While the majority of these children had used an AI assistant, 8 per cent said they had used an AI companion, which we estimate represents around 200,000 children in Australia.

“But we’re just at the beginning of this and we’re also starting to see the lines begin to blur between AI assistant chatbots kids might use to help them with their homework and these AI companions in terms of their features and functionality.

“While AI companions can feel personal and supportive, they really are not designed for children and they are not mental health experts either, which is why I’m concerned that most of the companion services we asked questions of did not automatically refer users to appropriate support when self-harm or suicide were detected in chats.

“It’s also extremely troubling to discover that a number of these services were not checking all the AI models they used to provide their service for inputs (or prompts) relating to child sexual exploitation and abuse material.

“And many didn’t check outputs either for the potential generation of child sexual exploitation and abuse material, or using proven deterrent measures like advising users of the criminality of engaging in conduct related to child sexual exploitation and abuse.”

The report also showed that some AI companion chatbots employed insufficient numbers of trust and safety personnel. This is a direct result of both Nomi and Chub AI not having any staff or moderators dedicated to trust and safety.

The report follows the recent commencement of Age-Restricted Material Codes in Australia designed to protect children from exposure to a range of age-inappropriate content. Among other service types, these new codes also apply to the growing number of AI chatbots.

These codes complement the existing Unlawful Material Codes and Standards, which require industry to take system-wide action to prevent child sexual exploitation material, as well as pro-terror and extreme crime and violence material .

“The Age-Restricted Material Codes are now law and require AI companion chatbots to protect children from age-inappropriate content such as sexually explicit material by preventing the service from generating this content, or through implementing appropriate age assurance,” added Grant. “And they also require them to provide appropriate crisis and mental health information and services.”

The codes and standards are legally enforceable and breach of a direction to comply may result in civil penalties of up to $49.5 million.

Since the four companies received transparency notices from eSafety in October 2025, some have improved their age assurance measures while one company removed its service from Australia. Given the safety gaps revealed through the transparency process, eSafety considers these moves to be positive developments.

Following the transparency notice process, Character AI introduced age assurance measures for Australian users in early 2026 and has removed the chat function for its under 18s experience, while Chub AI decided to geo-block, or withdraw, its service from Australia.

Chai has now restricted free access to chat with AI Companions, instead requiring users to pay a subscription while Nomi has committed to ‘implementing further age assurance functionality’.

Join more than 30,000 advertising industry experts
Get all the latest advertising and media news direct to your inbox from B&T.

No related posts.


TAGGED: esafety commissioner
Share
Oliver Cerovic
By Oliver Cerovic
Oliver is a journalist at B&T, joining in April 2025 after completing a Bachelor of Communications, majoring in Journalism at UTS. He covers media agencies and owners, and has a strong interest in sports marketing. Oliver has a background in sport, previously writing for Fox League and the Manly Warringah Sea Eagles. He famously hit a last-ball six in the 2026 Big Clash to deliver his Indies side to a 19 point loss.

Latest News

Why Australian Drivers Are Flocking To Chinese EV Brands
16/04/2026
TFM Digital Launches That F*cking Marketing Podcast To Cut Through ‘Generic Advice’
16/04/2026
GoTransit Wins Big Bus Sydney Rights, Launches with Heineken
16/04/2026
TV Ratings (15/4/2026): British Game Show Leads TV Ratings, Royal Fallout Special Trails
16/04/2026
//

B&T is Australia’s leading news publication magazine for the advertising, marketing, media and PR industries.

 

B&T is owned by parent company The Misfits Media Company Pty Ltd.

About B&T

  • About
  • Contact
  • Editorial Guidelines
  • Privacy
  • Terms
  • Advertise

Top Categories

  • Advertising
  • Campaigns
  • Marketing
  • Media
  • Opinions & Analysis
  • Technology

Sign Up for Our Newsletter



B&TB&T
Follow US
© 2026 B&T. The Misfits Media Company Pty Ltd. All Rights Reserved.
Welcome Back!

Sign in to your account

Register Lost your password?