It’s no secret that abuse and hate speech is a massive problem on Twitter.
And the social media platform is trying to address it – launching a new prompt that will ask users to reconsider before they send something that might be deemed offensive.
Twitter had trained its algorithms to detect content that is usually associated with abuse and will ask the poster if they: “Want to review this before Tweeting?”
The user is then given the option to go ahead with the Tweet, edit it or delete it.
Twitter has been experimenting with the new tool for a year now and has found that prompting users to reconsider offensive tweets has improved the quality of conversation on the site.
The new tool is being rolled out this week.
“These tests ultimately resulted in people sending less potentially offensive replies across the service, and improved behaviour on Twitter,” the social media platform said.
“We learned that: if prompted, 34 per cent of people revised their initial reply or decided to not send their reply at all. After being prompted once, people composed, on average, 11 per cent fewer offensive replies in the future; if prompted, people were less likely to receive offensive and harmful replies back.”
Twitter also said the the algorithm will take into consideration how often two accounts interact with one another in an effort to recognise the ‘nuances’ and banter in conversation that exist on the platform.