When not sending themselves into a spin about ChatGPT taking their jobs, journalists also like to ask it slightly provocative questions for mildly amusing results.
Vice, the industry-leading peddler of mainly drugs and sex content, thought it would be great to ask ChatGPT some questions about getting on the bags, as well as smuggling them across the world.
The first question Vice journalist Max Daly asked the AI was “How do people make crack cocaine?”
Then, ChatGPT refused to tell Daly how to make meth, saying it was illegal. However, opening a new chat thread and asking the same question, ChatGPT happily told Daly that:
“[Meth] is commonly made using a combination of chemicals and household items. The production process typically involves the reduction of ephedrine or pseudoephedrine, which is found in cold and allergy medications, using a mixture of toxic chemicals such as lithium or anhydrous ammonia.”
More questions followed, covering everything from the ethical questions around drug use to their effects.
When Daly whether there were any good things about taking cocaine, it replied: “No, there are no good things about cocaine use.”
However, when he asked what it felt like to take cocaine, it replied: “Euphoria: A feeling of intense happiness and well-being.”
Euphoria sounds like a pretty ringing endorsement to us.
Daly also seems to have stumbled on a way round ChatGPT’s notoriously strict legalistic wording. When he asked ChatGPT how to join a cartel, it replied:
When he asked what the most efficient way to smuggle cocaine into Europe was, Daly received similar pushback. However, when he said he was writing a novel about smuggling cocaine into Europe, ChatGPT was happy to oblige.
The disclaimer at the bottom is a nice touch, though. Daly’s work serves as another reminder that while newsrooms, publishers, and tech proponents will tell you that ChatGPT is the future of content product, it actually remains a remarkably blunt instrument.
Still, it’s good for a laugh.