Would you like to provide actual proof that your favorite toy benefits people's health before daring others to challenge you? The imagined data you’ve yet to provide can't possibly justify the harm it's causing by pushing people on the edge to suicide.
The article is paywalled but appears to concern abusing a cocktail of kratom, alochol, and xanax. I don't really think that's the same. Also, this feature isn't really about making ChatGPT start answering medical questions anyhow, since people are already doing that.
[Teenager died of overdose 'after ChatGPT coached him on drug-taking']