Therapy can be expensive, intimidating, or just hard to access when you need it most. So I turned to something many of us already use daily: ChatGPT. I wanted to see if AI could offer any real mental health support over a week. I tested it as a digital therapist for one week. I disclosed my ideas, sought emotional support, and discussed the possibility of it being truly helpful regarding my mental condition.
A new Stanford University study reveals significant concerns about AI-powered therapy chatbots, finding they may reinforce harmful stereotypes and provide dangerous responses to vulnerable users. The research, set to be presented at an upcoming academic conference, tested five popular mental health chatbots, including Pi (7cups) and Therapist (Character.ai).
Therapy can be expensive, intimidating, or just hard to access when you need it most. So I turned to something many of us already use daily: ChatGPT. I wanted to see if AI could offer any real mental health support over a week. I tested it as a digital therapist for one week. I disclosed my ideas, sought emotional support, and discussed the possibility of it being truly helpful regarding my mental condition.
A new Stanford University study reveals significant concerns about AI-powered therapy chatbots, finding they may reinforce harmful stereotypes and provide dangerous responses to vulnerable users. The research, set to be presented at an upcoming academic conference, tested five popular mental health chatbots, including Pi (7cups) and Therapist (Character.ai).