OK. This one might not be so everyday - at least not at the moment anyhow. I have however started to have a play with ChatGPT role playing as a therapist. This blog post includes some thoughts on my experiences and thinking.
How about asking chatGPT to be a client and trainees can practice counselling by talking to it and ask for feedback?
I’ve tried it just now using a panic attack as an example and I asked if it can give me feedback on how I demonstrated my CBT knowledge, skills, and therapeutic approach. It was really helpful. However I noticed it can’t produce feedback on the therapeutic relationship, and feelings about how much insight was gained, and about therapy progress. There’s a lack of human emotions in the interaction, chatGPT didn’t struggle a single bit in presenting their “problem”, as if it doesn’t even “struggle” at all. I think the inevitable difference is that AI can never mimic “human psychological contact” unless it acts subjectively according to a conscious, imperfect mind.
thanks for the response here. That is a really interesting idea, and definitely one that I'll have a play around with. I think that these tools could be great for training :)
re: the imperfect mind. I once joked that we'd be able to create virtual therapists with a fallibility button (and associated disclaimer) that make them more like real humans. Maybe I should patent that idea now.
Hi Terry,
I enjoyed reading your article.
How about asking chatGPT to be a client and trainees can practice counselling by talking to it and ask for feedback?
I’ve tried it just now using a panic attack as an example and I asked if it can give me feedback on how I demonstrated my CBT knowledge, skills, and therapeutic approach. It was really helpful. However I noticed it can’t produce feedback on the therapeutic relationship, and feelings about how much insight was gained, and about therapy progress. There’s a lack of human emotions in the interaction, chatGPT didn’t struggle a single bit in presenting their “problem”, as if it doesn’t even “struggle” at all. I think the inevitable difference is that AI can never mimic “human psychological contact” unless it acts subjectively according to a conscious, imperfect mind.
Hi Solomon
thanks for the response here. That is a really interesting idea, and definitely one that I'll have a play around with. I think that these tools could be great for training :)
re: the imperfect mind. I once joked that we'd be able to create virtual therapists with a fallibility button (and associated disclaimer) that make them more like real humans. Maybe I should patent that idea now.