And yet, ChatGPT was up for the challenge. Caroline Plumer, a psychotherapist and founder of CPPC London, took a look at my conversation with AI and found parts of it “alarming”. “There’s definitely information in here that I agree with,” she says, “such as boundary setting not being about controlling others behaviour. Overall, though, the suggestions feel very heavy-handed, and the system seems to have immediately categorised you, the user, as ‘the good guy’ and your family as ‘the bad guys.’ Oftentimes with clients there is a need to challenge and explore how they themselves may also be contributing to the issue.” Plumer adds that when exploring dysfunctional family issues, it can take “weeks, months, or even years of work” — not, a matter of minutes. She also thinks, getting all of this information in one go, could be overwhelming for someone. Even if it’s seemingly more economic, a person might not be able to handle all of the suggestions let alone process and action them, when they’re given at rapid fire speed. Plumer says it isn’t helpful having an abundance of generic suggestions that aren’t truly accounting for nuance or individuality. At least, not in the same way a therapist you’d see over a period of time can do. On top of this, the environmental impact of AI is huge. “I appreciate that lots of people don’t have the privilege of having access to therapy. However, if someone is really struggling with their mental health, this might well be enough to set them off down an even more detrimental and potentially destructive path.”