Canta Point: Should AI be your therapist?
Image by Youssef Naddam on Unsplash
Supplied by Joni Lester (she/her)
Opinion
Canta Point is a column written by the University of Canterbury Debating Society showcasing different sides of an issue, and encouraging flexible thinking.
The society is open to members of all abilities. Visit the UC Debating Instagram @ucdebsoc for more information.
In favour of ChatGPT…
Although in recent times stigmas surrounding mental health have been broken down, there are still strong barriers to support services and a lack of access to information. Resources such as therapy sessions and treatments are incredibly expensive, preventing many people from receiving the support they need.
Because of the expense of these resources many have started searching for cheaper alternatives, one of these being AI and more specifically ChatGPT. The use of this platform as a “therapist” has no cost which allows users to receive advice that is taken from numerous sites informed about mental health that they can use to identify their feelings and work through them.
Booking in person therapy appointments is not only costly but can take long periods of time as professionals may have a lack of availability. ChatGPT is available at all times of the day and you don’t have to spend time waiting for an appointment, which is especially important when you are experiencing something that you need more urgent support with.
Thirdly, it can be difficult finding a therapist that you feel comfortable opening up to without a fear of judgement. It can be easier to discuss personal and sensitive topics with ChatGPT as they can sit down and talk through their feelings and dilemmas wherever and whenever.
Against ChatGPT…
While venting to ChatGPT about your latest menty b may cure you of some pent-up frustration, turning to it as a source of truthful and helpful advice on more serious mental health issues has the potential to do more harm than good. ChatGPT takes data and information from a range of internet sources and uses it to answer the questions of its users. This poses a massive risk as if this data is biased, outdated, or unrepresentative then the responses sent to users will reflect these issues. This could lead to people receiving advice that makes harmful generalisations and cause them to feel even worse about their experiences.
Not only could the responses be biased and push stereotypes, but AI sites like ChatGPT lack source validation meaning it can repeat information from unverified sites that could cause users to have a false perception of their mental health and therefore be unable to find resolutions.
A core part of in person therapy is the ability to make a human connection and have discussions catered towards you as an individual. While ChatGPT can pretend to have human instincts and imitate the compassion required of a therapist, but it lacks the ability of a therapist to connect with personal stories and understand a range of circumstances.