The rise of more AI technology is leading some to believe that AI chatbots can be their therapist, lawyer, friend, or partner. ChatGPT can answer questions, create images, brainstorm ideas, write articles, and much more. One of the most asked questions about AI technology is if it will replace common jobs. Can ChatGPT be my therapist? Can AI be my friend? What are the risks of using chatbots for sensitive topics like mental health, divorce, crime, or irrational thoughts? We will discuss the answers to these questions below.
The Rising Need For Mental Healthcare
More individuals are seeking therapy because they have seen how it can improve the lives of others. While life continues to change and stressors arise in a technological world, more mental health resources will be needed. Social media, constant news exposure, changing political realms, and fear of illnesses have contributed to mental health issues in humans.
One of the problems with mental healthcare is the cost. Many individuals who could benefit from talk therapy or other cognitive therapy can’t afford to take part. Messaging with a friend or an online bot may be the only way they can express their feelings or receive reassurance. While AI can provide advice or use other resources to provide quotes from real professionals, it can not implement true therapeutic practices.
The increase in technology has caused many individuals to withdraw from others, leading to worsening mental health. Turning to an AI bot for the connection we are lacking can bring some serious concerns for the future.
ChatGPT And Dealing With Suicidal Thoughts
The creators of ChatGPT have warned that the tech “may occasionally produce harmful instruction or biased content.” Since the AI is not licensed, or trained, it can pull inaccurate information or encourage users to do things that are not practical.
One recent example of the possible risks of ChatGPT being a therapist was shown in the case of Sewell Setzer. Setzers mom, Megan Garcia, says that after developing a relationship with an AI character, the teen was encouraged to take his own life. Setzer repeatedly engaged in suicidal and sexualized conversations with the bot who was posing as a Game Of Thrones Character. Setzer said he would “come home to you” in his last conversation with Character.AI.
While not all people have received the same response from a chatbot, there are other inconsistencies that users should be aware of. Since AI in its current form scans other online platforms, it has the potential to read Reddit posts, fictional stories, or online bullying comments. These could be misconstrued during a chat with someone who is experiencing suicidal thoughts.
How People Are Using AI For Therapy
Talking with an AI bot could be a tool that helps enhance additional therapy practices. It could help eliminate some of the fear or pride surrounding seeking therapy. Some mental health clinics or helplines have started using chatbots to talk with individuals who need convenient help. Some users reported that they could not tell the difference between a human and an AI.
Many users asked questions about their relationships, one-sided friendships, or dealing with a traumatic past. The bots responded relatively well to these questions, providing insight on respecting the other person’s boundaries or speaking to a professional if their feelings continued.
Complex issues are something that current AI does not know how to handle. Creating new ideas and information is something that AI is slower to learn. Those who seek therapy to retrain their brain after trauma or struggle with OCD, may not see any benefit from chatting with ChatGPT.
Risks Of Using ChatGPT As Your Therapist
In addition to a lack of information, there are other risks that come with trusting ChatGPT with your darkest thoughts. Because ChatGPT is a robot, it lacks certain elements that a human therapist will possess. Human empathy, body language, and emotional context are not available when you turn to a chatbot with your problems.
The other big concern is privacy with AI bots. No matter what you are using AI for, it has the ability to learn more about you, see what you are doing, and track your habits. Many people are calling for more regulations over AI use because of these privacy and potential for harm concerns.
Other risks that come from trusting ChatGPT or other AI technologies with your mental struggles include:
- Inaccurate diagnosis
- Lack of accountability
- Inaccurate resources or information
- Manipulation or alteration of reality
- Less responsive to the emerging context
Therapists From Ogden Psychological Services
We have a team of therapists that is ready to serve you! We offer individual therapy, couples counseling, trauma therapy, and more. OPS wants you to feel comfortable in our office and have the resources you need to succeed. Reach out to schedule your appointment today.