
To be real, no. All it’s doing is confirmation bias. For example, if you were to tell ChatGPT about an argument you had from YOUR perspective, of course AI will side with you. It could be used as an outlet tool like just dumping all of your frustrations (for example) or conversationalist, but never use AI as a tool to self-diagnose or seek psychological help. If you really are struggling, then you need to talk to *human* beings or professionals. Only seeking machine’s help will just push you towards isolation especially in the vulnerable times.
I am saying this as a psychology major. I’m seeing a lot of people use ChatGPT for things like that and it’s very bad for people in the end. I think it’s dystopian to go talk to AI about your relationship problems instead of sitting down and having mature conversation with your partner.
P.S. I’m not saying what you should or not do. Just a word of advice: don’t rely heavily on a machine who cannot feel or understand what you are feeling.

Thank you so much for your elaborate and serious answer on my unserious question (I also majored in Psych back in college). All you said is a great point. Yeah everybody should know that heavily relying on AI for any psychological needs is probably is not doing much "good" for ones mental health as they thought or believe it would.
But taking to account that professional help is indeed hella expensive and a long process (which makes it even more expensive) that majority of people can't afford— I don't think using AI is 100% bad, it's better to use it that be left alone with just you and your thoughts. It is a really good point to only use it as a safe space for venting/crashing out or like an emotional diary/journal but instead you get a response (to add if you're looking for more that just a validation, it's good to set it to give out unbiased response to break any emotional or mental head space you are currently in, and have a different perspective. Hopefully clear things out for you.)
I know a lot of people use Chatgpt therapy as a joke. But for those people who are maybe seriously considering AI for therapy— AI really isn't really just it, our technology isn't yet that far ahead. Also it's not only about AI not having the capacity to have emotional compassion. Therapy has this complex techniques and process that takes years to master and understand in order to be effective and I don't think chatgpt is made to nor operate for those things (not saying all professionals are effective in practicing therapy, but then what more chatgpt?)
So to put it simply, AI or Chatgpt could only be just a band-aid solution at best if really in a tight pinch. HOWEVER again, if you keep that band-aid for too long on a nasty wound— chances are you could get a terrible infection from it. So be careful out there people.
Anddd lastly, I know it's only easy to say to seek professional help, but the actual act of getting the help is hard due to different circumstances and factors the individual is going through. So please don't be pressured, just do what you think works for you and is accessible for you but don't rely 100% on those things for your health in general.

I’m so sorry. It’s so hard to tell the tones sometimes on the internet. And with people using AI everywhere (school, art, writing, etc) I thought that someone deadass wants to use ChatGPT as a replacement for actual therapy (and I’ve seen people using AI chat bots as their actual therapists, so that’s why I got concerned). My mistake.
I completely agree. I thought I mentioned therapy techniques, but I didn’t and I’m glad that *you* did. I was replying at 7am without no sleep, so making grammar mistakes or leaving some points not fully developed happens all the time.
I said earlier, that people can use ChatGPT (or other AI powered sources) for small things, but don’t rely on it too much for serious things. One of a few times is okay, for example, when you experience depression and need someone to talk to, but if a person heavily relies on AI in this vulnerable state, it just pushes person deeper into isolation. Also, ChatGPT makes mistakes, because internet is a swirl pool of a lot of information, including fake sources and actual researches on topics, but AI has no means to tell what’s true or not and just consumes it. Then gives it back to the user. So yeah, the band-aid is a great analogy for this situation. Could be used for small things like venting, but it shouldn’t be used for diagnosis or complete replacement.
Gays and gays if u cant afford therapy, there's chatgpt.