Attention How right is it to take help of AI for mental illnesses? New research surprised
AI Mental Health Risks: According to a study, AI chatbots like ChatGPT can help to some extent in reducing the stigma related to mental health but they cannot replace professional therapy. Research shows that AI chatbots can be useful for people who are hesitant to seek traditional face-to-face help because they provide confidential and easily accessible conversations.
Edith Cowan University Study
The team of Edith Cowan University (ECU) in Australia surveyed 73 people. These were the people who had used ChatGPT for any of their dilemmas. The team examined the use of ChatGPT and its impact on stigma.
Scott Hanna, a Master of Clinical Psychology student at ECU, said the results show the tool is effective and plays an important role in reducing concerns about external judgment.
Mental Health and Stigma
Stigma (fear of being stigmatized here) is a major barrier to seeking mental health help. This can make symptoms worse and prevent people from seeking support.

The study focused on stigma caused by prejudice, fear of being judged or discriminated against, and stigma caused by inculcating negative thoughts. These are stigmas that reduce self-confidence and willingness to seek help.
Also read:- Do you also not wear a hat in winter? Wind hitting the ears can spoil the body temperature
fear of privacy
People who felt ChatGPT was effective were more likely to use it and were less afraid of being judged. As AI tools become more common, people are using them more and more to discuss their mental health. One reason for this can be said to be the maintenance of privacy.
He further said that the results show that AI tools like ChatGPT are being used more for mental health, despite not being designed for these purposes.
be wary
The team said AI may make it easier to open up but one should be cautious as anonymous digital tools lack the necessary ethics.
Hanna said ChatGPT was not designed for treatment purposes and recent research has shown that its answers can sometimes be wrong. Therefore, we advise users to use AI-based mental health tools responsibly.
Comments are closed.