Chatting too much with AI would lead to depression


NEW YORK (HealthDay News) — Do you find yourself spending hours chatting with AI programs like ChatGPT, Microsoft Copilot, Google Gemini, Claude or DeepSeek?

Most likely, you are suffering from depression.

People who use AI chatbots daily are about 30% more likely to suffer from at least moderate levels of depression, researchers reported Jan. 21 in JAMA Network Open.

“We found that daily AI use was common and significantly associated with depressive symptoms and other negative affects” such as anxiety and irritability, concluded the research team led by psychiatrist Dr. Roy Perlis, director of the Center for Quantitative Health at Massachusetts General Hospital in Boston.

Age also appears to play a role, with middle-aged adults having particularly higher odds of depression if they frequently use generative AI, according to the researchers.

Regular AI users aged 45 to 64 had a 54% higher risk of depression, compared to a 32% higher risk among those aged 25 to 44, according to the results. These indicate that “some people may be more likely to experience depressive symptoms associated with AI use,” the researchers wrote.

For the new study, researchers surveyed nearly 21,000 American adults between April and May 2025, using a standard mental health questionnaire to track symptoms of depression. Participants were also asked how often they used AI.

About 10% said they use generative AI daily, including more than 5% who said they use it multiple times a day.

From the study design, it is difficult to know whether AI is promoting depression or whether depressed people turn to AI for comfort, according to the researchers.

Dr. Sunny Tang, an assistant professor of psychiatry at the Feinstein Institutes for Medical Research at Northwell Health in Manhasset, N.Y., agreed that it’s hard to know how the association works.

“People already experiencing mental health symptoms may be more likely to use generative AI for personal use by seeking help and support for their symptoms, coping with loneliness, or finding validation,” said Tang, who was not involved in the study.

“When thinking about the relationship between AI and mental health, we need to think in multiple directions: could the use of AI negatively affect mental health?

But also, how do differences and mental health change the way we interact with AI?” said Tang, who practices at Zucker Hillside Hospital in Queens, New York.

Loneliness could be a major factor, he said.

“Many people feel increasingly isolated lately, whether because they work remotely or for other reasons,” Tang said. “We know that loneliness is a very strong predictor of mental health symptoms like depression, anxiety, and irritability. I think that’s definitely one of the directions in which we should try to understand these relationships.”

The results also showed that AI companies need to produce products that take people’s mental health into account, Tang said.

“They should be kept in… The highlight is that people with mental illness and those who have mental health symptoms are going to be actively involved with their products,” he said. “As all doctors know, first, do no harm.”

Tang said there must be “better safeguards” to ensure that AIs do not offer advice that worsens existing mental health symptoms. “Companies should ask themselves, ‘Is there a way to build AI so that it can be more supportive of people with mental health needs?’” he stated bluntly.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *