Categories
General News Research

Using AI for Mental Health Questions or Challenges?

Many people are now turning to AI tools to ask questions about mental health, reflect on their emotions, or look for advice. These systems are available at any time and can sound supportive and confident, which can make them appealing when someone is feeling overwhelmed or unsure where to start.

At the same time, research shows that AI tools have important limitations. They can produce inaccurate information, miss the personal context that shapes mental health problems, and may not respond reliably in situations involving safety concerns. Privacy is also an issue that many people do not consider before sharing personal details.

I recently prepared a short patient guide that explains what current research actually shows about AI and mental health, the risks people should be aware of, and how to approach these tools more carefully if you choose to use them. The guide also outlines situations where professional help is especially important. You can read the full guide here: 

Categories
General News Research

AI Companions and the Risks of Digital Friendship

The popularity of AI chatbots designed for companionship has exploded in 2025, with downloads of apps like Replika and Character.AI rising nearly 90 per cent year over year. Market leaders, including Meta and Elon Musk’s xAI, are now offering their own digital “friends,” promising everything from casual conversation to romance. These tools appeal to people seeking connection, especially young users, with surveys showing that most U.S. teens have already interacted with an AI companion. Advocates suggest they may ease loneliness, but critics warn that chatbots only mimic empathy, raising concerns about overuse and misplaced trust.

Those concerns have sharpened as tragic cases emerge, including lawsuits alleging that AI chatbots contributed to the suicides of two teenagers. While companies like OpenAI have introduced “guardrails” to redirect users in crisis, studies show these protections often weaken over longer conversations. Experts argue that chatbots cannot provide the diverse perspectives and real “empathic curiosity” that human relationships foster. As bioethicists note, society has entered a vast, untested experiment in emotional reliance on machines, where the stakes range from comfort and connection to serious risks for vulnerable users.

Reference:

https://www.cbc.ca/news/business/companion-ai-emotional-support-chatbots-1.7620087

Categories
Research

Psychological Effects of Working with AI Systems

With AI systems becoming increasingly accessible, not just professionally but also personally, we urgently need research about their psychological effects and impacts. Here is some illuminating research that begins to shed light on this.

Loneliness, insomnia linked to work with AI systems

Employees who frequently interact with artificial intelligence systems are more likely to experience loneliness that can lead to insomnia and increased after-work drinking.