The popularity of AI chatbots designed for companionship has exploded in 2025, with downloads of apps like Replika and Character.AI rising nearly 90 per cent year over year. Market leaders, including Meta and Elon Musk’s xAI, are now offering their own digital “friends,” promising everything from casual conversation to romance. These tools appeal to people seeking connection, especially young users, with surveys showing that most U.S. teens have already interacted with an AI companion. Advocates suggest they may ease loneliness, but critics warn that chatbots only mimic empathy, raising concerns about overuse and misplaced trust.
Those concerns have sharpened as tragic cases emerge, including lawsuits alleging that AI chatbots contributed to the suicides of two teenagers. While companies like OpenAI have introduced “guardrails” to redirect users in crisis, studies show these protections often weaken over longer conversations. Experts argue that chatbots cannot provide the diverse perspectives and real “empathic curiosity” that human relationships foster. As bioethicists note, society has entered a vast, untested experiment in emotional reliance on machines, where the stakes range from comfort and connection to serious risks for vulnerable users.
Reference:
https://www.cbc.ca/news/business/companion-ai-emotional-support-chatbots-1.7620087
