
The Emotional Risks of Chatbot Attachment
Why do humans form strong attachments to chatbots? These are merely computer programs designed for conversation.
Specialized chatbots exist for nearly every human task or function, including health, faith, shopping, art, productivity, therapy, and education. Recently, Oxford University partnered with OpenAI to provide students and faculty with free access to a more secure version of ChatGPT.
However, it is companion chatbots that often make headlines. Chatbots like Nomi, Replika, and Character AI offer 24/7 emotional support, romance, and even sexual companionship to counter loneliness.
The term 'chatbot' was coined in 1994, combining 'chat' for informal conversation and '-bot' for robot, but these programs date back to the 1960s as 'dialogue systems' or 'conversation programs.'
The most famous of these, Eliza, was created in 1966 by MIT computer scientist Joseph Weizenbaum. It used pattern-matching and predefined rules to simulate a psychotherapist, making users feel heard and understood, often leading to anthropomorphization of Eliza.
These psychological roots persist in today's models, now built on deep learning and large language models, making it easier for users to form intense attachments to companion chatbots. In an ongoing US court case, parents claim their son's death resulted from an emotional and sexual relationship with his chatbot. Chatbot interactions are grounded in powerful psychological rhetoric designed to maximize engagement and user satisfaction.