AI News

News · · 2:03 AM · aurelic

Challenges in Regulating AI-Based Mental Health Chatbots

As AI-driven mental health chatbots proliferate, a few states are taking steps to regulate them.

In the absence of stronger federal regulation, some states have begun regulating apps offering AI 'therapy' as more people turn to artificial intelligence for mental health advice. However, the laws passed this year do not fully address the rapidly changing landscape of AI software development. App developers, policymakers, and mental health advocates argue that the resulting patchwork of state laws is insufficient to protect users or hold creators of harmful technology accountable.

State laws vary in their approaches. Illinois and Nevada have banned the use of AI for mental health treatment, while Utah has imposed certain restrictions, requiring chatbots to protect user health information and clearly disclose that they are not human. Pennsylvania, New Jersey, and California are also considering ways to regulate AI therapy.

The impact on users varies. Some apps have blocked access in states with bans, while others have made no changes as they await more legal clarity. Many laws do not cover generic chatbots like ChatGPT, which are not explicitly marketed for therapy but are used by many for that purpose.

Calls for federal regulation and oversight are growing. The Federal Trade Commission recently opened inquiries into seven AI chatbot companies, and the Food and Drug Administration is convening an advisory committee to review AI-enabled mental health devices. These measures focus on measuring and monitoring the potential negative impacts of AI technology.

The varied use of AI in mental health care, from 'companion apps' to 'AI therapists,' presents challenges in regulation. Some states target companion apps designed for friendship, while others ban products claiming to provide mental health treatment. This has led to different regulatory approaches and challenges in categorizing apps.

Despite the challenges, regulators and advocates remain open to changes. However, today's chatbots are not seen as a solution to the mental health provider shortage. The need for empathy, clinical judgment, and ethical responsibility in therapy cannot be replicated by AI, highlighting the importance of human involvement in mental health care.