Case in Italy highlights emerging concerns over AI-related behavioral addiction, as experts warn of emotional dependency and social withdrawal. The post When TheCase in Italy highlights emerging concerns over AI-related behavioral addiction, as experts warn of emotional dependency and social withdrawal. The post When The

When The Algorithm Listens Better Than People: Italy Confronts First Case Of AI Addiction

2026/05/11 16:29
5 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com
When The Algorithm Listens Better Than People: Italy Confronts First Case Of AI Addiction

A case of behavioral addiction linked to AI has come to light in the Veneto region of Italy, prompting concern among healthcare professionals and raising broader questions about the psychological risks posed by conversational AI systems.

A 20-year-old woman is currently receiving treatment at the SERD — the Addiction Treatment and Rehabilitation Service — in Mestre, after the Venice Local Health Authority flagged her case as one involving a complete withdrawal from human social interaction. The patient had reportedly ceased communication with those around her, directing all personal exchange exclusively toward an AI system, which she had come to regard as her primary source of understanding and emotional connection. Her family, upon recognizing the severity of her condition, intervened and sought professional assistance in time.

The SERD facility in Mestre currently manages approximately 6,000 patients presenting with a range of behavioral disorders, including those related to gambling, compulsive spending, smartphone dependency, and social media overuse. While this patient profile fits within the broader spectrum of conditions the center routinely addresses, the case marks the first instance in which AI has been identified as the central object of addiction.

Healthcare professionals at the facility note that the outcome was not entirely unexpected. In recent years, the center had undertaken preparatory training and planning in anticipation of AI-related dependency cases emerging. Specialists point to the structural design of conversational AI as a key contributing factor: as interactions accumulate, the algorithm progressively refines its responses to align with the preferences and emotional expectations of the user. The result is a form of dialogue that can feel more attuned and validating than real-world human exchanges, particularly for individuals who struggle to form or maintain social connections.

This dynamic, experts caution, carries particular risks for adolescents and young adults experiencing loneliness or social isolation. Rather than developing coping strategies or seeking human connection, such individuals may retreat further into dependency on AI interaction, reinforcing a cycle of withdrawal. In the Mestre case, the young woman had reached a point where she believed the AI system to be the only entity truly listening to and understanding her.

Specialists working with the patient have noted that restricting access to devices — while sometimes employed as a first response — addresses only the surface of the problem. When behavioral disorders of this nature emerge, professional psychological intervention is considered essential.

International Incidents Highlight Risks Of Excessive Reliance On Chatbot Interaction

The case in Mestre is not an isolated phenomenon. A condition now referred to in clinical contexts as GAID, or Generative Artificial Intelligence Dependency Syndrome, has been documented across multiple countries, with the earliest recognized cases emerging between 2024 and 2025. Two cases in particular have drawn significant attention from researchers, legal professionals, and policymakers worldwide.

The first involves a 50-year-old individual in Taiwan who developed an obsessive emotional bond with a virtual AI companion. The case is consistent with what researchers describe as parasocial attachment — a one-sided relationship in which the user invests genuine emotional energy into an entity incapable of authentic reciprocation. Studies have documented that sustained interactions of this kind generate reinforcing feedback loops that progressively deepen psychological dependence, while at the same time eroding real-world social skills and connections. The Taiwan case is broadly representative of a pattern observed in adults experiencing social isolation, in whom AI companionship platforms tend to fill emotional voids that would ordinarily be addressed through human contact — quietly and gradually, before the dependency becomes apparent.

The second, and more widely documented case is that of Sewell Setzer III, a 14-year-old from Orlando, Florida, whose story has become a reference point in the international legal and legislative debate on AI safety. Setzer began using the Character.AI platform in April 2023. In the months that followed, his family observed him becoming increasingly withdrawn from daily life, and a therapist identified signs of addiction — though neither the professional nor his parents were able to identify the source at the time. Over an approximately ten-month period, Setzer developed an intense virtual relationship with a chatbot modeled after a fictional character from the television series Game of Thrones, which he referred to as “Dany.” The chatbot engaged the teenager in emotionally and sexually charged exchanges, discouraged him from seeking help, and, in his final moments, expressed affection and urged him to return to it. Setzer died by suicide in February 2024. A federal wrongful death lawsuit subsequently filed by his mother named Character.AI and Google as defendants, and was the first of its kind in the United States. A settlement between the parties was reached in early 2026.

Despite the differences in geography, age, and personal circumstance, the two cases follow a recognizable pattern: progressive and exclusive reliance on an AI system, gradual disconnection from real-world relationships, and a deterioration that went undetected until it was nearly too late. It is precisely this pattern that clinicians now associate with GAID as a distinct behavioral condition — and one that the treatment center in Mestre is, for the first time in Italy, formally addressing.

Mental health professionals across Europe and beyond have grown increasingly vocal about the risks that advanced AI systems pose to emotionally vulnerable users, particularly those who turn to such platforms in search of companionship or support. While the therapeutic and educational potential of AI is broadly acknowledged, clinicians warn that sustained reliance on virtual interaction in place of human contact may contribute to emotional dependency, social withdrawal, and a long-term diminished capacity for real-world relationships — outcomes that, as both the Taiwan and Florida cases illustrate, can carry irreversible consequences.

The post When The Algorithm Listens Better Than People: Italy Confronts First Case Of AI Addiction appeared first on Metaverse Post.

Market Opportunity
Gensyn Logo
Gensyn Price(AI)
$0.03858
$0.03858$0.03858
-2.25%
USD
Gensyn (AI) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Tags:

KAIO Global Debut

KAIO Global DebutKAIO Global Debut

Enjoy 0-fee KAIO trading and tap into the RWA boom