Abstract
A significant minority of young people use ChatGPT, often on a regular basis, for emotional purposes in the broadest sense of the term: seeking life advice, seeking friendship, relief from emotional distress, sexual questions, etc. The mere existence of this dynamic, directed towards a machine, is already cause of concern.
Our field experience, as well as tests we conducted on ChatGPT without its knowledge, show that its response quality is unpredictable. Sometimes the responses are acceptable, even very acceptable, sometimes they are unhealthy, toxic and antisocial, spreading illusions about the reality of what a human relationship is... In the sexual domain, while some responses are valid, there is also a general libertarian atmosphere.
The risks for young people are high: loss of lucidity and confusion of ideas; encouragement of antisocial behaviour; addiction; increased feelings of insignificance and loneliness; despair and suicide.