Abstract
Mass consumer generative artificial intelligence (GAI) produces acceptable outputs with little effort, creating both positive and negative user trajectories. Since these outcomes stem from the act of using GAI itself they exemplify what Van Deursen and Helsper (2015) term a “third-level digital divide.”
The primary at-risk group—socially vulnerable and lonely users—comprises 39.1% of regular chatbot users. Compulsive chatbot use among such users can intensify loneliness, while general users tend to report lower loneliness. Some vulnerable users integrate chatbots into their lives, forming intimate or marital relationships with GAIs.
Increased loneliness correlates with aggression, extremist beliefs, and antidemocratic attitudes. Drawing on Alexander (2008), such patterns can be interpreted as signs of dislocation—a loss of belonging, autonomy, and achievement. Cognitive reliance and addictive use suggest that some users experience this state. Because these dimensions underpin societal resilience (Kupiecki & Chloń, 2025), their erosion diminishes both individual resilience and democratic stability, underscoring the need for resilience-enhancing interventions and deeper research into GAI’s effects on resilience and democratic vitality.
Keywords: Generative artificial intelligence; societal resilience; loneliness; democracy; digital divide
Text
Mass consumer generative artificial intelligence (GAI) presents a seemingly attractive proposition: it produces passable outputs with minimal effort. Yet, while for some users GAIs open “a new world, a world of wonder and possibilities”[1], for others they pave the way toward increasingly negative outcomes[2]. Both trajectories arise from the very act of using GAI itself and illustrate what Van Deursen and Helsper (2015) describe as a “third-level digital divide”.
The primary group of users at risk (“socially vulnerable”[3]; “lonely moderate”, “lonely light”, and “socially challenged frequent” users[4]), comprising 39.1% of regular chatbot users[5], consists of individuals who report high levels of loneliness and seek emotional support or companionship. Within this group, three distinct subgroups can be identified: one characterized by high neuroticism and low self-esteem; another by low socialization coupled with a positive attitude toward chatbots; and a third by low extraversion and limited social support, accompanied by low trust in both familiar and unfamiliar people. For those in this group who engage in compulsive chatbot use, this interaction can create “more opportunities for unhealthy usage patterns, which in turn exacerbate loneliness” (Liu et al., 2025). In other words, for individuals with a deficit in their basic need for belonging, that deficit deepens when their chatbot use turns problematic – addictive and compulsive. By contrast, general chatbot users report lower levels of loneliness when using chatbots, though they may experience reduced socialization[6].
An undisclosed proportion of this at-risk group “integrate[s] chatbots into their lifestyle” (Liu et al., 2025), in some cases forming intimate and sexual relationships with GAI models[7] and entering digital marriages with GAIs[8].
Increased loneliness is associated with greater aggression and extremist beliefs[9], increasing social distrust, and radicalization towards antidemocratic attitudes such as authoritarianism and xenophobia[10]. Empirically, links were found between loneliness and support for left and right populist parties[11], between perceived loneliness and a motivation to vote[12] and to participate in some aspects of the political process[13].
While the relation between loneliness on the one hand and attitudes and activities that have a negative impact on democracy on the other constitutes a correlation and not necessarily a causation, psychosocial literature provides support for a potential causal relationship. While the causal chain remains partly speculative, the convergence of evidence across psychology suggests a plausible causal pathway. Psychologist Bruce Alexander (2008) would interpret profound lifestyle changes implemented by some lonely GAI users as potential signs of “dislocation”, a state in which individuals experience a lack of autonomy, belonging, and achievement. According to Alexander, lifestyle changes in this context function as an adaptive response to the intense psychological pain caused by dislocation. Individuals experiencing dislocation “become susceptible to the lure of pills, gang leaders, extremist religions, or violent political movements – anybody and anything that promises relief” (Van der Kolk, 2014, p.315).
Although it is unclear whether the most vulnerable group suffers from a lack of experienced autonomy and achievement in addition to a lack of experienced belonging, research by Kosmyna et al. (2025) shows that significant parts of GAI users become cognitively reliant on GAIs and forfeit ownership of their GAI-based activity. The addictive component of problematic use is a further indication of the existence of a situation of dislocation; according to Alexander only the dislocated can become addicted.
The extent to which individuals experience autonomy, belonging, and achievement corresponds closely to their level of societal resilience.[14] In the current state of democracy, strengthening societal resilience is crucial.[15] However, when experiences of belonging diminish, as is the case for the most vulnerable GAI users, individual resilience also declines, thereby rendering inclusive democratic systems in which these individuals participate correspondingly more vulnerable. This situation underscores the urgent need for substantial resilience-enhancing interventions[16] and deeper research into GAI’s adverse effects on resilience and democratic vitality.
Literature
Alexander, Bruce. 2008. The Globalization of Addiction: A Study in the Poverty of the Spirit. Oxford: Oxford University Press.
Fang, Cathy Mengying, et al. 2025. “How AI and Human Behaviors Shape Psychological Effects of Chatbot Use: A Longitudinal Randomized Controlled Study.” arXiv. https://arxiv.org/pdf/2503.17473.
Hansen-Staszyński, Onno. 2024. “Resilience.” SAUFEX. https://saufex.eu/post/4-Resilience.
Hansen-Staszyński, Onno. 2025a. “Generative AI-Triggered Digital Inequality: Pathways, Mechanisms, and Proposed Interventions.” Paper presented at Understanding and Addressing Digital Inequalities Conference.
Hansen-Staszyński, Onno. 2025b. “Resilience Revisited.” SAUFEX. https://saufex.eu/post/46-Resilience-revisited.
Hansen-Staszyński, Onno, et al. 2025. “Project SAUFEX on ‘Societal Resilience’ and ‘Whole-of-Society Approach’: Proposition for a Citizen-Oriented Strategy as an Integral Part of the Post-Peace European Defense Strategy.” FCP.
Kosmyna, Nataliya, et al. 2025. “Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Task.” arXiv. https://arxiv.org/abs/2506.08872.
Kupiecki, Robert, and Tomasz Chłoń. 2025. “Towards FIMI Resilience Council in Poland: A Research and Progress Report.” SAUFEX. https://www.researchgate.net/publication/389159744_Towards_FIMI_Resilience_Council_in_Poland_A_Research_and_Progress_Report.
Langenkamp, Alexander. 2021a. “Lonely Hearts, Empty Booths? The Relationship Between Loneliness, Reported Voting Behavior, and Voting as Civic Duty.” Social Science Quarterly 102 (4): 1239–1254. https://onlinelibrary.wiley.com/doi/10.1111/ssqu.12946.
Langenkamp, Alexander. 2021b. “Enhancing, Suppressing or Something in Between: Loneliness and Five Forms of Political Participation Across Europe.” European Societies 23 (3): 311–332. https://direct.mit.edu/euso/article/23/3/311/127188/Enhancing-suppressing-or-something-in-between.
Langenkamp, Alexander, and Elena Stepanova. 2024. “Loneliness, Societal Preferences and Political Attitudes.” In Loneliness in Europe, edited by Sylke V. Schnepf, Béatrice d’Hombres, and Chiara Mauri, 83–104. Population Economics. Springer. https://doi.org/10.1007/978-3-031-66582-0_6.
Langenkamp, Alexander. 2025. “Linking Social Deprivation and Loneliness to Right-Extreme Radicalization and Extremist Antifeminism.” Current Opinion in Behavioral Sciences 63: 101525. https://doi.org/10.1016/j.cobeha.2025.101525.
Liu, Auren, et al. 2024. “Chatbot Companionship: A Mixed-Methods Study of Companion Chatbot Usage Patterns and Their Relationship to Loneliness in Active Users.” arXiv. https://doi.org/10.48550/arXiv.2410.21596.
Peterson, Dalaney, et al. 2025. “Loneliness Is Positively Associated with Populist Radical Right Support.” Social Science & Medicine 366: 117676. https://doi.org/10.1016/j.socscimed.2025.117676.
Segato, Gian. 2025. “Building AI Products in the Probabilistic Era.” giansegato.com. https://giansegato.com/essays/probabilistic-era.
Van der Kolk, Bessel. 2014. The Body Keeps the Score: Mind, Brain, and Body in the Transformation of Trauma. New York: Viking Press.
Van Deursen, Alexander, and Ellen Helsper. 2015. “The Third-Level Digital Divide: Who Benefits Most from Being Online?” Communication and Information Technologies Annual. https://doi.org/10.1108/S2050-206020150000010002.
Wood, Natasha. 2020. “Adventures in Solitude: The Link Between Social Isolation and Violent Extremism.” Master’s thesis, University of Pittsburgh (unpublished). https://d-scholarship.pitt.edu/38639/
[1] Segato, 2025
[2] Hansen-Staszyński, 2025a
[3] Fang et al., 2025
[4] Liu et al., 2025
[5] Liu et al., 2025
[6] Fang et al., 2025
[7] Liu et al., 2025
[9] Wood, 2020
[10] Langenkamp, 2025; Langenkamp & Stepanova, 2024
[11] Petersen et al., 2025; Langenkamp & Stepanova, 2024
[12] Langenkamp, 2021a; Langenkamp & Stepanova, 2024
[13] Langenkamp, 2021b; Langenkamp & Stepanova, 2024
[14] Kupiecki & Chłoń, 2025; Hansen-Staszyński, 2025b; 2024.
[15] Hansen-Staszyński et al., 2025
[16] E.g. program Interdemocracy, based on a format and method acknowledged by the European Commission as good practice. See: https://interdemocracy.wordpress.com/