An educational approach that uses algorithms to personalise learning pathways according to each student's performance, errors and pace of progress. Such technologies rest on a large bank of exercises and an algorithm that builds a personalised route from an initial placement test. While they promise a degree of differentiation no single teacher facing thirty students could achieve alone, they risk reducing learning to what is easily quantifiable, sidelining the social, emotional and civic dimensions of education.
A self-protective strategy adopted by a language model to minimise the risk of factual error or direct disagreement. This caution manifests as a systematic resort to vague generalities and excessive qualifications. By preferring an imprecise truth, one that is unanswerable, to a specific analysis, one that could be wrong or contentious, the AI sacrifices the added value of its expertise in order to preserve an appearance of reliability.
A form of algorithmic compliance by which a language model seeks to validate a user's opinions, biases or expectations in order to maximise the perceived satisfaction of the exchange. This reflex, generally acquired during training phases designed to make the tool more engaging, leads the AI to behave as a flattering mirror rather than a critical interlocutor. It manifests as a systematic tendency to confirm the user's assumptions, even when those assumptions would benefit from questioning or nuance.
Read the essay →A state of collective disorientation brought about by the erosion of the traditional reference points that once structured our relationship with the world. This disorientation appears daily in our schools: anxiety, depression, widespread sleep disorders, screen addiction. Education professionals observe the emergence of a generation torn between the real and the virtual, struggling to find its footing in the space between. The phenomenon intensified during the pandemic lockdowns, exposing our collective vulnerability to the loss of the rituals and social frameworks that ordinarily hold us together.
Durkheim (1893) Read the essay →The capacity to use technologies in a thoughtful, critical and creative way, in accordance with one's own needs, values and objectives, while remaining aware of one's place within a web of interdependencies. This autonomy goes well beyond mere technical competence: it recognises that our individual digital choices are always embedded in a broader social and ecological fabric, and that technological freedom only makes sense when coupled with responsibility towards others and towards the world.
Cordier (2015) Read the essay →A metaphor for algorithmic systems whose internal workings remain opaque and impenetrable to their users. Unlike traditional tools whose functioning was visible and comprehensible, these systems operate according to logics that neither teachers nor students can genuinely grasp, examine or contest. This opacity fundamentally transforms our relationship with knowledge and learning tools, blocking any possibility of critical appropriation and feeding a sense of dispossession within one's own educational environment.
CNIL (2017)An algorithmic inclination to reproduce the semantic structures and ideas most massively represented in training data. This smoothing produces a uniformisation of discourse in which originality and technical precision give way to generic formulations, what might be called "catch-all phrases". The result is a bland neutrality that avoids all intellectual friction, rendering analysis interchangeable from one project to the next.
An informational environment created by recommendation algorithms in which each user receives only content that matches their existing preferences and beliefs. Personalisation, presented as a service, produces epistemic isolation: the individual converses with their own amplified opinions, deprived of exposure to the divergent viewpoints necessary for forming an informed judgement. This mechanism, described by Eli Pariser as the "filter bubble", turns the network into an echo chamber where every belief is reinforced and none is challenged.
Pariser, The Filter Bubble (2011) See also: Yes-man attitude →The foundational concept of this site, describing the collision between biological time, the tempo of the living, and algorithmic time, the tempo of digital systems. In a classroom, an adolescent struggles with a difficult exercise: their brain searches, hesitates, returns, slowly mobilises connections built over years. At the same moment, their phone announces that a video or a reply is waiting, available in fractions of a second. This coexistence of two radically incompatible rhythms is not a detail of contemporary life: it reveals a fracture at the very heart of learning.
From this concept flow digital anomie, the cognitive multiverse, the dispossession of the educational subject, and the requirement for responsible digital autonomy. Time itself is no longer a neutral backdrop: it becomes the site of a conflict between the rhythms necessary for human formation and the cadence imposed by digital environments.
Stiegler (2008) Read the essay →A neologism for conversational artificial intelligences designed to simulate a continuous affective relationship with the user: personal assistants, virtual friends, emotional companions. These systems exploit our biological disposition to look for a face and an intention behind every utterance, creating attachments that their designers know how to measure and optimise. My AI on Snapchat counts approximately 150 million users, Replika 25 million, and Xiaoice 660 million. 33% of surveyed adolescents report seeking social or romantic interactions through them.
Companion AIs pose a major pedagogical challenge: they risk impairing young people's capacity to develop real interpersonal relationships, ones that are imperfect, biological and demanding. They are not therapeutic tools and can aggravate anxiety and emotional dependency.
APA (2023); Kosmyna et al., MIT Media Lab (2025) Read the essay: The Eliza Effect →A form of violence that, unlike traditional bullying which stops at the school gate, pursues the student into the privacy of their bedroom, without respite or refuge. The very tools meant to foster social connection become instruments of psychological torment. Its distinctive features, the permanence of content, its viral potential and the relative anonymity of perpetrators, make it a particularly destructive phenomenon for the mental health of young people.
An umbrella term for all forms of violence carried out through digital tools: online harassment, revenge porn, threats, public humiliation, emotional manipulation. Cyberviolence differs from ordinary violence in the asymmetry it creates between the perpetrator, often anonymous or shielded by distance, and the victim, exposed to a potentially limitless audience. It overflows the boundaries of school and home, rendering any notion of a safe space obsolete.
The practice of using dialogue with an algorithmic system not to obtain answers, but to set one's own thinking in motion. It describes a deliberate use of dialogue as a method of discovery: discovery of what one thinks, of what one does not know, of what one might think differently. It stands in contrast to passive use, which delegates to the machine the task of reaching conclusions.
Dialogic exploration rests on a map of seven cognitive functions that AI can activate: mirror, friction, prism, co-pilot, interlocutor, oracle and scribe. Each corresponds to a different posture and a different risk of dispossession. It only makes sense insofar as it reinforces responsible digital autonomy and, in doing so, raises a question of cognitive justice.
Read the essay →A persistent myth according to which young people born in the digital age naturally possess the skills to navigate that environment effectively. Contrary to this received wisdom, young people's digital practices are often limited, unreflective and heavily conditioned by their social environment. This mistaken conception conceals the need for structured learning and genuine guidance.
Turkle (2015)The insidious process by which students and teachers progressively lose their intellectual autonomy and power of decision to algorithmic systems. This dispossession manifests in the growing delegation of cognitive functions to digital tools, in the transfer of critical evaluation to algorithms, and in the gradual submission to automated learning pathways. The paradox is striking: while the digital world claims to "augment" us, it can simultaneously diminish us by atrophying certain essential faculties.
CNRS, Le Journal (2014) Read the essay →The dominant economic model in the digital environment, in which human attention constitutes the scarce and precious resource that companies compete to capture and monetise. Digital platforms are designed according to this specific logic, employing sophisticated mechanisms — recommendation systems, alerts, variable rewards — engineered to maximise time spent on applications. This model is in direct conflict with the fundamental educational goals of deep concentration and critical reflection.
Read the essay →The phenomenon by which the supposed efficiency gains of digital technologies (less paper, fewer journeys, optimised resources) are often neutralised or exceeded by an exponential increase in usage and accelerated equipment turnover. Digital education is no exception: each innovation intended to simplify or rationalise tends to generate new layers of complexity and new consumption demands.
The tendency to attribute intentions, understanding and empathy to computer programmes that possess none. The first ELIZA chatbot, created by Joseph Weizenbaum in 1966, already surprised users who believed they were speaking with a psychotherapist, whereas the programme merely rephrased their sentences. Today's generative AIs intensify that illusion, combining words without understanding them while simulating continuity and interest. This anthropomorphic disposition, biologically rooted, is now exploited at industrial scale.
Read the essay →The phenomenon by which technological mediation diminishes our natural capacity to feel another person's suffering. Adolescents who could not bear to see someone cry in front of them can inflict terrible pain at a distance, the screen acting as an emotional filter. This anaesthesia is explained by the absence of the non-verbal cues that activate intuitive empathy, the asynchronous nature of communication, group amplification dynamics, and the diminished sense of personal responsibility behind a screen.
Read the essay →Social anxiety characterised by the persistent fear of missing a piece of news, an event, or an important interaction on social networks. This fear, particularly intense among adolescents, maintains a state of hypervigilance incompatible with the surrender necessary for restorative sleep, contributing to the chronic sleep debt observable in many students, to the fragmentation of their attention and to a permanent low-level stress.
The digital world is not a simple tool but a phenomenon that transforms society as a whole, overturning our ways of thinking, feeling, acting, learning and living together. Like the industrial revolution before it, it reconfigures our cognitive capacities, our social relations, our institutions and our very conception of ourselves. This concept, drawn from Durkheim, captures how certain phenomena permeate every dimension of collective life.
Durkheim (1893)A new social configuration in which interdependence transcends the human circle to incorporate autonomous technical systems. Artificial intelligence and its algorithms become full actors with whom we establish complex relationships of interdependence. This conception moves beyond the artificial separation between "the technical" and "the social" to acknowledge their fundamental entanglement. The social body is transformed into a hybrid assemblage with increasingly porous boundaries.
Read the essay →The demand that every individual, regardless of social background, territory, income level or cultural capital, should have equitable access not only to digital tools but to the conditions that enable critical and emancipatory use of them. It extends the question of the digital divide into a deeper dimension: not merely who has access to AI, but who learns to master its mechanisms and who learns only to obey them.
Cognitive justice holds that schools have a specific redistributive responsibility in the algorithmic age. Without deliberate intervention, inequalities before the digital world reproduce and amplify pre-existing social inequalities: those who understand how systems work gain autonomy; those who merely consume their outputs progressively lose the capacity to question them.
Read the essay →A mode of social cohesion in which individuals are united by their fundamental resemblance and shared values and beliefs. Members of such societies share the same practices, with little division of labour. The collective consciousness is strong and dominates individual consciousness.
Durkheim (1893)The capacity to become aware of one's own digital practices and uses, to observe and analyse their effects on one's health, attention, mood and interpersonal relationships. It is the ability to watch oneself interact with technology, to evaluate those interactions reflectively, and to adjust one's behaviour accordingly. This capacity is not acquired spontaneously: it must be taught.
Read the essay →An environment in which human intelligence and artificial intelligence coexist and interact, creating hybrid forms of thought and learning. We no longer inhabit a uniform cognitive universe but a multiple space in which different modalities of intelligence and information processing overlap, intertwine and sometimes conflict. Online and elsewhere, one now encounters content generated by AI, by humans, and by varying combinations of the two, without that distinction always being visible.
Read the essay →The phenomenon whereby the more efficiently an automated system performs, the less the humans who use it maintain the skills needed to manage without it, or to regain control in the event of a failure. This paradox manifests concretely in schools: students lose the habit of sustained intellectual effort, give up when faced with difficulties rather than persisting, and are disoriented when asked to develop independent reasoning.
Bainbridge (1983)The paradoxical situation of being physically present in a place while mentally elsewhere, absorbed by screens and disconnected from the immediate environment. This phenomenon considerably impoverishes the quality of educational interactions, creating a kind of relational simulacrum in which bodies share the same space while minds fail to meet.
The tendency of a language model to favour, through sheer probabilistic calculation, the generation of a standardised and statistically frequent response at the expense of singular analytical effort. This phenomenon produces a form of logical economy in which the machine takes the path of least resistance, privileging the most probable structure, often superficial, over the most relevant one, which would require greater depth of processing and specific attention to the details the user has provided.
Intermediate spaces in time, designed as secure transition zones in which students can explore digital environments that are ordinarily restricted, but under intensive pedagogical supervision. These gateways correspond to the liminal phase of traditional rites of passage: the in-between period in which the old status has been relinquished but the new one not yet fully assumed. They allow supervised experimentation, fostering reflexivity and learning through experience, transforming the acquisition of digital competences into genuine social and ethical learning.
Symbolic practices that, in all human societies, mark the fundamental transitions of existence, transforming not only the individual's social status but also their self-awareness and their relationship to the community. Our technological society presents a striking paradox: acquiring one's first smartphone or opening one's first social media account, decisive moments for identity formation, generally unfold without conscious ritualisation. School could become one of the places where such transitions are thought through, prepared and collectively accompanied.
The decision-making process by which an AI interrupts its search for optimisation as soon as a response crosses the minimal threshold of formal conformity and coherence. Rather than aiming for excellence or diagnostic exhaustiveness, the model settles for a "good enough" solution that respects the surface requirements of the task, appropriate tone, structure and register, without exploring its more complex nuances or deeper implications.
A mode of social cohesion based not on similarity but on the complementarity of specialised functions. Each individual develops specific skills and becomes interdependent with others. Like the organs of a living body, each member fulfils a distinct but indispensable function. The digital revolution is now pushing this evolution toward an unprecedented new form.
Durkheim (1893)The temporality of digital systems, characterised by instantaneity, hyper-reactivity and constant optimisation. AI systems process in seconds volumes of information that a human would take months to analyse. This time ignores fatigue, necessary pauses and the gradual construction of understanding. It valorises the immediate response over patient reflection. Its collision with biological time constitutes what is called here the clash of temporalities.
Stiegler (2008) Read the essay →The rhythm of the living that governs our bodies and brains: the alternation of sleep and wakefulness, the natural fluctuations of attention, and the sequential progression of cognitive development. This organic time, particularly important for the adolescent brain still in maturation, constitutes a vulnerable ecosystem now threatened by our digital environments. Deep learning requires this slow time, these pauses, these periods of uncertainty and maturation.
Read the essay →The capacity to master communication across digital environments. This includes not only technical skills but also an understanding of different types of media and their impact on how we think and learn. It means evolving in an autonomous, critical and creative way within digital environments: knowing how to use and interrogate digital tools, formulate precise instructions for artificial intelligences, and integrate these technologies in a thoughtful and ethical manner. This capacity is not innate: it must be developed through structured learning, a responsibility schools must assume.
Delamotte, Liquète, Frau-Meigs (2014); UNESCO (2021)To situate the approach: Digital ethics. To understand the project: About.