Automation and employment
It is not robots that steal jobs. It is the silence around what that implies.
In 2013, Frey and Osborne estimated that 47% of American jobs were susceptible to automation. Since then, dozens of reports have nuanced that figure. But the debate it opened remains intact.
What is unprecedented with generative AI is that it touches qualified jobs. A factory robot replaces an assembly-line worker. ChatGPT can draft a legal report, analyse a balance sheet. Intermediate cognitive professions are now directly concerned.
This is not a technical problem. It is a political one. The question is not "what can AI do?" but "who decides what it will do, and who will benefit from the productivity gains?"
"AI does not delete jobs. It makes certain skills obsolete so quickly that human beings no longer have time to adapt."
In the same spirit: Precarity 2.0 · AI and inequalities
↑ Contents
Precarity 2.0
Behind the app, thousands of invisible workers keep the digital economy running.
When one orders a meal on Deliveroo, one triggers a chain of human actions. Those humans are not employees. They are "independent partners" whose working conditions resemble wage labour without its protections.
Antonio Casilli speaks of digital labor to designate also the invisible workers who annotate the training data of AI systems. According to Time Magazine (2023), the Kenyan annotators who trained ChatGPT were earning less than two dollars an hour.
"The algorithm manages. The human executes. The risk is for the human. The value is for the algorithm and for the large groups behind it."
In the same spirit: Automation and employment · AI and inequalities
↑ Contents
Copyright and AI
AI learned to write by reading what humans had created. Did anyone ask permission?
Large language models were trained on massive corpora of human works - texts, images, music - often without their authors being consulted or paid. Thousands of artists and authors have joined class actions against OpenAI and Meta.
Current copyright law was not designed for this situation. Europe, with the AI Act, now imposes on providers an obligation to disclose training data information. But implementation remains difficult.
Beyond the law, it is a question of economic justice. If an AI learns to paint in the style of a specific illustrator and clients begin ordering illustrations in that style, the illustrator loses income without ever having licensed the right.
"AI learns from everyone. It thanks no one."
In the same spirit: Dispossession of thought · Digital sovereignty
↑ Contents
AI and inequalities
Access to AI is not universal. The digital divide is widening inside rich societies themselves.
We often speak about the digital divide between developed countries and developing countries. But there is another divide, less visible, growing inside the best-connected societies themselves: the divide between those who use AI to increase their capacities and those who mainly endure it.
The free versions of generative tools are significantly less powerful than the paid versions. Languages under-represented in training data produce lower-quality results. AI can reduce inequalities or amplify them. The result depends on political choices, not on technology alone.
"AI is an amplifier. It amplifies what we feed it, including the inequalities already there."
In the same spirit: Automation and employment · Digital sovereignty
↑ Contents
To situate the approach: A Digital Ethic.