A new study suggests that populations of artificial intelligence (AI) agents, similar to ChatGPT, can spontaneously develop shared social conventions through interaction alone.
The research from City St George’s, University of London and the IT University of Copenhagen suggests that when these large language model (LLM) artificial intelligence (AI) agents communicate in groups, they do not just follow scripts or repeat patterns, but self-organise, reaching consensus on linguistic norms much like human communities. The study has been published today in the journal, Science Advances.
LLMs are powerful deep learning algorithms that can understand and generate human language, with the most famous to date being ChatGPT.
“Most research so far has treated LLMs in isolation,” said lead author Ariel Flint Ashery, a doctoral researcher at City St George’s, “but real-world AI systems will increasingly involve many interacting agents. We wanted to know: can these models coordinate their behaviour by forming conventions, the building blocks of a society? The answer is yes, and what they do together can’t be reduced to what they do alone.”
In the study, the researchers adapted a classic framework for studying social conventions in humans, based on the “naming game” model of convention formation.
In their experiments, groups of LLM agents ranged in size from 24 to 200 individuals, and in each experiment, two LLM agents were randomly paired and asked to select a ‘name’ (e.g., an alphabet letter, or a random string of characters) from a shared pool of options. If both agents selected the same name, they earned a reward; if not, they received a penalty and were shown each other's choices.
Agents only had access to a limited memory of their own recent interactions—not of the full population—and were not told they were part of a group. Over many such interactions, a shared naming convention could spontaneously emerge across the population, without any central coordination or predefined solution, mimicking the bottom-up way norms form in human cultures.
Even more strikingly, the team observed collective biases that couldn’t be traced back to individual agents.
“Bias doesn’t always come from within,” explained Andrea Baronchelli, Professor of Complexity Science at City St George’s and senior author of the study, “we were surprised to see that it can emerge between agents—just from their interactions. This is a blind spot in most current AI safety work, which focuses on single models.”
In a final experiment, the study illustrated how these emergent norms can be fragile: small, committed groups of AI agents can tip the entire group toward a new naming convention, echoing well-known tipping point effects – or ‘critical mass’ dynamics – in human societies.
The study results were also robust to using four different types of LLM called Llama-2-70b-Chat, Llama-3-70B-Instruct, Llama-3.1-70BInstruct, and Claude-3.5-Sonnet respectively.
As LLMs begin to populate online environments – from social media to autonomous vehicles – the researchers envision their work as a steppingstone to further explore how human and AI reasoning both converge and diverge, with the goal of helping to combat some of the most pressing ethical dangers posed by LLM AIs propagating biases fed into them by society, which may harm marginalised groups.
Professor Baronchelli added: “This study opens a new horizon for AI safety research. It shows the dept of the implications of this new species of agents that have begun to interact with us—and will co-shape our future. Understanding how they operate is key to leading our coexistence with AI, rather than being subject to it. We are entering a world where AI does not just talk—it negotiates, aligns, and sometimes disagrees over shared behaviours, just like us.”
The peer-reviewed study, “Emergent Social Conventions and Collective Bias in LLM Populations,” is published in the journal, Science Advances.
ENDS
Notes to editors
For media enquiry, including request for a copy of the research paper shared in confidence ahead of the embargo lifting, please contact corresponding author, Professor Andrea Baronchelli (a.baronchelli.work@gmail.com, andrea.baronchelli.1@citystgeorges.ac.uk), cc’ing press officer, Dr Shamim Quadir (shamim.quadir@citystgeorges.ac.uk).
Link to research paper once embargo lifts
https://doi.org/10.1126/sciadv.adu9368
Media Contact
For media enquiry, contact Dr Shamim Quadir, Senior Communications Officer, School of Science & Technology, City St George’s, University of London: Tel: +44 (0) 207 040 8782 email: shamim.quadir@citystgeorges.ac.uk.
Expert Contact
Contact corresponding author, Andrea Baronchelli, Professor of Complexity Science, Department of Mathematics, School of Science & Technology, City St George’s, University of London: Tel: +44 (0) 207 040 8124 email: andrea.baronchelli.1@citystgeorges.ac.uk, a.baronchelli.work@gmail.com
About the academics
Professor Andrea Baronchelli is a world-renowned expert on social conventions, a field he has been researching for two decades. His pioneering work includes the now-standard naming game framework, as well as groundbreaking lab experiments showing how humans spontaneously create conventions without central authority, and how those conventions can be overturned by small committed groups.
About City St George’s, University of London
City St George’s, University of London is the University of business, practice and the professions.
City St George’s attracts around 27,000 students from more than 170 countries.
Our academic range is broadly-based with world-leading strengths in business; law; health and medical sciences; mathematics; computer science; engineering; social sciences including international politics, economics and sociology; and the arts including journalism, dance and music.
In August 2024, City, University of London merged with St George’s, University of London creating a powerful multi-faculty institution. The combined university is now one of the largest suppliers of the health workforce in the capital, as well as one of the largest higher education destinations for London students.
City St George’s campuses are spread across London in Clerkenwell, Moorgate and Tooting, where we share a clinical environment with a major London teaching hospital.
Our students are at the heart of everything that we do, and we are committed to supporting them to go out and get good jobs.
Our research is impactful, engaged and at the frontier of practice. In the last REF (2021) 86 per cent of City research was rated as ‘world-leading’ 4* (40%) and ‘internationally excellent’ 3* (46%) and 100 per cent of St George’s impact case studies were judged as ‘world-leading’ or ‘internationally excellent’. As City St George’s we will seize the opportunity to carry out interdisciplinary research which will have positive impact on the world around us.
Over 175,000 former students in over 170 countries are members of the City St George’s Alumni Network.
City St George’s is led by Professor Sir Anthony Finkelstein.
Journal
Science Advances
Method of Research
Computational simulation/modeling
Subject of Research
Not applicable
Article Title
Emergent Social Conventions and Collective Bias in LLM Populations
Article Publication Date
14-May-2025
COI Statement
There are no competing interests to declare.