image: The joint research area of Paderborn University and Bielefeld University focuses on research into social artificial intelligence.
Credit: TRR 318
The Transregional Collaborative Research Centre ‘Constructing Explainability’ (TRR 318) at Paderborn and Bielefeld Universities, funded by the German Research Foundation (DFG), will be beginning its second funding phase from January 2026. Today, the DFG announced that the interdisciplinary research collaboration on the topic of ‘social artificial intelligence’ would be extended for a further three and a half years beyond its successful first term of four and a half years. Around 14 million euros of funding have been approved for this.
‘This decision highlights the importance of research into social artificial intelligence and demonstrates the exceptional interdisciplinary expertise at an elite international level that we are pooling here at Paderborn and Bielefeld Universities’, said Professor Matthias Bauer, President of Paderborn University. ‘The universities are collaborating on this project as strong regional partners. This collaboration clearly showcases the strength of research and innovative capacity that sets Ostwestfalen-Lippe apart – this Collaborative Research Centre is an international flagship for the region’, added Professor Angelika Epple, Principal of Bielefeld University.
TRR 318 is a research collaboration between scholars from the fields of computer science, linguistics, media studies, philosophy, psychology, sociology and economics. Since July 2021, the team has been examining how artificial intelligence (AI) could be made more comprehensible. This goes beyond the traditional approaches of what is known as ‘explainable artificial intelligence’ and makes a vital contribution to developing systems that adapt to users’ needs.
Explanation as a two-way process
Over the last four and half years, the research conducted by TRR 318 has demonstrated that explanations are only effective if they also take into account the perspective of the person receiving the explanation. ‘Although we often want a perfect explanation, one provided in the form of a monologue may not be successful. Instead, a dialogue creates an opportunity for the people on both sides to be actively involved in shaping the process of what they understand and how’, explained Professor Katharina Rohlfing, spokesperson for TRR318 and Professor of Psycholinguistics at Paderborn University.
The researchers examined in empirical studies how people use language, gestures and reactions to signal what they have understood, and how this information could be used in AI systems. In addition, the team tackled the vital question of how explainability manifests itself in everyday contexts. Interviews were also conducted regarding AI applications that people are already using in everyday life. Current developments, such as the publication of large language modules like ChatGPT, were incorporated into the research at an early stage.
Next steps: spotlight on context
The second funding phase will focus in particular on the context within which explanations are given. The scholars are applying the term to various situations, settings, people and interpretations, as well as to the shared knowledge built up via dialogue. During the next phase, they will be examining how explanations should be adapted to contextual circumstances. ‘In one context, a brief, technical explanation could be helpful, whilst another requires a more detailed, everyday approach. Explanation requirements can also vary within a single setting’, said Professor Philipp Cimiano, deputy spokesperson for TRR318 and Professor of Semantic Databases at Bielefeld University. In the future, AI systems should be able to respond to changes like this and flexibly shape dialogue with users.
‘We are delighted about the trust that the DFG have placed in us’, Professor Katharina Rohlfing said. ‘During the second phase, we look forward to taking responsibility for developing a more social form of explainable AI and applying these findings to practical settings so that AI explanations are comprehensible, helpful and relevant for users.’
Interdisciplinary collaboration continues
TRR 318 involves more than 60 researchers across 20 sub-projects in seven scientific disciplines. The projects are split into three research areas and supplemented by a graduate school to promote young academic talent. This project extension enables the outstanding interdisciplinary collaboration to continue and further consolidates the role of Paderborn and Bielefeld Universities as strong (inter)national hubs of AI research.