09/23/2025 | News release | Archived content
From chatbots that explain AI models to the question whether we feel excluded by robots: the FAITH focus area at Bielefeld University is researching how humans and artificial intelligence work together in an interdisciplinary and practical way that takes a social perspective. Current projects show why hybrid teams are more than just a technical experiment: they are changing our understanding of work.
AI is no longer just a tool, but is becoming increasingly an interaction partner: it recognizes situations, makes predictions, and suggests solutions. But how must an AI system be designed so that it works together in a team with humans as a trustworthy, fair, and competent partner? FAITH brings together researchers from psychology, sociology, linguistics, and computer science to explore this question.
© Stefan Sättele
'We are currently experiencing a phase in which AI systems are profoundly changing our working world, not only technically but also socially,' says Professor Dr Philipp Cimiano, spokesperson for the focus area. 'FAITH is our response to this development. We don't want AI to replace people. We want them to complement each other in a purposeful and constructive manner.'
FAITH goes beyond classic human-machine research. It focuses on the team with all the challenges that teams involve: trust, fairness, role understanding, autonomy. The researchers are investigating how such hybrid teams emerge in the first place, how they organize themselves, and what social and organizational consequences this has.
© Stefan Sättele
What happens, for example, when humans are socially excluded by robots? Academics are asking this unusual question as part of the SAIL joint project 'Sustainable Life-Cycle of Intelligent Socio-Technical Systems'- with surprising results.
Studies by psychologist Clarissa Sabrina Arlinghaus show that when people are ignored or excluded by AI, they react emotionally, similar to the way they would react to interpersonal exclusion. This suggests that we increasingly perceive AI systems as social beings, and we have corresponding expectations of fairness and respect.
This has far-reaching consequences for the design of AI in the workplace: 'If robots act as social actors, they must also abide by social rules,' emphasizes Cimiano.
One example of FAITH's practical research is Project B01 in the TRR 318 Collaborative Research Centre. Here, researchers are investigating how to design dialogue systems so that humans can understand complex AI models.
An applied example is the police force in North Rhine-Westphalia, Germany. They use AI-supported prediction models to identify potential burglary hotspots. However, it is not always clear why a particular neighbourhood is classified as high-risk-neither for police officers nor for citizens.
An interactive chatbot aims to change this. Instead of providing one-sided explanations, the system engages in a genuine dialogue: it asks questions, allows follow-up questions, and adapts to the user's knowledge level. 'It was a real change of perspective,' says Cimiano. 'Particularly in a police context, communication is often hierarchical. Our system focuses on participation. That creates trust and acceptance.'
© Stefan Sättele
The team led by Professor Anja Abendroth at the 'NRW Research College Work 4.0: Designing Flexible Working Environments' is investigating the operational conditions under which these changes take place. The focus is on algorithmic work instructions and their impact on work autonomy.
An analysis of employees in large companies reveals that the extent to which algorithmic control has unintended consequences for the quality of employees' work depends heavily on how well employees are integrated into their team and whether they can continue to contribute their skills. 'It's also about how the use of AI in the workplace is socially prepared, negotiated, and regulated,' says Abendroth.
The question of how we work together with AI goes far beyond technical processes. It is about responsibility, participation, education, and equal opportunities in an increasingly automated world. 'We are living in a time when voices are being raised saying that soon we will no longer need programmers, for example,' says Cimiano, referring to statements such as that of Mark Zuckerberg. 'We see things differently. People have unique abilities, and we want to develop systems that respect and complement these, not replace them.'
Spokespersons: Professor Dr Philipp Cimiano, Professor Dr Anja Abendroth, Professor Dr Stefan Kopp, Professor Dr Günter W. Maier, Professor Dr Sina Zarrieß
Participating faculties: Faculty of Technology, Faculty of Sociology, Faculty of Psychology and Sport Science, Faculty of Linguistics and Literary Studies, Faculty of Business Administration and Economics, and the Medical School OWL.
Website: FAITH - Bielefeld University