11/06/2025 | News release | Distributed by Public on 11/06/2025 03:51
How can we harness the power of data without compromising privacy? That's the question Chalmers and the University of Gothenburg want to equip future engineers to answer. With the new master's course Data Privacy, students will learn how to combine data-driven innovation with strong protection of personal integrity.
Across Europe, organizations in health, mobility, finance, and the public sector face a growing dilemma: how to unlock the value of data without exposing people's private lives. As digitalization and AI accelerate, data has become a critical resource - but it also brings rising privacy risks and increasing regulatory demands through frameworks such as the GDPR, the Data Governance Act, and the European Health Data Space.
To meet these challenges, Chalmers and the University of Gothenburg are launching a new master's course in Data Privacy. The course equips students with science-based methods to analyze and share data responsibly, combining technical innovation, strengthening compliance, and privacy protection.
"Digital transformation can only be sustainable when privacy is engineered into the core of our systems. This course bridges the gap between research and practice, giving students the tools to build data analysis solutions that are both innovative and compliant," says Alejandro Russo, professor at the Department of Computer Science and Engineering, and course responsible for the new course.
Students learn to address realistic societal and industrial needs using Differential Privacy - a mathematical technique that enables analysis of patterns and trends in data about groups while safeguarding information about individuals. The course was initiated by Alejandro Russo, whose research evolved from secure programming languages to privacy-preserving data analytics platforms. It is grounded in academic work at Chalmers and the University of Gothenburg and draws on practical experience from his research-based startup DPella, which develops tools for responsible data analyses.
Through hands-on exercises inspired by real scenarios, students perform "red team" and "blue team" activities - identifying privacy vulnerabilities and then designing systems resilient to them. They also gain insights from guest lectures by leading experts addressing GDPR and EU data initiatives, and AI Sweden, presenting its LeakPro project on privacy risks and protections in AI systems.
The course offers a level of mathematical depth, comparable to other top universities. The content is closely linked to ongoing research, and builds on material from flagship international conferences such as IEEE Security & Privacy, USENIX Security, and ACM Communication and Computer Security Conference.
Open to both master's and doctoral students, the course expects around 40 participants in its first run. By combining theory with realistic scenarios, it provides students with the skills needed to design responsible, privacy-aware data systems for a data-driven society.