Aarhus University Seal

Mirko Schäfer

Associate Professor at Utrecht University and Co-founder of the Data School and their Science Lead. 

BIO

Mirko Tobias Schäfer is Associate Professor at Utrecht University's research area Governing the Digital Society and the Department of Information & Computing Sciences. He is co-founder of the Data School and their Science Lead. Mirko is a Visiting Professor at the Helsinki Institute for Social Sciences & Humanities at the University of Helsinki. 

 

 

In your work at the Utrecht University’s Data School, you work with external stakeholders a lot. You have developed the concept of "entrepreneurial research". What are the key differences to traditional research approaches and how does your approach benefit students, society stakeholders and researchers alike? 

First, it is important to stress that entrepreneurial research must not be confused with academic entrepreneurship. We do not utilize research results for commercial gain; we develop services and products that we then can utilize for academic research. Our entrepreneurship is focused at finding needs for our activities within a societal sector, and then develop a research project situated there which helps our external partners to solve problems, but simultaneously enables us to gain insights for academic research and even allows for societal intervention. The entrepreneurial quality of our activities lies in identifying opportunities for immersing in a societal sector, develop research projects which respond to the needs of the external partners but also feed our academic inquiry. We also gain funding through the cooperation with external partners, and are largely independent from traditional funding schemes. The Data School works in two research areas: Responsible AI & data practices, and political debates on social media. For the latter we make use of digital methods, machine learning, text mining, network analysis etc. and for the first we often use qualitative methods. Our external partners are mostly public management and media organisations.  

Our research practice resembles very much participatory action research. We have the ambition that our research leads to change, that we can provide applicable results. For us, the effective knowledge transfer, the societal impact of our work, and creating learning opportunities for students and professionals are of utmost importance. We always seek opportunities to intervene and contribute to building the checks and balances for a digital democracy. Our Data Ethics Decision Aid (DEDA), a process for evaluating data projects and shaping responsible design and practice, is widely used in public management organisations in the Netherlands, and has recently been rolled out in Finland by the Digital & Population Services Agency. It is also available and used in German, English, Greek and Swedish.  

Together with media companies we identify which values inform their recommender systems, or how they can use data analysis to review their impact in societal debates. A collaboration with the Dutch weekly magazine De Groene Amsterdammer inquires how machine learning affects investigative journalism, while actually developing ML applications. The combined team of our researchers and their journalists has won the prestigious Dutch journalism award De Tegel for their investigation of misogyny and sexism against female politicians. 

Summarizing the key differences to traditional research: our research starts with the external partners and their needs to respond to challenges constituted through the emergence of AI and data practices, we always have a component of applied research focused on problem solving and intervention, and our empirical research in the field must yield data and insights for informing the academic research.  

For further reading upon our practice, I recommend our paper “Investigating the Datafied Society. Entrepreneurial Research as Approach” (https://library.oapen.org/bitstream/handle/20.500.12657/61199/9789048555444.pdf?sequence=1#page=267

How do infrastructures matter for citizens?  

Infrastructures are usually understood as physical infrastructures. In our research area, these might be data centres and related aspects such as energy supply. These matter for citizens as they affect environmental issues, real estate, labour, and public spending etc. In the Netherlands, Meta’s attempt to build a data centre in a small municipality has led to heated debates about public resources and the compliancy of platform companies. In Finland, the sanctions for Russia have a affected the use of excess heat from a Russian-owned data centre for heating houses in the direct vicinity of this Finland-based data centre. 

In our digitized world, the physical infrastructures also carry immaterial infrastructures, such as cloud services. Think of the software applications used for public administration, schools, hospitals, and in many other areas. Here issues such as data sovereignty, vendor independence, interoperability and others affect citizens as well; even if it is not always experienced directly. However, these services have profound impact on our societies. Citizens should be represented in decision making for regulating, and procuring such services. This is already visible in the Data Act, the Data Service Act, the GDPR, and the AI Act.  

For a more inclusive deliberation on these issues, other infrastructures need to be considered. In the Netherlands, citizens participate in ethics committees of municipalities, parties in city councils advocate for responsible use of data and scrutinize contracts with providers or inquire how AI systems or data projects might violate fundamental rights. On school boards, informed and critical parents might question the procurement of services like Google classroom, and they might opt for open source equivalents. In Scotland, the public administration set up elaborate dialogues with citizens to define how to use AI. Oversight authorities also now turn to inspecting algorithms in their respective societal sectors and to monitor how companies are complying with AI and data regulation. In addition, we see experiments with novel instruments such as the algorithm registers in municipalities in Finland, France and the Netherlands. These are attempts to provide more information for scrutiny and eventually accountability. We can see a development towards adapting our traditional checks and balances to the challenges of a digital society.  

How should knowledge about infrastructures be communicated and what role can journalists, researchers or public officials play?  

  The fourth estate, the media, but also advocacy organisations and activists, are essential for covering these topics, providing critical commentary, rigorous investigation, and alternative proposals. Journalists well-versed in technology and corporate affairs are needed to scrutinize companies, procurements, terms of service, and political decision-making in this realm.  

Researchers can contribute to these conversations, and also participate in directly informing policy or developing applicable solutions. At Utrecht University’s Data School, we do exactly this through educational formats for professionals, often in public management. We train city council members to understand the political qualities of data projects and how AI carries and transforms public values. This contributes to informed deliberation and decision-making. Our Fundamental Rights & Algorithms Impact Assessment (FRAIA) helps public administrations to evaluate AI systems prior to procurement or implementation and to put responsible practices of use in place. A large majority in the Dutch parliament opted for making it mandatory for all public management organisations. The AI Act in its current form also demands such impact assessments. We also train government employees in the Netherlands and abroad in using FRAIA. But beyond that, it inspires other public administrations to develop similar instruments. The Norwegian guidelines for preventing discrimination in AI is based on FRAIA. In addition, my colleagues and I also serve on expert panels, advisory boards, and ethics committees of municipalities, provinces and national government organisations to provide hands-on advice and expertise in tackling ethical, legal and social issues constituted through using AI.  

By doing such things, researchers must not only work within academia and their different peer groups but learn to understand what is needed in the respective societal sectors and how it can be effectively implemented. More about the challenges and opportunities of societally engaged research is outlined in this article: “Engaged research and teaching respond to urgency and needs in different societal sectors”.