“…decolonizing the curriculum means creating spaces and resources for a dialogue among all members of the university on how to imagine and envision all cultures and knowledge systems in the curriculum, and with respect to what is being taught and how it frames the world” (Keele University Manifesto for Decolonising the Curriculum)

The Department of Computer Science at Oxford was founded in 1957, which makes it one of the oldest university computer science departments in the UK. Although relatively young by Oxford’s standards, it is nevertheless part of an ancient institution which, it is acknowledged, has benefitted from and perpetuated attitudes and practices rooted in deeply wrong biases and prejudices. The Department recognises and acknowledges its position within this institution and understands that being non-racist is insufficient – carrying out research that is truly representative requires an anti-racist position. This includes working on understanding what it means to decolonise the curriculum and examining preconceptions that have been taken for granted for decades, if not centuries.

One aspect of our work here is a growing awareness in computer science, and its related disciplines, that new technologies can have a detrimental effect on individuals, communities and entire societies. We need to go beyond understanding these effects to realise that they are often rooted in a colonial past that even at its most benign, sought to impose Western standards and understandings on other countries, and at its worst enslaved and reduced local populations, creating divisions and hierarchies of value that are replicated in the vast datasets so often used in machine learning. We teach our students about these issues in our courses, including Computers in Society, and Ethics and Responsible Innovation and encourage them to consider how such problems might be addressed. We believe this work must go further, to emphasise how global histories of domination and subjugation have impacted the structures of science they see and the assumptions they encounter. We can further encourage our students to reflect on their own role whilst they are studying, and then in their future careers: we can highlight their responsibility to consider questions such as, what values do we have as computer scientists? How can we ensure we work for the social good? What assumptions are we bringing to our work?

Several of our research projects and activities also seek to co-create technological innovation and develop novel methods to address structural social inequalities. Using an “ethical hackathon” model, we work with students to consider how ‘fairness’ and responsibility mechanisms can be embedded into the design of tools and systems. For example, with colleagues in InSiS and from the Harare Institute of Technology in Zimbabwe, we broadened the ethical hackathon to a laboratory hackathon which tackled the challenge of resource-scarcity in STEM education in southern Africa. The students who participated in the hackathon used their expert local knowledge of the STEM education context and the availability of resources to prototype low-cost lab equipment that could be easily replicated. Similarly, several projects seek to develop low-cost open-source conservation technology including novel wildlife tracking solutions using GPS, acoustics and motion monitoring. These projects focus on working with conservationists to create efficient, effective low-cost manufacturing and distribution models that can challenge the – often prohibitively priced – commercial technology. In these projects, we are directed by local researchers and collaborators, and seek to support their work, rather than attempting to impose our own assumptions on them.

Much of our work also centres on positive ways in which innovations and technologies can be channelled to the benefit of society. One of our projects helps to train UK doctoral students in Responsible Research and Innovation – this is a way of carrying out research and development that seeks to assess possible harms that technology might generate, and to try to change course to avoid potential undesirable effects. We believe computer science, and technology in general, should work for the benefit of society and this includes many factors, including giving the under-represented a voice. Our Women in AI & Ethics conference in 2019 sought to give a platform to women working in STEM, who are often in the extreme minority in their teams or departments. This isolation can be amplified if they are women of colour, and we wanted to provide an opportunity to connect and collaborate.

Future work

We understand that our search for ‘the best’ students and staff must take into account factors such as representation, and wider social impact, as well as academic excellence. To this end we are investigating ‘blind’ CV-sifting processes, as work has shown that this can have a significant impact on diversity.

We also understand that computer science itself has been characterised as a colonial system, exporting technology designed for particular cultural and social contexts into other areas of the globe, without regard to local needs or contexts. In particular, learning from our previous research, there is an urgent need to work on decolonising digital innovation, digital content, and digital data to explore how databases and images might support indigenous knowledge systems. To address this, we seek to incorporate perspectives from other areas of the world into our research and teaching, particularly within research groups such as Human Centred Computing.

Finally, we acknowledge the lessons that the events of 2020 have made so clear, not just in other countries but in the UK and here in Oxford. The University, and this Department within it, seek to be the best that we can be. This means making it clearer that we reject the conscious and unconscious biases of the past and that we seek a future for our work that incorporates a multitude of voices at every level.