"First, do no harm" - transdisciplinarity and responsible software engineering


Transdisciplinary Research for Responsible Software Engineering

When we think about professions guided by the principle 
“First, do no harm,” medicine usually comes to mind. Although the phrase doesn’t actually appear in the Hippocratic Oath, the spirit of it permeates healthcare: a duty to safeguard the wellbeing of the public by maximising the potential benefit and minimising the risk of harm. Software engineering now finds itself facing a similar responsibility.

Digital systems shape our safety, our autonomy, our relationships, and our access to opportunity. As software increasingly blends with the physical and social world, the boundaries between the technical and the human become blurred. This means software engineers must grapple with a fundamental ethical question: How do we build technology that maximises benefits while minimising the risk of harm?

From Clear Boundaries to Complex Realities
Traditional software engineering assumed neat, well‑defined system boundaries. We designed algorithms, data structures, and protocols in relative isolation from the messy real world. But modern systems are:
  • Cyber‑physical (think IoT, smart homes)
  • Socio‑technical (think social media, digital identity, automated decision-making)
  • Deeply embedded in political, cultural, psychological, environmental and health contexts

A login mechanism can influence someone’s sense of security. A recommendation algorithm can shape a public conversation. A poorly designed system can put a vulnerable person at risk. Technical design choices now ripple outward into society.

This is why responsible software engineering—ensuring software maximises benefits while minimising harm—is no longer optional. It’s a professional and societal imperative.

Why Interdisciplinary Approaches Aren’t Enough
Many disciplines have valuable perspectives to offer: ethics, law, psychology, human‑computer interaction, sociology, security engineering and more. Interdisciplinary collaborations bring these fields together to enrich the technical domain.

But even this isn’t sufficient.

Interdisciplinary work typically starts with research questions defined by researchers. Yet responsible software engineering requires us to understand lived experience: how real people, in real contexts, encounter real risks.

This calls for transdisciplinary research—an approach where:

  • Researchers from multiple disciplines work with practitioners, stakeholders, and affected communities.
  • The problem space is co‑defined.
  • Diverse expertise is integrated to produce actionable, context‑specific solutions.
  • Knowledge extends beyond academia to include practical, experiential know‑how.

In short: we cannot design responsible digital systems without involving the people who live with their consequences.

Transdisciplinarity “In the Small” - The Centre for Protecting Women Online
Large‑scale transdisciplinary programmes can be resource‑intensive and difficult to sustain. But targeted, problem‑specific initiatives—transdisciplinarity in the small—can unlock meaningful progress. The Centre for Protecting Women Online (https://cpwo.open.ac.uk/), based at The Open University, offers a powerful example.

Why focus on online safety for women and girls? The UN defines violence against women as any gender‑based harm—physical, sexual, or psychological—whether threatened or enacted. Increasingly, these harms are: enabledamplified, or entirely created through digital technologies: harassment, stalking, coercive control, doxxing, and more. Understanding such harms requires input from survivors, frontline support workers, social workers and law enforcement, policy makers, technologists, ethicists, designers and developers

By bringing these groups together, the Centre aims to enable the development of “safe by design” software technologies and improve the broader digital ecosystem for women and girls. This work not only identifies risks but co‑creates tools, frameworks, and processes that help developers build safer, pro‑social technologies—from privacy‑aware design guidelines to guardrails that limit harmful outcomes that might result from the integration of generative AI into software products.

Challenges on the Path to Transdisciplinarity
Doing this work isn’t easy. Some barriers are structural, others cultural:
  • Funding mechanisms tend to prioritise single‑discipline research.
  • PhD training often doesn’t prepare researchers to collaborate across domains.
  • High‑profile publication venues rarely encourage transdisciplinary output.
  • Teams need to bridge differences in terminology, methods, and expectations.
  • Practitioners and communities need time and support to participate meaningfully.

Yet the benefits—impact, innovation, and inclusion—make it worth pursuing.

A Credo for Transdisciplinary Responsible Software Engineering
To support teams and individuals who want to pursue a transdisciplinary research agenda for responsible software engineering, I propose the following credo:
  • Competence – bring your expertise, but recognise its limits.
  • Courage – face complex problems and ethical trade‑offs honestly.
  • Curiosity – seek out the perspectives you don’t have.
  • Openness – embrace new methods, models, and voices.
  • Focus – anchor your efforts in real-world harm and specific problem contexts.
The Takeaway: Building Technology That Truly Helps
Responsible software engineering doesn’t mean slowing innovation. It means doing it better—by grounding our design choices in lived experience, understanding harm in all its forms, and incorporating diverse expertise from the start.

To “first, do no harm,” software engineers must look beyond the technical system and towards the human system it affects. And the only way to do that well is through transdisciplinary collaboration.

This post is a summary of the keynote talk I delivered at the Responsible Software Engineering Workshop, which took place as part of the 2025 ACM International Conference on the Foundations of Software Engineering (FSE 2025).  It is an edited version of a summary produced using MS Copilot based on the talk abstract and slides.

Comments

Popular posts from this blog

(Academic) Leadership ... in "13 Days"

Towards Safety by Design for Protecting Women Online

Love work, even if it doesn't love me back ...