Software now governs hospital records, legal judgments, and financial fates. Yet accountability remains diffuse. A meditation on moral responsibility in the age of systems.
In 2024 I built a Legal Case Management System — software that would track the legal fates of real people. For the first time, I felt the weight of a question philosophers have long wrestled with: what does it mean for a human to be responsible for a machine's decision?
Hannah Arendt wrote about the "banality of evil" — harm caused not by monsters but by functionaries who never asked moral questions. I wonder if we software engineers are today's bureaucrats, building systems whose downstream effects we rarely trace.
Consider: a bug in a case-management system could delay a hearing. A delay could mean weeks more in pre-trial detention for someone who is innocent.
Code is not neutral. It encodes assumptions, power structures, and values. The least we can do is know whose values we're embedding.
Topics
Abdi Tefera
Software Engineer · Addis Ababa, Ethiopia