Moral vs Technical Solutions

By Joshua Hovsha and Katie Doherty
February 6, 2020
13min read
boxer walking into ring

Writing in 1968 at a time of great faith in scientific progress, the ecologist Garret Hardin sent out a warning to the world that technology alone isn’t enough. Hardin’s article “The Tragedy of the Commons” claimed that the finite limitation of common resources cannot be fixed with ‘technical solutions”. Instead, he stressed the need for a revolution of moral imagination as the only way to overcome these challenges.

Those fighting for global action on climate change will be all too familiar with the problems Hardin highlighted. The tension between technical solutions and moral revolutions is key to this battle.

moral and Technical battle

We may have already reached the moment when our faith in the capacity of technical growth has begun to collapse. More and more focus has been given to the long predicted ‘Techlash.’

This is the context in which three of our team woke up unnaturally early a couple of weeks ago to make our way to the annual CPDP (Computers, Privacy & Data Protection) Conference in Brussels.

The conference brought together over 1000 policy makers, consultants, academics, scientists, civil society, legal advocates, practitioners, IT experts and activists. The primary theme of CPDP 2020 was “Data Protection and Artificial Intelligence” focusing on a range of ethical, legal and policy issues related to new technology and data analytics. At the core of these issues was the question of whether we are to find our solutions in technical innovation or moral imagination?

One of the first talks we attended discussed issues posed by autonomous vehicles including the classical ‘Trolley Problem’. The panel, made up of advocates, regulators and representatives from car manufacturers, discussed how autonomous vehicles may be the first product that will bring AI to the masses and the ethical issues that will ensue.

the trolley problemThe Trolley Problem is a classical thought experiment used to test ethical standards by forcing us to determine who will live and who will die in an extreme situation. This unlikely thought experiment is now seen as a real issue requiring human decision as autonomous cars may face decisions over who to save and who to sacrifice when loss of life cannot be avoided on the road.

This case presents the scenario where technical advances have forced moral decisions upon us, rather than resolving them for us.

Another of the issues that was how do we define AI for if we cannot define it, we cannot regulate it. AI is more of an umbrella term and to have a meaningful discussion as to regulation, ethical AI and the future of AI we must be prepared to narrow down and define what we mean and what we do not mean!

Similar challenges may arise around the changing nature of our homes, cars and shared public spaces into surveillance spaces complete with digital assistants and constant monitoring.

Do we require technical solutions or a moral revolution to protect autonomy?

Some place great faith in the capacity of Big Privacy will be able to overcome Big Tech. However, as a panel on Privacy Enhancing Technologies (PETs) noted these technologies are simply tools. They may help in a situation but cannot do so unguided.

As in our Trolley Problem the tool cannot make the problem disappear. Instead moral imagination is going to determine our path forward.

tombstone from privacyFor some in the tech industry the revolution in imagination means that it is time to let go of outmoded beliefs in privacy.  To borrow from Nietzsche many will simply say that privacy is dead, and we are the ones who have killed it.

As Prof. Jennifer Golbeck of the University of Maryland notes – the self-congratulations would quickly end if we were to replace the word ‘privacy’ with ‘consent’. Celebrating the end of consent is certainly not something which can get past a tech firm’s PR team.

So, we must ask: What does giving consent to use one’s personal data mean in a public space or in the context of a public transport system? What happens if you say no? Is your only choice simply to accept this invasion or to no longer use public spaces or public transport?

This brings us to the question of digital divides as your personal resources will determine whether or not you can simply opt out of public spaces and transport. But what happens to the privacy rights of those who have less resources?

Privacy as a luxury for the rich is not an idea which sits well. So we are left to determine not only what the cities of our future look like in terms of technology, but in terms of equality of access and protection. The technical revolution and backlash are under way. It is up to us to ensure that the moral revolution accompanies it.





Related Insights


Keep up to date with all our latest insights, podcast, training sessions, and webinars.

This field is for validation purposes and should be left unchanged.