Featured image: XXX.

Utilitarianism offers a powerful, formulaic strategy to normative ethics, through which we discern the right thing to do from consequences (hence ‘consequentialism’). It is a popular and persuasive ideology.

The hard task for the utilitarian, though, is to define what kinds of consequences should be maximised in order to bring the most good to the world. Philosophers such as Jeremy Bentham, John Stuart Mill, and Peter Singer define ‘good’ in terms of outcomes like more pleasure and less suffering for the most people on the whole. But there are some strange logical consequences, so to speak, if utilitarianism is the best approach.

One is the ‘utility monster’: Robert Nozick’s greedy human who continually seeks to maximise their own sick pleasure, where their pleasure is morally mandated on the premise that it outweighs others’ suffering.

A good, conceivable example of this is a strident free-speech advocate who takes disproportionate pleasure from supporting and delivering free speech, swamping the suffering caused by their hate speech.

A similar example is the pleasure gained by people who love to eat bacon, for the sensory pleasure they gain and for the freedom they possess to eat whatever they want.

There are a number of problems for utilitarianism here. Good and bad people aren’t necessarily distinguishable, neither are humans and other animals. Moreover, we’re logically required to prioritise pleasure for people who are entitled and feel sorry for themselves.

Then again, the utility monster is just a theoretical possibility: no one would reason to feed a ‘monster’ above other people in most real-world scenarios. But, intriguingly, if we were morally mandated to give pleasure to as many worthy people as possible, ‘good’ or ‘bad’, why wouldn’t we? Moreover, wouldn’t it make sense to hook up everyone’s brain to pleasure-simulation devices (à la The Matrix (1999))? And, in a world where artificial wombs are on the horizon and have been made for premature lambs, are we already on our way there?