Moral conflicts between groups are inevitable in modern life, writes Harvard University professor Joshua Greene in his book Moral Tribes: Emotion, Reason, and the Gap Between Us and Them (Penguin, 2013). The tendency to separate ourselves into distinct groups arose from the tribal lives of our ancestors, who had to get along with members of their own tribe and fend off threats from members of other tribes to survive. As technology and a population explosion draw them ever closer together, groups—from neighbors to companies to nations—face the difficult challenge of negotiating differences involving questions of fairness, values, and morality.
In Moral Tribes, Greene explains why moral conflicts between groups are so challenging and offers a new spin on an old philosophical theory as a means of resolving these differences through negotiation.
A common tragedy
Human conflicts typically spring from one of two challenges, writes Greene. First, we face the problem of cooperation, or the tension between doing what is best for ourselves as individuals and doing what is best for society at large.
The problem of cooperation is often illustrated with ecologist Garrett Hardin’s well-known parable, “the tragedy of the commons.” According to the parable, a single group of herders shares a common pasture. Each herder has self-interested motives to add to his flock to maximize his profits when he sells the animals at the market. Fortunately, each herder also has a broader interest in ensuring that the pasture is managed sustainably. If flocks grow too big, the pasture will be depleted and there will be nothing left for any of the animals to eat.
As the parable illustrates, the problem of cooperation is difficult but not unsolvable, writes Greene, thanks in part to how our sense of morality has evolved. We have become open to cooperating with other members of the groups to which we belong. For example, we aspire to obey laws, treat others kindly, and engage in actions that will benefit society, such as recycling. Our natural inclination to behave ethically is often reciprocated by others.
When morality isn’t commonsense
Unfortunately, we have a harder time cooperating with members of groups to which we do not belong. Greene illustrates this dilemma with a new parable, “the tragedy of commonsense morality.”
According to this parable, four tribes of herders live on the four sides of a great forest. Each tribe has established very different rules based on differing conceptions of morality. One tribe, for instance, gives each family the same number of sheep to be tended on a common pasture, another tribe gives each family its own plot of land and allows families to buy land from one another, and so on.
One summer the forest separating the four tribes is transformed into a perfect grazing pasture after a fire and heavy rains. The four tribes rush in to claim the land, each trying to impose its own morality: one tribe insists that the pasture should be managed jointly by the tribes, another tries to divide up the land, and so on. These differing views turn into bitter fights that eventually lead to violence. Disputes further escalate when tribal members violate the rules and customs of other tribes.
The tribes have moral conflicts “not because they are fundamentally selfish but because they have incompatible visions of what a moral society should be,” writes Greene. This view explains why individuals, organizations, political parties, and governments clash on issues ranging from the appropriate role of government in citizens’ lives to gay marriage to environmental crises. Even a swimming pool can become a moral battleground between those who view it as a resource to be shared and those who do not.
What’s fair is fair?
Notably, our sense of what constitutes a fair solution to moral conflicts depends a great deal on where we stand. In 1995, for example, U.S. News & World Report asked some of its readers, “If someone sues you and you win the case, should he pay your legal costs?” Eighty-five percent of respondents said yes. Others were asked, “If you sue someone and lose the case, should you pay his costs?” Now only 44% agreed. This anecdote suggests that our sense of fairness is biased by egocentrism, or the tendency to have difficulty seeing a situation from another person’s perspective. We assume that we are much less likely than others to file a frivolous lawsuit—and thus should not be held responsible if we lose the case.
Research by Carnegie Mellon University professors Linda Babcock and George Loewenstein and their colleagues supports this conclusion. In one set of experiments, they assigned participants to the role of plaintiff or defendant for a simulated settlement negotiation that was based on a real case involving a motorcyclist who had been hit by a car. Before negotiating, the participants read materials about the actual case. The participants were asked to guess what the judge in the actual case had deemed to be a fair settlement. Despite knowing that they would be rewarded based on the accuracy of their guesses, those who would be playing the role of plaintiff made significantly higher guesses about the judge’s award than did those who would be playing the role of defendant.
Moreover, the farther apart plaintiffs’ and defendants’ guesses were, the more likely they were to reach an impasse in their subsequent negotiation. By contrast, participants who were not told which role they would be playing in the negotiation when they made their guesses were much more likely to reach agreement.
Egocentrism and the tendency for our fairness perceptions to be biased in our favor make it difficult for us to come to agreement, this research suggests, both when we are making judgments on our own behalf and when we are making judgments regarding the groups to which we belong, including our families and organizations.
Egocentrism, biased fairness, and other common cognitive biases, including the tendency to escalate commitment to conflict, help explain why an argument may recur not just over days or weeks, but sometimes over years.
A utilitarian solution to moral conflicts
If we are hardwired to follow our own group’s view of what constitutes a fair and moral outcome, then how can we effectively negotiate with those who see things differently?
In Moral Tribes, Greene writes that utilitarianism, a philosophy developed by Jeremy Bentham and John Stuart Mill in the 1860s, offers the best framework to help us fairly and rationally resolve our disagreements. Utilitarianism dictates that we should choose the solution that maximizes happiness and reduces suffering in society at large. Under utilitarianism, the moral value of an action is determined only by its outcome.
As “Of Footbridges and Trolleys,” below, suggests, our emotions and biases can prevent us from applying utilitarian logic consistently. But when considered carefully, utilitarianism’s goal of creating the greatest amount of happiness guides us toward the morally correct action. As Greene discusses in his book, it offers solutions to the greatest moral conflicts of our time, including health care and carbon emissions. And it suggests that all the residents of the London Terrace apartment complex should share the benefits and costs of the swimming pool.
6 rules for resolving moral conflicts
How can we navigate our most contentious “us versus them” negotiations? Greene offers six rules, summarized here:
1. Question your moral instincts. In the face of moral controversy, such as a dispute over an area of land that one or more sides view to be sacred, it would be a mistake to rely on intuition alone.
2. Avoid appeals to rights and duties. People often try to “win” moral dilemmas by asserting their rights or calling on others to meet their obligations. Unfortunately, such assertions preclude compromise and end negotiations before they can even begin.
3. Focus on the facts. Before forming strong views on policies and practices we don’t fully understand, we should seek objective, scientific evidence about how they would actually play out in the real world—and require our counterparts to do the same.
4. Beware biased fairness. As noted earlier, we have incentives to view proposals that favor our side as fair. We need to learn to look beyond this bias and consider each side’s perspective.
5. Use a common currency. All of us want to be happy and to be treated well by others. These desires form a “common currency” that allows us to negotiate principled compromises.
6. Make small sacrifices. The logical conclusion of utilitarianism is that those who have resources to spare should share whatever they can with the needy. Though few of us are willing to make this sacrifice, we can at least strive to be more generous toward members of other tribes near and far.
Of footbridges and trolleys: Exposing bias in our negotiations
Our emotions and cognitive biases often prevent us from making decisions that would maximize happiness for all parties in a negotiation and in other realms.
The best illustration of this fact may be philosopher Judith Jarvis Thomson’s hypothetical “footbridge dilemma,” paraphrased here:
Imagine that a runaway trolley is bearing down on five railway workers. You are standing on a footbridge above the tracks, midway between the trolley and the five workers, next to a railway worker who is wearing a large backpack. The only way to save the five workers is to push the worker standing next to you onto the tracks. His body and backpack would stop the trolley from reaching the other workers. You are not big enough to stop the trolley, and there’s no time for you to put on the backpack. Do you push the worker to his death to save the five people below?
For most people, the answer is a resounding no. Yet utilitarianism would support the decision to push the man off the footbridge because doing so would promote the greater good: One person would die instead of five.
Interestingly, people’s choices change dramatically when the problem is tweaked. In this alternate version, known as the “trolley problem,” you can save the lives of the five people on the track by flipping a switch that would divert the trolley onto a sidetrack where a single workman stands.
For many of us, flipping the switch seems like a regrettable but moral action. Because flipping a switch removes us from the physical violence of pushing someone to his death, we are able to make the tough decision to sacrifice one life in exchange for five. These problems attest to the difficulty of applying utilitarian logic uniformly to our decisions.
Have you encountered similar moral conflicts? How did you resolve them?