
Recently I had a fallout with a friend that I have known since we were both teenagers. Had you asked me back then if I ever would have thought our relationship could or would morph into something that felt uncertain and fragile, I would have stared at you in disbelief and bewilderment.
The friction with my friend led me to reflect on how I, as a moral agent, decide between ethical¹ choices. It made me consciously attuned to the presence of gut reactions to moral issues, and how intuition can serve as a moral guide. I find it fascinating how the body rapidly can have a say about right and wrong, which relates to complex social information about other people, norms and values, and possible future outcomes.
Before I delve into some of the science behind gut-level morality, let me first describe how the disagreement with my friend evoked contemplation about ethical decision-making.
A Cocktail of Emotions
The fallout induced familiar emotional suspects in such circumstances: anger, sadness, disappointment, guilt, and confusion, to name but a few. The mix of these potent emotions tempted me to contact another good friend, who is also good friends with the first one.
The temptation made me feel a sense of relief and ease, and I felt energized. I pictured myself sharing screenshots of the alienating text conversation I had had with my old friend. I imagined our mutual friend becoming indignant on my behalf, defending me, and criticizing the other. The idea of sharing the perceived injustice that had been done to me felt good.
However, moments after the surge of positive emotions, I felt awful; a gut feeling of wrongness arose before any deliberation about right and wrong; before any weighing of reasons for one or the other choice; and before any awareness of ethical virtues or consideration of outcomes. It felt as if something or someone within me — an inner moral philosopher — communicated to me via intuition and emotion. As if he was saying, “You will regret this.”
Involving our mutual friend in the disagreement by placing him in an awkward and difficult middle position, felt viscerally morally wrong; it wasn’t fair. It also didn’t feel fair towards my old friend — a disagreement always has two sides — with whom I also wish to reconcile. My action would probably have resulted in more harm than good. These reflections followed after the gut response, and I decided not to send or mention anything to our mutual friend. This restraint on my initial impulses was a victory over my darker and more primitive self.
The brief, yet important ethical process, made me contemplate what had just happened. I realized that this intuitive moral signaling occurred quite often. When facing an ethical dilemma, my gut will often veto dubious choices I am consciously entertaining before any rationalization.
This made me curious about moral decision-making and the role of intuition and emotion. How can my body know that an action is wrong before my mind has gone through a careful process of reflection? And to what degree can I trust my gut feeling?
Moral Intuitionism
According to psychologist Jonathan Haidt, who can be labeled as a “moral intuitionist”, these gut-level perceptions are direct causes of ethical judgment and not merely additional pieces of evidence used as input in a rational decision process. He defines moral intuition in the following way:
“The sudden appearance in consciousness of a moral judgment, including an affective valence (good-bad, like-dislike), without any conscious awareness of having gone through steps of searching, weighing evidence, or inferring a conclusion.”(1)
This definition aligns well with my own experiences described above. Let us read a brief and controversial example from an article by Haidt that further serves to demonstrate moral intuitionism at work:
“Julie and Mark are brother and sister. They are traveling together in France on summer vacation from college. One night they are staying alone in a cabin near the beach. They decide that it would be interesting and fun if they tried making love. At the very least it would be a new experience for them. Julie was already taking birth control pills, but Mark uses a condom, too, just to be safe. They both enjoy making love, but they decide not to do it again. They keep that night as a special secret, which makes them feel even closer to each other. What do you think about that? Was it OK for them to make love?”(2)
Most people will probably intuitively and rapidly state that what Julie and Mark did was morally unacceptable. The verdict will perhaps be accompanied by an emotion of disgust. But if asked to provide reasons for the ethical judgment, these same people might find it less easy than one would expect. For instance, the siblings used double protection, thereby severely decreasing the chance of a baby, which would have an increased risk of developing disorders. None of them were harmed emotionally, according to the story; on the contrary, it made them feel closer to each other. But although it might be challenging to articulate precisely why sibling sex is wrong, most people will still nonetheless hold on to their initial emotional-based conclusion; they will probably still claim they know that is it wrong.(3) It shows that the judgments are the results of unconscious processing.
This supports the intuitionist model proposed by Haidt, in which the gut reaction is the cause of the moral judgment. Additionally, Haidt states that intuitive/emotional-based ethical verdicts are at times followed by a verbal effort to provide reasons for these gut-level responses. And that the person will seek to confirm the already established moral evaluation. There is evidence to support these claims.(4)

This is interesting. The proposition that “moral judgment is caused by quick moral intuitions and is followed (when needed) by slow, ex post facto moral reasoning”(5), strikes against the traditional rationalist view of morality. In this view, ethical judgments are arrived at through a careful step-by-step process using reason and evidence, in which intuition and emotions are typically seen as noise.(6) In Haidt’s model, reason takes a back seat and becomes to some extent an instrument that serves emotions. Or as Hume so famously wrote, “Reason is, and ought only to be, the slave of the passions.”(7) Although I don’t think Haidt will agree with the “ought” part.
A long and thorough discussion of the intricacies of theories of ethical judgments is beyond the scope of this post. But worthy to note is that the intuitivist model is mainly descriptive, and not normative; that is, it does not say anything about if or when it is better to use intuition than reason; such a general prescription would also be difficult to settle on given the complexity of the nature of morality, and would in all likelihood betray a rich and multi-dimensional understanding of moral judgments.
Nevertheless, I do believe the intuitivist model has a lot of credibility, given the degree to which it fits with my own experiences and the experiences of many others; not to mention the substantial amount of scientific evidence that corroborates it.(8) In sum, a lot of our moral judgments seem to be products of automatic unconscious processes instead of careful rational examination.²
A Proposal
Given the scientific “facts” about intuitive ethical judgments and my own positive experiences with my inner moral philosopher, I propose that we as ethical agents should become more aware of our gut feelings and respect these bodily perceptions; its vote should be taken seriously. Why? Our biopsychological systems are complex machinery that have evolved in social niches, in which making the wrong ethical choices could lead to exclusion and ultimately death.
Due to the long process of evolution, our bodies have become highly attuned to socio-ethical information — for example, virtues — and have acquired abilities to navigate moral landscapes. Could we say that natural selection favors the ethically sensitive individual, that is, genes that play a role in the expression of morality?
Perhaps this moral “intelligence”³ is — to some degree — akin to other rapid emotional evaluations of situations. The perception of danger and threats is also a rapid and automatic process and for good evolutionary reasons. Having a system that can make snap, effortless decisions can hypothetically mean the difference between life and death. Analogously, taking action based on ethical intuition — could we say moral reflex — might mean the difference between social inclusion and exclusion.
However, perceptions about danger can be false positives. Is this the same for intuitively derived moral judgments? Certainly. Maybe my intuition tells me that euthanasia is wrong; when seeing what abortions look like later in pregnancy, my inner judge might scream, WRONG. These verdicts might hypothetically be “false positives” in the sense that they are misaligned with the needs and values in the moral space I am in; or perhaps an ideal and objective moral space. This begs the question of what the “right” or “true” ethical values and judgments are. Do they even exist, and if so, how could we know them? Classical age-old ethical conundrums.
At any rate, such intuitively-based moral conclusions demand closer scrutiny, since they are far more complex than my own personal ethical dilemma, which I was able to resolve in minutes.⁴ The gut feeling is not an isolated intelligence but is mediated through its biology, mind, and the cultural context within which it operates. It is therefore vulnerable to biases, and can even induce a state of ‘cognitive dissonance’; for example, strong negative moral judgments about others may conflict with values I claim to hold, such as justice, equality, liberty, and fairness.
Even though our ethical intuitions can be powerful and useful, an appropriate dose of skepticism is always warranted. Especially if we are inclined to rationally confirm our initial moral conclusions, as Haidt believes.
Non-Emotional Moral Agent?
What is interesting to speculate about is how a person would navigate in a socio-moral landscape without some sort of grounding in an intuitive and emotional intelligence. What would moral reasoning look like? Well, speculate no more:
“Prior research reveals that the ventromedial prefrontal cortex (vmPFC) is a critical area underpinning affect and morality, and patients with vmPFC lesions show abnormalities in moral judgment and moral behavior.”(9)
The research shows that a disturbance in brain regions associated with emotions and morality increases the likelihood of inappropriate moral evaluations and actions. Connecting back to my dilemma, if I had had damage to my vmPFC, I might have sent those screenshots without any reservations. This suggests that gut-level morality, despite its shortcomings, is not only useful for social functioning but necessary.
Ending Thoughts
Being a social creature with the ability to think means moral challenges and struggles. Discerning between right and wrong has and will probably always plague the human species. Building upon my own experiences I have come to respect and rely on my gut intelligence when it comes to moral questions. It is not a blind trust, but I try to carefully listen to the voice within. It has so far served me well in my personal affairs by pulling the brakes on choices that probably would have caused more bad than good. Sometimes I do give in to certain temptations — I am human after all — and learn that my moral alarm system, indeed had a wise understanding of the situation.
So … what is your relationship to your inner moral philosopher?
Footnotes
[1] I use ‘moral’ and ‘ethical’ synonymously for variation.
[2] An important note: the intuitivist theory does not make claims about objective ethical truths. To ascertain if a moral judgment is objectively true, whether it is derived from intuition/emotion or reason, is a difficult problem, which also demands exploration of the concept of “truth”.
[3] ‘Intelligence’ is used loosely.
[4] That of course does not mean it lacked significance.
References
(1) Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108(4), 814–834. https://doi.org/10.1037/0033-295X.108.4.814
(2) Ibid.
(3) Ibid.
(4) Sapolsky, R. M. (2017). Behave: the biology of humans at our best and worst (pp. 481–488). Penguin Books.
(5) Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108(4), 814–834. https://doi.org/10.1037/0033-295X.108.4.814
(6) Ibid.
(7) Ibid.
(8) Sapolsky, R. M. (2017). Behave: the biology of humans at our best and worst (pp. 481–488). Penguin Books.
(9) Cameron, C. D., Reber, J., Spring, V., & Tranel, D. (2018). Damage to the ventromedial prefrontal cortex is associated with impairments in both spontaneous and deliberative moral judgments. Neuropsychologia, 111, 261–268. https://doi.org/10.1016/j.neuropsychologia.2018.01.038
Leave a Reply