Philosophy Ph.D. Dissertations


Subjective Moral Biases & Fallacies: Developing Scientifically & Practically Adequate Moral Analogues of Cognitive Heuristics & Biases

Date of Award


Document Type


Degree Name

Doctor of Philosophy (Ph.D.)


Philosophy, Applied

First Advisor

Sara Worley (Advisor)

Second Advisor

Richard Anderson (Other)

Third Advisor

Theodore Bach (Committee Member)

Fourth Advisor

Michael Bradie (Committee Member)

Fifth Advisor

Michael Weber (Committee Member)


In this dissertation, I construct scientifically and practically adequate moral analogues of cognitive heuristics and biases. Cognitive heuristics are reasoning “shortcuts” that are efficient but flawed. Such flaws yield systematic judgment errors, cognitive biases. For example, the availability heuristic infers an event’s probability by seeing how easy it is to recall similar events. Since dramatic events like airplane crashes are disproportionately easy to recall, this heuristic explains systematic overestimations of their probability (availability bias). The research program on cognitive heuristics and biases (e.g., Daniel Kahneman’s work) has been scientifically successful and has yielded useful error-prevention techniques, cognitive debiasing. I try to apply this framework to moral reasoning to yield moral heuristics and biases. For instance, a moral bias of unjustified differences in animal-species treatment might be explained by a moral heuristic that dubiously infers animals’ moral status from their aesthetic features.

While the basis for identifying judgments as cognitive errors is often unassailable (e.g., per violating laws of logic), identifying moral errors seemingly requires appealing to moral truth, which, I argue, is problematic within science. Such appeals can be avoided by repackaging moral theories as mere “standards-of-interest” (a la non-normative metrics of purported right-making features/properties). However, standards-of-interest do not provide authority, which is needed for effective debiasing. Nevertheless, since each person deems their own subjective morality authoritative, subjective morality (qua standard-of-interest and not moral subjectivism) satisfies both scientific and practical concerns. As such, (idealized) subjective morality grounds a moral analogue of cognitive biases, subjective moral biases (e.g., committed non-racists unconsciously discriminating).

I also argue that cognitive heuristic is defined by its relation to rationality. Consequently, heuristics explain biases, which are also so defined. However, such relations are causally-irrelevant to cognition. This frustrates heuristic’s presumed usefulness in causal explanation, wherein categories should be defined by causally-efficacious properties. As such, in the moral case, I jettison this role and tailor categories solely to relational explanations. As such, “moral heuristic” is replaced with subjective moral fallacy, which is defined by its relation to subjective morality and explains subjective moral biases. The resultant subjective moral biases and fallacies framework can undergird future empirical research.