Why is it that so many otherwise informed, intelligent, rational people are uninformed and epistemically irrational when it comes to their knowledge and beliefs about human-nonhuman relations, veganism, and animal rights? Why aren’t otherwise informed people knowledgeable about the atrocities in animal agriculture and other areas of animal use? Why do animal rights advocates hear so many absurd and implausible objections to animal rights and veganism? Why are the best arguments against animal rights, put forth by professional philosophers, merely classic examples of confirmed prejudice and tortured logic? I believe a good explanation can be found in two ideas generated in the field of economics during the past 60 years: rational ignorance and rational irrationality. There are several sources of information on these two ideas, but the primary resource I’ve used for this essay is “Why People Are Irrational about Politics” by Michael Huemer, Associate Professor of Philosophy at the University of Colorado, Boulder. For those who are interested, another interesting paper on the topic is “Rational Ignorance versus Rational Irrationality” by Bryan Caplan, 2001. The purpose of this essay is to introduce these two ideas and apply them specifically to objections to veganism and animal rights. 
The Prevalence of Disagreement
Professor Huemer starts out his paper on the topic noting that political disagreement (and moral and religious disagreement) is very widespread, strong, and persistent. That is, any randomly-chosen two people are likely to disagree about many issues; they are likely to be very confident that they are right; and long discussions or rational argument is unlikely to bring them to agreement.
Why don’t we have such widespread, strong, and persistent disagreements in subjects like mathematics or science? While there are disagreements in these other subjects, the frequency, intensity of conviction, and tenacity do not compare to politics, morality, and religion. After this brief introduction, Professor Huemer considers four theories  of why there is so much strong and persistent disagreement in politics, and concludes that while there are several reasons for it, the most important factor is “rational irrationality” where “rationality” in the first term is referring to instrumental rationality (i.e. the “means-ends” or purely self-interested kind of rationality that economists refer to) and “rationality” in the second term is referring to epistemic rationality (i.e. the disinterested kind of rationality that seeks only truth, regardless of the implications of the truth).
The economic theory of rational ignorance holds that people often rationally choose to remain ignorant of a topic because the perceived utility value of the knowledge is low or even negative. For an example of where perceived utility value is low, consider what you will gain from going through the time and trouble of knowing the specific voting records of all the politicians who represent you. You won’t gain much. The fact is that the next politician elected will be the person who the other tens or hundreds of thousands of voters in your district voted for.
For an example where the perceived utility value is negative, consider what you will gain from knowing exactly what happened to the chickens who laid the eggs you purchased or were slaughtered for lunch today. If you have a conscience, it will likely ruin the meal for you, and may affect the way you see your eating habits in general. In the instrumental, purely self-interested sense of the word “rational”, it is irrational to want to know what happened to the sentient beings who were tortured and slaughtered for your next meal.
This explains why vegans, when we start to gently introduce the plight of ‘food’ animals to non-vegans, so often get a response along the lines of “Stop, I don’t want to know”. It’s not that we’re about to bore our non-vegan associate with the voting records of a dozen politicians (a perceived low utility value), it’s that the non-vegan is insisting on maintaining (instrumental, self-interested) rational ignorance in the face of highly disturbing information that bears heavily on certain decisions we make about three times a day (perceived negative utility value).
Similarly, the economic theory of rational irrationality holds that it is often rational, in a purely self-interested, economic sense, to adopt epistemically irrational beliefs because the cost of epistemically rational beliefs exceeds the benefits of adopting them. So if I accept epistemically irrational beliefs against animal rights – for example, that sentient nonhumans don’t feel pain or that their pain doesn’t matter as much as human pain “because they’re not human” or that we’ll be overrun by billions of cows, pigs, and chickens if we stop slaughtering them – I bear none of the cost of accepting such absurd beliefs.
Rational irrationality makes two assumptions: 1) that individuals have, as Huemer puts it “. . .non-epistemic belief preferences (otherwise known as ‘biases’). That is, there are certain things people want to believe, for reasons independent of the truth of those propositions or how well they are supported by the evidence.”; and 2) that individuals exercise some control over their beliefs. Quoting Huemer again, “Given the first assumption, there is a “cost” to thinking rationally—namely, that one may not get to believe the things one wants to believe. Given the second assumption (and given that individuals are usually instrumentally rational), most people will accept this cost only if they receive greater benefits from thinking rationally.” Since individuals don’t perceive any personal benefit from being epistemically rational about animal rights and veganism, we can predict that they will often choose to be epistemically irrational about animal rights and veganism. (Huemer draws this conclusion regarding only political issues generally.)
Huemer points out that some people will highly value epistemic rationality itself, and therefore will be epistemically rational about political issues (and in our case here, animal rights and veganism). But there’s no reason to think everyone (or even most people) will have this value preference.
Non-epistemic Belief Preferences (i.e. Biases)
So what are some of the specific sources of non-epistemic belief preferences (biases and prejudices)? Huemer suggests four, although qualifies the suggestions by noting that a comprehensive answer would require extensive psychological study. I will significantly modify the details of Huemer’s suggestions to apply them to animal rights and veganism.
Due in large part to persistent marketing by the food industry, the confused message of new welfarists, and the anti-animal rights countermovement, most people falsely perceive veganism as ‘difficult’ at best, and at worst, hold a caricature of veganism as a diet consisting of ‘rabbit food’ (with mental images of barely surviving on things like iceberg lettuce, cucumbers, and carrots). Regardless of how delightful vegan food really is, and how much vegan junk food there is, and how many substitutes there are these days for our formerly favorite animal products, it is ultimately the perception of ‘difficulty’ that represents a ‘cost’ of going vegan. Of course, the greater the perception of ‘difficulty’ is; the greater is the perceived ‘cost’. And the greater the perceived ‘cost’ is; the greater is the likelihood of rational ignorance and rational irrationality.
Beliefs as Tools of Social Bonding
Most people want to go along with the beliefs of people who they like and associate with on a regular basis. Although veganism is becoming increasingly more common and widely accepted in most social groups, many people are afraid of the social consequences of becoming a vegan. They may fear being challenged or even ridiculed about their decision. They may fear awkward social situations or the loss of friends. These fears of social consequences (regardless of whether they are justified or not) can be powerful motivations for rational ignorance and rational irrationality regarding veganism and animal rights.
Beliefs as Self-Image Constructors
People generally want to adopt beliefs that support the self-image they want to maintain and project. If animal rights and veganism doesn’t fit the preconceived self-image for whatever reason, then rational ignorance and rational irrationality about animal rights and veganism are likely to occur.
People usually prefer to hold beliefs that fit well with their other beliefs. Someone who believes X as an evaluative proposition will likely be biased in favor of descriptive propositions or other evaluative propositions that support X. This tendency to prefer coherence can be either epistemically rational (unbiased) or irrational (biased). For example, one will prefer an epistemically rational (unbiased) coherence when one is genuinely and disinterestedly seeking epistemically sound beliefs. Contrarily, one will often prefer an epistemically irrational (biased) coherence when one is seeking ways of ‘justifying’ a self-serving belief by adopting erroneous premises that fit a self-serving (but epistemically false) conclusion.
Coherence bias is, by far, the most interesting bias in the case of animal rights and veganism and deserves its own essay. Why? Because arguably, the most wildly incoherent set of beliefs in our society is most people’s beliefs regarding sentient nonhuman beings. Further, people go to great lengths in rational ignorance and rational irrationality to cover up this incoherence born of bias.
Consider that so many of us love and coddle the family dog, or even a stranger’s dog (familiarity with the dog generally doesn’t matter) and then stick a fork in the equally sentient tortured chicken or drink the milk of the raped and slaughtered cow, who lost her calf to the veal industry. This is a classic example of an incoherence of evaluative beliefs that is wildly irrational epistemically. How do we cope with this epistemic incoherence that we’d normally scoff at? We cope with it via rational ignorance (“Stop, I don’t want to know what happens to the (‘food’) animals”) and rational irrationality (“They’re bred for food.” “What would happen to the millions of cows if we didn’t milk and slaughter them?” [and dozens of other epistemically irrational objections]).
Non-epistemic Belief Preferences Supported by Mechanisms of Belief Fixation
Huemer suggests that perhaps we cannot believe obviously false propositions at will, but we can still manage to exercise substantial control over our political beliefs (and in our case, resistance to veganism and animal rights). He suggests a few mechanisms by which we exercise such control.
Biased Weighting of Evidence
If we attribute slightly more weight to pieces of evidence supporting our preferred beliefs and slightly less weight to pieces of evidence against our preferred beliefs, the cumulative effect of these small biases in weighting evidence can be substantial.
Selective Attention and ‘Rationalization’
We tend to pay more attention to our beliefs and the ideas supporting them than we do to alternative beliefs. Also, as I discussed in the essay Understanding the Anti-Animal Rights Viewpoint, we tend to look to non-epistemically preferred beliefs as a conclusion and work backwards to find ‘premises’, ‘reasons’, or ‘rationalizations’ for the conclusion. When we encounter evidence supporting our non-epistemically preferred conclusion, we tend to accept it at face value. When we encounter evidence against our preferred conclusion, we tend to scrutinize it for what is ‘wrong’. Rational ignorance and rational irrationality are often the result.
We also tend to read and interact with sources we already agree with, and these sources are a steady stream of ‘evidence’ supporting our non-epistemic preferred beliefs. Indeed, one of the common complaints heard among people who are genuinely looking for solutions to society’s problems is that most people are buried in information they already agree with. So there is plenty of dialogue, but the vast majority of it is clustered within specific causes with very little productive dialogue with ‘outsiders’. And it’s not just those concerned with a specific cause who contribute to this isolated bubble effect, but the ‘outsiders’ are generally indifferent and often involved in their own isolated bubble.
Intelligence and Belief Fixation
One might think a high degree of intelligence or education would protect a person from holding on to false beliefs, but this is not necessarily the case. As Huemer points out, the highly intelligent or highly educated person often uses her or his intelligence or education as tools to find more support for non-epistemically preferred beliefs. Where a less intelligent or educated person might give up and admit error, the highly intelligent or educated person has more drive and resources to prop up false beliefs.
The relationship of intelligence and bias to finding out the truth of a matter are as follows: 1) high intelligence and low bias yield the best prospects at obtaining truth; 2) low intelligence and low bias yield good prospects at obtaining truth; 3) low intelligence and high bias yield poor prospects at obtaining truth; and 4) high intelligence and high bias yield the worst prospects at obtaining truth.
Irrationality Is a Big Problem
As Huemer concludes about irrationality regarding politics, so I conclude about irrationality regarding veganism and animals rights. That is, irrationality is the greatest problem we face. It is the greatest problem because it prevents us from solving other problems. It is analogous to an immune-deficiency disorder in health, where our methods of overcoming disease are diseased themselves. Rational ignorance and rational irrationality are widespread diseases of clear thinking and problem solving.
What Can We Do About It?
Like many problems and diseases, the first step to overcoming them is to recognize or admit that the problem exists, both in us and in others. Once we diagnose the problem, we can look for likely causes. We can ask ourselves what ulterior motives we have, or someone else has, for believing a certain claim. We can explore the beliefs underlying preferred beliefs to see what instrumental (self-interested) and epistemic (disinterested) reasons we have for believing what we believe.
Are there any biases from self-interest? For example, do we refuse to think rationally about veganism and animal rights because of preconceptions of what it might be like for us to be vegan? Do we believe something to reaffirm our desired self-image or to fit in with a social group? For example, do we refuse to think rationally about veganism because of a lack of self-esteem or fear of rejection? What do we really have to fear personally or socially – anything? Do we believe underlying claims because they are true or because they cohere well with other claims we want to believe? For example, do we accept irrational beliefs about nonhuman beings and their interests in not being exploited, tortured, and killed because they cohere well with our continued consumption of them and their reproductive products?
We can also make an effort to develop good thinking habits. We should hear or carefully consider both sides of an argument before accepting either side. We should become familiar with informal logic and common fallacies. When we feel inclined to assert a claim, we can ask what epistemic reasons we have for believing it, and also why we might want to believe the claim (independent of its truth). We should develop a higher degree of skepticism toward the beliefs that we suspect have ulterior motives, regardless of whether those ulterior motives are our own or someone else’s. Our first assumption, especially if there is an ulterior motive, such as profit or any conflict of interest, should be that the information provided to us is false, misleading, or incomplete, until we’ve subjected it to further scrutiny and verification. Such skepticism should not be merely applied to positive assertions, i.e. “X is true”, but also to negative assertions, i.e. “X is false” (in other words, proper skepticism is not just about avoiding erroneous acceptance, but equally about avoiding erroneous rejection).
Most of all, we should eliminate our ignorance about animal agriculture and be epistemically rational about it. We should face the facts with courage. Animal agriculture, regardless of what label it is marketed under (e.g. “free-range” or “humane certified”), is a deplorable business and we should know what we’re contributing to. Upon obtaining the facts about animal agriculture, we should beware of epistemically irrational attempts to ‘justify’ our participation in it. We should examine the issue impartially, with a particular effort to recognize our underlying motivations, if any, for accepting or rejecting certain descriptive or evaluative claims.
In the end, we should dispel ignorance; cultivate epistemic rationality; and go vegan as a result.
 For example: Carl Cohen’s “of-a-kind” argument is probably the strongest of an incredibly weak collection of arguments manufactured to attempt to ‘justify’ severe animal exploitation, but it is nothing more than confirmed prejudice (“yes, I’m a speciesist”) and a question-begging fallacy (Cohen’s “of-a-kind” premise does not logically connect to his conclusion that species is relevant; it merely begs the question by assuming species is relevant as a dogmatic ‘given’. If Cohen insists that he’s not assuming species, but only conceptual rationality, is relevant, then he must maintain that it’s morally permissible to force painful and lethal experiments on mentally-challenged humans. Logically, he cannot have it both ways.) Further, and most importantly, Cohen never establishes why possessing conceptual-symbolic rationality (as opposed to mere sentience and perceptual intelligence) would be necessary to an interest in one’s own well-being in the first place.
 In writing this essay, I have relied heavily on Professor Huemer’s paper since it effectively applies rational ignorance and rational irrationality, as well as many of their causes, to political disagreements in general; disagreement in animal rights and veganism being subsets of general moral and political disagreement. That said, I have also substantially ignored, diverged from, and added to sections of Huemer’s paper, so this essay should not at all be taken as representative of Huemer’s paper, and if one is interested in his paper, I encourage opening the link in the introduction to this essay and reading it.