top of page

Acerca de

Steep cliffs of Faroe Islands. Green grass at the top, Ocean below the cliffs. .jpg
fork outdoor double unknown way.jpg
Fisherman foot on broken cracked thin ice at lake. Dangerous fishing.jpg

“It ain’t what you don’t know that gets you into trouble.  It’s what you know for sure that just ain’t so.”

 

-- Mark Twain

 

 

“Twain was half right.  Sometimes you also get into trouble for what you don’t know, when the reason you don’t know is that you only considered the evidence, arguments, and sources that suited you.”

 

-- Tim Sawyer (founder, Epistemic Crossroads)

​

​

“… because myside bias is an exception, an outlier, it is the bias where the cognitive elites most often think they are unbiased, when in fact they are just as biased as everyone else.”

​

-- Keith Stanovich (one of the world's foremost experts on epistemic rationality)

​

​

“The amount of evidence and its quality do not count for much, because poor evidence can make a very good story. For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous.”

​

-- Daniel Kahneman (Psychologist, and winner, 2002 Nobel Prize in Economics)

​

​

"Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

​

-- Daniel Kahneman

​

​

“Maybe the world doesn’t make as much sense as we think it does. And maybe the people who realize this have the upper hand.”

​

-- Daniel Kahneman

​

​

"Of the 50-odd biases discovered by Kahneman, Tversky, and their successors, forty-nine are cute quirks, and one is destroying civilization."  

​

-- Scott Alexander

​

​

Natural selection takes time.  Your brain and your mind clearly did not evolve for the purpose of discovering objective truth from the massive avalanche of never-ending, constantly rotating, fragmented, one-sided, and incomplete political information constantly bombarding you in today’s Information Age. 

 

So why do you think you are good at it?

​

-- A question worth pondering, as you begin exploring the world of epistemic rationality ...

 

​

​

  ________________________

 

​

The civilization-destroying bias to which Alexander refers in the above quote is confirmation bias, which in the realm of political thinking, is more aptly referred to as myside bias.  Myside bias is defined below.  As Stanovich put it, this bias is "the primary psychological contributor to our society's failure to achieve belief convergence on numerous critical issues."  I explain why in my first book:

​

The SCIENCE of Epistemic Rationality ... 

and how the Error-Prone Ways in which the Highly-Intelligent, the Highly-Educated, and the Rest of Us form Political Beliefs are Putting Democracy at Risk

​​

Much of what follows is also found in the book's introduction.  (By the way -- don't buy the book!  An updated version will be out in late July.)

​

__________________

​

​

The western world is becoming more and more polarized.  The people have formed groups and have taken sides, with each side entirely convinced their side is right, and the other side is dumb, gullible, irrational, evil, or even all four.  Democracy itself, it seems, is now at risk.

     The significant majority of us, however – including both the highly intelligent and the less intelligent, the highly educated and those with only high school degrees, the political right and the political left – are not very good at forming objective and accurate political beliefs.  We feel like we are, because we feel we view the world from a position of “enlightened objectivity.”  But we don’t.

     What I have written on this web page and in my book is based on years of actively studying and synthesizing the books, articles, manuscripts, and key ideas of thought leaders in the Science of Epistemic Rationality – the science of how we form our beliefs, and whether our beliefs are true.  Based on this research, I believe that maybe – just maybe – an understanding of the highly biased and error-prone ways in which most of us form very strong political beliefs from very limited and very one-sided information can help us bridge our increasingly intense, and increasingly worrisome, political divide.

     

​

​If your sole goal was objective truth when forming a new political belief (or a belief about a scientific, economic, or other issue that has become political):

 

  1. You would ask yourself -- am I attempting to build or bolster an argument, or am I trying to arrive at objective truth?  These are very different goals, requiring very different approaches.

  2. You would make a conscious effort to stay as open-minded and objective as possible.  You would resist the urge to simply defer to the first intuitive answer that pops into your mind.  You would attempt to separate yourself from your existing beliefs, convictions, worldview, and political ideology; from your favored party's political platform; and from the beliefs of those you associate with as you draw your conclusions.

  3. You would carefully gather evidence and arguments from the most credible sources on each side of the issue.  You would spend at least as much time considering evidence and arguments that conflict with your existing beliefs, deep convictions, worldview, and political ideology as you spend on evidence and arguments that reinforce them.

  4. You would assimilate and analyze the information gathered, including using “specialized” forms of thinking as indicated, such as probabilistic reasoning, scientific reasoning, and statistical reasoning.

  5. You would spend a significant amount of time reflecting, and just thinking.

  6. You would reach a conclusion you then treat as a working hypothesis, as opposed to a firmly established fact gripped in a tightly clenched fist. In other words, you would keep an open mind, and you would adjust or even change your belief as often as evidence and superior arguments lead you to do so.

 

     You would take the same approach when determining your overall worldview and overarching political ideology, reading, assimilating, and contemplating many diverse political and philosophical viewpoints before settling on your own perspectives. 

     You would again take a similar approach when choosing which politicians, media organizations, and other sources to trust for political information and opinions. You would thoroughly vet them by carefully comparing the information and opinions provided by sources that generally support your existing views with the information and opinions provided by sources from the other side of the political spectrum, and by carefully comparing and contrasting the issues and events each side chooses to cover in the first place. And you would constantly remind ourselves that the sources you pay attention to – the sources that almost always tell you good things about the politicians and political party you support and bad things about the politicians and political party you don’t support – are not honest, objective, and accurate just because they reinforce your existing political leanings.

     Many of us never approach complex political issues and / or choose information sources as described above, and of those who do, most do it rarely.  This is true whether or not you are highly intelligent and whether or not you are highly educated.  It is true whether you reside on the political right or the political left. Yet you likely reach firm conclusions, and regularly feel quite confident in the beliefs you end up with.  

     So how do we actually form political beliefs?  If we don’t approach political issues as outlined above, how objective and how accurate are the conclusions we reach?  Why do we feel so confident about complex subjects for which we have invested little time and mental energy, about which we have memorized a few facts but which we don’t really and truly understand, for which we have never open-mindedly explored alternative viewpoints, and for which we have relied on information and opinion sources that tell us what makes sense to us and reinforce our existing convictions, but which we have never properly vetted for either accuracy or honesty?  And other than objective truth, what other goals do we have when we reason about politics?  

​

​

​

We generally use non-reflective, heuristics-based approaches that allow us to quickly draw conclusions in which we are highly confident, without going through any of the above-outlined steps.  But “confident,” as I will explain, does not mean “accurate.” 

​

     Judgment heuristics are mental shortcuts – such as deferring to our intuitions, using "common sense," listening to our "gut feelings,” and relying on “rules of thumb” -- that allow us to come to conclusions quickly and without putting in much work.  As I’ll explain, however, this type of reasoning is associated with bias, and with multiple other forms of “mis-thinking” that impede objective thinking and the discovery of objective truth. 

     Before I go on, in case you (like most people) initially find the concept of judgment heuristics confusing:  You are utilizing judgment heuristics when you come to conclusions without performing deep analysis, without searching for and analyzing the best evidence and arguments you can find on both sides of an issue, and without then spending time reflecting on the information you have gathered.  In other words, you have probably utilized one or more judgment heuristics if you have come to a conclusion regarding a complex political issue without utilizing a truth discovery process such as the six-point process provided earlier in this essay.  If you are typical – and yes, even if you are highly intelligent and highly educated -- you have probably utilized judgment heuristics for almost all of your political beliefs!

     When you accept the first conclusion that pops into our mind, when you defer to your gut feelings or insist that for a particular issue, deep analysis and open-minded reflection are unnecessary and common sense is all that is needed, you are utilizing judgment heuristics.  To simplify, you can think of utilizing judgement heuristics as simply deferring to your intuitions.  Your intuitions, though, are not as accurate as you think they are.

     In attempting to understand our usual approach to political belief formation, consider the following cyclically recurring, complex question:  Is a particular politician guilty of the scandal of which they are currently accused?  Forms of mis-thinking associated with heuristics-based belief formation include: 

 

1.  We substitute.  In approaching the above question, we in essence address simpler, much easier substitution questions, such as Do I think this politician is a good person?  Do I support and trust the political party to which this politician belongs?  How do I feel about those who are doing the accusing?  How worried am I about the prospect of the party I favor losing power, and the other side taking over?  What conclusions have my friends reached?  What have I heard most recently from my trusted political information and opinion sources in the media (the ones that almost always tell me good things about the party I favor and bad things about the party I disfavor)?  

     Various categories of heuristic questions are explained in Chapter 5 of the book.

 

2.  We answer the substitution questions – that is, we form our belief -- in a manner that is highly susceptible to cognitive biases.  Cognitive biases are essentially errors that we make when we use heuristic thinking to form our beliefs.  Psychologist and Nobel Prize in Economics winner Daniel Kahneman and his research partner Amos Tversky revolutionized the ways in which we think about human judgment by describing how we use judgment heuristics to reach our conclusions, and by describing many of the biases that occur when we think this way.

     The most significant of the cognitive biases when it comes to political thinking is myside bias.  The political beliefs we form are highly influenced by our existing convictions, those deep beliefs for which we have emotions and even our sense of self-identity involved, including our political party alignment.  Our existing beliefs, opinions, and attitudes all influence the new beliefs we form.  Myside bias is a form of motivated reasoning.  We want certain beliefs – those that line up nicely with what we already believe – to be true. And we want to believe them. Far more often than not, myside bias leads us to conclude that if the accused politician belongs to the party we support, they are innocent, and vice versa.  

     Myside bias explains why we so often think the coach of our child’s sports team should have given our own child more playing time.  It’s why we so often feel the referees were biased against the team we favor.  Myside bias explains why we generally see political leaders on our side as all good, while seeing the other side’s leaders as all bad.  It explains why we so often believe our own side’s conclusions about complex scientific issues we barely understand.  It explains why we generally support the military actions our political party supports, and why we generally oppose military interventions supported by the other side (unless, of course, our side supports the intervention as well).  Myside bias is why we generally believe that any new law or policy proposed by politicians on our side of the political aisle is likely to be successful, while believing that the policies favored by the other side are destined to fail.  It explains why we so often attribute favorable economic circumstances to our side’s leaders, and unfavorable economic circumstances to the other side. It helps explain why we so often form very strong beliefs about highly complex and controversial issues after consulting only the information sources that support our side.

     As I explain in the book, the highly intelligent and highly educated are at least as susceptible to myside bias as are those without elite-level intelligence and advanced degrees – and perhaps more so!  

     Another critically important cognitive bias is referred to by many different names, so I simply refer to it as the tendency to form new beliefs biased by the beliefs of our associates.   People of all levels of intelligence are highly influenced by this bias as well.

​

3.    Once we have formed our belief, we think backward, building an argument by gathering confirmatory evidence in support of the belief we have formed.  We think it’s the other way around – that we formed our belief in response to the evidence we have gathered – but it rarely is. We decide – that is, we form a belief – and we then gather evidence that supports the belief we have formed, to create a neat new belief-plus-supporting-evidence-and-other-reasons argument that makes sense to us, that is coherent with our existing beliefs and convictions.  

​

4.   We ignore conflicting evidence and arguments.  In creating our beliefs and belief arguments, we simply ignore evidence and arguments that would cause us to consider an alternative conclusion, and we write off sources that might provide it as unreliable and dishonest.  We now have the sense of satisfaction that we know, and we typically feel no need to question or explore further.

 

5.   We become quite confident – yes, overconfident – in the belief and the narrative we have formed.  As long as the story is coherent with the beliefs we already have, as long as the story we have created makes sense to us, we know we are right.  The level of confidence we have, however, is much greater than the level of confidence we should have, given the approach we have used to arrive at our conclusion.  

     Confidence, it turns out, comes from having created a coherent story that makes sense to us, with little or no conflicting information. It comes from having a set of beliefs that fit together well. However, while a high level of confidence in one’s beliefs is generally associated with the illusion of knowing, it has little to do with whether objective truth has been achieved.  A high level of confidence is often even a warning sign that one does not understand the complexity of one’s subject matter, and that one does not understand one’s own belief-forming approaches and limitations!

 

6.   We develop belief perseverance.  That is, we cling ferociously to the belief we have formed, and refuse to entertain any evidence and arguments that might cause us to reconsider.  And we steadfastly refuse to even consider changing our minds.

 

     Our worldview and overarching political ideology, usually based heavily on the views of those most influential in our lives during our late teens and early twenties, are generally formed similarly (though genetics appears to play a role as well). 

     And how do we determine the reliability of the sources we turn to for political information and opinions? In November, 2023, at the annual meeting of the Society for Judgment and Decision Making in San Francisco, I made the argument that once again, almost all of us utilize the non-reflective, heuristics-based approaches outlined above.  If a politician or other political information source provides information that meshes well with our existing beliefs convictions, worldview, and ideology; that meshes well with the believes of those we associate with; and that our associates also trust; we tend to consider the source as competent, objective, and honest. We even assume its goals are aligned with our own. Once again, we generally write off sources favorable to the other side as incompetent and untruthful, and we often assign malevolent motives.

 

     It’s relatively easy to see the above forms of mis-thinking in those we disagree with.  It’s extremely difficult to see them in ourselves.

 

     Based on the Science of Epistemic Rationality – once again, the science of how we form our beliefs, and whether our beliefs are true -- the confidence we have in many of our beliefs, and in our personal political ideology (again, generally formed as young adults, and often unchanged and unquestioned since) is unwarranted. Yes, the highly intelligent and highly educated do have some advantages when it comes to getting it right.  However, as I explain in my book. they also have at least nine key disadvantages. 

     Intelligence versus epistemic rationality are very different concepts.

 

 

The 6-step process for epistemically rational thinking provided at the top of this web page demonstrates the three key mental components of epistemically rational thinking according to the model outlined by Stanovich, West, and Toplak in their 2015 academic volume The Rationality Quotient (Stanovich and his colleagues have argued compellingly that intelligence, versus epistemic rationality, are such different concepts that they should be measured separately.).  

 

     These components include:

 

     1.  Thinking dispositions, or cognitive styles, associated with the reflective mind.

     2.  Algorithmic-level processing. 

     3.  Mindware.

 

1.  Thinking Dispositions, or Cognitive Styles, Associated with the Reflective Mind:

 

     These thinking dispositions, also referred to as cognitive styles, are those that in the words of Stanovich, foster “thorough and prudent thought, unbiased thought, and unbiased knowledge acquisition.” Thinking dispositions are essentially “ways of thinking.”  Thinking dispositions associated with epistemically rational reasoning include:

​

  • The tendency to base new beliefs on evidence

  • The tendency to weight new evidence against a favored belief heavily

  • The tendency to seek various points of view before coming to a conclusion

  • The tendency to calibrate the degree of strength of one’s opinion to the degree of evidence available

  • The tendency to seek nuance and avoid absolutism

  • The willingness to change one’s mind in the face of new evidence

  • The willingness to consider alternative opinions and evidence

  • The tendency to think extensively about a problem before responding

  • The tendency to spend a great deal of time on a problem before giving up

 

2.  Algorithmic-level Processing:

​

     Examples of algorithmic-level processing include:

​

  • Reading and comprehending

  • Assimilating and applying information

  • Analyzing

 

     These are the types of thinking that are measured by IQ tests.  It follows that people who are good at these types of thinking have high IQs.  Keep in mind, though, that this is only one of the three key components of epistemically rational thinking.  Intelligence, by itself, is not enough, when objective truth is the goal.

 

​3.  Mindware:

 

     Still, what if you have the ideal thinking dispositions, you have great reading, comprehending, assimilating, and analytical skills and are willing to put in some work; but you don’t have the right specialized thinking skills for the political issue in question? For example, what if you are highly intelligent but have no background in scientific thinking nor the scientific method, and you are attempting to tackle a science-related question that has become political (or to tackle a non-scientific question using rigorous, scientific-style thinking)?

     When a reasoner has the ideal thinking dispositions and high-level algorithmic processing ability, missing or insufficient mindware, or specialized “process knowledge,” is still often a barrier to reaching objective truth.  Scientific reasoning isn’t the only type of mindware that may be required. A few examples of key mindware include:

 

  • Probabilistic thinking (such as Bayesian reasoning)

  • Scientific thinking, including understanding of the scientific method

  • General statistical reasoning and understanding of general statistical methodology

​

​

When it comes to forming maximally accurate and objective political beliefs, however, ideal thinking dispositions, high-level algorithmic processing, and sufficient mindware still aren't enough! 

​

     Stanovich and others have argued compellingly that the above components of epistemically rational thinking still leaves us susceptible to myside bias.  And people of all levels of intelligence, all levels of education, and all political ideologies appear to be equally susceptible.

     The 6-step process for epistemically rational thinking found toward the top of this web page also includes two mechanisms for minimizing myside bias, including:

  • resisting the urge to simply defer to the first intuitive answer that pops into your mind. 

  • attempting to separate yourself from your existing beliefs, convictions, worldview, and political ideology; from your favored party's political platform, and from the beliefs of those you associate with, as you draw your conclusions.

​

     Still, however, myside bias permeates all aspects of our thinking.  It is extremely difficult to control.  For most people, as I argue below and in my book, it's probably impossible.

​

​

Forming beliefs and creating arguments regarding scientific issues, versus true scientific reasoning, are very different concepts.

 

     Science essentially involves making observations, collecting and analyzing data, and running experiments using established approaches that minimize the influence of observer bias, random chance, and confounding variables (a confounding variable is one that tends to cause you to conclude there is a causal relationship between two variables, when no such relationship actually exists).  Science as a process involves the dispassionate and open-minded search for objective truth.  It involves impartially following the data, wherever it leads.  

     Many people who believe they are thinking scientifically actually are not.  Many who insist that they “believe in science” actually have little understanding of the scientific method.  It is very common for people to utilize judgment heuristics when drawing a conclusion about a scientific issue, while believing that because the issue concerns science, they are thinking scientifically.

 

Here are some examples of scientific thinking:

 

  • Designing a double-blinded randomized controlled clinical trial, to help minimize the impact of bias

  • Collecting data, and then performing a regression analysis, to minimize the impact of confounding variables

  • Open-mindedly reviewing scientific data and reading scientific arguments in an attempt to disprove a favored hypothesis

  • Open-mindedly reading and reflecting upon the totality of peer-reviewed studies on a particular issue, including studies that come to conflicting conclusions

  • Maintaining a healthy level of skepticism regarding a scientific conclusion reached by one's favored political party

 

Here are some examples of non-scientific, non-epistemically rational thinking about a scientific issue:

​​

  • Memorizing science-related data and other material for a high school science test

  • Adopting the science-related belief that one's political party, one's associates, and / or one's favored media sources have adopted, without additional analysis, without reflection, and without skepticism

  • Simply adopting the conclusion of the scientific consensus, especially regarding conviction-level beliefs for which both you and the scientists care about the outcome and have one’s sense of self-identify involved.  Sure, this will often (but not always!) get you to the right answer -- but it is not an example of scientific reasoning.

  • Reading only the scientific papers that support one's existing beliefs

  • Building an argument in support of a favored hypothesis

  • Building an argument to support a conclusion you formed via the utilization of judgment heuristics (highly influenced by myside bias and the beliefs of those you associate with), by collecting large amounts of supporting scientific data and organizing it into a highly compelling cause-and-effect narrative  


      

little bit more about arguments:

     I have already explained the concept of backward thinking, whereby a belief is formed first, and evidence and is then gathered to support it.  Highly intelligent people are able assimilate and organize large amounts of scientific (and other) data, and are able to create particularly compelling belief-and-supporting-evidence arguments this way – so compelling that both they and their associates are fooled.  Above, I mentioned that the highly intelligent have nine disadvantages when it comes to forming accurate political beliefs.  This is one of them. The more confirmatory data one gathers, and the more compelling the argument one is able to create, the easier it is to fool oneself!

     To reiterate, memorizing and organizing scientific facts is not scientific thinking.  And once again, building compelling arguments, versus pursuing objective truth, are very different goals that require very different approaches. Those creating highly compelling arguments are often pursuing instrumental goals (see below), not epistemic ones.

​

​

Using nothing but factual information, one can make any argument, and tell any story, one wants.

 

     In addressing heuristics, cognitive biases, creation of arguments via backward thinking, and the ignoring of contradictory evidence; I have thus far focused on subconscious processing, whereby a person believes one is pursuing maximally objective thinking and the pursuit of objective truth (but actually is not).

     It should now be apparent that via the deliberate use of carefully-selected facts, quotes, and evidence; one can create any argument, and tell any story, one wants.

     Think about it.  If someone who does not particularly care for you was able to locate and interview many of the people who think the least of you, was willing to take things you have said and done out of context, and was willing to focus exclusively on incriminating evidence while ignoring the exculpatory – and then carefully weaved the information into a “you suck” narrative – could they convince people that you are a jerk?  Or that you are not very bright?   Or both?

     For almost all of us, the answer is yes.  Historical events can be massaged, and cause-and-effect arguments can be created, the same way.

​

​

​

Beliefs are built on beliefs are built on beliefs.  

 

     The political arguments (narratives) we form, each generally distilling a highly complex issue down to a belief plus a few pieces of supporting evidence and other reasons we identified after the belief was formed, usually include little or no conflicting information.  Despite this, our beliefs and our narratives seem very real to us. They fit together nicely with our other political beliefs, with the beliefs of those we associate with, with what we are told by the media sources we choose to pay attention to, and with the large number of diverse and often unrelated social, political, economic, and moral positions that make up the platform of our favored political party.  Our beliefs on a wide variety of political issues line up neatly with those of our party leaders. We begin to favor policies based on who is advocating them, and not the other way around.  

     Together, our collection of mutually reinforcing political narratives becomes our way of making sense of highly complex issues and a highly, highly complex world -- a world that is far too complicated for any of us to truly comprehend. We become extremely confident in the set of narratives we form. Together, they become our truth, with very little subtlety or gray. It becomes impossible to see the world from any other perspective. We generally don't try to.  One perspective -- our perspective -- is enough.  Our day-to-day political beliefs, our deep convictions on major issues (such as equity, equality, systemic racism, abortion, climate change, and gun control), our worldview; and our overarching political ideology; often unquestioned and unaltered since we formed them, seem so obvious to us that it feels like we carefully reasoned our way to them, methodically and open-mindedly considering competing evidence, viewpoints, worldviews, and ideologies along the way. Few of us, of course, have actually done this.

     Meanwhile, while political philosophy should serve as the underpinning for our political beliefs, few of us have read and compared diverse and competing works from respected conservative versus progressive political philosophers.  In fact, few of us read any political philosophy at all.

​

​

When we utilize heuristic thinking to form our beliefs about political issues, objective thinking and the discovery of objective truth often are not our primary motivations.  We have subconscious instrumental goals that compete with and often supersede the pursuit of truth, such as:

 

  • Speed, with minimal expenditure of mental energy.  We don’t want to spend a lot of time thinking and analyzing.  And we don’t want to expend a lot of mental energy.  Thinking – just sitting there, and really thinking -- is hard.  Most of us find it unpleasant.  Most of us do it rarely.

  • A sense of confidence, with minimal doubt. We crave the firm sense of confidence that comes with having created a story that makes sense to us and that fits with our existing beliefs and convictions, with little or no conflicting information.  Once again, however, a sense of confidence has little to do with whether truth has been achieved.

  • Social goals, such as the desire to be seen favorably by others, group acceptance, group communication, group cooperation, and group cohesion.

  • The desire to be moral, and to feel good about ourselves and our groups.

  • The desire for a sense of purpose, to have meaning in our lives.

  • Material goals, ranging from food, housing, physical safety, and access to healthcare; to financial security, college educations for our children, exotic vacations in faraway places, fancy watches, diamonds, and private jets.

 

     Heuristic thinking often does lead to the epistemic goal of objective truth.  However, it also often leads to beliefs that are just "good enough," and it sometimes leads to beliefs that are outright false! While the use of judgment heuristics is not optimal for truth discovery, it is quite good at helping us achieve other subconscious goals we have when we form beliefs about political issues, such as the six goals outlined above.  So no, most of our belief forming processes regarding political issues are not optimized for, and are not designed for, the attainment of objective truth.  

​

​

Based on the ways in which our brains and our minds evolved, and based on the purposes for which they did so, truly open-minded, objective thinking may not even be possible.

 

     As human beings evolved, they acquired thinking capabilities, including the abilities to reason and to learn, that differentiate them from all other animals.  But why did our abilities to reason and learn develop?  What is human reasoning, and what is human learning, for?  What, it follows, are our reasoning and our learning functions good at?

     Many or all of the above instrumental goals, clearly of great importance to humans in the present day, were in the same or similar form also likely of great importance to the survival and reproduction of our ancestors during the tens of thousands of years during which they evolved. It follows that the human brain and the human mind, and the biological, cognitive mechanisms (defined below) for which they are designed, developed for the purpose of helping our ancestors achieve them. It is reasonable to hypothesize that our brains are designed to “mis-think,” epistemically-speaking, because in the environments in which our ancestors lived, thinking this way conferred adaptive advantages related to the achievement of instrumental goals. Unfortunately, brain development that conferred advantages to our ancestors then may not be advantageous for the discovery of truth now, and may even serve as an impediment.

     Evolution occurs in significant part via the process of natural selection.  Spontaneous, random genetic mutations (changes in DNA) occasionally result in new physical or biochemical characteristics that confer on the host animal an improved ability to survive and/or reproduce. Descendants with the new adaptation thrive, outcompete, and multiply.  Those without the adaptation die off, and the species evolves. For humans, the ability to walk upright and the presence of opposable thumbs are characteristics that evolved over time and that have conferred survival advantages.

     Based on principles fundamental to the discipline of evolutionary psychology, a field still in its relative infancy, the brain evolved similarly, to develop a significant number of specialized information-processing cognitive mechanisms, or ways of thinking, that were designed to address specific adaptive problems our ancestors faced. These ways of thinking, it follows, conferred survival and reproductive advantages

     Related to political thinking, humans are of course not born with specific political thoughts.  However, our political thinking is likely deeply influenced by our evolved mental mechanisms, by the ways of reasoning and learning for which our brains and our minds were designed, for the purpose of meeting various instrumental goals of our ancestors.

     In June, 2023, at the annual meeting of the Human Behavior and Evolution Society, I presented the theories that myside bias, by facilitating in-group flourishing, empathy toward one’s in-group, and antipathy toward one’s out-groups; as well as the tendency to form new beliefs biased by the beliefs of one’s associates, by facilitating learning from others (the only way our ancestors could learn, other than by personal experience), are natural selection-derived cognitive mechanisms.  The theories are summarized in this poster, and I elaborate on them in much more detail in this essay (written to be understood by those who are not avid students of epistemic rationality science or evolutionary psychology).  

     If these theories are correct, it may actually be objective, open-minded thinking that should be considered aberrant thinking, thinking for which the brain was not designed – and therefore, thinking that may be impossible for many or most people.

     While these are admittedly only theories, and while some respected academicians disagree with them, what is clear is that during the tens of thousands of years during which the human brain and the human mind evolved, the hand of natural selection did not design either reasoning or learning for pursuits which would have had minimal value to our ancestors, such as:

​

  • Reaching objective conclusions about one’s in-group versus one’s out-groups

  • Learning via the assimilation and open-minded analysis of large amounts of information from disparate sources

  • Finding objective truth from short, carefully selected snippets of political information on a wide range of constantly rotating complex political issues most people know little about, and that have little to do with their daily lives

  • Developing a highly objective, open-minded, extensively-reasoned, and comprehensive worldview and / or political ideology

  • Identifying accurate and objective information sources concerning complex political issues that have little bearing on one’s day-to-day survival

​

     When we reason, think abstractly, analyze, learn, communicate, interrelate, and remember, we are doing so with brains that were designed, by the forces of evolution, to address adaptive problems that our ancestors faced tens and hundreds of thousands of years ago.  We are not doing so with brains designed for objective political thinking in the information abundant twenty-first century world.

     Human thinking, including both reasoning and learning, developed for a reason -- or probably, for a multitude of reasons – but probably not for maximizing objectivity and accuracy as we determine our political beliefs.  Despite the confidence we have in these beliefs (a strong sense of confidence probably does confer a survival advantage, as I explain in my book, and we are generally great at becoming quite confident in the beliefs we form), there is no reason to think we are naturally good at forming them objectively and accurately.  At least, not without a lot of work.

 

     I’ll close this section with a question:   If you think you are capable of seeing the world from an objective point of view … why would you be?

​

​

We are not nearly as good at recognizing and choosing honest, objective, and accurate political information sources as we think we are, either.

 

     Most of us feel quite sure the politicians, media, and other information and opinion sources we pay attention to are objective and accurate, and that it is the people on the other side of the political divide who are tuning in to sources that are biased and dishonest. Most of us are quite sure we would know it if our sources were misleading us.  However, just as we use heuristic thinking to determine our beliefs about complex political issues, we generally utilize heuristic thinking when we determine our beliefs regarding which information sources are both honest and accurate. Cognitive biases such as myside bias and the tendency to form beliefs biased by the beliefs of our associates exert significant influence.   If a politician or political information source reinforces our political beliefs, convictions, worldview, and ideology; and if the source is trusted by those we associate with; we tend to consider the source as competent, reliable, and truthful; and even assume its goals are aligned with our own.  Meanwhile, we simply write off sources favorable to the other side as incompetent and untruthful, and we often assign malevolent motives.

     I cover this issue in more detail in Chapters 15-17 of my book, but for now, the bottom line is that we trust the sources we view as being on our side.  And we trust the sources that provide information that makes sense to us, that allow us to create nice, neat belief-plus-supporting-evidence-and-other-reasons narratives that are coherent with what we already believe.

​

​

Almost everyone feels that they, and those who think what they think, are the exception.

 

     Succinctly:

 

  • What I have written above applies to nearly everyone.  

  • When we observe those who disagree with us, it is relatively easy to see that the above claims are true. 

  • When we observe those who agree with us, it is much more difficult to see.

  • It is extremely difficult to see it in ourselves.

 

     If you are able to open your mind to the possibility that you do not utilize optimal processes when you form even many of your most cherished political beliefs and when you choose your information and opinion sources, that you have subconscious goals other than objective truth when you reason about politics, and that objective thinking may not even be possible for you; it may be possible to improve the accuracy of the beliefs you form.  Unfortunately, most are simply unable to open their minds to these possibilities, and most are simply unable to actively consider evidence and arguments in a truly open-minded fashion.  For most people, based on the ways in which they form political beliefs and on the purposes for which they form them, the psychological barriers are just too high. 

     Those who can do this, however, have a significant advantage when truth discovery is the dominant goal.    

​

     Open-minded, objective thinking about complex political issues is highly challenging for most of us.  For many of us, as I explain above and in the book, it may not even be possible. 

     However, it's necessary.

     Regardless of your level of intelligence, regardless of your level of education, and regardless of your political orientation, consider attempting to erase your brain, at least with regard to your political beliefs, deep political convictions, worldview, and political ideology.  Implement the principles of epistemically rational thinking explained above.  And start over. 

     You just might end up in a different place.

​

​

                                                                      ________________________

​

​

We are much more susceptible to disinformation than we think we are.  

 

     By now, we are all familiar with the concept of “fake news," and much has been written about the dissemination of both disinformation and propaganda.  Similarly, much has been written about indiscrete methods governments and other organizations use to “nudge” you toward certain behaviors or beliefs. Most of us assume that only other people – generally, those on the other side of the political aisle, and those who are less intelligent – are susceptible to misinformation, disinformation, propaganda, and nudges. Most of us assume our own political beliefs and behaviors are rooted in logic, careful reasoning, objectivity, and truth.  But based on the way we form our beliefs about politics and based on the way we form our beliefs about the reliability of our information sources, the Science of Epistemic Rationality does not back up these assumptions. 

 

 

Yes, the highly intelligent and the highly educated are susceptible, too.  

 

     Once again, it is not just the simple-minded and minimally educated who are susceptible to error-prone thinking, to error-prone identification of information and opinion sources, and to disinformation.  It is not just the simple-minded and minimally educated who subconsciously prioritize goals other than objective truth when they form their political beliefs.

     As I explain in the book, there is a tremendous difference between intelligence, as measured by IQ (or GMA, general mental ability), versus epistemic rationality, our ability to think in ways that maximize both objectivity and our odds of reaching objective truth.  Many people (in fact, most people) who are highly intelligent and/or highly educated do not think in highly epistemically rational ways when they form their political beliefs.  Meanwhile, some who do not possess elite-level intelligence and who do not have advanced degrees approach political issues in ways that are much more objective and accurate.   Very bright people do have some advantages when it comes to getting it right.  However, as I explain above and in Chapters 10 and 11 of the book, they also have at least nine meaningful disadvantages

 

​

Democracy itself is now at stake.  

 

     In a widely circulated 2022 article titled “Why the Past 10 Years of American Life have been Uniquely Stupid," social psychologist and New York University Professor Jonathan Haidt draws from The Bible in explaining the current political state of America (though the article is relevant to many other countries as well). In the Book of Genesis, God took offense to the decision of the descendants of Noah to build a great tower to make a name for themselves, and to punish them He deliberately confused their language. Haidt explains it this way:

 

“The story of Babel is the best metaphor I have found for what happened to America in the 2010s, and for the fractured country we now inhabit.  Something went terribly wrong, very suddenly.  We are disoriented, unable to speak the same language or recognize the same truth. We are cut off from one another, and from the past.

 

It’s been clear for quite a while now that red America and blue America are like two different countries claiming the same territory, with two different versions of the Constitution, economics, and American history.  But Babel is not a story about tribalism.  It’s a story about the fragmentation of everything.”

 

     We have entered a dangerous era.  Armed with a rapidly improving understanding of the ways in which people think and form beliefs, and with increasingly sophisticated information technology (to include artificial intelligence), those propagating disinformation in the pursuit of power are becoming more and more effective. America and many other countries are becoming more and more polarized.  And yes, democracy itself is now at stake.

 

________________________

 

Epistemic Crossroads' Purpose

 

     In 2023, I created Epistemic CrossroadsTM (and in very early 2024, I will complete my first book) for the purpose of teaching people about the Science of Epistemic Rationality – in other words, for the purpose of teaching people how to believe.  Not what to believe, but how.  Yes, it’s a strange and unconventional mission. However, we live in strange times -- increasingly dangerous times, actually -- so this goal is a critically important one.  In fact, in this era of misinformation, disinformation, technologically enhanced propaganda distribution and amplification, and hyperpolarization, I believe how to believe is one of the most important things you could learn.

     In creating Epistemic CrossroadsTM I have three main missions, all related to the Science of Epistemic Rationality and all covered in detail in the book.

 

First, I explain the science behind the highly error prone ways in which most people, including the highly intelligent and highly educated, form political beliefs. In doing so, I synthesize and summarize the key findings and conclusions of thought leaders Daniel Kahneman, Amos Tversky, Keith Stanovich, Philip Tetlock, Jonathan Haidt, Steven Pinker, Hugo Mercier, Dan Sperber, Jonathan Baron, Daniel Kahan, and many others.  

 

Second, I explain more objective, more accurate, more epistemically rational ways to form political beliefs. Emphasizing the work of Stanovich and his colleagues, Baron, and other leaders in the field, I explain the reasoning methods and thinking styles that have the potential to lead much closer to objective truth, but which most of us seldom use.

 

     In approaching these first two missions, I attempt to provide people of all political persuasions a belief-forming framework that will give them a common understanding of where their political beliefs come from and how to form them better -- and that just might allow them to communicate with each other more effectively.

​

Third, I emphasize the implications to society and the risks to democracy itself, if we continue forming political beliefs the way we do, as opposed to utilizing the epistemically rational reasoning approaches that maximize objective thinking and the odds of arriving at objective truth.  Misinformation, disinformation, fake news, and propaganda are not going away; too many people benefit by disseminating it, and they are becoming better at it.

 

​

So yes, I think the United States and the rest of western society are in trouble. But why not just teach people what to believe, instead of how?  

 

     While providing information and opinions for the purpose of helping people determine what to believe regarding political issues (and other issues, such as scientific issues, that have been politicized) is a critically important function, and while some news and opinion organizations do it competently, honestly, and accurately; I believe it is also critically important to teach people how to believe.  People will always form many of their political beliefs under the influence of their friends, their family, their teachers, their co-workers, and agents who are actively trying to change their opinions, via both traditional media and social media.  And of course, they will also continue to form beliefs about which information and opinion sources are both objective and honest in the first place.  Those who understand the Science of Epistemic Rationality – including both their own belief-forming limitations as well as approaches that are more likely to lead to objective truth – are more likely to identify and choose accurate and honest information sources, and they are more likely to form accurate beliefs from the available evidence.   In addition, maybe – just maybe -- a mutually understood belief-forming framework will begin to allow people with very different beliefs, who trust very different information sources, to begin to relate and communicate.     

     Perhaps the way to keep society from breaking apart is to teach people the processes by which political beliefs are formed in the first place, and how incredibly susceptible those processes are to error. Perhaps instead of attempting to argue our opinions, we should instead first try to teach our political adversaries about the Science of Epistemic Rationality, or at least point them to EpistemicCrossroads.com. 

 

​

We have now reached a crossroads.  

 

     We can learn how to form beliefs and to choose reliable information sources more accurately.  Or we can live in a manufactured reality, in which we think how others want us to think, think about what others want us to think about, and believe what others want us to believe.

     An understanding of the Science of Epistemic Rationality has become critical.  Forming beliefs heuristically, and then confidently proclaiming them as accurate (while having little or no understanding of rationality science) must become socially unacceptable.  The uncritical acceptance of various news and information sources as objective, accurate, and honest; simply because they give us information that we are comfortable with and that meshes with our existing convictions and political leanings, must become unacceptable as well.  

     If democracy is going to survive, we are going to have to advance in the way we believe.  Epistemically rational thinking must become commonplace. Teaching of the Science of Epistemic Rationality to schoolchildren must become ubiquitous.  An understanding of rationality science must become expected, and because of the belief-forming limitations we all have, we must all become much humbler in the conclusions we reach.  When we form our beliefs heuristically, we must become less certain!

    The Science of Epistemic Rationality – again, the science of where our political beliefs come from, and whether our beliefs are true -- must no longer be confined to the world of academia.  It must be taught to all, and it must be understood by all.  People must have a common framework for understanding how political beliefs are generally formed, for understanding their limitations, and for understanding how to believe better.

 

     So this is Epistemic Crossroads’ purpose, and this is the purpose of my first book.  

 

     Success is far from guaranteed.  But I think it’s important to try.  Because, once again, democracy itself is now at stake.

​

________________________

​

​

I considered the three photos at the top of this web page for the cover of my first book, and considered three names for the organization and website: Epistemic Precipice, Epistemic Thin Ice, and the one I ultimately chose, Epistemic Crossroads.

 

     Those seeking to maintain and grow power are well aware of the epistemic errors virtually everyone makes when they form political beliefs.  In addition, just a few powerful companies now have gained the technological ability to reach tremendous swaths of the population with carefully designed political messaging, and to influence our reality. So yes, we are at an Epistemic Crossroads.  We can turn in one direction, toward a post-truth society in which we live the reality others may decide they want us to live.  Or we can learn to recognize our epistemic limitations, become better believers, and take the other road. 

​

     Those you trust the most are also those most capable of fooling you.  

 

     And yes, it’s easier to fool you than to convince you you’ve been fooled.

​

     I hope you will read the book. 

​

     Tim Sawyer, MD

bottom of page