top of page

October 24, 2024

​

Additional Thought.  Essay on Scientific Reasoning

 

Last weekend, I mentioned I would be sending a new essay I have been working on, about scientific reasoning.

 

The bottom line: scientific reasoning is an arduous, complex process, rooted in skepticism, that requires a lot of learning and requires a lot of practice. It also requires an open mind!

 

It simply is not enough to point to anecdotes or memorize facts that support the conclusion of one's political party, and then proclaim "I believe in science!" while calling those who conclude differently "science deniers."

 

The essay does contain and explain the six-step process for epistemically rational reasoning, as well as the six pitfalls of heuristics-based processing, I have explained in prior essays. If you have a handle on these, feel free to skip these sections!

 

A PDF version of the essay can be found on this website, here.

 

As always, I would appreciate your feedback.

 

I hope you are having a great fall.

 

            Tim

​

​

October 20, 2024

​

Additional Thought.  Epistemic Crossroads to Present in NYC, and an upcoming Essay on Scientific Reasoning

​

Good news.  Epistemic Crossroads’ latest submission has been accepted; for the second year in a row, we will be presenting at the annual meeting of the Society for Judgment and Decision Making. This year’s conference will be in New York City, in late November. 

 

The presentation represents an attempt to develop a model that helps people understand the degree to which their political belief-forming process are subject to myside bias (the tendency to form beliefs biased by one’s existing beliefs, opinions, and attitudes), and to multiple other forms of “mis-thinking.”  Myside bias permeates all of our thinking; it appears to be “built in” to our reasoning processes.  Yet while it is relatively easy to see it in others, it is extremely difficult to see it in ourselves.  

 

The model I will be presenting compares the way we generally reason about politics to the U.S. trial system, where jurors are forced to listen to both sides, encouraged to consider both sides, and forced to deliberate before reaching a verdict (forming a belief).  Yes, the trial system was designed this way for a reason!

 

Here is the submission.  Due to word count limits, it is abbreviated and written for an academic audience.  If you have already read the longer, clearer essay on the subject I forwarded during the summer, there is no real reason to read the submission.  I’ll forward the poster I’ll use for the actual presentation once I have finished it.

 

Early this week, I will be forwarding a new essay I plan to submit for publication, on scientific reasoning.  Yes, I remain perplexed and amazed by the extent to which people with little background in science, and minimal understanding of the scientific method, are willing to utilize judgment heuristics to form their beliefs about scientific issues that have become political, to become extremely confident in the beliefs they form, and to then refer to those who reach difference conclusions as “science deniers.”  Sigh.

 

More soon ...

 

Tim

 

​

August 17, 2024

 

Additional Thought.  Ground Truth at MIT:  Facebook's Fact-checkers?

​

     At MIT, how do the scientists who study political belief formation determine political reality?  I wanted to know.

     After all, if one wanted to study the cause or causes of inaccurate belief formation, one would first need to know which beliefs are inaccurate, versus which beliefs are accurate. 

 

     Last November in San Francisco, at the annual meeting of the Society for Judgment and Decision Making, I gave a very small presentation arguing that the primary factor almost all of us use in determining the reliability of our political information sources – that is, the honesty, objectivity, and accuracy of the newspapers, television networks, political pundits, and websites we turn to for political information – is the degree to which the sources give us information that conforms well with the political beliefs we already have.  My argument was in part a rebuttal to a 2020 book by French philosopher and cognitive scientist Hugo Mercier, Not Born Yesterday:  the Science of Who We Trust, and What We Believe.  The display poster I used to give the presentation can be found here.  It’s short.  Feel free to read it, of course; it’s written to make sense even if you have not read Mercier’s book.  A year later, I continue to stand by my argument and my conclusions.

 

     At the same meeting, at a much more prominent and well attended session on how people form their political beliefs, all three presentations were given by academicians from MIT.  In the final presentation, the presenter made the argument that those on the political right are less apt to reach objective truth, because they are more apt to form their beliefs intuitively.

 

     Recall that perhaps the most fundamental finding of the early heuristics-and-biases work by (Nobel Prize winner) Daniel Kahneman and his research partner Amos Tversky is that when we form our beliefs about complex subjects, we tend to utilize judgment heuristics – intellectual shortcuts, basically, that allow us to form our beliefs quickly and without expending a great deal of mental energy, but that are associated with cognitive biases (such as myside bias and the tendency to form beliefs biased by the beliefs of our associates), as well as multiple other forms of “mis-thinking.”  Utilization of intuitions is the classic judgment heuristic in the world of political belief formation.  In other words, the MIT presenter was arguing that those on the right are less likely to reach objective truth regarding complex political issues, because they are more likely to utilize judgment heuristics.

 

     In his presentation, the speaker from MIT cited two examples of false political beliefs common to those on the right:  that the 2020 election was stolen, and that the Covid-19 vaccines have poor efficacy.  He did not cite examples of false political beliefs common to those on the left.

 

     How, I wondered, did the MIT speaker form his own beliefs about the 2020 election and Covid vaccine efficacy?  It’s MIT, after all, so he must have extensively analyzed all of the relevant data, making every effort to remain as open-minded as possible, before reaching conclusions about these complex scientific and political issues.  He must have utilized something that resembles the 6-step epistemically rational belief-forming process I provide on the EpistemicCrossroads.com website, and in the essay I forwarded a few weeks ago.  Right?

 

     At the end of his presentation, as he opened it up to questions from the floor, I raised my hand. I found the exchange enlightening.  You might, too.

​

     I had already asked a question of one of his colleagues that was perhaps a bit off putting, so he did not seem to want to call on me.  He glanced at me and then scoured the room, but finding no one else with their hand up, he pointed to me. 

 

MIT presenter:  Okay, you.

 

Me:  Thank you.  That was an excellent presentation.  You referred to beliefs about 2020 election integrity and Covid-19 vaccine effectiveness.  Is it safe to assume that you believe that the 2020 election was not stolen, and that the Covid-19 vaccines are highly effective?

 

MIT presenter, nodding in agreement:  Yes.

 

Me:  How did you form these beliefs?  Did you form them intuitively, and is it possible your conclusions are influenced by myside bias? (Recall from my other writing, and from the EpistemicCrossroads.com website, that myside bias is the tendency to form beliefs that are biased by one’s existing beliefs, opinions, and attitudes, including one’s political leanings.)

 

MIT presenter, after pausing:  No, I used information sources that are considered highly reliable.  He named a few mainstream, highly recognizable media entities.

 

Me:  How did you determine the reliability of your information sources?  Do you have a method for carefully vetting them?  Or did you determine their reliability intuitively, and again, is possible you were influenced by myside bias?

 

MIT presenter, after pausing:  No, we use highly reputable fact checkers to ensure the information is accurate.

 

Me:  There are a lot of fact checkers out there.  How did you determine which fact checkers are reliable?  Did you determine this intuitively, and again, is it possible myside bias was a factor?

 

MIT presenter:  No, definitely not.  We use the same fact checkers that the most reputable online sources utilize.  We use the same fact checkers that organizations such as Facebook turn to.

 

At that point, someone from the audience (presumably also from MIT) jumped in and turned to me: There are several studies showing that these fact checkers are highly reliable in determining what’s true and what’s false.

 

     The Society for Judgment and Decision Making is an academic society, and politically, social science academia resides almost exclusively on the hard left.  I’m a relatively new member, and I was concerned about jeopardizing opportunities to present at future meetings, or worse yet, about getting kicked out of the club.  The next question I wanted to ask is probably obvious to you, but at that point, I decided to refrain from pushing it any further.  I simply thanked them, and the presenter moved on to another questioner.

 

     I’ll let you determine for yourself whether Facebook’s fact checkers are reliable (honest, objective, and accurate) bestowers of truth, and whether studies showing that they are were conducted in an objective and scientifically rigorous fashion.  If you are typical, the answers may very well depend on whether their true-versus-false determinations are consistent with what you already believe. 

 

     So that’s how the scientists at MIT – at least, these particular scientists – determined their political reality regarding the integrity of the 2020 U.S. presidential election, and regarding and the effectiveness of Covid-19 vaccines. 

 

     Now you know.

 

​

July 15, 2024

​

Some Additional Thoughts related to yesterday’s attempted assassination of President Trump:

 

Much detail has already unfolded, but think back to the initial minutes after you learned that President Trump had fallen during his rally, following the occurrence of several “loud popping noises” (as some media outlets put it); while chaos still reigned, and before you knew a single thing about the shooter.

 

If you are on the left, did you immediately have some suspicion that this might be fake; that this might be a set-up by the Trump team to garner support for Trump and to blame the left?

 

If you are on the right, did you immediately suspect it was an assassination attempt organized by the Deep State to take Trump out?

 

If you are on the left, were you supportive of the left-oriented press’s initial refusal to conclude Trump had been shot, and then its initial refusal to conclude it was an attempted assassination?

 

If you are on the right, did you immediately conclude that the left-oriented press’ initial refusals to fully endorse the shooting and assassination attempt narratives were purposefully biased?

 

If you are on the right, did you immediately conclude that the shooter had been radicalized by repeated claims by the left that Trump is Hitler, and that he (and his supporters) are a threat to democracy?

 

If you are on the left, did you immediately conclude that at least part of what motivated the shooter was related to Trump’s rhetoric and MAGA extremism?

 

If the answer to any of these questions is yes, you just might be experiencing myside bias.  

 

I say this tongue-in-cheek; to at least some degree, we ALL experience myside bias, on a regular basis. Myside bias is everywhere. It permeates all of our thoughts.  Those who recognize this, and who takes steps to minimize myside bias, can come closest to objective truth.

 

To paraphrase Keith Stanovich, myside bias occurs when new beliefs are influenced by existing beliefs, opinions, and attitudes; when these beliefs, attitudes, and opinions are convictions (those very strong beliefs for which we have emotion and even ego involved).  Our views on climate change, equity, equality, abortion, gun rights, systemic racism, and of course, party loyalty; are examples of conviction-level beliefs.  Of course, myside bias is different than confirmation bias, whereby new beliefs are biased by established facts.  

 

It’s worth remembering that not everyone experiences myside bias to the same degree.  It’s also worth remembering that there is a trend toward myside bias being strongest in the highly intelligent and the highly educated.

 

Finally, please note that I am not addressing the accuracy of the beliefs you formed in the initial post-shooting moments.  I’m only addressing your initial intuitive responses, and why you had them. Many intuitive (heuristics-based) beliefs, of course, turn out to be accurate.

 

Tim

 

​

July 5, 2024

​

Hi everyone.

 

I’m finally getting back to writing essays and other Additional Thoughts related to today’s political world.  It took more time than I expected to do some deep dives into a few concepts I felt would help strengthen the book.  I’m new to the publishing process; it’s slow and arduous!  The “new and improved” version of the book should be released on Amazon (and other online booksellers) in the next few weeks.  And I’m of course always happy to forward you (and anyone else who wants one) a PDF version for free.

​

I also just finished submitting an academic abstract for presentation at this fall’s annual meeting of the Society for Judgment and Decision Making.  I was able to give a couple of (small!) presentations at academic meetings in 2023 as well; I’ll fill you in on all of this in future Additional Thoughts.

 

I’m now working on a series of essays I hope to have published.  They are meant to be read by the general population – that is, by people who are not experts in the science of epistemic rationality.  In aggregate, they are designed to give a reasonably complete picture of epistemic rationality science as it relates to politics:  what the key flaws in our reasoning and learning processes are, how to reason better, and why in today’s world, it really, really matters.  I’m a better technical writer than I am a writer for general audiences; I would love your feedback (even if negative!).

 

I’m not overly optimistic that these essays will be accepted by a reputable (and widely read) online publisher.  I’m a complete unknown, obviously.  The subject is not as sexy as, say, the latest high level political scandal.  And besides, who is open to being be told that their reasoning is deeply flawed, that they are not particularly rational (epistemically speaking) when they reason about politics, that the degree of faith they have placed in various news and information sources has little to do with whether these sources are accurate, objective, and honest; that subconsciously, objective truth isn’t even their primary goal when they reason and learn about politics; and that their perception of political reality is basically a construct, consisting of a series of narratives (arguments) that were created and retained more for the degree to which they fit well with each other and with one’s worldview, than for their accuracy?

​

The first of these essays, here, is an initial attempt to tackle an issue I have been thinking a lot about lately:  assuming it is possible (it probably is not), how do you get the masses (or even individuals) who are not open to the idea to accept that their political reasoning is deeply biased and flawed?  

​

Again, I would love your feedback.

 

 I hope you are having a great summer.

 

Tim

 

 

November 19, 2023:

​

Hello!  I'm finally back, having taken some time off to finish my book (now available on Amazon and most other online bookseller websites), and to work on two theoretical presentations for academic meetings.  I'll be posting more frequently again from here out.

​

I presented the first theory June 1, 2023, in Palm Springs, CA, at the annual meeting of the Human Behavior and Evolution Society.  The gist is that based on the ways in which human reasoning and human learning evolved over hundreds of thousands of years, and based on the purposes for which they did so, both myside bias (the tendency to base new beliefs on our existing convictions, including our party alignment), and the tendency to base new beliefs on the beliefs of our associates, are "built in."  Our brains are hard-wired to think in ways that support our team, and are hard-wired to learn by adopting the beliefs of others.  

​

The poster I used to present at the above meeting is found here.

​

And here is a much longer essay, written for understanding by the general public.

​

Today, I will be presenting a separate theory on the error-prone ways in which we assess the reliability of our political information sources, at the annual meeting of the Society for Judgment and Decision Making in San Francisco.  You can view it here

​

Hope you are doing well!

​

Tim

​

​

bottom of page