I
If I were worried about being right about everything, no doubt I should be having a rough time of it with confirmation bias. I would start to think about something, form an opinion, become petrified this was merely me seeing what I wanted to see and explore the opposite opinion, then become petrified by the possibility that this was also me just seeing what I wanted to see as well, but from the opposite point of view, a kind of grim truth to wear like a hairshirt, to convince myself of my epistemic virtues.
I’m not in this spiral, largely because I’m not aiming to be right. That is to say my primary aim in inquiry (outside of day-to-day life) is not to have true, justified beliefs. I’ll justify this extraordinary claim in a moment, but for the sake of a logical ordering, let me start by saying what I am trying to be. I am trying to be interesting and honest (and you will observe these are at least potentially contradictory). By interesting, I mean bringing to the table ideas that are novel, or, more plausibly, are at least novel for much of my audience. By honest, I mean not misrepresenting or deceiving about the strength of the evidence for these ideas.
You’ll notice that both these virtues are other-directed. That is, I have described my approach to thinking in terms of interacting with others. So now we come to why I’m not particularly worried about confirmation bias. My goal isn’t to be right individually. My goal is to make my little contribution to the world as a whole getting it right.
Given that overarching goal, bringing novel thoughts forward, while seeking not to exaggerate my case and distort “the discourse” is the best I can do.
In a way, the fact that I am prone to confirmation bias might even be a good thing from the point of view of the world getting it right. It allows me to range deeply in epistemic tunnels that, from an outside view probably lead nowhere, but just might lead to a golden trove. The confidence that allows me to plumb these tunnels is, from a certain perspective, irrational, but in expectation, it may make society overall more likely to have true beliefs.
In many ways, adopting this perspective has freed me from a poisonous dialectic. The man who wants to be right in everything he propounds is alternatively caught by the fear that he is wrong or by an overabundance of confidence he is right. Both unlimber you.
II
Why adopt this perspective rather than a more individualistic approach to rationality? There are really two questions here, viz:
What is intrinsically more important, society being right or you being right?
If society being right is more important than you being right, what’s the best way, in expectation, to help society become more correct? How can we be sure it’s not just each individual trying to be rational?
I think the first question almost answers itself. The second one is more interesting, I hope to treat it in more detail at a later date.
However, I would suggest that, as a preliminary sketch, every single individual trying to have the correct beliefs individually, and pursuing that in a rational way, would probably lead to an underinvestment in niche ideas and possibilities. Eccentricity ensures a diversity of ideas.
We also have to consider what I have elsewhere termed the paradox of the crowd. If you want to be right as an individual, your best bet is to adopt the most widely held beliefs, on the basis of a wisdom of the crowds’ argument- except in cases where there is reason to expect systemic bias. However, if everyone did this, the overall epistemic quality would go down. The crowds can only be wise if the individuals that make them up are not too over eager to follow the crowd’s wisdom. (This is a bit like the problem of a stock market becoming overburdened with index fund investors).
III
I previously suggested the two cardinal virtues for someone trying to help society get it right are being interesting and being honest. There’s a third virtue as well, but it comes in at another stage of analysis.
Let’s say that your aim is to help society find truths, but let’s suppose that not all truths are equally in everyone’s interests, so you start to think about what truths are, so to speak, likely to suffer from underinvestment.
I’m sure you’ll agree with me that, at least one set of underinvested truths, are going to be those truths that, if uncovered, can help advance the position of the weak, at the expense of the strong (1).
So we come to another epistemic virtue that I call acting in service. Helping advance those ideas that are likely to be systematically squelched because they are inconvenient and difficult for those who hold power. You will note that while this virtue of service could be justified in non-epistemic terms, our argument for it here is purely framed in terms of the epistemic health of society: it is likely that ideas that disadvantage the powerful will be systematically neglected, therefore there is value in focusing our investigations there.
A couple of things you might like if you enjoyed this post. My free book which you can find by clicking this: Live More Lives Than One and my subreddit which you can find by clicking this: r/PhilosophyBear. Please share this post if you liked it.
————————————————————————
Footnote:
(1) Not that this epistemic case is the only generalized case to favor the weak over the strong or even the best such case. However it is one case, and it is relevant to this essay. Obiter dicta, probably the best argument for favoring the weak, in general, is that by definition because they are weaker they are less likely to have what they deserve and/or need.
Love me some sextus empiricus on the curse of dogmatism - speaks a lot of this truth
https://newlearningonline.com/new-learning/chapter-7/sextus-empiricus-the-sceptic-on-not-being-dogmatic
Are you familiar with the Baha'i concept of consultation?