I want to, as it were, speak of the elephant in the room of traditional epistemology. Now, maybe this is a mendacious elephant- maybe this is a lying pachyderm- but it seems odd that no one has taken the time to talk about it. That elephant is this: there’s a way of seeing the Bayesian project in the theory of rational beliefs as simply replacing, wholesale, many of the traditional concerns and positions of epistemology. Now, I hasten to add that I’m not an expert on epistemology, Bayesianism or, for that matter, the family Elephantidae, but I look on, from my amateurish position, and it seems to me there are some things that aren’t being said openly about traditional epistemology which maybe should be.
Traditional epistemology- the big positions
The two big views in traditional epistemology are coherentism and foundationalism:
“A belief or set of beliefs is justified, or justifiably held, just in case the belief coheres with a set of beliefs, the set forms a coherent system or some variation on these themes.”
i.e. we look at all the things you believe and consider them justified so long as they have sufficient coherence.
Meanwhile, according to foundationalism:
“(a) there are some “basic” or “foundational” beliefs that have a positive epistemic status—e.g., they count as justified or as knowledge—without depending on any other beliefs for this status, and (b) any other beliefs with a positive epistemic status must depend, ultimately, on foundational beliefs for this status.”
The debate between these two views is a very old one. Must beliefs form a self-consistent web, or a tower, rising from secure, but unproven, foundations?
Bayesianism, meanwhile holds that we need to update our beliefs according to the probability calculus. This is a very comprehensive doctrine- given an initial set of beliefs expressed as degrees of beliefs, it will guide you through changing them pretty completely. Moreover, there are powerful proofs that doing anything except this is irrational- dynamic Dutch book arguments. Subjective Bayesianism meanwhile holds further that our initial beliefs- our priors- are only constrained by coherence requirements. Now it’s important to be clear that the requirement that priors be coherent is not equivalent to saying, as coherentism says, that beliefs are justified because they are coherent. Foundationalism also requires coherence in this sense- everyone wants coherence, so coherence isn’t enough for coherentism. Subjective Bayesianism is thus, at face value, not committed to Coherentism, or for that matter Foundationalism.
So on the Subjective Bayesian (SB) approach, we have arbitrary but coherent initial priors, and then we update those priors as we receive new evidence, in accordance with the laws of the probability calculus. If we do that right, our changes to our beliefs are in some sense warranted.
It seems to me that if we take subjective Bayesianism seriously, then coherentism and foundationalism are just done away with, and to the best of my limited knowledge, no one has said this out loud yet. Subjective Bayesianism can be seen as embodying an alternative to Coherentism and Foundationalism which we might call Dynamism:
Epistemological Dynamism: Beliefs, so long as they are consistent, aren’t justified or unjustified, only changes to our beliefs are justified or unjustified. Coherentism and foundationalism are hence category errors- they try to explain something that doesn’t exist (the justification of beliefs).
Subjective Bayesianism as Epistemology (SBE) seems to me to be a kind of dynamism about epistemology, though perhaps not the only sort. Because it has some metric by which priors might be wrongheaded- inconsistency, we could perhaps call it moderate dynamism, but nevertheless, there is an important sense in which most of the action on the Bayesian story is in changes to the degree of belief. That’s the primary locus of normative epistemology on this view.
Dynamism and the problem of obviously irrational beliefs
At face value, dynamism does not seem true to life. Let us suppose I ask Sam what Jessica does for a job and Sam says “She’s a category theorist” I ask him how he knows that and he simply replies “Oh, I just think that”. There seems to be a pretty important sense in which this is unjustified. Can dynamism explain this? I think it can.
It’s very unlikely that Sam, upon meeting Jessica, just arbitrarily decided- as a pure prior having nothing whatsoever to do with her behavior, appearance or his background information- that Jessica was a category theorist. He almost certainly inferred this from something, and if that inference wasn’t a strong one, which is probably the case because of his non-committal answer, he has changed a belief of his in an unjustified way at some point. Hence he has been irrational by the dynamist’s lights.
But what if he really did, for no reason whatsoever, just assign a high credence to Jessica being a category theorist? Saying there’s no problem with this seems like quite a bullet to bite!
Yes, sure, it does seem like a bullet to bite, but it’s not as if the subjective Bayesian wants to be a subjective Bayesian. Subjective Bayesians would love nothing more than to uncover a scheme for assigning priors binding on all rational agents, it’s just that so far they think- with some plausibility- that this approach hasn’t succeeded. No one has a fleshed-out way to make prior assignments non-arbitrary, at least yet. We may just live in a world where, sadly, priors are largely beyond the normative epistemology.
Furthermore, even if it does succeed at some point- even if we find a rationally binding way to assign priors, there is no reason to think that it will recover foundationalism or coherentism. Suppose a form of objective Bayesianism on which the principle of indifference is the principle by which we should form priors on novel subjects turns out to be correct. All good and well, but this seems to me not especially foundationalist or coherentist.
Subjective Bayesian epistemology and the problem of skepticism
Subjective Bayesianism even offers a kind of solution to the problem of skepticism- a solution with a broadly Moorean flavor. I started out with a set of priors endowed upon me by nature. I then continued to update these beliefs as appropriate. These beliefs give a very high probability to the claim that the world is as it seems to be, and a very low probability to the claim that I am being deceived by an evil demon. Unless for some reason I should have updated my beliefs to give a high probability to the claim that I am being deceived by an evil demon or equivalent [and perhaps a positive argument for such can be found in the simulation argument], I should keep my beliefs as they are. Since priors themselves are no longer in need of justification Moore was right all along. We started out with a high degree of belief in the external world, we’ve run into nothing to compel us otherwise, and so we will remain. Even if you don’t buy my total story about replacing traditional epistemology with Bayesianism, I think there’s something interesting about this answer to skepticism.
There’s a depth here - I think for reasons I won’t completely outline in this piece that the external world skeptic needs a kind of suspension of belief to be an option, but it’s not possible in the Bayesian framework. When you consider a question you must, from a Bayesian point of view, have started with a prior to that question- there is no such thing as neutrality. You might think that you can withhold belief in both P and ~P, and in a way you can, but it only amounts to believing both P and ~P 50% each. This Bayesian picture of credence, on which judgment is never really suspended, only affirmative, negative, or uncertain, is toxic to the skeptical worldview. The challenge for the skeptic about the external world is either to reject Bayesianism and defend genuine suspension of disbelief as an option or to show why people must update their high credence in the existence of the external world to a lowish enough credence that they can no longer plausibly say “I know the external world exists”.
The flaws of Bayesianism
Of course, Bayesianism is not a fully worked-out approach to epistemology. People are still arguing the difference between subjective and objective Bayesianism. Bayesianism seems to require logical omniscience and has many other problems. It is unclear to what degree it is good enough to simply ‘approximate’ perfect Bayesian reasoning maybe, as with economics, imperfectly imitating the perfect is a poor strategy. Still, there does seem to be at least a sense in which Bayesianism is knocking menacingly on the door of traditional epistemology.
Think about a perfect Bayesian reasoner with priors over everything (constrained only by coherence), updating as appropriate. It seems to me there’s little here that corresponds with the concerns of traditional epistemology. Furthermore, Bayesianism seems to have a number of powerful arguments going for it- like the aforementioned indicating we must update our beliefs in accordance with it. I’m unsure, then, whether or not there’s anything more to say. Perhaps the Bayesian story is wrong. Perhaps the Bayesian story is incomplete. Perhaps there is some profound sense in which good Bayesian reasoning conforms to the dictates of foundationalism, coherentism or perhaps both. Still, there seems to be at least a prima facie case for Bayesianism as total epistemology, or for replacement Bayesianism- for the thesis that Bayesianism is, in large degree, the successor of traditional epistemology.
Bonus Edit: Also, it seems to make the a priori, a posteriori distinction redundant, or at least in need of updating.
Bonus, bonus edit: Also a fully worked-out version of this needs to say something about credence versus belief, among many other topics, of course.
Bonus, bonus, bonus edit: Of course, since Bayesianism doesn’t deal with truths that would already be known by any logically omniscient being, we’ll need an alternative epistemological account of these.
BTW, the coherentism section cuts off without a
This just seems like foundationalism. You assign priors which function as foundational beliefs, then update.