8 Comments
Sep 10, 2023Liked by Philosophy bear

BTW, the coherentism section cuts off without a

Expand full comment
author

Thanks, fixed.

Expand full comment

This just seems like foundationalism. You assign priors which function as foundational beliefs, then update.

Expand full comment
author

I can start with a prior of 0.01 and end up thinking 0.99, it seems odd to say 0.01 is foundational to my current beliefs, at least in the way the foundationalist intends the term foundational.

Expand full comment

What about the propositions you conditionalize on when you update your credences (evidence)? Those seem kind of like basic beliefs.

Expand full comment
Sep 10, 2023·edited Sep 10, 2023

> Subjective Bayesianism even offers a kind of solution to the problem of skepticism- a solution with a broadly Moorean flavor. I started out with a set of priors endowed upon me by nature. I then continued to update these beliefs as appropriate. These beliefs give a very high probability to the claim that the world is as it seems to be, and a very low probability to the claim that I am being deceived by an evil demon. Unless for some reason I should have updated my beliefs to give a high probability to the claim that I am being deceived by an evil demon or equivalent [and perhaps a positive argument for such can be found in the simulation argument], I should keep my beliefs as they are. Since priors themselves are no longer in need of justification Moore was right all along.

This (false dichotomy, heuristic driven post-hoc rationalization) seems like a fine example of how Bayesianism implodes when one tries to move from the abstract to the concrete. At least Rationalists would throw a "probably" in there to make it more rhetorically persuasive!

And of course: apologies for "pedantry". ;)

Expand full comment
author

I can only argue against arguments.

Expand full comment

Well that's a shame.

Expand full comment