I can start with a prior of 0.01 and end up thinking 0.99, it seems odd to say 0.01 is foundational to my current beliefs, at least in the way the foundationalist intends the term foundational.
> Subjective Bayesianism even offers a kind of solution to the problem of skepticism- a solution with a broadly Moorean flavor. I started out with a set of priors endowed upon me by nature. I then continued to update these beliefs as appropriate. These beliefs give a very high probability to the claim that the world is as it seems to be, and a very low probability to the claim that I am being deceived by an evil demon. Unless for some reason I should have updated my beliefs to give a high probability to the claim that I am being deceived by an evil demon or equivalent [and perhaps a positive argument for such can be found in the simulation argument], I should keep my beliefs as they are. Since priors themselves are no longer in need of justification Moore was right all along.
This (false dichotomy, heuristic driven post-hoc rationalization) seems like a fine example of how Bayesianism implodes when one tries to move from the abstract to the concrete. At least Rationalists would throw a "probably" in there to make it more rhetorically persuasive!
BTW, the coherentism section cuts off without a
Thanks, fixed.
This just seems like foundationalism. You assign priors which function as foundational beliefs, then update.
I can start with a prior of 0.01 and end up thinking 0.99, it seems odd to say 0.01 is foundational to my current beliefs, at least in the way the foundationalist intends the term foundational.
What about the propositions you conditionalize on when you update your credences (evidence)? Those seem kind of like basic beliefs.
> Subjective Bayesianism even offers a kind of solution to the problem of skepticism- a solution with a broadly Moorean flavor. I started out with a set of priors endowed upon me by nature. I then continued to update these beliefs as appropriate. These beliefs give a very high probability to the claim that the world is as it seems to be, and a very low probability to the claim that I am being deceived by an evil demon. Unless for some reason I should have updated my beliefs to give a high probability to the claim that I am being deceived by an evil demon or equivalent [and perhaps a positive argument for such can be found in the simulation argument], I should keep my beliefs as they are. Since priors themselves are no longer in need of justification Moore was right all along.
This (false dichotomy, heuristic driven post-hoc rationalization) seems like a fine example of how Bayesianism implodes when one tries to move from the abstract to the concrete. At least Rationalists would throw a "probably" in there to make it more rhetorically persuasive!
And of course: apologies for "pedantry". ;)
I can only argue against arguments.
Well that's a shame.