Doesn't an obligation to your future self arise naturally, precisely because of the high degree of similarity to them (i.e. you expect them to have similar values, so you have instrumental reasons to value their existence/wellbeing/power)? This seems consistent with all of A, B and C at once.
I don't know if it's correct to call that time discounting, per se - or at least, pure time discounting, since it's instrumental. Consider, for instance, that you'd value having a job much more during the Great Depression than ten years prior to it, or after it. I don't think moral philosophers or decision theorists would find this irrational; instrumental goals shifting priorities in response to changes in the environment seems pretty reasonable (and commonplace).
Similarly, you could say that the (expected) degree of similarity to your future self is a feature of the environment (a vague and hand-wavy term, but it'll do here), and Travyon1 could end up discounting Travyon2 (tomorrow-self)'s interests less than, say, Travyon3 (twenty-years-later-self)'s interests, while remaining completely free of pure time discounting with respect to intrinsic goals.
A test case for similarity as a basis for time discounting would be if you expect yourself to become less like current you, then go back to being more like current you.
Dreaming may be a case like this! The me that is asleep is much less like current waking me than other waking mes are, and I do feel that I probably care about dreaming me less. Not zero, but even granting that dream-harms aren't real, and that I'm less likely to remember them, I'd rather my dreaming self experience a harmless pain than my waking self. But there are other confounds there, of course.
Doesn't an obligation to your future self arise naturally, precisely because of the high degree of similarity to them (i.e. you expect them to have similar values, so you have instrumental reasons to value their existence/wellbeing/power)? This seems consistent with all of A, B and C at once.
Right, but so long as similarity is not 100% there should be some time discounting.
I don't know if it's correct to call that time discounting, per se - or at least, pure time discounting, since it's instrumental. Consider, for instance, that you'd value having a job much more during the Great Depression than ten years prior to it, or after it. I don't think moral philosophers or decision theorists would find this irrational; instrumental goals shifting priorities in response to changes in the environment seems pretty reasonable (and commonplace).
Similarly, you could say that the (expected) degree of similarity to your future self is a feature of the environment (a vague and hand-wavy term, but it'll do here), and Travyon1 could end up discounting Travyon2 (tomorrow-self)'s interests less than, say, Travyon3 (twenty-years-later-self)'s interests, while remaining completely free of pure time discounting with respect to intrinsic goals.
A test case for similarity as a basis for time discounting would be if you expect yourself to become less like current you, then go back to being more like current you.
Dreaming may be a case like this! The me that is asleep is much less like current waking me than other waking mes are, and I do feel that I probably care about dreaming me less. Not zero, but even granting that dream-harms aren't real, and that I'm less likely to remember them, I'd rather my dreaming self experience a harmless pain than my waking self. But there are other confounds there, of course.