A better example of this is “how sure are you that 2+2=4 ?” It makes sense to assign a prior probability of 1 to such mathematical certainties, because they don’t depend on our uncertain world. On the other hand, how sure are you that 8858289582116283904726618947467287383847 isn’t prime?
For a die in a thought experiment – sure, it can’t be 7. But in a physical universe, a die could indeed surprise you with a 7.
More to the point, why do you believe the probability that hallucinations as a problem will be solved (at least to the point that they are rare and mild enough not to matter) is literally 0? Do you think that the existence of fanatical AI zealots makes it less likely?
Okay, so by your logic the probability of literally everything is 1. That’s absurd and that’s not how Laplace’s law of succession is supposed to be applied. The point I’m trying to make is that some things are literally impossible, you can’t just hand-wave that!
And I’m not saying that solving hallucinations is impossible! What I’m saying that it could be impossible and am criticizing your blind faith in progress because you just believe the probability is literally 1. I can’t say, for sure, that it’s impossible. At the same time you can’t say, for sure, that it is possible. You can’t just assume the problem will inevitably be fixed, otherwise you’ve talked yourself into a cult.
You do not know that it is nonzero, that’s just an assumption you made up.
Also, Laplace’s law of succession necessarily implies that, over an infinite number of attempts and as long as there is a possibility of success, the probability that the next attempt results in success approaches 1.
A better example of this is “how sure are you that 2+2=4 ?” It makes sense to assign a prior probability of 1 to such mathematical certainties, because they don’t depend on our uncertain world. On the other hand, how sure are you that 8858289582116283904726618947467287383847 isn’t prime?
For a die in a thought experiment – sure, it can’t be 7. But in a physical universe, a die could indeed surprise you with a 7.
More to the point, why do you believe the probability that hallucinations as a problem will be solved (at least to the point that they are rare and mild enough not to matter) is literally 0? Do you think that the existence of fanatical AI zealots makes it less likely?
Okay, so by your logic the probability of literally everything is 1. That’s absurd and that’s not how Laplace’s law of succession is supposed to be applied. The point I’m trying to make is that some things are literally impossible, you can’t just hand-wave that!
And I’m not saying that solving hallucinations is impossible! What I’m saying that it could be impossible and am criticizing your blind faith in progress because you just believe the probability is literally 1. I can’t say, for sure, that it’s impossible. At the same time you can’t say, for sure, that it is possible. You can’t just assume the problem will inevitably be fixed, otherwise you’ve talked yourself into a cult.
I’m not saying the probability of literally everything is 1. I am saying nonzero. 0.00003 is not 1 nor 0.
I am not assuming the problem will inevitably be fixed. I think 0.5 is a reasonable p for most.
You do not know that it is nonzero, that’s just an assumption you made up.
Also, Laplace’s law of succession necessarily implies that, over an infinite number of attempts and as long as there is a possibility of success, the probability that the next attempt results in success approaches 1.