(06-07-2025, 11:30 PM)Mauro Wrote: You are not allowed to view links. Register or Login to view.but nothing of the knowledge one has, needs to go into a 'prior'. Knowledge = evidence, and it can be uniformly treated as such without separating some of that evidence into a 'prior' and calling 'evidence' only what remains. [...] the difficulty in determining the odds (of each piece of evidence) cannot be used as a criticism of Bayesian logic: it's a problem inherent in the data at hand, not a specific feature of Bayesian reasoning.
Again, the probability of a proposition is not a measurable property of the proposition itself. It is a numeric expression of
one's belief that the proposition is true. Thus it is inherently subjective, and depends on everything one knows about the circumstances and on all arguments and computations one can make.
Probability Theory (PT) does
not tell us how to choose probabilities. It only tell us how to combine probabilities that we have chosen for some propositions in order to obtain probabilities for related propositions
in a consistent manner.
PT does
not say that Prob(head) = 0.5 in a coin toss. That is merely a choice that most people will make based on their knowledge (scientific or intuitive) of the mechanics of a coin toss, and/or their experience with doing them.
PT
does say that, if one's Prob(head) is 0.7, then one's Prob(tail) should be 0.3, provided one believes that those outcomes are distinct and there is no other possible outcome.
And PT
does say that Bayes's formula is the correct way to compute Prob(X_i|A_j),
if one has already chosen the values of Prob(A_j|X_i)
and the priors Prob(X_i). Belief in the formula itself is not subjective; it can be proved directly from the definition of probability and the rules elementary logic.
A quantity X is called "random" if one does not know its value. If one has chosen values for Prob(X=v) for all possible values v, then one can compute the entropy of X, which is a numeric measure (commonly expressed in bits) of one's ignorance about its value.
One may think that the entropy will only decrease as one gains more information abut X; but, paradoxically, it can increase as well. If X is the millionth digit of pi, my Prob(X=7) (or any other digit value ) is currently 0.1, because I cannot compute it and I have not looked it up; and my entropy for X would be ~3.3 bits. If someone told me that he looked it up and X is actually 5, my Prob(X=7) would drop to near zero, my Prob(X=5) would rise to near 1, and my entropy of X would drop to near zero. But if the guy then told me that he got that information from ChatGPT, my Prob(X=5) would drop back to a bit more than 0.1 and my entropy would go back to almost 3.3.
When choosing a probability, one could pretend to not know certain things; but there would be no point in doing that. Probabilities are meant to help us take decisions, and therefore we want to use all the information and computations we have.
When discussing probabilities, one could also consider what one's probability was before one gained certain information (like, what my Prob(H) was before I saw T&T's paper), or how one's probability would change if one received certain information (like what my Prob(H) would be if I were to learn that Barschius bought the VMS from Edward Kelley). Or we could assume some of the other party's probabilities in order to show them what their probabilities for related events should be (like what
your P(H|A) should be given
your Prob(H) and Prob(A|H) etc.)
Quote:Can Bayesian logic help with the VMS? Unfortunatley I'm not sure it can, even for the most basic case, meaningless vs. meaningful.
Yep. My Prob(H) is obviously much smaller than yours, and it would take a lot of arguing to
perhaps change either of them...
All the best, --jorge