Some more probability thoughts
Jul. 9th, 2010 10:30 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Weirdly, it isn't the counterintuitive 13/27 answer that caused the controversy (thus showing that you guys actually do understand numbers), but whether the starting point is a half or a third (thus showing that we don't really understand semantics, language and philosophy).
I've been thinking about this "two children" problem, and I think I've come up with an interesting, and probably solvable (given a few assumptions about certain likelihoods) poser.
Let's apply this to gambling!
A man walks in. Slaps his palm down on the table on top of two coins and looks at them. He says to you (Mr Aardvark) "There are two coins here. At least one of them is a head."
A second man (a Mr Birks) (who knows no more than you do about this state of affairs) now says: "I will offer you 3-to-2 on both coins being heads".
Do you take the bet?
Well, you, Mr Aardvark, are in the 50:50 camp. So obviously you do.
As it happens, Mr Birks wins the bet (the second coin is a tail), and you are ten quid down.
"Mr Aardvark", scoffs Mr Birks, "You are a fool! Everyone knows that the chance of the other coin being a head is one-third!"
But you, Mr Aardvark, are not non-plussed. "Double or nothing?" you say.
"Hah HAH!" says Mr Birks. "This is going to pay for my holiday!"
The first man then walks out of the door. Ten seconds later he comes back in. He repeats the exercise, and says:
"At least one of these coins is a tail."
At this point, Mr Birks (foolishly) says "ahh, it's exactly the same problem, so clearly the chance of the second coin being a tail is still 1/3." He offers you 3-to-2, and you absolutely lump on, because now you are certain (incorrectly, as it happens, but we can cover that later) that the chance of the other coin being a tail is 50%.
Let's take the sample of four.
(A) Coins are HH. Man says "At least one of these coins is a head"
(B) Coins are TT. Man says "At least one of these coins is a tail"
(C) Coins are TH. Man says "At least one of these coins is a tail"
(D) Coins are TH. Man says "At least one of these coins is a head"
The "law of restricted choice" in A and B causes the probability of the other coin to be the same to move up to 50% from 33%.
But, suppose the man comes in the second time, tosses the two coins and puts his palm down, and, once again, says... "At least one of the coins is a head". Once again, Mr Birks offers you 3-to-2 on the other coin being a head. Do you, Mr Aardvark, still lump on?
This is a tougher prospect.
Clearly, therefore, what matters here (in terms of making money) is the sample of "choices of what to say" held by the man who tosses the coin.
Suppose he comes in, tosses the two coins, looks at them. Says nothing. And then tosses the two coins again. At which point he says "At least one of the coins is a head".
Even you, Mr Aardvark, will deduce from the fact that the man said nothing after the first toss that there is a state of affairs that commits him to silence. Thus, when he makes the second toss and states "At least one of them is a head", you will induce (perhaps incorrectly, but it's a reasonable induction) that his previous silence was because he had tossed two tails.
At this point you, Mr Aardvark, will conclude that the chance of the other coin being a head is 1/3.
In real life, of course, Mr Birks would deduce as soon as the man said the second time "at least one of them is a tail" (having said the first time that "at least one of them is a head", that the chance of the other coin being a tail was, because of the law of restricted choice, 50%, not 33.3%.
The key, therefore, is what the man is instructed to say. Even the above is simplistic.
I have, in fact, assumed two possibilities (nos (1) & (4) below) when there are more than this. remember, we are toalking about real prop bets here.
1) The man is told (or has decided) to say either "at least one is a head" or "at least one is a tail". If it's a head and a tail, he can say either.
2) The man is instructed to say "at least one is a tail" ONLY if both are tails. Otherwise, he says "at least one is a head".
3) The man is instructed to say "at least one is a head" ONLY if both are heads. Otherwise, he says "at least one is a tail".
4) The man is instructed to say "at least one is a head" if at least one is a head. Otherwise, he stays silent and tosses again.
5) The man is instructed to say "at least one is a tail" if at least one is a tail. Otherwise, he tosses again.
There are many other combinational possibililties, and clearly it would be erroneous to apply an equal probability to all of them. So let's just stick to casees (1) and (4) for the moment.
Now, let's go back to the first coin toss again. He tosses it, and says "At least one is a head". What is the chance that the other is a head?
If we take the simple sample here (choices of (1) and (4) being the "actual state of affairs") , we could say: "He might have been told to stay silent if neither was a head, or he might have been told to say either "at least one is a head" or "at least one is a tail" (his choice) if it was a head and a tail. These are equally likely states of affairs (under the principle of 'in the beginning, everything was even money')".
Now, this is where it gets interesting. In effect, if we are talking about having a bet on this, we know that there's an evens chance that it's 33.3% (state of affairs 4 is true), and an evens chance that it's 50:50 (state of affairs 1 is true).
This gives us the intermediate real, I'm prepared to put my money on it, chance of 41.67% that the other coin is a head!
Once you add in the other "posible states of affairs", things become more complicated. I can return to that later.
Now, if the man goes out, comes back in, and says "At least one is a tail", I think that the odds shift directly to 50%.
However, if he once again says "at least one is a head", we have an interesting calculation.
It's my guess (and it is a guess -- I haven't seriously worked it out) that this would shift the best "prop bet" likelihood to about 37.75% that the other coin is a head.
For every time the man comes in and says "at least one is a head", rather than saying "at lest one is a tail" or saying nothing and tossing again, you halve the distance between your previous percentage and 33.33%.
If anyone wants to test this financially, I am happy to walk into a room and be the coin tosser and statement maker while two people bet against each other!.
PJ
I've been thinking about this "two children" problem, and I think I've come up with an interesting, and probably solvable (given a few assumptions about certain likelihoods) poser.
Let's apply this to gambling!
A man walks in. Slaps his palm down on the table on top of two coins and looks at them. He says to you (Mr Aardvark) "There are two coins here. At least one of them is a head."
A second man (a Mr Birks) (who knows no more than you do about this state of affairs) now says: "I will offer you 3-to-2 on both coins being heads".
Do you take the bet?
Well, you, Mr Aardvark, are in the 50:50 camp. So obviously you do.
As it happens, Mr Birks wins the bet (the second coin is a tail), and you are ten quid down.
"Mr Aardvark", scoffs Mr Birks, "You are a fool! Everyone knows that the chance of the other coin being a head is one-third!"
But you, Mr Aardvark, are not non-plussed. "Double or nothing?" you say.
"Hah HAH!" says Mr Birks. "This is going to pay for my holiday!"
The first man then walks out of the door. Ten seconds later he comes back in. He repeats the exercise, and says:
"At least one of these coins is a tail."
At this point, Mr Birks (foolishly) says "ahh, it's exactly the same problem, so clearly the chance of the second coin being a tail is still 1/3." He offers you 3-to-2, and you absolutely lump on, because now you are certain (incorrectly, as it happens, but we can cover that later) that the chance of the other coin being a tail is 50%.
Let's take the sample of four.
(A) Coins are HH. Man says "At least one of these coins is a head"
(B) Coins are TT. Man says "At least one of these coins is a tail"
(C) Coins are TH. Man says "At least one of these coins is a tail"
(D) Coins are TH. Man says "At least one of these coins is a head"
The "law of restricted choice" in A and B causes the probability of the other coin to be the same to move up to 50% from 33%.
But, suppose the man comes in the second time, tosses the two coins and puts his palm down, and, once again, says... "At least one of the coins is a head". Once again, Mr Birks offers you 3-to-2 on the other coin being a head. Do you, Mr Aardvark, still lump on?
This is a tougher prospect.
Clearly, therefore, what matters here (in terms of making money) is the sample of "choices of what to say" held by the man who tosses the coin.
Suppose he comes in, tosses the two coins, looks at them. Says nothing. And then tosses the two coins again. At which point he says "At least one of the coins is a head".
Even you, Mr Aardvark, will deduce from the fact that the man said nothing after the first toss that there is a state of affairs that commits him to silence. Thus, when he makes the second toss and states "At least one of them is a head", you will induce (perhaps incorrectly, but it's a reasonable induction) that his previous silence was because he had tossed two tails.
At this point you, Mr Aardvark, will conclude that the chance of the other coin being a head is 1/3.
In real life, of course, Mr Birks would deduce as soon as the man said the second time "at least one of them is a tail" (having said the first time that "at least one of them is a head", that the chance of the other coin being a tail was, because of the law of restricted choice, 50%, not 33.3%.
The key, therefore, is what the man is instructed to say. Even the above is simplistic.
I have, in fact, assumed two possibilities (nos (1) & (4) below) when there are more than this. remember, we are toalking about real prop bets here.
1) The man is told (or has decided) to say either "at least one is a head" or "at least one is a tail". If it's a head and a tail, he can say either.
2) The man is instructed to say "at least one is a tail" ONLY if both are tails. Otherwise, he says "at least one is a head".
3) The man is instructed to say "at least one is a head" ONLY if both are heads. Otherwise, he says "at least one is a tail".
4) The man is instructed to say "at least one is a head" if at least one is a head. Otherwise, he stays silent and tosses again.
5) The man is instructed to say "at least one is a tail" if at least one is a tail. Otherwise, he tosses again.
There are many other combinational possibililties, and clearly it would be erroneous to apply an equal probability to all of them. So let's just stick to casees (1) and (4) for the moment.
Now, let's go back to the first coin toss again. He tosses it, and says "At least one is a head". What is the chance that the other is a head?
If we take the simple sample here (choices of (1) and (4) being the "actual state of affairs") , we could say: "He might have been told to stay silent if neither was a head, or he might have been told to say either "at least one is a head" or "at least one is a tail" (his choice) if it was a head and a tail. These are equally likely states of affairs (under the principle of 'in the beginning, everything was even money')".
Now, this is where it gets interesting. In effect, if we are talking about having a bet on this, we know that there's an evens chance that it's 33.3% (state of affairs 4 is true), and an evens chance that it's 50:50 (state of affairs 1 is true).
This gives us the intermediate real, I'm prepared to put my money on it, chance of 41.67% that the other coin is a head!
Once you add in the other "posible states of affairs", things become more complicated. I can return to that later.
Now, if the man goes out, comes back in, and says "At least one is a tail", I think that the odds shift directly to 50%.
However, if he once again says "at least one is a head", we have an interesting calculation.
It's my guess (and it is a guess -- I haven't seriously worked it out) that this would shift the best "prop bet" likelihood to about 37.75% that the other coin is a head.
For every time the man comes in and says "at least one is a head", rather than saying "at lest one is a tail" or saying nothing and tossing again, you halve the distance between your previous percentage and 33.33%.
If anyone wants to test this financially, I am happy to walk into a room and be the coin tosser and statement maker while two people bet against each other!.
PJ
no subject
Date: 2010-07-09 01:24 pm (UTC)Ahh, that's what I feared.
I haven't spent time thinking about this yet, but more than anything else I'm reminded of the Monty Hall problem (http://en.wikipedia.org/wiki/Monty_Hall_problem). Specifically, see the "Other host behaviours" section. Depending upon the way you ask the host to behave, you can get the probability to be pretty much anything you want.
I suspect that it's probably possible to use a similar line of thought on the "boy born on Tuesday" issue, but I haven't worked through the logic here.
no subject
Date: 2010-07-09 01:30 pm (UTC)no subject
Date: 2010-07-09 01:48 pm (UTC)I haven't got around to re-evaluating my (presumably false, because Those Who Know in academia, etc, disagree with me) two boys stance. Basically it boiled down to the the use of language and my interpretation, based on the specific use of language, that the law of restricted choice didn't apply in the case quoted. I mean, everybody is capable of handling a "classic" four-state HH HT TH TT problem, because, as you say, the maths are trivial. (Doesn't always help me, though, as my arse-first "solution" to Monty Hall demonstrates all too clearly.)
I'll concede an assumption of defeat on yesterday's BB (I'm not being mealy mouthed, just admitting that testing that assumption is pretty fruitless) and have a think about this current formulation, which has interesting subtleties.
1 In 4, Baby, 4 In 1
Date: 2010-07-09 02:55 pm (UTC)For added stringency, case (1) needs a verbal toughening up -- something along the lines of the man tossing a coin whilst outside to predetermine his choice of words if the result inside is HT or TH. But it's fair to assume that some such aleatory procedure takes place.
I actually agree with everything you say after "Now, let's go back to the first coin toss again." I had to think quite hard about it, but I'm fairly sure I'd have reached each separate conclusion myself, given the wording. Of course that's easy for me to say, because you've already written it down ... Not entirely sure about all the preceding stuff, but then I haven't looked carefully enough at it.
The interesting thing, of course, is that the world changes the minute the man walks in, tosses the coins, and says nothing. At that point you've eliminated (1) and the odds permanently shift to 33%. If you expect Aardvark not to spot this and consequently not adjust his bet, there's a certain amount of arbitrage available on the previous bets to make sure he thinks he's winning...
To return briefly to the BB problem, I found this (http://mathforum.org/library/drmath/view/52186.html) interesting on the topic of "choosing" the candidate family. To simplify, if you pick a boy out of the pool to determine the family, the chances are 50:50. If you pick the family out of the pool and find a boy, the chances are 33:66.
Except ... if the set of all families with two children has a cardinality of one -- there's only one such family in the whole world -- then the chances are, apparently, simultaneously 50:50 and 33:66. Which just goes to show how difficult it is to formulate these problems.
Re: 1 In 4, Baby, 4 In 1
Date: 2010-07-09 03:19 pm (UTC)(If you're going to insist on specifying the gender, no matter what, then I believe that the Bayesian formulation in the wikipedia article is the way to go.)
Re: 1 In 4, Baby, 4 In 1
Date: 2010-07-10 01:51 pm (UTC)PJ
Re: 1 In 4, Baby, 4 In 1
Date: 2010-07-10 05:12 pm (UTC)"Thus, when he makes the second toss and states "At least one of them is a head", you will induce (perhaps incorrectly, but it's a reasonable induction) that his previous silence was because he had tossed two tails."
Induction in the mathematical sense does pretty much what it says on the tin. You start with the assumption of no historical knowledge, and you build up from one or more minimalist preconditions. (Usually it's one, but as long as it covers the entire problem space for preconditions, that's fine.) The reason you use induction is because (a) it eliminates all things that you do not know -- or else it fails, by definition -- and (b) you can use etablished rules to build from a lemma to a proof.
In this case, you are clearly (and correctly) using deduction, which is mathematically a process that involves historical knowledge. I'm not sure I've got that definition completely right, so let me assert that proof by induction is a recursive proof, whereas proof by deduction is linear.
I'm prepared to consider that it is possible to present a "proof by deduction" as a "proof by induction." Within the set of proofs (given axioms blah blah don't forget Goedel blah blah) there may be proofs that are, idempotently, both inductive and deductive.
Let's say you found one of those.
In that case, using a common verb that reminds people of using mechanical and/or pharmaceutical means for causing a birth to occur is almost certainly an unfortunate alternative to using another common verb that reminds people of Sherlock Holmes.
Re: 1 In 4, Baby, 4 In 1
Date: 2010-07-10 05:54 pm (UTC)The latter is right (and induce is wrong) when you say "It's a number from 1 to 8, but it isn't 8". I therefore deduce that it's "a number from 1 to 7".
I "induce" the general case that the silence was because he had two tails, because his statement "at least one of them is a head" after his previous silence acts in a different logical way on the previous silence after a coin toss from the way the statement "but it isn't 8" acts on the previous statement "it's a number from 1 to 8".
It was possible, for example, that the man was operating under a rule of which we were unaware (but which was not the rule that, if he threw two tails, he remained silent and tossed again) that required him to toss the coins again.
Therefore I felt that I was not using mathematical deduction in the Holmesian sense (eliminating all cases until whatever remained, must, however improbable, be the truth).
I was, instead, inducing a state of affairs in the world from a single instance.
The unfortunate fact that "induction" has other medical associations is an irritation, but I don't think I should allow that to discourage me from using it in the sense which does exist.
PJ
Re: 1 In 4, Baby, 4 In 1
Date: 2010-07-10 06:36 pm (UTC)Or, in other words, and God knows I've been reading you for 30 years because you obviously believe in this, language is about clarity.
(It's still a side-issue, though.)
I can't think of a single instance of a logical or mathematical framework that allows you to "induc[e] a state of affairs in the world from a single instance in the future."
Well, I can, actually. That single instance (and as I say, it doesn't have to be a single instance -- as long as it covers the problem domain) is by necessity the single instance from which you infer the subsequents. Given my assumption that there is a part of the proof space where deduction is exactly equivalent to induction (and don't forget, I'm not sure there is one; I'm just allowing you that possibility), then, I believe, you are taking a deductive proposition such as
D <- C <- B <- A
and, in your words, "inducing [actually inducting, I think] a state of affairs in the world from a single instance."
I wish to put forth the proposition (in a Jim Morrison sort of way) that you cannot induce D from A. You cannot. Dum de dum de soft machine dum de dum ...
Your single instance is in the historical present. Your conclusions are based upon analysis of the historical past.
That would be deduction -- not induction.
Induction is normally represented as a recursive process. There is no recursion present in your argument.
On the side: No, you weren't using Holmesian elimination. You weren't even suggesting that you were. I used that as a metaphor, and you are throwing it back as an accusation. What you were doing was to use a set of (reasonably assumed) preconditions and arguing based on those preconditions. That still makes it a deduction when you are talking about the set of results based upon those preconditions.
Basically, I think what I'm saying is that you can inductively generate a statistical spread on "first principles," without history. If you are going to use (perfectly validly) a set of historical results (which already represents a statistical spread), then it's deduction. If only because you know more. (Which is what I thought was the point at issue in the first place.)
Of course, this might only go to prove that I was right to do two A-levels in Maths & Mechanics rather than one in pure maths and one in stats.
Can't say I feel the loss after all these years.
no subject
Date: 2010-07-09 03:30 pm (UTC)1) Is the chance that he has the pocket pair different if you've chosen which of the two cards to see to the chance that he has the pocket pair if he's chosen which of the two cards to show you?
2) Is the chance that he has the pocket pair different if it's your idea to pick which card to see than if it's his idea to pick which card to see?
no subject
Date: 2010-07-09 05:26 pm (UTC)If you chose card 1, you're 50/50.
If you chose card 2, you're 50/50.
You are therefore 50/50.
If villain chooses, you sort of have to specify whether he knows the pair you're looking for or not. (We can assume that he knows you're looking for a pair.)
We can assume that he wants you to turn the other card over.
If he knows that you're looking for HH, then he'll flip an H (and you've excluded TT). You are therefore 33/66, because this is a matter of combinations and not the number of Hs available.
If he just knows you're looking for a pair, then his choice is totally random. However, he hasn't flipped a T, so you're still left with {HH HT TH}. In this case, however, his random selection encompassed all six cards, four of which are H. In two of those cases, you hit the pocket pair. Your chances are therefore 50/50.
Well, that's my analysis anyway. Probably gruesomely wrong, as usual. Now, if you're going for a 52 card pack, less known cards, with a given range at a certain percentage confidence, and you're not dealing with {HH HT TH TT}, and and and ... then all bets are off.
Except in the case where villain knows that you're looking for HH, and you can replace 'T' with 'o' where 'o' doesn't necessarily represent a pair. In which case, your odds if you let villain pick are still 33/66.
no subject
Date: 2010-07-09 05:30 pm (UTC)no subject
Date: 2010-07-11 12:49 am (UTC)If he doesn't have the pocket pair, then either (a) he would prefer you to see he had it or (b) he would prefer you to see he didn't have it.
In case (a),
1) If you've chosen, then you've got a 50% chance of seeing the card you want. If he's chosen, then you've got a 100% chance of seeing the card you want.
In case (b),
1) If you've chosen, then you've got a 50% chance of seeing the card you want. If he's chosen, then you've got a 0% chance of seeing the card you want.
I can't work it out now because I'm off to bed, but there might be some way of drawing up a tree diagram which goes through the eight possibilities and comes up with an appropriate function. Possibly I will have a go at this tomorrow.
I'm sure I had a point with question 2 at the time, but I can't remember what it was and on reflection it may well not make a difference.
no subject
Date: 2010-07-11 07:15 am (UTC)It basically depends on whether villain wants you to think whether you made the right decision or a wrong decision when you folded. Usually one wants people in tournies to think that folding was the right decision, so, yes, villain would choose to show you the card that confirmed you were right to fold more than he would choose to show you the "other" card.
no subject
Date: 2010-07-10 02:34 pm (UTC)The famous example of this, of course, was when a player offered his opponent to pick one of his hole cards for $25. The player who paid the $25 thought that seeing one of the hole cards would give him more information about his opponent's hand, whereas in fact it did not. I'll dig out the example when I have time. I'm fairly sure that it is in Al Alvarez's first book on poker. "The Biggest Game In Town".
This does bring up a minor philosophical point. Suppose I am drawing to a flush with one card to come. We casually say that "your chance of hitting is 9/46", but, since the order of the cards is predetermined, we could (if we knew the order of the cards still in the deck) say that "your chance is 1" or "your chance is 0".
Both of us could be "right" simultaneously, simply because we (who know the order of the cards) are essentially inhabiting a different universe (in probabilistic terms) than the person who does not know the order of the cards. This has a large number of real-world applications, obviously.
PJ
no subject
Date: 2010-07-10 03:08 pm (UTC)Board is x72, Strauss has raised PF with 72, and I think Mr. Other has played back at him, such that Strauss thinks that his two pair is behind. So Strauss bets, then he offers to show a card for $25 while villain is considering a call. The implication is that he has a set, so he gets his opponents to fold a better hand.
(I suspect it might be the turn and the turn paired the board, vs an overpair.) AN Other MIGHT be Jesse Alto.
no subject
Date: 2010-07-10 03:56 pm (UTC)It makes me wonder whether there would be players interested in playing a poker variant with the extra rule that, at any point before the deal of the final card, any player still in the pot can propose that the hand concludes at that point. If all players still in the pot agree, then all pocket cards are revealed and the pot is shared in proportion to the probability of each player winning the hand, based on the cards yet undealt. Effectively, it's not so much "run it twice", as "run it through the entire deck". Perhaps this is a rare gap in the market by which a poker site might yet distinguish itself. Is this still an interesting game to you? Do you think there would be many players who would enjoy this?
no subject
Date: 2010-07-10 04:20 pm (UTC)If X & Y agree that, whenever they are all in against each other, they will "run it infinite times", but that, wheneer either X or Y are in a pot against player C they will only run it once, then player C may have the same long-term expectation, but he faces far higher volatility.
Since volatility has a (negative) value (stocks with high beta are valued less than equivalent stocks with low beta), such an agreement between X and Y works to the joint benefit of X and Y and to the detriment of C. It is, therefore, collusion.
In live poker games, if you are playing high-low PLO or something, the effect is even more marked. C might be well up on the night. X & Y have pulled out more cash. Raising and reraising commences. C folds, at which point X & Y run it twice and, nearly always, split it.
Or, to put it another way, suppose X & Y raised like nutters, getting C to fold, and then split the pot between them in the toilets immediately afterwards, no matter who "won" the hand?
PJ
no subject
Date: 2010-07-10 05:29 pm (UTC)To put it another way, if you've worked a perfectly sensible line based on perfectly sensible judgement and a reasonable approximation of opponents' range and a fair understanding of the way their stacks and your stack affect the odds and a good few guesses on the side pots, if any, and some idiot comes up and says:
Game over, boys! I've got an overpair on the turn! Let's stop betting right now, because I don't feel like risking a possible flush draw on the river. We'll just share what we've got...
Hell, I might be totally ignorant about poker, but this is a seriously unattractive proposition.
no subject
Date: 2010-07-11 12:53 am (UTC)Have you ever gambled money on a coin flip in a circumstance where you have convinced yourself that it's an even money proposition - for instance, you're flipping a coin that you provided yourself and believed to be fair? I'm prepared to believe that you're rational enough and EV-focused enough that you haven't, or at least not while you were sober, but I bet that many poker players have.
no subject
Date: 2010-07-11 07:09 am (UTC)PJ
no subject
Date: 2010-07-11 11:01 pm (UTC)Cheers old mate.
Richard