>Not true. There is a chance, no matter how miniscule, that a million
>flipped coins will all be heads.
I dropped out of the conversation for a while but after noticing this discussion was still going on I have to say this.
You're thinking statistics not probability. Probability is what actualy happens, what we actualy observe and statistics is what's supposedly probable in pure mathematics.
Take the lottery for example. Statisticaly any combination is just as likely as any other combintation. But when you enter all those drawings in a database and do a comparison you notice that is not the case. 1 2 3 4 5 6 does not occur nor any sequence of 6, by 1s, 2s, or what ever. Why not? After all, a number is just a mark. A 3 is the same thing as a 22. I don't try to explain such things (though I probably could if I cared to put the energy to it) instead I notice it has never happened compared to what has happened. I make the logical assumption, based on observation, what has happened the majority of the time will continue to happen. So I get rid of all sequences of 6 in a possible combination and I greatly decrease my odds. (Note: I don't play the lottery, but it's fun to watch)
So with coin flipping I compare the probability 50/50 based on the amount of flips I've already done and compare that to the statistical value of the possibility of it comming up heads that many times. I have an ~50/50 compared to a 1/100. 1 is much greater than .001. They can't both be true because that would be contradictory so I reason by the amount of gap 50/50 will always happen while 1/100 never will.