Impossibility of a gambling system

The principle of the impossibility of a gambling system is a concept in probability. It states that in a random sequence, the methodical selection of subsequences does not change the probability of specific elements. The first mathematical demonstration is attributed to Richard von Mises (who used the term collective rather than sequence).[1][2]

A random walk on a cubic three-dimensional lattice.

The principle states that no method for forming a subsequence of a random sequence (the gambling system) improves the odds for a specific event. For instance, a sequence of fair coin tosses produces equal and independent 50/50 chances for heads and tails. A simple system of betting on heads every 3rd, 7th, or 21st toss, etc., does not change the odds of winning in the long run. As a mathematical consequence of computability theory, more complicated betting strategies (such as a martingale) also cannot alter the odds in the long run.

Von Mises' mathematical demonstration defines an infinite sequence of zeros and ones as a random sequence if it is not biased by having the frequency stability property. With this property, the frequency of zeroes in the sequence stabilizes at 1/2, and every possible subsequence selected by any systematic method is likewise not biased.[3]

The subsequence selection criterion is important, because although the sequence 0101010101... is not biased, selecting the odd positions results in 000000... which is not random. Von Mises did not fully define what constituted a "proper" selection rule for subsequences, but in 1940 Alonzo Church defined it as any recursive function which having read the first N elements of the sequence decides if it wants to select element number N+1. Church was a pioneer in the field of computable functions, and the definition he made relied on the Church Turing Thesis for computability.[4][5][6]

In the mid-1960s, A. N. Kolmogorov and D. W. Loveland independently proposed a more permissive selection rule.[7][8] In their view Church's recursive function definition was too restrictive in that it read the elements in order. Instead they proposed a rule based on a partially computable process which having read any N elements of the sequence, decides if it wants to select another element which has not been read yet.

The principle influenced modern concepts in randomness, e.g. the work by A. N. Kolmogorov in considering a finite sequence random (with respect to a class of computing systems) if any program that can generate the sequence is at least as long as the sequence itself.[9][10]

See also

edit

References

edit
  1. ^ Probability, Statistics and Truth by Richard von Mises 1928/1981 Dover, ISBN 0-486-24214-5 page 25
  2. ^ Counting for something: statistical principles and personalities by William Stanley Peters 1986 ISBN 0-387-96364-2 page 3
  3. ^ Laurant Bienvenu "Kolmogorov Loveland Stochastocity" in STACS 2007: 24th Annual Symposium on Theoretical Aspects of Computer Science by Wolfgang Thomas ISBN 3-540-70917-7 page 260
  4. ^ Alonzo Church, "On the Concept of Random Sequence," Bull. Amer. Math. Soc., 46 (1940), 254–260
  5. ^ Companion encyclopedia of the history and philosophy Volume 2, by Ivor Grattan-Guinness 0801873975 page 1412
  6. ^ J. Alberto Coffa, Randomness and Knowledge in "PSA 1972: proceedings of the 1972 Biennial Meeting Philosophy of Science Association, Volume 20, Springer 1974 ISBN 90-277-0408-2 page 106
  7. ^ A. N. Kolmogorov, Three approaches to the quantitative definition of information Problems of Information and Transmission, 1(1):1--7, 1965.
  8. ^ D.W. Loveland, A new interpretation of von Mises' concept of random sequence Z. Math. Logik Grundlagen Math 12 (1966) 279-294
  9. ^ An introduction to probability and inductive logic 2001 by Ian Hacking ISBN 0-521-77501-9 page 145
  10. ^ Creating modern probability by Jan Von Plato 1998 ISBN 0-521-59735-8 pages 23-24