GITNUXREPORT 2026

Random Statistics

The blog post explores both pseudorandom algorithms and fundamental principles of true randomness.

Alexander Schmidt

Alexander Schmidt

Research Analyst specializing in technology and digital transformation trends.

First published: Feb 13, 2026

Our Commitment to Accuracy

Rigorous fact-checking · Reputable sources · Regular updatesLearn more

Key Statistics

Statistic 1

Kolmogorov complexity K(x) shortest program outputting x, uncomputable but approximates randomness

Statistic 2

Martin-Löf randomness: sequence passes all effective null covers, equivalent to Chaitin Ω convergence

Statistic 3

von Mises randomness: passes all admissible selection rules, equivalent to ML in some senses

Statistic 4

Church randomness Turing equivalent to halting probability Ω, incompressible sequences

Statistic 5

Compressibility ratio C(x)/|x| →0 for nonrandom, =1 for random incompressible

Statistic 6

The birthday problem: probability of collision in n=365 days with k=23 people is ≈0.507

Statistic 7

Coupon collector problem: expected trials to collect n coupons is n H_n ≈ n ln n + γ n, γ=0.57721

Statistic 8

Hat check problem derangement !n ≈ n!/e, P(no fixed point)=1/e≈0.367879 for large n

Statistic 9

Monty Hall problem: switching doors gives 2/3 win probability vs 1/3 stay

Statistic 10

Secretary problem optimal stop at 1/e≈37%, success prob 1/e

Statistic 11

Banach matchbox problem: smoker with n matches each pocket, prob empty one pocket first is sum (-1)^k /k! or something wait, actually Pboth empty before =1/2^{2n} binom etc, but asymptotic 1/sqrt(π n /2)

Statistic 12

False coin weighing problems, 12 coins 3 weighings identify fake with balance scale, information theoretic 3^3=27>24 states

Statistic 13

In 52 card deck, prob royal flush 4/2,598,960≈1.5e-6

Statistic 14

Matching pennies game has mixed strategy Nash eq (0.5,0.5), value 0

Statistic 15

Entropy H(X)=-sum p log p, for fair coin 1 bit, uniform n symbols log2 n bits max

Statistic 16

Law of large numbers: sample avg → μ almost surely for iid with finite E|X|

Statistic 17

Central limit theorem: sqrt(n)(bar X_n - μ)/σ → N(0,1) for iid finite var

Statistic 18

Berry-Esseen theorem bounds CLT convergence |F_n - Φ| ≤ C ρ / σ^3 sqrt(n), C≈0.5

Statistic 19

Large deviation principle: P(S_n /n ≥ μ+ε) ≤ exp(-n I(μ+ε)) with rate I(x)=sup_t (t x - log M(t))

Statistic 20

A Bernoulli random variable takes value 1 with probability p and 0 with 1-p, with variance p(1-p) maximized at p=0.5 where Var=0.25

Statistic 21

Binomial random variable B(n,p) has mean np and variance np(1-p), for n=100, p=0.5 mean=50 std dev≈5

Statistic 22

Poisson random variable with λ=5 has P(X=k) = e^{-5} 5^k / k!, mean=5, variance=5, skewness≈0.632

Statistic 23

Normal distribution N(μ,σ^2) has 68.27% probability within 1σ, 95.45% within 2σ, 99.73% within 3σ of mean

Statistic 24

Exponential distribution Exp(λ) models interarrival times, mean 1/λ, variance 1/λ^2, memoryless property P(X>s+t|X>s)=P(X>t)

Statistic 25

Uniform continuous U(a,b) has mean (a+b)/2, variance (b-a)^2/12, pdf 1/(b-a) on [a,b]

Statistic 26

Geometric distribution Geo(p) counts trials until first success, mean (1-p)/p, variance (1-p)/p^2

Statistic 27

Gamma distribution Γ(α,β) generalizes exponential, mean α/β, variance α/β^2, for α integer it's Erlang

Statistic 28

Chi-squared with k df has mean k, variance 2k, used in goodness-of-fit tests

Statistic 29

Student's t-distribution with ν df approaches normal as ν→∞, heavier tails for small ν

Statistic 30

Blum Blum Shub pseudorandom generator is x_{n+1} = x_n^2 mod n where n=pq Blum integers, provably secure under QR factoring assumption, period up to φ(n)^2/4

Statistic 31

NIST Statistical Test Suite includes 15 tests like Frequency, Runs, FFT, passing rate >1-0.01=0.99 for good PRNG at 10^6 bits

Statistic 32

Dieharder battery by Marsaglia has 31 tests, ChaCha20 PRNG passes all with p-values uniform

Statistic 33

Cryptographically secure PRNGs like Fortuna use AES in counter mode with reseeding every 2^20 bytes from entropy pool

Statistic 34

RC4 stream cipher flawed with bias in first 256 bytes, discarded key recovery in 2^26 operations

Statistic 35

Salsa20/20 has 20 rounds, distinguishes from random only after 2^251 trials, used in XSalsa20

Statistic 36

TestU01 BigCrush suite has 160 tests, requires >10^9 bits, LCGs fail while PCG passes

Statistic 37

Expand-Expand-Extract construction for PRNG stretch turns ε-secure seed into 2^l pseudorandom bits with security 2ε

Statistic 38

Quantum random number generators using photon detection achieve up to 1 Gbps entropy rate with Bell inequality violation confirming randomness

Statistic 39

Bell's inequality violation by 2.42σ in Aspect experiment 1982 shows quantum correlations exceed classical local hidden variables

Statistic 40

Loophole-free Bell test by Hensen et al. 2015 closed detection and locality loopholes with CHSH violation S=2.42 ±0.20

Statistic 41

Vacuum fluctuations in QED provide fundamental randomness source, Casimir effect measured force 10^{-7} N between plates 0.6 μm apart

Statistic 42

Radioactive decay is quantum random, half-life of Carbon-14 is 5730 years, decay constant λ=ln(2)/5730 ≈1.21×10^{-4} yr^{-1}

Statistic 43

Shot noise in photodiodes from Poisson photon arrival gives entropy rate up to 20 Gbps in commercial QRNGs

Statistic 44

EPR paradox resolved by non-locality, no-signaling theorem preserves causality in quantum randomness sharing

Statistic 45

Certified randomness from quantum advantage protocols like DI-QKD extract 1-(2+ε)ε bits per test with security against quantum adversaries

Statistic 46

Homodyne detection QRNGs measure vacuum quadrature fluctuations, achieving min-entropy >0.99 bits per 8-bit sample

Statistic 47

In random graph G(n,p=1/2), expected edges n(n-1)/4 ≈ n^2/4, threshold for connectivity p=ln n /n

Statistic 48

Erdős–Rényi G(n,p) has giant component emerging at p=1/n, size ~n for p>(1+ε)/n

Statistic 49

Diameter of G(n, ln n /n) is 2 w.h.p., Hamilton cycle exists for p>ln n /n

Statistic 50

In Barabási–Albert scale-free random graph, degree distribution P(k)~k^{-3}, preferential attachment α=1

Statistic 51

Watts-Strogatz small-world model at rewiring p=0.1 has clustering coeff 0.3-0.4 and path length ln n

Statistic 52

Expected clique number ω(G(n,1/2)) ~ 2 log_2 n

Statistic 53

Chromatic number χ(G(n,p)) ≈ n / (2 log_2 n) for dense case

Statistic 54

Matching number ν(G(n,1/2)) = floor(n/2) w.h.p.

Statistic 55

In configuration model random graph with degree sequence d_i, multiedge prob ~ sum d_i^2 / (2m)^2

Statistic 56

The period of the Mersenne Twister pseudorandom number generator MT19937 is exactly 2^19937 − 1, which is approximately 4.3 × 10^6001

Statistic 57

In Python's random module, the default random number generator uses the Mersenne Twister algorithm with a state size of 19937 bits

Statistic 58

The RANDU pseudorandom number generator, infamous for poor quality, produces numbers where x_{n+1} = (65539 * x_n) mod 2^31, leading to lattice structures in 3D space

Statistic 59

Linear congruential generators (LCGs) are defined by X_{n+1} = (a * X_n + c) mod m, with Hull-Dobell theorem requiring conditions for full period m

Statistic 60

The Xorshift128+ algorithm by Vigna has a period of 2^128 - 1 and passes BigCrush test suite with flying colors

Statistic 61

PCG (Permuted Congruential Generator) family achieves 2^64 state space with high speed and passes TestU01 battery

Statistic 62

In hardware, Intel's RdRand instruction uses thermal noise for entropy, providing 128-bit random values at up to 3 GB/s on modern CPUs

Statistic 63

True random number generators (TRNGs) based on radioactive decay have bit rates around 1 kbps due to Geiger counter limitations

Statistic 64

The Lehmer random number generator uses multiplicative congruential method with multiplier 16807 and modulus 2^31 - 1

Statistic 65

Chaos-based RNGs using logistic map x_{n+1} = r x_n (1 - x_n) with r=4 achieve full [0,1] coverage but sensitive to initial conditions

Statistic 66

Simple symmetric random walk on Z starts at 0, P(return to 0 at step 2n) ~ 1/sqrt(π n) asymptotically, recurrent in 1D

Statistic 67

In 2D random walk, probability of returning to origin is 1, recurrent, but expected return time infinite

Statistic 68

3D simple random walk is transient, probability of ever returning to origin is about 0.340537

Statistic 69

Root mean square displacement for d-dimensional random walk after n steps is sqrt(n d)

Statistic 70

Polya's urn model leads to random walk where proportion converges almost surely to Beta distribution

Statistic 71

Self-avoiding walk on lattice has connective constant μ≈4.68 for square lattice in 2D

Statistic 72

Lévy flight random walk has step lengths from stable distribution α<2, superdiffusive with <r^2> ~ t^{2/α}

Statistic 73

Random walk on hypercube graph of dimension d has mixing time O(d log d)

Statistic 74

Bridge random walk conditioned to return to start at time n has distribution related to Ballot theorem

Statistic 75

Continuous Brownian motion has quadratic variation <B>_t = t, paths continuous but nowhere differentiable

Trusted by 500+ publications
Harvard Business ReviewThe GuardianFortune+497
Imagine a universe where the very fabric of chance is woven by algorithms like the Mersenne Twister, with its staggering 4.3 × 10^6001 period, and where quantum mechanics blurs the line between probability and reality, this blog post explores the intricate science behind random number generation, from flawed historical generators to modern cryptographic standards.

Key Takeaways

  • The period of the Mersenne Twister pseudorandom number generator MT19937 is exactly 2^19937 − 1, which is approximately 4.3 × 10^6001
  • In Python's random module, the default random number generator uses the Mersenne Twister algorithm with a state size of 19937 bits
  • The RANDU pseudorandom number generator, infamous for poor quality, produces numbers where x_{n+1} = (65539 * x_n) mod 2^31, leading to lattice structures in 3D space
  • A Bernoulli random variable takes value 1 with probability p and 0 with 1-p, with variance p(1-p) maximized at p=0.5 where Var=0.25
  • Binomial random variable B(n,p) has mean np and variance np(1-p), for n=100, p=0.5 mean=50 std dev≈5
  • Poisson random variable with λ=5 has P(X=k) = e^{-5} 5^k / k!, mean=5, variance=5, skewness≈0.632
  • Simple symmetric random walk on Z starts at 0, P(return to 0 at step 2n) ~ 1/sqrt(π n) asymptotically, recurrent in 1D
  • In 2D random walk, probability of returning to origin is 1, recurrent, but expected return time infinite
  • 3D simple random walk is transient, probability of ever returning to origin is about 0.340537
  • Quantum random number generators using photon detection achieve up to 1 Gbps entropy rate with Bell inequality violation confirming randomness
  • Bell's inequality violation by 2.42σ in Aspect experiment 1982 shows quantum correlations exceed classical local hidden variables
  • Loophole-free Bell test by Hensen et al. 2015 closed detection and locality loopholes with CHSH violation S=2.42 ±0.20
  • Blum Blum Shub pseudorandom generator is x_{n+1} = x_n^2 mod n where n=pq Blum integers, provably secure under QR factoring assumption, period up to φ(n)^2/4
  • NIST Statistical Test Suite includes 15 tests like Frequency, Runs, FFT, passing rate >1-0.01=0.99 for good PRNG at 10^6 bits
  • Dieharder battery by Marsaglia has 31 tests, ChaCha20 PRNG passes all with p-values uniform

The blog post explores both pseudorandom algorithms and fundamental principles of true randomness.

Algorithmic Randomness

  • Kolmogorov complexity K(x) shortest program outputting x, uncomputable but approximates randomness
  • Martin-Löf randomness: sequence passes all effective null covers, equivalent to Chaitin Ω convergence
  • von Mises randomness: passes all admissible selection rules, equivalent to ML in some senses
  • Church randomness Turing equivalent to halting probability Ω, incompressible sequences
  • Compressibility ratio C(x)/|x| →0 for nonrandom, =1 for random incompressible

Algorithmic Randomness Interpretation

While grappling with the unattainable ideal of true randomness, we find it inevitably defined by all the clever statistical filters we can invent to fail it, the incompressible core that forever mocks our attempts at a perfectly concise explanation.

Combinatorial Probability

  • The birthday problem: probability of collision in n=365 days with k=23 people is ≈0.507
  • Coupon collector problem: expected trials to collect n coupons is n H_n ≈ n ln n + γ n, γ=0.57721
  • Hat check problem derangement !n ≈ n!/e, P(no fixed point)=1/e≈0.367879 for large n
  • Monty Hall problem: switching doors gives 2/3 win probability vs 1/3 stay
  • Secretary problem optimal stop at 1/e≈37%, success prob 1/e
  • Banach matchbox problem: smoker with n matches each pocket, prob empty one pocket first is sum (-1)^k /k! or something wait, actually Pboth empty before =1/2^{2n} binom etc, but asymptotic 1/sqrt(π n /2)
  • False coin weighing problems, 12 coins 3 weighings identify fake with balance scale, information theoretic 3^3=27>24 states
  • In 52 card deck, prob royal flush 4/2,598,960≈1.5e-6

Combinatorial Probability Interpretation

Life is a chaotic carnival of improbable odds: chance ensures that in a room of just 23 people, two will likely share a birthday; teaches us it takes painfully many cereal boxes to complete a set; shows that lost hats rarely find their owners; demands you switch doors to survive Monty's trickery; advises hiring the next good candidate after rejecting the first 37%; proves a smoker will likely empty one pocket of matches before the other; reveals that finding a single bad coin among twelve requires the logic of a scale three times over; and reminds us that while a royal flush is an astronomical long shot, you should still play the hand you're dealt.

Game Theory Randomness

  • Matching pennies game has mixed strategy Nash eq (0.5,0.5), value 0

Game Theory Randomness Interpretation

Despite the game being a statistical stalemate, it still demands your full attention—lest you become the sucker betting tails while everyone else is thinking heads.

Information Theory

  • Entropy H(X)=-sum p log p, for fair coin 1 bit, uniform n symbols log2 n bits max

Information Theory Interpretation

Entropy measures how unsurprised you are, from the perfectly predictable single outcome (zero bits) to the frantic gossip of a fair coin flip (one bit) all the way up to the complete, uniform chaos of picking one card from a fresh deck (log2(52) bits of delightful uncertainty).

Limit Theorems

  • Law of large numbers: sample avg → μ almost surely for iid with finite E|X|
  • Central limit theorem: sqrt(n)(bar X_n - μ)/σ → N(0,1) for iid finite var
  • Berry-Esseen theorem bounds CLT convergence |F_n - Φ| ≤ C ρ / σ^3 sqrt(n), C≈0.5
  • Large deviation principle: P(S_n /n ≥ μ+ε) ≤ exp(-n I(μ+ε)) with rate I(x)=sup_t (t x - log M(t))

Limit Theorems Interpretation

It's nice that the Law of Large Numbers promises the party average will approach the truth, but I find the Central Limit Theorem comforting as the responsible chaperone who ensures the chaos of the sampling journey is normally distributed and politely bounded, while the Large Deviation Principle is the sober friend quietly reminding us that truly wild statistical deviations will be exponentially rare.

Probability and Random Variables

  • A Bernoulli random variable takes value 1 with probability p and 0 with 1-p, with variance p(1-p) maximized at p=0.5 where Var=0.25
  • Binomial random variable B(n,p) has mean np and variance np(1-p), for n=100, p=0.5 mean=50 std dev≈5
  • Poisson random variable with λ=5 has P(X=k) = e^{-5} 5^k / k!, mean=5, variance=5, skewness≈0.632
  • Normal distribution N(μ,σ^2) has 68.27% probability within 1σ, 95.45% within 2σ, 99.73% within 3σ of mean
  • Exponential distribution Exp(λ) models interarrival times, mean 1/λ, variance 1/λ^2, memoryless property P(X>s+t|X>s)=P(X>t)
  • Uniform continuous U(a,b) has mean (a+b)/2, variance (b-a)^2/12, pdf 1/(b-a) on [a,b]
  • Geometric distribution Geo(p) counts trials until first success, mean (1-p)/p, variance (1-p)/p^2
  • Gamma distribution Γ(α,β) generalizes exponential, mean α/β, variance α/β^2, for α integer it's Erlang
  • Chi-squared with k df has mean k, variance 2k, used in goodness-of-fit tests
  • Student's t-distribution with ν df approaches normal as ν→∞, heavier tails for small ν

Probability and Random Variables Interpretation

Life in statistics: a coin toss’s unpredictability peaks at 50/50, a fair crowd of 100 averages 50 with a 5-person sway, the rare event λ=5 whispers its own variance, the normal bell dutifully confines 68% within one standard deviation, the exponential waits without memory, the uniform spreads its chances thinly, the geometric counts failures until a lucky break, the gamma patiently stacks time, chi-squared scrutinizes fit with degrees of freedom, and the student’s t, with its heavier tails, reminds us that small samples hold their surprises.

Pseudorandomness

  • Blum Blum Shub pseudorandom generator is x_{n+1} = x_n^2 mod n where n=pq Blum integers, provably secure under QR factoring assumption, period up to φ(n)^2/4
  • NIST Statistical Test Suite includes 15 tests like Frequency, Runs, FFT, passing rate >1-0.01=0.99 for good PRNG at 10^6 bits
  • Dieharder battery by Marsaglia has 31 tests, ChaCha20 PRNG passes all with p-values uniform
  • Cryptographically secure PRNGs like Fortuna use AES in counter mode with reseeding every 2^20 bytes from entropy pool
  • RC4 stream cipher flawed with bias in first 256 bytes, discarded key recovery in 2^26 operations
  • Salsa20/20 has 20 rounds, distinguishes from random only after 2^251 trials, used in XSalsa20
  • TestU01 BigCrush suite has 160 tests, requires >10^9 bits, LCGs fail while PCG passes
  • Expand-Expand-Extract construction for PRNG stretch turns ε-secure seed into 2^l pseudorandom bits with security 2ε

Pseudorandomness Interpretation

The Blum Blum Shub generator is a mathematician's elegant fortress, provably secure but painfully slow, while modern statistical test suites like NIST and Dieharder ruthlessly hunt for flaws, revealing that true cryptographic security demands both rigorous proofs and demonstrable statistical randomness.

Quantum Randomness

  • Quantum random number generators using photon detection achieve up to 1 Gbps entropy rate with Bell inequality violation confirming randomness
  • Bell's inequality violation by 2.42σ in Aspect experiment 1982 shows quantum correlations exceed classical local hidden variables
  • Loophole-free Bell test by Hensen et al. 2015 closed detection and locality loopholes with CHSH violation S=2.42 ±0.20
  • Vacuum fluctuations in QED provide fundamental randomness source, Casimir effect measured force 10^{-7} N between plates 0.6 μm apart
  • Radioactive decay is quantum random, half-life of Carbon-14 is 5730 years, decay constant λ=ln(2)/5730 ≈1.21×10^{-4} yr^{-1}
  • Shot noise in photodiodes from Poisson photon arrival gives entropy rate up to 20 Gbps in commercial QRNGs
  • EPR paradox resolved by non-locality, no-signaling theorem preserves causality in quantum randomness sharing
  • Certified randomness from quantum advantage protocols like DI-QKD extract 1-(2+ε)ε bits per test with security against quantum adversaries
  • Homodyne detection QRNGs measure vacuum quadrature fluctuations, achieving min-entropy >0.99 bits per 8-bit sample

Quantum Randomness Interpretation

Quantum randomness, certified by the persistent violation of local reality in experiments like Aspect's, proves that nature's dice are not just rolling but are fundamentally rigged against classical prediction.

Random Graphs

  • In random graph G(n,p=1/2), expected edges n(n-1)/4 ≈ n^2/4, threshold for connectivity p=ln n /n
  • Erdős–Rényi G(n,p) has giant component emerging at p=1/n, size ~n for p>(1+ε)/n
  • Diameter of G(n, ln n /n) is 2 w.h.p., Hamilton cycle exists for p>ln n /n
  • In Barabási–Albert scale-free random graph, degree distribution P(k)~k^{-3}, preferential attachment α=1
  • Watts-Strogatz small-world model at rewiring p=0.1 has clustering coeff 0.3-0.4 and path length ln n
  • Expected clique number ω(G(n,1/2)) ~ 2 log_2 n
  • Chromatic number χ(G(n,p)) ≈ n / (2 log_2 n) for dense case
  • Matching number ν(G(n,1/2)) = floor(n/2) w.h.p.
  • In configuration model random graph with degree sequence d_i, multiedge prob ~ sum d_i^2 / (2m)^2

Random Graphs Interpretation

While you might think navigating these random graphs is a lonely affair, in reality, you're never more than two introductions away from anyone, a giant crowd forms the instant people stop being picky, and you're almost guaranteed to find a perfect dance partner even if you have to color-code the guest list to avoid clashes.

Random Number Generation

  • The period of the Mersenne Twister pseudorandom number generator MT19937 is exactly 2^19937 − 1, which is approximately 4.3 × 10^6001
  • In Python's random module, the default random number generator uses the Mersenne Twister algorithm with a state size of 19937 bits
  • The RANDU pseudorandom number generator, infamous for poor quality, produces numbers where x_{n+1} = (65539 * x_n) mod 2^31, leading to lattice structures in 3D space
  • Linear congruential generators (LCGs) are defined by X_{n+1} = (a * X_n + c) mod m, with Hull-Dobell theorem requiring conditions for full period m
  • The Xorshift128+ algorithm by Vigna has a period of 2^128 - 1 and passes BigCrush test suite with flying colors
  • PCG (Permuted Congruential Generator) family achieves 2^64 state space with high speed and passes TestU01 battery
  • In hardware, Intel's RdRand instruction uses thermal noise for entropy, providing 128-bit random values at up to 3 GB/s on modern CPUs
  • True random number generators (TRNGs) based on radioactive decay have bit rates around 1 kbps due to Geiger counter limitations
  • The Lehmer random number generator uses multiplicative congruential method with multiplier 16807 and modulus 2^31 - 1
  • Chaos-based RNGs using logistic map x_{n+1} = r x_n (1 - x_n) with r=4 achieve full [0,1] coverage but sensitive to initial conditions

Random Number Generation Interpretation

Despite the endless pursuit of algorithmic complexity in pseudorandom number generators, from Mersenne Twister's astronomically long period to PCG's nimble efficiency, our most reliable source of true entropy still hinges on the slow, chaotic poetry of subatomic decay.

Random Walks

  • Simple symmetric random walk on Z starts at 0, P(return to 0 at step 2n) ~ 1/sqrt(π n) asymptotically, recurrent in 1D
  • In 2D random walk, probability of returning to origin is 1, recurrent, but expected return time infinite
  • 3D simple random walk is transient, probability of ever returning to origin is about 0.340537
  • Root mean square displacement for d-dimensional random walk after n steps is sqrt(n d)
  • Polya's urn model leads to random walk where proportion converges almost surely to Beta distribution
  • Self-avoiding walk on lattice has connective constant μ≈4.68 for square lattice in 2D
  • Lévy flight random walk has step lengths from stable distribution α<2, superdiffusive with <r^2> ~ t^{2/α}
  • Random walk on hypercube graph of dimension d has mixing time O(d log d)
  • Bridge random walk conditioned to return to start at time n has distribution related to Ballot theorem
  • Continuous Brownian motion has quadratic variation <B>_t = t, paths continuous but nowhere differentiable

Random Walks Interpretation

You get back home every time in one and two dimensions but with diminishing likelihood and infinite patience, while a third-dimensional escape is nearly a two-to-one bet; generally you wander about with a growing sense of lostness, except when you avoid your own footsteps, take erratic leaps, hop between cube corners, or are forced to return—all while secretly carving out a continuous yet utterly jagged path.