Key Takeaways
- In discrete random variables, the possible values are countable, such as the number of heads in 10 coin flips ranging from 0 to 10
- Expected value for discrete RV is E[X] = Σ x_i P(X=x_i)
- Discrete variables often result from counting processes, like number of defects in manufacturing
- A Poisson distribution models the number of events occurring in a fixed interval, with PMF P(K=k) = (λ^k * e^{-λ}) / k! where λ is the average rate
- Binomial distribution counts successes in n independent Bernoulli trials, PMF P(X=k) = C(n,k) p^k (1-p)^{n-k}
- Geometric distribution gives trials until first success, PMF P(X=k)=(1-p)^{k-1}p
- Continuous random variables take any value in an interval, like height measured to any precision
- Variance of continuous RV is Var(X) = E[(X-μ)^2] = ∫ (x-μ)^2 f(x) dx
- Cumulative distribution function for continuous is F(x)=P(X≤x)=∫_{-∞}^x f(t)dt
- The normal distribution has PDF f(x) = (1/(σ√(2π))) * e^{-(x-μ)^2/(2σ^2)}, used for continuous data like IQ scores
- Uniform continuous distribution over [a,b] has PDF f(x)=1/(b-a) for x in [a,b], modeling random selection from interval
- Exponential distribution models time between events, PDF f(x)=λ e^{-λx} for x≥0
- Discrete variables use probability mass functions (PMF), while continuous use probability density functions (PDF)
- Discrete RVs have P(X=x)>0 for specific points, continuous have P(X=x)=0 for any single x
- Continuous distributions are infinitely divisible, unlike many discrete ones limited by support
The blog post explains that discrete variables are countable while continuous variables are measurable.
Continuous Distributions
- The normal distribution has PDF f(x) = (1/(σ√(2π))) * e^{-(x-μ)^2/(2σ^2)}, used for continuous data like IQ scores
- Uniform continuous distribution over [a,b] has PDF f(x)=1/(b-a) for x in [a,b], modeling random selection from interval
- Exponential distribution models time between events, PDF f(x)=λ e^{-λx} for x≥0
- Gamma distribution generalizes exponential, PDF f(x)= (x^{α-1} e^{-x/β}) / (β^α Γ(α))
- Lognormal distribution for log-transformed normals, PDF involves exp and normal terms
- Beta distribution on [0,1], PDF f(x)= x^{α-1}(1-x)^{β-1} / B(α,β)
- Weibull distribution for reliability, PDF f(x)= (k/λ)(x/λ)^{k-1} e^{-(x/λ)^k}
- Chi-squared distribution sum of squared standard normals, PDF involves gamma function
- Student's t-distribution for small samples, PDF complex with degrees of freedom ν
- Cauchy distribution heavy-tailed, no mean or variance, PDF 1/[π(1+x^2)]
- Laplace distribution double exponential, PDF (1/(2b))exp(-|x-μ|/b)
- Pareto distribution for incomes, PDF α x_m^α / x^{α+1}
- F-distribution ratio of chi-squareds, used in ANOVA
- Logistic distribution for growth models, PDF sech^2 form
- Rayleigh distribution for wind speeds, PDF (x/σ^2) e^{-x^2/(2σ^2)}
- Inverse Gaussian for Brownian motion hitting times
- Gumbel distribution extreme value type I
- Birnbaum–Saunders distribution fatigue life
- Irwin–Hall distribution sum of uniforms
- Tracy–Widom distribution random matrix extremes
- Holtsmark distribution gravitational fields
- von Mises distribution circular continuous
- Wrapped normal circular continuous analog
- Arcsine distribution U-shaped
- Noncentral chi-squared noncentrality parameter λ
- Hotelling's T-squared multivariate continuous
- Lévy distribution stable with α=1/2
- Multivariate normal density product of conditionals
- Rice distribution amplitude of sinusoid in noise
- Singh–Maddala distribution generalized gamma
- Beta prime ratio of gammas
- Generalized extreme value distro types I-III
Continuous Distributions Interpretation
Continuous Random Variables
- Continuous random variables take any value in an interval, like height measured to any precision
- Variance of continuous RV is Var(X) = E[(X-μ)^2] = ∫ (x-μ)^2 f(x) dx
- Cumulative distribution function for continuous is F(x)=P(X≤x)=∫_{-∞}^x f(t)dt
- Quantiles for continuous defined via inverse CDF
- For continuous, PDF integrates to 1 over support, no mass at points
- Median for continuous solves ∫_{-∞}^m f(x)dx=0.5
- Mode of continuous multimodal if multiple peaks in PDF
- Continuous uniform CDF F(x)=(x-a)/(b-a)
- Mean absolute deviation for continuous ∫|x-μ|f(x)dx
- Discrete RVs digitized signals approximate continuous
- Mean for exponential continuous 1/λ
Continuous Random Variables Interpretation
Differences and Comparisons
- Discrete variables use probability mass functions (PMF), while continuous use probability density functions (PDF)
- Discrete RVs have P(X=x)>0 for specific points, continuous have P(X=x)=0 for any single x
- Continuous distributions are infinitely divisible, unlike many discrete ones limited by support
- Discrete can be approximated by continuous via Poisson limit theorem for rare events
- Binomial converges to normal as n→∞ by CLT, bridging discrete-continuous
- Kolmogorov-Smirnov test distinguishes discrete from continuous empirically
- Continuous approximations like normal to binomial valid when np>5, n(1-p)>5
- Discrete can have atoms in distribution, continuous are atomless
- Quantization turns continuous into discrete, losing information
- Continuous RVs modeled by stochastic differential equations, discrete by difference eqs
- Le Cam's theorem on discrete convergence to continuous
- Kolmogorov complexity distinguishes discrete patterns from continuous noise
- Donsker's theorem functional CLT from discrete to continuous paths
- Continuity correction for discrete to continuous approx, add/sub 0.5
- Lévy continuity theorem convergence discrete to continuous
Differences and Comparisons Interpretation
Discrete Distributions
- A Poisson distribution models the number of events occurring in a fixed interval, with PMF P(K=k) = (λ^k * e^{-λ}) / k! where λ is the average rate
- Binomial distribution counts successes in n independent Bernoulli trials, PMF P(X=k) = C(n,k) p^k (1-p)^{n-k}
- Geometric distribution gives trials until first success, PMF P(X=k)=(1-p)^{k-1}p
- Negative binomial distribution counts trials for r successes, PMF involves combinations
- Hypergeometric distribution for sampling without replacement, PMF P(X=k)= [C(K,k)C(N-K,n-k)] / C(N,n)
- Multinomial distribution generalizes binomial for multiple categories, PMF product of powers
- Discrete uniform on {1,2,...,n}, P(X=k)=1/n
- Zipf distribution for rank-frequency, discrete power law P(k)∝1/k^s
- Pascal distribution synonym for negative binomial
- Zeta distribution discrete generalization of Pareto, P(k)=1/ζ(s) k^{-s}
- Yule-Simon distribution for species richness, discrete
- Discrete phase-type distributions phase exponential
- Categorical distribution multinomial special case
- Skellam distribution difference of Poissons
- Compound Poisson discrete sum of random Poissons
- Binomial coefficient C(n,k) central in discrete probs
- Hermite distribution overdispersed Poisson
- Logarithmic distribution niche species, P(k)= -p^k /(k ln(1-p))
- Delaporte convolution Poisson gamma
Discrete Distributions Interpretation
Discrete Random Variables
- In discrete random variables, the possible values are countable, such as the number of heads in 10 coin flips ranging from 0 to 10
- Expected value for discrete RV is E[X] = Σ x_i P(X=x_i)
- Discrete variables often result from counting processes, like number of defects in manufacturing
- Discrete RVs have finite or countably infinite support sets
- Moments for discrete are E[X^n]=Σ x_i^n P(X=x_i)
- Skewness for discrete calculated as E[(X-μ)^3]/σ^3
- Bernoulli RV takes 0 or 1, E[X]=p, Var(X)=p(1-p)
- Discrete RVs model integers like customer arrivals per hour
- Discrete RVs have step-function CDF, continuous have smooth CDF
- Entropy for discrete H(X)= -Σ P(x) log P(x)
- Support of discrete RV is countable set, often {0,1,2,...}
- PMF sums to 1 for discrete: Σ P(X=x_i)=1
- Kurtosis excess for discrete calculated similarly to continuous
- Discrete RVs in Markov chains have countable states
- Covariance for bivariate discrete E[XY]-E[X]E[Y]
- Discrete RVs have probability generating function G(s)=E[s^X]
- Independence for discrete P(X=x,Y=y)=P(X=x)P(Y=y)
- Discrete Laplace signed Poisson-like
- Discrete RVs in queueing theory M/D/1 models
- Generating functions multiply for independent discrete RVs
- PMF non-negative sums to 1 for discrete
- Characteristic function for discrete E[e^{itX}] sum over points
Discrete Random Variables Interpretation
Sources & References
- Reference 1ENen.wikipedia.orgVisit source
- Reference 2KHANACADEMYkhanacademy.orgVisit source
- Reference 3STATTREKstattrek.comVisit source
- Reference 4PROBABILITYCOURSEprobabilitycourse.comVisit source
- Reference 5MATHmath.libretexts.orgVisit source
- Reference 6ITLitl.nist.govVisit source
- Reference 7BRILLIANTbrilliant.orgVisit source
- Reference 8MATHWORLDmathworld.wolfram.comVisit source
- Reference 9STATLECTstatlect.comVisit source
- Reference 10ONLINEonline.stat.psu.eduVisit source
- Reference 11STATstat.cmu.eduVisit source
- Reference 12MATHmath.stackexchange.comVisit source
- Reference 13ONLINECOURSESonlinecourses.science.psu.eduVisit source
- Reference 14DSPGUIDEdspguide.comVisit source






