Deterministic: No randomness involved - concept of probability N/A
Sample Space: Set of possible outcomes of a random experiment; denoted $S$
Uniform Probability Model: Each event in $S$ is equally likely
Permutations: Arrangements of sequences $~_nP_k = n^{(k)}$
Sterling's Approximation: Approximates $n!≈(\frac{n}{e})^n\sqrt{2πn}$
Combinations: Arrangements, order is arbitrary (i.e. "sort then remove dupes")
Random Variables: Function from sample space → ℝ (e.g. $X=\text{the number of heads when a coin is flipped thrice}$)
F(x) ∈ [0, 1]
; limit as $x→-∞$ is 0
, limit to $∞$ is 1
1 & 2
3
3.2
3.3
Binomial Distribution: Two outcomes: "success" or "failure".
$f(x)=P(X=x)={n\choose x}p^x(1-p)^{n-x}\text{ for }x=0, 1, 2, \ldots, n$
X ~ B(n, p)
, X is number of successesHypergeometric Distribution: Binomial without replacement
$f(x)=P(X=x)={r \choose x}{N-r \choose n-x}\div{N \choose n}$
X ~ HypergeometricDistribution(n, r, N)
, X is number of successesNegative Binomial Distribution: Binomial, except we keep doing the experiment until we get $k$ successes
$f(x)=P(X=x)={x+k-1\choose x}p^k(1-p)^x\text{ for }x=0, 1, 2, \ldots,$
X ~ NB(k, p)
, X is number of failures before $k$th successGeometric Distribution: Negative Binomial with $k=1$ success
$f(x)=P(X=x)=p(1-p)^x\text{ for }x=0, 1, 2, \ldots,$
X ~ NB(1, p)
, X is number of failures before a successPoisson Distribution from Binomial: Binomial as $n→∞, p→0$
$f(x)=μ^xe^{-p}\div x! \text{ for } x =0, 1, 2, \ldots$
X ~ Poisson(μ)
, X is number of eventsPoisson Distribution from Poisson Process: Events that occur at points in time or space
$f_t(x)=\frac{μ^xe^{-μ}}{x!}\text{ for }x=0,1,2,\ldots$