Probability Guide
Classical Probability
P(A) = (favorable outcomes) / (total equally likely outcomes). Example: P(heads) = 1/2, P(rolling 6) = 1/6
Addition Rule
P(A ∪ B) = P(A) + P(B) − P(A ∩ B)
If mutually exclusive: P(A ∪ B) = P(A) + P(B)
Multiplication Rule
P(A ∩ B) = P(A) × P(B|A)
If independent: P(A ∩ B) = P(A) × P(B)
Conditional Probability
P(A|B) = P(A ∩ B) / P(B) — probability of A given B has occurred.
Bayes' Theorem
P(A|B) = P(B|A) × P(A) / P(B)
Used to update belief based on new evidence. Foundation of Bayesian statistics and ML classifiers.
Expected Value
E(X) = Σ xᵢ · P(xᵢ) — the weighted average of all possible outcomes. Example: E(fair die) = (1+2+3+4+5+6)/6 = 3.5