What Did Buhayan Learn?? 🤔

IV. Discrete Probability

Discrete probability deals with the likelihood of events occurring in situations where the set of possible outcomes (the sample space) is finite or countably infinite. It provides a mathematical framework for quantifying uncertainty and making predictions in scenarios involving chance.

Real-life context: Discrete probability is used everywhere, from determining the odds in games of chance (like lotteries or card games) and weather forecasting, to risk assessment in finance and insurance, quality control in manufacturing, and the effectiveness of medical treatments.

A. Probability Theorems & Axioms

What it is: Probability theory is built upon a few fundamental axioms and rules:

  • Axiom 1: For any event E, 0 ≤ P(E) ≤ 1. (Probabilities are non-negative and no greater than 1).
  • Axiom 2: P(S) = 1. (The probability of the entire sample space, i.e., that some outcome occurs, is 1).
  • Axiom 3 (For mutually exclusive events): If E1, E2, ..., En are mutually exclusive events (i.e., they cannot occur at the same time, Ei ∩ Ej = Ø for i ≠ j), then P(E1 ∪ E2 ∪ ... ∪ En) = P(E1) + P(E2) + ... + P(En).
From these axioms, several important rules can be derived:
  • Rule of Complement: P(E') = 1 - P(E), where E' is the complement of E (event E not happening).
  • Addition Rule (General): For any two events A and B, P(A ∪ B) = P(A) + P(B) - P(A ∩ B). (The probability of A or B or both occurring).
  • Conditional Probability: The probability of event A occurring given that event B has already occurred is P(A|B) = P(A ∩ B) / P(B), provided P(B) > 0.
  • Multiplication Rule: P(A ∩ B) = P(A|B)P(B) = P(B|A)P(A). (The probability of both A and B occurring).
  • Independence: Two events A and B are independent if the occurrence of one does not affect the probability of the other. If independent, P(A ∩ B) = P(A)P(B), and P(A|B) = P(A).

C. Bayes' Theorem

What it is: Bayes' Theorem describes how to update the probability of a hypothesis based on new evidence. It relates the conditional probability of an event A given B, to the conditional probability of B given A. The formula is:
P(H|E) = [P(E|H) * P(H)] / P(E)
Where:

  • P(H|E) is the posterior probability: the probability of hypothesis H being true after observing evidence E.
  • P(E|H) is the likelihood: the probability of observing evidence E if hypothesis H is true.
  • P(H) is the prior probability: the initial probability of hypothesis H being true before observing evidence E.
  • P(E) is the evidence probability (or marginal likelihood): the total probability of observing evidence E. It can be calculated as P(E) = P(E|H)P(H) + P(E|¬H)P(¬H) for a simple case with H and its complement ¬H.

D. Expected Value and Variance

What it is:

  • Expected Value (E[X] or μ): For a discrete random variable X that can take values x1, x2, ..., xn with probabilities P(X=x1), P(X=x2), ..., P(X=xn), the expected value is the weighted average of these values:
    E[X] = Σ [xi * P(X=xi)]
    It represents the long-run average outcome if the experiment is repeated many times.
  • Variance (Var(X) or σ2): Measures the spread or dispersion of the random variable's values around its expected value.
    Var(X) = E[(X - E[X])2] = Σ [(xi - E[X])2 * P(X=xi)]
    An alternative formula is Var(X) = E[X2] - (E[X])2.
  • Standard Deviation (σ): The square root of the variance (σ = √Var(X)). It is in the same units as the random variable, making it easier to interpret the spread.

Where can all this be used?

Gaming: Calculating chances of winning a dice or card game
Cybersecurity: Predicting likelihood of attack vectors or password cracking
Operations Research: Modeling queue systems, machine failures (discrete events)
Epidemiology: Counting number of cases or outbreaks (discrete counts of events)
Genetics: Probabilities of trait inheritance (Punnett squares, Mendelian genetics)