Formula Sheets

Probability Formula Booklet

Probability Formula Booklet

Complete Reference Guide for All Mathematics Students

📐 What is Probability?

Definition:
Probability is a measure of the likelihood that an event will occur. It quantifies uncertainty and is expressed as a number between 0 and 1, where 0 means the event will never occur and 1 means it will certainly occur.

Key Concepts:
Experiment: A process that produces outcomes
Sample Space (S): Set of all possible outcomes
Event (A): A subset of the sample space
Favorable Outcome: Outcomes that satisfy the event

📊 Basic Probability Formula

\[ P(A) = \frac{n(A)}{n(S)} = \frac{\text{Number of favorable outcomes}}{\text{Total number of possible outcomes}} \]

Where:
• P(A) = Probability of event A
• n(A) = Number of favorable outcomes
• n(S) = Total number of possible outcomes in sample space

Probability Range:
\[ 0 \leq P(A) \leq 1 \]

Example: Probability of rolling a 3 on a fair die

Favorable outcomes = 1 (only one face shows 3)
Total outcomes = 6 (six faces)
\[ P(\text{rolling 3}) = \frac{1}{6} \approx 0.167 \text{ or } 16.7\% \]

🔄 Complementary Events

\[ P(A') = 1 - P(A) \]

or

\[ P(A) + P(A') = 1 \]

Where:
• A' (or Ac) = Complement of event A (event A does NOT occur)
• P(A') = Probability that event A does not happen
• The sum of probabilities of an event and its complement is always 1

Example: Probability of NOT getting a head in a coin toss

P(Head) = 1/2
\[ P(\text{Not Head}) = 1 - \frac{1}{2} = \frac{1}{2} \]

➕ Addition Rule (Union of Events)

General Addition Rule:

\[ P(A \cup B) = P(A) + P(B) - P(A \cap B) \]

(Probability of A OR B occurring)

Where:
• P(A ∪ B) = Probability of A or B (union)
• P(A ∩ B) = Probability of both A and B (intersection)
• We subtract P(A ∩ B) to avoid counting overlap twice

For Mutually Exclusive Events:
(Events that cannot occur simultaneously)

\[ P(A \cap B) = 0 \]

\[ P(A \cup B) = P(A) + P(B) \]

Example: Drawing a King or a Heart from a deck

P(King) = 4/52, P(Heart) = 13/52, P(King of Hearts) = 1/52
\[ P(\text{King or Heart}) = \frac{4}{52} + \frac{13}{52} - \frac{1}{52} = \frac{16}{52} = \frac{4}{13} \]

✖️ Multiplication Rule (Intersection of Events)

For Independent Events:

\[ P(A \cap B) = P(A) \times P(B) \]

(Probability of A AND B both occurring)

For Dependent Events:

\[ P(A \cap B) = P(A) \times P(B|A) \]

where P(B|A) = probability of B given A has occurred

Independent vs Dependent:
Independent: One event does not affect the other (e.g., two coin flips)
Dependent: One event affects the probability of the other (e.g., drawing cards without replacement)

Example (Independent): Flipping two coins, getting heads both times

\[ P(\text{2 heads}) = P(\text{Head}) \times P(\text{Head}) = \frac{1}{2} \times \frac{1}{2} = \frac{1}{4} \]

🔗 Conditional Probability

\[ P(B|A) = \frac{P(A \cap B)}{P(A)} \]

(Probability of B given that A has occurred)

Where:
• P(B|A) = Probability of event B occurring given that A has already occurred
• P(A ∩ B) = Probability of both A and B occurring
• P(A) = Probability of event A (must be > 0)

Alternative form:
\[ P(A \cap B) = P(A) \times P(B|A) = P(B) \times P(A|B) \]

Example: Drawing 2 aces without replacement

P(1st Ace) = 4/52
P(2nd Ace | 1st Ace) = 3/51
\[ P(\text{2 Aces}) = \frac{4}{52} \times \frac{3}{51} = \frac{1}{221} \]

🎓 Bayes' Theorem

\[ P(A|B) = \frac{P(B|A) \times P(A)}{P(B)} \]

Where:
• P(A|B) = Posterior probability (what we want to find)
• P(B|A) = Likelihood
• P(A) = Prior probability
• P(B) = Marginal probability (total probability of B)

Extended form with total probability:

\[ P(A|B) = \frac{P(B|A) \times P(A)}{P(B|A) \times P(A) + P(B|A') \times P(A')} \]

Use case: Updating probabilities based on new evidence (medical diagnosis, spam filtering, etc.)

🎲 Binomial Probability Formula

\[ P(X = k) = \binom{n}{k} p^k (1-p)^{n-k} \]

\[ = \frac{n!}{k!(n-k)!} p^k (1-p)^{n-k} \]

Where:
• n = Number of trials
• k = Number of successes
• p = Probability of success on a single trial
• (1-p) = Probability of failure
• \(\binom{n}{k}\) = Binomial coefficient (combinations)

Conditions for Binomial Distribution:
• Fixed number of trials (n)
• Each trial is independent
• Only two outcomes (success/failure)
• Constant probability of success (p)

Example: Probability of getting exactly 3 heads in 5 coin flips

n = 5, k = 3, p = 0.5
\[ P(X=3) = \binom{5}{3} (0.5)^3 (0.5)^2 = 10 \times 0.125 \times 0.25 = 0.3125 \]

📈 Expected Value and Variance

Expected Value (Mean):

\[ E(X) = \mu = \sum_{i=1}^{n} x_i \cdot P(x_i) \]

Variance:

\[ \text{Var}(X) = \sigma^2 = \sum_{i=1}^{n} (x_i - \mu)^2 \cdot P(x_i) \]

\[ = E(X^2) - [E(X)]^2 \]

Standard Deviation:

\[ \sigma = \sqrt{\text{Var}(X)} \]

For Binomial Distribution:
• E(X) = np
• Var(X) = np(1-p)
• σ = √[np(1-p)]

🔢 Permutations and Combinations

Permutations (Order Matters):

\[ P(n,r) = \frac{n!}{(n-r)!} \]

(Arranging r objects from n objects)

Combinations (Order Doesn't Matter):

\[ C(n,r) = \binom{n}{r} = \frac{n!}{r!(n-r)!} \]

(Selecting r objects from n objects)

Key Difference:
Permutations: ABC ≠ BAC (order matters)
Combinations: {A,B,C} = {B,A,C} (order doesn't matter)

Examples:

Permutation: Arranging 3 books from 5 = P(5,3) = 60
Combination: Choosing 3 books from 5 = C(5,3) = 10

📉 Normal Distribution (Gaussian)

Probability Density Function:

\[ f(x) = \frac{1}{\sigma\sqrt{2\pi}} e^{-\frac{(x-\mu)^2}{2\sigma^2}} \]

Where:
• μ = Mean (center of distribution)
• σ = Standard deviation (spread)
• σ² = Variance
• π ≈ 3.14159
• e ≈ 2.71828

Standard Normal (Z-score):

\[ Z = \frac{X - \mu}{\sigma} \]

(Converts to standard normal with μ = 0, σ = 1)

Empirical Rule (68-95-99.7):
• 68% of data within 1σ of mean
• 95% of data within 2σ of mean
• 99.7% of data within 3σ of mean

🎯 Law of Total Probability

\[ P(B) = \sum_{i=1}^{n} P(B|A_i) \cdot P(A_i) \]

For partition {A₁, A₂, ..., Aₙ}

For two events:
\[ P(B) = P(B|A) \cdot P(A) + P(B|A') \cdot P(A') \]

Use case: Finding total probability when event B can occur through multiple pathways (mutually exclusive events A₁, A₂, ..., Aₙ)

📊 Common Probability Distributions

Bernoulli Distribution

Single trial with two outcomes
\[ P(X = x) = p^x(1-p)^{1-x} \text{ for } x \in \{0,1\} \]

Poisson Distribution

Number of events in fixed interval
\[ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} \]
where λ = average rate

Geometric Distribution

Number of trials until first success
\[ P(X = k) = (1-p)^{k-1} \cdot p \]

Uniform Distribution

All outcomes equally likely
\[ P(X = x) = \frac{1}{n} \text{ for discrete} \]
\[ f(x) = \frac{1}{b-a} \text{ for continuous on } [a,b] \]

⚡ Important Probability Properties

1. Sum of all probabilities = 1
\[ \sum P(x_i) = 1 \]

2. Probability of impossible event = 0
\[ P(\emptyset) = 0 \]

3. Probability of certain event = 1
\[ P(S) = 1 \]

4. For independent events A and B:
\[ P(A|B) = P(A) \text{ and } P(B|A) = P(B) \]

5. De Morgan's Laws:
\[ P((A \cup B)') = P(A' \cap B') \]
\[ P((A \cap B)') = P(A' \cup B') \]

📋 Quick Reference Summary

Basic: \(P(A) = \frac{n(A)}{n(S)}\) | Range: \(0 \leq P(A) \leq 1\)

Complement: \(P(A') = 1 - P(A)\)

Addition: \(P(A \cup B) = P(A) + P(B) - P(A \cap B)\)

Multiplication (Independent): \(P(A \cap B) = P(A) \times P(B)\)

Conditional: \(P(B|A) = \frac{P(A \cap B)}{P(A)}\)

Bayes': \(P(A|B) = \frac{P(B|A) \times P(A)}{P(B)}\)

🎲 Master Probability for Mathematical Success!

Probability is fundamental to statistics, data science, and decision-making

💡 Pro Tips:
• Always check if events are independent or dependent
• Verify your probabilities sum to 1 for all possible outcomes
• Draw diagrams (tree diagrams, Venn diagrams) to visualize problems
• Remember: probability is always between 0 and 1
• For "at least" problems, often easier to use complement rule
• Practice identifying which formula applies to each problem type

Shares: