Bayes’ Theorem
Maths: Statistics for machine learning
2 min read
This section is 2 min read, full guide is 105 min read
Published Oct 22 2025, updated Oct 23 2025
40
Show sections list
0
Log in to enable the "Like" button
0
Guide comments
0
Log in to enable the "Save" button
Respond to this guide
Guide Sections
Guide Comments
Machine LearningMathsNumPyPandasPythonStatistics
Bayes’ Theorem describes how to update the probability of a hypothesis when given new evidence.
It tells us:
“How likely is something (a hypothesis) to be true, given we’ve seen some data (evidence)?”
Highlevel formula
Posterior = (Likelihood × Prior) / Evidence
It starts with what you already believe (prior) and updates it using new data (likelihood).
Real-World Example — Medical Test
Let’s say:
- 1% of people have a certain disease (P(Disease) = 0.01)
- A test is 99% accurate:
- If you have the disease, it’s positive 99% of the time (P(Pos|Disease) = 0.99)
- If you don’t, it’s negative 99% of the time (P(Neg|NoDisease) = 0.99)
Now, you take the test and it comes back positive.
What’s the probability you actually have the disease? (P(Disease|Pos))
Python example:
# Given values
P_disease = 0.01
P_no_disease = 1 - P_disease
P_pos_given_disease = 0.99
P_pos_given_no_disease = 0.01
# Calculate total probability of testing positive
P_positive = (P_pos_given_disease * P_disease) + (P_pos_given_no_disease * P_no_disease)
# Apply Bayes' theorem
P_disease_given_positive = (P_pos_given_disease * P_disease) / P_positive
print(f"Probability of disease given positive test: {P_disease_given_positive:.2%}")
Copy to Clipboard
Toggle show comments
Output:
Probability of disease given positive test: 50.00%
Even with a 99% accurate test, if the disease is rare, a positive result only means a 50% chance you actually have it.
Intuition
- The prior captures what you already believe (disease is rare).
- The likelihood shows how well the evidence fits that belief (positive test).
- The posterior gives your updated belief after seeing the new data.
This “belief updating” is what powers Bayesian inference — the foundation of probabilistic learning.














