Bayes Theorem
Bayes theorem is a theorem in probability and statistics, named after the Reverend Thomas Bayes, that helps in determining the probability of an event that is based on some event that has already occurred. Bayes rule has many applications such as Bayesian interference, in the healthcare sector - to determine the chances of developing health problems with an increase in age and many others.
The Bayes theorem is based on finding P(A | B) when P(B | A) is given. Here, we will aim at understanding the use of the Bayes rule in determining the probability of events, its statement, formula, and derivation with the help of examples.
1. | What is Bayes Theorem? |
2. | Bayes Theorem Proof |
3. | Bayes Theorem Formula |
4. | Difference between Conditional Probability and Bayes Rule |
5. | Terms Related to Bayes Theorem |
8. | FAQs on Bayes Theorem |
What is Bayes Theorem?
Bayes theorem, in simple words, determines the conditional probability of event A given that event B has already occurred based on the following:
- Probability of B given A
- Probability of A
- Probability of B
Bayes Law is a method to determine the probability of an event based on the occurrences of prior events. It is used to calculate conditional probability. Bayes theorem calculates the probability based on the hypothesis. Now, let us state and prove Bayes Theorem. Bayes rule states that the conditional probability of an event A, given the occurrence of another event B, is equal to the product of the likelihood of B, given A and the probability of A divided by the probability of B. It is given as:
\(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\)
Here, P(A) = how likely A happens(Prior knowledge)- The probability of a hypothesis is true before any evidence is present.
P(B) = how likely B happens(Marginalization)- The probability of observing the evidence.
P(A|B) = how likely A happens given that B has happened(Posterior)-The probability of a hypothesis is true given the evidence.
P(B|A) = how likely B happens given that A has happened(Likelihood)- The probability of seeing the evidence if the hypothesis is true.
Bayes Theorem - Statement
The statement of Bayes Theorem is as follows: Let \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) be a set of events associated with a sample space S, where all events \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) have non-zero probability of occurrence and they form a partition of S. Let A be any event which occurs with \(E_{1} or E_{2} or E_{3} ...or E_{n}\), then according to Bayes Theorem,
\(P(E_{i} | A) = \dfrac{P(E_{i})P(A|E_{i})}{\sum_{k=1}^{n}P(E_{k})P(A|E_{k})} , i=1,2,3,...,n\)
- Here E\(_i\) ∩ E\(_j\) = φ, where i ≠ j. (i.e) They are mutually exhaustive events
- The union of all the events of the partition, should give the sample space.
- 0 ≤ P(E\(_{i}\)) ≤ 1
Bayes Theorem Proof
To prove the Bayes Theorem, we will use the total probability and conditional probability formulas.
- The total probability of an event A is calculated when not enough data is known about event A, then we use other events related to event A to determine its probability.
- Conditional probability is the probability of event A given that other related events have already occurred.
(E\(_{i}\)), be is a partition of the sample space S. Let A be an event that occurred. Let us express A in terms of (E\(_{i}\)).
A = A ∩ S
= A ∩ (\(E_{1}, E_{2}, E_{3}, ..., E_{n}\))
A = (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\))....∪ ( A ∩\(E_{1}\))
P(A) = P[(A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\))....∪ ( A ∩\(E_{1}\))]
We know that when A and B are disjoint sets, then P(A∪B) = P(A) + P(B)
Thus here, P(A) = P(A ∩\(E_{1}\)) +P(A ∩\(E_{1}\))+ P(A ∩\(E_{1}\)).....P(A ∩\(E_{n}\))
According to the multiplication theorem of a dependent events, we have
P(A) = P(E). P(A|\(E_{1}\)) + P(E). P(A|\(E_{2}\)) + P(E). P(A|\(E_{3}\))......+ P(A|\(E_{n}\))
Thus total probability of P(A) = \(\sum_{i=1}^{n}P(E_{i})P(A|E_{i}) , i=1,2,3,...,n\) --- (1)
Recalling the conditional probability, we get
\(P(E_{i}|A) = \dfrac{P(E_{i}\cap A)}{P(A)} , i=1,2,3,...,n\) ---(2)
Using the formula for conditional probability of \(P(A|E_{i})\), we have
\(P(E_{i}\cap A) = P(A|E_{i}) P(E_{i})\) --- (3)
Substituting equations (1) and (3) in equation (2), we get
\(P(E_{i}|A) = \dfrac{P(A|E_{i}) P(E_{i})}{\sum_{k=1}^{n}P(E_{k})P(A|E_{k})}, i=1,2,3,...,n\)
Hence, Bayes rule is proved.
Bayes Theorem Formula
Bayes formula exists for events and random variables. Bayes theorem formulas are derived from the definition of conditional probability. It can be derived for events A and B, as well as continuous random variables X and Y. Let us first see the formula for events.
Bayes Theorem Formula for Events
The formula for events derived from the definition of conditional probability is:
\(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}, P(B) \neq 0\)
Derivation:
According to the definition of conditional probability, \(P(A|B) = \dfrac{P(A \cap B)}{P(B)}, P(B) \neq 0\) and we know that \(P(A \cap B) = P(B \cap A) = P(B|A)P(A)\), which implies,
\(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\)
Hence, the Bayes theorem formula for events is derived.
Bayes Theorem for Continuous Random Variables
The formula for continuous random variables X and Y derived from the definition of the conditional probability of continuous variables is:
\(f_{X|Y=y}(x) = \dfrac{f_{Y|X=x}(y)f_{X}(x)}{f_{Y}(y)}\)
Derivation:
According to the definition of conditional density or conditional probability of continuous random variables, we know that \(f_{X|Y=y}(x)=\dfrac{f_{X,Y}(x,y)}{f_{Y}(y)}\) and \(f_{Y|X=x}(y)=\dfrac{f_{X,Y}(x,y)}{f_{X}(x)}\), which implies,
\(f_{X|Y=y}(x) = \dfrac{f_{Y|X=x}(y)f_{X}(x)}{f_{Y}(y)}\)
Hence, the Bayes Theorem formula for random continuous variables is derived.
Difference Between Conditional Probability and Bayes Rule
Conditional Probability | Bayes Theorem |
---|---|
Conditional Probability is the probability of an event A that is based on the occurrence of another event B. | Bayes theorem is derived using the definition of conditional probability. The Bayes theorem formula includes two conditional probabilities. |
Formula: \(P(A|B) = \dfrac{P(A \cap B)}{P(B)}\) | Formula: \(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\) |
This formula gives the probability of A given B. | This formula gives the probability of A given B when the probability of B given A is known. |
Terms Related to Bayes Theorem
As we have studied about Bayes theorem in detail, let us understand the meanings of a few terms related to the concept which have been used in the Bayes theorem formula and derivation:
- Conditional Probability - Conditional Probability is the probability of an event A based on the occurrence of another event B. It is denoted by P(A|B) and represents the probability of A given that event B has already happened.
- Joint Probability - Joint probability measures the probability of two more events occurring together and at the same time. For two events A and B, it is denoted by \(P(A \cap B)\).
- Random Variables - Random variable is a real-valued variable whose possible values are determined by a random experiment. The probability of such variables is also called the experimental probability.
- Posterior Probability - Posterior probability is the probability of an event that is calculated after all the information related to the event has been accounted for. It is also known as conditional probability.
- Prior Probability - Prior probability is the probability of an event that is calculated before considering the new information obtained. It is the probability of an outcome that is determined based on current knowledge before the experiment is performed.
Important Notes on Bayes Law:
- Bayes theorem is used to determine conditional probability.
- When two events A and B are independent, P(A|B) = P(A) and P(B|A) = P(B)
- Conditional probability can be calculated using the Bayes theorem for continuous random variables.
☛ Related Topics:
Bayes Theorem Examples
-
Example 1: Amy has two bags. Bag I has 7 red and 4 blue balls and bag II has 5 red and 9 blue balls. Amy draws a ball at random and it turns out to be red. Determine the probability that the ball was from the bag I.
Solution: Assume A to be the event of drawing a red ball. We know that the probability of choosing a bag for drawing a ball is 1/2, that is,
P(X) = P(Y) = 1/2
Let X and Y be the events that the ball is from the bag I and bag II, respectively. Since there are 7 red balls out of a total of 11 balls in the bag I, therefore, P(drawing a red ball from the bag I) = P(A|X) = 7/11
Similarly, P(drawing a red ball from bag II) = P(A|Y) = 5/14
We need to determine the value of P(the ball drawn is from the bag I given that it is a red ball), that is, P(X|A). To determine this we will use Bayes Theorem. Using Bayes theorem, we have the following:
\(P(X|A) = \dfrac{P(A|X)P(X)}{P(A|X)P(X)+P(A|Y)P(Y)}\)
= [((7/11)(1/2))/(7/11)(1/2)+(5/14)(1/2)]
= 0.64
Answer: ∴ The probability that the ball is drawn is from bag I is 0.64
-
Example 2: Assume that the chances of a person having a skin disease are 40%. Assuming that skin creams and drinking enough water reduces the risk of skin disease by 30% and prescription of a certain drug reduces its chance by 20%. At a time, a patient can choose any one of the two options with equal probabilities. It is given that after picking one of the options, the patient selected at random has the skin disease. Find the probability that the patient picked the option of skin screams and drinking enough water using the Bayes theorem.
Solution: Assume E1: The patient uses skin creams and drinks enough water; E2: The patient uses the drug; A: The selected patient has the skin disease
P(E1) = P(E2) = 1/2
Using the probabilities known to us, we have
P(A|E1) = 0.4 × (1-0.3) = 0.28
P(A|E2) = 0.4 × (1-0.2) = 0.32
Using Bayes rule, the probability that the selected patient uses skin creams and drinks enough water is given by,
\(P(E1|A) = \dfrac{P(A|E1)P(E1)}{P(A|E1)P(E1)+P(A|E2)P(E2)}\)
= (0.28 × 0.5)/(0.28 × 0.5 + 0.32 × 0.5)
= 0.14/(0.14 + 0.16)
= 0.47
Answer: ∴ The probability that the patient picked the first option is 0.47
-
Example 3: A man is known to speak the truth 3/4 times. He draws a card and reports it is king. Find the probability that it is actually a king.
Solution:
Let E be the event that the man reports that king is drawn from the pack of cards
A be the event that the king is drawn
B be the event that the king is not drawn.
Then we have P(A) = probability that king is drawn = 1/4
P(B) = probability that king is drawn = 3/4
P(E/A) = Probability that the man says the truth that king is drawn when actually king is drawn = P(truth) = 3/4
P(E/B)= Probability that the man lies that king is drawn when actually king is drawn = P(lie) = 1/4
Then according to Bayes formula, the probability that it is actually a king = P(A/E)
=\(\dfrac{P(A)P(E|A)}{P(A)P(E|A)+P(B)P(E|B)}\)
= [1/4 × 3/4] ÷[(1/4 × 3/4) + (1/4 × 3/4)]
= 3/16 ÷12/16
= 3/16 × 16/12
=1/2 = 0.5
Answer: ∴ The probability that the drawn card is actually a king = 0.5
FAQs on Bayes Theorem
State Bayes Theorem Probability.
Bayes theorem is a statistical formula to determine the conditional probability of an event. It describes the probability of an event based on prior knowledge of events that have already happened. Bayes rule is named after the Reverend Thomas Bayes and Bayesian probability formula for random events is \(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\), where
- P(A) = how likely A happens
- P(B) = how likely B happens
- P(A/B) = how likely does A to happen given that B has happened
- P(B/A) = how likely does B to happen given that A has happened
What Does the Bayes Theorem State?
Let \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) be a set of events associated with a sample space S, where all events \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) are mutually exclusive and exhaustive events of the sample space S. Let A be an event related to S, then according to Bayesian probability, \(P(E_{i} | A) = \dfrac{P(E_{i})P(A|E_{i})}{\sum_{k=1}^{n}P(E_{k})P(A|E_{k})} , i=1,2,3,...,n\).
Is Conditional Probability the Same as Bayes Theorem?
Conditional probability is the probability of the occurrence of an event based on the occurrence of other events whereas the Bayes theorem is derived from the definition of conditional probability. Bayes law includes the two conditional probabilities.
How to Use Bayes Theorem?
To determine the probability of an event A given that the related event B has already occurred, that is, P(A|B) using the Bayes Theorem, we calculate the probability of the event B, that is, P(B); the probability of event B given that event A has occurred, that is, P(B|A); and the probability of the event A individually, that is, P(A). Then, we substitute these values into the Bayes formula \(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\) to determine the probability.
Is Bayes Rule for Independent Events?
If two events A and B are independent, then P(A|B) = P(A) and P(B|A) = P(B), therefore Bayes theorem cannot be used here to determine the conditional probability as we need to determine the total probability and there is no dependency of events.
What is the Bayes Theorem in Machine Learning?
Bayes theorem provides a method to determine the probability of a hypothesis based on its prior probability, the probabilities of observing various data given the hypothesis, and the observed data itself. It helps immensely in getting a more accurate result. Hence, whenever there is a conditional probability problem, the Bayes rule in Machine Learning is used.
visual curriculum