Algebra- Revision Notes on Probability
The sum of all the probabilities in the sample space is 1.
The probability of an event which cannot occur is 0.
The probability of any event which is not in the sample space is zero.
The probability of an event which must occur is 1.
The probability of the sample space is 1.
The probability of an event not occurring is one minus the probability of it occurring.
The complement of an event E is denoted as E' and is written as P (E') = 1 - P (E)
P (A∪B) is written as P (A + B) and P (A ∩ B) is written as P (AB).
If A and B are mutually exclusive events, P(A or B) = P (A) + P (B)
When two events A and B are independent i.e. when event A has no effect on the probability of event B, the conditional probability of event B given event A is simply the probability of event B, that is P(B).
If events A and B are not independent, then the probability of the intersection of A and B (the probability that both events occur) is defined by P (A and B) = P (A) P (B|A).
A and B are independent if P (B/A) = P(B) and P(A/B) = P(A).
If E1, E2, ......... En are n independent events then P (E1 ∩ E2 ∩ ... ∩ En) = P (E1) P (E2) P (E3)...P (En).
Events E1, E2, E3, ......... En will be pairwise independent if P(Ai ∩ Aj) = P(Ai) P(Aj) i ≠ j.
P(Hi | A) = P(A | Hi) P(Hi) / ∑i P(A | Hi) P(Hi).
If A1, A2, ……An are exhaustive events and S is the sample space, then A1 U A2 U A3 U ............... U An = S
If E1, E2,….., En are mutually exclusive events, then P(E1 U E2 U ...... U En) = ∑P(Ei)
If the events are not mutually exclusive then P (A or B) = P (A) +P (B) – P (A and B)
Three events A, B and C are said to be mutually independent if P(A∩B) = P(A).P(B), P(B∩C) = P(B).P(C), P(A∩C) = P(A).P(C), P(A∩B∩C) = P(A).P(B).P(C)
The concept of mutually exclusive events is set theoretic in nature while the concept of independent events is probabilistic in nature.
If two events A and B are mutually exclusive,
P (A ∩ B) = 0 but P(A) P(B) ≠ 0 (In general)
⇒ P(A ∩ B) ≠ P(A) P(B)
⇒ Mutually exclusive events will not be independent.
The probability distribution of a count variable X is said to be the binomial distribution with parameters n and abbreviated B (n,p) if it satisfies the following conditions:
The total number of observations is fixed
The observations are independent.
Each outcome represents either a success or a failure.
The probability of success i.e. p is same for every outcome.
Some important facts related to binomial distribution:
(p + q)n = C0Pn + C1Pn-1q +...... Crpn-rqr +...+ Cnqn
The probability of getting at least k successes out of n trials is
P(x > k) = Σnx = k nCxpxqn-x
Σnx = k nCxqn-xpx = (q + p)n = 1
Mean of binomial distribution is np
Variance is npq
Standard deviation is given by (npq)1/2, where n
Sum of binomials is also binomial i.e. if X ~ B(n, p) and Y ~ B(m, p) are independent binomial variables with the same probability p, then X + Y is again a binomial variable with distribution X + Y ~ B(n + m, p).
If X ~ B(n, p) and, conditional on X, Y ~ B(X, q), then Y is a simple binomial variable with distributionY ~ B( n, pq).
The Bernoulli distribution is a special case of the binomial distribution, where n = 1. Symbolically, X ~ B (1, p) has the same meaning as X ~ Bern (p).
If an experiment has only two possible outcomes, then it is said to be a Bernoulli trial. The two outcomes are success and failure.
Any binomial distribution, B (n, p), is the distribution of the sum of n independent Bernoulli trials Bern (p), each with the same probability p.
The binomial distribution is a special case of the Poisson Binomial Distribution which is a sum of n independent non-identical Bernoulli trials Bern(pi). If X has the Poisson binomial distribution with p1 = … = pn = p then X ~ B(n, p).
A cumulative binomial probability refers to the probability that the binomial random variable falls within a specified range (e.g., is greater than or equal to a stated lower limit and less than or equal to a stated upper limit).
0 Comments