Definition: A random variable is said to be discrete if it takes only a finite or countably infinite number of values. Its values are usually isolated points on the number line.
Examples: 1. Number of heads in 10 tosses of a coin. 2. Number of students present in a class.
Properties:
By definition of the Distribution Function F(x), we have:
F(b) = P(X <= b) and F(a) = P(X <= a)
The event (X <= b) can be expressed as the union of two mutually exclusive events: (X <= a) and (a < X <= b).
Therefore, P(X <= b) = P(X <= a) + P(a < X <= b)
F(b) = F(a) + P(a < X <= b)
P(a < X <= b) = F(b) - F(a) [Hence Proved]
Discrete Case: The conditional probability mass function of Y given X = x is defined as:
Continuous Case: The conditional probability density function of Y given X = x is:
We know that the Variance of a random variable X is always non-negative: Var(X) >= 0.
The formula for Variance is: Var(X) = E(X²) - [E(X)]²
Since Var(X) >= 0, it implies: E(X²) - [E(X)]² >= 0
E(X²) >= [E(X)]² [Hence Proved]
E(X): Integral of (x * 2x) dx from 0 to 1 = [ (2x^3)/3 ] from 0 to 1 = 2/3.
P(-1/2 < X < 1/2): Since the function is only defined for x > 0, the range becomes (0 < x < 1/2).
Integral of 2x dx from 0 to 1/2 = [ x² ] from 0 to 1/2 = (1/2)² - 0 = 1/4.
Definition: The mathematical expectation of a random variable is the weighted average of all possible values it can take, where weights are the probabilities.
Proof: E(aX + b) = Integral [ (ax + b) f(x) dx ]
= a * Integral [ x f(x) dx ] + b * Integral [ f(x) dx ]
Since Integral [ f(x) dx ] = 1 and Integral [ x f(x) dx ] = E(X),
E(aX + b) = aE(X) + b.
MGF: M(t) = E(e^(tX)). For discrete: Sum [ e^(tx) * P(x) ]. For continuous: Integral [ e^(tx) * f(x) dx ].
Characteristic Function: Phi(t) = E(e^(itX)), where i is the imaginary unit. It always exists for any random variable.
Let Z = X + Y, where X and Y are independent.
Mz(t) = E(e^(t(X+Y))) = E(e^(tX) * e^(tY))
Since X and Y are independent, E(g(X)h(Y)) = E(g(X))E(h(Y)).
Mz(t) = E(e^(tX)) * E(e^(tY)) = Mx(t) * My(t).
Mean np = 4, Variance npq = 2. Dividing gives q = 2/4 = 0.5. Hence p = 0.5.
n(0.5) = 4 => n = 8. X ~ B(8, 0.5).
P(3 < X <= 5) = P(X=4) + P(X=5)
= [8C4 * (0.5)^8] + [8C5 * (0.5)^8] = (70 + 56) / 256 = 126 / 256 = 0.492.
It is a discrete probability distribution that describes the probability of k successes in n draws, without replacement, from a finite population of size N that contains exactly M successes.
Definition: It represents the number of failures before the first success in a series of independent Bernoulli trials with probability p.
P(X=x) = q^x * p ; x = 0, 1, 2...
Mean: E(X) = Sum [ x * q^x * p ] = p * q * d/dq [ Sum q^x ] = p * q * (1-q)^-2 = q/p.
p.d.f: f(x) = [ x^(m-1) ] / [ B(m,n) * (1+x)^(m+n) ] ; x > 0.
Mean: E(X) = m / (n - 1) for n > 1.
f(x) = 1 / (b - a) for a <= x <= b. Else 0.
Mean: Integral of [ x / (b-a) ] dx from a to b = [ x² / 2(b-a) ] = (b² - a²) / 2(b-a) = (a + b) / 2.
PDF: A function f(x) is a probability density function if f(x) >= 0 and its total integral over the real line is 1.
Given P(X < b) = P(X > b), this means 'b' is the Median.
Integral from 0 to b [ 6x - 6x² ] dx = 0.5
[ 3x² - 2x³ ] from 0 to b = 0.5 => 3b² - 2b³ = 0.5
By inspection, if b = 0.5: 3(0.25) - 2(0.125) = 0.75 - 0.25 = 0.5.
Value of b = 0.5.
CDF: F(x) = P(X <= x). Lower limit is 0, Upper limit is 1.
Solve for K: Sum of P(x) = 1.
0 + k + 2k + 2k + 3k + k² + 2k² + 7k² + k = 1
10k² + 9k - 1 = 0 => (10k - 1)(k + 1) = 0. Since k > 0, k = 0.1.
The PDF of the sum of two independent continuous random variables is the convolution of their individual PDFs.
This is derived from the joint PDF f(x,y) = fX(x)fY(y) by performing a change of variables (U=X+Y, V=X) and integrating out the nuisance variable V.
X\Y | 0 | 1 | Marginal X
0 | 2/9 | 1/9 | 3/9
1 | 1/9 | 5/9 | 6/9
Marginal Y: 3/9, 6/9.
Conditional Distribution X|Y=1: P(X=0|Y=1) = (1/9)/(6/9) = 1/6; P(X=1|Y=1) = (5/9)/(6/9) = 5/6.
Independence: P(X=0, Y=0) = 2/9. P(X=0)*P(Y=0) = (3/9)*(3/9) = 1/9. Since 2/9 != 1/9, X and Y are Not Independent.
E(X+Y) = Double Integral [ (x+y) f(x,y) dx dy ]
= Integral [ x (Integral f(x,y) dy) dx ] + Integral [ y (Integral f(x,y) dx) dy ]
= Integral [ x fX(x) dx ] + Integral [ y fY(y) dy ] = E(X) + E(Y).
Based on the provided bivariate table in Source 19:
Marginal X probabilities: P(-1)=0.2, P(0)=0.6, P(1)=0.2.
E(X) = (-1)(0.2) + 0 + (1)(0.2) = 0.
V(X) = E(X²) - 0 = [(-1)²(0.2) + 0 + 1²(0.2)] = 0.4.
(Similar calculation for Y based on marginals).
Mean: E(X) = Sum [ x * nCx * p^x * q^(n-x) ] = np.
Variance: E(X²) - [E(X)]² = npq.
This represents the even-order moments about the mean. Using the MGF of Normal Distribution:
M(t) = exp(μt + 0.5σ²t²). For central moments, μ=0.
Expanding the exponential series and finding the coefficient of (t^2n / (2n)!), we obtain the recursive relation which simplifies to the product of odd integers multiplied by σ to the power of 2n.