Define random variable with example. State the conditions for probability mass function (p.m.f.).
Definition: A random variable is a real-valued function defined on the sample space of a random experiment. It assigns a unique real number to every outcome in the sample space.
Example: In tossing two fair coins, if X denotes the number of heads, then X can take values 0, 1, or 2.
Conditions for p.m.f.: For a discrete random variable X with possible values x1, x2, ..., the function p(xi) is a p.m.f. if:
Define cumulative distribution function (c.d.f.) of a random variable for both discrete and continuous cases.
The Cumulative Distribution Function F(x) is defined as the probability that the random variable X takes a value less than or equal to x: P(X <= x).
Define conditional probability density function and joint distribution function of a two-dimensional random variable.
Joint Distribution Function: F(x, y) = P(X <= x, Y <= y). It represents the probability that X is less than or equal to x AND Y is less than or equal to y simultaneously.
Conditional p.d.f.: The conditional p.d.f. of X given Y=y is defined as:
f(x|y) = f(x, y) / f2(y), provided f2(y) > 0
Where f(x,y) is the joint p.d.f. and f2(y) is the marginal p.d.f. of Y.
The joint p.d.f. of X and Y is f(x,y) = 4xy exp(-(x^2 + y^2)); x >= 0, y >= 0. Find marginal p.d.f. of X and Y. Are X and Y independent? Find conditional distributions.
Step 1: Marginal p.d.f. of X
Integrate f(x,y) with respect to y from 0 to infinity:
f1(x) = Integral[4xy exp(-x^2) exp(-y^2) dy]
f1(x) = 4x exp(-x^2) * Integral[y exp(-y^2) dy]
Let y^2 = t, 2y dy = dt. Integral becomes 1/2 * Integral[exp(-t) dt] = 1/2.
f1(x) = 2x exp(-x^2); x >= 0
Step 2: Marginal p.d.f. of Y
By symmetry, f2(y) is calculated similarly:
f2(y) = 2y exp(-y^2); y >= 0
Step 3: Independence
X and Y are independent if f(x,y) = f1(x) * f2(y).
(2x exp(-x^2)) * (2y exp(-y^2)) = 4xy exp(-(x^2+y^2)) = f(x,y).
Conclusion: X and Y are independent.
Step 4: Conditional Distributions
Given c.d.f. F(x), find c.d.f. and p.d.f. of (1) Y=X+a and (2) Y=X^2.
(1) Y = X + a:
Fy(y) = P(Y <= y) = P(X + a <= y) = P(X <= y - a) = Fx(y - a).
p.d.f. gy(y) = d/dy [Fx(y - a)] = fx(y - a).
(2) Y = X^2:
Fy(y) = P(X^2 <= y) = P(-sqrt(y) <= X <= sqrt(y)) = Fx(sqrt(y)) - Fx(-sqrt(y)).
p.d.f. gy(y) = d/dy [Fx(sqrt(y)) - Fx(-sqrt(y))] = [fx(sqrt(y)) + fx(-sqrt(y))] / [2 * sqrt(y)].
Define mathematical expectation. Prove E[X - E(X)] = 0.
Definition: Expectation E(X) is the weighted average of all possible values of X, where weights are probabilities. E(X) = Sum[x * p(x)] for discrete or Integral[x * f(x) dx] for continuous.
Proof:
E[X - E(X)] = E(X) - E[E(X)]
Since E(X) is a constant (say k), E(k) = k.
E[X - E(X)] = E(X) - E(X) = 0.
Define Variance. Prove V(aX + bY) = a^2 V(X) + b^2 V(Y) + 2ab Cov(X,Y).
Definition: V(X) = E[(X - E(X))^2] = E(X^2) - [E(X)]^2.
Proof:
Let E(X) = ux and E(Y) = uy.
E(aX + bY) = a*ux + b*uy.
V(aX + bY) = E[((aX + bY) - (a*ux + b*uy))^2]
= E[(a(X-ux) + b(Y-uy))^2]
= E[a^2(X-ux)^2 + b^2(Y-uy)^2 + 2ab(X-ux)(Y-uy)]
= a^2 E(X-ux)^2 + b^2 E(Y-uy)^2 + 2ab E[(X-ux)(Y-uy)]
V(aX + bY) = a^2 V(X) + b^2 V(Y) + 2ab Cov(X,Y)
If independent: Cov(X,Y) = 0, so V(aX + bY) = a^2 V(X) + b^2 V(Y).
Derive Poisson distribution from Binomial distribution.
Poisson is a limiting case of Binomial distribution under conditions:
P(X=x) = nCx * p^x * q^(n-x). Substituting p = lambda/n and taking limits results in:
P(X=x) = [exp(-lambda) * lambda^x] / x!
Obtain m.g.f. of Binomial distribution and find mean/variance.
Mx(t) = E[exp(tX)] = Sum[exp(tx) * nCx * p^x * q^(n-x)]
Mx(t) = Sum[nCx * (p*exp(t))^x * q^(n-x)]
Mx(t) = (q + p*exp(t))^n
Mean: d/dt [Mx(t)] at t=0 = n(q + p*exp(t))^(n-1) * p*exp(t) = np.
Variance: E(X^2) - [E(X)]^2 = [n(n-1)p^2 + np] - (np)^2 = npq.
Obtain the median of normal distribution.
For a Normal distribution N(u, sigma^2), the p.d.f. is symmetric about x = u.