Subject: Statistics
Paper Name/Code: Mathematical Statistics / STADSM-252
Semester: 4th Semester (FYUG)
Time: 3 Hours | Full Marks: 70
Pass Marks: 28
Define discrete and continuous random variables with illustration.
A discrete random variable is one that can take on only a finite or countably infinite number of distinct values.
Illustration: The number of heads obtained in three tosses of a coin (0, 1, 2, or 3).A continuous random variable is one that can take any value within a specified range or interval.
Illustration: The height of students in a class or the lifetime of an electric bulb.Define probability mass function (p.m.f.) of a random variable.
For a discrete random variable X, the probability mass function (p.m.f.) is a function P(X=x) that assigns a probability to each possible value x such that P(X=x) >= 0 and the sum of all P(X=x) equals 1.
State the properties of cumulative distribution function.
Define p.d.f. and cumulative distribution function of a random variable.
Probability Density Function (p.d.f.): For a continuous random variable X, the p.d.f. f(x) represents the density of probability at point x such that the integral over its entire range is 1.
Cumulative Distribution Function (c.d.f.): Defined as F(x) = P(X <= x). It gives the probability that the random variable X takes a value less than or equal to x.
A discrete random variable X has the given probability distribution. Find k, P(X>=6), P(X<6), and P(2
The sum of probabilities must equal 1: 0 + k + 2k + 2k + 3k + k^2 + 2k^2 + (7k^2 + k) = 1.
Equation: 10k^2 + 9k - 1 = 0. Solving gives k = 0.1 (since k cannot be negative).
Define mathematical expectation.
The mathematical expectation E(X) of a random variable X is the weighted average of all possible values of X, where the weights are the respective probabilities (for discrete) or the p.d.f. (for continuous).
Prove that E(X^2) >= [E(X)]^2.
Since the variance V(X) = E(X^2) - [E(X)]^2 and variance is always non-negative (V(X) >= 0), it follows directly that E(X^2) >= [E(X)]^2.
State and prove the additive property of mathematical expectation.
Statement: For any two random variables X and Y, E(X + Y) = E(X) + E(Y), provided the expectations exist.
Proof (Continuous case):
E(X + Y) = Integral Integral (x + y) f(x,y) dx dy
= Integral Integral x f(x,y) dx dy + Integral Integral y f(x,y) dx dy
= Integral x [Integral f(x,y) dy] dx + Integral y [Integral f(x,y) dx] dy
= Integral x f(x) dx + Integral y f(y) dy = E(X) + E(Y).
Define moment-generating function (m.g.f.).
The moment-generating function M_x(t) of a random variable X is defined as E(e^(tX)), for all values of t for which the expectation exists.
Prove that m.g.f. of the sum of independent random variables is equal to the product of their individual m.g.f.s.
Let Z = X + Y, where X and Y are independent.
Define bivariate random variable.
A bivariate random variable (X, Y) is a pair of random variables defined on the same sample space, representing two different characteristics of the same outcome.
Given joint p.d.f. f(x,y) = 8xy for 0 < x < y < 1. Find marginal densities and check independence.
State the properties of binomial distribution.
Define Poisson distribution and find its mean.
A discrete random variable X follows Poisson distribution if P(X=x) = [e^(-lambda) * lambda^x] / x! for x = 0, 1, 2...
Mean: E(X) = Sum x * [e^(-lambda) * lambda^x] / x! = lambda.
Would you like me to generate a practice quiz based on the key distributions (Binomial and Poisson) covered in this paper?