FYUG Even Semester Exam, 2025 STATISTICS (2nd Semester) Course No.: STADSC-151

Subject: Statistics

Paper Code: STADSC-151

Semester: 2nd Semester (FYUG)

Year: 2025

Full Marks: 50 (Inferred from distribution)


UNIT-I

Question 1(a) [2+2=4]

Define random variable with example. State the conditions for probability mass function (p.m.f.).

Definition: A random variable is a real-valued function defined on the sample space of a random experiment. It assigns a unique real number to every outcome in the sample space.

Example: In tossing two fair coins, if X denotes the number of heads, then X can take values 0, 1, or 2.

Conditions for p.m.f.: For a discrete random variable X with possible values x1, x2, ..., the function p(xi) is a p.m.f. if:

  1. p(xi) >= 0 for all i (Non-negativity).
  2. Sum over all i of p(xi) = 1 (Total probability is unity).

Question 1(b) [4]

Define cumulative distribution function (c.d.f.) of a random variable for both discrete and continuous cases.

The Cumulative Distribution Function F(x) is defined as the probability that the random variable X takes a value less than or equal to x: P(X <= x).

  • Discrete Case: F(x) = Sum of p(xi) for all xi <= x.
  • Continuous Case: F(x) = Integral from -infinity to x of f(t) dt, where f(t) is the p.d.f.

Question 1(c) [4]

Define conditional probability density function and joint distribution function of a two-dimensional random variable.

Joint Distribution Function: F(x, y) = P(X <= x, Y <= y). It represents the probability that X is less than or equal to x AND Y is less than or equal to y simultaneously.

Conditional p.d.f.: The conditional p.d.f. of X given Y=y is defined as:

f(x|y) = f(x, y) / f2(y), provided f2(y) > 0

Where f(x,y) is the joint p.d.f. and f2(y) is the marginal p.d.f. of Y.

UNIT-II

Question 2(a)(i) [10]

The joint p.d.f. of X and Y is f(x,y) = 4xy exp(-(x^2 + y^2)); x >= 0, y >= 0. Find marginal p.d.f. of X and Y. Are X and Y independent? Find conditional distributions.

Step 1: Marginal p.d.f. of X
Integrate f(x,y) with respect to y from 0 to infinity:
f1(x) = Integral[4xy exp(-x^2) exp(-y^2) dy]
f1(x) = 4x exp(-x^2) * Integral[y exp(-y^2) dy]
Let y^2 = t, 2y dy = dt. Integral becomes 1/2 * Integral[exp(-t) dt] = 1/2.
f1(x) = 2x exp(-x^2); x >= 0

Step 2: Marginal p.d.f. of Y
By symmetry, f2(y) is calculated similarly:
f2(y) = 2y exp(-y^2); y >= 0

Step 3: Independence
X and Y are independent if f(x,y) = f1(x) * f2(y).
(2x exp(-x^2)) * (2y exp(-y^2)) = 4xy exp(-(x^2+y^2)) = f(x,y).
Conclusion: X and Y are independent.

Step 4: Conditional Distributions
f(x|y) = f(x,y)/f2(y) = f1(x) = 2x exp(-x^2)
f(y|x) = f(x,y)/f1(x) = f2(y) = 2y exp(-y^2)

Question 2(b)(i) [6]

Given c.d.f. F(x), find c.d.f. and p.d.f. of (1) Y=X+a and (2) Y=X^2.

(1) Y = X + a:
Fy(y) = P(Y <= y) = P(X + a <= y) = P(X <= y - a) = Fx(y - a).
p.d.f. gy(y) = d/dy [Fx(y - a)] = fx(y - a).

(2) Y = X^2:
Fy(y) = P(X^2 <= y) = P(-sqrt(y) <= X <= sqrt(y)) = Fx(sqrt(y)) - Fx(-sqrt(y)).
p.d.f. gy(y) = d/dy [Fx(sqrt(y)) - Fx(-sqrt(y))] = [fx(sqrt(y)) + fx(-sqrt(y))] / [2 * sqrt(y)].

UNIT-III

Question 3(a) [4]

Define mathematical expectation. Prove E[X - E(X)] = 0.

Definition: Expectation E(X) is the weighted average of all possible values of X, where weights are probabilities. E(X) = Sum[x * p(x)] for discrete or Integral[x * f(x) dx] for continuous.

Proof:
E[X - E(X)] = E(X) - E[E(X)]
Since E(X) is a constant (say k), E(k) = k.
E[X - E(X)] = E(X) - E(X) = 0.

Question 4(a)(i) [5]

Define Variance. Prove V(aX + bY) = a^2 V(X) + b^2 V(Y) + 2ab Cov(X,Y).

Definition: V(X) = E[(X - E(X))^2] = E(X^2) - [E(X)]^2.

Proof:
Let E(X) = ux and E(Y) = uy.
E(aX + bY) = a*ux + b*uy.
V(aX + bY) = E[((aX + bY) - (a*ux + b*uy))^2]
= E[(a(X-ux) + b(Y-uy))^2]
= E[a^2(X-ux)^2 + b^2(Y-uy)^2 + 2ab(X-ux)(Y-uy)]
= a^2 E(X-ux)^2 + b^2 E(Y-uy)^2 + 2ab E[(X-ux)(Y-uy)]
V(aX + bY) = a^2 V(X) + b^2 V(Y) + 2ab Cov(X,Y)

If independent: Cov(X,Y) = 0, so V(aX + bY) = a^2 V(X) + b^2 V(Y).

UNIT-IV

Question 7(b) [4]

Derive Poisson distribution from Binomial distribution.

Poisson is a limiting case of Binomial distribution under conditions:

  • n (number of trials) is very large (n -> infinity).
  • p (probability of success) is very small (p -> 0).
  • np = lambda (a constant).

P(X=x) = nCx * p^x * q^(n-x). Substituting p = lambda/n and taking limits results in:
P(X=x) = [exp(-lambda) * lambda^x] / x!

Question 8(a)(i) [4]

Obtain m.g.f. of Binomial distribution and find mean/variance.

Mx(t) = E[exp(tX)] = Sum[exp(tx) * nCx * p^x * q^(n-x)]
Mx(t) = Sum[nCx * (p*exp(t))^x * q^(n-x)]
Mx(t) = (q + p*exp(t))^n

Mean: d/dt [Mx(t)] at t=0 = n(q + p*exp(t))^(n-1) * p*exp(t) = np.

Variance: E(X^2) - [E(X)]^2 = [n(n-1)p^2 + np] - (np)^2 = npq.

UNIT-V

Question 10(a)(i) [3]

Obtain the median of normal distribution.

For a Normal distribution N(u, sigma^2), the p.d.f. is symmetric about x = u.
By definition of median (M): Integral from -infinity to M of f(x) dx = 0.5.
Since the total area under the symmetric curve is 1 and it is divided equally at the mean (u), we have:
Median = Mean = Mode = u