FYUG Even Semester Exam, 2024
STATISTICS: Probability Distribution

Subject: Statistics (2nd Semester)
Course No: STADSC-151T
Time: 3 Hours | Full Marks: 70
Instructions: Figures in the margin indicate full marks. Answer any ten from Section A and five from Section B. (Note: All questions solved below per requirements).

SECTION-A (Short Answer Type)

1. Define discrete random variable with examples. State two properties of random variable.

[2 Marks]

Definition: A random variable is said to be discrete if it takes only a finite or countably infinite number of values. Its values are usually isolated points on the number line.

Examples: 1. Number of heads in 10 tosses of a coin. 2. Number of students present in a class.

Properties:

2. Show that P(a < X <= b) = F(b) - F(a).

[2 Marks]

By definition of the Distribution Function F(x), we have:

F(b) = P(X <= b) and F(a) = P(X <= a)

The event (X <= b) can be expressed as the union of two mutually exclusive events: (X <= a) and (a < X <= b).

Therefore, P(X <= b) = P(X <= a) + P(a < X <= b)

F(b) = F(a) + P(a < X <= b)

P(a < X <= b) = F(b) - F(a) [Hence Proved]

3. Explain conditional probability distribution of Y given X = x.

[2 Marks]

Discrete Case: The conditional probability mass function of Y given X = x is defined as:

P(Y=y | X=x) = P(X=x, Y=y) / P(X=x), provided P(X=x) > 0

Continuous Case: The conditional probability density function of Y given X = x is:

f(y|x) = f(x,y) / f1(x), provided f1(x) > 0, where f1(x) is the marginal p.d.f. of X.

4. Show that E(X²) >= [E(X)]².

[2 Marks]

We know that the Variance of a random variable X is always non-negative: Var(X) >= 0.

The formula for Variance is: Var(X) = E(X²) - [E(X)]²

Since Var(X) >= 0, it implies: E(X²) - [E(X)]² >= 0

E(X²) >= [E(X)]² [Hence Proved]

5. Given f(x) = 2x for 0 < x < 1. Find E(X) and P(-1/2 < x < 1/2).

[2 Marks]

E(X): Integral of (x * 2x) dx from 0 to 1 = [ (2x^3)/3 ] from 0 to 1 = 2/3.

P(-1/2 < X < 1/2): Since the function is only defined for x > 0, the range becomes (0 < x < 1/2).

Integral of 2x dx from 0 to 1/2 = [ x² ] from 0 to 1/2 = (1/2)² - 0 = 1/4.

6. Define mathematical expectation. Prove E(aX + b) = aE(X) + b.

[2 Marks]

Definition: The mathematical expectation of a random variable is the weighted average of all possible values it can take, where weights are the probabilities.

Proof: E(aX + b) = Integral [ (ax + b) f(x) dx ]

= a * Integral [ x f(x) dx ] + b * Integral [ f(x) dx ]

Since Integral [ f(x) dx ] = 1 and Integral [ x f(x) dx ] = E(X),

E(aX + b) = aE(X) + b.

7. Define Moment Generating Function (MGF) and Characteristic Function.

[2 Marks]

MGF: M(t) = E(e^(tX)). For discrete: Sum [ e^(tx) * P(x) ]. For continuous: Integral [ e^(tx) * f(x) dx ].

Characteristic Function: Phi(t) = E(e^(itX)), where i is the imaginary unit. It always exists for any random variable.

8. State properties of Characteristic Function.

[2 Marks]

9. Prove MGF of sum of independent variables is product of MGFs.

[2 Marks]

Let Z = X + Y, where X and Y are independent.

Mz(t) = E(e^(t(X+Y))) = E(e^(tX) * e^(tY))

Since X and Y are independent, E(g(X)h(Y)) = E(g(X))E(h(Y)).

Mz(t) = E(e^(tX)) * E(e^(tY)) = Mx(t) * My(t).

10. Binomial variate: mean = 4, variance = 2. Find P(3 < X <= 5).

[2 Marks]

Mean np = 4, Variance npq = 2. Dividing gives q = 2/4 = 0.5. Hence p = 0.5.

n(0.5) = 4 => n = 8. X ~ B(8, 0.5).

P(3 < X <= 5) = P(X=4) + P(X=5)

= [8C4 * (0.5)^8] + [8C5 * (0.5)^8] = (70 + 56) / 256 = 126 / 256 = 0.492.

11. What is Hypergeometric Distribution?

[2 Marks]

It is a discrete probability distribution that describes the probability of k successes in n draws, without replacement, from a finite population of size N that contains exactly M successes.

12. Define Geometric Distribution and obtain its mean.

[2 Marks]

Definition: It represents the number of failures before the first success in a series of independent Bernoulli trials with probability p.

P(X=x) = q^x * p ; x = 0, 1, 2...

Mean: E(X) = Sum [ x * q^x * p ] = p * q * d/dq [ Sum q^x ] = p * q * (1-q)^-2 = q/p.

13. Define Beta distribution of second kind and find its mean.

[2 Marks]

p.d.f: f(x) = [ x^(m-1) ] / [ B(m,n) * (1+x)^(m+n) ] ; x > 0.

Mean: E(X) = m / (n - 1) for n > 1.

14. Definition of Continuous Uniform Distribution and obtain its mean.

[2 Marks]

f(x) = 1 / (b - a) for a <= x <= b. Else 0.

Mean: Integral of [ x / (b-a) ] dx from a to b = [ x² / 2(b-a) ] = (b² - a²) / 2(b-a) = (a + b) / 2.

15. Discuss two importances of Normal Distribution.

[2 Marks]

SECTION-B (Long Answer Type)

16. (a) Define PDF. Given f(x) = 6x(1-x), 0 <= x <= 1. Find 'b' such that P(X < b) = P(X > b).

[4 Marks]

PDF: A function f(x) is a probability density function if f(x) >= 0 and its total integral over the real line is 1.

Given P(X < b) = P(X > b), this means 'b' is the Median.

Integral from 0 to b [ 6x - 6x² ] dx = 0.5

[ 3x² - 2x³ ] from 0 to b = 0.5 => 3b² - 2b³ = 0.5

By inspection, if b = 0.5: 3(0.25) - 2(0.125) = 0.75 - 0.25 = 0.5.

Value of b = 0.5.

16. (b) Define CDF. Solve for K and probabilities in discrete distribution.

[6 Marks]

CDF: F(x) = P(X <= x). Lower limit is 0, Upper limit is 1.

Solve for K: Sum of P(x) = 1.

0 + k + 2k + 2k + 3k + k² + 2k² + 7k² + k = 1

10k² + 9k - 1 = 0 => (10k - 1)(k + 1) = 0. Since k > 0, k = 0.1.

17. (a) Joint PDF and Distribution of U = X + Y (Convolution).

[4 Marks]

The PDF of the sum of two independent continuous random variables is the convolution of their individual PDFs.

h(u) = Integral from -inf to +inf [ fX(v) * fY(u - v) ] dv

This is derived from the joint PDF f(x,y) = fX(x)fY(y) by performing a change of variables (U=X+Y, V=X) and integrating out the nuisance variable V.

17. (b) Joint Distribution Table Analysis.

[6 Marks]

X\Y | 0 | 1 | Marginal X

0 | 2/9 | 1/9 | 3/9

1 | 1/9 | 5/9 | 6/9

Marginal Y: 3/9, 6/9.

Conditional Distribution X|Y=1: P(X=0|Y=1) = (1/9)/(6/9) = 1/6; P(X=1|Y=1) = (5/9)/(6/9) = 5/6.

Independence: P(X=0, Y=0) = 2/9. P(X=0)*P(Y=0) = (3/9)*(3/9) = 1/9. Since 2/9 != 1/9, X and Y are Not Independent.

18. (a) Prove E(X + Y) = E(X) + E(Y).

[3 Marks]

E(X+Y) = Double Integral [ (x+y) f(x,y) dx dy ]

= Integral [ x (Integral f(x,y) dy) dx ] + Integral [ y (Integral f(x,y) dx) dy ]

= Integral [ x fX(x) dx ] + Integral [ y fY(y) dy ] = E(X) + E(Y).

18. (b) Find E(X), E(Y), V(X), V(Y) from Table.

[4 Marks]

Based on the provided bivariate table in Source 19:

Marginal X probabilities: P(-1)=0.2, P(0)=0.6, P(1)=0.2.

E(X) = (-1)(0.2) + 0 + (1)(0.2) = 0.

V(X) = E(X²) - 0 = [(-1)²(0.2) + 0 + 1²(0.2)] = 0.4.

(Similar calculation for Y based on marginals).

22. (a) Mean and Variance of Binomial Distribution.

[3 Marks]

Mean: E(X) = Sum [ x * nCx * p^x * q^(n-x) ] = np.

Variance: E(X²) - [E(X)]² = npq.

24. (a) Show for Normal Distribution μ(2n) = 1.3.5...(2n-1)σ^(2n).

[4 Marks]

This represents the even-order moments about the mean. Using the MGF of Normal Distribution:

M(t) = exp(μt + 0.5σ²t²). For central moments, μ=0.

Expanding the exponential series and finding the coefficient of (t^2n / (2n)!), we obtain the recursive relation which simplifies to the product of odd integers multiplied by σ to the power of 2n.