NEOCODE
Back To Top

Unit-1: Random Variables and Probability Distributions

1. Random Variables

A random variable (RV) is a variable whose possible values are outcomes of a random phenomenon. It can be classified as either discrete or continuous.

Cumulative Distribution Function (CDF)

The CDF of a random variable X is defined as:

F ( x ) = P ( X x ) = t x f ( t )

Properties of CDF:

2. Probability Distributions

Probability Mass Function (PMF)

For a discrete random variable, the PMF P(X=x) satisfies:

0 P ( X = x ) 1
x P ( X = x ) = 1
f ( x ) 0
x f ( x ) = 1

Probability Density Function (PDF)

For a continuous random variable, the PDF f(x) satisfies:

f ( x ) 0
f ( x ) d x = 1
P ( a < X < b ) = a b f ( x ) d x

Note: For continuous RVs, P(X=x)=0.

Relation Between PDF and CDF:

F ( x ) = f ( x )

Properties of CDF

Property Discrete Random Variable Continuous Random Variable
Definition F(x)=xixP(X=xi) F(x)=xf(t)dt
Range 0F(x)1 0F(x)1
Monotonicity Non-decreasing Non-decreasing
Right-Continuity Right-continuous Continuous
Limits limxF(x)=0
limxF(x)=1
limxF(x)=0
limxF(x)=1
Jump Discontinuities Has jumps at each value of x where P(X=x)>0 No jumps (smooth curve)
Probability at a Point P(X=x)=F(x)F(x) P(X=x)=0

3. Joint Probability Distributions

Joint PMF

For two discrete random variables X and Y, the joint PMF P(X=x,Y=y) satisfies:

0 P ( X = x , Y = y ) 1
x y P ( X = x , Y = y ) = 1

Joint PDF

For two continuous random variables X and Y, the joint PDF fX,Y(x,y) satisfies:

f X , Y ( x , y ) 0
f X , Y ( x , y ) d x d y = 1

4. Marginal Probability Distribution

The marginal probability distribution is the probability distribution of a subset of random variables, obtained by summing or integrating over the other variables.

For Discrete Random Variables

Given two discrete random variables X and Y, the marginal PMF of X is:

P ( X = x ) = g ( x ) = y f ( x , y )

Similarly, the marginal PMF of Y is:

P ( Y = y ) = h ( y ) = x f ( x , y )

For Continuous Random Variables

Given two continuous random variables X and Y, the marginal PDF of X is:

f X ( x ) = f X , Y ( x , y ) d y

Similarly, the marginal PDF of Y is:

f Y ( y ) = f X , Y ( x , y ) d x

4. Conditional Probability Distribution

The conditional probability distribution describes the probability of one random variable given the value of another random variable.

For Discrete Random Variables

The conditional PMF of X given Y=y is:

P ( X = x | Y = y ) = P ( X = x , Y = y ) P ( Y = y )

Similarly, the conditional PMF of Y given X=x is:

P ( Y = y | X = x ) = P ( X = x , Y = y ) P ( X = x )

For Continuous Random Variables

The conditional PDF of X given Y=y is:

f X | Y ( x | y ) = f X , Y ( x , y ) f Y ( y )

Similarly, the conditional PDF of Y given X=x is:

f Y | X ( y | x ) = f X , Y ( x , y ) f X ( x )

5. Mean, Variance, and Covariance

Mean (Expected Value)

For a discrete random variable:

μ = E ( X ) = x x f ( x )
μ g ( x ) = E [ g ( x ) ] = x g ( x ) f ( x )

For a continuous random variable:

μ = E ( X ) = x f ( x ) d x
μ g ( x ) = E [ g ( x ) ] = g ( x ) f ( x ) d x

Linearity of Expectation:

E ( a X + b Y ) = a E ( X ) + b E ( Y )

Variance

For any random variable:

Var ( X ) = E ( X 2 ) [ E ( X ) ] 2

Properties:

Covariance

For two random variables X and Y:

Cov ( X , Y ) = E ( X Y ) E ( X ) E ( Y )

Properties:

6. Chebyshev's Theorem

For any random variable X with mean μ and variance σ2:

P ( | X μ | k σ ) 1 1 k 2

Complementary Form:

P ( | X μ | > k σ ) 1 k 2

Key Features:

MCQ Questions

1. Which of the following is true for a probability mass function (PMF)?

Answer: c) Both a) and b)

2. For ( k = 2 ), Chebyshev's Theorem states that the probability ( P(|X - μ| ≤ 2σ) ) is at least:

Answer: b) 0.75