26 Jun 2001
 |
Main Points
|
Definition
A random variable X is a continuous random
variable (crv) if the range of X
is made up of interval(s).
A crv is a theoretical representation of a continuous variable such
as length, mass or time.
Probability Density Function
A crv is specified by its probability density function
(pdf).
If X is a crv with pdf f(x), then
P(a £ X £
b) = |
ób
õa |
f(x) dx. |
If a = b, then
P(X = a) = P(a £
X £ a) = |
óa
õa |
f(x) dx = 0. |
Therefore, the probability that a crv will assume a fixed value is 0.
Thus, P(X < a) = P(X £
a), etc.
Note: f(x) need not
be continuous although X is a crv.
Cumulative Distribution Function
The cumulative distribution function (cdf)
of a rv X is a function defined on R as follows:
Properties of F(x)
-
F(x) is defined for every real number x
-
0 £ F(x) £
1
-
When x ® -¥,
F(x) ® 0. When x ®
¥, F(x) ®
1
-
F(x) is a non-decreasing function: if s < t, then
F(s) £ F(t)
-
If X is a crv, then F(x) is a continuous function
-
P(a < X £ b) =
F(b) - F(a)
-
If x is not an endpoint of an interval, then f(x) = F'(x)
Let m, l and u be the median, lower quartile and upper
quartile of X. Then
F(m) = 0.5, F(l) = 0.25, F(u)
= 0.75.
Expectation
For a crv X with pdf f(x), the expectation
of X, written as E(X), is given by
E(X) is also denoted by m and referred
to as the mean of X.
Note: If f(x) is symmetrical
about the central value c, then E(X) = c.
In general, if g(X) is any function of the random variable X,
then
Properties of E (similar to that for drv)
Let a and b be any constants
-
E(a) = a
-
E(aX) = aE(X)
-
E(aX + b) = aE(X) + b
-
E[f(X) ± g(X)] = E[f(X)]
± E[g(X)]
-
E(XY) = E(X)E(Y) if X and Y are
independent
Variance
The variance of a rv X, denoted by
Var(X), is defined as
Var(X) = E[(X - m)2].
The standard deviation of X, denoted by s,
is the square root of Var(X):
Computational formula for Var(X):
Var(X) = E(X2) -
[E(X)]2.
Properties of Var (similar to that for drv)
-
Var(a) = 0
-
Var(aX) = a2Var(X)
-
Var(aX + b) = a2Var(X)
Sum of Two Random Variables
If X and Y are any two random variables, then for any
constants a and b,
E(aX ± bY) = aE(X)
± bE(Y).
If X and Y are also independent, then
Var(aX ± bY) =
a2Var(X) + b2Var(Y).
|