Page 18-50
Therefore, the F test statistics is F
o
= s
M
2
/s
m
2
=0.36/0.25=1.44
The P-value is P-value = P(F>F
o
) = P(F>1.44) = UTPF(
ν
N
,
ν
D
,F
o
) =
UTPF(20,30,1.44) = 0.1788…
Since 0.1788… > 0.05, i.e., P-value >
α
, therefore, we cannot reject the null
hypothesis that H
o
:
σ
1
2
=
σ
2
2
.
Additional notes on linear regression
In this section we elaborate the ideas of linear regression presented earlier in
the chapter and present a procedure for hypothesis testing of regression
parameters.
The method of least squares
Let x = independent, non-random variable, and Y = dependent, random
variable. The regression curve of Y on x is defined as the relationship between
x and the mean of the corresponding distribution of the Y’s.
Assume that the regression curve of Y on x is linear, i.e., mean distribution of
Y’s is given by
Α
+
Β
x. Y differs from the mean (
Α
+
Β⋅
x) by a value
ε
, thus
Y =
Α
+
Β⋅
x +
ε
, where
ε
is a random variable.
To visually check whether the data follows a linear trend, draw a scattergram or
scatter plot.
Suppose that we have n paired observations (x
i
, y
i
); we predict y by means of
∧
y = a + b
⋅
x, where a and b are constant.
Define the prediction error as, e
i
= y
i
-
∧
y
i
= y
i
- (a + b
⋅
x
i
).
The method of least squares requires us to choose a, b so as to minimize the
sum of squared errors (SSE)
the conditions
2
1
1
2
)]
(
[
i
n
i
i
n
i
i
bx
a
y
e
SSE
+
−
=
=
∑
∑
=
=
0
)
(
=
SSE
a
∂
∂
0
)
(
=
SSE
b
∂
∂
Summary of Contents for 50G
Page 1: ...HP g graphing calculator user s guide H Edition 1 HP part number F2229AA 90006 ...
Page 130: ...Page 2 70 The CMDS CoMmanDS menu activated within the Equation Writer i e O L CMDS ...
Page 206: ...Page 5 29 LIN LNCOLLECT POWEREXPAND SIMPLIFY ...
Page 257: ...Page 7 20 ...
Page 383: ...Page 11 56 Function KER Function MKISOM ...
Page 715: ...Page 21 68 Whereas using RPL there is no problem when loading this program in algebraic mode ...
Page 858: ...Page L 5 ...