^

^

= β0+ β1X1+ β2X2

(A-5)

where β0 = ln *a*

β1 = *b*

β2 = *c*

Now we can restate the problem in a linear form

(

)

(

)

2

2

^

= ∑ *Y*i - β0 - β1X1,i - β2X2,i .

min ∑ *Y*i - *Y*i

(A-6)

The summation index *i *has now been added. The summation occurs over the total

number of observations *n*. To find the minimum for this expression with respect to

respect to each of the parameters and set the result to zero in each case

(

)

/ β0 = ∑ *Y*i - β0 - β1X1,i - β2X2,i = 0

(A-7)

(

)

/ β1 = ∑ *X*1,i Yi - β0 - β1X1,i - β2X2,i = 0

(A-8)

(

)

/ β2 = ∑ *X*2,i Yi - β0 - β1X1,i - β2X2,i = 0 .

(A-9)

We now have three linear equations in the three unknown parameters by rear-

ranging as follows

β0 + β1 ∑ *X*1,i +β2 ∑ *X*2,i = ∑ *Y*i

(A-10)

β0 ∑ *X*1,i + β1 ∑ (*X*1,i )2 +β2 ∑ *X*2,i X1,i = ∑ *Y*i X1,i

(A-11)

β0 ∑ *X*2,i + β1 ∑ (*X*1,i X2,i ) +β2 ∑ (*X*2,i )2 = ∑ *Y*iX2,i .

(A-12)

These equations can be written in matrix form as

(A-13)

where *A*11 = *n*

(*X*1,i )2

80