Question 11:
We have seen that the result for \((x_{1},x_{2})\) that we obtained
above using the notation with \(\lambda^{*}\) and so on is the same as
the result obtained by the procedure in the preceding question. Why is
that?
True: We actually carried out the same procedure in the two cases, just
with slightly different notation.
False: It is because of the utility function \(u=\log{(x_{1})}+log(x_{2})\) that we used. For other utility
function, the two results obtained using the two procedures will not
always be the same.
Explanation:
Going through what we did before, one sees that we actually carried out
the same procedure in the two cases, just with slightly different
notation.
A side note: Actually, for any problem of maximizing \(u(x)\) under some
constraint \(g\left(x\right)\leq 0\) we can apply the procedure
- Write down \(L=u-\lambda g(x)\)
- For each \(i\in\{1,\ldots,n\}\) compute \(\frac{\partial L}{\partial x_{\text{i\ }}}=0\)
- Solve the system of \(n+1\) equations \(\{\frac{\partial u}{\partial x_{1}}-\lambda\frac{\partial g}{\partial x_{1}}=0,\ldots,\frac{\partial u}{\partial x_{n}}-\lambda\frac{\partial g}{\partial x_{n}}=0,g(x)=y\}\) in the \(n+1\) unknowns.
The only place where we used the specific properties of the consumer
choice problem was in the way we solved step 3 using the steps \(3.1,\ 3.2\), 3.3 and \(3.4\).
Our analysis from above can be slightly extended to show that under very
general conditions (always satisfied in the applications we will
encounter in these capsules) the following: if this procedure yields a
solution \((x_{1},\ldots,x_{n})\) then this solution will be also be a
solution to the problem of maximizing \(u(x)\) und some constraint \(g\left(x\right)\leq 0\).