Code
Number of observations (_N) was 0, now 100.
Basic Calculus
The limit of a function \(f(x)\) as \(x\) approaches \(a\) is the value that \(f(x)\) approaches as \(x\) gets closer and closer to \(a\). We write this as:
\[ \lim_{{x \to a}} f(x) = L \]
Here, \(L\) is the limit of the function \(f(x)\) as \(x\) approaches \(a\).
For example, consider the function \(f(x) = x^2\). The limit of \(f(x)\) as \(x\) approaches 2 is 4:
\[ \lim_{{x \to 2}} x^2 = 4 \]
Limits can also be used to define derivatives. The derivative of a function \(f(x)\) is the slope of the function at a given point. The derivative of \(f(x)\) at \(x = a\) is written as \(f'(a)\). The derivative is defined as:
\[ f'(a) = \lim_{{h \to 0}} \frac{f(a+h) - f(a)}{h} \]
In other words, the derivative is the slope of the function at a particular point \(a\). This can be approximated numerically by choosing a very small value for \(h\).
For example, consider the function \(f(x) = x^2\). The derivative of \(f(x)\) at \(x = a\) is:
\[ \begin{aligned} f'(a) &= \lim_{{h \to 0}} \frac{(a+h)^2 - a^2}{h} \\ &= \lim_{{h \to 0}} \frac{a^2 + 2ah + h^2 - a^2}{h} \\ &= \lim_{{h \to 0}} (2a + h) = 2a. \end{aligned} \]
If other methods fail, one can always rely on numerical differentiation.
Stata
can be used to calculate numerical derivatives. mata
(matrix algebra language) has powerful rutines for numerical differentiation. Stata
also has some capabilities, and you can always do it manually.
For most common functions, the derivative can be calculated using the following rules:
There are other rules for derivatives, but these are the ones that will be used most often.
The derivative of a composite function \(f(g(x))\) is given by the chain rule:
\[ \frac{d}{dx} f(g(x)) = f'(g(x)) \cdot g'(x). \]
For example, consider the function \(f(x) = \ln(x^2)\). The derivative of \(f(x)\) is:
\[ \begin{aligned} \frac{d}{dx} \ln(x^2) &= \frac{1}{x^2} \cdot \frac{d}{dx} (x^2) \\ &= \frac{1}{x^2} \cdot 2x \\ &= \frac{2}{x}. \end{aligned} \]
The derivative of a sum of functions is the sum of the derivatives of the functions:
\[ \frac{d}{dx} (f(x) + g(x)) = \frac{d}{dx} f(x) + \frac{d}{dx} g(x). \]
The derivative of a product of functions is given by the product rule:
\[ \frac{d}{dx} (f(x) \cdot g(x)) = f'(x) \cdot g(x) + f(x) \cdot g'(x). \]
The derivative of a quotient of functions is given by the quotient rule:
\[ \frac{d}{dx} \left( \frac{f(x)}{g(x)} \right) = \frac{f'(x) \cdot g(x) - f(x) \cdot g'(x)}{g(x)^2}. \]
This is a special case of the product rule.
Derivatives can be used to identify the maximum and minimum values of a function. Consider a function \(f(x)\).
To find the maximum (or minimum) value of \(f(x)\), we take the derivative of \(f(x)\) and set it equal to zero.
For example, consider the function \(f(x) = 5x^2 - 4x + 2\). The derivative of \(f(x)\) is:
\[ \begin{aligned} f'(x) &= 10x - 4 = 0 \\ x &= \frac{4}{10} = 0.4. \end{aligned} \]
So when \(x\) is equal to 0.4, the function \(f(x)\) does not change anymore.
To determine this, we take the second derivative of \(f(x)\), known as the second-order condition:
\[ f''(x) = 10 > 0. \]
When considering multiple variables, we also need to rely on the first- and second-order conditions to find minimum and maximum values. Consider a function \(f(x, y)\). The first-order conditions are:
\[ \begin{aligned} \frac{\partial}{\partial x} f(x, y) &= 0, \\ \frac{\partial}{\partial y} f(x, y) &= 0. \end{aligned} \]
These conditions indicate that, in the direction of \(x\) and \(y\), the function \(f(x, y)\) is not changing anymore. Thus, we have a potential maximum or minimum. To identify a minimum, we need second-order conditions:
\[H = \begin{bmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{bmatrix} \]
\[H = \begin{bmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{bmatrix} \]
where \(H\) is the Hessian matrix.
When optimizing a function with constraints, we can use the method of Lagrange multipliers. Consider a function \(f(x, y)\) subject to the constraint \(g(x, y) = z\). The Lagrangian is:
\[ L(x, y, \lambda) = f(x, y) + \lambda (z - g(x, y)). \]
These are the equivalent first-order conditions:
\[ \begin{aligned} \frac{\partial}{\partial x} L(x, y, \lambda) &= 0, \\ \frac{\partial}{\partial y} L(x, y, \lambda) &= 0, \\ \frac{\partial}{\partial\lambda} L(x, y, \lambda) &= z - g(x, y) = 0. \end{aligned} \]
The last condition is the constraint, and it implies that the constraint must be satisfied. The second-order conditions are the same as before.