Newton's approximation
Informally, if \( f \) is differentiable at \( x_0 \), then one has the approximation \( f(x) \approx f(x_0) + f'(x_0)(x - x_0) \), and conversely.
The formal version:
Newton's approximation
Let \( X \) be a subset of \( \mathbb{R} \), let \( x_0 \in X \) be a limit point of \( X \), let \( f : X \to \mathbb{R} \) be a function, and let \( L \) be a real number. Then the following statements are logically equivalent:
- \( f \) is differentiable at \( x_0 \) with derivative \( f' \).
- For any \( \varepsilon > 0 \), there exists a \( \delta > 0 \) such that
for all \( x \in \{i \in X : |i - x_0| < \delta \} \), \( f(x) \) is
\( \varepsilon |x - x_0| \)-close to \( f(x_0) + f'(x_0)(x-x_0) \).
In other words, we have:
\( |f(x) - (f(x_0) + f'(x_0)(x - x_0))| \le \varepsilon|x - x_0| \)
2. can be prased as: fix \( x_0 \). Given a challenge of \( \varepsilon \), it's possible to find a neighbourhood around \( x_0 \) for which \( x_0 \) along with \(f(x_0) \) and \(f'(x_0) \) can be used to approximate \( f \) for all of the neighbourhood, with an approximation error at most \( \varepsilon \).
This seems like such a confusing way to formulate the idea.