Powered by MathJax

Thursday, December 26, 2013

Theoretical Remarks #1 - indefinite integrals, antiderivatives and the function $\int_{a}^{x}f(t)dt$

   Suppose we are given a continuous, real function $f(x)$ defined on an interval $\Delta \subseteq \mathbb{R}$ and let $a \in \Delta$ be a fixed point.
   Any other function $F(x)$, with domain $D_{F}= \Delta$ will be called an antiderivative function of $f$ if
$$
F'(x)=f(x)
$$
(notice that $F$ is by definition differentiable (and thus continuous) in $\Delta$.)
   The above definition implies that: the antiderivative function is not uniquely determined, but rather there is a family of functions satisfying the above relation. Actually, any other function $G(x)$, $D_{G}= \Delta$ with the property
\begin{equation} \notag
G'(x)=F'(x)=f(x)
\end{equation}
will also be an antiderivative function. In such a case $F(x)$ and $G(x)$ will differ by a constant:
\begin{equation} \notag
G(x)=F(x)+c
\end{equation}
for some $c \in \mathbb{R}$. (this comes from a well known theorem of elementary calculus). We can thus now lay the following
   Definition: We will call antiderivative or indefinite integral of $f$, and we will denote it by $\int f(x)dx$ the set of all functions satisfying the above property, thus:
\begin{equation} \notag
\begin{array}{r}
\int f(x)dx = \{F | F'(x)=f(x), \ x \in \Delta  \} = \\
     \\
= \{G(x)+c |\textrm{for all } c \in \mathbb{R} \} \ \ \ \ \ \ \ \
\end{array}
\end{equation}
where in the last equality $G$ is an antiderivative function (actually any antiderivative function) of $f$.

   Now it can be proved (the proof can be found on standard calculus texts and i will -hopefully- post it here later) that:
Proposition: one of these functions belonging in the above set (thus, one of the antiderivative functions of $f$ or equivalently: one of the indefinite integrals of $f$) is the function
\begin{equation} \notag
\int_{a}^{x} f(t)dt
\end{equation}
In other words: $(\int_{a}^{x} f(t)dt )'=f(x)$ for all $x \in \Delta$.

Remarks:
(1). Notice that the above proposition readily implies that $\int_{a}^{x} f(t)dt$ is differentiable (and thus continuous) for any $x \in \Delta$. Of course the $\ ' \ $ symbol indicates differentiation with respect to the variable $x$.
(2). Thus: the definite integral $\int_{a}^{x} f(t)dt $ with variable upper limit of integration, is an antiderivative function of $f$. Consequently, we can write
\begin{equation} \notag
\int f(x)dx = \int_{a}^{x} f(t)dt + c
\end{equation}
for all $c \in \mathbb{R}$.
(3). What the above proposition actually tells us is that: any function $f$ which is continuous on an interval $\Delta \subseteq \mathbb{R}$, has an antiderivative function given by $\int_{a}^{x} f(t)dt $ for $a,x \in \Delta$.
(4). It is worth noticing the meaning of the number $a \in \Delta$: Varying the value of $a \in \Delta$ produces different antiderivative functions (because the variation of $a \in \Delta$ simply alters the value of the constant of integration $c$). However, we cannot hope that varying the value of $a \in \Delta$ "covers" all possible antiderivatives of a given (continuous function $f$). In other words, this means that: although the above theorem tells us that  $\int_{a}^{x} f(t)dt$ is an antiderivative function of $f$, not all antiderivative functions of $f$ can necessarily be expressed as $\int_{a}^{x} f(t)dt$ for some $a \in \Delta$. This can be clearly seen in the following example:
   If $f(x)=2x$ and $\Delta = \mathbb{R}$, then for  any real $a$ we have
\begin{equation} \notag
\int_{a}^{x} 2tdt = [t^{2}]_{a}^{x}=x^{2}-a^{2}
\end{equation}
But, the family of functions $\{F(x)=x^{2}-a^{2}, \ x \in \mathbb{R} | \textrm{for all } a \in  \mathbb{R} \}$ does not include for example the function $G(x)=x^{2}+1$, which is an obvious antiderivative function of $f(x)=2x$.





No comments :

Post a Comment