Mistakes regarding derivatives

From Applied Science

Regarding concepts

  • A function can be continuous and non-differentiable at the same time. Can we say that if the function is not differentiable at a point this is caused by a division by zero? No. This idea comes from the fact that we first see the derivative in a relationship with the tangent line. We have this limit: [math]\displaystyle{ \lim_{h \ \to \ 0} \frac{f(x + h) - f(x)}{h} }[/math] and [math]\displaystyle{ h \to 0 }[/math] means that we are making the distance between two points in the number line close to zero. Are we dividing by zero? No and this is the level of abstraction that we are faced with when we first learn about limits. Sometimes we naively assume that to say that a limit doesn't exist is the same as to say that we tried to divide by zero.


  • There is a confusion that can happen regarding partial derivatives. When we say "treat one variable as a constant", this shouldn't be interpreted as a constant function. There are many examples in physics where we have a multivariable function and integrate in respect to one variable, while the others are treated as constants. This isn't the same as to say that the other variables don't change their respective values or that they are always a constant. What it means to treat as a constant is that we are not interested in its value when the differentiate or integrate in respect to a variable that isn't that which is treated as a constant. We aren't ignoring their existence.


  • A little mistake happens with the Leibniz's notation. [math]\displaystyle{ \frac{dy}{dx}(x + 1) \neq \frac{d}{dx}(x + 1) }[/math]. The right side really means to differentiate [math]\displaystyle{ x + 1 }[/math]. The left side has a product between [math]\displaystyle{ x + 1 }[/math] and [math]\displaystyle{ y' }[/math], the derivative of some unknown function.


  • Take a function [math]\displaystyle{ f(x,y) = x + y }[/math] for example. The partial derivatives in respect to each variable make the other variable null. Each partial derivative is still a function of two variables, but one of them is zero. Some people may think that the partial derivative is an operation that removes a variable from a function because of that.


  • Moving from one to many variables can impose a certain degree of challenge in regards to rates of change. A rate of change is always about the function assuming one value in one point and another value in another point. Either both points have different or equal values. It may happen that some people think about rates of change in the form of treating one variable in respect to the other. For example thinking that [math]\displaystyle{ f(x,y) }[/math], one variable is rise and the other is run of the right triangle. We don't do this in calculus because we always assume that one variable has no dependency with the others.


  • It may happen that some people confuse the concepts of integration, differentiation and the inverse function. Integration and differentiation are one the inverse of the other, but that's not the same as finding the inverse of a function.

Regarding the chain rule

  • I think that the most common mistake with the chain rule is to derive the nested function twice, like this [math]\displaystyle{ g'(x)f'(g'(x)) }[/math]. One way to avoid this common mistake is to remember that we have a product of derivatives, not a composition of derivatives. The other cause for such confusion is the lack of units with meaning. [math]\displaystyle{ g'(x) }[/math] and [math]\displaystyle{ g(x) }[/math] are two different functions and if they have different interpretations, their respective units can't be the same. When a function is used as the argument of another, their units must make sense, be compatible with such operation.

    [math]\displaystyle{ v(t) }[/math] for example is velocity that depends on time. If we do the composition [math]\displaystyle{ v(v(t)) }[/math] it's meaningless because since when do velocity depends on another velocity? In this case the inner function should had been something like [math]\displaystyle{ t(x) }[/math], time that depends on something else.

    Many math textbooks (high school level or pre-calculus) contain exercises about compound functions like [math]\displaystyle{ f(f(f(x))) }[/math] for instance. It's algebraically correct, but we don't see such operations in physics.


  • [math]\displaystyle{ \frac{\partial}{\partial x} \sin(x + y) \neq \cos(x) }[/math].
  • [math]\displaystyle{ \frac{\partial}{\partial x} \sin(x + y) \neq \cos(y) }[/math].
  • [math]\displaystyle{ \frac{\partial}{\partial x} \sin(x + y) \neq \cos(y + 1) }[/math].
  • [math]\displaystyle{ \frac{\partial}{\partial x} \sin(x + y) \neq \cos(x + 1) }[/math]. The correct answer for this partial derivative is [math]\displaystyle{ \cos(x+y) }[/math]. This mistake is the same that happens with single variable functions, now making us do even worse when there are multiple variables.


  • [math]\displaystyle{ \frac{\partial}{\partial x} \sin(x + xy) = (y + 1) \cos(x+xy) }[/math] How about this? Now [math]\displaystyle{ y }[/math] may be a constant, but [math]\displaystyle{ xy }[/math] is no longer a constant. To calculate this derivative we cannot forget the chain rule! So x in respect to x is 1 and xy in respect to x is y. The rest is the same as the above example.

Regarding calculations

  • [math]\displaystyle{ f(x) = x^{\frac{1}{2}} }[/math] and [math]\displaystyle{ f(x) = \frac{1}{x} }[/math]. A very common mistake is to forget that roots are rational exponents and negative exponents mean the inverse of a number or variable. This leads to calculating derivatives the wrong way.


  • [math]\displaystyle{ f(x) = x^n \implies f'(x) = nx^{n \ - \ 1} }[/math]. A not uncommon misstep here is to do this [math]\displaystyle{ f'(x) = (n - 1)x^{n \ - \ 1} }[/math]. I'd say that among all errors in mathematics, any formula that has a [math]\displaystyle{ (n \pm 1) }[/math] term in it is susceptible to this very mistake. Another mistake is to forget the constant "c" [math]\displaystyle{ f(x) = cx^n }[/math] if there is one (there is always a constant 1 by the way). Yet another mistake is that this rule is not true if we have this: [math]\displaystyle{ [\sin(x)]^n }[/math].


  • [math]\displaystyle{ f(x) = e^x \implies f'(x) = e^x }[/math]. Sometimes we memorize this and make this mistake: [math]\displaystyle{ f(x) = e^{2x} \not\!\!\!\implies f'(x) = e^{2x} }[/math] because we forget that this is a composite function.


  • l'Hospital rule says that when we have a quotient and the limit of both the numerator and denominator converge to infinity or both to zero, we can derive both functions and calculate the same limit. Sometimes people may see the quotient [math]\displaystyle{ f'/g' }[/math] and use the quotient rule of derivatives. What we are doing with l'Hospital is evaluating a limit, not trying to calculate the derivative of a quotient!


  • When we have limits of products or quotients we do the product or quotient of the limits. The derivative is a limit, but there is no rule saying that the quotient or product of derivatives is the derivative of the product or the quotient. I think this happens mostly because people memorize rules without associating them with a meaning.


  • With the definition of a derivative as a limit, one may do this: [math]\displaystyle{ f(x + h) }[/math] and the function is [math]\displaystyle{ f(x) = x^3 }[/math], then a substitution made ends at [math]\displaystyle{ h^3 }[/math] and not the correct one [math]\displaystyle{ (x + h)^3 }[/math]. I remember making this mistake many times. It's some quirk where we see the variable [math]\displaystyle{ x }[/math] but for some reason ignore it and think that the increment is the variable itself. It happens with multivariable functions as well.


  • When computing directional derivatives some people may think about this [math]\displaystyle{ f(a,b) }[/math] because they do [math]\displaystyle{ f(x,y) = f(a,b) }[/math]. It's a confusion between the vector's coordinates and the function's variables.

    [math]\displaystyle{ \frac{\partial f}{\partial x}(x,y) }[/math] is a partial derivative in respect to [math]\displaystyle{ x }[/math].

    [math]\displaystyle{ \frac{\partial f}{\partial \overrightarrow{v}}(a,b) }[/math] is a directional derivative in the direction of [math]\displaystyle{ \overrightarrow{v} = (a,b) }[/math], which requires us to compute the partial derivatives in respect to [math]\displaystyle{ x }[/math] and [math]\displaystyle{ y }[/math].


  • l'Hospital rule is applicable when we consider the quotient [math]\displaystyle{ f(x)/g(x) }[/math], not the product by the inverse of the second. [math]\displaystyle{ \frac{\lim_{x \ \to \ 0}\sin(x)}{\lim_{x \ \to \ 0} x} }[/math] is not the same thing as [math]\displaystyle{ \lim_{x \ \to \ 0} \sin(x) \cdot \lim_{x \ \to \ 0}\frac{1}{x} }[/math].