Limits at or with infinity

From Applied Science

In calculus we aren't concerned with counting elements of a set. What we do is to either calculate limits that have no boundaries, limits that diverge to infinity itself. Or calculate limits that are converging to a number, any finite quantity.

Some people may find limits somewhat contradictory in the sense of having a number on one side and something that isn't a number of the other side. I think this is where people may make a leap of faith. If one is unable to comprehend a limit, they may accept results or not out of faith.

Asymptotes

For the purposes of calculus and analytical geometry the asymptotes are dashed lines that represent a boundary. A curve is converging to that boundary at infinity. There isn't much that we do with asymptotes in calculus, except to trace graphs. The difference between [math]\displaystyle{ f(x) = x^2 }[/math] and [math]\displaystyle{ f(x) = 1/x^2 }[/math] is that the former does not have an asymptote, while the latter does. An asymptote always represents a limit, the function won't ever go beyond that limit. For functions of two variables the idea is the same, except that the asymptote would be a whole plane, not just a line.

Limit going to infinity

[math]\displaystyle{ \lim_{x \ \to \ \infty} x^2 = \infty }[/math] because the function can grow indefinitely. [math]\displaystyle{ \lim_{x \ \to \ \infty} \frac{1}{x} = 0 }[/math] because we are dividing a number by something very large, or in infinitely many small parts. Is there a rigorous way to prove that our intuition is correct in both cases? Yes, there is. This idea is pretty abstract because the whole concept is "there is a number that is very large, then we can add one and make it even larger and continue this process indefinitely". The other side of the same coin is to divide a number and repeat it indefinitely. We know that to divide anything in half yields smaller parts. Doing it again and again would come to an end at the atomic and then, subatomic level. But mathematics allow us to go beyond the smallest atom and there is the abstraction, how can something be smaller than a subatomic particle?

The definition of a limit when we have infinity is really the same for calculating it at a given, finite, point:

[math]\displaystyle{ 0 \lt |x - a| \lt \delta \implies f(x) \gt M }[/math]

The distance between two consecutive points on the function is positive. [math]\displaystyle{ \delta }[/math] is greater than that distance.

M is a large number, the largest number that one could think of. We are stating that no matter how large M is, [math]\displaystyle{ f(x) }[/math] is still larger than that. Then,

(If you didn't quite grasp the concept. Try a large number such as 99 or 1000. Calculate the function for that number. Try an even bigger number and do it again. This is what the letters epsilon and delta mean.)

Now for the side limits:

If [math]\displaystyle{ f(x) \gt M }[/math] whenever [math]\displaystyle{ 0 \lt x - a \lt \delta }[/math], we write

[math]\displaystyle{ \lim_{x \ \to \ a^{+}} = +\infty }[/math]

[math]\displaystyle{ \lim_{x \ \to \ a^{-}} = +\infty }[/math]

If both sides converge to the same limit, it does exist and is infinite itself. The other case is [math]\displaystyle{ f(x) \lt -M }[/math] for negative infinity.

Graphically we have this:

One may ask: Does the graph ever touch or cross the dashed line (asymptote)? We can imagine the graph touching the dashed line at infinity, but for all practical terms it never will. [math]\displaystyle{ f(x) = \infty }[/math] is wrong to write. Infinity can never be reached. The graph shows that M is large, but the function assumes values above that. There is a level of abstraction here because [math]\displaystyle{ a }[/math] is a finite value on the number line, yet we are considering that way up towards infinity the function keeps going up. There is no "ceiling" or upper boundary. For negative infinity it's the same concept, except that the graph goes down, there is a negative M and the function is below M.

Another question that may arise here: the graph is symmetrical but the concept applies to any continuous function. It's symmetrical to show that the function has two limits, one to the left of [math]\displaystyle{ a }[/math] and another to the right that converge to the same point. If the function is [math]\displaystyle{ f(x) = e^x }[/math] for example, the limit exists at infinity but the graph doesn't have a symmetry. In this case it's harder to picture it because assuming that [math]\displaystyle{ a }[/math] is already a large number and between it and infinity there are even larger numbers that we just won't be able to see or plot the graph to the right of [math]\displaystyle{ a }[/math]. The graph does exist at infinity, but we can't plot it. Some people may find it really hard to grasp this abstract concept. For others it can be easy though.

Limit with infinity

Now for the opposite problem, [math]\displaystyle{ x \to \infty }[/math] with the function having a finite limit there. This may be easier to grasp because when a limit exists and is a number, it may be easier to see it as limit because it's really a visible boundary.

For all [math]\displaystyle{ \epsilon \gt 0 }[/math], there is a [math]\displaystyle{ \delta \gt 0 }[/math] such that [math]\displaystyle{ |f(x) - L| \lt \epsilon }[/math] for all [math]\displaystyle{ x \gt \delta }[/math].

In other words: [math]\displaystyle{ \lim_{x \ \to \ \infty} f(x) = L }[/math]

Graphical representation:

Note: the dashed line means that, at infinity, the graph is converging to that value. [math]\displaystyle{ f(\infty) = L }[/math] is not right. We can calculate [math]\displaystyle{ f(x) }[/math] for any extremely large number, but infinity can never be reached. What the graph shows is that we have a point [math]\displaystyle{ \delta }[/math], where the function is continuous and the limit does exist. We can indefinitely choose another number next to [math]\displaystyle{ \delta }[/math] such that the limit exist, is not equal to the previous point and the distance [math]\displaystyle{ L - \epsilon \lt f(x) \lt L + \epsilon }[/math] keeps getting smaller and smaller. That's the idea, this process of finding numbers that are larger than the previous while at the same time the distance between the function's graph and the limit is shrinking more and more is infinite, it never ends.

Another question that may arise here: what about the sided limits? This question doesn't make sense in a way and is to think that there is something to the right of infinity. Think about this [math]\displaystyle{ ]\delta, \infty[ }[/math] is an open interval where [math]\displaystyle{ \delta }[/math] is a very large number. To the right of it there is an even larger number, and another, there is always a bigger number. Is there some number to the right of infinity? First, infinity is not a number. There can't be a number to the right of it. Second, we are regarding [math]\displaystyle{ \delta }[/math] as the largest number we can think of, already. Infinity is never a point or position in a number line, because if we place it there, then there is always going to be another point or position to the right of it.

Reference: https://brilliant.org/wiki/epsilon-delta-definition-of-a-limit/

One may ask: is there a limit that mixes up zero with infinity? It can happen that we end up with a limit that is of the form [math]\displaystyle{ 0/\infty }[/math] or vice-versa. For such cases we have to compare the problematic limit with another which we can solve, the same reasoning applied to evaluate improper integrals when we can't know for sure whether the integral is converging to some number or diverging to infinity. However, we have to be sure that we didn't make any mistakes with algebraic operations such that the limit is wrong because there is some error in some step.