QUICK FACTS
Created Jan 0001
Status Verified Sarcastic
Type Existential Dread
calculus, fundamental theorem, limits, continuity, rolle's theorem, mean value theorem, inverse function theorem, differential, derivative, generalizations

Root Test

“• Part of a series of articles about...”

Contents
  • 1. Overview
  • 2. Etymology
  • 3. Cultural Impact

• Part of a series of articles about Calculus

$\int_{a}^{b}f’(t),dt=f(b)-f(a)$

Fundamental theoremLimitsContinuityRolle’s theoremMean value theoremInverse function theorem

Differential

Definitions • Derivative (generalizations ) • Differentialinfinitesimalof a functiontotal

Concepts • Differentiation notationSecond derivativeImplicit differentiationLogarithmic differentiationRelated ratesTaylor’s theorem

Rules and identitiesSumProductChainPowerQuotientL’Hôpital’s ruleInverseGeneral LeibnizFaà di Bruno’s formulaReynolds

Integral

Lists of integralsIntegral transformLeibniz integral rule

Definitions • AntiderivativeIntegral (improper ) • Riemann integralLebesgue integrationContour integrationIntegral of inverse functions

Integration by • PartsDiscsCylindrical shellsSubstitution (trigonometric , tangent half-angle , Euler ) • Euler’s formulaPartial fractions (Heaviside’s method ) • Changing orderReduction formulaeDifferentiating under the integral signRisch algorithm

Series

Geometric (arithmetico-geometric ) • HarmonicAlternatingPowerBinomialTaylor

Convergence testsSummand limit (term test)Ratio • Root • IntegralDirect comparisonLimit comparisonAlternating seriesCauchy condensationDirichletAbel

Vector

GradientDivergenceCurlLaplacianDirectional derivativeIdentities

Theorems • GradientGreen’sStokes’DivergenceGeneralized StokesHelmholtz decomposition

Multivariable

Formalisms • MatrixTensorExteriorGeometric

Definitions • Partial derivativeMultiple integralLine integralSurface integralVolume integralJacobianHessian

Advanced

Calculus on Euclidean spaceGeneralized functionsLimit of distributions

Specialized • FractionalMalliavinStochasticVariations

Miscellanea • PrecalculusHistoryGlossaryList of topicsIntegration BeeMathematical analysisNonstandard analysis


Criterion for the convergence of an infinite series

In the vast, often tedious landscape of mathematics , where the infinite insists on being tamed by finite minds, determining the behavior of an infinite series is a recurring, if somewhat uninspired, task. Does it settle down, eventually converging to a finite value, or does it spiral into an endless, unruly divergence? One of the more direct, though not universally applicable, tools for this assessment is the root test. It provides a criterion for the convergence of such a series, acting as a convergence test by scrutinizing the n-th root of the absolute value of its terms.

The efficacy of this test hinges on the behavior of a particular quantity:

{\displaystyle \limsup {n\rightarrow \infty }{\sqrt[{n}]{|a{n}|}},}

where $a_n$ represents the individual terms of the series. The verdict is rather straightforward: if this calculated value is less than one, the series graciously converges absolutely. If, however, it stubbornly insists on being greater than one, the series diverges, throwing all pretense of finite summation to the wind. For those unfortunate souls dealing with power series , this test proves to be particularly, almost annoyingly, useful, often clarifying their domain of convergence with a singular, decisive stroke.

Root test explanation

The genesis of the root test can be traced back to the rather prolific Augustin-Louis Cauchy , a figure whose contributions to calculus are as undeniable as they are extensive. He formally introduced this test, among countless other fundamental concepts, in his seminal textbook, Cours d’analyse , published in 1821. This era marked a crucial period in the rigorous formalization of mathematical analysis , moving away from the more intuitive, sometimes shaky, foundations laid by earlier pioneers. Cauchy’s insistence on precise definitions and rigorous proofs, while undoubtedly a boon for modern mathematics, probably added years to the collective suffering of students then and now. Consequently, this criterion is occasionally referred to, with a slight historical flourish, as the Cauchy root test or Cauchy’s radical test.

For any given infinite series structured as:

{\displaystyle \sum {n=1}^{\infty }a{n}}

the root test requires the computation of a specific value, denoted here as C. This value is defined as:

{\displaystyle C=\limsup {n\rightarrow \infty }{\sqrt[{n}]{|a{n}|}},}

where “lim sup” refers to the limit superior . For those unfamiliar with this concept, the limit superior of a sequence is essentially the largest limit point of that sequence. Think of it as the ultimate upper boundary that the sequence’s terms repeatedly approach, or even exceed infinitely often, as n tends towards infinity. It’s a more robust concept than a simple limit, especially when the sequence oscillates or doesn’t converge in the traditional sense, and it can, rather dramatically, be +∞. It’s a useful distinction for sequences that refuse to be well-behaved. It’s worth noting, for clarity’s sake, that if the simpler, more common limit:

{\displaystyle \lim {n\rightarrow \infty }{\sqrt[{n}]{|a{n}|}},}

actually exists and converges, then it will inherently be equal to C. In such agreeable circumstances, this simpler limit can be used interchangeably within the root test, sparing you the intellectual gymnastics of the limit superior .

The root test then lays down its pronouncements with an almost judicial finality:

  • If C < 1, the series is said to converge absolutely . This is the strongest form of convergence, implying that even if you were to take the absolute value of every term in the series, it would still sum to a finite number. It’s the mathematical equivalent of having your life entirely together, even under scrutiny.
  • If C > 1, the series diverges . It simply grows without bound, refusing to settle. A clear and unambiguous outcome, for once.
  • If C = 1 and the limit approaches strictly from above (meaning, the terms of the sequence $\sqrt[n]{|a_n|}$ are consistently greater than 1 as n approaches infinity, even if the limit superior itself is 1), then the series diverges. This is a subtle but crucial distinction.
  • Otherwise, if C = 1 under any other circumstances, the test is, regrettably, inconclusive. This is where the test throws its hands up in exasperation, offering no definitive answer. The series could still diverge , converge absolutely , or even converge conditionally (meaning it converges, but only if its terms retain their original signs; taking the absolute value would cause it to diverge). It’s a common point of frustration, forcing one to resort to other, equally demanding convergence tests .

Consider, for instance, the series $\textstyle \sum 1/{n^{2}}$. Here, C = 1, yet the series famously converges (it’s a p-series with p=2 > 1). Conversely, the harmonic series , $\textstyle \sum 1/n$, also yields C = 1, but it equally famously diverges. These examples underscore the test’s inherent ambiguity when C equals one, highlighting the need for a more nuanced approach in such cases.

Application to power series

The true utility of the root test often shines brightest when applied to the analysis of power series . These are formidable constructs in mathematical analysis , representing functions as infinite sums of terms involving powers of a variable. They are indispensable for approximating functions, solving differential equations , and defining many elementary and special functions. A general power series takes the form:

{\displaystyle f(z)=\sum {n=0}^{\infty }c{n}(z-p)^{n}}

In this expression, $c_n$ represents the coefficients of the series, $p$ is the center of the series (a point in the complex plane around which the series is expanded), and $z$ is the complex variable itself. For those with a preference for the real number line, these can, of course, be real numbers, but the full power of the concept often resides in the complex plane .

When we apply the root test to such a series, the terms $a_n$ are defined as $c_n (z - p)^n$. The test then proceeds as described above, evaluating the n-th root of $|a_n|$. The critical insight here is that the convergence of a power series is intimately tied to its radius of convergence . This radius, typically denoted by R, defines the largest interval (for real series) or disc (for complex series) centered at $p$ within which the series is guaranteed to converge for all points $z$ strictly in its interior. The behavior of the series precisely on the boundary of this interval or disc (e.g., at the endpoints of an interval) usually requires separate, often more intricate, investigation, a detail that the root test doesn’t deign to resolve.

A particularly elegant and practical corollary that emerges directly from the application of the root test to power series is the celebrated Cauchy–Hadamard theorem . This theorem provides a precise formula for the radius of convergence , R, stating that it is exactly:

{\displaystyle 1/\limsup {n\rightarrow \infty }{\sqrt[{n}]{|c{n}|}},}

One must, naturally, exercise a modicum of caution here: if the denominator, $\limsup {n\rightarrow \infty }{\sqrt[{n}]{|c{n}|}}$, happens to be zero, then the radius of convergence R is considered to be infinity, implying that the series converges for all complex numbers $z$. Conversely, if this limit superior is infinite, R is zero, meaning the series converges only at its center $p$, a rather trivial and uninteresting case.

Proof

The proof of the root test’s assertions regarding the convergence or divergence of a series $\sum a_n$ is fundamentally an elegant application of the direct comparison test , a foundational tool in the analysis of infinite sums.

Let’s first address the case of convergence. If, for all $n$ greater than or equal to some fixed natural number $N$, we find that:

{\displaystyle {\sqrt[{n}]{|a_{n}|}}\leq k<1}

for some constant $k$ which is strictly less than 1, then a simple rearrangement reveals that:

{\displaystyle |a_{n}|\leq k^{n}<1}

What we have here is a comparison. The terms $|a_n|$ are bounded above by the terms of a geometric series , $\sum_{n=N}^{\infty} k^n$. Given that $k < 1$, this geometric series is known to converge absolutely. Since the absolute values of our series’ terms are less than or equal to the terms of a convergent series, the direct comparison test unequivocally states that $\sum_{n=N}^{\infty} |a_n|$ also converges. Consequently, the original series $\sum a_n$ converges absolutely, which, as we know, implies its plain convergence as well. The initial finite number of terms (from $n=1$ to $N-1$) do not affect the ultimate convergence behavior of an infinite series , merely its sum.

Now, consider the case of divergence. If, for infinitely many values of $n$, we observe that:

{\displaystyle {\sqrt[{n}]{|a_{n}|}}>1}

this implies that $|a_n| > 1^n = 1$ for those infinitely many $n$. If the absolute values of infinitely many terms remain stubbornly greater than 1, then the terms $a_n$ themselves simply cannot approach zero as $n \to \infty$. A necessary, though not sufficient, condition for any series to converge is that its terms must tend to zero (the term test ). Since $a_n$ fails this fundamental requirement, the series is, without question, divergent.

The proof of the Cauchy–Hadamard theorem , the corollary concerning the radius of convergence for a power series $\sum a_n = \sum c_n (z - p)^n$, follows directly from the root test.

From the root test, we know that the series converges if there exists some natural number $N$ such that for all $n \geq N$, the following condition holds:

{\displaystyle {\sqrt[{n}]{|a_{n}|}}={\sqrt[{n}]{|c_{n}(z-p)^{n}|}}<1,}

This expression can be simplified by taking the n-th root of the product:

{\displaystyle {\sqrt[{n}]{|c_{n}|}}\cdot |z-p|<1}

For the series to converge, this inequality must hold for all sufficiently large $n$. This logically implies that the absolute value of the difference between $z$ and $p$ must be bounded:

{\displaystyle |z-p|<1/{\sqrt[{n}]{|c_{n}|}}}

for all sufficiently large $n$. To ensure convergence for all $z$ within a certain radius, we must consider the worst-case scenario for the denominator, meaning the smallest possible value for $1/{\sqrt[{n}]{|c_{n}|}}$, which corresponds to the largest possible value for ${\sqrt[{n}]{|c_{n}|}}$. This leads us to the limit superior :

{\displaystyle |z-p|<1/\limsup {n\rightarrow \infty }{\sqrt[{n}]{|c{n}|}},}

From this, it is evident that the radius of convergence , $R$, must be less than or equal to this quantity:

{\displaystyle R\leq 1/\limsup {n\rightarrow \infty }{\sqrt[{n}]{|c{n}|}}.}

The only other scenario where convergence might occur is when:

{\displaystyle {\sqrt[{n}]{|a_{n}|}}={\sqrt[{n}]{|c_{n}(z-p)^{n}|}}=1,}

(as any value greater than 1 guarantees divergence). However, these specific points correspond precisely to the boundary of the interval or disc of convergence. By definition, the radius of convergence is the largest radius for which the series converges in the interior. The behavior on the boundary does not alter the radius itself. Thus, the equality holds:

{\displaystyle R=1/\limsup {n\rightarrow \infty }{\sqrt[{n}]{|c{n}|}}.}

This elegantly connects the local behavior of the series terms to the global domain of convergence for a power series .

Examples

One might imagine that these abstract criteria are purely theoretical constructs, but they do, regrettably, have practical applications. Let’s examine a couple of instances where the root test proves its worth.

Example 1: A Series Determinedly Divergent

Consider the series:

{\displaystyle \sum _{i=1}^{\infty }{\frac {2^{i}}{i^{9}}}}

To apply the root test, we need to evaluate the limit superior of the n-th root of the absolute value of the terms. Here, $a_n = \frac{2^n}{n^9}$.

{\displaystyle C=\lim _{n\to \infty }{\sqrt[{n}]{\left|{\frac {2^{n}}{n^{9}}}\right|}}}

Since $2^n$ and $n^9$ are positive for $n \ge 1$, we can drop the absolute value signs:

{\displaystyle C=\lim _{n\to \infty }{\sqrt[{n}]{\frac {2^{n}}{n^{9}}}}}

Now, we can separate the n-th root:

{\displaystyle C=\lim _{n\to \infty }{\frac {\sqrt[{n}]{2^{n}}}{\sqrt[{n}]{n^{9}}}}}

This simplifies to:

{\displaystyle C=\lim _{n\to \infty }{\frac {2}{(n^{1/n})^{9}}}}

We invoke a well-known limit property: $\lim _{n\to \infty }n^{1/n}=1$. This is a fundamental result, often derived using logarithms or L’Hôpital’s rule . Substituting this into our expression:

{\displaystyle C=\lim _{n\to \infty }{\frac {2}{(1)^{9}}}=2}

Since $C = 2$, and $C > 1$, the root test unequivocally declares that this series diverges. It’s not even a close call; the exponential growth of $2^n$ completely overwhelms the polynomial growth of $n^9$.

Example 2: The Root Test’s Superiority to the Ratio Test

Now, for a more nuanced case, showcasing where the root test might outmaneuver its close relative, the ratio test . Consider the series:

{\displaystyle \sum _{n=0}^{\infty }{\frac {1}{2^{\lfloor n/2\rfloor }}}=1+1+{\frac {1}{2}}+{\frac {1}{2}}+{\frac {1}{4}}+{\frac {1}{4}}+{\frac {1}{8}}+{\frac {1}{8}}+\ldots }

Here, the terms $a_n$ are defined based on the floor function $\lfloor n/2 \rfloor$. Let’s explicitly list the terms: $a_0 = 1/2^0 = 1$ $a_1 = 1/2^0 = 1$ $a_2 = 1/2^1 = 1/2$ $a_3 = 1/2^1 = 1/2$ $a_4 = 1/2^2 = 1/4$ $a_5 = 1/2^2 = 1/4$, and so on.

Applying the root test requires evaluating:

{\displaystyle r=\limsup {n\to \infty }{\sqrt[{n}]{|a{n}|}}}

Let’s look at the terms for even and odd $n$. If $n = 2k$ (even), then $a_{2k} = 1/2^{\lfloor 2k/2 \rfloor} = 1/2^k$. Then ${\sqrt[{2k}]{|a_{2k}|}} = {\sqrt[{2k}]{1/2^k}} = (1/2^k)^{1/(2k)} = (1/2)^{k/(2k)} = (1/2)^{1/2} = 1/\sqrt{2}$.

If $n = 2k+1$ (odd), then $a_{2k+1} = 1/2^{\lfloor (2k+1)/2 \rfloor} = 1/2^k$. Then ${\sqrt[{2k+1}]{|a_{2k+1}|}} = {\sqrt[{2k+1}]{1/2^k}} = (1/2^k)^{1/(2k+1)} = (1/2)^{k/(2k+1)}$. As $k \to \infty$, $k/(2k+1) \to 1/2$. So, $(1/2)^{k/(2k+1)} \to (1/2)^{1/2} = 1/\sqrt{2}$.

Both subsequences converge to $1/\sqrt{2}$. Therefore, the limit superior is:

{\displaystyle r=\limsup {n\to \infty }{\sqrt[{n}]{|a{n}|}}={\frac {1}{\sqrt {2}}}}

Since $1/\sqrt{2} \approx 0.707$, which is clearly less than 1 ($r < 1$), the root test confirms that this series converges absolutely.

Now, for a brief moment of schadenfreude, let’s observe the ratio test ’s struggle with this particular series. The ratio test relies on the limit of the ratio of consecutive terms, $|a_{n+1}/a_n|$. If $n$ is even, say $n=2k$: $a_{2k} = 1/2^k$ $a_{2k+1} = 1/2^k$ So, $a_{n+1}/a_n = a_{2k+1}/a_{2k} = (1/2^k) / (1/2^k) = 1$.

If $n$ is odd, say $n=2k+1$: $a_{2k+1} = 1/2^k$ $a_{2k+2} = 1/2^{k+1}$ So, $a_{n+1}/a_n = a_{2k+2}/a_{2k+1} = (1/2^{k+1}) / (1/2^k) = 1/2$.

Because the ratio $|a_{n+1}/a_n|$ alternates between 1 and 1/2, the limit $\lim {n\to \infty }|a{n+1}/a_{n}|$ simply does not exist. Thus, the ratio test is, rather uselessly, inconclusive for this series. This example neatly illustrates a scenario where the root test is demonstrably “stronger” – it provides a definitive answer where the ratio test fails due to oscillatory behavior. It’s almost as if some tests are designed to be more discerning than others.

Root tests hierarchy

For those who find the standard root test insufficient for the more obstinate series, mathematicians, in their ceaseless pursuit of granularity, have developed a hierarchy of root tests. This structure is analogous to the ratio tests hierarchy (as detailed in Section 4.1 of the ratio test article, particularly Subsection 4.1.4). It provides increasingly sensitive criteria for determining convergence when the simpler tests, like the standard Cauchy root test, yield the frustrating “inconclusive” result of $C=1$.

For a series $\sum_{n=1}^{\infty} a_n$ composed entirely of positive terms (a common simplification for these advanced tests), we can delve into this hierarchy.

Let $K \geq 1$ be an integer, and let $\ln_{(K)}(x)$ denote the $K$-th iterate of the natural logarithm . This means:

  • $\ln_{(1)}(x) = \ln(x)$
  • For any $2 \leq k \leq K$, $\ln_{(k)}(x) = \ln_{(k-1)}(\ln(x))$. For instance, $\ln_{(2)}(x) = \ln(\ln(x))$, and $\ln_{(3)}(x) = \ln(\ln(\ln(x)))$, and so on. These iterated logarithms grow excruciatingly slowly, allowing for incredibly fine distinctions between series behavior.

Suppose that the expression ${\sqrt[{-n}]{a_n}}$ (which can be rewritten as $(a_n)^{-1/n}$ or $1/(a_n)^{1/n}$), for sufficiently large $n$, can be represented in the following highly specific and rather intimidating form:

{\displaystyle {\sqrt[{-n}]{a_{n}}}=1+{\frac {1}{n}}+{\frac {1}{n}}\sum _{i=1}^{K-1}{\frac {1}{\prod _{k=1}^{i}\ln _{(k)}(n)}}+{\frac {\rho _{n}}{n\prod _{k=1}^{K}\ln _{(k)}(n)}}.}

(Here, the “empty sum” for $i=1$ to $0$ is taken to be 0, simplifying the expression when $K=1$.) This form essentially attempts to capture the precise asymptotic behavior of $(a_n)^{1/n}$ as it approaches 1 from below (or above), providing a more detailed “expansion” around 1. The term $\rho_n$ is the crucial component here, a sequence whose limit behavior dictates the fate of the series.

The hierarchy then provides these conditions for convergence or divergence, based on the behavior of $\rho_n$:

  • The series converges if $\liminf _{n\to \infty }\rho _{n}>1$. This means that the smallest limit point of $\rho_n$ is strictly greater than 1, implying that $\rho_n$ eventually stays above 1.
  • The series diverges if $\limsup _{n\to \infty }\rho _{n}<1$. This means that the largest limit point of $\rho_n$ is strictly less than 1, implying that $\rho_n$ eventually stays below 1.
  • Otherwise, if $\liminf \rho_n \leq 1 \leq \limsup \rho_n$, the test remains inconclusive. This occurs, for example, if $\rho_n$ oscillates around 1, or converges to exactly 1. Even with this level of detail, mathematics occasionally refuses to give a definitive answer.

Proof

To prove this advanced hierarchy, we begin by observing the relationship between ${\sqrt[{-n}]{a_n}}$ and $\ln a_n$. Given the definition:

{\displaystyle {\sqrt[{-n}]{a_{n}}}=\mathrm {e} ^{-{\frac {1}{n}}\ln a_{n}}}

where ’e’ is Euler’s number , we can equate this with the given expansion:

{\displaystyle \mathrm {e} ^{-{\frac {1}{n}}\ln a_{n}}=1+{\frac {1}{n}}+{\frac {1}{n}}\sum _{i=1}^{K-1}{\frac {1}{\prod _{k=1}^{i}\ln _{(k)}(n)}}+{\frac {\rho _{n}}{n\prod _{k=1}^{K}\ln _{(k)}(n)}}.}

This is where the magic (or rather, the meticulous algebra) happens. We can take the natural logarithm of both sides. For sufficiently large $n$, the right-hand side, being $1 + (\text{small terms})$, can be expanded using the Taylor series for $\ln(1+x) = x - x^2/2 + x^3/3 - \ldots$. Here, $x$ is the entire sum of terms after the ‘1’.

Applying the natural logarithm to both sides, and using the Taylor expansion for $\ln(1+X) \approx X - X^2/2 + \ldots$ where $X = {\frac {1}{n}}+{\frac {1}{n}}\sum _{i=1}^{K-1}{\frac {1}{\prod _{k=1}^{i}\ln _{(k)}(n)}}+{\frac {\rho _{n}}{n\prod _{k=1}^{K}\ln _{(k)}(n)}}$, we get:

{\displaystyle -{\frac {1}{n}}\ln a_{n}=\ln \left(1+{\frac {1}{n}}+{\frac {1}{n}}\sum _{i=1}^{K-1}{\frac {1}{\prod _{k=1}^{i}\ln _{(k)}(n)}}+{\frac {\rho _{n}}{n\prod _{k=1}^{K}\ln _{(k)}(n)}}\right).}

Expanding the right-hand side using the Taylor series for $\ln(1+X)$ and carefully collecting terms, while ignoring higher-order terms that become insignificant for large $n$ (represented by $O(1/n)$), we obtain:

{\displaystyle \ln a_{n}=-1-\sum _{i=1}^{K-1}{\frac {1}{\prod _{k=1}^{i}\ln _{(k)}(n)}}-{\frac {\rho _{n}}{\prod _{k=1}^{K}\ln _{(k)}(n)}}+O\left({\frac {1}{n}}\right).}

This intermediate step is crucial. It relates the logarithm of the series term $a_n$ to the iterated logarithms and the $\rho_n$ sequence. From this, we can exponentiate to find $a_n$ itself. The structure of $a_n$ then dictates its convergence behavior.

The final form of $a_n$ can be expressed as:

{\displaystyle a_{n}={\begin{cases}\mathrm {e} ^{-1+O(1/n)}{\frac {1}{(n\prod _{k=1}^{K-2}\ln _{(k)}n)\ln _{(K-1)}^{\rho _{n}}n}},&K\geq 2,\\mathrm {e} ^{-1+O(1/n)}{\frac {1}{n^{\rho _{n}}}},&K=1.\end{cases}}}

(The “empty product” for $k=1$ to $K-2$ is set to 1 if $K-2 < 1$.)

This detailed asymptotic form of $a_n$ is now in a state where its convergence can be directly compared to known convergent or divergent series. Specifically, the final result concerning the convergence or divergence based on $\rho_n$ follows directly from the application of the integral test for convergence . The integral test allows us to determine the convergence of a series by comparing it to the convergence of an improper integral of a related function. The series with terms of the form $1/(n \ln n \ln(\ln n) \ldots (\ln_{(K-1)} n) (\ln_{(K)} n)^\alpha)$ are classic examples of series whose convergence depends on the value of $\alpha$ (converging if $\alpha > 1$, diverging if $\alpha \le 1$). The structure derived for $a_n$ mirrors these forms, with $\rho_n$ effectively acting as that critical exponent $\alpha$. Thus, if $\rho_n$ is ultimately greater than 1, the series behaves like a convergent integral; if less than 1, like a divergent one. A rather elegant, if convoluted, conclusion.

See also