# A little-known series test

I just encountered this one on March 26, 2012 AP Calculus EDG courtesy of Doug Kuhlmann with the original article from the Mathematics Magazine, Vol 57 No.4 September 1984.  The article is available here.

I thought it was cool enough to repost here. The wording is slightly reworked and LaTeXed from Doug’s AP Calculus EDG posting.

Suppose f is a function such that $f(\frac{1}{n})=a_n$ and $f''(0)$ exists.  Then $\sum_{n=1}^{\infty}a_n$ converges absolutely if $f(0)=0$ and $f'(0)=0$ and diverges otherwise.

Here are two examples:

1. If $a_n=\frac{1}{n^2}$ then $f(x)=x^2$.  In this case, $f(0)=0$ and $f'(0)=0$, therefore $\sum_{n=1}^{\infty}\frac{1}{n^2}$ converges.
2. If $a_n=\frac{1}{n}$ then $f(x)=x$.  Since $f'(0)=1$, therefore $\sum_{n=1}^{\infty}\frac{1}{n}$ diverges.

### One response to “A little-known series test”

1. Doug Kuhlmann

Here are two more examples that are easily solved using this test: $\sum_{n=1}^{\infty}sin(\frac{1}{n})$ and $\sum_{n=1}^{\infty}(1-cos(\frac{1}{n}))$

In the first let $f(x)=sin(x)$ and in the second let $f(x)=1-cos(x)$. First diverges since $f'(0)=1$ there. The second converges absolutely.