
Nathaniel Z. answered 06/11/24
Experienced Calculus, Statistics, and Econometrics Tutor
An arithmetic sequence has the general form a_n = a_1 + (n-1)d. For example, suppose you are working with the sequence a_n = a_n-1 + 2, i.e., to go from one term to the next in the sequence you add 2. If you do this a finite number of times, say 10 times, you know that the final term in your sequence is your first term + 20, so the sequence converges. However, if you have an infinite sequence, there is no final term and each term becomes larger and larger, so this sequence must diverge. Be careful to distinguish a sequence from a series. A sequence is just a progression of numbers like 1,3,5,7,..., whereas a series is a sum of those terms, 1+3+5+7+...