A runner starts a race fast, but slows down along the way. Her starting pace was 6 minutes per mile. After every 2 miles, she slows down by 1 minute per mile. The sequence should show how many minutes it will take to run 12 miles.

OK, let's see if we can solve this, PJ:

First two miles take 12 minutes total. 2 miles * 6 min. per mile = 12 minutes (miles 1 and 2)

After the first two miles she slows down to 7 min. per mile, so the second 2 miles takes

2 miles * 7 min. per mile = 14 minutes. (miles 2 and 4)

When the second two miles is finished she slows down to 8 min. per mile.

2 miles * 8 min. per mile = 16 min. (miles 5 and 6)

Similarly, by the times she has completed 6 miles, she slows down to 9 min. per mile

2 miles * 9 min. per mile = 18 minutes. (miles 7 and 8)

And miles 9 and 10 she has slowed to 10 min. per mile

2 miles * 10 min. per mile = 20 (miles 9 and 10)

And finally, miles 11 and 12 are accomplished at 11 min. per mile

2 miles * 11 min. per mile = 22 minutes

Trying to write this as a sequence, we would have something like:

Total time = (2 miles)(6 min/mile) + (2 miles)(7 min/mile) + (2 miles)(8 min/mile) + (2 miles)(9 min/mile) + (2 miles)(10 min/mile) + (2 miles)(11 min/mile)

We could write this more compactly as

_{s = 11}

Total time = (2 miles)∑ s

^{ }_{s = 6}

where I have allowed s to signify her speed (expressed in min/mile).

Total time = (2 miles)*(6 + 7+ 8 + 9 +10 + 11)

= (2 miles)(51 min/mile) =

**102 minutes**
## Comments