A computer takes 2 seconds to compute a particular definite integral accurate to 4 decimal places. Approximately how long does it take the computer to get 12 decimal places of accuracy using each of the LEFT, MID, and SIMP rules?

Round your answers to one decimal place.

(a) LEFT ≈ years

(b) MID ≈ hours

(c) SIMP ≈ minutes

## Comments

You are correct. The relationship is not linear.

I haven't thought through the answer completely, but you have to take into account that some of the methods are more accurate than others for the same number of steps. For example, MID can be more accurate than either LEFT or RIGHT for functions where the 2nd derivative of the function being integated is relatively small. Yet the work involved is similar. SIMP generally requires less steps to receive a given level of accuracy, but the steps require about 4 times as much work as LEFT or RIGHT.

I don't believe the question can be accurately answered without knowing both the function being integrated. Perhaps the instructor provided some thumbrules to generate estimates.