A computer takes 2 seconds to compute a particular definite integral accurate to 4 decimal places. Approximately how long does it take the computer to get 12 decimal places of accuracy using each of the LEFT, MID, and SIMP rules?
Round your answers to one decimal place.
(a) LEFT ≈ years
(b) MID ≈ hours
(c) SIMP ≈ minutes
First it's not clear that there is a straight linear relationship between the time the computer takes to add extra digits of accuracy, but assuming that is the case, it would then be 2 seconds per 4 decimal places, and thus 6 seconds for 12 decimal places (3 x 4 = 12 decimal places, and 2 x 3 seconds). I'm not sure what the reference is to the years, hours and minutes. SIMP likely refers to Simpsons rule which may take slightly longer to compute, the computational speed for Left and Mid (midpoint) should be identical as they are about the same level of work to code either one.