Search 72,561 tutors
FIND TUTORS
Ask a question
0 0

calc II promblem! help please

A computer takes 2 seconds to compute a particular definite integral accurate to 4 decimal places. Approximately how long does it take the computer to get 12 decimal places of accuracy using each of the LEFT, MID, and SIMP rules?
Round your answers to one decimal place.

(a) LEFT ≈ years
(b) MID ≈ hours
(c) SIMP ≈ minutes

 

Tutors, please sign in to answer this question.

1 Answer

First it's not clear that there is a straight linear relationship between the time the computer takes to add extra digits of accuracy, but assuming that is the case, it would then be 2 seconds per 4 decimal places, and thus 6 seconds for 12 decimal places (3 x 4 = 12 decimal places, and 2 x 3 seconds). I'm not sure what the reference is to the years, hours and minutes. SIMP likely refers to Simpsons rule which may take slightly longer to compute, the computational speed for Left and Mid (midpoint) should be identical as they are about the same level of work to code either one.

Comments

You are correct.  The relationship is not linear.

I haven't thought through the answer completely, but you have to take into account that some of the methods are more accurate than others for the same number of steps.  For example, MID can be more accurate than either LEFT or RIGHT for functions where the 2nd derivative of the function being integated is relatively small.  Yet the work involved is similar.  SIMP generally requires less steps to receive a given level of accuracy, but the steps require about 4 times as much work as LEFT or RIGHT.

I don't believe the question can be accurately answered without knowing both the function being integrated.   Perhaps the instructor provided some thumbrules to generate estimates.

Chantilly calculus tutors