If the total distance is 1000 miles, half the trip takes 500/55 hrs and the other half takes 500/45 hrs.
Average speed = 1000/[(500/55)+(500/45)] = 49.5 mph
Paridhi S.
asked 09/25/22In going from Chicago to Atlanta, a car averages 45 miles per hour, and in going from Atlanta to Miami, it averages 55 miles per hour. If Atlanta is halfway between Chicago and Miami, what is the average speed from Chicago to Miami? Discuss an intuitive solution. Write a paragraph defending your intuitive solution. Then solve the problem algebraically. Is your intuitive solution the same as the algebraic one? If not, find the flaw.
If the total distance is 1000 miles, half the trip takes 500/55 hrs and the other half takes 500/45 hrs.
Average speed = 1000/[(500/55)+(500/45)] = 49.5 mph
Get a free answer to a quick problem.
Most questions answered within 4 hours.
Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.