Raymond B. answered 03/16/22
Math, microeconomics or criminal justice
assume 40 mph for half a 120 mile full trip
that's 60 miles in 1.5 hours
if she went 80 mph for the 2nd half, the 2nd 60 miles that would take 60/80 = 3/4 of an hour
for a total 1.5 + .75 = 2.25 hours
It's not true though that the average speed is 60 mph. You can't add 40+80 and divide by 2 to get the apparent average speed.
the average speed is only 53 1/3 mph. total distance 120 divided by 2.25 = 53 1/3 mph
If you went 40 mph for half the trip, to get an average 60 mph would require going 120 mph for the 2nd half
3 times the original speed.
40 mph for 1.5 hours means 40(1.5) = 60 miles. 120(.5) = 60 miles for the 2nd half
40 mph then 120 mph means an average speed of 60 mph
try it with 30 mph for 60 miles. That takes 2 hours
to get an average 40 mph for the full 120 miles means a total 120/40 = 3 hours
so you need to finish the last 60 miles in one hour.
that means 60 mph
the average of 30 and 60 mph is then 40 mph
change the numbers to what fits your specific problem
which seem to be missing in your post