Raymond B. answered 12/29/21
Math, microeconomics or criminal justice
2 cars 5 miles apart clock a car going 55 mph then 50 mph 4 minutes later. What is the minimum speed the car must have gone for 4 minutes to go 5 miles?
4 minutes = 4/60 hours = 2/30 = 1/15 of an hour
55 mph times 1/15 = 3 2/3 miles < 5 miles. the car had to go faster than 55 mph to go 5 miles in 4 minutes
1/15th of an hour going 5 miles means an average speed of 5x15 = 75 mph. If the average is 75, and it also went 55 and 50 mph, then the car must also have gone faster than 75 mph to get an average of 75 mph
It's similar to calculating your GPA. IF your average grade is 75 and you had a couple test scores of 50 and 55, then you must also have had tests greater than 75.
5 miles per 1/15 of an hour = 5/(1/15) mph = 5(15) = 75 mph
the car must have accelerated from 55 to over 75, then decelerated from over 75 down to 50 mph
For example, if it went 55 for 1 minute, 50 for the last minute, that's average of 52.5 mph for 2 minutes. To get an average of 75 would require going an average 75+(75-52.5) 75+22.5= 97.5 miles per hour for the middle two minutes.
but this all looks a little suspicious if they try to give you a speeding ticket. Were their watches or clocks calibrated? was the radar calibrated? If two officers get speeds that aren't speeding, there's clearly reasonable doubt if they try to charge you with criminal speeding, reckless driving or endangerment.