Ethan J. answered 07/13/21
Physics graduate with Over 5 Years of Tutoring Experience.
Let:
y = some arbitrary distance between height of cliff and the ground
v1 = initial velocity (0m/s)
t = the time it takes for the rock to drop distance "y"
g = gravitational acceleration (9.81m/s^2)
Equation to use:
y2 = y1 + v1*t + (1/2)a*t^2
Define coordinate system:
Define the down direction as positive. Set the top of the cliff as the zero point (y1 = 0)
y1 = 0, y2 = y, v1 = 0, v2 = v2, t = t
Let's take a look of how time "t" varies as we adjust "y"
Plug in these variables into the equation:
y = 0 + 0 + (1/2)(g)(t^2)
Algebraically rearrange the equation to solve for t
t = sqrt( 2*y/g )
For part (a) y = 100m, so directly plug that into the equation for time:
t = sqrt( 2*100m / 9.81m/s^2) = 4.51s
For part (b), let's figure out the difference in the two times between the 50m point and the ground.
50m point: 50m above the ground is 150m displacement from the top of the cliff. So let y = 150m
t = sqrt( 2*150m / 9.81m/s^2) = 5.53s
This is time from the top of the cliff to the 50m point
ground: let y = 200m
t = sqrt( 2*200m / 9.81m/s^2) = 6.39
This is time from the top of the cliff to the ground
difference in times: 6.39s - 5.53s = 0.855s
This is time from the 50m point to the ground. Answer for part B