
Jake T. answered 04/02/20
Mechanical Engineering PhD Student
Assuming that the baseball is thrown perfectly horizontally.
First, find the time it takes for the ball to hit the ground. As the ball is thrown horizontally, the initial velocity in the vertical direction is 0, i.e. vy,0 = 0 m/s. Using the acceleration of gravity, a = -9.81 m/s2, we can use the kinematic equation d = vy,0*t + at2/2. Using dy = -2.5 m (assuming that the origin is the hand of the pitcher, the ball goes below that point 2.5 m), a = g = -9.81 m/s2, we solve for t:
- -2.5 = 0*t + -9.81*t2/2 = -4.905*t2
- t = 0.714 s
Now, we know that the velocity in the x direction is 21 m/s, i.e. vx = 21 m/s. Assuming that the velocity stays constant, then we can find the distance traveled in the x direction as the velocity multiplied by the time, i.e.
- dx = vx*t = 21*0.714 = 15 m
Answers:
- t = 0.714 s
- dx = 15 m