A baseball player leads off the game and hits a long home run. The ball leaves the bat at an angle of 30 degrees from horizontal with a velocity of 40m/s. How far will it travel in the air?
The answer is 141 meters. But I don't understand why.
------------
Here is how I worked it out:
Using the following equation:
distance = (velocity initial)(time) + .5(acceleration)(time)^2
Information I gathered from the problem:
Velocity initial for the horizontal is (40)(cos30 degrees) and for the vertical it is (40)(sin30 degrees).
Horizontal acceleration is 0; vertical acceleration is 9.8m/s^2, which is gravity.
Then the next thing I have to do is find time, then use that to find the distance. Problem is that I don't know how to solve for time. I'm sure you math wizzes know how to do this.
The answer is 141 meters. But I don't understand why.
------------
Here is how I worked it out:
Using the following equation:
distance = (velocity initial)(time) + .5(acceleration)(time)^2
Information I gathered from the problem:
Velocity initial for the horizontal is (40)(cos30 degrees) and for the vertical it is (40)(sin30 degrees).
Horizontal acceleration is 0; vertical acceleration is 9.8m/s^2, which is gravity.
Then the next thing I have to do is find time, then use that to find the distance. Problem is that I don't know how to solve for time. I'm sure you math wizzes know how to do this.