A ball is kicked off a cliff at a height of 20m above ground and an angle of 30 degree from the horizontal, it follows projectile motion and lands after a time t. Its velocity at the maximum height it reaches is 20m/s, how long does it take it to land?

This problem should be split into two parts, the time it takes the ball to reach its maximum point, and the time it takes it to fall to the ground from the maximum point.part 1) We can calculate the balls initial vertical velocity by using trig and applying the knowledge that at its maximum height, a projectiles vertical velocity is 0, giving us;tan(30) = Vy/20 which can be rearranged to find Vy (the initial velocity in the vertical direction)We can then use V = u +at, setting v to 0 and a to g to find the time it takes the ball to reach this maximum point, t1part 2) Next we have to find the time it takes the ball to drop down from the maximum point, we can work out how much higher the max point is above the height the ball was kicked from by using s = ut +(1/2)at^2, the total height the ball achieves is this distance s + 20m. We can then solve s + 20 = (1/2)at^2 for t to find t2.Finally we simply add the two times together to get the time for the whole journey of the ball.

Answered by Patryk k. Physics tutor

2331 Views

See similar Physics A Level tutors

Related Physics A Level answers

All answers ▸

A gold leaf electroscope with a zinc plate top is charged by briefly connecting it to the negative electrode of a high-voltage supply. Explain how the gold leaf will appear and how the leaf can be caused to drop again.


Why does a single slit diffraction pattern occur?


How does circular motion work?


A ball is thrown downwards from a height of 10m with speed of 5m/s, assuming g=10m/s^2, calculate the final velocity of the ball when it hits the ground


We're here to help

contact us iconContact usWhatsapp logoMessage us on Whatsapptelephone icon+44 (0) 203 773 6020
Facebook logoInstagram logoLinkedIn logo
Cookie Preferences