A projectile launched at 30 degrees will have the same distance as a projectile launched at 60 degrees. Why is this the case?
CodePudding user response:
If at launch the body is going at speed L, then it's going v_y0 = L sin \theta
in the y direction and v_x0 = L cos \theta
in the x.
Ignoring friction, v_x
doesn't change. So the horizontal position is x = v_x0 t
.
But v_y
does change with time. Newton says it decelerates uniformly due to gravity:
v_y(t) = v_y0 - g t
So the vertical position is integral(v_y0 - g t, dt)
. That's y(t) = v_y0 t - g t^2 / 2 C
. If we say the launch was at vertical position 0, the C is also 0. y(t) = v_y0 t - g t^2 / 2
. We want to know when that position is (again) 0. So solve
0 = v_y0 t - g t^2 / 2
= t (v_y0 - gt / 2)
This has the obvious solution t = 0
for the launch point. There's also the more interesting one.
0 = v_y0 - gt / 2
v_y0 = gt / 2
2 v_y0 = gt
t = 2 v_y0 / g
That's how long the body takes to hit the ground again after launch. Note it makes sense. If the launch is vertically faster, it takes longer to return to earth. During that time it's traveled horizontally
v_x0 t = (L cos \theta) 2 v_y0 / g
= 2 (L cos \theta) (L sin \theta) / g
= (2 L^2 / g) (cos \theta * sin \theta)
Everything in the first term is constant for the problem, so we need only to look at the second term.
For angles 30 and 60 degrees, the second term has the same value: cos 60 = sin 30
and sin 60 = cos 30
, so the horizontal distance travelled is the same in both cases.
CodePudding user response:
Because the body moves up slowly due to gravity and the body moves down more quickly due to gravity. The speed lost while going up is equal to the speed gained while going down.