A driver in a car traveling at a speed of 21.8 m/s sees a car 101 m away on the road. How long will it take for the car to uniformly decelerate to a complete stop in exactly 99 m?

1 Answer
Apr 5, 2018

9.1s9.1s

Explanation:

Since we know that velocity is uniformly decelerating, we can take the average velocity:

(V_i + V_f)/2Vi+Vf2

Letting V_fVf equal 0 m/s, we get 1/2*21.8m1221.8m/ss or 10.9m10.9m/ss.

Now we know V = d/tV=dt and rearranging this gives t = d/Vt=dV.

Substitute our velocity of 10.9m/s for V and 99m for d:
t=(99m)/(10.9m*s^-1)t=99m10.9ms1 (Sorry for the s^-1s1 but otherwise I couldn't write m/s)

Finally we get t=9.1st=9.1s (with significant figure rounding).

Hope this helps!

Cheers