An object is at rest at (2 ,9 ,8 ) and constantly accelerates at a rate of 1/5 m/s as it moves to point B. If point B is at (6 ,2 ,7 ), how long will it take for the object to reach point B? Assume that all coordinates are in meters.

1 Answer
Jul 4, 2016

t = (66)^(1/4)sqrt(10) s

Explanation:

I'm assuming the object travels in a straight line. Distance between the two points given by:

Deltar = sqrt((Deltax)^2+(Deltay)^2+(Deltaz)^2)

= sqrt((6-2)^2+(2-9)^2+(7-8)^2) = sqrt(4^2+7^2+1^2) = sqrt(66)

The object is travelling with constant acceleration along this path, there are no external forces acting on it so equation of motion is:

(d^2r)/(dt^2) = 1/5

Integrating gives

(dr)/(dt) = 1/5t + C_1

This may be reminiscent of v = u + at because, well, it's the same thing. C_1 denotes initial speed here. As we start from rest, in this case it is equal to zero.

Integrating again gives

r(t) = 1/10t^2 + C_2

We know that r(0) = 0 so implies C_2 = 0

therefore r(t) = 1/10t^2

The time at which r = sqrt(66) is then found by:

sqrt(66) = 1/10t^2 implies t^2 = 10*sqrt(66)

Hence t = (66)^(1/4)sqrt(10) s

(we discard the negative solution as we are working with times).