An object is at rest at (1 ,2 ,1 ) and constantly accelerates at a rate of 1/3 ms^-2 as it moves to point B. If point B is at (4 ,4 ,5 ), how long will it take for the object to reach point B? Assume that all coordinates are in meters.

1 Answer
Jan 11, 2018

The distance between the two points is 5.4 m and the time taken to move from one to the other is 5.7 s.

Explanation:

First step is to find the distance between the two points:

s=sqrt((x_2-x_1)^2+(y_2-y_1)^2+(z_2-z_1)^2)
=sqrt((4-1)^2+(4-2)^2+(5-1)^2)=sqrt(3^2+2^2+4^2)
=sqrt(9+4+16)=sqrt(29)=5.4 m

Now we know:

u=0 ms^-1 (since the object was initially at rest)
a=1/3 ms^-2
s=5.4 m (some people use d for distance)
t=? s

We have the equation of motion:

s=ut+1/2at^2

The ut term goes to 0 because u=0, which makes life simpler:

s=1/2at^2

Rearranging to make t the subject:

t=sqrt((2s)/a)=sqrt((2xx5.4)/(1/3))=sqrt 32.4=5.7 s