An object is at rest at (1 ,8 ,4 ) and constantly accelerates at a rate of 5/4 m/s^2 as it moves to point B. If point B is at (1 ,5 ,3 ), how long will it take for the object to reach point B? Assume that all coordinates are in meters.

1 Answer
Sep 15, 2017

2.25 seconds

Explanation:

First thing is to determine the distance to be travelled. We can do this thanks to our old pal Mr. Pythagoras:

s = sqrt (deltaz^2 + deltay^2 + deltax^2)

= sqrt((3-4)^2 + (5-8)^2 + (1-1)^2)

= sqrt(10)

Now, we are given acceleration: (dv)/dt = 5/4

Integrate once, and we get v(t):

v = 5/4t + c1

constant of integration c1 would represent initial velocity. We are told our mass it at rest at time t = 0, so c1 = 0.

Integrate a second time, and we get position x(t). (x as a function of t).

x = 5/4 * 1/2 t^2 + c2 = 5/8 t^2 + c2

We will interpret c2 as the fraction of the distance s (that we calculated above) where the mass will start at. I.e, it's initial position at time t=0. It's at the start point at that time, so it has travelled zero percent of the distance (sqrt(10)) as of this time. So we can set c2 to zero.

But now we have everything we need. We know distance, so we can solve for time.

sqrt(10) = 5/8 t^2

so, just a little algebra:

(8sqrt(10))/5 = t^2

t = sqrt((8sqrt(10))/5) = 2.25 seconds (rounding)