When a transistor radio is switched off, the current falls away according to the differential equation (dI)/dt=-kI where k Is a constant . If the current drops to 10% in the first second ,how long will it take to drop to 0.1% of its original value?

1 Answer
Apr 22, 2018

3 \ s

Explanation:

We have a current, Iat time t in a circuit that flows according to the DE:

(dI)/(dt) = -kI

Which is a First Order Separable Ordinary Differential Equation, so we can collect terms and "separate the variables" to get:

int \ 1/I \ dI = int \ -k \ dt

Which consisting of standard integrals, so we can directly integrate to get:

ln |I| = -kt + C

:. |I| = e^(-kt + C)

And, noting that the exponential is positive over the entire domain, we can write:

I(t) = e^(-kt)e^(C)

\ \ = Ae^(-kt) , say, where A=e^(C)

So, the initial current, I(0), flowing (at time t=0) is:

I(0) = Ae^(0) = A

We are given that the current drops to 10% of the initial value in the first second, so we can compute:

I(1) = Ae^(-k)

And:

I_1 = 10/100 * I(0) => Ae^(-k) = 1/10 * A

:. e^(-k) = 1/10 => k = ln(10)

Thus we can write the solution as:

I(t) = Ae^(-tln10)

We want the time, T, such that. I(T) is 0.1% of the initial current I(0), so we seek T satisfying:

I(T) = 0.1/100 * I(0) => Ae^(-Tln10) = 0.1/100 * A

:. e^(-Tln10) = 1/1000

:. -Tln10 = ln (1/1000)

:. Tln10 = 3ln10

:. T= 3