How do you factor #10x^2 + 11x + 3#?

1 Answer
May 17, 2015

#10x^2+11x+3# is of the form #ax^2+bx+c# with #a=10#, #b=11# and #c=3#, so we can calculate the discriminant as follows:

#Delta = b^2-4ac = 11^2 - (4xx10xx3) = 121 - 120 = 1 = 1^2#

...a positive perfect square. So #10x^2+11x+3# has two distinct rational roots, given by the formula:

#x = (-b+-sqrt(Delta))/(2a) = (-11+-1)/20#

That is #x = -12/20 = -3/5# and #x = -10/20 = -1/2#

Since #x = -3/5# is a root, #(5x+3)# must be a factor.

Since #x = -1/2# is a root, #(2x+1)# is the other factor.

So #10x^2+11x+3 = (5x+3)(2x+1)#