Michael H. answered 10/29/19
High School Math, Physics, Computer Science & SAT/GRE/AP/PRAXIS Prep
We could use the Quadratic formula to get the roots, but I prefer to complete the square:
(1/a)x2 + (2/b)x + 3/c = 0
Subtract 3/c and multiply both sides by 1/a:
(x/a)2 + (2/b)(x/a) = -3/ac
Let y = x/a:
y2 + (2/b)y = -3/ac
Now complete the square by adding (1/b)2 to both sides:
y2 + (2/b)y + (1/b)2 = (1/b)2 - 3/ac
The left side is, by construction, a perfect square:
(y + 1/b)2 = (1/b)2 - 3/ac
Take the square root of both sides:
y + 1/b = ±sqrt((1/b)2 - 3/ac)
Solving for y yields:
y = -1/b ± sqrt((1/b)2 - 3/ac)
Remembering that y = x/a, get
x = a⋅y = -a/b ± a⋅ sqrt((1/b)2 - 3/ac)
or
x = -a/b ± sqrt((a/b)2 - 3a/c)