If and the equation (x-a)(x-10)+1=0 has integral roots ,then value of 'a' are
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
We have the quadratic equation x 2 − ( a + 1 0 ) x + ( 1 0 a + 1 ) = 0 , which has roots:
x = 2 ( a + 1 0 ) ± ( a + 1 0 ) 2 − 4 ( 1 ) ( 1 0 a + 1 ) = 2 ( ( a + 1 0 ) ± ( a − 1 0 ) 2 − 4 .
If x has intergral roots, then the discriminant must be a perfect square:
( a − 1 0 ) 2 − 4 = N 2 ⇒ [ ( a − 1 0 ) + N ] [ ( a − 1 0 ) − N ] = 2 2 ;
and if a ∈ Z then we require:
( a − 1 0 ) + N = 2 , ( a − 1 0 ) − N = 2 ⇒ a = 1 2 AND ( a − 1 0 ) + N = − 2 , ( a − 1 0 ) − N = − 2 ⇒ a = 8 .