Let , and be the real roots of the cubic equation , and their absolute values be the roots of another cubic . Find the minimum integral value of .
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Hi,
Firstly, we can deduce that
⎩ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎧ α + β = 7 − γ α β = − 5 0 / γ ∣ α ∣ + ∣ β ∣ + ∣ γ ∣ = b ∣ α β γ ∣ = d = 5 0
WLOG, assume that γ < 0 < α ≤ β
⇒ b = α + β − γ = 7 − 2 γ
The problem now can be interpreted as finding the minimum integer value of − 2 γ .
α , β , γ ∈ R requires that the quadratic equation X 2 − ( α + β ) X + α β = 0 or X 2 − ( 7 − γ ) X − 5 0 / γ = 0 has real root(s) for some real value of γ .
⇔ ( 7 − γ ) 2 ≥ − 2 0 0 / γ
⇒ ( − 2 γ ) 3 + 2 8 ( − 2 γ ) 2 + 1 9 6 ( − 2 γ ) − 1 6 0 0 ≥ 0 ( 1 ) (note that γ < 0 )
We use a little calculus here.
Consider the function f ( − 2 γ ) = L H S ⇒ ⎩ ⎪ ⎨ ⎪ ⎧ f ′ ( − 1 4 ) = 0 f ′ ( − 1 4 / 3 ) = 0 f ( − 1 4 ) f ( − 1 4 / 3 ) > 0 ⇒ ∃ ! δ ∈ R : f ( δ ) = 0
Even more, we know that f ( 4 ) f ( 5 ) < 0 ⇒ 4 < δ < 5 .
Therefore, ( 1 ) ⇔ − 2 γ ≥ δ . What we need here is the smallest integer that is greater than or equal to δ .
Hence, − 2 γ = ⌈ δ ⌉ = 5 .
In conclusion, the minimum "integral" value of b + d is 6 2 at ( α , β , γ ) = ( u , v , − 5 / 2 ) and its permutations in which u , v are roots of y 2 − 9 . 5 y + 2 0 = 0
The word "integral" makes me so confused.