Zeros at the same time

Algebra Level 1

Are there non-zero real numbers a a , b , b, and c c such that a + b + c a+b+c and 1 a + 1 b + 1 c \frac 1 a + \frac 1 b + \frac 1 c are both zero?

No Yes

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

28 solutions

Jim Chale
Jul 2, 2018

We assume that both are zero. Then we have :

1 a + 1 b + 1 c = 0 a b + a c + c b a b c = 0 a b + a c + c b = 0 \frac {1} {a} + \frac {1} {b} + \frac {1} {c} = 0 \iff \frac {ab+ac+cb}{abc} = 0 \iff ab+ac+cb=0

We also have : a + b + c = 0 ( a + b + c ) 2 = 0 a 2 + b 2 + c 2 + 2 ( a b + a c + b c ) = 0 a 2 + b 2 + c 2 = 0 a+b+c=0 \iff (a+b+c)^{2}=0 \iff a^2+b^2+c^2+2(ab+ac+bc)=0 \iff a^2+b^2+c^2=0 Which is a contradiction ! Hence those numbers cannot be both zero at the same time

Hello, I was wondering if this is a valid solution so we have ab+ac+cb=0 and a+b+c=0 and so a=-b-c plugging a in we get -b^2-bc-cb-c^2+cb=0 simplifying and multiplying by -1 we get: b^2+c^2+bc=0 divide both sides by b^2 we get: c^2/b^2 + c/b + 1 = 0 let the ratio of c/b be x and so we get x^2+x+1=0 which has no real solutions so therefore if b is a real number that is not zero then c must be an imaginary number, a contradiction.

A Former Brilliant Member - 2 years, 11 months ago

Log in to reply

My solution exactly. I'm pretty sure it's valid.

Jacob Swenberg - 2 years, 11 months ago

There are three factors 2 missing after you expand ( a + b + c ) 2 (a+b+c)^2 . Nice symmetric solution.

Tom Verhoeff - 2 years, 11 months ago

Correction: (a+b+c)^2 = a^2 + b^2 + c^2 + 2ab +2bc + 2ca. But your solution still works!

Ans C - 2 years, 11 months ago

Log in to reply

Thanks for your observation!

Jim chale - 2 years, 11 months ago

neat solution

Alex Mandelias - 2 years, 11 months ago

I do not see the contradiction

Laura Gao - 2 years, 11 months ago

Log in to reply

It is a contradiction because if both a + b + c and a^2 + b^2 + c^2 are equal to 0 then they are equal to themselves. And unless a, b, and c are all equal to 1, (which would no longer make the whole thing equal to zero), this is impossible.

Karthik Narayanan - 2 years, 11 months ago

Log in to reply

For a + b + c = 0 some term(s) must be negative and some positive.

But all of a^2 + b^2 + c^2 are positive. Their sum could not possible be zero for non zero a, b, c

Sulayman Hussain - 2 years, 11 months ago

Log in to reply

@Sulayman Hussain Ohhhhhhhh i get it now. Thank you for your explanation.

Laura Gao - 2 years, 11 months ago

The square of a nonzero real number is always positive. a 2 + b 2 + c 2 = 0 a^2+b^2+c^2=0 contradicts this.

Stewart Gordon - 2 years, 11 months ago

The contradiction is that sum of squares can be zero only if all of them(a,b,c) are zero but question said a,b,c are non zero so hence contradiction.

Vishal Yadav - 2 years, 11 months ago

I don't understand where all the ab+ac+cb/abc comes from or how it relates to this

Nathaniel Graham - 2 years, 11 months ago

Log in to reply

We can't add dissimilar fractions without expressing each of those fractions with 1 common denominator, which is abc. We then divide abc by a, b and c each to get bc, ac and ab as the numerators, respectively. Later on it appears from factoring out 2 from 2ab + 2ac + 2bc (which, together with a^2 + b^2 + c^2, result from squaring a + b + c).

Adriel Padernal - 2 years, 11 months ago
X X
Jul 2, 2018

Similar with this

If a + b + c = 0 a+b+c=0 ,then c = ( a + b ) , 1 a + 1 b + 1 c = 1 a + 1 b + 1 ( a + b ) = 0 , 1 a + 1 b = 1 a + b c=-(a+b),\frac1a+\frac1b+\frac1c=\frac1a+\frac1b+\frac1{-(a+b)}=0,\frac1a+\frac1b=\frac1{a+b} which is impossible.

Not complete yet. You get a + b a b = 1 a + b ( a + b ) 2 = a b a 2 + b 2 + a b = 0 2 ( a 2 + b 2 + a b ) = 0 a 2 + b 2 + ( a + b ) 2 = 0 \frac {a+b}{ab} = \frac {1}{a+b} \iff (a+b)^2=ab \iff a^2+b^2+ab=0 \iff 2(a^2+b^2+ab)=0 \iff a^2+b^2+(a+b)^2=0 Which IS impossible

Jim chale - 2 years, 11 months ago

Log in to reply

I thought maybe someone will go to the similar problem and see that 1/a+1/b=1/(a+b) is impossible. Thanks for posting the comment so people can understand the solution more.

X X - 2 years, 11 months ago

let f ( x ) = 1 x f(x) = \frac{1}{x} , then f ( a ) + f ( b ) = f ( a + b ) f(a) + f(b) = f(a+b) . For real numbers f ( x ) f(x) must be a linear function, and f ( x ) = 1 x f(x) = \frac{1}{x} is appearently not, so it is impossible.

John Lee - 2 years, 11 months ago

Why is it impossible? I do not comprehend.

Laura Gao - 2 years, 11 months ago

Log in to reply

Look at the below comment,or check the "similar with this" out.

X X - 2 years, 11 months ago

Log in to reply

Ok thank you

Laura Gao - 2 years, 11 months ago
Kelvin Hong
Jul 3, 2018

We have 1 a + 1 b + 1 c = a b + b c + c a a b c = 0 \dfrac1a+\dfrac1b+\dfrac1c=\dfrac{ab+bc+ca}{abc}=0 which says a b + b c + c a = 0 ab+bc+ca=0 .

By assuming that a , b , c a,b,c are real roots of a cubic equation, then we have { a + b + c = 0 a b + b c + c a = 0 a b c = k \begin{cases}a+b+c=0\\ab+bc+ca=0\\abc=k\end{cases} for some real value of k k .

These three roots satisfying equation x 3 k = 0 x^3-k=0 , which has roots k 3 , ω k 3 , ω 2 k 3 \sqrt[3]{k}, \omega\sqrt[3]{k}, \omega^2\sqrt[3]{k} where ω \omega is third root of unity. If k k is non-zero, it will contradict to the fact that roots are all real, but if k k is zero, then all of the roots are zeros, which is also a contradiction. Hence, the answer is No \boxed{\text{No}} .

What is omega?

Orlando Moreno - 2 years, 11 months ago

Log in to reply

It is the third root of unity, thanks for correction, I will add a note aside it.

Kelvin Hong - 2 years, 11 months ago

Log in to reply

But 1 is a third root of unity. If ω = 1 \omega = 1 , all of the 3 roots would be k 3 \sqrt[3]{k} , so there could be a real solution?

Orlando Moreno - 2 years, 10 months ago

Log in to reply

@Orlando Moreno No, when we say the third root of unity, we usually excluded the $1$ root, or you may consider the n-th root of unity to be the complex root of z n 1 z 1 = 0 \frac{z^n-1}{z-1}=0 Hope this helps!

Kelvin Hong - 2 years, 10 months ago
Kimberly Rae
Jul 10, 2018

We have a+b+c = 0

Assume a and b are positive and c is negative.

Thus a+b = -c (other assumptions could be made, but they are all symmetrically equivalent)

thus 1 a \frac{1}{a} + 1 b \frac{1}{b} + 1 c \frac{1}{c} = 0

gives 1 a \frac{1}{a} + 1 b \frac{1}{b} = - 1 c \frac{1}{c}

a + b a b \frac{a+b}{ab} = - 1 c \frac{1}{c}

a b a + b \frac{ab}{a+b} = -c

combining: a+b = a b a + b \frac{ab}{a+b}

( a + b ) 2 = a b (a+b)^2 = ab Thus ab is positive

a 2 + 2 a b + b 2 = a b a^2+2ab+b^2 = ab

a 2 + b 2 = a b a^2+b^2 = -ab This is a contradiction.

Where is the fact "Assume a and b are positive and c is negative." used in the proof to lead to a contradiction. In particular, why is it without a loss of generality that an assumption can be made about two number out of three are positive and only one negative.

Vishal Shekhar - 2 years, 11 months ago

Log in to reply

Since the numbers sum to zero, either two are positive and one negative or the other way around. If the latter produces a solution then so will -a, -b, -c. So we can assume two (label them a, b) are positive and one (c) is negative. That assumption is used to obtain a+b = ab/(a+b). Also used in my comment to Kimberly.

Anthony Cutler - 2 years, 11 months ago

Log in to reply

Thank you.

Vishal Shekhar - 2 years, 10 months ago

Kimberly, you could more simply argue that -c is larger than both a and b, and so -1/c is smaller than both 1/a and 1/b, so that 1/a + 1/b + 1/c > 0.

Anthony Cutler - 2 years, 11 months ago
Omar Balbuena
Jul 9, 2018

First off, let's take off one variable: a = b c a=-b-c

With that out of the way, we have 1 b c + 1 b + 1 c = 0 \frac 1 {-b-c} + \frac 1 b + \frac 1 c=0 which can be manipulated to b 2 + b c + c 2 = 0 b^2+bc+c^2=0

Applying the quadratic formula trying for b b we get c ± c 2 4 c 2 2 \frac{-c\pm\sqrt{c^2-4c^2}} 2

But to solve that quadratic formula either c c must be imaginary or we get an imaginary result for b b .

Therefore the answer is no, it's not possible.

Raymond Chan
Jul 2, 2018

Assume there exist such a , b , c a, b, c . Then we have c = ( a + b ) c=-(a+b) and 1 a + 1 b + 1 c = a + b a b 1 a + b = 0 \frac{1}{a}+\frac{1}{b}+\frac{1}{c}=\frac{a+b}{ab}-\frac{1}{a+b}=0 WLOG, since obviously a , b , c a, b, c can't all negative and positive, we can pick a a and b b such that they share the same sign

Simplify, we have ( a + b ) 2 = a b (a+b)^2=ab a + b = a b |a+b|=\sqrt{ab} However, by AM-GM Inequality and the fact that a a and b b have same sign, we have a + b = a + b 2 a b > a b |a+b|=|a|+|b|\ge 2\sqrt{|a||b|}>\sqrt{ab} Therefore the assumption leads to a contradiction, so there do not exist such values a , b , c a, b, c

Not only can we pick a a and b b s.t. they have to have the same sign, since a b ab is equal to a nonnegative number, as you can see in the next line. Moreover, the simplification is justified even without this remark, as far as I can tell.

Nadav Slotky - 2 years, 11 months ago

Log in to reply

That is interesting, and possibly even maddening if you don't keep in mind that the whole proof is under a counterfactual!

He needed a a and b b with the same sign in order to use AM-GM, but what you've pointed out actually can provide an earlier contradiction. That is, we could choose a , b a,b of opposite sign, but then a b = ( a + b ) 2 0 ab = (a+b)^2 \geq 0 is a contradiction.

Brian Moehring - 2 years, 11 months ago

Log in to reply

Unfortunately, this is not a good argument. No one tells you that you can choose the sign of both as you want.

However, my previous argument shows that any two of the three must have the same sign. That is, all of them must have the same sign. Which is a contradiction to a + b + c = 0 a+b+c=0

Nadav Slotky - 2 years, 11 months ago

Log in to reply

@Nadav Slotky Really? Your argument and mine are dual to one another, so they're equivalent. In particular, the statements

  • From a + b + c = 0 a+b+c = 0 and nonzero a , b , c a,b,c , we may choose the signs of a , b a,b such that they are opposite of one another.
  • From the proof that a b > 0 ab > 0 , we may conclude any two of a , b , c a,b,c have the same sign.

both are conclusions of the fact the problem is symmetric with respect to permutations of a , b , c a,b,c . This is often incurred in proofs by saying "Without loss of generality".

That is, you can argue about which proof is more clear, but unless you have some extreme dislike of using symmetry in problems, both the arguments are "good".

Brian Moehring - 2 years, 11 months ago

Log in to reply

@Brian Moehring Oh, I see now. You were referring to a sentence in the original proof I hadn't noticed up until now.

Then, indeed, our proofs are very similar. Thank you for the clarification.

Nadav Slotky - 2 years, 11 months ago
Pierre Stöber
Jul 12, 2018

My solution involves logic. We start by observing one or two of the integers must be negative. We can focus on the case where only is integer is negative because of symmetry. Let's say that c is negative and a and b positive. We then have (with d=-c):

a + b = d a+b=d

1 a + 1 b = 1 d \frac{1}{a}+ \frac{1}{b} = \frac{1}{d}

Because everything is positive, we can derive from the first equation that d is greater than a and b, and from the second equation, we get that d is smaller than a and b. The only possibility is that a=b=d, but that doesn't work out. So it's impossible.

Noel Lo
Jul 12, 2018

Suppose both are zero. Then c = ( a + b ) c=-(a+b) and 1 c = 1 a 1 b = a + b a b = c a b \dfrac{1}{c}=-\dfrac{1}{a}-\dfrac{1}{b}=-\dfrac{a+b}{ab}=\dfrac{c}{ab} . This means that c 2 = a b c^2=ab . Now considering that c = ( a + b ) c=-(a+b) , we have ( a + b ) 2 = a b (a+b)^2=ab . Considering that ( a + b ) 2 (a+b)^2 is positive, a b ab must also be positive. But on the other hand, upon subtracting 4 a b 4ab form both sides, ( a b ) 2 = 3 a b (a-b)^2=-3ab . Considering that a b ab is positive, ( a b ) 2 (a-b)^2 would then be negative!! This is impossible!!

Andrew James
Jul 15, 2018

We assume that:

a + b + c = 0 = a^(-1) + b^(-1) + c^(-1)

Adding both equations we recieve:

a + a^(-1) + b + b^(-1) + c + c^(-1) = 0

Since any real number squared is always non-negative we can assume:

(a - 1)^2 >= 0

a^2 - 2a + 1 >= 0

a - 2 + a^(-1) >= 0

(We allowed to divide by a because a/=0)

a + a^(-1) >= 2

The same is also true for b and c.

Hence a + a^(-1) + b + b^(-1) + c + c^(-1) >= 6 and our assumption must be false.

(I am quite new here and don't know how to include fractions etc., so sorry for that)

Johannes H
Jul 15, 2018

Write c = n b c = nb
The equations then say:
a + b ( n + 1 ) = 0 a+b(n+1) = 0 and
a b + a n b + n b 2 = 0 < = > a ( n + 1 ) + b = 0 ab+anb+nb^2 = 0 <=> a(n+1) + b = 0
=> n + 1 = b a = a b n+1 = -\frac{b}{a} = -\frac{a}{b}
=> ( n + 1 ) a b = a = b -(n+1)ab = a = b . So b can be ommited, leaving the equations as
2 a + n a = a ( 2 + n ) = 0 2a +na = a (2+n) = 0 and
2 ( 1 a + 1 n a ) = 1 a ( 2 + 1 n ) = 0 2 (\frac{1}{a} + \frac{1}{na}) = \frac{1}{a} (2+\frac{1}{n})= 0 < = > <=> (with a 0 a \neq 0 ):
2 + n = 0 2+n = 0 and 2 + 1 n = 0 2+\frac{1}{n} = 0

Matias Apablaza
Jul 14, 2018

Equation 1:

a + b = c a+b=-c


Equation 2:

1 a + 1 b = 1 c \frac{1}{a} + \frac{1}{b} = \frac{-1}{c}

c = 1 1 / a + 1 / b -c = \frac{1}{1/a+1/b}


Equalizing equations:

a + b = 1 1 / a + 1 / b a+b = \frac{1}{1/a+1/b}

1 a + b = 1 a + 1 b \frac{1}{a+b} = \frac{1}{a} + \frac{1}{b}

1 a + b = a + b a . b \frac{1}{a+b} = \frac{a+b}{a.b}

a b = ( a + b ) 2 ab = (a+b)^2

a 2 + b 2 + 2 b a = a b a^2+b^2+2ba = ab

a 2 + b a + b 2 = 0 a^2+ba+b^2 = 0

c = b ± b 2 4 b 2 2 c = \frac{-b \pm \sqrt{b^2-4b^2}}{2}

b 2 4 b 2 = 3 b 2 \sqrt{b^2-4b^2} = \sqrt{-3b^2}

b R , b 2 < 0 \nexists b \in R , b^2<0

a R a \notin R

Contradiction

Diane Tu
Jul 11, 2018

1) a+b=-c, 2)1/a + 1/b = -1/c ==> ab = cc Square 1) in both sides: (a+b)(a+b) = aa+2ab+bb = cc ==> aa+ab+bb=0 (a+1/2b)(a+1/2b) + 3/4bb = 0

So, b = 0 ==> a = 0 ==> c = 0 Thus, contradiction.

Edith Dubiner
Jul 11, 2018

Without loss of generality, we can assume that a a and b b are positive and c c is negative. Then a + b = c = c a+b=-c=|c| .

In particular, 0 < a , b < c 0 < a, b < |c| . But then 1 a , 1 b > 1 c \frac1a,\frac1b>\frac 1{|c|} .

Therefore 1 a + 1 b + 1 c = 1 a + 1 b 1 c > 1 a + 0 > 0 \frac1a+\frac1b+\frac 1c = \frac1a+\frac1b-\frac 1{|c|} > \frac1a + 0>0 , and hence 1 a + 1 b + 1 c 0 \frac1a+\frac1b+\frac 1c\ne0

Mandira Basu
Jul 11, 2018

If a+b+c = 0

a+b+c>1/(1/a+1/b+1/c) [ am hm inequality]

Hence 1/(1/a +1/b+1/c) is not equal to 0

Mark Lunt
Jul 11, 2018

If there is a real solution, 1/a + 1/b = 1/(a + b). Multiplying through by (a + b): -> 1+ b/a + a/b + 1 = 1 Simplifying and multiplying through by a: -> a^2/b + a + b = 0 Solving for a: -> a = { -1 +/- sqrt (1 - 4b/b) } / (2/b)

Therefore a has no real solution since it involves the square root of -3.

Utkarsh Duvey
Jul 10, 2018

Consider the polynomial x 3 + p x 2 + q x + r = 0 x^3 + px^2 + qx + r = 0 and let a,b,c be its roots .Then, the conditions suggest that p = 0 and q = 0 .so the polynomial becomes x 3 = r x^3 = -r . Since f ( x ) = x 3 f(x) = x^3 is an increasing function there exist only 1 real root of it.

a = ( b + c ) a=-(b+c) and 1 b + 1 c = 1 b + c \frac{1}{b}+\frac{1}{c}=\frac{1}{b+c} therefore:

b + c b c = 1 b + c \frac{b+c}{bc}=\frac{1}{b+c} wich is: ( b + c ) 2 = b c (b+c)^2=bc

( b + c ) 2 0 (b+c)^2\geq0 therefore b c 0 bc\geq0

If ( b + c ) 2 = b c = 0 (b+c)^2=bc=0 then b = 0 b=0 or c = 0 c=0 wich is false, so in other words ( b + c ) 2 = b c > 0 (b+c)^2=bc>0

Manipulating this ( b + c ) 2 = b c (b+c)^2=bc expression you get b 2 + c ( b + c ) = 0 b^2+c(b+c)=0 , b 2 b^2 is always positive so c ( b + c ) c(b+c) has to be negative.

That leads to:

b + c < 0 b+c<0 for this to be true you have to have b < c b<-c . This statement contradict with fact that b > 0 b>0 because c -c is always negative.

So there is no solution for the problem!!!

Simon Barley
Jul 10, 2018

Not the simplest solution, but I like it:

Let a,b,c be solutions to a cubic equation. Since a+b+c=0, the coefficients of the x^2 term in the cubic is zero.

The second property means ab+ac+bc=0, so the coefficient of the x term in the cubic is zero.

Hence, a,b,c are solutions to a cubic of the form mx^3+n=0, where m and m are constants.

Since this cubic only has one real solution (obvious if you consider the graph) then a,b,c must all be the same, so the only way a+b+c=0 is if they are all zero.

Ervyn Manuyag
Jul 10, 2018

Bcuz 1/a cannot equal to 0

We can see that an opposite sign number in the equation is necessary to cancel the values of the other two numbers. For example in a + b + c = 0 a + b + c = 0 we can have:

1 + 2 3 = 0 1 + 2 - 3 = 0

1 + ( 2 ) + 3 = 0 -1 + (-2) + 3 = 0

Now we know that not only we need a number with an opposite sign but also that this number must be the greatest in the case of a + b + c = 0 a + b + c = 0 .

The equation 1 a + 1 b + 1 c = 0 \frac {1}{a} + \frac{1}{b} + \frac{1}{c} = 0 , can be solved independently with several values:

1 3 + 1 6 + 1 2 = 0 \frac {1}{3} + \frac{1}{6} + \frac{1}{-2} = 0

1 12 + 1 4 + 1 3 = 0 \frac {1}{-12} + \frac{1}{-4} + \frac{1}{3} = 0

But this time is the number (a, b, c) with the least value that needs to have the opposite sign, which creates a contradiction with the first equation.

Therefore there is no solution for any real number different than zero.

We are given c = ( a + b ) c = -(a + b) and 1 a + 1 b = 1 c \frac{1}{a} + \frac{1}{b} = - \frac{1}{c}

So we have 1 a + 1 b = 1 a + b \frac{1}{a} + \frac{1}{b} = \frac{1}{a+b}

a + b a b = 1 a + b \Rightarrow \frac{a+b}{ab} = \frac{1}{a+b}

( a + b ) 2 = a b \Rightarrow (a+b)^2 = ab

a 2 + 2 a b + b 2 = a b \Rightarrow a^2+2ab+b^2 = ab

a 2 + a b + b 2 = 0 \Rightarrow a^2+ab+b^2 = 0

Now if we try to find the roots of the above equation for either a a or b b , we can see there are no real roots.

That is,

a = b ± b 2 4 b 2 2 = b ± 3 b 2 2 a = \frac{-b\pm\sqrt{b^2-4b^2}}{2} = \frac{-b\pm\sqrt{-3b^2}}{2}

So, a a cannot be real.

James Felling
Jul 9, 2018

I did it differently. It is abvious a,b,and c cannot have the same sign as if the did a+b+c would be non zero.

Let a and b share the same sign. Then c has the opposite sign and c= -(a+b)

So the sum of the inverses is 1/a + 1/b - 1/(a+b) = 0

Multiply through by ab(a+b) So b(a+b)+ a(a+b)- ab=0 So a^2+ b^2+ ab=0 and since all three of those items are non zero and positive(since a and b have the same sign), we have a contradiction.

So it's impossible.

Rocco Dalto
Jul 9, 2018

Assume there exists non-zero real numbers a a , b , b, c c such that a + b + c = 0 a+b+c = 0 and 1 a + 1 b + 1 c = 0 \frac{1}{a} + \frac{1}{b} + \frac{1}{c} = 0

\implies

b + c = a b + c = -a and b + c = b c a b + c = -\frac{bc}{a}

b + c = a c = a b b 2 + a b + a 2 = 0 b = a ( 1 2 ± 3 2 i ) b + c = -a \implies c = -a - b \implies b^2 + ab + a^2 = 0 \implies b = a(-\frac{1}{2} \pm \frac{\sqrt{3}}{2}i)

a a is a nonzero real number b C \implies b \in \mathbb{C} and c C c \in \mathbb{C} .

\therefore there does not exist non-zero real numbers a a , b , b, c c such that a + b + c = 0 a+b+c = 0 and 1 a + 1 b + 1 c = 0 \frac{1}{a} + \frac{1}{b} + \frac{1}{c} = 0

Shukraditya Bose
Jul 9, 2018

By the given conditions it is found that a 2 + b 2 + c 2 = 0 a^{2}+b^{2}+c^{2}=0 and also in the question it says that a, b and c are non-zero. But the sum of squares cant be zero. So values for a,b,c do not E X I S T EXIST

e1:

a + b = c a + b = -c

e2:

c = 1 1 a + 1 b -c = \frac{1}{\frac{1}{a} + \frac{1} {b}}

e1 = e2:

a + b = 1 1 a + 1 b a + b = \frac{1}{\frac{1}{a} + \frac{1} {b}}

To pq form:

b 2 + a b + a 2 = 0 b^2 + ab + a^2 = 0

The term under the root is <0 so a, b or c must be 0.

I know thats not the question but it should work if:

b = a 2 ± 3 2 a i b = -\frac{a} {2} \pm \frac{\sqrt{3}}{2}ai

And

c = a b c = -a - b

So if two of three numbers are irrational?

Andre Bourque
Jul 8, 2018

Similar to Kelvin Hong, we can deduce that a,b,c are roots to some cubic equation of the form x^3 = k. We can see that because the equation y = x^3 has a well-defined inverse, this equation will have one solution for all real numbers k. Thus a,b,c cannot be all real.

Edwin Gray
Jul 4, 2018

Suppose: (1) a + b + c = 0; (2) 1/a + 1/b + 1/c = 0.Multiply ( 2 ) by abc, resulting in (3) bc + ac + ab + 0; multiply (1) by a, resulting in (4) a^2 + ab + ab = 0. Equating (3) and (4), (5) a^2 = bc. But a^2 = b^2 + 2bc + c^2, and substituting, a^2 + b^2 + c^2 = 0; Then a = b = c = 0, and the right side is undefined. Ed Gray

Brian Moehring
Jul 2, 2018

Assume a + b + c = 0 a+b+c=0 and 1 a + 1 b + 1 c = 0 \frac{1}{a} + \frac{1}{b} + \frac{1}{c} = 0 .

The second equation is only defined for b 0 b\neq 0 and both equations are homogeneous, so we may set b = 1 b=1 . Then by the first equation, c = 1 a c = -1 - a . Putting this into the second equation gives 1 a + 1 = 1 1 + a a 2 + a + 1 = 0 a = 1 ± i 3 2 \frac{1}{a} + 1 = \frac{1}{1+a} \implies a^2 + a + 1 = 0 \implies a = \frac{-1 \pm i\sqrt{3}}{2} which is not real.

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...