∣ ∣ ∣ ∣ ∣ ∣ e a e b e c e 2 a e 2 b e 2 c ( e 3 a − 1 ) ( e 3 b − 1 ) ( e 3 c − 1 ) ∣ ∣ ∣ ∣ ∣ ∣
if a , b and c are cube roots of unity, find the determinant of the matrix above.
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Just the thing I did ;)
Yeahbsolutely ;)
Problem Loading...
Note Loading...
Set Loading...
First put a = 1 , b = ω , c = ω 2 . Let value of this determinant to be A .
Then put a = ω 2 , b = ω , c = 1 . Let value of this determinant to be B .
A = B = − A ⇒ A = 0
Classic JEE style.