Given that α 1 , α 2 , α 3 are column vectors and they form a basis for R 3 , A is a 3 × 3 matrix such that:
⎩ ⎪ ⎨ ⎪ ⎧ A α 1 = α 1 A α 2 = α 1 + α 2 A α 3 = α 2 + α 3
Is it always true that A and A − 1 are similar , i.e. there exists an invertible matrix P , so that P − 1 A P = A − 1 ?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
To Be honest, I am not convinced by my own solution now that I think more about it. I am open to counter-arguments. Concluding that v 1 is a null vector contradicts the given information that the given vectors are linearly independent. Or am I making a more fundamental conceptual error?
Log in to reply
In the line ( I − A − 1 ) v 2 = ( A − I ) v 2 = v 1 , ( I − A − 1 ) v 2 = ( A − I ) v 2 does not guarantee that I − A − 1 = A − I .
Problem Loading...
Note Loading...
Set Loading...
We are told that the three vectors ( v 1 , v 2 and v 3 ) are linearly independent. I am not using the vector symbol and and dropping the Greek letter α for the sake of typing convenience. So it is given that:
A v 1 = v 1 A v 2 = v 1 + v 2 A v 3 = v 2 + v 3
It is obvious from the first equation that v 1 is an eigenvector of the matrix A and the corresponding eigenvalue is unity. Looking at the second equation:
( A − I ) v 2 = v 1 ( A − A A − 1 ) v 2 = v 1 A ( I − A − 1 ) v 2 = v 1 ( I − A − 1 ) v 2 = A − 1 v 1
∵ A v 1 = v 1 ⟹ A − 1 v 1 = v 1 ( I − A − 1 ) v 2 = A − 1 v 1 = A v 1 = v 1 ( I − A − 1 ) v 2 = ( A − I ) v 2 = v 1 ⟹ I − A − 1 = A − I A + A − 1 = 2 I
Let us look at the third equation now:
( A − I ) v 3 = v 2
We already know that: ( I − A − 1 ) v 2 = v 1 ⟹ ( I − A − 1 ) ( A − I ) v 3 = v 1 ⟹ ( A + A − 1 − 2 I ) v 3 = v 1 ∴ v 1 = 0
So, v 1 is essentially the null vector. Coming back to the relations:
( I − A − 1 ) v 2 = ( A − I ) v 2 = v 1
If v 2 is a nonzero vector, then it must be true that:
A = A − 1 = I
This makes sense as an eigenvalue of the identity matrix is unity and all vectors in R 3 are eigenvectors of the identity matrix. So essentially:
P − 1 A P = A − 1 ⟹ P − 1 P = I
Which is an identity for every invertible matrix P . So the answer to the question is Y E S