Similarity of A A and A 1 A^{-1}

Algebra Level pending

Given that α 1 , α 2 , α 3 \overrightarrow{\alpha_1},\overrightarrow{\alpha_2},\overrightarrow{\alpha_3} are column vectors and they form a basis for R 3 \mathbb R^3 , A A is a 3 × 3 3 \times 3 matrix such that:

{ A α 1 = α 1 A α 2 = α 1 + α 2 A α 3 = α 2 + α 3 \begin{cases} A\overrightarrow{\alpha_1}=\overrightarrow{\alpha_1} \\ A\overrightarrow{\alpha_2}=\overrightarrow{\alpha_1}+\overrightarrow{\alpha_2} \\ A\overrightarrow{\alpha_3}=\overrightarrow{\alpha_2}+\overrightarrow{\alpha_3} \end{cases}

Is it always true that A A and A 1 A^{-1} are similar , i.e. there exists an invertible matrix P P , so that P 1 A P = A 1 P^{-1}AP=A^{-1} ?

Yes No

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

1 solution

Karan Chatrath
Jan 4, 2021

We are told that the three vectors ( v 1 v_1 , v 2 v_2 and v 3 v_3 ) are linearly independent. I am not using the vector symbol and and dropping the Greek letter α \alpha for the sake of typing convenience. So it is given that:

A v 1 = v 1 Av_1 = v_1 A v 2 = v 1 + v 2 Av_2 = v_1+v_2 A v 3 = v 2 + v 3 Av_3 = v_2+v_3

It is obvious from the first equation that v 1 v_1 is an eigenvector of the matrix A A and the corresponding eigenvalue is unity. Looking at the second equation:

( A I ) v 2 = v 1 (A-I)v_2 = v_1 ( A A A 1 ) v 2 = v 1 (A-AA^{-1})v_2 = v_1 A ( I A 1 ) v 2 = v 1 A(I-A^{-1})v_2 = v_1 ( I A 1 ) v 2 = A 1 v 1 (I-A^{-1})v_2 = A^{-1}v_1

A v 1 = v 1 A 1 v 1 = v 1 \because Av_1 = v_1 \implies A^{-1}v_1 = v_1 ( I A 1 ) v 2 = A 1 v 1 = A v 1 = v 1 (I-A^{-1})v_2 = A^{-1}v_1=Av_1 = v_1 ( I A 1 ) v 2 = ( A I ) v 2 = v 1 (I-A^{-1})v_2=(A-I)v_2 = v_1 I A 1 = A I \implies I-A^{-1}=A-I A + A 1 = 2 I \boxed{A + A^{-1} =2I}

Let us look at the third equation now:

( A I ) v 3 = v 2 (A-I)v_3 = v_2

We already know that: ( I A 1 ) v 2 = v 1 (I-A^{-1})v_2= v_1 ( I A 1 ) ( A I ) v 3 = v 1 \implies (I-A^{-1})(A-I)v_3 = v_1 ( A + A 1 2 I ) v 3 = v 1 \implies (A + A^{-1} -2I)v_3 = v_1 v 1 = 0 \therefore v_1 = 0

So, v 1 v_1 is essentially the null vector. Coming back to the relations:

( I A 1 ) v 2 = ( A I ) v 2 = v 1 (I-A^{-1})v_2=(A-I)v_2 = v_1

If v 2 v_2 is a nonzero vector, then it must be true that:

A = A 1 = I A = A^{-1}=I

This makes sense as an eigenvalue of the identity matrix is unity and all vectors in R 3 R^3 are eigenvectors of the identity matrix. So essentially:

P 1 A P = A 1 P 1 P = I P^{-1}AP = A^{-1} \implies P^{-1}P = I

Which is an identity for every invertible matrix P P . So the answer to the question is Y E S \boxed{\mathrm{YES}}

To Be honest, I am not convinced by my own solution now that I think more about it. I am open to counter-arguments. Concluding that v 1 v_1 is a null vector contradicts the given information that the given vectors are linearly independent. Or am I making a more fundamental conceptual error?

Karan Chatrath - 5 months, 1 week ago

Log in to reply

In the line ( I A 1 ) v 2 = ( A I ) v 2 = v 1 (I-A^{-1})v_2 =(A-I)v_2 =v_1 , ( I A 1 ) v 2 = ( A I ) v 2 (I-A^{-1})v_2 =(A-I)v_2 does not guarantee that I A 1 = A I I-A^{-1} = A-I .

Alice Smith - 5 months, 1 week ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...