Two non null matrices A and B satisfy A B = B and B A = A . Then the matrix A 2 + B 2 equals
Here I stands for the identity matrix
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
That's you 👶
Let the matrix A have m rows and n columns and matrix B have n rows and p columns. The matrix A B will therefore have m rows and p columns. The number of columns of A and the number of rows of B must be same, otherwise, the multiplication A B is not possible.
Since:
A B = B
We can conclude that
m = n
We can make similar arguments and conclude that
m = n = p
Therefore A and B are square matrices. Consider the expression:
X = A 2 + B 2 ⟹ X = A ⋅ A + B ⋅ B
Assuming the matrices A and B are invertible, from the given information we get:
A = B B − 1 = I B = A A − 1 = I
This leads to
X = A ⋅ A + B ⋅ B = A ( B B − 1 ) + B ( A A − 1 )
Which simplifies to:
X = A 2 + B 2 = A + B = 2 I
Is there a way of proving this expression without assuming that A and B are invertible? I am curious. If not, a note that the matrices are full-rank (row or column) should be mentioned in the problem statement.
Log in to reply
What I have done is as follows
A B = B ⟹ A B = I B ⟹ ( A − I ) B = ∅ ⟹ A = I . Similarly, B A = A ⟹ B = I . Hence A 2 = A , B 2 = B , and A 2 + B 2 = A + B .
Even though it is inherent that matrices A and B are invertible, it was not necessary to use this fact explicitly.
Log in to reply
The matrices A = ( 1 0 2 0 )
and
B = ( 1 0 − 2 0 )
satisfy the condition, but A = I and neither matrix is invertible...
Log in to reply
@Chris Lewis – Brilliant ! In fact the matrices
A = [ 1 0 a 0 ]
and
B = [ 1 0 b 0 ]
are non invertible non identity matrices satisfying those conditions.
Aha - found a way!
Not assuming A and B are invertible.
A B = B ⟹ ( A B ) A = B A = A ⟹ A ( B A ) = A ⟹ A 2 = A
and
B A = A ⟹ ( B A ) B = A B = B ⟹ B ( A B ) = B ⟹ B 2 = B
⟹ A 2 + B 2 = A + B .
Problem Loading...
Note Loading...
Set Loading...
We have A 2 = B A ⋅ B A = B ⋅ A B ⋅ A = B ⋅ B ⋅ A = B ⋅ B A = B ⋅ A = B A = A
and
B 2 = A B ⋅ A B = A ⋅ B A ⋅ B = A ⋅ A ⋅ B = A ⋅ A B = A ⋅ B = A B = B
so A 2 + B 2 = A + B .