Given the matrix . Let , the sine of matrix . This is defined in terms of the well-known Taylor expansion of the function. Find .
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Firstly, thank you for this interesting follow up problem.
I used the fact that any square matrix can be factorised according to this link .
So, the matrix A can be written as:
A = V Λ V − 1
Here, Λ is a diagonal matrix comprising of the eigenvalues of the matrix A while V is a square matrix comprising of the eigen vectors of A in each column.
Λ = [ λ 1 0 0 λ 2 ]
Having decomposed the above matrix, we now look at the sine series expansion for matrices:
sin ( A ) = A − 3 ! A 3 + 5 ! A 5 + . . .
sin ( A ) = V Λ V − 1 − 3 ! ( V Λ V − 1 ) 3 + 5 ! ( V Λ V − 1 ) 5 + …
Remember that
V V − 1 = I
V − 1 V = I
The above matrix exponential can be rewritten as:
sin ( A ) = V Λ V − 1 − 3 ! V Λ V − 1 V Λ V − 1 V Λ V − 1 + …
All the terms having V − 1 V combined cancel to be the identity matrix. The expression simplifies to:
sin ( A ) = V Λ V − 1 − 3 ! V Λ 3 V − 1 + …
or:
sin ( A ) = V ( Λ − 3 ! Λ 3 + … ) V − 1
This gives us the result:
sin ( A ) = V sin ( Λ ) V − 1
Using a similar analysis, for a diagonal matrix it can also be proved that:
sin ( Λ ) = [ sin ( λ 1 ) 0 0 sin ( λ 2 ) ]
Finally, the required Final result is:
sin ( A ) = V [ sin ( λ 1 ) 0 0 sin ( λ 2 ) ] V − 1
From here, it is just a matter of computation.