Exponent function in Matrix

Algebra Level 4

If A = [ a b b a ] \displaystyle A= \begin{bmatrix} \;\; a & b\\ -b & a \end{bmatrix} , where a , b a,b are real numbers, then find e A t e^{At} .

[ e a e b e b e a ] t \displaystyle \begin{bmatrix} e^{a} & e^{b} \\ e^{-b} & e^{a} \end{bmatrix}t e a t [ cos b t sin b t sin b t cos b t ] \displaystyle e^{at}\begin{bmatrix} \cos bt & \sin bt \\ -\sin bt & \cos bt \end{bmatrix} e b t [ cos a t sin a t sin a t cos a t ] \displaystyle e^{bt}\begin{bmatrix} \cos at & \sin at \\ -\sin at & \cos at \end{bmatrix} e b t [ cos a t sin b t sin b t cos a t ] \displaystyle e^{bt}\begin{bmatrix} \cos at & \sin bt \\ -\sin bt & \cos at \end{bmatrix} e b t [ cos a t sin a t sin a t cos a t ] \displaystyle e^{bt}\begin{bmatrix} \cos at & -\sin at \\ \sin at & \cos at \end{bmatrix} e a t [ cos a t sin b t sin b t cos a t ] \displaystyle e^{at}\begin{bmatrix} \cos at & \sin bt \\ -\sin bt & \cos at \end{bmatrix} e a t [ cos b t sin b t sin b t cos b t ] \displaystyle e^{at}\begin{bmatrix} \cos bt & -\sin bt \\ \sin bt & \cos bt \end{bmatrix} [ a t b t a b t a b t a t b t ] \displaystyle \begin{bmatrix} \;\;a^{t}-b^{t} & abt \\ -abt & a^{t}-b^{t} \end{bmatrix}

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

1 solution

Brian Moehring
Feb 11, 2017

Let's start with some basics:

  • If M \mathbf{M} is a square matrix, we define e M = exp ( M ) = k = 0 M k k ! e^\mathbf{M} = \exp(\mathbf{M}) = \sum_{k=0}^\infty \frac{\mathbf{M}^k}{k!} and note that the infinite series on the right side will always converge component-wise.
  • If M , N \mathbf{M}, \mathbf{N} are square matrices that commute (i.e. M N = N M \mathbf{M}\mathbf{N} = \mathbf{N}\mathbf{M} ), then e M + N = e M e N . e^{\mathbf{M}+\mathbf{N}} = e^\mathbf{M} e^\mathbf{N}.

Therefore, we may write A t = a t [ 1 0 0 1 ] + b t [ 0 1 1 0 ] = a t I 2 + b t [ 0 1 1 0 ] , At = at\left[\begin{array}{cc} 1 & 0 \\ 0 & 1\end{array}\right] + bt\left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right] = at\mathbf{I}_2 + bt\left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right], and since I 2 \mathbf{I}_2 commutes with every matrix, we can use the property I mentioned above to split the problem into two simpler parts:

Therefore, we compute exp ( a t I 2 ) = k = 0 ( a t ) k I 2 k k ! = k = 0 ( a t ) k k ! I 2 = e a t I 2 \exp\left(at\mathbf{I}_2\right) = \sum_{k=0}^\infty \frac{(at)^k\mathbf{I}_2^k}{k!} = \sum_{k=0}^\infty \frac{(at)^k}{k!} \mathbf{I}_2 = e^{at}\mathbf{I}_2 exp ( b t [ 0 1 1 0 ] ) = k = 0 ( b t ) k k ! [ 0 1 1 0 ] k = k = 0 ( b t ) 2 k ( 2 k ) ! [ 0 1 1 0 ] 2 k + k = 0 ( b t ) 2 k + 1 ( 2 k + 1 ) ! [ 0 1 1 0 ] 2 k + 1 \exp\left(bt\left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]\right) = \sum_{k=0}^\infty \frac{(bt)^k}{k!} \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]^k = \sum_{k=0}^\infty \frac{(bt)^{2k}}{(2k)!} \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]^{2k} + \sum_{k=0}^\infty \frac{(bt)^{2k+1}}{(2k+1)!} \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]^{2k+1}

To simplify this latter expression, note that [ 0 1 1 0 ] 2 k = ( I 2 ) k = ( 1 ) k I 2 , [ 0 1 1 0 ] 2 k + 1 = ( 1 ) k [ 0 1 1 0 ] \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]^{2k} = (-\mathbf{I}_2)^k = (-1)^k \mathbf{I}_2,\qquad \qquad \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]^{2k+1} = (-1)^k \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right] so that it becomes exp ( b t [ 0 1 1 0 ] ) = k = 0 ( b t ) 2 k ( 1 ) k ( 2 k ) ! I 2 + k = 0 ( b t ) 2 k + 1 ( 1 ) k ( 2 k + 1 ) ! [ 0 1 1 0 ] = cos ( b t ) I 2 + sin ( b t ) [ 0 1 1 0 ] = [ cos ( b t ) sin ( b t ) sin ( b t ) cos ( b t ) ] \exp\left(bt\left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right]\right) = \sum_{k=0}^\infty \frac{(bt)^{2k}(-1)^k}{(2k)!} \mathbf{I}_2 + \sum_{k=0}^\infty \frac{(bt)^{2k+1}(-1)^k}{(2k+1)!} \left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right] = \cos(bt)\mathbf{I}_2 + \sin(bt)\left[\begin{array}{cc} 0 & 1 \\ -1 & 0\end{array}\right] = \left[\begin{array}{cc} \cos(bt) & \sin(bt) \\ -\sin(bt) & \cos(bt)\end{array}\right]

Finally, putting it all together, we have e A t = e a t I 2 [ cos ( b t ) sin ( b t ) sin ( b t ) cos ( b t ) ] = e a t [ cos ( b t ) sin ( b t ) sin ( b t ) cos ( b t ) ] . e^{At} = e^{at}\mathbf{I}_2 \left[\begin{array}{cc} \cos(bt) & \sin(bt) \\ -\sin(bt) & \cos(bt)\end{array}\right] = e^{at} \left[\begin{array}{cc} \cos(bt) & \sin(bt) \\ -\sin(bt) & \cos(bt)\end{array}\right].

Nice solution, Brian. Your time in putting all this together is appreciated. There is a more brute force way of demonstrating the same, but this is better in showing some of the interesting properties of matrix exponentiation.

Michael Mendrin - 4 years, 3 months ago

Log in to reply

Thanks! Yeah, there are a lot of ways to see this same result. The fastest way I know of is just to note the eigenvalues of A t At are a t ± b t i at\pm bti , so under a change of basis, e A t e^{At} just looks like e a t ± b t i = e a t ( cos ( b t ) ± i sin ( b t ) ) e^{at\pm bti}=e^{at}(\cos (bt) \pm i\sin (bt)) . Therefore, the final result can be seen as a real matrix form of complex exponentiation!

Brian Moehring - 4 years, 3 months ago

Log in to reply

I used that approach, but I kinda doubted its validity. Thanks! :)

A Former Brilliant Member - 4 years, 3 months ago

Log in to reply

@A Former Brilliant Member To be a formal solution, you would need to check a few things (that At is diagonalizable, what the basis-change matrix is), but it is a correct intuition in this case.

Brian Moehring - 4 years, 3 months ago

Log in to reply

@Brian Moehring Yep, yep. It is a pretty cool method too! :P

A Former Brilliant Member - 4 years, 3 months ago

@SUMUKHA ADIGA

rakshith lokesh - 3 years, 1 month ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...