- if \(\mathrm{Re}(\lambda_j)>0\) for any \(j=1,2,\cdots,n\), the system is "unstable", norm of \(U\) increases in time without any bound: \(\displaystyle\lim_{t\to\infty}\|U(t)\|=\infty\)
- if \(\mathrm{Re}(\lambda_j)<0\) for all \(j=1,2,\cdots,n\), the system is "stable", norm of \(U\) is bounded in time: \(\displaystyle\lim_{t\to\infty}\|U(t)\|<\infty\)
- Matrix \(A\) is Hermitian (or symmetric) and negative-definite.
- Matrix \(B\) is skew-Hermitian (or skew-symmetric).
Can adding a skew-Hermitian matrix to a stable matrix make the sum \(A+B\) unstable?
Then, I came to the idea of generatring lots of random 2-by-2 matrices with the same properties as I want for \(A\) and \(B\) to see if \(A+B\) is stable or no. And it was always stable! So, I got really suspicous and more motivated to continue my search. One day, google took me to Michele Benzi's website outlining the topic of the course "Matrix Analysis", and something caught my attention: Bendixson's theorem!
I was, of course, curious enough to google for this. I found another paper by Helmut Wielandt repharesing the original result of Bendixson: And this is what I was looking for: it simply says that the real part of \(\lambda_j\)'s is bounded by the (real) eigenvalues of the Hermitian matrix (\(A\)) while the imaginary part is bounded by the (imaginary) eigenvalues of the skew-Hermitian matrix \(B\).
This result is obtained by Bendixson in 1901 (for real matrices) and then got extended to complex matrices in 1902 by Hirsch: My problem, fortunately, got solved by this intersting theorem and ended up as a journal paper. But I am still surprised how much mathematicians are not aware of Bendixson's theorem, even the great Terence Tao (!): To be continued!
No comments:
Post a Comment