Oct 31, 2020

Discovering the lost "Bendixson's theorem"

Working on my PhD thesis, I had to investiage the "stability" of such an "evolutionary system" \(U'(t)=(A+B)\,U\), where \(U=(u_1,u_2,\cdots,u_n)\) is a vector and both \(A\) and \(B\) are square matrices of size \(n\). Such a system describes the dynamics of the vector \(U\) over the time. Here, stability means if the norm of the solution vector \(U\) denoted by \(\|U\|\) is bounded or if it goes to infinity \(\displaystyle\lim_{t\to\infty}\|U\|\)? This is a well-stablisehd topic in control thery and differential equations that the stability is linked to the "eigenvalues" of the matrix \(A+B\) denoted by \((\lambda_k)_{j=1}^n\). What is important, in fact, is the sign of real parts of eigenvalues, that is:
  • if \(\mathrm{Re}(\lambda_j)>0\) for any \(j=1,2,\cdots,n\), the system is "unstable", norm of \(U\) increases in time without any bound: \(\displaystyle\lim_{t\to\infty}\|U(t)\|=\infty\)
  • if \(\mathrm{Re}(\lambda_j)<0\) for all \(j=1,2,\cdots,n\), the system is "stable", norm of \(U\) is bounded in time: \(\displaystyle\lim_{t\to\infty}\|U(t)\|<\infty\)
Of course, I knew this famous result. My problem, however, was that I could not really determine the sign of \(\mathrm{Re}(\lambda_j)\) for my martix \(A+B\) because they were parametric! What I had, in addition, was
  • Matrix \(A\) is Hermitian (or symmetric) and negative-definite.
  • Matrix \(B\) is skew-Hermitian (or skew-symmetric).
As matrix \(A\) is Hermitian and negative-definite, we can conclude that the eigenvalues of matrix \(A\) are in fact negative, so this part is "stable". So, the question of stability of \(U'(t)=(A+B)\,U\) is boiled down to an algebraec question:

Can adding a skew-Hermitian matrix to a stable matrix make the sum \(A+B\) unstable?

\[ \underbrace{A}_{\text{stable}} + \underbrace{B}_\text{skew-Hermitian} \Longrightarrow \text{stable?} \] I was digging the literature to answer this question. I found tons of inequalities (partially discussed in Trence Tao's blog) giving bounds on the eigenvalues of matrix summations, but nothing could help me!

Then, I came to the idea of generatring lots of random 2-by-2 matrices with the same properties as I want for \(A\) and \(B\) to see if \(A+B\) is stable or no. And it was always stable! So, I got really suspicous and more motivated to continue my search. One day, google took me to Michele Benzi's website outlining the topic of the course "Matrix Analysis", and something caught my attention: Bendixson's theorem!
I was, of course, curious enough to google for this. I found another paper by Helmut Wielandt repharesing the original result of Bendixson:
And this is what I was looking for: it simply says that the real part of \(\lambda_j\)'s is bounded by the (real) eigenvalues of the Hermitian matrix (\(A\)) while the imaginary part is bounded by the (imaginary) eigenvalues of the skew-Hermitian matrix \(B\).

This result is obtained by Bendixson in 1901 (for real matrices) and then got extended to complex matrices in 1902 by Hirsch:
My problem, fortunately, got solved by this intersting theorem and ended up as a journal paper. But I am still surprised how much mathematicians are not aware of Bendixson's theorem, even the great Terence Tao (!):
To be continued!

No comments:

Post a Comment