It’s fascinating to learn linear algebra knowing the enormous application in all kinds of verticals. From video games, neuron network to Google search, epidemic study and forensic detection.
Thanks to 3Blue1Brown, we learn to look at matrix not as an abstract symbol, but a function/transformation of coordinate framework, carrying original data points to the new definition.
During this transformation, there are some special circumstances where the original vector(position, proportion you name it whatever you wish) will not be stretched or squished, meaning the determinate of M*V vector = 0, premised on V vector is not zero vector of course. This V is the eigenvector. So obviousely if the M is a diagonal matrix, meaning it just scales up/down the basis vector, each of them is Eigenvector, the whole piece is called eigenbasis.
Eigenvectors/Eigenvalues reveal long term behavior or equilibrium. the Fox Rabbit population illustrate this eigenvector concept beautifully. The math context is that fox population is non-linear dependent on its own current population, while rabbit population is dependent on both it own population, positive, and fox population, negative. The derivative equations on the top right corner can be expressed in linear algebra language as below matrix form.
when coefficient of R and F varies, lambda, or eigenvalue and eigenvector, which is the borderline in above graph moves left to right, favoring foxes or rabbits respectively. In a nutshell, once we’re able to figure out this eigenvector/value, we are like the omnipotent god to be able to predict the evolution of fox/rabbit population precisely.
Similarly, there is a famous zombie human puzzle, which is solved in the same approach, with a special name Markov Matrix:
Next, on to its application on pagerank, core algorithm underlying the behemoth Google Search Engine.
I have written previously a blog on pagerank, intending to explain the basic notion but merely touched on the mathematical logic. It’s well known, while most people still don’t know – Why is PageRank an eigenvector problem?
It can depicted from a simplified 4-page network:
Knowing the original 0.25 is based on assumption that if you spent a long time randomly clicking links (equal time between each click), what percent of time would be spent on each site? If there are only 4 pages, each will get 0.25.
You repeatedly multiple the R1 to the ABCD matrix until Rn where n is close to infinity, while the value of R vector close to a stable constants – equilibrium state, it’s the eigenvector.
Knowing non-diagonal matrix multiplication is exponentially complex, we could apply the eigenbasis technique – A-1MA to convert it to a diagonal or eigenbasis, then complete the A^n computation, then revert back.
That being said, it does strike me odd that this eigenvecotr is not derived from determinant calculation in classical linear algebra course. (to be continued…)