Proof That Distinct Eigenvalues Return Distinct Eigenvectors

Theorem
If  
\[\lambda_1 , \lambda_2\]
  are eigenvalues of a square matrix  
\[A\]
  with  
\[\lambda_ 1 \neq \lambda_2\]
  and  
\[\mathbf{v}_1 , \: \mathbf{v}_2\]
  are the eigenvectors associated with  
\[\lambda_1 , \: \lambda_2\]
  respectively, then  
\[\mathbf{v}_1 \neq \mathbf{v}_2\]
.
Proof
\[\mathbf{v}_1 , \lambda_1\]
  satisfy  
\[A \mathbf{v}_1 = \lambda_1 \mathbf{v}_1\]

\[\mathbf{v}_2 , \lambda_2\]
  satisfy  
\[A \mathbf{v}_2 = \lambda_2 \mathbf{v}_2\]

Set  
\[\mathbf{v}_2 = \mathbf{v}_1\]
  and subtract the first from the second.
\[A( \mathbf{v}_1 - \mathbf{v}_1) = (\lambda_2 -\lambda_1 ) \mathbf{v}_1\]

The left hand side is zero. The right hand side is then also zero, and since  
\[\mathbf{v}_1 \neq 0\]
  we must have  
\[\lambda_1 = \lambda_2\]
. This is a contradiction, so distinct eigenvalues give rise to distinct eigenvectors.
The converse is not true. Distinct eigenvectors can have the same eigenvalues.
The matrix  
\[ \left| \begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array} \right| \]
  has eigenvectors  
\[ \begin{pmatrix}1\\0\end{pmatrix} , \: \begin{pmatrix}0\\1\end{pmatrix}\]
  corresponding to the eigenvalue  
\[\lambda =1\]
.

You have no rights to post comments