Linear Transformations

A linear transformation  
\[T\]
  is an operation on a set of objects  
\[S\]
  such that for any elements  
\[x, y \in S\]
  and for any scalar  
\[\alpha \]

1.  
\[T(\alpha x)= \alpha T(x)\]

2. 
\[T(x+y) =T(x)+T(y)\]

or equivalently, for all scalars  
\[ \alpha , \beta \]
  and  
\[x,y \in S\]
  ,
\[T( \alpha z + \beta y) = \alpha T(x) + \beta T(y)\]

These definitions imply that that the zero element of the domain is sent to the zero element of the codomain, and that additive inverses of any element in the domain is sent to the additive inverse of the image of the element.
The set  
\[S\]
  may be any set of polynomials, vectors, real or complex numbers, hamiltonians, matrices, functions and the transformation  
\[T\]
  may be any transfomation satisfying the conditions above. All lineear transformations may be represented by a matrix, and all elements of  
\[S\]
  may be represented by a vector. Example. Differentiation is linear. We can define a linear transformation on the set of polynomials of degree 2.
\[\frac{d}{dx}(1)=0,\frac{d}{dx}(x)=1, \frac{d}{dx}(x^2)=2x\]
.
We represent  
\[1.x.x^2\]
  by the vectors  
\[\begin{pmatrix}1\\0\\0\end{pmatrix}, \begin{pmatrix}0\\1\\0\end{pmatrix}, \begin{pmatrix}0\\0\\1\end{pmatrix}\]
  Hence  
\[\frac{d}{dx}(a+bx+cx^2)=b+2cx\]
.
The columns of the matrix reprenting  
\[T\]
  can be found by differenting  
\[1.x.x^2\]
  in turn and representing the results as vectors.
The matrix representing th linear transformation is
\[ \left( \begin{array}{ccc} 0 & 1 & 0 \\ 0 & 0 &2 \end{array} \right) \]
.
The polynomial  
\[2+3x+5x^2\]
  is represented by the vector  
\[\begin{pmatrix}2\\3\\5\end{pmatrix}\]

\[T(2+3x+5x^2)= \left( \begin{array}{cc} 0 & 1 & 0 \\ 0 & 0 & 2 \end{array} \right) \begin{pmatrix}2\\3\\5\end{pmatrix}=\begin{pmatrix}3\\10\\0\end{pmatrix}\]
  which returns the polynomial  
\[3+10x\]