A matrix is a concise and useful way of uniquely representing and working with linear transformations. In particular, every linear transformation can be represented by a matrix, and every matrix corresponds to a unique linear transformation. The matrix, and its close relative the determinant, are extremely important concepts in linear algebra, and were first formulated by Sylvester (1851) and Cayley.
In his 1851 paper, Sylvester wrote, "For this purpose we must commence, not with a square, but with an oblong arrangement of terms consisting, suppose, of lines and columns. This will not in itself represent a determinant, but is, as it were, a Matrix out of which we may form various systems of determinants by fixing upon a number , and selecting at will lines and columns, the squares corresponding of th order." Because Sylvester was interested in the determinant formed from the rectangular array of number and not the array itself (Kline 1990, p. 804), Sylvester used the term "matrix" in its conventional usage to mean "the place from which something else originates" (Katz 1993). Sylvester (1851) subsequently used the term matrix informally, stating "Form the rectangular matrix consisting of rows and columns.... Then all the determinants that can be formed by rejecting any one column at pleasure out of this matrix are identically zero." However, it remained up to Sylvester's collaborator Cayley to use the terminology in its modern form in papers of 1855 and 1858 (Katz 1993).
In his 1867 treatise on determinants, C. L. Dodgson (Lewis Carroll) objected to the use of the term "matrix," stating, "I am aware that the word 'Matrix' is already in use to express the very meaning for which I use the word 'Block'; but surely the former word means rather the mould, or form, into which algebraical quantities may be introduced, than an actual assemblage of such quantities...." However, Dodgson's objections have passed unheeded and the term "matrix" has stuck.
The transformation given by the system of equations
(1)
| |||
(2)
| |||
(3)
| |||
(4)
|
is represented as a matrix equation by
(5)
|
where the are called matrix elements.
An matrix consists of rows and columns, and the set of matrices with real coefficients is sometimes denoted . To remember which index refers to which direction, identify the indices of the last (i.e., lower right) term, so the indices of the last element in the above matrix identify it as an matrix. Note that while this convention matches the one used for expressing measurements of a painting on canvas (where height comes first then width), it is opposite that used to measure paper, room dimensions, and windows, (in which the width is listed first followed by the height; e.g., 8 1/2 inch by 11 inch paper is 8 1/2 inches wide and 11 inches high).
A matrix is said to be square if , and rectangular if . An matrix is called a column vector, and a matrix is called a row vector. Special types of square matrices include the identity matrix , with (where is the Kronecker delta) and the diagonal matrix (where are a set of constants).
In this work, matrices are represented using square brackets as delimiters, but in the general literature, they are more commonly delimited using parentheses. This latter convention introduces the unfortunate notational ambiguity between matrices of the form and the binomial coefficient
(6)
|
When referenced symbolically in this work, matrices are denoted in a sans serif font, e.g, , , etc. In this concise notation, the transformation given in equation (5) can be written
(7)
|
where and are vectors and is a matrix. A number of other notational conventions also exist, with some authors preferring an italic typeface.
It is sometimes convenient to represent an entire matrix in terms of its matrix elements. Therefore, the th element of the matrix could be written , and the matrix composed of entries could be written as , or simply for short.
Two matrices may be added (matrix addition) or multiplied (matrix multiplication) together to yield a new matrix. Other common operations on a single matrix are matrix diagonalization, matrix inversion, and transposition.
The determinant or of a matrix is a very important quantity which appears in many diverse applications. The sum of the diagonal elements of a square matrix is known as the matrix trace and is also an important quantity in many sorts of computations.