An th-rank tensor in -dimensional space is a mathematical object that has indices and components and obeys certain transformation rules. Each index of a tensor ranges over the number of dimensions of space. However, the dimension of the space is largely irrelevant in most tensor equations (with the notable exception of the contracted Kronecker delta). Tensors are generalizations of scalars (that have no indices), vectors (that have exactly one index), and matrices (that have exactly two indices) to an arbitrary number of indices.
Tensors provide a natural and concise mathematical framework for formulating and solving problems in areas of physics such as elasticity, fluid mechanics, and general relativity.
The notation for a tensor is similar to that of a matrix (i.e., ), except that a tensor , , , etc., may have an arbitrary number of indices. In addition, a tensor with rank may be of mixed type , consisting of so-called "contravariant" (upper) indices and "covariant" (lower) indices. Note that the positions of the slots in which contravariant and covariant indices are placed are significant so, for example, is distinct from .
While the distinction between covariant and contravariant indices must be made for general tensors, the two are equivalent for tensors in three-dimensional Euclidean space, and such tensors are known as Cartesian tensors.
Objects that transform like zeroth-rank tensors are called scalars, those that transform like first-rank tensors are called vectors, and those that transform like second-rank tensors are called matrices. In tensor notation, a vector would be written , where , ..., , and matrix is a tensor of type , which would be written in tensor notation.
Tensors may be operated on by other tensors (such as metric tensors, the permutation tensor, or the Kronecker delta) or by tensor operators (such as the covariant derivative). The manipulation of tensor indices to produce identities or to simplify expressions is known as index gymnastics, which includes index lowering and index raising as special cases. These can be achieved through multiplication by a so-called metric tensor , , , etc., e.g.,
(1)
| |||
(2)
|
(Arfken 1985, p. 159).
Tensor notation can provide a very concise way of writing vector and more general identities. For example, in tensor notation, the dot product is simply written
(3)
|
where repeated indices are summed over (Einstein summation). Similarly, the cross product can be concisely written as
(4)
|
where is the permutation tensor.
Contravariant second-rank tensors are objects which transform as
(5)
|
Covariant second-rank tensors are objects which transform as
(6)
|
Mixed second-rank tensors are objects which transform as
(7)
|
If two tensors and have the same rank and the same covariant and contravariant indices, then they can be added in the obvious way,
(8)
| |||
(9)
| |||
(10)
|
The generalization of the dot product applied to tensors is called tensor contraction, and consists of setting two unlike indices equal to each other and then summing using the Einstein summation convention. Various types of derivatives can be taken of tensors, the most common being the comma derivative and covariant derivative.
If the components of any tensor of any tensor rank vanish in one particular coordinate system, they vanish in all coordinate systems. A transformation of the variables of a tensor changes the tensor into another whose components are linear homogeneous functions of the components of the original tensor.
A tensor space of type can be described as a vector space tensor product between copies of vector fields and copies of the dual vector fields, i.e., one-forms. For example,
(11)
|
is the vector bundle of -tensors on a manifold , where is the tangent bundle of and is its dual. Tensors of type form a vector space. This description generalized to any tensor type, and an invertible linear map induces a map , where is the dual vector space and the Jacobian, defined by
(12)
|
where is the pullback map of a form is defined using the transpose of the Jacobian. This definition can be extended similarly to other tensor products of and . When there is a change of coordinates, then tensors transform similarly, with the Jacobian of the linear transformation.