I now turn to module-like entities.
First, vector spaces. x = {x(1), x(2), ..., x
} for an n-dimensional vector, where x(1), x(2), ..., x
are in some ring R.
Addition is component-by-component: x + y = {x(1)+y(1), x(2)+y(2), ..., x
+y
} and scalar multiplication is likewise component-by-component: a*x = {a*x(1), a*x(2), ..., a*x
} and x*a = {x(1)*a, x(2)*a, ..., x
*a} where a is also in R.
One can interpret a vector as a function of its index variable, and then generalize that index variable beyond 1, 2, ..., n.
It is evident that vectors form an abelian group under vector addition.
Modules generalize vector spaces. A left module is defined as follows. For an abelian group M and a ring R, and an operation * with R*M = M,
a*(x + y) = (a*x) + (a*y)
(a + b)*x = (a*x) + (b*x)
(a*b)*x = a*(b*x)
(R multiplicative identity: 1)*x = x
where a and b are in R and x and y are in M.
Right modules and two-sided modules can be defined similarly.
-
Going into algebra-like entities, an algebra over a field is a vector space with a bilinear operator (vector) # (vector) -> (vector). It is distributive on both sides and it has scalar multiplication:
(x + y) # z = (x # z) + (y # z)
x # (y + z) = (x # y) + (x # z)
(a*x) # (b*y) = (a*b)*(x # y)
An important one is a Lie algebra, where # is antisymmetric and non-associative, though it nevertheless satisfies the Jacobi identity.
(b # a) = - (a # b)
(a # b) # c + (b # c) # a + (c # a) # b = 0
Both identities are readily verified for # being a matrix commutator: a # b = a.b - b.a
Another non-associative one is the 3-vector cross product. It is equivalent to the Lie algebra SO(3).
There are also some associative ones, like complex-number and quaternion multiplication.