← Back to home

Symmetric Algebra

Right. You want me to… explain this. As if it's some kind of parlor trick. Fine. Let's not pretend this is about illumination. It's about… necessity. Like a sharp edge you can't avoid.


"Smallest" Commutative Algebra Containing a Vector Space

Let's get this straight from the start: this isn't about being cute or trying to impress anyone. This is about structure. About building something that respects certain rules, and doing it with the least amount of fuss possible. You could call it the symmetric algebra S(V), or Sym(V), on a vector space V over some field K. It's a commutative algebra over K, and it contains V. The "smallest" part? That's just efficiency. It means it's the most direct, the most economical way to embed V into a commutative algebra.

Think of it this way: it satisfies a universal property. For any linear map f from V to any commutative algebra A, there's precisely one algebra homomorphism g from S(V) to A that makes f happen. Specifically, f has to be g composed with the inclusion map i of V into S(V). No detours, no unnecessary complications. Just the most straightforward path.

If V has a basis, say B, then S(V) is essentially the polynomial ring K[B]. The elements of B are treated as variables. So, S(V) is like a polynomial ring, but without being tied down to specific coordinates. It's "coordinate-free," as they say. A concept, rather than a collection of numbers.

How do we construct it? We start with the tensor algebra T(V). Then, we take a quotient. We essentially smash the elements v ⊗ w and w ⊗ v together – make them equal. That's done by taking the ideal generated by all elements of the form x ⊗ y − y ⊗ x. It’s a way of enforcing commutativity.

And yes, this whole dance extends. If V is a module over a commutative ring (not necessarily free), the same principles apply. It's just a matter of adapting the rules.

Construction

From the Tensor Algebra

We can build S(V) using the tensor algebra T(V). It's a quotient algebra of T(V), where we've modded out the two-sided ideal generated by the commutators of the form v ⊗ w − w ⊗ v.

It's not hard to see that this resulting algebra has that universal property I mentioned. Because the tensor algebra itself has a universal property, any linear map f from V to a commutative algebra A can be extended to an algebra homomorphism from T(V) to A. Since A is commutative, this map will automatically factor through S(V). And because V generates S(V) as a K-algebra, this extension of f to S(V) is unique.

This is also a consequence of a more general idea in category theory. The forgetful functor from commutative algebras to vector spaces is a composition of two other forgetful functors: from commutative algebras to associative algebras, and from associative algebras to vector spaces. The tensor algebra construction and the quotient by commutators are left adjoints to these forgetful functors. Composing them means their composition is left adjoint to the overall forgetful functor from commutative algebra to vector spaces. This, in turn, guarantees the universal property. It’s a chain reaction of adjointness, if you want to be precise.

From the Polynomial Ring

Alternatively, we can build S(V) from polynomial rings.

If V is a K-vector space or a free K-module with basis B, we take the polynomial ring K[B], where elements of B are the indeterminates. The homogeneous polynomials of degree one in this ring form a vector space or free module that's essentially V. This makes K[B] a solution to our universal problem. Therefore, K[B] and S(V) are canonically isomorphic. This is also apparent from category theory – free modules and polynomial rings are the free objects in their respective categories.

What if V isn't free? We can express it as a quotient: V = L / M, where L is a free module and M is a submodule of L. Then, S(V) is S(L/M), which is S(L) / ⟨M⟩. Here, ⟨M⟩ is the ideal generated by M. This, again, can be proven through direct computation, or more elegantly, using category theory and the universal property of quotients. It's about mapping to zero a specific subset. The kernel plays a role, whether it's a normal subgroup, a submodule, or an ideal.

Grading

The symmetric algebra isn't just a jumbled mess; it's a graded algebra. It breaks down into a direct sum:

S(V) = ⨁n=0 Sn(V)

Here, Sn(V) is the nth symmetric power of V. It’s the vector subspace (or submodule) generated by products of n elements from V. The S2(V) is sometimes called the symmetric square.

How do we know this? The tensor algebra construction helps. T(V) is graded, and S(V) is its quotient by a homogeneous ideal – the one generated by all x ⊗ y − y ⊗ x, where x and y are in V (degree one elements).

For vector spaces or free modules, this grading corresponds to the total degree of polynomials. If V is not free, say V = L/M with L free, then S(V) is the quotient of the graded S(L) (a polynomial ring) by the homogeneous ideal generated by elements of M (which are degree one).

We can also define Sn(V) as the solution to a universal problem for n-linear symmetric functions from V into some vector space or module. Then, the direct sum of all these Sn(V) satisfies the universal problem for the symmetric algebra itself. It's a recursive elegance.

Relationship with Symmetric Tensors

Let's be clear: a symmetric algebra element isn't a tensor, and it's certainly not a symmetric tensor in the direct sense. They're related, but not identical.

A symmetric tensor of degree n lives in Tn(V) and is invariant under the action of the symmetric group Sn. This means if you swap the order of any two elements in a tensor product, the tensor remains unchanged. The symmetric tensors of degree n form a subspace, Symn(V) ⊂ Tn(V). The direct sum of all Symn(V) is a graded vector space, but it's not an algebra because the product of symmetric tensors isn't generally symmetric.

There's a map, πn, from Symn(V) to Sn(V). If n! is invertible in the ground field (like in characteristic zero fields), then πn is an isomorphism. The inverse is the symmetrization map, which averages over all permutations.

However, if the characteristic is small (less than n+1), this map isn't an isomorphism. For example, in characteristic two, x ⊗ y + y ⊗ x is zero in S2(V) but not necessarily in Sym2(V). And even in characteristic zero, over the integers, the map might not be surjective. The symmetric tensors and the symmetric algebra are isomorphic as graded vector spaces over fields of characteristic zero, but not as algebras. And this isomorphism definitely doesn't hold for fields of positive characteristic or rings that don't contain the rational numbers.

Categorical Properties

Given a module V over a commutative ring K, the symmetric algebra S(V) is defined by this universal property: for any K-linear map f from V to a commutative K-algebra A, there is a unique K-algebra homomorphism g : S(V) → A such that f = g ∘ i, where i is the inclusion of V into S(V).

This universal property means that S(V) is unique up to a canonical isomorphism. All its properties can be derived from this.

S(V) is a functor. A module homomorphism f : V → W can be uniquely extended to an algebra homomorphism S(f) : S(V) → S(W).

The universal property also states that the symmetric algebra is a left adjoint to the forgetful functor that takes a commutative algebra to its underlying module.

Symmetric Algebra of an Affine Space

We can construct something similar for an affine space. The key difference is that the symmetric algebra of an affine space isn't graded. It's a filtered algebra. You can determine the degree of a polynomial on an affine space, but not its homogeneous parts. Why? Because there's no distinguished point, no origin, to evaluate at and separate the constant term from the rest.

Analogy with Exterior Algebra

The Sk are functors and are comparable to the exterior powers. But here, the dimension of Sk(V) grows with k. If n is the dimension of V, then dim(Sk(V)) = (n+k-1 choose k). This is simply the number of n-variable monomials of degree k.

The symmetric algebra and the exterior algebra are actually the isotypical components of the trivial and sign representations, respectively, of the action of Sn on V ⊗ n. This is a more advanced point, of course, usually discussed over fields like the complex numbers.

As a Hopf Algebra

The symmetric algebra can be endowed with the structure of a Hopf algebra. You can find details on this in the section on Tensor algebra.

As a Universal Enveloping Algebra

S(V) is also the universal enveloping algebra of an abelian Lie algebra. That is, a Lie algebra where the Lie bracket is always zero.


So, there you have it. A structure defined by its necessity, built by imposing order on chaos, and with properties that ripple through various branches of mathematics. It’s not inherently exciting, but it is… fundamental. Don't expect me to elaborate further unless you have a genuinely interesting question.