The Spin Chronicles Part 30:
Solutions to exercises of Part 30
This post is all written by Saša. Saša agreed to present the results of our common work (Saša, Anna, Bjab). So here they are.
Here is the beginning of the new post. I will be expanding it during the course of the day. At the same time I will be open to the feedback from my Readers to see if I need to change or adjust anything.
This post will be about some basic stuff. I have ordered "Basic Algebra, Vol. II" by Nathan Jacobson, and yesterday it came by mail. It was on the kitchen table. Laura looked inside, opened it at the chapter "Primary Decomposition", looked at the symbols there, and noticed: "It so not so `basic'". Well the same will be with this post. Basic but not so basic.
We will be discussing "decompositions". The general scenery is as follows:
- all spaces here are finite-dimensional and over the field of complex numbers
- we have an associative algebra A, with unit 1, and with an involution "*"
- we have a vector space E with a (positive defined) scalar product
(u,v) (finite-dimensional Hilbert space, if you wish). We assume (u,v)
is linear with respect to v, anti-linear with respect to u.
- we have a *-representation, let us call it ρ, of A on E. Thus for each u in A we have a linear operator ρ(u) acting on E (thus a member of End(E) such that ρ(1) is the identity operator and ρ(u*)=ρ(u)*, where * on the right hand side is the Hermitian conjugate of the operator ρ(u) with respect to the scalar product: (X* x,y) = (x,Xy) for all x,y in E, all X in End(E).
So far we have studied the case with A = Cl(V), E = A, and ρ = L or ρ
= R - the left and right regular representations of A acting on A
itself. But the reasoning below is the same for a general case. And it
is more transparent at the same time, there are less possibilities for a
confusion. In text below I will state certain things as evident: "it is
so and so ...". In a good linear algebra course each of these things is
being proven, or is left as an exercise - to be proved starting from
definitions. I will also provide reasonings that sketch the proofs of
less evident properties.
We are interested in invariant subspaces of E. By a subspace I will always mean a linear (or vector-) subspace. "Invariant" means invariant for the representation ρ. Thus a subspace F⊂ E is invariant if ρ(A)F ⊂F or, explicitly
ρ(u)x∈F for all u∈A, x∈F.
The representation is said to be irreducible if there are no
invariant subspaces other then the zero subspace (consisting of the zero
vector alone) and the whole space E. These two subspaces are always
trivially invariant.
If F is a subspace, its orthogonal complement F⟂ is also a subspace, and they span the whole space:
F⊕F⟂= E.
Thus every u in E E can be uniquely decomposed (orthogonal decomposition)
u = v+w,
where v is in F and w is in F⟂.
We define PF, the orthogonal projection on F, by
PF u = v in the above orthogonal decomposition. Then PF is a Hermitian idempotent PF=PF*=PF2, or "orthogonal projection", or "projection operator", or, simply, "a projector".
To the decomposition F⊕F⟂= E, there corresponds the formula
PF+PF⟂= I, or PF⟂= I-PF,
where I stands for the identity operator.
It is an easy exercise to show that F is invariant if and only if PF commutes with all ρ(u). If this happens, then, evidently, also PF⟂= I-PF commutes with all ρ(u), thus F is invariant if and only if F⟂is invariant. We then have a decomposition of E into the direct sum of two invariant subspaces:
E = F⊕F⟂.
Note. In the case we have discussed in previous posts, we have E = A, and, for ρ=L, ρ(u)x=ux. Therefore looking for a non-trivial invariant subspace is the same as looking for a non-trivial left ideal. For ρ=R, the right regular representation, looking for an invariant subspace is the same as looking for a right ideal.
We can always choose an orthonormal basis ei in E. If E is n-dimensional, then i=1,2,...,n. If F is a subspace of E, with dim(F)=m, 0<m<n, we can always choose a basis so that e1,...,em are in F, while em+1,...,en are in F⟂. Then e1,...,em form a basis in F, while em+1,...,en form a basis in F⟂. We call such a basis "adapted to the decomposition". So, let ei be such a basis. Assume that F is invariant. The vectors e1,...,em are in F, so, since F is invariant, for any u in A, the vectors ρ(u)ei (i=1,...,m) are also in F. But e1,...,em form a basis in F. Therefore ρ(u)ei are linear combinations of e1,...,em. We write it as
ρ(u)ei = ∑j=1m ej μ(u)ji (i=1,...,m).
The m2 complex coefficients μ(u)ji form a matrix representation of A by m⨯m square matrices.We have
μ(uw) = μ(u)μ(w).
Now, since F⟂is also invariant, we have
ρ(u)ei = ∑j=m+1n ej ν(u)ji (i=m+1,,,,,n).
Similarly we have
ν(uv) = ν(u)ν(v).
Thus we have another matrix representation of A with, this time, (n-m)⨯(n-m) square matrices.
Now, all n basis vectors e1,..., en form a basis for E. So, we get a matrix representation of ρ(u), call its matrices simply ρ(u)ij
ρ(u)ei = ∑j=1n ej ρ(u)ji , (i=1.,,,,n).
Now, in our adapted basis, the n by n matrices ρ(u)ji are block diagonal: they are of the form
ρ(u) = {{μ(u),0},{0,ν(u}}
with blocks m⨯m, m⨯(n-m),(n-m)⨯m,(n-m)⨯(n-m). That is the matrix view of reducibility. A representation is reducible if there is an orthonormal basis in E in which the matrices of the representation are block diagonal.
Exercise 1. In the comments to the last post we have found a basis E1,E2 which spans a nontrivial left ideal of A. Is this basis orthonormal? Probably not, because we were not taking care of the normalization. But are vectors E1,E2 orthogonal to each other in the scalar product of A? But then there should be another pair of vectors, say E1',E2' orthogonal to E1,E2 and to each other that span the orthogonal complement to our left ideal. It should also be an invariant subspace, thus a complementary left ideal. Can we find a simple form of E1',E2'? Can we find ν? It is enough to find ν(ei), where ei is the basis of A.
It is now time to formulate a version of Shur's famous Lemma, adapted to our needs.
Given the *-representation ρ of A on E, ρ is irreducible if and only if the only linear operators acting on E and commuting with all ρ(u) are multiples of the identity operator.
In other words ρ is irreducible if and only if the commutant ρ(A)' is CI. That is if and only if the commutant is trivial. It is reducible if and only if the commutant is non-trivial.
In the proof below we use the standard results about eigenvalues and eigenspaces of Hermitian operators (matrices, if you wish).
Proof. One way is simple. Suppose ρ is reducible, then there is a nontrivial invariant subspace F. Then the projection PF is non-trivial (different from 0 or I). But since F is invariant, PF is in the commutant. Now the harder part: show that if the commutant is non-trivial, then the representation is reducible. So suppose ρ(A)' is non-trivial. Then there is an operator X commuting with all ρ(u), and X is not of the form cI, c being a complex number. Now, since ρ is a *-representation, it follows that also X* commutes with all ρ(u).
Exercise 2. Prove the last statement.
But then X=(X+X*)/2 +(X-X*)/2. The first term commutes with all ρ(u), the second term too. Also the second term multiplied by the imaginary i. Both X+X* and i(X-X*) are Hermitian. If X=(X+X*)/2 +(X-X*)/2 is not of the form cI, then at least one of X+X* and i(X-X*), call it Y, is not of the form cI. Now we have Y=Y*, Y commutes with all ρ(u), and Y is not of the form cI. Since Y is Hermitian, it has eigenvalues and eigenspaces. At least one of its eigenspaces must be nontrivial (different from 0 and E). Call it F. Since Y commutes with ρ(u), its eigenspaces are invariant. Thus F is a nontrivial invariant eigenspace. Thus ρ is reducible.
In practice we often choose an orthonormal basis in E, and we work with matrices. Then ρ is irreducible if and only if the only matrix commuting with all ρ(u) matrices is a multiple of the identity matrix. But A is assumed to be finite-dimensional. Thus there is a basis, say εr in A, r =1,2,...,k, where k is dimension of the algebra A. Every element u of A can be written as u=∑r=1k cr εr, where cr are complex numbers. For a matrix M to commute with all ρ(u) matrices, it is enough that M commutes with all ρ(εr), r=1,...,k. Thus ρ is irreducible if and only if from [M,ρ(εr)]=0, r=1,...,k, it follows that M is a multiple of the identity matrix.
Why do we care about reducibility or irreducibility? Suppose ρ is reducible. So we have invariant subspaces F and F⟂. Then ρ acts on F. We may ask if ρ acting on F is still reducible. The same with ρ acting on F⟂. We proceed this way until we end up with irreducible representations. We get this way
E=F1⊕...⊕Fp,
where each of F1,...,Fp carries an irreducible representations. These are the building blocks of ρ, the "bricks", or "atoms". They can't be decomposed any further. Both physicists and mathematicians want to know these "atoms". And if a representation is reducible, there must be some "reason" for it. Perhaps it is a reducible representation for A, but irreducible for some bigger algebra B? Then what would be the "meaning" of this B? On the other hand atoms are build of nucleons and electrons. If ρ is an irreducible representation of A, perhaps it is reducible when restricted to a smaller algebra C? What would be the meaning of such a C?
There is another thing: suppose we have two irreducible representations of A, one on F1, and one on F2. Are they "essentially the same", or "essentially different"? Two representations are essentially the same (we say: they are "equivalent") if it is possible to choose some orthonormal basis in F1, and one orthonormal basis in F2, in such a way that the matrices of both representations are exactly the same. Of course it is enough that the matrices representing εr are the same.
We have so far followed a geometrical way and looked for particular left ideals of our geometric algebra A. But once we know that it is a *-algebra, there is another way, more related to the concepts of quantum theory such as "states". Then there is a celebrated GNS (Gelfand-Naimark-Segal) construction of irreducible representations. This we discuss later on, and we will relate it to our adventure with ideals.
The Spin Chronicles Part 30: Solutions to exercises of Part 30 This post is all written by Saša. Saša agreed to present the results of ou...