Sunday, December 22, 2024

Spin Chronicles Part 28: Left and Right Regular

As it is Sunday, and Christmas Eve is coming soon - it should be an easy talk today. In fact it is my intention that everything should be easy in my posts. By 'easy" I mean that even I myself can understand it. So, as Christmas is coming and light is a foundation of being, we decorated our home with light. The dedicated photographer in our extended family recorded it on a medium, for me to show you - there structure constants of the geometric algebra visible on the photo - for a trained eye:

Geometric algebra home

For this post I will denote our geometric Clifford algebra of space, Cl(V), by the bold letter A. It is an algebra over complex numbers, and we have a basis e0,e1,e2,e3 in A. For calculation purposes, especially when dealing with matrices, it is more prudent to number the basis differently: e1,e2,e3,e4, with e4 = e0 - the identity, the unit of A. And that is what I am going to use below. Thus every element of A can be written uniquely as

u = u1e1+... + u4e4,

where uμ are complex numbers.

A is endowed with involution "*", it is an involutive algebra. We notice that (eμ)* = eμ, μ - 1,...,4. For u,v in A we have (uv)*=v*u*.

A is also endowed with a positive definite scalar product

<u,v> = (u*v)4.

We notice that the basis vectors form an orthonormal basis of A:

<eμ, eν> = δμν.

Once we have a positive-definite scalar product, we have a norm, defined by ||u||2 = <u,u>, and we notice that

||u*|| = ||u||.

We also know, from the previous post,  that A is a Hilbert algebra - we have

<ba,c> = <b,ca*>.

As any algebra, so A  acts on itself. It can act from the left, from the right, or from both sides at once. Let us denote these actions by L, R, and LR:

L(u)w = uw,

R(u)w = wu,

LR(u)w = uwu*.

From associativity of the algebra it follows then that left and right actions commute

L(u)R(v) = R(v)L(u),

and evidently

LR(u) = L(u)R(u*) = R(u*)L(u).

The map L from A to End(A) is a faithful representation of A. It is called the left regular representation. Similarly for R. Moreover, it is a *-representation, that is we have

L(u*) = L(u)*.

What the above equality means? On the left L(u*) - the meaning is clear. On the right we have L(u)*. What does that mean? It is the Hermitian adjoint of the operator L(u) with respect to the Hilbert space scalar product. How is the Hermitian adjoint operator defined? Here is the defining relation:

<L(u)*v,w> = <v,L(u)w>,

or, if you prefer:

<L(u)v,w> = <v,L(u)*w>

Exercise 1. Use the Hilbert algebra property to show that L is indeed a *-representation. Do the same for R.

Note. This is the right place for a side remark. We do not really need it, but, nevertheless, here it is: A is endowed with a norm ||u||. But we have a faithful representation L of A on the Hilbert space A. To each u in A we have the associated linear operator L(u) acting on a Hilbert space. This operator has a norm, like it is the case with every bounded linear operator.   We can therefore equip A with another norm, denoted ||u||', defined as

||u||' = ||L(u)||.

If we do this, we have a nice property:

||u*u||' = ||u||'2,

because operators in every Hilbert space have this property. *-algebras with such a norm are named C*-algebras. So A can be thought as a particularly simple case of a C*-algebra. There is a whole theory of abstract C*-algebras (in finite-dimensional case they are the same as von Neumann algebras)

In the discussion under the last post Bjab calculated the matrix form of L(u) in a basis. Taking into account the change of indexing, index 0 replace d by index 4, L(u) is given by the matrix:

{{u4, -iu3, iu2, u1},
{iu3, u4, -iu1, u2},
{-iu2, iu1, u4, u3},
{u1, u2, u3, u4}
}.

I have moved the first row to the end, the first column became the last, and replaced u0 by u4. Selecting u = eμ, with (eμ)ν = δμν, we get the matrices Lμ calculated by Saša:

L1 = {{0,0,0,1},{0,0,-i,0},{0,i,0,0},{1,0,0,0}},

L2 = {{0,0,i,0},{0,0,0,1},{-i,0,0,0},{0,1,0,0}},

L3 = {{0,-i,0,0},{i,0,0,0},{0,0,0,1},{0,0,1,0}},

L4 = {{1,0,0,0},{0,1,0,0},{0,0,1,0},{0,0,0,1}}.

Matrices Rμ are transposed to the matrices Lμ.

Exercise 2. The matrices Lμ and Rμ are Hermitian. Why is it so?

Exercise 3. Why the matrices Rμ are simply transposed to the matrices Lμ?

The space End(A) - the space of all linear operators acting on A has complex dimension 16 (=4x4). We can build 16 matrices LμRν. There are enough of these matrices to build a basis in End(A). But to be a basis the matrices should be linearly independent. Are they?

One way to address this question is that End(A) is also a Hilbert space with a natural scalar product - for A,B in End(A) the scalar product is given by the trace:

<X,Y> = Tr(X*Y).

So, if our basis happens to be orthonormal, then we automatically have linear independence. Using Mathematica I verified that

<LμRν,LσRρ> = 4 δμσ δνρ

Therefore indeed our 16 matrices LμRν form a basis in End(A). Nice to know.

Let us concentrate now on L(A) - the image of A under the representation L. In other words: the set of all matrices L(u), u in A. L(A) is an algebra, a subalgebra of End(A). While End(A) is 16-dimensional, L(A) is only 4-dimensional. It is closed under Hermitian conjugation: if X is in L(A), then also  X* is in L(A). The same is true about R(A). We know that every elements in R(A) commutes with every element in L(A).

Exercise 4. Why is it so?

In algebra whenever we have a subalgebra S of an algebra T, we denote by S' the commutant of S in T:

S' = {X in T: XY = YX for all Y in T}.

The fact that every element in R(A) commutes with every element in L(A) can be expressed by the formulas:

R(A) ⊂ L(A)',

L(A) ⊂ R(A)'.

It would be cruel of me to ask the Reader, on Sunday, two days before Christmas Eve,  to prove that, in fact, we have

R(A) = L(A)',

L(A) = R(A)'.

So, I leave the proof for the next post. But, perhaps it is not so cruel to ask the following

Exercise 5. Show that L(A)∩R(A) = C, where C denotes here the algebra of cI, where c is a complex number and I is the identity matrix.



4 comments:

  1. "By 'easy" I mean that even I myself can understand it."
    :)))
    A good one, really, as an "easy" and understandable is way above my pay grade with this today's post.
    Merry Christmas! (if no new posts appear until after Wednesday)

    ReplyDelete
  2. R(u)w = w u, ->
    R(u)w = wu,

    The map L form A ->
    The map L from A

    ReplyDelete
  3. The fact that that ->
    The fact that

    that every elements ->
    that every element

    ReplyDelete

Thank you for your comment..

Spin Chronicles Part 28: Left and Right Regular

As it is Sunday, and Christmas Eve is coming soon - it should be an easy talk today. In fact it is my intention that everything should be ...