Sunday, December 22, 2024

Spin Chronicles Part 28: Left and Right Regular

As it is Sunday, and Christmas Eve is coming soon - it should be an easy talk today. In fact it is my intention that everything should be easy in my posts. By 'easy" I mean that even I myself can understand it. So, as Christmas is coming and light is a foundation of being, we decorated our home with light. The dedicated photographer in our extended family recorded it on a medium, for me to show you - there structure constants of the geometric algebra visible on the photo - for a trained eye:

Geometric algebra home

For this post I will denote our geometric Clifford algebra of space, Cl(V), by the bold letter A. It is an algebra over complex numbers, and we have a basis e0,e1,e2,e3 in A. For calculation purposes, especially when dealing with matrices, it is more prudent to number the basis differently: e1,e2,e3,e4, with e4 = e0 - the identity, the unit of A. And that is what I am going to use below. Thus every element of A can be written uniquely as

u = u1e1+... + u4e4,

where uμ are complex numbers.

A is endowed with involution "*", it is an involutive algebra. We notice that (eμ)* = eμ, μ - 1,...,4. For u,v in A we have (uv)*=v*u*.

A is also endowed with a positive definite scalar product

<u,v> = (u*v)4.

We notice that the basis vectors form an orthonormal basis of A:

<eμ, eν> = δμν.

Once we have a positive-definite scalar product, we have a norm, defined by ||u||2 = <u,u>, and we notice that

||u*|| = ||u||.

We also know, from the previous post,  that A is a Hilbert algebra - we have

<ba,c> = <b,ca*>.

As any algebra, so A  acts on itself. It can act from the left, from the right, or from both sides at once. Let us denote these actions by L, R, and LR:

L(u)w = uw,

R(u)w = wu,

LR(u)w = uwu*.

From associativity of the algebra it follows then that left and right actions commute

L(u)R(v) = R(v)L(u),

and evidently

LR(u) = L(u)R(u*) = R(u*)L(u).

The map L from A to End(A) is a faithful representation of A. It is called the left regular representation. Similarly for R. Moreover, it is a *-representation, that is we have

L(u*) = L(u)*.

What the above equality means? On the left L(u*) - the meaning is clear. On the right we have L(u)*. What does that mean? It is the Hermitian adjoint of the operator L(u) with respect to the Hilbert space scalar product. How is the Hermitian adjoint operator defined? Here is the defining relation:

<L(u)*v,w> = <v,L(u)w>,

or, if you prefer:

<L(u)v,w> = <v,L(u)*w>

Exercise 1. Use the Hilbert algebra property to show that L is indeed a *-representation. Do the same for R.

Note. This is the right place for a side remark. We do not really need it, but, nevertheless, here it is: A is endowed with a norm ||u||. But we have a faithful representation L of A on the Hilbert space A. To each u in A we have the associated linear operator L(u) acting on a Hilbert space. This operator has a norm, like it is the case with every bounded linear operator.   We can therefore equip A with another norm, denoted ||u||', defined as

||u||' = ||L(u)||.

If we do this, we have a nice property:

||u*u||' = ||u||'2,

because operators in every Hilbert space have this property. *-algebras with such a norm are named C*-algebras. So A can be thought as a particularly simple case of a C*-algebra. There is a whole theory of abstract C*-algebras (in finite-dimensional case they are the same as von Neumann algebras)

In the discussion under the last post Bjab calculated the matrix form of L(u) in a basis. Taking into account the change of indexing, index 0 replace d by index 4, L(u) is given by the matrix:

{{u4, -iu3, iu2, u1},
{iu3, u4, -iu1, u2},
{-iu2, iu1, u4, u3},
{u1, u2, u3, u4}
}.

I have moved the first row to the end, the first column became the last, and replaced u0 by u4. Selecting u = eμ, with (eμ)ν = δμν, we get the matrices Lμ calculated by Saša:

L1 = {{0,0,0,1},{0,0,-i,0},{0,i,0,0},{1,0,0,0}},

L2 = {{0,0,i,0},{0,0,0,1},{-i,0,0,0},{0,1,0,0}},

L3 = {{0,-i,0,0},{i,0,0,0},{0,0,0,1},{0,0,1,0}},

L4 = {{1,0,0,0},{0,1,0,0},{0,0,1,0},{0,0,0,1}}.

Matrices Rμ are transposed to the matrices Lμ.

Exercise 2. The matrices Lμ and Rμ are Hermitian. Why is it so?

Exercise 3. Why the matrices Rμ are simply transposed to the matrices Lμ?

The space End(A) - the space of all linear operators acting on A has complex dimension 16 (=4x4). We can build 16 matrices LμRν. There are enough of these matrices to build a basis in End(A). But to be a basis the matrices should be linearly independent. Are they?

One way to address this question is that End(A) is also a Hilbert space with a natural scalar product - for A,B in End(A) the scalar product is given by the trace:

<X,Y> = Tr(X*Y).

So, if our basis happens to be orthonormal, then we automatically have linear independence. Using Mathematica I verified that

<LμRν,LσRρ> = 4 δμσ δνρ

Therefore indeed our 16 matrices LμRν form a basis in End(A). Nice to know.

Let us concentrate now on L(A) - the image of A under the representation L. In other words: the set of all matrices L(u), u in A. L(A) is an algebra, a subalgebra of End(A). While End(A) is 16-dimensional, L(A) is only 4-dimensional. It is closed under Hermitian conjugation: if X is in L(A), then also  X* is in L(A). The same is true about R(A). We know that every elements in R(A) commutes with every element in L(A).

Exercise 4. Why is it so?

In algebra whenever we have a subalgebra S of an algebra T, we denote by S' the commutant of S in T:

S' = {X in T: XY = YX for all Y in S}.

The fact that every element in R(A) commutes with every element in L(A) can be expressed by the formulas:

R(A) ⊂ L(A)',

L(A) ⊂ R(A)'.

It would be cruel of me to ask the Reader, on Sunday, two days before Christmas Eve,  to prove that, in fact, we have

R(A) = L(A)',

L(A) = R(A)'.

So, I leave the proof for the next post. But, perhaps it is not so cruel to ask the following

Exercise 5. Show that L(A)∩R(A) = C, where C denotes here the algebra of cI, where c is a complex number and I is the identity matrix.

P.S. 24-12-24 16:32 A neigbour and friend wrote to me a while ago that he has found on Academia.edu my paper "Revisiting Wigner's mind-body problem", and that "A lot of good things are said in this paper." Well, I never dared to publish this paper thinking that it contains too many controversial and speculative statements. But, perhaps, it is not too bad? I really don't know.


72 comments:

  1. "By 'easy" I mean that even I myself can understand it."
    :)))
    A good one, really, as an "easy" and understandable is way above my pay grade with this today's post.
    Merry Christmas! (if no new posts appear until after Wednesday)

    ReplyDelete
  2. R(u)w = w u, ->
    R(u)w = wu,

    The map L form A ->
    The map L from A

    ReplyDelete
  3. The fact that that ->
    The fact that

    that every elements ->
    that every element

    ReplyDelete
  4. "In algebra whenever we have a subalgebra S of an algebra T, we denote by S' the commutant of S in T:
    S' = {X in T: XY = YX for all Y in T}."

    Is this definition correct? Should there maybe be "for all Y in S" at the end, instead "for all Y in T", because as it is now it does not have anything to do with the set S with respect to which the S' is being defined.

    https://en.m.wikipedia.org/wiki/Centralizer_and_normalizer

    ReplyDelete
    Replies
    1. OK.

      Then proving
      "R(A) = L(A)',
      L(A) = R(A)'."

      Assuming otherwise would mean that there are elements in End(A) which commute with all elements in L(A) (or R(A)) that are not in R(A) (or L(A)), meaning that they can not be expressed using basis Rμ (or Lμ), while their 'product', with any element in L(A) (or R(A)) written as linear combination of basis elements Lμ (or Rμ), can be expressed using basis LμRν for End(A).
      Well, it seems like a contradiction, thus there are no elements in End(A) that commute with all elements in L(A) (or R(A)) that are not in R(A) (or L(A)).
      Therefore,
      R(A) = L(A)',
      L(A) = R(A)'.

      Would that do?

      Delete
    2. "Well, it seems like a contradiction,"

      Can you elaborate on this point. And concentrate on just one case, not two at once.

      Delete
    3. It seems I made an error, implicitly assuming that XY is in End(A), which I don't know if it is true.
      So, the previous comment proved another thing, not R(A) = L(A)', but that for Y in L(A), XY is in End(A) if and only if X is in R(A).

      Regarding the commutant of L(A).
      There is an element X in End(A), but not in R(A), that commutes with all elements Y in L(A), meaning that there is an element (xμν LμRν), which is not possible to express as (rρ Rρ), that satisfies (xμν LμRν) (lσ Lσ) = (lσ Lσ) (xμν LμRν), for any (lσ Lσ).

      The complex coefficients (complex scalars) xμν are the simple product of complex coefficients lμ and rν, thus we have (lμ rν lσ) Lμ Rν Lσ = (lσ lμ rν) Lσ Lμ Rν, and as every element in R(A) commutes with every element in L(A), we get (lμ lσ) Lμ Lσ (rν Rν) = (lσ lμ) Lσ Lμ (rν Rν), meaning LμLσ = LσLμ.
      For arbitrary Lσ, LμLσ = LσLμ only if μ=σ or if μ=4, meaning LμLσ = I for μ=σ and LμLσ = I Lσ for μ=4, where I is the identity 4×4 matrix or L4.

      In later case LμRν = Rν, thus X = (xμν LμRν) = (xμν Rν) which is obviously in R(A), or to stay in line with that "assuming otherwise" we see that the 'product' XY = (lμ rν lσ) Lμ Rν Lσ = (lμ rν lσ) Rν Lσ can be expressed as a linear combination of RνLσ, i.e. in basis for End(A), meaning the XY is in End(A), and we go to that statement "for Y in L(A), XY is in End(A) if and only if X is in R(A)".

      In former case though, when LμRν = LσRν, we immediately go to XY = (lμ rν lσ) Lσ Rν Lσ = (lμ rν lσ) Rν which is obviously in R(A) and thus also in End(A), so we can again finish with the statement "for Y in L(A), XY is in End(A) if and only if X is in R(A)".

      Therefore, any X in End(A) that commutes with all Y in L(A) must necessarily be in R(A), or R(A) = L(A)', that is R(A) is commutant of L(A).

      Analogous argumentation proves "the same" for L(A), that is L(A) = R(A)' or that L(A) is commutant of R(A).

      Is that OK now?

      Delete
    4. "The complex coefficients (complex scalars) xμν are the simple product of complex coefficients lμ and rν"

      How did you get this conclusion?

      Delete
    5. Assumption that linear combination in basis LμRν can be written as product of linear combinations in bases Lμ and Rν.
      Since it does not affect the proof in any way and only reduces its generality, maybe better to omit it.

      Delete
    6. "(lμ lσ) Lμ Lσ (rν Rν) = (lσ lμ) Lσ Lμ (rν Rν), meaning LμLσ = LσLμ."

      OK. How do you explain this implication?

      Delete
    7. As the complex scalars commute with every other element, the equality between left and right side holds if LμLσ = LσLμ, that is for those Lμ that commute with arbitrary Lσ.

      Delete
  5. Interesting observation:
    Matrices LμLν are Hermitian adjoints of matrices LνLμ (and vice versa), meaning that they are conjugate transposes (or transpose conjugates) of one another. The same holds for matrices RμRν and RνRμ.
    FWIW.

    ReplyDelete
    Replies
    1. @Saša, how did you show that "LμLν are Hermitian adjoints of LνLμ" - implicitly by hand or using some 'Wolfram Mathematica'-like software? I cannot obtain this result. What is LνLμ, 16x16 matrix, right?

      Delete
    2. LμLν are pretty much similar to LμRν, 4×4 matrices, as Ark showed in new post Part 29, they are in fact products of L(eμ)L(eν), basis elements for L(A).
      After multiplying each of the 4 Lμ with themselves, i.e. with each of the 4 Lν, arranged them in 4×4 matrix-like grid, with 4 Id on the "main diagonal", as Lμ^2=Id, and noticed that those above the diagonal are transposes (from the signs of Re elements, i.e. +/- 1) and complex conjugates (from the signs of Im elements, i.e. +/- i), of those under the diagonal.

      Delete
    3. @Anna "how did you show that "LμLν are Hermitian adjoints of LνLμ""

      We already know that Lμ = Lμ* - don't we?
      Then (LμLν)* = Lν*Lμ* = LνLμ, since in general (XY)*=Y*X*.
      Or I am missing something?

      Delete
    4. Yesterday, for some reason (after our corporative new-year party apparently) i decided that Lμ x Lν should be multiplied tensorialy and got 16x16-matrices😂 And they looked almost like required, but not exactly.
      By the way, i thought that 'transposing' means changing rows with columns, but not 1 with -1. It is not the same things...

      Delete
    5. "'transposing' means changing rows with columns,"

      You are perfectly right, though it may happen that it is the same as exchanging +1 with -1, like in the second Pauli matrix.

      Delete
    6. Yes, Ark explained it; the exchange of the positions of +1 and -1 made it possible to deduce by observation that they are transposed. And also Ark showed in one line that this Hermitian adjoint observation between LμLν and LνLμ is basically a trivial thing. :)

      So, a short recap: Lμ and Rμ are Hermitian, involutory and unitary, while Li and Ri are also traceless, almost like Pauli sigma_i matrices, with only difference (apart from being 4×4) in det(Li)=+1 instead of det(sigma_i)=-1.

      Anything important to add?

      Delete
  6. "Exercise 5. Show that L(A)∩R(A) = C, where C denotes here the algebra of cI, where c is a complex number and I is the identity matrix."

    L(A)∩R(A) are those elements of End(A), i.e. (lμ rν) LμRν for which:
    lσ Lσ = rρ Rρ,
    Lσ \ lσ Lσ = rρ Rρ / Rρ
    lσ LσLσ Rρ = rρ Lσ RρRρ
    lσ Rρ = rρ Lσ => lσ = rρ and Rρ = Lσ which works only for σ=ρ=4, i.e. Rρ = R4 = L4 = Lσ.

    Therefore, L(A)∩R(A) = (lμ rν) LμRν = l4 r4 L4 R4 = c I = C, where c = l4 r4 is a product of two complex numbers, that is complex number itself, and I is the identity matrix.

    ReplyDelete
    Replies
    1. "Lσ \ lσ Lσ = rρ Rρ / Rρ"

      Can you explain the meaning of this?

      Delete
    2. Multiplying from the right with Rρ and from the left with Lσ.
      As every element of R(A) commutes with every element of L(A), it does not make any difference when multiplying from right or from left in this case, so maybe can be changed to just "multiplying" to avoid unnecessary confusion and improve clarity.

      Delete
    3. But in lσ Lσ the index σ is the summation index: lσ Lσ stands for:

      lσ Lσ = l1 L1 +l2 L2 + l3 L3 +l4 L4.

      So, you multiply this sum by what?

      Delete
    4. Yeah, I realized that after posting the comment; similar is with the right side, there's a summation index ρ.

      So it's maybe more correct manner to write the sums explicitly, which makes it evident that equality holds only for l4 L4 = r4 R4, and then we are back at the last paragraph of the previous comment, that is proof.

      Delete
    5. Corrected answer to Exercise 5.:
      L(A)∩R(A) are those elements of End(A), i.e. aμν LμRν, for which:
      lσ Lσ = aμν LμRν = rρ Rρ,
      with summations over repeated indeces.
      Left equality holds only for for Rν = R4 = I, while right equality works only for Lμ = L4 = I, leading to:
      lσ Lσ = l4 L4 = l4 I = a44 L4R4 = a44 I I = r4 I = r4 R4 = rρ Rρ,
      which also means that l4 = a44 = r4 is the same complex coefficient in all three cases, that is
      L(A)∩R(A) = c I, where c is complex number and I is identity matrix (I = L4 = R4).

      Delete
    6. This reasoning is correct, but "Left equality holds only for for Rν = R4 = I" needs a justification. How do we know that?

      Delete
    7. Well, Lσ are linearly independent as they form the basis for L(A), and they are a subset of LμRν which are also all linearly independent as those are basis elements for End(A).
      As such Lσ can not be expressed as a linear combination using other elements of the basis LμRν, that is lσ Lσ = aμν LμRν = aσ4 LσR4 = lσ Lσ I.

      Delete
    8. That is what I wanted to hear: "linear independence". Good job, Saša.

      Delete
    9. @Saša "As such Lσ can not be expressed as a linear combination..."
      Probably, you wanted to say "can be expressed as a linear combination..." ?

      Delete
    10. No, linear combination of basis elements for L(A) can not be expressed using the remaining basis elements for End(A) as they are linearly independent. And basis of L(A) is part of basis for End(A).

      Delete
    11. In other words, only way to express linear combination (l1 L1 + l2 L2 + l3 L3 + l4 L4) in basis LμRν is by itself, i.e. just by multiplying it with R4=I.

      Delete
    12. Sorry, i don't understand😒 What is the expression "aμν LμRν" if not a linear combination of "LμRν" (which are the basis for End(A)) with coefficients aμν ?

      Delete
    13. Yes, "aμν LμRν" is a linear combination of basis elements for End(A), and only way to express (l1 L1 + l2 L2 + l3 L3 + l4 L4) in the basis LμRν is:
      l1 L1 + l2 L2 + l3 L3 + l4 L4 = (l1 L1 + l2 L2 + l3 L3 + l4 L4) R4,
      that is by itself (just by multiplying it with identity matrix I=R4) because of linear independence.
      Hope that made it a bit clearer now.

      Delete
    14. "As such Lσ can not be expressed as a linear combination using other elements of the basis LμRν, ..."

      The key word in the passage above is "other", or "remaining" in the passage below.

      "linear combination of basis elements for L(A) can not be expressed using the remaining basis elements for End(A)"

      Delete
    15. Yes, thank you very much! Now it seems very simple. After you've given the key idea. Easy to do but hard to guess what should be done.

      Delete
    16. Then, these considerations should be generalized. Intersection of subspaces is the smallest subspace that contains all the common elements of the intersecting subspaces. If two subspaces are linearly independent, they intersect only across the identity. Is there such a property, or i'm fantasizing again?

      Delete
    17. @Anna
      " Intersection of subspaces is the smallest subspace that contains all the common elements "

      perhaps you meant "largest"?

      Delete
    18. @Ark, i borrowed this phrase from some lecture course on the Internet. And did not switch on my brain when quoted it here. The "largest" surely. The smallest is the empty set.

      Delete
    19. I would rather say: the smallest is the zero subspace. But on Christmas day, after a glass of wine, I am not sure of anything.

      Delete
    20. "In vino veritas" they say. :))
      Enjoy the day dear Ark, by all means, you deserve it.

      @Anna
      Not sure about the generalization; was using (sub)sets, not subspaces in my reasoning.
      As the basis for L(A) shares only identity matrix with the basis for R(A), i.e. L4=I=R4, it basically means that the intersection of L(A) and R(A) consists of only those elements that can be expressed as a product of complex scalar and identity matrix I, that is as a complex linear combination of the shared element(s) of both bases, because all other elements of their bases, except the identity matrix I, are mutually linearly independent as they also form (non overlapping) subsets in basis for End(A).

      Delete
    21. Exactly, i thought about "zero subspace". But then i doubt whether we are working in a space or in a group (you say more often "basis elements" than "basis vectors") and chose more vague "empty set".
      Well, zero is zero, no matter what you call it, even after a glass of wine.🙃

      Delete
    22. @Saša "was using (sub)sets, not subspaces in my reasoning".
      Oh, i knew that it makes difference whether we use sets or spaces and inclined to sets.
      "As the basis for L(A) shares only identity matrix with the basis for R(A)..."
      Yes, this is very clear reasoning, which i intuitively understood but could not formalize.
      As concerned the generalization, i thought that Ark agreed with it, with some remarks. Let us return to the question later.

      Delete
    23. @Anna
      My original idea was to not use the basis. Instead I thought something like this:

      Let X be in End(A) and assume X is in L(A) and in R(A). X being in L(A) menas there is a in A such that X=L(a). Similarly, there is b in A such that X=R(b). That means
      Xu = L(a)u = Rb(u) for all u in A. Or:
      au = ub for all u in A. With u=1 we find that b=a. That means
      au = ua for all u. But that implies that a is a scalar.

      Delete
    24. Correction R(b)u instead of Rb(u) .

      Delete
    25. Of course to get "au = ua for all u. But that implies that a is a scalar." the realization of A as the 2x2 matrix algebra is handy (Shur's lemma).

      Delete
    26. "P.S. 24-12-24 16:32 A neigbour and friend wrote to me a while ago that he has found on Academia.edu my paper "Revisiting Wigner's mind-body problem", and that "A lot of good things are said in this paper." Well, I never dared to publish this paper thinking that it contains too many controversial and speculative statements. But, perhaps, it is not too bad? I really don't know."

      Agree with your friend, some very good ideas are presented in that paper. After reading it again last night and in relation to Anna's comment about "generalization", sort of a realization or insight appeared, probably trivial, but here it is. FWIW.

      L(A) and R(A) are not completely linearly independent, they are connected through shared Id, and in that sense each 'preserves' its own identity, its own 3 'self' dimensions, while by combining with the 3 'self' dimensions of the other, they together create 9 'new' dimensions of their joined reality, together providing for in total 1+3+3+9=16 dimensions of the overall reality for End(A).
      FWIW.

      Delete
    27. Schur's lemma - my stumbling block for a long time. It seemed so intricate until Ark gave that nice explanation above.
      Schur's Lemma: "Endomorphisms of an irreducible representation over a field can only be multiplication by a scalar".
      Or like this: "If operator A commutes with all irreducible representations of a group, then A is proportional to unity".
      Irreducibility appears as linear independence in our case.
      There is still much to be grasped for me about Schur's lemma, for example, does it have anything to do with the fundamental theorem of algebra: "Every irreducible polynomial over a field has degree 1" (in one of formulations). There is something in common between them, isn't it?
      As regards my generalization: "If two subspaces are linearly independent, they intersect only across the identity", there is some catch there, i will try to find it myself.

      Delete
    28. @Anna "If operator A commutes with all irreducible representations of a group, then A is proportional to unity".

      It would be more correct to say "with all operators of an irreducible representation".

      But it occurs to me that in order to use this lemma, we must know that Mat(2,C) form an irreducible representation of Mat(2,C). And this alone would be another good exercise.

      Delete
    29. @Saša "together providing for in total 1+3+3+9=16 dimensions of the overall reality for End(A)".
      I like such observations. The more so that this one concerns the interrelation of spaces with n+1 and n dimensions. Such "bulk-boundary" relations seem to be of crucial role in the structure of our world, since getting from (n+1)-dim bulk into another, qualitatively different one, requires crossing their n-dim boundary.

      Delete
    30. Oh no, Ark, please, i am not ready for such exercises, not yet!

      Delete
    31. @Ark "we must know that Mat(2,C) form an irreducible representation of Mat(2,C). And this alone would be another good exercise".
      Could you give a hint how to show that, please?
      I've just convinced myself that representation of Cl(3) elements by Pauli matrices is actually a representation, i.e., A(u)A(v) = A(uv), where A(u) is { u0 + u3, u1 - i u2, u1 + i u2, u0 - u3} from Mat(2,C). Will this do as a warm-up before that 'good exercise'?

      Delete
    32. Explaining more: We can do it within A, we do not have to use matrices To applu Shur's lemma we need to show that that any u commuting with all v is a multiple of identity. Write u and v in terms of the basis. Use the structure constants as in Part 29. The same reasoning. Of course you can do the same within Mat(2,C).

      Delete
  7. So many comments, i hope, Ark will review for other readers the main ideas, but i can't see clear answers to Ex. 1-3, probably, because they are considered too easy. Not in my case, so i would like you to check my answers below please.
    "Exercise 1. Use the Hilbert algebra property to show that L is indeed a *-representation. Do the same for R."
    For R it is really easy to show, the Hilbert algebra property is applied straightforwardly.
    But for L we have to make one more step since the factor which jumps into its conjugate stands at the first place not at the second like in =

    "Exercise 2. The matrices Lμ and Rμ are Hermitian. Why is it so?"
    Finally, because of the anticommutation of the basis elements e_i, matrices Lμ and Rμ reflect the antisymmetry of elements e_i labeling the rows and columns.

    "Exercise 3. Why the matrices Rμ are simply transposed to the matrices Lμ?"
    Because the left column and upper row of elements e_i, used to compose the left-product matrix, should be rotated by pi/2 clockwise to become, correspondingly, the upper row and the right column, which we use to compose the right-product matrix.

    ReplyDelete
  8. @Anna
    Exercise 1. Indeed, I may have overlooked something. It was simple in my mind, now this is not so simple anymore. So, I have to think. Be patient, please.

    ReplyDelete
    Replies
    1. Yes, I overlooked something. The definition of the Hilbert algebra, quoted from Wolfram, reads:

      3. For any a in A, there exists an adjoint element a* in A such that =, and =.

      I have forgotten about the first property. The proof, in our case, uses the same method as for the second. You then use one of them, the one that fits you needs. So, thank you, Anna.

      Delete
    2. Correction:

      such that (ab,c) = (b,a*c) and (ba,c) = (b,ca*)

      Delete
    3. @Anna Exercise 2. we have eμ = (eμ)* - vectors eμ are in V, they are real. Once we know that L(a*)= L(a)* for all a in A, taking a = eμ should give you the result.

      Delete
    4. Indeed. So, a *-representation is always given by a Hermitian matrix. But the reason behind this property in our case is anticommutation of eμ. Above I wrote erroneously "antisymmetry" of eμ. No, of corse not, i meant anticommutation.

      Delete
    5. Not that fast! L(eμ) are Hermitian, but L(u), for u being a complex linear combination of eμ, are, in general, not Hermitian.

      Delete
    6. Yes, thank you, it was a bit hastily. Ok, i am not so selfish as to hold your attention still further on the Christmas Eve😊

      Delete
    7. @Anna. Indeed it is time for me to pause the calculations and join the family. Thank you for the smile. Smile back. 😊

      Delete
  9. Let me wish Merry Christmas to everyone!🎄🎄🎄
    Many happy new insights to all of us, and let it be easy since those are wise who make a hard problem simpler and not vice versa. Getting new knowledge is not only one’s own pleasure, it puts a little piece of chaotic being into a better order and thus adds to the harmony of the Universe. Let us appreciate the fact that we are able to go along this infinite path and enjoy it.

    ReplyDelete
    Replies
    1. Thank you very much. I am being told that my younger sister will be cooking borshch and pierogi, very Polish, but perhaps Russian as well? Laura is working on pies and scallops and the turkey. While the girls are doing their job and boys are taking care of fire wood - I can work on the next posts. Quantum logic and left and right ideals coming soon!

      Delete
    2. Great menu for a Christmas party. So deliciously described that I can almost smell the aromas. By the way, what is the difference between pies and pirogi? I thought they are two names for almost the same thing. As for the atmosphere of the Blog, the spinors spirit has definitely appeared here. Yes, ideals are coming next, you have arranged everything and did it so smoothly and delicately… As if took by hand and led through a dark forest that always teased and scared me at the same time.

      Delete
    3. Pie:
      https://www.bbcgoodfood.com/recipes/key-lime-pie-1
      Pierogi:
      https://simple.wikipedia.org/wiki/Pierogi
      There was also herring for start.
      But key-lime-pie will be for today. Yet untouched.
      Working on spinor bases - which I yet have to understand. Didn't let me sleep last night.

      Delete
  10. "We know that every elements in R(A) commutes with every element in L(A).
    Exercise 4. Why is it so?"

    Because 'multiplication' is associative,
    L(v)R(w)z = v(zw) = (vz)w = R(w)L(v)z.

    ReplyDelete
    Replies
    1. Or more precisely, because their basis elements commute, thus L(u)R(v) = (lσ Lσ) (rρ Rρ) = (rρ Rρ) (lσ Lσ) = R(v)L(u).

      Delete
    2. Correction:
      L(u)R(v) = (uσ Lσ) (vρ Rρ) = (vρ Rρ) (uσ Lσ) = R(v)L(u).

      Delete

Thank you for your comment..

Spin Chronicles Part 39: Inside a state

 This note is a continuation of Part 38 . While we have the geometric Clifford algebra Cl(V) in mind, we start with a more general case of...