Tuesday, October 22, 2024

The Spin Chronicles (Part 5): Exterior algebra of space

In this post we will construct the Grassmann (or exterior) algebra of a 3D real vector space. It will be eight-dimensional. Its basis will have eight elements.

The basis of Λ(V) has eight elements

With this post we will start a new chapter of our spin chronicles with another approach - via Clifford algebra - the Clifford algebra of space. What is space? For us space will be a three-dimensional affine Euclidean space, let's call it  M. The fact that it is affine, means that there is a 3-dimensional real vector space, let us call V (this is not the same V as in previous posts, this is the V of the new chapter), and we can translate any point x of M by a vector a in V, to make another point x+a. The fact that M is Euclidean means that in V we have a positive definite scalar product that we will denote a·b. In the following we will deal exclusively with V, so we will use also letters x,y etc. for vectors in V. And to ease the notation we will write x,y, ... instead of x,y,.... In V we will restrict ourselves to orthonormal bases ei, that is we will require ei·ej = δij. Any two such bases are related by a unique orthogonal matrix, and element of the group O(3) of 3x3 matrices R such that RT R = R RT = I:

e'i = ej Rji.

Note: I will be using lower indices to number vector components, and upper indices to number basis vectors.

Every orthonormal matrix R has determinant +1 or -1. Base connected by R of determinant 1 are said to be of the same parity, those connected by R of determinant -1 are said to be of opposite parity. In the following we will restrict ourselves to one parity, which, by convention, we will call positive. This restricts transformation between bases to the subgroup SO(3) of O(3) - special orthogonal group in three (real) dimensions, consisting of orthogonal matrices R with det(R) = 1. But all this will be relevant only in the next post.

The Grassmann (or exterior) algebra Λ(V)

To define Λ(V) we do not need any scalar product in V. We need only its vector space structure. So Grassmann algebra is a pre-metric construction.

We consider the space of multi-vectors. They will form the Grassmann (or "exterior") algebra Λ(V). There will be scalars (these form the sub-algebra, essentially the one-dimensional algebra  of real numbers R), vectors (they make our V), bi-vectors, and three-vectors. No more. Sometimes we may use the term "rank" od "degree". So, scalars are of rank (or degree) 0, vectors of rank 1, bi-vectors of rank 2, and three-vectors, of course, of rank 3. All multi-vectors of rank higher than 3 are automatically zero for  three-dimensional V. Therefore we do not include them in Λ(V).

We know what are vectors, let us introduce bi-vectors (or two-vectors). In any basis of V, a vector v is represented by its components vi. A bi-vector f is represented by an anti-symmetric matrix  fij = -fji. Similarly a three-vector is represented by a totally anti-symmetric matrix fijk = - fjik =.-fikj. Since i,j,k can only take values 1,2,3, every 3-vector is of the form

fijk = c εijk,

where c is a real number, and εijk is the totally anti-symmetric Levi-Civita tensor taking values 0,1,-1, with ε123 = 1.

Kronecker deltas

It will be convenient to use Kronecker delta symbols. One of them, δij, is well known. Then we have (writing on a web page lower indices directly under upper indices is too complicated for me, so my formulas look differently than in a "real" math text)

δklij = δikδjl - δiljk,

and

δlmn ijk = δil δmnjk - δim δlnjk + δin δlmjk.

It is easy to see the pattern. We can verify that the following identities hold for contractions (summation) over repeated indices (a convention we will always use):

δlmn123 = εlmn, δ123lmn = εlmn,

δimnijk = δmnjk ,

(1/2!) δijnijk =  δkn .

δklij  and δlmnijk  are, by construction, anti-symmetric with respect to lower indices, and also with respect to upper indices. They are equal to +1 if lower indices are an even permutation of upper indices, -1 for odd permutations, and zero otherwise. We will use them to define exterior product of multi-vectors.

Exterior product

We will denote it by "∧". Multiplication by scalars is the normal one: we just multiply any multi-vector by  the real number, from the left or from the right - it is the same. Multiplication by vectors is defined as follows. If v,w are two vectors, then v∧w is a bi-vector with components:

(v∧w)ij = δijkl vkwl = viwj - vjwi.

Notice that v∧w = - w∧v. In particular v∧v = 0 for every vector v.

Multiplication of vectors with bi-vectors is defined in a similar way. If v is a vector and f is a bi-vector, then v∧f is a three-vector with components

(v∧f)ijk = δijkklm vkflm.

Similarly from the right

(f∧w)ijk = δijkklm fklvm.

Since δijkklm = δijkmkl ,

we have v∧f = f∧v. Thus vectors commute with bi-vectors.

Exercise 1: Show that for any bi-vector f we have: 

(1/2!) δijkl fkl =  fij,

and for any three-vector f we have:

(1/3!) δijklmn flmn =  fijk.

Exercise 2: Show that if f is a three-vector, then it commutes with every element of the algebra.


Finally a three-vector multiplied by a vector or bi-vector gives 0. The same for the product of two bi-vectors. This way we have defined the algebra of multi-vectors Λ(V), known also as the Grassmann algebra of V. One can verify that the product is associative, The unit element of this algebra is the scalar 1. As a vector space Λ(V) is of dimension 1+3+3+1 = 8 = 23. This happens to be twice the dimension of the algebra of quaternions. Later on we will see that there are reasons for it.

If ei is an arbitrary basis of V, we introduce the multi-vectors of rank 0, 1,2,3 respectively, defined by their components:

The components of these basic multi-vectors can be written as

rank 0: scalar 1

rank 1: (ei)j = δij,

rank 2: (eij)kl = (1/2!) δijkl ek∧el

rank 3: (eijk)lmn = (1/3!) δijklmn el∧em∧en.

The basis of Λ(V) consists then of 8 elements:

1,

e1, e2, e3,

e12 = e1∧ e2

e23 = e2∧ e3

e31 = e3∧e1

e123 = e1∧e2∧e3.

We do not need, for example,  e21, because e21 = - e12. Similarly, we do not need, for example,  e231, since e231 = e123. We can easily figure out the multiplication table of these eight basic vectors, for instance

e1∧ e23 = e123, e12 ∧ e1 = 0, e23 ∧ e1 = e123, etc.

We can also check that the product is associative, for instance (e1∧ e2)∧ e3 = e1 ∧ (e2∧ e3).

Every bi-vector f is then:

f  = Σi<j  fij eij,

and every three-vector f is then:

f  = Σi<j<k  fijk eijk.

Note:In mathematics Grassmann algebra is defined in a different way, without indices, as a quotient of the infinite dimensional tensor algebra by an infinite dimensional ideal, to end up with a finite-dimensional space. It has its advantages. Here I have chosen a computer-friendly, constructive approach.

The future

In the next post we will deform the product in Λ(V) to obtain a new algebra structure on the same space - that will be the Clifford geometric algebra Cl(V) of V. So far, defining the multiplication, we never used the scalar product a·b of V . That will change with the geometric algebra product.The scalar product will be used in the formula defining the deformation.

Note: Ultimately we will need a separate Grassmann (and Clifford) algebra at each point of our space (a field of algebras). This will lead us to infinite number of dimensions of the field. But let us deal with just one point at a time.

Infinity of the exterior algebra field

P.S. Here are two relevant and useful pages dealing with Kronecker deltas from

Справочник По Математике Корн Г, Корн Т 1974



P.S. Reading "Unreal Probabilities: Partial Truth with Clifford Numbers" by Carlos. C. Rodriguez. There, at the beginning:

"The main motivation for this article has come from realizing that the
derivations in Cox [4] still apply if real numbers are replaced by complex
numbers as the encoders of partial truth. This was first mentioned by
Youssef [12] and checked in more detail by Caticha [2] who also showed
that non-relativistic Quantum theory, as formulated by Feynman [5], is the
only consistent calculus of probability amplitudes. By measuring propo-
sitions with Clifford numbers we automatically include the reals, complex,
quaternions, spinors and any combination of them (among others) as special
cases."

And at the end:

"Comments and conclusion What the hell is this all about and what it
may be likely to become...."

My answer: It is about fields of Clifford algebras and Clifford algebra-valued "measures".

Reading . "Intelligent machines in the twenty-first century: foundations of inference and inquiry" by Kevin H. Knuth. There

"Complex numbers and quaternions also conform to Cox’s consistency requirements (Youssef 1994;
also S. Youssef (2001), unpublished work), as do the more general Clifford algebras (Rodrıguez 1998), which are multivectors in the geometric algebra (Hestenes & Sobczyk 1984) described in Lasenby et al . (2000). Furthermore, Caticha (1998) has derived the calculus of wave function amplitudes and the Schrodinger equation entirely by constructing a poset of experimental set-ups and using the consistency requirements with degrees of inclusion represented with complex numbers. This leads to a very satisfying description of quantum mechanics in terms of measurements, which explains how it looks like probability theory—yet is not. We expect that the generalizations of lattice theory described here will not only identify unrecognized relationships among disparate fields, but also allow new measures to be developed and understood at a very fundamental level."

A partial truth value can be a multi-vector!


11 comments:

  1. Few questions.
    What's the underlying motivation to use anti-symmetric matrices for the definitions of rank 2 and rank 3 multivectors? Is there something "special" about them?
    I mean, an example of a bivector would be electromagnetic field tensor, right? What would be an example in nature of a three-vector?

    What is the meaning of the term "partial truth" and what would be "partial truth values"?


    And a few interesting snippets about Clifford from
    https://en.wikipedia.org/wiki/William_Kingdon_Clifford

    Mind stuff:
    "That element of which, as we have seen, even the simplest feeling is a complex, I shall call Mind-stuff. A moving molecule of inorganic matter does not possess mind or consciousness; but it possesses a small piece of mind-stuff. When molecules are so combined together as to form the film on the under side of a jelly-fish, the elements of mind-stuff which go along with them are so combined as to form the faint beginnings of Sentience. When the molecules are so combined as to form the brain and nervous system of a vertebrate, the corresponding elements of mind-stuff are so combined as to form some kind of consciousness; that is to say, changes in the complex which take place at the same time get so linked together that the repetition of one implies the repetition of the other. When matter takes the complex form of a living human brain, the corresponding mind-stuff takes the form of a human consciousness, having intelligence and volition."
    — "On the Nature of Things-in-Themselves" (1878)

    Tribal self:
    Tribal self, on the other hand, gives the key to Clifford's ethical view, which explains conscience and the moral law by the development in each individual of a 'self,' which prescribes the conduct conducive to the welfare of the 'tribe.'

    Clifford principle from
    https://en.wikipedia.org/wiki/Clifford%27s_principle

    While Clifford's principle can be found in works from previous philosophers such as John Milton and Samuel Coleridge, it is the most well known by Clifford's iconic definition:
    "It is wrong always, everywhere, and for anyone to believe anything on insufficient evidence."
    This principle can also be found in a slight variation, often called Clifford's Other Principle:
    "It is wrong always, everywhere, and for anyone to ignore evidence that is relevant to his beliefs, or to dismiss relevant evidence in a facile way."

    ReplyDelete
    Replies
    1. As long as we have only space and no time, it would be only magnetic field. 3-vector is an oriented parallelepiped of space itself. Partial truth is truth under some specified assumptions. Its value is a an element of the set of values that we decide to use for navigation.

      Delete
    2. Thank you.

      Re "Exercise 2: Show that if f is a three-vector, then it commutes with every element of the algebra.",
      it's probably not what you had in mind with this exercise, but can a logical argument like the following be a satisfactory resolution:
      Since "a three-vector multiplied by a vector or two-vector gives 0" it's trivially evident that 3-vector commutes with vectors and bivectors, and as "Multiplication by scalars is the normal one: we just multiply any multi-vector by the real number, from the left or from the right - it is the same." it's also obvious that a product of 3-vector with a scalar is commutative. A product of a 3-vector with a 3-vector is not defined, i.e. assuming it's also 0, so a 3-vector commutes with everybody, that is by definition with scalars, which is the only defined non-zero product result of all possible combinations for a product with 3-vectors.
      Would that do?

      Delete
    3. Re "Exercise 1: Show that for any bi-vector f we have:
      (1/2!) δ_ij^kl f_kl = f_ij,"

      using
      δ_ij^kl = δ_i^k δ_j^l - δ_j^k δ_i^l
      for left-hand side we get
      (1/2!) (f_ij - f_ji) = (1/2) * 2 * f_ij = f_ij,
      which is what we wanted to get, i.e. the right-hand side of the expression;

      and for the other part of the exercise:
      "and for any three-vector f we have:
      (1/3!) δ_ijk^lmn f_lmn = f_ijk."

      using
      δ_ijk^lmn = δ_i^l δ_jk^mn - δ_j^l δ_ik^mn + δ_k^l δ_ij^mn,
      and for example the above
      δ_ij^kl = δ_i^k δ_j^l - δ_j^k δ_i^l,
      for left hand side we get,
      (1/6)*(f_ijk - f_ikj - f_jik + f_jki + f_kij - f_kji) = (1/6) * 6 * f_ijk = f_ijk,
      which is what we wanted to get, i.e. the right-hand side.

      Delete
  2. =. ->
    =

    same Multiplication ->
    same. Multiplication

    Finally a three-vector multiplied by ...
    What about bi times bi?

    "two-vector" occurs once.

    vis a three ->
    f is a three

    vlm ->
    vm

    If ei is an arbitrary basis ->
    superscript

    ReplyDelete
    Replies
    1. @Bjab
      "What about bi times bi?"
      A product of two anti-symmetric matrices in principle is not an anti-symmetric matrix, even when we know how and can multiply them, so a convenient definition would be needed.
      And going with analogy, multi-vector rank 1 (vector) times multi-vector rank 1 gives multi-vector rank 2 (bivector), rank 1 times rank 2 gives multi-vector rank 3 (3-vector), so bivector (rank 2) times bivector (rank 2) would then give multi-vector rank 4 which would be 'outside' of our 3d space V, just like a product of 3-vector with vector or bivector. So, in the similar fashion, we might say that bivector times bivector gives 0.
      Does that make sense?

      Delete
    2. @Bjab. Fixed. Thanks again.
      @Saša. Added a sentence "All multi-vectors of rank higher than 3 are automatically zero for three-dimensional V. Therefore we do not include then in Λ(V)."

      Delete

    3. "Therefore we do not include then in Λ(V)."

      Them.
      They, their, theirs.

      https://en.wikipedia.org/wiki/Singular_they
      Fortunately we (!) don't have such problems in Polish.

      Delete
    4. Thanks. But already noticed and corrected in the text.

      Delete

Thank you for your comment..

Sunday Special - From Fairy Tales to Math: The Power of Threes

  Ever noticed how many things come in threes? Like in a story, three chances always seem just right, or in comedy, the third punchline real...