Tuesday, June 25, 2024

AI at work

 Today (25-06-24) got an email from ai.feedback@outlook.com, with subject "Feedback on your recent arXiv preprint 2401.14217". Here it is:

Hi there,

Congratulations on your recent preprint on arXiv, titled "The explicit form of the unitary representation of the Poincar\'e group for vector-valued wave functions (massive and massless), with applications to photon's localization and position operators". We are grateful for your hard work and dedication to the field, and we value your contributions!

We are part of a team from Northwestern University, Stanford University, and Cornell University, committed to providing research feedback to scholars with the assistance of advance AI models. We have followed your work closely and, upon a thorough examination, generate the suggestions below. These suggestions cover various aspects of your work, including the writing style, research design, and title. We hope they offer you fresh insights that may enhance the depth and impact of your research.

To view detailed comments regarding your research, please visit this link: http://feedback.kellogg.northwestern.edu/xxx.html

Once again, congratulations on your achievement. We are certain that your work will have an impact on the future of your field and will inspire fellow researchers worldwide.

Should you have any questions, concerns, or suggestions, please do not hesitate to reach out to us at ai.feedback@outlook.com. Your insights would be invaluable in helping us better support the community's research development.

Best regards,

Feedback Team

And here is their (AI) feedback: 

1.Significance and Novelty2.Suggestions for Improvement3.Suggestions on Title4.Grammar Check for Abstract5.Grammar Check for Introduction

Feedback for

The explicit form of the unitary representation of the Poincar\'e group for vector-valued wave functions (massive and massless), with applications to photon's localization and position operators

1. Significance and Novelty

  • Geometric Derivation:
  • Innovatively builds a unitary representation of the Poincaré group using a geometric framework.
  • Photon Position Operators:
  • Offers an explicit form for the photon position operator with commuting components, aligning with the HawtonBaylis formulation.
  • Affine Connections:
  • Compares the Pryce and HawtonBaylis connections, highlighting differences in semisymmetry and metric properties.
  • Photon Localization:
  • Provides a rigorous analysis of photon states localized on loops, applying both the HawtonBaylis operators and the JauchPironAmrein POV measure.

2. Suggestions for Improvement

  • Mathematical Rigor and Clarification:
  • Contextual Clarification:
  • The paper should more explicitly clarify the context and motivation behind the study of photon position operators. While the historical background is comprehensive, linking it directly to the benefits and applications of photon localization in contemporary quantum mechanics could enhance reader engagement.
  • Detailed Proofs:
  • Include more detailed proofs for the derivations of the photon boost eigenmodes and related transformations. For example, the section detailing the explicit form of so obtained unitary representation and its limit to m=0 would benefit from stepbystep mathematical exposition.
  • Figures and Tables:
  • Inclusion of Visual Aids:
  • The manuscript lacks visual aids. Diagrams illustrating the geometric constructs, and tables summarizing the comparative properties of different connections (e.g., metric semisymmetry versus flatness) could significantly enhance comprehension.
  • Clarity in Captions:
  • Ensure that all figures and tables have clear, selfexplanatory captions. The absence of such captions in the current outline is a noticeable omission.
  • Comprehensive Literature Review:
  • Expand Bibliography:
  • While the references are robust, ensure that all recent relevant studies are included. For instance, adding discussions about recent empirical studies or theoretical advancements related to photon localization and affine connections could provide a more comprehensive view.
  • Terminology and Notation:
  • Consistency:
  • Ensure that terminology and notations are used consistently throughout the paper. For example, the term HawtonBaylis connection should be used uniformly without alternation if it is referred to by other names.
  • Structure and Readability:
  • Section Consistency:
  • Improve the flow between sections. The transition from one main section to another sometimes feels abrupt. Providing brief introductory and concluding paragraphs within sections could improve overall coherence.
  • Abstract Brevity and Focus:
  • The abstract is too technical and dense for a general reader. Consider distilling it to highlight the core contributions and findings succinctly, reserving detailed technicalities for the main text.
  • Theoretical Implications:
  • Implications of Findings:
  • Expand on the potential implications of your theoretical findings. For example, how do the new photon position operator and the findings on different connections impact future research perspectives or practical applications in quantum field theory or optics?
  • By addressing these suggestions, the authors can significantly improve the clarity, impact, and comprehensiveness of their paper. The emphasis on more explicit motivational contexts, visual aids, detailed proofs, and thorough discussions on potential implications will not only make the paper more accessible but also more valuable to the scientific community.

3. Suggestions on Title

Original Title

The explicit form of the unitary representation of the Poincar\'e group for vector-valued wave functions (massive and massless), with applications to photon's localization and position operators

Recommended Titles

  • Unitary Representations of the Poincaré Group in Photon LocalizationReasoning: This title is concise and highlights the core mathematical and physical concepts of unitary representations and photon localization."
  • Photon Localization through Unitary Poincaré Group RepresentationsReasoning: Reverses the structure for emphasis on photon localization
  • Geometric Derivations of Unitary Poincaré Representations and Photon Position OperatorsReasoning: Incorporates the geometric aspect
  • Affine Connections and Photon States in Unitary Poincaré RepresentationsReasoning: Specifically mentions affine connections
  • Photon Position Operators and Localization via Unitary Poincaré Group TheoryReasoning: This title combines the key elements - photon position operators and localization - with the Poincaré group

4. Grammar Check for Abstract

  • 1.Original Sentence: We geometrically derive the explicit form of the Unitary representation of the Poincaré group and use it to apply speed-of-light boosts to simple polarization basis to end up with Hawton-Baylis photon position operator with commuting components.
        ErrorType: Subject-Verb Agreement Errors
        Explanation: The verb 'apply' should be in its base form as it follows 'use it to'
        Recommended Fragment: apply speed-of-light boosts to a simple polarization basis
  • 2.Original Sentence: We geometrically derive the explicit form of the Unitary representation of the Poincaré group and use it to apply speed-of-light boosts to simple polarization basis to end up with Hawton-Baylis photon position operator with commuting components.
        ErrorType: Incorrect Word Usage
        Explanation: The phrase 'simple polarization basis' should include an article 'a' to be grammatically correct.
        Recommended Fragment: apply speed-of-light boosts to a simple polarization basis
  • 3.Original Sentence: Finally we discuss localizabil- ity of photon states localized on closed loops and show that photon states on the circle, both unnormalized improper states and finite norm wave packet smeared over washer-like regions are strictly localized with respect to Hawton-Baylis oper- ators with commuting components and also with respect to the noncommutative Jauch-Piron-Amrein POV measure.
        ErrorType: Spelling Errors
        Explanation: The word 'localizabil- ity' is split incorrectly over two lines and should be corrected to 'localizability'.
        Recommended Fragment: Finally we discuss localizability
  • 4.Original Sentence: photon states on the circle, both unnormalized improper states and finite norm wave packet smeared over washer-like regions are strictly localized with respect to Hawton-Baylis oper- ators with commuting components and also with respect to the noncommutative Jauch-Piron-Amrein POV measure.
        ErrorType: Spelling Errors
        Explanation: The word ‘oper- ators’ is split incorrectly over two lines and should be corrected to ‘operators’.
        Recommended Fragment: operators with commuting components and also

* Disclaimer: The grammar suggestions provided are checked by advanced AI models and are intended for reference purposes only.

5. Grammar Check for Introduction

  • 1.Original Sentence: Introduction The group-theoretical analysis of elementary relativistic quantum systems lead to the concept of imprimitivity systems, developed by G.W. Mackey (cf. e.g. [2, Ch. VI] and references therein), and to the associated concept of the localization of elemen- tary quantum particles.
        ErrorType: Run-On Sentence
        Explanation: The sentence lacks proper punctuation to separate distinct ideas.
        Recommended Fragment: Introduction. The group-theoretical analysis of elementary relativistic quantum systems led to the concept of imprimitivity systems, developed by G.W. Mackey (cf. e.g. [2, Ch. VI] and references therein), and to the associated concept of the localization of elementary quantum particles.
  • 2.Original Sentence: A.S. Wightman [3] applied these concepts to the study of localizability of quantum mechanical systems and came to conclusion confirming the previous analysis of T.D. Newton and E.P Wigner [4], namely that photons (as well as other particles of rest mass zero and helicity ≥ 1) are covariantly non-localizable in a strict sense of an imprimitivity system bases on the 3-d Euclidean group acting on R 3 ..
        ErrorType: Run-On Sentence
        Explanation: The sentence lacks proper punctuation to separate distinct ideas.
        Recommended Fragment: A.S. Wightman [3] applied these concepts to the study of localizability of quantum mechanical systems and came to the conclusion confirming the previous analysis of T.D. Newton and E.P. Wigner [4]. They determined that photons (as well as other particles of rest mass zero and helicity ≥ 1) are covariantly non-localizable in a strict sense of an imprimitivity system based on the 3-D Euclidean group acting on R3.
  • 3.Original Sentence: J.M. Jauch and C. Piron [5], developed a concept of \"weak localizability\" replacing projection-valued measure by POV (positive operator-valued) measures, and A.O. Amrein [6] proved that there exist photon states strictly POV-localized in arbitrarily small regions of space, while, more recently, I. and Z. Bialynicki-Birula [7] argued that photons cannot be sharply localized because of a kind of complementarity between magnetic and electric energy localization.
        ErrorType: Run-On Sentence
        Explanation: The sentence lacks proper punctuation to separate distinct ideas.
        Recommended Fragment: J.M. Jauch and C. Piron [5] developed a concept of \"weak localizability\" by replacing projection-valued measures with POV (positive operator-valued) measures. Additionally, A.O. Amrein [6] proved that there exist photon states strictly POV-localized in arbitrarily small regions of space. More recently, I. and Z. Bialynicki-Birula [7] argued that photons cannot be sharply localized due to a kind of complementarity between magnetic and electric energy localization.
  • 4.Original Sentence: It is rather easy to show that the standard requirements of the covariance with respect to the Euclidean group and inversions lead to a unique Q, - known as the Pryce photon position operator 2 [8], the trouble is that the components Q i do not commute, which makes the simple probabilistic interpretation for the photon's localization problem impossible.
        ErrorType: Run-On Sentence
        Explanation: The sentence lacks proper punctuation to separate distinct ideas.
        Recommended Fragment: It is rather easy to show that the standard requirements of covariance with respect to the Euclidean group and inversions lead to a unique Q, known as the Pryce photon position operator [8]. The trouble is that the components Q i do not commute, which makes the simple probabilistic interpretation for the photon's localization problem impossible.
  • 5.Original Sentence: acting on R 3 ..
        ErrorType: Sentence Fragment
        Explanation: The sentence lacks a subject and verb to create a complete thought.
        Recommended Fragment: acting on R3.

* Disclaimer: The grammar suggestions provided are checked by advanced AI models and are intended for reference purposes only.

Altogether not too bad! 

Saturday, June 15, 2024

The Goldilocks Enigma – Part Six

 by Laura Knight-Jadczyk

In the previous post, we learned how scientists have been able to use the ‘hot big bang’ theory to extrapolate how the chemicals for life came into being.  I noted that the theory includes the idea that the initial ‘searing hot plasma’ was composed of ‘freely moving atomic components’ such as protons, neutrons and electrons’ which came from where?  No clue. 

We learned that the universe is composed of mostly hydrogen and helium in a 3 to 1 ratio, and the theorized processes of the big bang have been confirmed (at least in part, I guess) in atom colliders. We also learned that, at the initial stages of the theory, the universe was allegedly “squeezed into a volume of space no larger the solar system, with temperature almost a million times hotter than the center of the sun.”  In passing, I noted Weizsäcker’s ‘information-theoretic” idea. 


Created by A.J. using AI software Kandinsky 3.1 with the following  command:
 'a drawing of an inflating universe at the instant of initial inflation

One important thing learned was that the rate of expansion vs deceleration played a very important role in the physical processes taking place at a given time, i.e. the creation of the different materials and conditions necessary for life. 

Then, there was Alan Guth and his ‘inflation’ as opposed to ‘expansion’.  In this theory the universe jumps from something the size of a single proton to the size of a grapefruit virtually instantaneously, i.e. almost like creation by fiat!  According to Guth, this is all thanks to a scalar field that has a pressure comparable to its energy and produces anti-gravity though this process had to stop pretty quick or everything would go kaflooey.  So, conveniently, Guth’s scalar field is ‘inherently unstable’ and just peters out after a bit. That is, instead of an explosion, there’s just this instant inflation and the energy stored in the inflation field turns into heat and this heat created all the 1050 tons of matter in the universe.  Nobody has yet observed a scalar field, according to Davies.

I thought about all of that for a while and the thing that bothered me is this: The initial Standard model, according to Davies, proposes that the primal universe was “squeezed into a volume of space no larger the solar system, with temperature almost a million times hotter than the center of the sun.”  The solar system is more than 28 billion miles in diameter.  Then Davies said: "the universe doubled in size between 1 and 2 microseconds (a millionth of a second)".  That means that the outer edges traveled 2x28 billion miles in a microsecond.  That means that this expansion took place waaay faster than the speed of light.  Now, I’m not a mathematician, so I mentioned this to Ark who noted that the solution to this problem usually given is that ‘the speed of light does not apply to the edge of the universe.’ He then pointed me to this page: The Universe Could Be Eternal, According to This Controversial Theory  There we read:

·       Controversial research suggests the Big Bang may be a myth due to its reliance on the Doppler effect theory.

·       This idea says the universe is neither expanding, nor contracting; instead it is steady, and has no beginning and no end.

The author of this theory thinks that the redshift on which all we have been looking at so far has been based, is a red herring. We read there:

“The Doppler’s effect is a 180-year old theory nobody has backed up with experimental evidence,” Wilenchik tells Popular Mechanics. To look at different planets and moons in the solar system, Wilenchik, who is a lawyer by trade and an amateur astronomer, borrowed a simple spectroscopy test English astronomer William Huggins had first used in 1868. Spectroscopy is the study and measurement of spectra, or the charts or graphs that depict the intensity of light from an astronomical body like a star. Wilenchik also used data from the Hawaii-based Keck Observatory’s spectrometers—available online—and had a professional astrophysicist process it for him. The results of his study align with a different, incompatible idea about the universe: the tired light model.

The 1929 brainchild of Swiss astronomer Fritz Zwicky, the tired light hypothesis attributes the universe’s redshift to the fact that photons, the tiny packets of electromagnetic energy that make up light, lose energy as they pass through the great cosmos. Therefore, a decrease or increase in energy doesn’t necessarily mean movement, so no stretching universe can exist. This model indicates that light simply loses energy over time—and so the universe must be static.


Read the article for the pros and cons as well as another article about the same: Quasi Steady-State theory: the Big Bang alternative explained.   There we read: 

"To his death, Hoyle would never submit to the Big Bang theory. A small subset of cosmologists still work on resurrecting a steady state model; but, on the whole, the community overwhelmingly supports the Big Bang theory."

Ark further pointed me to this discussion  where he says “Here is the discussion that shows how it is all confusing. It is like arguing lawyers. Whatever arguments one lawyer would have, another lawyer will always be able to find arguments to the contrary.”

It is here that we notice that apparently Guth’s model proposes that the primal universe, before inflation, was ‘the size of a coin’.  Obviously, that’s a lot smaller than a solar system sized primal atom and far more satisfactory to those who don’t want to have to explain where all that primal mass came from.  And here we see my question: ‘Can space expand with unlimited speed?’ The first answer says ‘yes’, more or less:

The expansion hasn't got a speed. It is a misnomer to say that it is a speed. It should be called expansion rate. It is not like two points having a relative speed, it is more like a scaling rate of the unit distance. If there were no masses in the universe we would not sense any expansion at all. – Oktay Doğangün, Commented Apr 15, 2018

Then, there is a long, complicated answer with equations and graphs.  Go check it out at the link. The bottom line is the following:

·       The current (co-moving) distance of the edge of the observable universe is 46.2 billion ly. Of course, the total universe can be much bigger, and is possibly infinite. The observable universe will keep expanding to a finite maximum co-moving distance at cosmic time t=∞, which is 62.9 billion ly. We will never observe any source located beyond that distance.

·       The edge of the observable universe is receding from us with a recession velocity of more than 3 times the speed of light. 3.18c, to be exact. In other words, we can observe sources that are moving away from us faster than the speed of light. Sources at co-moving distances of 10, 20, 30 and 40 Gly are receding from us at 0.69, 1.38, 2.06 and 2.75 times the speed of light, respectively.

·       Sources outside our particle horizon are moving away even faster. There is no a priori limit to the maximum recession velocity: it is proportional to the size of the total universe, which could be infinite.

A third answer to the question:

Yes, the expansion of space itself is allowed to exceed the speed-of-light limit because the speed-of-light limit only applies to regions where special relativity – a description of the spacetime as a flat geometry – applies. In the context of cosmology, especially a very fast expansion, special relativity doesn't apply because the curvature of the spacetime is large and essential.

So, apparently, the inflation/expansion of the universe at faster than the speed of light, does not violate relativity because it is space that is expanding not the objects themselves moving within that space.  And, by fiat, there is no limit to the expansion rate of space.

Finally, the very day Ark and I were discussing this, I found this article:   'Physics itself disappears': How theoretical physicist Thomas Hertog helped Stephen Hawking produce his final, most radical theory of everything

So, yet another lawyer arguing his case.

When Davies wrote “The Goldilocks Enigma” in about 2006, it was as up-to-date as he could make it.  But it is clear that the ‘science is not settled’ at all.  In the next post, we’ll pick up with Davies again and see where he was going.  I just thought an interlude showing where things are today would be useful. 

Saturday, June 8, 2024

The Goldilocks Enigma – Part Five


 

By Laura Knight-Jadczyk

In the previous post we learned that, apparently, the better our technology, the older the universe appears to be.  Hubble can, apparently, see 28 billion years back in time. We also learned that the universe is unbounded, but still finite, but we still cannot see over the cosmic visual horizon.  We also learned that, apparently, we are more or less trapped in our 3 dimensional world and it is irrelevant that we may be embedded in some higher dimensional space since we can’t perceive it.  I’m not so sure about that since I have long thought that what we call ‘paranormal’ may actually be instances of perception of other/higher dimensions.  We touched on how other dimensions could be hidden from us and Davies mentioned compactification and branes. We also learned that it seems that it appears to be necessary for us to live in a 3D world since a 4D world (or higher) would not be sustainable;  3 dimensions are ‘just right’.  Ark added a note that he would not be surprised if there is both compactification and branes as well as other possibilities to explain what is going on that hides other dimensions from us.  Finally, I included two articles that appear to contradict the Standard Model which I am endeavoring to describe in these posts.

So now, let us continue.  Among the top requirements for a life-friendly universe is a good supply of the chemical elements that are utilized to make and sustain living organisms.  Scientists have been able to use the ‘hot big bang’ theory to extrapolate how these chemicals came into being.  Apparently, at the moment of the big bang, such elements did not exist because, at one second after the big bang, the temperature was about ten billion degrees.  Atoms cannot survive those temperatures, so, at that moment, there was only searing hot plasma of ‘freely moving atomic components.’  Now, Davies writes that “even atomic nuclei would be smashed apart” at such high temperatures, but when he writes that the plasma was composed of ‘atomic components’ he lists protons, neutrons and electrons and I cannot help but ask: where did the protons, neutrons and electrons come from? Who/what decided that plasma should/could/would be composed of these little critters that can make atoms?  According to Wikipedia: “By the first second, the universe is made up of fundamental particles and energy: quarks, electrons, photons, neutrinos and less familiar types. These particles smash together to form protons and neutrons.”  It seems to me that these ‘fundamental particles’ might tell us something about the nature of the alleged primordial mass that ‘exploded’ and became the universe. Well, anyway, let me quote Davies here:

“Most of the protons that came out of the big bang remained free, and were destined to form hydrogen atoms once the universe had cooled enough for each proton to capture an electron.  (That final step didn’t happen for nearly 400,000 years.)  Meanwhile, however, not all the protons were left isolated.  Some of them collided with neutrons and stuck to form deuterium, a relatively rare isotope of hydrogen with one proton and one neutron apiece in each nucleus.  Other protons became incorporated into helium, the next simplest element, which has a nucleus consisting of two protons and two neutrons.  What I am describing is nuclear fusions, a process which is very well understood.  Protons and neutrons could begin combining together to make composite nuclei only once the temperature had fallen enough so that the newly minted nuclei would not immediately be fragmented again by the intense heat.  The window of opportunity for nuclear fusion was limited, however, opening up at 100 seconds or so and closing again after only a few minutes.  Once the temperature dropped below about a hundred million degrees, fusion ground to a halt because the protons lacked the energy to overcome their mutual electrical repulsion.”

Apparently scientists can also calculate how much helium was made and how many protons were left over to make hydrogen and the answer is three hydrogen atoms to every helium atom and nothing else except a tiny amount of deuterium and lithium. The ratio is apparently confirmed by astronomical observations since every chemical element, in the light they emit, have a spectral ‘barcode’ by which they are identified.

So, we know that the universe is made mostly of hydrogen and helium in a 3 to 1 ratio and helium is a relic of the first minutes of the big bang.  

The processes of the big bang have been tested and confirmed in high energy physics experiments in atom colliders such as Brookhaven National Laboratories. Scientists there can see what happened at the point the universe was “squeezed into a volume of space no larger the solar system, with temperature almost a million times hotter than the center of the sun.  It turns out that under these extreme conditions even protons and neutrons cannot exist as discrete entities.  Instead, they were melded into an amorphous cocktail of subnuclear fragments.” But still, I ask, where did those subnuclear fragments come from and what can they tell us? There is an alternative idea that tries to suss out this problem:

Quantized Elementary Alternatives 

The quantum theory of the elementary alternative was formulated by German physicist and philosopher Carl Friedrich von Weizsäcker in a series of papers entitled Komplementarität und Logik (Complementarity and Logic) I-III between 1955 and 1958. Weizsäcker calls the elementary alternative ‘das Ur’ (pronounced more like ‘poor’ than ‘pure’), after the German prefix ur-, denoting something like primitive or primordial (compare: Ursuppe, the primordial soup, or Urknall, the primordial bang, or big bang). Hence, the theory of the elementary alternative is known as ur theory, which doesn’t do its googleability any favors.

Weizsäcker’s starting point thus is basic logic—how we should reason about the things in the world. Complementarity, then, is the central phenomenon of quantum theory that entails the necessity of formulating the description of a system in terms that are both mutually exclusive and jointly necessary (as in wave/particle duality; see the previous discussion here). As Weizsäcker argues, this should be a fundamental building block of the logic used to reason about and construct scientific theories. But this itself constrains the theories that can be built, in surprising and illuminating ways.

Weizsäcker’s outlook is, thus, at first brush broadly Kantian: there are certain concepts that we may consider ‘innate’, that dictate the form of our experience of the world. Kant considered, e. g., space and time to be among these; hence, no non-spatial experience is possible, or even imaginable.

This is a break with the atomist tradition. Rather than simply being at the receiving end of unbiased data emanating from the world, the observer in this picture mediates the data through the process of observation—thus, the sorts of theories that can be built do not describe unvarnished reality, but the experience of an observer in the world. The idea of an observer-less world is immediately nonsensical, as the notion of ‘world’ carries that of the ‘observer’ with it.

From there, Weizsäcker proposes to build a theory of physics, taking as its point of origin nothing but the ur theory, that is, the quantum theory of the elementary alternative—the qubit, in modern parlance. In this way, he proposed an information-theoretic grounding for physics three decades before Wheeler ever coined the famous slogan ‘It from Bit’.

Getting back to Davies: notice that he wrote that the universe was “squeezed into a volume of space no larger the solar system, with temperature almost a million times hotter than the center of the sun.” Our solar system is pretty darn big relative to our planet and us.  Voyager 1 has been traveling for more than 40 years and still has not escaped the influence of our sun at almost 14 billion miles out.  So that ‘primordial atom thingy’ was really huge.  Was it flat like a pancake or round like a ball? What was it? How did it come into being?

Whatever it was, apparently the science tells us that the universe doubled in size between 1 and 2 microseconds (a millionth of a second), but by one second, the expansion rate had dropped to a trillionth of what it was at one microsecond.  The apparent reason for this rapid slow-down was gravitation. The attraction between all forms of matter put the brakes on especially because of the extraordinarily compressed state of matter at the time, i.e. that giant, solar-system sized ‘primal atom’.  Notice that we are talking about matter before matter was supposed to exist.  It was “an amorphous cocktail of subnuclear fragments” that we don’t know anything about.

Davies provides a graph of the rate of expansion of the universe that resembles the curved line designated ‘open’ in the graph below:


The caption beneath Davies’ single line graph says: “How the size of the universe should increase with time according to the general theory of relativity.  It starts out expanding explosively fast at the big bang origin, but progressively slows as the attractive force of gravitation acts like a brake.”  Obviously, there is some discussion about whether the universe is open or not nowadays. 

·       Open universe: One that continues to expand. Gravity slows the rate of expansion but is not strong enough to stop it.

·       Closed universe: One that will eventually collapse back on itself. This would result in a BIG CRUNCH which is the reverse of the Big Bang.

·       Flat universe: The force of gravity keeps slowing down the expansion but theoretically, it'll take an infinite amount of time for it to come to rest.

Research in this area is ongoing and much is not well understood, so keep that in mind.  In 1997, cosmologists determined that the universe appears to be more open  than expected.  They concluded that there must be some other previously unknown force, acting in opposition to gravity, which is pushing the universe apart. This was designated ‘Dark Energy’.

In any event, it appears that the rate of expansion vs deceleration played a very important role in the physical processes taking place at a given time and this was extremely important for the creation of the atoms that are necessary to life. So it appears as though this whole process was ‘controlled’ in some way so as to definitely result in a life-friendly environment. As Davies writes:

“Our universe has picked a happy compromise: it expands slowly enough to permit galaxies, stars and planets to form, but not so slowly as to risk rapid collapse.  … Explosions are normally rather messy affairs. If the big bang has been slightly uneven, so that the expansion rate in one direction outstripped that in another, then over time the universe would have grown more and more lopsided as the faster galaxies receded.  We don’t see that.  Evidently the big bang had exactly the same vigour in all directions, and in all regions of space, tuned to very high precision.  How has the entire cosmos cooperated to achieve this?”  

Enter theoretical physicist Alan Guth.  Guth’s idea was ‘inflation’ as opposed to ‘expansion.’  According to Guth, the traditional big bang didn’t need to be uniform or orchestrated, it could be as messy as any other explosion.  Then, the universe almost immediately jumped in size by a huge factor.  An analogy would be something that jumps from the size of a proton to the size of a grapefruit virtually instantaneously.  At that point, the rapid ‘inflation’ stopped and normal ‘expansion’ took over as according to the given story of the early universe as already described.  This almost instantaneous inflation has the effect of smoothing the universe the same way blowing up a balloon gets rid of any wrinkles.  


Guth’s idea also included that inflating space in this way made it less curved; inflated enough and it is indistinguishable from flat. Et voila!  Inflation explains uniformity and the apparent flat geometry of space! 


Now, I don’t know about you, but this instantaneous ginormous expansion smacks of the paranormal to me.  You know, cases where objects sort of just materialize out of nowhere and are just suddenly there. But, whatever floats your boat.  Davies acknowledges this:

“Guth’s inflation seems little more than a magic wand.  It would have fallen on deaf ears had Guth not provided a credible physical mechanism to explain how inflation might have occurred… The gravitational pull of the universe serves to diminish the expansion rate progressively.  Inflation does just the opposite: it is a brief episode in which the expansion rate accelerates hugely, causing the universe to swell up super-fast.  Guth proposed that a type of antigravity force was responsible.”  

Conveniently, anti-gravity is built into Einstein’s general theory of relativity.  But where does it come from? Guth proposed a scalar field and he called this hypothetical entity the ‘inflation field.’

(Wikipedia: In affine geometry, uniform scaling (or isotropic scaling) is a linear transformation that enlarges (increases) or shrinks (diminishes) objects by a scale factor that is the same in all directions. The result of uniform scaling is similar (in the geometric sense) to the original. A scale factor of 1 is normally allowed, so that congruent shapes are also classed as similar. Uniform scaling happens, for example, when enlarging or reducing a photograph, or when creating a scale model of a building, car, airplane, etc.)


Back to Davies’ explanation:

“In Newton’s theory, gravitation is generated by mass.  In Einstein’s general theory of relativity, mass is also a source of gravitation, as is energy (remember that Einstein’s equation E = mc2 tells us that energy has mass).  But it doesn’t stop there.  Pressure too is a source of gravitation in the general theory of relativity. …if the pressure gets seriously big, it can rival the energy in its gravitating power. ‘Seriously big’ here means the sort of pressure found inside a collapsing star … Another example, however, is a scalar field: it has a pressure comparable to its energy. …But why does the scalar field produce anti-gravity?  The crucial factor is the pressure: for a scalar field it is negative.  Negative pressure isn’t especially exotic: it is no more than what we normally call tension – a stretched elastic band provides a familiar example.  In three dimensions, a block of rubber pulled in all directions would have negative pressure.  Now negative pressure implies negative gravitation – a repulsive, antigravity force.  So a scalar field generates gravity by virtue of its energy, but antigravity by virtue of its (negative) pressure.  A calculation shows that the antigravity beats the gravity by a factor of three, so the net effect of the scalar field is to antigravitate.”

So it was: Guth theorized that during the first instant after the birth of the universe a scalar field permeated space exerting a powerful antigravity effect which induced the universe to leap into runaway expansion. (What manifested this scalar field? How did it come into being?) This antigravity effect had to be strong enough to overpower the incredible gravity of the ‘normal matter in the universe,’ i.e. the solar system sized primal atom.  He plugged in some numbers and discovered that the antigravity would not only easily overwhelm the universe, it would be so strong that the universe would double in sized every 1014 seconds. 

The only problem is: what stopped this almost unthinkable expansion?  Guth had an answer for that: the inflation field was inherently unstable and only existed for a brief time.  It just decayed and disappeared, more or less.  Poof! And once it disappeared, then the big bang proceeded according to the Standard Model. Well, more or less.  According to Guth, the energy stored in the inflation field became heat and it was this heat that created protons and electrons and all the 1050 tons of matter in the universe.  Afterward, with the field decayed, the CMB represents the remnants of the inflation field.

In any event, Guth’s theory had a flaw: the exit from inflation. Note he thought it was just ‘inherently unstable’, but the decay of the inflation field is a quantum process and thus is subject to the usual unpredictability of quantum fluctuations.  Davies writes:

“As a result, it would decay at different times in different places, in the form of randomly distributed bubbles – bubbles of space, that is, in which the inflation field had decayed surrounded by regions of space where it had not.  The energy given up by the decayed inflation field would be concentrated in the bubble walls.  Bubble collisions would release this energy, as heat, but the process would be utterly chaotic and generate as much inhomogeneity as inflation was designed to remove.  … The solution was to find a theoretical scheme that would avoid bubble collisions and enable the bubbles to grow to a size much larger than the observable universe.  One way to do this is called eternal inflation.”

One thing to notice here: by its very nature, inflation erases the record of what went before and makes it impossible to deal with the question: What caused the Big Bang and what was before it?  Inflation may help to explain the fundamental features of the universe, describing them as purely physical processes, but it appears to prevent penetrating beyond that.

I don’t care much about the fancy mathematical/terminological footwork going on here, what he is describing still amounts to manifesting something out of nothing or something moving between dimensions and suddenly appearing as an apport.  Notice also that nobody has yet observed a scalar field, according to Davies.

We live in a universe that doesn’t just allow life, it appears to promote it.  If any one of a goodly number of physical properties of our universe were other than they are, life would be impossible.  And so, next time I will continue to probe into these problems and look at the solutions and explanations that have been offered.  

Saturday, June 1, 2024

The Goldilocks Enigma – Part Four

by Laura Knight-Jadczyk



In the previous post, we talked about the beginnings of cosmology as described by Paul Davis.  Davies took on the task of ‘Explaining the Universe’ which means describing the Standard Model and how it came to be.  We learned that there appears to be no ‘center’ to the universe and everywhere you look, the space between galaxies gets bigger and bigger as time goes by.  Thus, there is the expanding balloon analogy which says that the universe can be finite without having a center or an edge.  It also makes us realize that, whatever the ‘Big Bang’ was, it wasn’t exactly an explosion the way we think of explosions.

Davies tells us that a telescope is a ‘timescope’, that when we observe images of distant galaxies, we are seeing them as they appeared long before Earth existed.  That is due to the fact that it takes time for the visual to reach us at the speed of light. 

As light traverses the expanding universe its wavelength stretches along with the stretching space. ….The amount of red shift depends on how long ago (and hence how far away) the light was emitted.  Working back towards the big bang, the red shift gets bigger and bigger. … Going much farther back in time (and out into space), we reach the epoch from which the CMB emanates.  … CMB has travelled to earth relatively undisturbed since about 380,000 years after the big bang.  Before that time the temperature was too high for atoms to exist because the electrons would have been stripped away from the nuclei by the intense heat, i.e. the atoms were ionized.  Physicists refer to a gas in this state as plasma.  Plasmas scatter light strongly and so they are opaque: that is why we can’t peer inside the sun…  when WMAP detects the CMB, it is in effect seeing as far back in time as is possible… No ordinary telescope or microwave antenna, however powerful, can penetrate the glowing fog beyond. 

Going in another, but related, direction for a moment: I read an interesting article the other day:  James Webb telescope discovers earliestgalaxy in the known universe — and its shockingly big .   In this article we are told:

According to new research, astronomers using the powerful infrared telescope have revealed what appears to be the two earliest, most distant galaxies in the known universe, dating to just 300 million years after the Big Bang.

Besides being exceptionally old, the newly discovered galaxies — named JADES-GS-z14-0 and JADES-GS-z14-1 — are also unusually large for such an early time in cosmic history, according to the discovery paper published May 28 to the preprint server arXiv. With the larger of the galaxies measuring an estimated 1,600 light-years across, the discovery adds to a mounting pile of evidence that the earliest galaxies in the universe grew up much faster than leading theories of cosmology predict to be possible.


That is interesting enough, but here is another problem I read about recently: Astronomers say we may live at the center of a cosmic void 2 billion light-years wide that defies the laws of cosmologyAstronomers say we may live at the center of a cosmic void 2 billionlight-years wide that defies the laws of cosmology  which says:

·       Evidence suggests that our galaxy is inside a cosmic void, a vast expanse of relatively empty space.

·       According to our laws of cosmology, however, this void should not exist.

·       New research says that such a void may explain unusual behavior in nearby galaxies. 

According to a growing list of evidence, we live in the crosshairs of a giant cosmic void — the largest ever observed. Astronomers first suggested such a void in 2013 and the evidence for its existence has been stacking up ever since.

But the kicker is that this giant void shouldn't exist in the first place. If it does exist, that means something is probably amiss with our understanding of the cosmos.

According to a fundamental theory of cosmology called the cosmological principle, matter in the universe should be uniformly distributed on very large scales.

The reason this matters is that by assuming uniformity, scientists can apply the same laws of physics to nearby objects as objects at the fringes of the early universe. In other words, everything operates under the same universal laws. 

However, multiple observations over the last decade suggest that matter in the universe may clump into regions of high- and low densities, meaning it's not so uniform, after all.

"By now it's pretty clear that we are in a significant underdensity," Indranil Banik, a postdoctoral research fellow at the University of St. Andrews, told Business Insider.

"There's a few people that are still opposed to it to a limited extent. For example, some people have correctly argued that such a void shouldn't exist in the standard model, which is true. That unfortunately doesn't prove it's not there," he added.

Banik co-authored a paper published late last year in the peer-reviewed journal Monthly Notices of the Royal Astronomical Society that suggests we may live near the center of this void — called the KBC void — about 2 billion light-years across. Wide enough to fit 20,000 Milky Way Galaxies in a row stretching from one end to the other. …

The KBC void isn't totally empty. It can't be, because we live in it. But, if Banik and his colleagues' calculations are correct, the void would be about 20% emptier than space outside its border.

That may not seem like a big deficit, but it's enough to cause some confusing behavior in our local cosmic neighborhood, according to the recent study.

In particular, nearby stars and galaxies are moving away from us faster than they should be. Cosmologists have a value, called the Hubble constant, which they use to help describe how fast the universe's expansion is accelerating.

The Hubble constant should be the same value wherever you look, whether it's close by or very far away. The problem is that the galaxies and stars in our local neighborhood appear to be moving away from us faster than the Hubble constant predicts, essentially defying our law of cosmology that describes how the universe grows and evolves.

Astronomers can't agree on what's causing this discrepancy in the Hubble constant, and the contention has become known as the Hubble tension.

Banik and his colleagues suggest that the void could be a solution because high-density regions with stronger gravity outside the void could be pulling galaxies and stars toward them.

Banik argues that these outflows could explain why cosmologists have calculated a higher value for the Hubble constant when looking at nearby objects. Stuff moves faster in the void, flying out of our empty region towards crowded outer space.

New research suggests the KBC void is a 2 billion light-year-wide expanse of relatively empty space, and our galaxy sits right near the center of it. Pablo Carlos Budassi / Wikimedia Commons  © Pablo Carlos Budassi / Wikimedia Commons

Back to our main topic.  According to the Standard Model, which Davies is describing, light can have travelled at most 13.7 billion years and we cannot see beyond that point and so, for a while, it was said that the universe is that old. (Well, getting older every second!)  Nowadays, we can see much further thanks to the Hubble telescope.  Apparently Hubble can see at least 28 billion years back in time.  We also have to keep in mind that even as light is moving across the universe, space itself is expanding ahead of it and therefore, travel time is greatly extended.

Scientists have estimated that the observable universe contains about 1050  tons of visible matter which combines to create a powerful gravitational field which warps the geometry of space.  “So, what is the shape of space,” Einstein asked himself in 1917.  Because gravitation warps the geometry of space, in Einstein’s mathematical model of the universe, this warping, averaged over billions of light years, makes space a hypersphere. According to Einstein, one can set off in one direction and keep going and going and end up back where you started.  The universe is unbounded, but finite.

 


As Davies says, we don’t know what lies over the cosmic visual horizon but it is probably more of the same, especially considering the Einstein cosmic balloon model.  That being said, what is inside the balloon?  What is inside Einstein’s ‘hypersphere’? 

Well, apparently, we can’t know that because we are trapped on the 3 dimensional surface of the sphere.  And the same holds true for the ‘exterior’ of the balloon.  We really are like the beings in the novella “Flatland   only it’s a little more complicated because we are 3D beings in a 3D world, and we are talking about hyperdimensions basically all around us.  Paul Davies thinks this is irrelevant:

Try to put yourself in the position of a pancake-like creature restricted to life on the surface of a round balloon.  The pancake might conjecture about what lies inside the balloon (air, empty space, green cheese…), but whatever there is doesn’t affect the pancake’s actual experience because it cannot access the space inside the balloon, or receive any information from it.  … the pancake doesn’t need a god’s eye view of the balloon to conclude that its world is spherical – closed and finite, yet without boundary.  The pancake can deduce this entirely by observations it can make from the confines of the spherical surface: the sphericity is intrinsic to the surface, and does not depend on it being embedded in an enveloping three-dimensional space.  How can the pancake tell?  Well, for example, by drawing triangles and measuring whether the angles add up to more than 180’.  Or the pancake could circumnavigate its world.  In the same vein, humans could deduce that we are living in a closed, finite, hyperspherical Einstein space without reference to any higher-dimensional embedding or enveloping space, merely by doing geometry within the space.  So, the existence or otherwise of an ‘interior’ or ‘exterior’ region of the Einstein universe, not to mention what it consists of, is quite simply irrelevant.  But if you would like to imagine inaccessible empty space there for ease of visualization, they go ahead.  It makes no difference.

Yet, Davies then begins to discuss other dimensions than the 3 we experience.  Since we can’t see them, they must be hidden.  But how?  According to Davies, there are two ways.  The first was suggested by Oskar Klein in the 1920s.  His idea was to consider a hose which, from a distance, looks like a line.  When you get closer, the ‘line’ turns out to be a two-dimensional sheet rolled into a tube.  A point on the line would then turn out to be a circle around the circumference of the tube.  His suggestion was that what we take to be points of 3D space are actually little circles going around a 4th dimension.  This ‘rolling up’ of dimensions is called ‘compactification’ and there is no limit to the number of extra dimensions that can be compactified though there are a variety of ways they can compactify.  The different shapes that can result are referred to as ‘topologies’.  The more dimensions, the more possible topologies.  So, when talking about the shape of space, you have to specify how many ‘large’ (i.e. seen) dimensions there are, and how many are compactified (i.e. unseen).

The second way that extra dimensions might be hidden from view would be if we are trapped in the three dimensions we observe and are not able to move in the extra dimensions.  Such trapping would also trap light since we cannot see the 4th dimension.  The idea that we are prisoners in our 3D reality emerges naturally in what are known as ‘brane theories’.  There it is suggested that our 3D universe is a ‘three brane’ embedded in four space dimensions.

Davies concludes that there seems to be a reason that nature has decreed that we live in a three dimensional world (however many hidden dimensions there are).  According to English mathematician, Gerald Whitrow, if space had four dimensions and the laws of gravitation and electromagnetism remained unchanged, the inverse square law would become an inverse cube law, and the Earth would have spiraled into the Sun long ago along with many other disasters. Life would be impossible in space with any more ‘large’ dimensions than three.  So, as Davies notes: three dimensions are ‘just right’ like baby bear’s porridge was for Goldilocks.



To be continued .......

AI at work

 Today (25-06-24) got an email from ai.feedback@outlook.com, with subject " Feedback on your recent arXiv preprint 2401.14217 ". H...